The Null Device

Posts matching tags 'psychology'

2014/12/21

Youth subcultures, as many middle-aged former teenagers have lamented, aren't what they used to be. In the old days, you see, you had to make your choices (or have them made for you by which class you had been born into and, in the light of this, what would get you camaraderie and what would get you beaten up), and stick with them. You were either a Mod or a Rocker, or a Punk or a Grebo or something, and you would live (and, in some cases, die) by that; you wore the uniform, listened to the music, and had little tolerance for other subcultures; if you fancied the other side's soundtrack or sartorial style, you would keep it to yourself, or else. But kids these days (kids these days...!) treat subculture as if it were a supermarket, or perhaps Noel Fielding's dressing-up chest; the entire back-catalogue of young cool is there for the taking and the mashing up, with elements going in and out of style by the season, to be worn as accessories. Beyond the dress-up element, the default becomes a globalised homogenate, a sort of international Brooklyn/Berlin/Harajuku of skinny jeans/folkbeards/vividly coloured sunglasses/patterned fabrics worn by the international Hipster, and, more intensely and urgently, by their adolescent precursor, the Scene Kid. Subcultural music, meanwhile, is a post-ironic soup of the last few decades of influences, refracted through the prisms of trend blogs (drum machines, hazy synths, skronky/choppy guitars, That Krautrock Drum Beat, and so on). Parties are inevitably called “raves”, whether or not they bear any similarity to MDMA-fuelled bacchanales around the M25 circa 1987. Increasingly, the content of the subculture becomes interchangeable, and the process of performing a subculture becomes the subculture; almost like a masked ball, or a postmodern reënactment society for the youth tribes of the 20th century.

But once one goes beyond the idea of a subculture as being based around fashion or music, things sometimes start to get much more unusual.
One case in point is the Tulpamancy subculture; which, could be summed up in three words as “extreme imaginary friends”. Tulpamancers essentially invent imaginary friends and believe in them really hard, to the point of voluntarily inducing dissociative personalities in themselves, hiving off one part of their minds to be another, autonomous, personality, with whom they can interact.

The term tulpa is a Tibetan word meaning a sentient being created from pure thought; the practice crossed over from Tibetan mysticism into the Western occult/esoteric fringe in the early 20th century (the explorer Alexandra David-Neel was one pioneer), but the modern version owes more to internet “geek” subcultures; it started amongst Bronies (dudes who are really into My Little Pony, which may be either a repudiation of gender dichotomies or the ontological equivalent of a frat-bro panty raid on the idea of “girl”, or both or neither), before spreading to other branches of “geek” culture/fandom.

Tulpas remained the preserve of occultists until 2009, when the subject appeared on the discussion boards of 4chan. A few anonymous members started to experiment with creating tulpas. Things snowballed in 2012 when adult fans of My Little Pony: Friendship is Magic – known as “bronies” to anyone who's been near a computer for the past three years – caught on. They created a new forum on Reddit and crafted tulpas based on their favourite characters from the show.

In the cross-pollinating fields of the internet, it wasn’t long before tulpamancy also started to attract manga and fantasy fans. “My tulpa is called Jasmine,” says Ele. “She’s a human but from an alternative reality where she can do magic. I created her a dozen years ago for a fantasy series I write and then made her into a tulpa.”

Being a fandom subculture, there are, of course, plenty of drawings (of varying levels of execution) depicting tulpas; one probably would not be too surprised to find that many look either like anime bishonen with fox ears/snouts and/or variants on Hot Topic Darkling. Because, of course, that's what one's magical alter-ego looks like in fandom.

As for the creation of, and interaction with, these tulpas, an entire methodology has evolved for bringing them into being, and interacting with them. Tulpamancers don't so much consciously think up their spirit critters, but rather mentally create a wonderland, imagine themselves in it, and let them come up from their subconscious and meet them. From then on, they practice imagining them, allowing them to become clearer, and ultimately being able to hallucinate them in everyday reality, which is where the fun starts:

While voice is the most common way tulpas communicate with their hosts, tulpamancers can learn to stroke their tulpa’s fur, feel their breath on their neck and even experience sexual contact.

Tulpas soon get curious about their host’s body; some want to experience life as a “meatperson”. Indulgent hosts then use a practice called “switching”, which allows their tulpa to possess their body while they watch from the ringside of consciousness.

This, of course, sounds a lot like disassociative personality disorder, something not generally seen as desirable. Some tulpamancers, though, have turned that claim on its head; rather than dissociation being a disorder, or a symptom of one, what if it could be a way of self-medicating or coping What if, in other words, the optimal number of personalities in one body is, in some circumstance, greater than 1?

Koomer’s case is rare, and for Veissière “schizophrenia [could be understood as]… an incapacitating example of ‘involuntary Tulpas’", therefore, by forming positive relationships with their symptoms, sufferers can start to recover. It's an idea shared by the “Hearing Voices Movement”, who challenge the medical models of schizophrenia and suggest that pathologisation aggravates symptoms. “My schizophrenia manifested itself by having many thoughts and ideas all conflicting and shouting at me,” says Logan, who wanted his last name withheld. “Turning them into tulpas gave those thoughts a face and allowed them to be sorted out in a way that made sense.”

In January 2012, Facebook conducted a psychology experiment on 689,003 unknowing users, modifying the stories they saw in their news feeds to see whether this affected their emotional state. The experiment was automatically performed over one week by randomly selecting users and randomly assigning them to two groups; one had items with positive words like “love” and “nice” filtered out of their news feeds, whereas the other had items with negative words similarly removed; the software then tracked the affect of their status updates to see whether this affected them. The result was that it did: a proportion of those who saw only positive and neutral posts tended to be more cheerful than those who saw only negative and neutral ones. (The experiment, it must be said, was entirely automated, with human researchers never seeing the users' identities or posts.)

Of course, this sort of experiment sounds colossally unethical, not to mention irresponsible. The potential adverse consequences are too easy to imagine, and too hard to comfortably dismiss. If some 345,000 people's feeds were modulated to feed them a week of negativity in the form of what they thought were their friends' updates, what proportion of those were adversely affected beyond feeling bummed out for a week? Out of 345,000, what would be the expected amount of relationship breakups, serious arguments, alcoholic relapses, or even incidents of self-harm that may have been set off by the online social world looking somewhat bleaker and more joyless? And while it may seem that the other cohort, who got a week's worth of sunshine and rainbows, were done a favour, this is not the case; riding the heady rush of good vibes, some of them may have made bad decisions; taking gambles on bad odds because they felt lucky, or dismissing warning signs of problems. And then there's the fact that messages from their friends and family members were deliberately not shown to them if they went against the goals of the experiment. What if someone in the negative cohort was cut off from communications with a loved one far away, for just long enough to introduce a grain of suspicion into their relationship, or someone in the positive cohort didn't learn about a close friend's problems and was unable to offer support?

In academe, this sort of thing would not pass an ethics committee, where informed consent is required. However, Facebook is not an academic operation, but a private entity operating in the mythical wild frontier of the Free Market, where anything both parties consent to (“consent” here being defined in the loosest sense) goes. And when you signed up for a Facebook account, you consented to them doing pretty much whatever they like with your personal information and the relationships mediated through their service. If you don't like it, that's fine; it's a free market, and you're welcome to delete your account go to Google Plus. Or if Google's ad-targeting and data mining don't appeal, to build your own service and persuade everyone you wish to keep in touch with to use it. (Except that you can't; these aren't the wild 1990s, when a student could build LiveJournal in his dorm room; nowadays, the legal liabilities and regulatory compliance requirements would keep anyone other than multinational corporations with deep pockets out of the game.) Or go back to emailing a handful of friends, in the hope that they'll reply to your emails in the spare time left over after keeping up with Facebook. Or only socialising with people who live within walking distance of the same pub as you. Or, for that matter, go full Kaczynski and live in a shack in the woods. And when you've had enough of trapping squirrels for your food and mumbling to yourself as you stare at the corner each night, you can slink back to the bright lights, tail between your legs, reconnect with Mr. Zuckerberg's Magical Social Casino, where all your friends are, and once again find yourself privy to sweet, sweet commercially-mediated social interaction. In the end, we all come back. We know that, in this age, the alternative is self-imposed exile and social death, and so does Facebook, so they can do what they like to us.

As novel as this may seem, this is another instance of the neoliberal settlement, tearing up prior settlements and regulations in favour of a flat, market-based system, rationalised by a wilful refusal to even consider the disparities of power dynamics (“there is no such thing as an unfair deal in a free market, because you can always walk away and take a better offer from one of the ∞ other competitors”, goes the argument taken to its platygæan conclusion). Just as in deregulated economies, classes of participants (students, patients, passengers) all become customers, with their roles and rights replaced by what the Invisible Hand Of The Free Market deals out (i.e., what the providers can get away with them acquiescing to when squeezed hard enough), here those using a means of communication become involuntary guinea pigs in a disruptive, and (for half of them) literally unpleasant experiment. All that Facebook has to provide, in theory, is something marginally better than social isolation, and everything is, by definition, as fair as can be.

Facebook have offered an explanation, saying that the experiment was intended to “make the content people see as relevant and engaging as possible”. Which, given the legendarily opaque Facebook feed algorithm, and how it determines which of your friends' posts get a look into the precious spaces between the ads and sponsored posts, is small comfort. Tell you what, Facebook: why don't you stop trying to make my feed more “relevant” and “engaging” and just give me what my friends, acquaintances and groups post, unfiltered and in chronological order, and let me filter it as I see fit?

The ascent up the Maslow hierarchy of needs might have a dark side;
a US psychologist claims that the ideal of self-actualisation has created a world in which romantic relationships are more likely to fail. Eli Finkel of Northwestern University posits the “suffocation” model of marriage, asserts that, as the needs we have of a partner have changed from shared survival in a hostile environment, through romantic love and onto mutual self-discovery, and the time these couples spend with one another decreases due to external time constraints, it is harder for any actual relationship with another human being (especially one who also wishes to discover themselves) to fit the bill:

"People used to marry for basic things like food and shelter. In the 1800s, you didn't have to have profound insight into your partner's core essence to tend to the chickens or build a sound physical structure against the snow," Finkel said. "Back then, the idea of marrying for love was ludicrous."

"In 2014, you are really hoping that your partner can help you on a voyage of discovery and personal growth, but your partner cannot do that unless he or she really knows who you are, and really understands your core essence. That requires much greater investment of time and psychological resources," he said.

Book of Lamentations, a review of the American Psychological Association's Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition as a work of dystopian literature:

If the novel has an overbearing literary influence, it’s undoubtedly Jorge Luis Borges. The American Psychiatric Association takes his technique of lifting quotes from or writing faux-serious reviews for entirely imagined books and pushes it to the limit: Here, we have an entire book, something that purports to be a kind of encyclopedia of madness, a Library of Babel for the mind, containing everything that can possibly be wrong with a human being. Perhaps as an attempt to ward off the uncommitted reader, the novel begins with a lengthy account of the system of classifications used – one with an obvious debt to the Borgesian Celestial Emporium of Benevolent Knowledge, in which animals are exhaustively classified according to such sets as “those belonging to the Emperor,” “those that, at a distance, resemble flies,” and “those that are included in this classification.”

As you read, you slowly grow aware that the book’s real object of fascination isn’t the various sicknesses described in its pages, but the sickness inherent in their arrangement... This mad project is clearly something that its authors are fixated on to a somewhat unreasonable extent. In a retrospectively predictable ironic twist, this precise tendency is outlined in the book itself. The entry for obsessive-compulsive disorder with poor insight describes this taxonomical obsession in deadpan tones: “repetitive behavior, the goal of which is […] to prevent some dreaded event or situation.” Our narrator seems to believe that by compiling an exhaustive list of everything that might go askew in the human mind, this wrong state might somehow be overcome or averted. References to compulsive behavior throughout the book repeatedly refer to the “fear of dirt in someone with an obsession about contamination.” The tragic clincher comes when we’re told, “the individual does not recognize that the obsessions or compulsions are excessive or unreasonable.” This mad project is so overwhelming that its originator can’t even tell that they’ve subsumed themselves within its matrix. We’re dealing with a truly unreliable narrator here, not one that misleads us about the course of events (the narrator is compulsive, they do have poor insight), but one whose entire conceptual framework is radically off-kilter. As such, the entire story is a portrait of the narrator’s own particular madness. With this realization, DSM-5 starts to enter the realm of the properly dystopian.

A behavioural economist from Yale has posited the theory that how one's primary language handles the future tense influences the amount of planning one does for the future, with one consequence being that English speakers save less for their old age than speakers of languages such as Mandarin and Yoruba, which lack a separate future tense and instead treat the future as part of the present. Professor Keith Chen's theory is that, in doing so, such languages encourage and entrench habits of thought more conducive to mindfulness of one's future than languages where the future is hived off into a separate grammatical tense:

Prof Chen divides the world's languages into two groups, depending on how they treat the concept of time. Strong future-time reference languages (strong FTR) require their speakers to use a different tense when speaking of the future. Weak future-time reference (weak FTR) languages do not.

"The act of savings is fundamentally about understanding that your future self - the person you're saving for - is in some sense equivalent to your present self," Prof Chen told the BBC's Business Daily. "If your language separates the future and the present in its grammar that seems to lead you to slightly disassociate the future from the present every time you speak.

The effect is not limited to exotic non-European languages; similar differences are present in European languages to an extent (for example, one often uses the present tense in German to refer to events in the future, which is not the case in English, French or Italian; whether this has any causal relationship with the higher rate of personal saving in Germany remains to be determined).

If this effect holds true, all may not be lost; one could consciously intervene in English to an extent without breaking too much, by forcing oneself to say things like “I'm going to the seminar” rather than “I will go to the seminar”. Further flattenings-out of the future tense, however, get more awkward; saying, at age 29, “I'm retiring to the south of France” could raise a few eyebrows.

Researchers in the US have been investigating the question of what is “cool” from a psychological perspective, hitting the dichotomy between the two opposite poles which can be described with this term: on one hand, agreeability and popularity, and, on the other hand, a vaguely antisocial countercultural/oppositional stance reflected in the classic iconography of rebels and outlaws from the history of cool:

"I got my first sunglasses when I was about 13," said Dar-Nimrod. "There wasn't a cooler kid on the block for the next few days. I was looking cool because I was distant from people. My emotions were not something they could read. I put a filter between me and everyone else. That, in my mind, made me cool. Today, that doesn't seem to be supported. If anything, sociability is considered to be cool, being nice is considered to be cool. And in an oxymoron, being passionate is considered to be cool—at least, it is part of the dominant perception of what coolness is. How can you combine the idea of cool—emotionally controlled and distant—with passionate?"

"We have a kind of a schizophrenic coolness concept in our mind," Dar-Nimrod said. "Almost any one of us will be cool in some people's eyes, which suggests the idiosyncratic way coolness is evaluated. But some will be judged as cool in many people's eyes, which suggests there is a core valuation to coolness, and today that does not seem to be the historical nature of cool. We suggest there is some transition from the countercultural cool to a generic version of it's good and I like it. But this transition is by no way completed."

The researchers claim that the concept of “cool” is mutating away from the oppositional/rebellious sense and towards straight agreeability.

If this phenomenon does bear itself out, there may be a number of possible explanations. Perhaps, as the countercultural struggles against the repressive hegemony of the “squares” have receded into folk memory of The Fifties and everyone wears jeans, listens to rock and has smoked a joint at least once in their lives, the idea of the rebel is left with even less of a cause than before Perhaps the shift in the meaning of “cool” has something to do with the ongoing process of commodification of the counterculture, with the sneers and icy glares of vintage cool now being little more than a mask for agreeable dudes to put on when the occasion suits. Or perhaps, in the information age, being agreeable and well-connected confers a greater advantage than being tough and detached. One would imagine that this would be the case in most normal situations, in which case, the old world of tough guys and strong, silent types would have been an anomalous case, a hostile environment which traumatised its inhabitants into growing expensive carapaces of character armour.

Another option would be that the meaning of “cool” is not, in fact, changing (this study doesn't seem to involve surveys done decades earlier to gauge what people thought at the time, and compares living attitudes with canned stereotypes), and that the word “cool” has several meanings; when it's used as a term of approval for a person, it has always indicated agreeability, whereas when talking about fictional characters, it suggested a certain type of antiheroic asshole.

Cracked's David Wong has a list of five telltale indicators of a bullshit political story; in this case, a “bullshit political story” is one which ignores the actual issues and treats politics as a sporting event, appealing to the audience's identification with one team or other:

The answer is that many (if not most) people don't follow politics in order to find out who to vote for as part of their duty as citizens living in a democracy. They follow it purely as a form of entertainment. They're like sports fans, rooting for their "team" to win. And as you're going to find out, virtually all political news coverage is written to appeal to those people. They're the most rabid "consumers" of news, and their traffic is the most reliable, so the news is tailored to appeal to them. In the business, they derisively call it "horse race journalism," where the stories focus purely on the "sport" of politics rather than the consequences.

The telltale signs are stories with the word “gaffe” in the headline (generally some content-free event giving one half of the stadium cause to hoot and jeer at what dumbasses the other side are), anything about a politician “blasting” the other side (which appeals to the audience's inner wrestling fan), weasel-worded headlines asking a question (the answer to which is generally “probably not”), headlines attempting to escalate random low-ranking members of one political side, generally with non-mainstream opinions, to the status of “lawmakers” or “advisors” and demanding that the leadership take responsibility for them, and real-world political issues being framed as a “blow to” one political side or other:

That's where the gaffe stories come in. See, in this game, your "team" scores a point each time the other team says something stupid. It lets all of the supporters of your team mock and humiliate the supporters of the opposing team, on Internet message boards and around water coolers and in coffee shops nationwide. "Haha! The supposed 'genius' Obama thinks there are 57 states in the U.S.!" "Oh, yeah? Well, your last president said he was going to help terrorists plan their next attack!"

Hey, did you know that Barack Obama is an out-of-touch elitist because he puts fancy Dijon mustard on his hamburgers? Did you know that Mitt Romney is an insane sociopath because he once made his pet dog ride on top of his car 26 years ago? Did you know John Kerry can't relate to the average person because he puts Swiss cheese on his Philly cheese steaks? Did you know that George W. Bush hates foreigners so much that he wiped his hand after shaking hands with a Haitian? Did you know that all of this is petty schoolyard bullshit that wastes valuable time and energy that you'll never get back?

And, as smarter commentators have pointed out, there's an even bigger problem with this: It actually implies that the issue itself is completely unimportant. For instance, if the courts overturn some regulation about mercury in the water or Congress blocks car mileage standards, it always gets reported as "A Blow to Environmentalists." Oh, no, it's not a blow to the people who have to drink the water or breathe the air, or the taxpayers who have to fund the regulations, or the businesses that lose jobs over it. It's either a "blow to environmentalists" or it's not. They specifically make it sound like the effects extend purely to some fringe special interest group and absolutely no one else.

I'm telling you from experience, watching political races this way is addictive as shit. You have thousands of years of violent tribal instincts pumping through your veins, itching for a fight. That makes you an easy tool for manipulation, and every good politician and pundit knows how to push those buttons to make people march neatly in formation. Don't succumb. Or else you'll start supporting the most bullshit legislation just because your guy is for it. Or you'll start knee-jerk rejecting anything the other "team" proposes. Not because it's bad for the country, but because you want to deny them a "win."

When testing drugs for treating depression on lab mice, it is important to have ways of determining whether or not a mouse is depressed, or suffering from the mouse equivalent of depression. Not surprisingly, mice deemed to be depressed are the ones which give up and stop struggling when faced with difficulty, and which get little joy from life:

Forced swimming test. The rat or mouse is placed into a cylinder partially filled with water from which escape is difficult. The longer it swims, the more actively it is trying to escape; if it stops swimming, this cessation is interpreted as depressionlike behavior, a kind of animal fatalism.

Sugar water preference. The preference an animal shows for sugar water is taken as an indication of its ability to derive pleasure, a quality that is missing in depression. Most rodents, when given two identical-looking sources of water, will drink much more of the sweetened water than the plain water. Rodents exposed to chronic stress or whose brains have been manipulated show no such preference.

People who tested as "extroverts" on the personality test tended to have more friends, but their networks tended to be more sparse, meaning that they made friends with lots of different people who are less likely to know each other.

The researchers also found that people with long last names tended to be more neurotic, perhaps because "a lifetime of having one's long last name misspelled may lead to a person expressing more anxiety and quickness to anger," according to the study, which is being presented this week at the Computer Human Interaction conference in Vancouver.

Psychologist Bruce Levine makes the claim that, in the US, the psychological profession has a bias towards conformism and authoritarianism, and against anti-authoritarian tendencies. This bias apparently results from the institutional structure of the profession, which selects for and reinforces pro-conformist and pro-authoritarian tendencies, and manifests itself, among other things, in those who exhibit “anti-authoritarian tendencies” being caught, diagnosed with various mental illnesses and medicated into compliance before they can develop into actual troublemakers:

In my career as a psychologist, I have talked with hundreds of people previously diagnosed by other professionals with oppositional defiant disorder, attention deficit hyperactive disorder, anxiety disorder and other psychiatric illnesses, and I am struck by (1) how many of those diagnosed are essentially anti-authoritarians, and (2) how those professionals who have diagnosed them are not.

Anti-authoritarians question whether an authority is a legitimate one before taking that authority seriously. Evaluating the legitimacy of authorities includes assessing whether or not authorities actually know what they are talking about, are honest, and care about those people who are respecting their authority. And when anti-authoritarians assess an authority to be illegitimate, they challenge and resist that authority—sometimes aggressively and sometimes passive-aggressively, sometimes wisely and sometimes not.

Some activists lament how few anti-authoritarians there appear to be in the United States. One reason could be that many natural anti-authoritarians are now psychopathologized and medicated before they achieve political consciousness of society’s most oppressive authorities.

Showing hostility to or resentment of authority will get one diagnosed with various conditions, such as “opposition defiant disorder (ODD)”, a condition which manifests itself in deficits in “rule-governed behaviour”, and for which, as for many parts of the human condition, there are many types of corrective medication these days. (Compare this to the condition of “sluggish schizophrenia”, which only existed in the Soviet Union and manifested itself as a rejection of the self-evident truth of Marxism-Leninism.)

While pretty much every hierarchical society has mechanisms for encouraging conformity to some degree, Dr. Levine's contention is that the increase in psychiatric medication in recent years may be leading to a more authoritarian and conformistic society.

1. Exploit pattern recognition. I magically produce four silver dollars, one at a time, with the back of my hand toward you. Then I allow you to see the palm of my hand empty before a fifth coin appears. As Homo sapiens, you grasp the pattern, and take away the impression that I produced all five coins from a hand whose palm was empty.

2. Make the secret a lot more trouble than the trick seems worth. You will be fooled by a trick if it involves more time, money and practice than you (or any other sane onlooker) would be willing to invest. My partner, Penn, and I once produced 500 live cockroaches from a top hat on the desk of talk-show host David Letterman. To prepare this took weeks. We hired an entomologist who provided slow-moving, camera-friendly cockroaches (the kind from under your stove don’t hang around for close-ups) and taught us to pick the bugs up without screaming like preadolescent girls. Then we built a secret compartment out of foam-core (one of the few materials cockroaches can’t cling to) and worked out a devious routine for sneaking the compartment into the hat. More trouble than the trick was worth? To you, probably. But not to magicians.

3. It’s hard to think critically if you’re laughing. We often follow a secret move immediately with a joke. A viewer has only so much attention to give, and if he’s laughing, his mind is too busy with the joke to backtrack rationally.

Almost every major retailer, from grocery chains to investment banks to the U.S. Postal Service, has a “predictive analytics” department devoted to understanding not just consumers’ shopping habits but also their personal habits, so as to more efficiently market to them. “But Target has always been one of the smartest at this,” says Eric Siegel, a consultant and the chairman of a conference called Predictive Analytics World. “We’re living through a golden age of behavioral research. It’s amazing how much we can figure out about how people think now.”

A specific case the article describes is that of finding new parents-to-be, who will soon be in a position of having to form new spending habits, and doing so before the competition can identify them from public information; which is to say, inferring from statistical information whether a customer is likely to be pregnant, and at which stage, and then subtly manipulating her spending through the various milestones of her child's development:

The only problem is that identifying pregnant customers is harder than it sounds. Target has a baby-shower registry, and Pole started there, observing how shopping habits changed as a woman approached her due date, which women on the registry had willingly disclosed. He ran test after test, analyzing the data, and before long some useful patterns emerged. Lotions, for example. Lots of people buy lotion, but one of Pole’s colleagues noticed that women on the baby registry were buying larger quantities of unscented lotion around the beginning of their second trimester. Another analyst noted that sometime in the first 20 weeks, pregnant women loaded up on supplements like calcium, magnesium and zinc. Many shoppers purchase soap and cotton balls, but when someone suddenly starts buying lots of scent-free soap and extra-big bags of cotton balls, in addition to hand sanitizers and washcloths, it signals they could be getting close to their delivery date.

One Target employee I spoke to provided a hypothetical example. Take a fictional Target shopper named Jenny Ward, who is 23, lives in Atlanta and in March bought cocoa-butter lotion, a purse large enough to double as a diaper bag, zinc and magnesium supplements and a bright blue rug. There’s, say, an 87 percent chance that she’s pregnant and that her delivery date is sometime in late August. What’s more, because of the data attached to her Guest ID number, Target knows how to trigger Jenny’s habits. They know that if she receives a coupon via e-mail, it will most likely cue her to buy online. They know that if she receives an ad in the mail on Friday, she frequently uses it on a weekend trip to the store. And they know that if they reward her with a printed receipt that entitles her to a free cup of Starbucks coffee, she’ll use it when she comes back again.

The uncanny accuracy of the algorithm is demonstrated by an anecdote about an angry father storming into a Target store demanding why they had sent his teenaged daughter coupons for nappies and prams, and then, some time later, returning to apologise, having discovered that she had, in fact, become pregnant.

Of course, there is the small issue of how to use this wealth of information in a plausibly deniable sense, without being obviously creepy. People, after all, tend to react badly to being surreptitiously watched and manipulated, especially so when deeply personal matters are involved:

“With the pregnancy products, though, we learned that some women react badly,” the executive said. “Then we started mixing in all these ads for things we knew pregnant women would never buy, so the baby ads looked random. We’d put an ad for a lawn mower next to diapers. We’d put a coupon for wineglasses next to infant clothes. That way, it looked like all the products were chosen by chance. And we found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons. She just assumes that everyone else on her block got the same mailer for diapers and cribs. As long as we don’t spook her, it works.”

When a psychologist or psychiatrist testifies during a defendant’s competency hearing, the psychologist or psychiatrist shall wear a cone-shaped hat that is not less than two feet tall. The surface of the hat shall be imprinted with stars and lightning bolts. Additionally, a psychologist or psychiatrist shall be required to don a white beard that is not less than 18 inches in length, and shall punctuate crucial elements of his testimony by stabbing the air with a wand. Whenever a psychologist or psychiatrist provides expert testimony regarding a defendant’s competency, the bailiff shall contemporaneously dim the courtroom lights and administer two strikes to a Chinese gong…

The amendment passed unanimously, but was removed from the final law, to the detriment of the theatrical beard and Chinese gong industries.

Idea of the day: the Happy Recession; the idea that the internet, through driving prices and costs down, will permanently deflate both prices and wages; the post-internet world, it seems, jams econo:

The most pernicious aspect of Internet entertainment is that it’s so easy to measure and so easy to mass-produce. So the moment something on the Internet gets fun enough to be competitive with the real-world analogue, it starts getting relentlessly improved until it’s vastly superior. World of Warcraft soaks up upwards of forty hours per week from serious fans, who pay about $15 per month for their subscriptions. Few other hobbies can consume so much time at such a low cost.

The web makes it easier to access non-traditional employees at much lower salaries. As we argued in our Demand Media analysis, the real story here is that a stay-at-home mom with a Masters in Journalism can write content that is good enough compared to a typical Madison Avenue copywriter, especially when the rate is $15 per article instead of six figures per year. This disaggregation of writing skill means that companies no longer have to hire good writers in order to write 5% good copy and 95% mediocre work; they can outsource the mediocre stuff and relegate the high-end work to a short-term freelancer.

The web offers cheap social status: In the long term, this may have a bigger effect than the web merely making digitizable products cheaper. Social status games drive a huge amount of economic activity: people strive to get into high-paying, high prestige career tracks, to win promotions and attendant raises, to live in the best neighborhoods and send their kids to the best schools. Few status games lack some kind of economic output—people who play sports well below the professional level still get some job opportunities out of it.

One could probably also add a geographical factor to this: in the age of cheap, ubiquitous opportunity, access to economic and cultural opportunities is less dependant on being located in a buzzing metropolis or creative-class hive; after all, if a copywriter or app developer can work from anywhere with creativity, things like music and art scenes (or whatever replaces the post-punk rock'n'roll era construction of the "music scene" in the cultural ecosystem) are centred around blogs rather than physical venues, and one doesn't even need to move to a different place to find like-minded people, there would be less competition for living in more desirable areas, when the price of not doing so no longer includes disconnection from as many opportunities.

My companion, a senior UK investment banker and I, are discussing the most successful banking types we know and what makes them tick. I argue that they often conform to the characteristics displayed by social psychopaths. To my surprise, my friend agrees.

He then makes an astonishing confession: "At one major investment bank for which I worked, we used psychometric testing to recruit social psychopaths because their characteristics exactly suited them to senior corporate finance roles."

Here was one of the biggest investment banks in the world seeking psychopaths as recruits.

Writing in the Pinboard blog, Maciej Ceglowski tears apart the concept "social graph", saying that it is neither social nor a graph, but a sort of pseudoscience invented by socially-challenged geeks and now peddled by hucksters out to monetise you and your relationships:

Last week Forbes even went to the extent of calling the social graph an exploitable resource comprarable to crude oil, with riches to those who figure out how to mine it and refine it. I think this is a fascinating metaphor. If the social graph is crude oil, doesn't that make our friends and colleagues the little animals that get crushed and buried underground?

The first part of his argument has to do with the inadequacy of the "social graph" model for representing all the nuances of human social relationships in the real world; the many gradations of friendship and acquaintance, the ways relationships change and evolve, making a mockery of nailed-down static representations; the way that describing a relationship can change it in some cases, and various issues of privacy and multi-faceted identity, things which exist trivially in the real world, even if they're in violation of the Zuckerberg Doctrine.

One big sticking point is privacy. Do I really want to find out that my pastor and I share the same dominatrix? If not, then who is going to be in charge of maintaining all the access control lists for every node and edge so that some information is not shared? You can either have a decentralized, communally owned social graph (like Fitzpatrick envisioned) or good privacy controls, but not the two together.

This obsession with modeling has led us into a social version of the Uncanny Valley, that weird phenomenon from computer graphics where the more faithfully you try to represent something human, the creepier it becomes. As the model becomes more expressive, we really start to notice the places where it fails.

You might almost think that the whole scheme had been cooked up by a bunch of hyperintelligent but hopelessly socially naive people, and you would not be wrong. Asking computer nerds to design social software is a little bit like hiring a Mormon bartender. Our industry abounds in people for whom social interaction has always been more of a puzzle to be reverse-engineered than a good time to be had, and the result is these vaguely Martian protocols.

Of course, whilst the idea of the social graph may not be good for modelling real-life social interactions with naturalistic fidelity, it has been a boon for targeting advertising; the illusion of social fulfilment is enough to keep people clicking and volunteering information about themselves. From the advertisers' point of view, the fish not only jump right into the boat, they fillet themselves in mid-air and bring their own wedges of lemon:

Imagine the U.S. Census as conducted by direct marketers - that's the social graph. Social networks exist to sell you crap. The icky feeling you get when your friend starts to talk to you about Amway, or when you spot someone passing out business cards at a birthday party, is the entire driving force behind a site like Facebook.

There is some good news, though: while general-purpose social web sites with the ambition of mediating (and monetising) the entirety of human social interaction may fail creepily as they approach their goal, special-purpose online communities can thrive in their niches:

The funny thing is, no one's really hiding the secret of how to make awesome online communities. Give people something cool to do and a way to talk to each other, moderate a little bit, and your job is done. Games like Eve Online or WoW have developed entire economies on top of what's basically a message board. MetaFilter, Reddit, LiveJournal and SA all started with a couple of buttons and a textfield and have produced some fascinating subcultures. And maybe the purest (!) example is 4chan, a Lord of the Flies community that invents all the stuff you end up sharing elsewhere: image macros, copypasta, rage comics, the lolrus. The data model for 4chan is three fields long - image, timestamp, text.
Now tell me one bit of original culture that's ever come out of Facebook.

I wonder whether there is a dichotomy there between sites and networks; would a special-interest site that used, say, Facebook's social graph as a means of identifying users (rather than having its own system of accounts, usernames, profiles, and optionally friendship/trust edges) be infected by the Zuckerbergian malaise?

Psychologists argue that a person loses their moral identity in a large group, and empathy and guilt - the qualities that stop us behaving like criminals - are corroded. "Morality is inversely proportional to the number of observers. When you have a large group that's relatively anonymous, you can essentially do anything you like," according to Dr James Thompson, honorary senior lecturer in psychology at University College London.

He rejects the notion that some of the looters are passively going with the flow once the violence has taken place, insisting there is always a choice to be made.

Workman argues that some of those taking part may adopt an ad hoc moral code in their minds - "these rich people have things I don't have so it's only right that I take it". But there's evidence to suggest that gang leaders tend to have psychopathic tendencies, he says.

[Criminologist Prof. John Pitts] says most of the rioters are from poor estates who have no "stake in conformity", who have nothing to lose. "They have no career to think about. They are not 'us'. They live out there on the margins, enraged, disappointed, capable of doing some awful things."

Today in weaponised sociolinguistics: the US intelligence research agency IARPA is running a programme to collect and catalogue metaphors used in different cultures, hopefully revealing how the Other thinks. This follows on from the work of cognitive linguist George Lakoff, who theorised that whoever controls the metaphors used in language can tilt the playing field extensively:

Conceptual metaphors have been big business over the last few years. During the last Bush administration, Lakoff – a Democrat – set up the Rockridge Institute, a foundation that sought to reclaim metaphor as a tool of political communication from the right. The Republicans, he argued, had successfully set the terms of the national conversation by the way they framed their metaphors, in talking about the danger of ‘surrendering’ to terrorism or to the ‘wave’ of ‘illegal immigrants’. Not every Democrat agreed with his diagnosis that the central problem with American politics was that it was governed by the frame of the family, that conservatives were proponents of ‘authoritarian strict-father families’ while progressives reflected a ‘nurturant parent model, which values freedom, opportunity and community building’ (‘psychobabble’ was one verdict, ‘hooey’ another).

But there’s precious little evidence that they tell you what people think. One Lakoff-inspired study that at first glance resembles the Metaphor Program was carried out in the mid-1990s by Richard D. Anderson, a political scientist and Sovietologist at UCLA, who compared Brezhnev-era speeches by Politburo members with ‘transitional’ speeches made in 1989 and with post-1991 texts by post-Soviet politicians. He found, conclusively, that in the three periods of his study the metaphors used had changed entirely: ‘metaphors of personal superiority’, ‘metaphors of distance’, ‘metaphors of subordination’ were out; ‘metaphors of equality’ and ‘metaphors of choice’ were in. There was a measurable change in the prevailing metaphors that reflected the changing political situation. He concluded that ‘the change in Russian political discourse has been such as to promote the emergence of democracy’, that – in essence – the metaphors both revealed and enabled a change in thinking. On the other hand, he could more sensibly have concluded that the political system had changed and therefore the metaphors had to change too, because if a politician isn’t aware of what metaphors he’s using who is?

The article is vague on the actual IARPA research programme, but reveals that it involves extracting metaphors from large bodies of texts in four languages (Farsi, Mexican Spanish, Russian and English) and classifying them according to emotional affect.

If we don’t know how irony works and we don’t know how it is used by the enemy, we cannot identify it. As a result, we cannot take appropriate steps to neutralize ironizing threat postures. This fundamental problem is compounded by the enormous diversity of ironic modes in different world cultures and languages. Without the ability to detect and localize irony consistently, intelligence agents and agencies are likely to lose valuable time and resources pursuing chimerical leads and to overlook actionable instances of insolence. The first step toward addressing this situation is a multilingual, collaborative, and collative initiative that will generate an encyclopedic global inventory of ironic modalities and strategies. More than a handbook or field guide, the work product of this effort will take the shape of a vast, searchable, networked database of all known ironies. Making use of a sophisticated analytic markup language, this “Ironic Cloud” will be navigable by means of specific ironic tropes (e.g., litotes, hyperbole, innuendo, etc.), by geographical region or language field (e.g., Iran, North Korea, Mandarin Chinese, Davos, etc.), as well as by specific keywords (e.g., nose, jet ski, liberal arts, Hermès, night soil, etc.) By means of constantly reweighted nodal linkages, the Ironic Cloud will be to some extent self-organizing in real time and thus capable of signaling large-scale realignments in the “weather” of global irony as well as providing early warnings concerning the irruption of idiosyncratic ironic microclimates in particular locations—potential indications of geopolitical, economic, or cultural hot spots.

The proposal goes on to suggest possibilities of using irony as a weapon:

Superpower-level political entities (e.g., Roman Empire, George W. Bush, large corporations, etc.) have tended to look on irony as a “weapon of the weak” and thus adopted a primarily defensive posture in the face of ironic assault. But a historically sensitive consideration of major strategic realignments suggests that many critical inflection points in geopolitics (e.g., Second Punic War, American Revolution, etc.) have involved the tactical redeployment of “guerrilla” techniques and tools by regional hegemons. There is reason to think that irony, properly concentrated and effectively mobilized, might well become a very powerful armament on the “battlefield of the future,” serving as a nonlethal—or even lethal—sidearm in the hands of human fighters in an information-intensive projection of awesome force. Without further fundamental research into the neurological and psychological basis of irony, it is difficult to say for certain how such systems might work, but the general mechanism is clear enough: irony manifestly involves a sudden and profound “doubling” of the inner life of the human subject. The ironizer no longer maintains an integrated and holistic perspective on the topic at hand but rather experiences something like a small tear in the consciousness, whereby the overt and covert meanings of a given text or expression are sundered. We do not now know just how far this tear could be opened—and we do not understand what the possible vital consequences might be.

A new study has shown that violent video games decrease crime rates. While they do increase aggression in the players, the incapacitation effect of the players being drawn into sitting in front of a computer or console for extended periods of time, and thus unlikely to attack anything larger than a plate of nachos in reality, outweighs this.

Cognitive bias of the day: the name letter effect, which causes people to be subconsciously more favourably inclined to names and words that sound like their name:

The researchers then moved on to career choices. They combed the records of the American Dental Association and the American Bar association looking for people named either Dennis, Denice, Dena, Denver, et cetera, or Lawrence, Larry, Laura, Lauren, et cetera. That is: were there more dentists named Dennis and lawyers named Lawrence than vice versa? Of the various statistical analyses they performed, most said yes, some at < .001 level. Other studies determined that there was a suspicious surplus of geologists named Geoffrey, and that hardware store owners were more likely to have names starting with 'H' compared to roofing store owners, who were more likely to have names starting with 'R'.

Some other miscellaneous findings: people are more likely to donate to Presidential candidates whose names begin with the same letter as their own, people are more likely to marry spouses whose names begin with the same letter as their own, that women are more likely to show name preference effects than men (but why?), and that batters with names beginning in 'K' are more likely than others to strike out (strikeouts being symbolized by a 'K' on the records).

I met an American CEO, Al Dunlap, formerly of the Sunbeam Corporation, who redefined a great many of the psychopath traits to me as "business positives": Grandiose sense of self-worth? "You've got to believe in yourself." (As he told me this, he was standing underneath a giant oil painting of himself.) Cunning/manipulative? "That's leadership."

I wondered if sometimes the difference between a psychopath in Broadmoor and a psychopath on Wall Street was the luck of being born into a stable, rich family.

The article, which is excerpted from Ronson's new book, "The Psychopath Test", also follows the story of Tony, a youth who, when tried for a violent crime, feigned insanity in an attempt to avoid prison, and instead was diagnosed as a manipulative psychopath and committed to the notorious Broadmoor prison. After 12 years amongst notorious killers and the criminally insane, he secured a hearing, which found him, whilst mildly psychopathic, fit to be released into society:

"The thing is, Jon," Tony said as I looked up from the papers, "what you've got to realise is, everyone is a bit psychopathic. You are. I am." He paused. "Well, obviously I am," he said.
"What will you do now?" I asked.
"Maybe move to Belgium," he said. "There's this woman I fancy. But she's married. I'll have to get her divorced."

The current research fills this gap by testing the hypothesis that one cultural product—word use in popular song lyrics—changes over time in harmony with cultural changes in individualistic traits. Linguistic analyses of the most popular songs from 1980–2007 demonstrated changes in word use that mirror psychological change. Over time, use of words related to self-focus and antisocial behavior increased, whereas words related to other-focus, social interactions, and positive emotion decreased. These findings offer novel evidence regarding the need to investigate how changes in the tangible artifacts of the sociocultural environment can provide a window into understanding cultural changes in psychological processes.

I wonder how much of this is actually emblematic of a deeper cultural shift towards short-term values. A world in which everything is a dynamic market of novelty and possibility, and "love" just means a temporary arrangement for mutually negotiated gratification.

When presented with a choice of partners, they showed no overall preference for either males or females. When just a male was introduced into the cage, the modified males were far more likely to mount the male and emit a "mating call" normally given off when encountering females than unmodified males were.

However, a preference for females could be "restored" by injecting serotonin into the brain.

The researchers have cautioned against drawing conclusions about human sexuality from the result.

The lazy takeaway from this, as seen in news sites, is that serotonin affects sexual orientation, with the suggestion that low serotonin might be the secret to the inexplicable condition known as homosexuality. I'm wondering whether a more plausible conclusion is that, with sexual selection being about competition amongst fit individuals, a prerequisite for having an active sexual preference is passing an internal test of subjective fitness, i.e., being aware that one has sufficiently high status to be picky. In other words, mice without functioning serotonin receptors perceive themselves as losers who will take anything that's warm and regard it, being more than they're entitled to, as a win.

The Hathaway Effect: an observation that when Hollywood celebrity Anne Hathaway makes headlines, stock in Berkshire Hathaway, Warren Buffett's company, rises, for no other reason. Is it a demonstration of human irrationality (significant numbers of investors getting a good feeling about a stock by virtue of having its name in their mind for an unconnected reason), or, as the article suggests, automated trading systems picking up entertainment headlines and buying stock on the basis of them?

Positivity considered harmful (2):
A new study suggests that social software such as Facebook may be making its users unhappy, by causing them to overestimate how contented their peers are with their lives (unlike themselves). The theory goes that, as these sites are self-curated experiences where users present generally positive images of themselves, other users don't get well-rounded views of how an online acquaintance's life is going, but have a cognitive bias to thinking that they do. Consequently, we overestimate our online acquaintances' life satisfaction, compare it to our own, and feel unhappy:

The human habit of overestimating other people's happiness is nothing new, of course. Jordan points to a quote by Montesquieu: "If we only wanted to be happy it would be easy; but we want to be happier than other people, which is almost always difficult, since we think them happier than they are." But social networking may be making this tendency worse. Jordan's research doesn't look at Facebook explicitly, but if his conclusions are correct, it follows that the site would have a special power to make us sadder and lonelier. By showcasing the most witty, joyful, bullet-pointed versions of people's lives, and inviting constant comparisons in which we tend to see ourselves as the losers, Facebook appears to exploit an Achilles' heel of human nature. And women—an especially unhappy bunch of late—may be especially vulnerable to keeping up with what they imagine is the happiness of the Joneses.

Which makes sense, assuming that one buys the assumption that social software strongly discourages expressions of negativity or unhappiness. This is clearly not the case on all social sites; witness, for example, the (somewhat old) stereotype of the LiveJournal Angstpuppy, characterised by demonstrative levels of self-pity, often encoded into musical and/or sartorial preferences. Granted, that was in an earlier, weirder internet, and might get one unfriended or laughed at in today's more mainstream networks, though one does see a fair amount of kvetching on Facebook. Perhaps the best solution for the collective mental health is to encourage a culture of moderate self-pity and commiseration?

At its brainiest, this sensibility expresses itself in the group blog Boing Boing, a self-described “directory of wonderful things.” Tellingly, the trope “just look at this!,” a transport of rapture at the wonderfulness of whatever it is, has become a refrain on the site, as in: ”Just look at this awesome underwear made from banana fibers. Just look at it.” Or: “Just look at this awesome steampunk bananagun. Just look at it.” Or: “Just look at this bad-ass volcano.” Or: “Just look at this illustration of an ancient carnivorous whale.” Because that’s what the curators of wunderkammern do—draw back the curtain, like Charles Willson Peale in “The Artist in His Museum,” exposing a world of “wonderful things,” natural (bad-ass volcanoes, carnivorous whales) and unnatural (steampunk bananaguns, banana-fiber underwear), calculated to make us marvel.

Of course, there is a downside to this relentless boosterism: the positive becomes the norm (how many things can you "favourite"?); meanwhile, critical thought becomes delegitimised. When everybody's building shrines to their likes, any expression of negativity is an attack on someone's personal taste, making one a "hater" (a term originally from hip-hop culture which, tellingly, gained mainstream currency in the past decade). From this relentlessly upbeat point of view, critics are no more legitimate than griefers, the players in multi-player games who destroy others' achievements motivated by sadism:

At their wound-licking, hater-hatin’ worst, the politics of enthusiasm bespeak the intellectual flaccidity of a victim culture that sees even reasoned critiques as a mean-spirited assault on the believer, rather than an intellectual challenge to his beliefs. Journal writer Christopher John Farley is worth quoting again: dodging the argument by smearing the critic, the term “hater” tars “all criticism—no matter the merits—as the product of hateful minds.” No matter the merits.

The culture of enthusiasm, and the culture of disenthusiasm (which Dery mentions), seems to be founded on the assumption that we are defined by the things we like and dislike. It's a form of commodity fetishism taken into the cultural sphere, though one step removed from the accumulation of material goods, rather dealing with approval and disapproval. Not surprisingly, it's often associated with youth subcultures; take, for example, punks' leather jackets; the names which appear on the back, and those omitted for obviousness or inauthenticity, signal their wearers' authenticity and legitimacy in the culture. (Hipsters take it further, into the realm of irony, where one's status is measured by how close one can surf to the void of kitsch; being into, say, Hall & Oates or M.C. Hammer, is worth more than safe choices like Joy Division and the Velvet Underground, which are so obvious a part of every civilised person's background that trumpeting one's enthusiasm for them is immediately suspect.)

However, likes and dislikes, when worn as badges of identity, can become mere totemism. Do you like, say, The Strokes or Barack Obama, because you find them interesting, or because you wish to be identified as the kind of person who does? Or, as A Softer World put it:

Cultural products (a term which encompasses everything from pop stars to public intellectuals, from comic books to politicians) can fulil two functions: they can be valued for their content or function (does this band rock? Is this book interesting?), or for their function as establishing the consumer's identity. Much like vinyl record sleeves framed on trendy apartment walls by people who don't own turntables to project an aura of cool, favourite books or movies or bands or public figures can be trotted out to buttress one's public image, without ever being fully digested. (Witness, for example, the outspokenly religious American "Conservatives" who idolise Ayn Rand, a strident atheist who expressed a Nietzschean contempt for religion.) Likes and dislike, in other words, are like flags, saluted or burned often out of habit or social obligation as much as any intrinsic value they may hold.

At the end, Dery points out that, far more interesting and telling than what we like or dislike are the things we both like and dislike, or else find fascinating; things which compel us with a mixture of fascination and repulsion, in whatever quantities, rather than neatly falling into one side or the other of the love/hate binary.

Freed from the confining binary of loving versus loathing, Facebook Like-ing versus hateration, we can imagine an index of obsessions, an inventory of intrigues that more accurately traces the chalk outline of who we truly are.

Imagine a more anarchic politics of enthusiasm, poetically embodied in a simulacrum of the self that preserves our repulsive attractions and attractive repulsions, reducing us not to our Favorites, nor even to our likes and dislikes, but to our obscure obsessions, our recurrent themes, the passing fixations that briefly grip us, then are gone—not our favorite things, but the things that Favorite us, whether we like it, or even know it, or not.

New research has shown that oxytocin, the neurochemical which promotes feelings of love and trust, also induces racism, or to be more precise, sharper discrimination against those ethnically or culturally different from oneself and one's group:

When asked to resolve a moral dilemma, such as choosing to save five lives from a runaway train by sacrificing one life, oxytocin-sniffing Dutch men more often saved fellow countrymen over Arabs and Germans than those who didn’t get a hormonal whiff.

“Earlier research of oxytocin paints a very rosy view of it. We thought it was odd a neurological system that survived evolution would make people indiscriminately loving toward others,” said social psychologist Carsten De Dreu of the University of Amsterdam, co-author of a Jan. 10 study in the Proceedings of the National Academy of Sciences. “Under oxytocin we saw an increase of in-group favoritism, which has the downside of discrimination against people who are not part of your group.”

The questions this raises are interesting. In modern Western society at least, the idea of love is almost a secular religion; it is seen as an unequivocally positive phenomenon, whose only fault is that it is, alas, not everywhere, not washing over everyone and making everything alright. Anyone who dissents from this opinion must be some kind of pitiably twisted curmudgeon; entire subgenres of Hollywood romantic comedies have been made about such sourpusses seeing the light and gaining a new faith in the redeeming power of love, replete with montage sequences. But if the biological conditions underlying the phenomena of love also measurably amplify less positive tendencies, such as reducing empathy to those outside of one's in-group, could love follow religion into becoming something once seen as universally good that has been subjected to more radical reassessment? Perhaps, in future, we'll see the same rational scepticism that has been applied to the virtue of religious faith applied to the universal beneficience of love?

"The anterior cingulate is a part of the brain that is on the middle surface of the brain at the front and we found that the thickness of the grey matter, where the nerve cells of neurons are, was thicker the more people described themselves as liberal or left wing and thinner the more they described themselves as conservative or right wing," he told the programme.

"The amygdala is a part of the brain which is very old and very ancient and thought to be very primitive and to do with the detection of emotions. The right amygdala was larger in those people who described themselves as conservative.

there are correlations between word categories and age; older people use more first-person plurals, positive emotions and references to religion and family, while young people tend to talk in the singular first-person (presumably adolescent alienation?), mention sadness and death, swear a lot and talk about sex, music and TV.

People with more friends talk more about social processes and other people, and have higher total word counts; whereas, while talking about home, family and emotions are correlated with having fewer friends, the most strongly correlated categories are time and the past.

Positive emotions are one of the most likely categories to be liked, but least likely to attract comments. Negative emotions, however, attract a lot of comments (presumably from the people posting empathetic "Don't Like" messages).

The one thing less likeable than negative emotions is talk about sleeping.

People who talk about metaphysical or religious subjects are most likely to be friends. And people who use prepositions a lot tend not to be friends with people who swear a lot or exhibit anger or negative emotions.

Schoental, an expert in microfungi, thought that Mozart died from mycotoxin poisoning. Drake, a neurosurgeon, proposed a diagnosis of subdural haematoma after a skull fracture identified on a cranium that is not Mozart’s. Ehrlich, a rheumatologist, believed he died from Behçet’s syndrome. Langegger, a psychiatrist, contended that he died from a psychosomatic condition. Little, a transplant surgeon, thought he could have saved Mozart by a liver transplant. Brown, a cardiologist, claimed he succumbed to endocarditis. On the basis of a translation error of Jahn’s biography of Mozart, Rappoport, a pathologist, thought Mozart died of cerebral haemorrhage. Ludewig, a pharmacologist, suggested poisoning or self poisoning by drinking wine adulterated with lead compounds. For some, Mozart manifested cachexia or hyperthyroidism, but for others it was obesity or hypothyroidism. Ludendorff, a psychiatrist, and her apostles, claimed in 1936 that Mozart had been murdered by the Jews, the Freemasons, or the Jesuits, and assassination is not excluded by musicologists like Autexier, Carr, and Taboga.

What clearly emerges is that Mozart’s medical historiography is made out of various alternatives, with a general time trend as tenable diagnostic hypotheses are progressively exhausted: the more recent they are the less probable. The most likely diagnoses—such as influenza, typhoid fever, and typhus—were proposed first, and only rare and irrelevant conditions such as Goodpasture’s syndrome, Wegener’s granulomatosis, Still’s disease, or Henoch-Schönlein syndrome were left for those who came later.

Thus, highly selective readings of the sources, blatant misquotations, and perversions of the diagnostic criteria have led to shoddy medical interpretations. Mozart allegedly had thought disorder, delusions, musical dysfluency, and epileptic fits, plus he did not actually compose music but merely displayed musical hallucinations. He was a manic depressive, a pathological gambler, and had an array of psychiatric conditions such as Capgras’ syndrome, attention deficit/hyperactive disorder, paranoid disorder, obsessional disorder, dependent personality disorder, and passive-aggressive disorder. This has resulted in psychiatric narratives that blend an uninterrupted long tradition of defamation—the film Amadeus was one of the last public expressions of this tradition.

This phenomenon is Mozart’s medical nemesis. It covers the hidden intent to pull an exceptional creator down from his pedestal through some obscure need to cut great artists down to size. It is reminiscent of Rameau’s nephew in Diderot’s novel who says about people of exceptional creativity: “I never heard any single one of them praised without it making me secretly furious. I am full of envy. When I hear some degrading feature about their private life, I listen with pleasure. This brings me closer to them. It makes me bear my mediocrity more easily.”

It is proposed that happiness be classified as a psychiatric disorder and be included in future editions of the major diagnostic manuals under the new name: major affective disorder, pleasant type. In a review of the relevant literature it is shown that happiness is statistically abnormal, consists of a discrete cluster of symptoms, is associated with a range of cognitive abnormalities, and probably reflects the abnormal functioning of the central nervous system. One possible objection to this proposal remains--that happiness is not negatively valued. However, this objection is dismissed as scientifically irrelevant.

Satoshi Kanazawa, evolutionary psychology researcher at the London School of Economics, has published a list of ten controversial assertions about human nature; they vary from well-trodden ones (men being naturally sexually promiscuous/drawn to younger partners and such; there are several points drawn from the asymmetry of sexual selection) to more contentious ones; Kanazawa contends that most suicide bombers are Muslims because polygyny, and the sexual frustration of a society where powerful men monopolise the pool of women, serves as powerful motivation, which sounds a bit reductionistic, and would suggest that suicide bombers would predominantly be of low status or prospects, which has not been the case. Meanwhile, liberals are more intelligent than conservatives (as measured by IQ scores) because conservatism is a no-brainer:

"The ability to think and reason endowed our ancestors with advantages in solving evolutionarily novel problems for which they did not have innate solutions. As a result, more intelligent people are more likely to recognise and understand such novel entities and situations than less intelligent people, and some of these entities and situations are preferences, values, and lifestyles," Dr Kanazawa said.

Humans are evolutionarily designed to be conservative, caring mostly about their family and friends. Being liberal and caring about an indefinite number of genetically unrelated strangers is evolutionarily novel. So more intelligent children may be more likely to grow up to be liberals.

Also, both creativity and criminality have a common basis in costly peacock-tail behaviour:

The tendency to commit crimes peaks in adolescence and then rapidly declines. But this curve is not limited to crime – it is also evident in every quantifiable human behaviour that is seen by potential mates and costly (not affordable by all sexual competitors). In the competition for mates men may act violently or they may express their competitiveness through their creative activities.

Today's big question: does country music increase suicide rates? The authors of this paper think that it does, and that country music fans are at significantly higher risk of suicide than nonfans, for reasons involving gun ownership, marital discord and the inherent job and financial stresses affecting America's working poor (which are often referred to in country song lyrics). The authors of this paper, however, dispute this, claiming methodological errors and that there is no evidence of country music making people more likely to off themselves than any other genre. (Whether music in general, or music with lyrics more specifically, correlates to depression or suicide risk, of course, is another question.)

Internet dating pivots around profiles; lists of attributes, paragraphs where you attempt to make yourself sound appealing, a handful of flattering photographs. But there's already a problem. Dozens of books and websites offer advice on how to write profiles; third-party services even charge 40 quid to save you the bother. As a result, the uniformity is hilarious. Everyone loves travelling, particularly to Machu Picchu – which, if the profiles are to be believed, is an Inca site swarming with thousands of backpacking singletons. Men are singularly obsessed with skiing. All of us love to curl up on the sofa with a bottle of wine and a DVD (or a VD, as one unfortunately misspelled profile said).

But we're forced to filter the mass of potential datees, and we do it savagely. We start to adopt a power-shopping mentality, disregarding people for arbitrary reasons; as my friend Sam put it, we cruise past people's pictures as if they're caravans in Daltons Weekly. "Yeah, no, no, yeah – ooh, yes! – no, no, ugh." It's a compelling, but ultimately exhausting, process that these services have adapted, refined and streamlined because it's a brilliant way for them to make money. While a service might lure you with a strapline saying "Meet sexy singles in your area", the truth is more like, "Reject perfectly decent singles in your area while waiting for the maddeningly elusive sexy ones." Everyone is trading off current opportunities against future possibilities. In a thoughtful moment, you might even realise there are people you've had relationships with in the past who, if they appeared as an online match, you might reject. And when you're the one being rejected, it can hurt.

Long-term internet dating participants know only too well, however, the cycle of knock-back followed by a speedy return to the site in search of someone else. You start seeing the same faces across multiple sites, and some people (especially men) will start to play the percentage game, firing off multiple cut-and-paste emails in the hope that someone will reply. One friend of mine was even sent a cheery message of introduction from a man who she had already had a disastrous date with via another dating website.

All the adults in the study were shown what they were led to believe was a test version of a new online news magazine. They were also given a limited time to look over either a negative and positive version of 10 pre-selected articles.
Each story was also paired with a photograph depicting someone of either the younger or the older age group.
The researchers found that older people were more likely to choose to read negative articles about those younger than themselves. They also tended to show less interest in articles about older people, whether negative or positive.

The study concluded that this is a result of a youth-centric society, and that stories which take young people down a few notches serve to boost the self-esteem of older readers.

I wonder whether this factor, plus the aging of the baby boom cohort and the populist bent of the market-driven media, could be behind so many beatups in the news, from scare stories about killer hoodies to dire warnings about internet addiction, shrinking attention spans and the imminent collapse of civilisation as we know it. (And whether, historically, the same factor has played a part in fuelling moral panics about youth-oriented trends such as rock'n'roll music, comic books, swing dancing, and so on.)

Peter McGraw, a behavioural economist from Colorado, has a grand unified theory of humour: he calls it the Benign Violation Theory; the gist of it is that, for something to be amusing, it has to involve a violation of norms, albeit one in which nobody is actually harmed.

Every kind of humor McGraw and Warren could think of fit into the BVT. Slapstick worked: Falling down the stairs, a physical violation, is only funny if nobody's actually hurt. A dirty joke trades on moral or social violations, but it's only going to get a laugh if the person listening is liberated enough to consider risqué subjects such as sex benign. Puns can be seen as violations of linguistic norms, though only cerebral types and grammarians care enough about the violation to chuckle.

McGraw believes the BVT may even help explain why, biologically, humans evolved with the ability to laugh. It is clearly a beneficial trait to be able to correctly perceive when a violation is benign and communicate that to others via laughter, he points out. Early humans who were afraid of every apparent violation, real or not, weren't going to last long — nor were those who took one look at a woolly mammoth charging their way and did nothing but bust a gut.

Which more or less makes sense, though McGraw's attempt to explain laughter as a reaction to being tickled by this theory seems to be grasping at straws. (I'd be more inclined to believe that the internal state arising from being tickled is quite different from that arising from perceiving a joke, even though they have the same external symptom.)

A theory of humour I once saw elsewhere suggested that laughter was a reflexive reaction to a frame of reference suddenly and abruptly being changed, and to being suddenly faced with the need to reevaluate an entire story, scene or proposition, especially if it has become more exciting or unusual in doing so. Of course, this is biased towards conceptual humour, such as a told joke in which a sudden wordplay causes the carefully constructed word-picture to come crashing down (take, for example: "When I die, I want to die peacefully in my sleep like my grandfather, not screaming like the passengers in his car"), or else stepping out of the frame and wantonly changing the (implied) terms of reference of the text of the first part of the joke ("What's orange and sounds like a parrot? A carrot"). This act of conceptual violence triggers a minor earthquake in the listener's mind, which manifests itself as laughter (or a groan of disapproval if they've heard the joke before). Slapstick (and the bodily-function gross-out gags on which current Hollywood comedies are founded) are basically this for people who'd rather not mess with ideas. But both seem to be encompassed by the benign-violation framework.

Of course, the benignness is a negotiable point. One can tell a joke in which people die horribly (or worse), if the people are clearly hypothetical, stuffed straw dummies whose only purpose is to be sacrificed in a joke. Among bigots, jokes at the expense of out-groups also work because, by being dehumanised, the outgroup don't count as actual people. (A popularly tolerated echo of this are things like lawyer jokes, because nobody really believes in the possibility of exterminating all members of a profession.)

Huffington Post co-founder Johan Peretti has posted a presentation, titled "Mormons, Mullets and Maniacs", on what makes online content "viral", i.e., likely to be passed along by bored people:

One key point: content that goes viral tends to appeal to people's personality disorders, or at least gives them an opportunity to score points, laugh at/put down those they disagree with, or express their obsessions, self-identification or narcissistic attention-seeking tendencies:

A new theory claims that human monogamy is a direct result of the development of beer; or more precisely, firstly, that monogamy was the result of social changes that arose from the shift from a nomadic to an agricultural (and thus hierarchical and patriarchial) lifestyle, and secondly, that the main impetus to move to agriculture wasn't so much a desire to build cities or empires but to brew beer.

The Rap Guide To Human Nature is a hip-hop album about evolutionary psychology by a Canadian "rap troubador" named Baba Brinkman. It's not a joke: the beats are sharp, and Brinkman rhymes with the speed and dexterity of an accomplished rapper, deftly laying out the theories and controversies of evolutionary psychology, from kin selection to the biological roots of religious and political belief, twin studies to alternative models of human nature, and of course to areas such as sexual competition and social status where hip-hop culture and evolutionary psychology intersect. Note that, as expected from rap, the lyrics are probably not suitable for children.

Cow Clicker: a distillation of addictive, potentially expensive Facebook games to their purest essence:

You get a cow. You can click on it. In six hours, you can click it again. Clicking earns you clicks. You can buy custom "premium" cows through micropayments (the Cow Clicker currency is called "mooney"), and you can buy your way out of the time delay by spending it. You can publish feed stories about clicking your cow, and you can click friends' cow clicks in their feed stories.

Until now, Google and social software haven't been ideas that went together naturally. The famously engineering-focussed company had experimented with social, though mostly in engineers' 20% time, and with mixed results. Orkut became spectacularly successful in Brazil, but largely bobbed along in the wake of Friendster elsewhere until the vastly technically inferior MySpace came along and seized the market, Google Friend Connect got its lunch eaten by Facebook Connect, and other forays into social made the mistake of being a bit too clever and automatically inferring the user's social graph from their online activity, crossing the line between nifty and disturbing.

Now, however, this is likely to change. There are rumours afoot that Google have made social software a strategic priority, establishing teams to work on the problem of social as part of their regular 80% job, and that a social platform, possibly named Google Me, is in the works. Of course, as far as social platforms go, Facebook have the area sewn up, with a pretty sophisticated API, leaving little space for newcomers (or even Google) to expand into, unless they find and solve problems in the way Facebook does it.

Which brings us to this slide presentation from Google user-experience researcher Paul Adams. The presentation rigorously examines the social uses of software, and the natures of social connections (Adams mentions strong ties and weak ties, and adds a third category, temporary ties, or pairs of people involved in once-off interactions; think someone you buy something from on eBay) and pinpoints possible shortcomings of simple models such as Facebook's (the fact that people have different social circles and needs to expose different facets of their identities to different circles, and that tools such as Facebook's privacy filters have a high overhead to use satisfactorily in this way), not to mention unresolved mismatches between the way human beings intuitively perceive social interaction working and the way it does in the age of social software (for example, we are not intuitively prepared for the idea of our conversations being recorded and made searchable). All in all, it looks like a pretty rigorous survey of social software, condensed down to 216 slides. (An expanded version may be the contents of a book, Social Circles, which comes out in August.)

If Google, who have not given much weight to social software in the past, are investing in this level of research into it, they may well have a Facebook-beating social platform in the works. Though (assuming that it exists, of course) only time will tell whether Google have finally grasped social enough to pull it off.

A team of evolutionary psychologists have revised Maslow's Hierarchy of Needs.
The original hierarchy is a pyramid of needs, with basic ones (food, shelter and, because it was invented in the 1960s, sex) at the bottom, and subsequent layers adding more advanced ones, like love, esteem and, at the apex, self-actualisation. Douglas Kenrick's team, however, does away with all that fluffy human-potential thinking and replaces it with the brute certainties of evolutionary psychology: at the top is not self-actualisation but parenting; i.e., doing what your genes built you to do and passing them on. The levels below have to do with acquiring and retaining a genetically fit mate, and building up the necessary social status to compete for the prize.

I am generally a fan of evolutionary psychology as an explanatory tool, though this doesn't sit well with me; it strikes me as a bit too reductionistic, and a bit too basic a model. Is the ultimate goal really to breed? Can we say that someone who has settled down in anonymous suburbia with a stable if dull job and started pumping out the children is more fulfilled than one who has found self-actualisation (through social, creative or otherwise constructive pursuits) but is childless? Are those who choose the latter path deluding themselves? It seems to say so.

ABC Radio National's All In The Mind recently interviewed a US psychiatrist who claims that
psychiatry was used as a weapon against the civil rights movement in the 1960s. According to Jonathan Metzl, author of The Protest Psychosis: How Schizophrenia Became A Black Disease, the definition of schizophrenia was tweaked to apply to a lot of discontented African-Americans, with many activists being institutionalised in mental hospitals. (Until then, schizophrenia had been seen as a passive, disengaged condition mostly affecting white women; mental institutions were repurposed for containing the civil rights movement, many such patients were rediagnosed with depression and deinstitutionalised.)

All of a sudden in 1968 the second Diagnostic Manual comes out, the DSM 2, in the context of probably the most racially charged year in the history of the Civil Rights Movement—1968, where there are many riots, many protests. And also the DSM 2 importantly added language in the paranoid sub-type of schizophrenia, it added several important terms, it said the new criteria included aggression, hostility and projection. These hadn't been characteristics in DSM 1 and the manual explained, 'the patient manifests the characteristics of aggression and hostility and also attributes to others characteristics he cannot accept in himself.

Even the advertisements at the time for sedative drugs used for treating patients echoed this racial paranoia:

I unearthed a series of advertisements for serious tranquilisers, Haldol, Stelazine, Thorazine that either represented African iconography, so African tribal masks, and would use incredibly charged racial language—so it would say this is the tool of primitive psychiatry and they would show these African masks—or images that quite literally showed, shockingly enough, angry black men protesting in the streets. And there's one image I reproduce in the beginning of the book, it's a Haldol advertisement that shows an angry black man in a burning urban scene who's shaking his fist. And the important point for both of these is that the iconography from these images literally appearing in the leading psychiatric journals was taking directly from the themes of the Civil Rights movement. The kind of Return to Africa Movement played out in these African scenes, and the idea of a clenched fist which was...

This wasn't the first example of psychiatry being used in the service of racism in the US; in the 1850s, a surgeon named Samuel Cartwright put forward the theory that escaped slaves were suffering from illnesses he called drapetomania and dysesthesia aethiopis; his argument being that, as Negroes are psychologically unfit to cope with the pressures of freedom, escaping from one's rightful master was a sign of mental illness. This idea was, of course, very useful to those with a stake in maintaining the status quo, and flourished for some time for that reason.

Anyway, the shifting of the meaning of schizophrenia during the civil rights era was subsequently remedied partly by a deliberate programme to harmonise diagnoses with those used in Europe, though one might argue that the likelihood of the mentally ill to slip through the cracks to the prison system is part of the legacy of this phenomenon (according to Metzl, those diagnosed with schizophrenia in the US today are far more likely to end up in prison than in hospital; given that in America's neo-Calvinist penology, prisons are emphatically places of punishment first and rehabilitation a distant second, this is particularly disturbing).

Meanwhile, back in Europe, a converse relationship between mental illness and radical politics was posited from the other side; West Germany's Sozialistisches Patientenkollektiv, a radical Marxist group comprised of mental patients and the odd psychiatrist, argued that mental illness was a cultural construct, a reaction to the iniquities of capitalism.

Google have developed facial recognition technology, capable of identifying individuals in photographs. Given the privacy implications (that plus Google Goggles would be the ultimate stalker tool), they're wisely being very careful about what, if anything, they do with it.

Spoonflower is a web-based company that will print your designs onto fabric and send it to you. Now if only we could get them talking to Blank Label (a web-based service that lets you design custom shirts, though currently only from a somewhat conservative range of fabrics), then that would be awesome.

The results of the mathematical analysis showed when both members of union are similar emotionally they have an “optimal effort policy,” which results in a happy, long-lasting relationship. The policy can break down if there is a tendency to reduce the effort because maintaining it causes discomfort, or because a lower degree of effort results in instability. Paradoxically, according to the second law model, a union everyone hopes will last forever is likely break up, a feature Rey calls the “failure paradox”.

The paper may be found here. (Aside: note the use of the Unicode ♥ character in the equations; I wonder how common unusual Unicode symbols are in mathematical or scientific papers these days.)

A Facebook intern and PhD student in human-computer interaction has used Facebook to measure the relationship between sharing and wellbeing. Moira Burke's findings, gained by measuring the interactions between Facebook users who filled in surveys, has found, unsurprisingly, that active sharing (such as posting content and sending messages) is more correlated with wellbeing than passive consumption.

In user interface design, sometimes worse is better, as in the case of the Bloomberg Terminal, a proprietary computer terminal used by financial traders. The Bloomberg Terminal's interface, which hasn't been updated for a decade or so, is generally seen as cluttered and ugly. Proposals for more elegant redesigns have been knocked back, because the existing users like the macho ugliness of the interface and the aura of hardcore expertise it bestows on them:

Simplifying the interface of the terminal would not be accepted by most users because, as ethnographic studies show, they take pride on manipulating Bloomberg's current "complex" interface. The pain inflicted by blatant UI flaws such as black background color and yellow and orange text is strangely transformed into the rewarding experience of feeling and looking like a hard-core professional.

In other words, the Bloomberg Terminal is one of a class of items whose bad design is a feature serving a higher-level social function; in this case, the function is that of being a badge of proficiency or status, and an artificial handicap to keep usurpers out. In this way, it functions somewhere between the tail of a peacock (which is expensive to grow and makes one more visible to predators, but having one (and being alive) also acts as proof of fitness) and the regalia and rituals of Freemasonry back when it was a force to be reckoned with. Of course, secrets are inherently leaky and can hold power only for so long, so sooner or later, perhaps someone (possibly Apple or Google?) will come along with a more elegantly-designed system that will demystify what it does and, in doing so, hole Bloomberg's boat below the waterline (unless they do so first).

If you've ever found yourself compelled to keep playing a video game, despite realising that you're not actually enjoying it, you may have been a victim of
the Behaviourist conditioning techniques game designers use to get people hooked. Video game designers are applying Skinnerian techniques of behaviour reinforcement to compel players to keep playing, to get hooked early, and to invest more time (and often money) into levelling up. (And playing a game does not necessarily equal enjoying it; the stimulus of getting unpredictable rewards, and the fear of losing one's carefully built-up progress, are sufficient to compel one, even if they might otherwise have preferred to do something else.)

His theories are based around the work of BF Skinner, who discovered you could control behavior by training subjects with simple stimulus and reward. He invented the "Skinner Box," a cage containing a small animal that, for instance, presses a lever to get food pellets. Now, I'm not saying this guy at Microsoft sees gamers as a bunch of rats in a Skinner box. I'm just saying that he illustrates his theory of game design using pictures of rats in a Skinner box. This sort of thing caused games researcher Nick Yee to once call Everquest a "Virtual Skinner Box."

First, set up the "pellets" so that they come fast at first, and then slower and slower as time goes on. This is why they make it very easy to earn rewards (or level up) in the beginning of an MMO, but then the time and effort between levels increases exponentially. Once the gamer has experienced the rush of leveling up early, the delayed gratification actually increases the pleasure of the later levels. That video game behavior expert at Microsoft found that gamers play more and more frantically as they approach a new level.

Behaviourist game design techniques are becoming more prevalent in the age of online games, where the maker's revenue comes not from once-off purchases but from time (and money) spent in the course of playing the game; hence, game designers have to get their players hooked before the other guy comes along and milks them. And milking is perhaps an apt metaphor, given that one of the leading examples of this sort of game design is the Facebook game FarmVille, which, by all accounts is more of a socially conditioned obligation than a ludic activity:

Farmville is not a good game. While Caillois tells us that games offer a break from responsibility and routine, Farmville is defined by responsibility and routine. Users advance through the game by harvesting crops at scheduled intervals; if you plant a field of pumpkins at noon, for example, you must return to harvest at eight o’clock that evening or risk losing the crop. Each pumpkin costs thirty coins and occupies one square of your farm, so if you own a fourteen by fourteen farm a field of pumpkins costs nearly six thousand coins to plant. Planting requires the user to click on each square three times: once to harvest the previous crop, once to re-plow the square of land, and once to plant the new seeds. This means that a fourteen by fourteen plot of land—which is relatively small for Farmville—takes almost six hundred mouse-clicks to farm, and obligates you to return in a few hours to do it again. This doesn’t sound like much fun, Mr. Caillois. Why would anyone do this?

The secret to Farmville’s popularity is neither gameplay nor aesthetics. Farmville is popular because in entangles users in a web of social obligations. When users log into Facebook, they are reminded that their neighbors have sent them gifts, posted bonuses on their walls, and helped with each others’ farms. In turn, they are obligated to return the courtesies. As the French sociologist Marcel Mauss tells us, gifts are never free: they bind the giver and receiver in a loop of reciprocity. It is rude to refuse a gift, and ruder still to not return the kindness.[11] We play Farmville, then, because we are trying to be good to one another. We play Farmville because we are polite, cultivated people.

On a tangent, there is a blog titled The Psychology of Games; some of its content has to do with psychological manipulation techniques to control and monetise gamers, though it also covers examples of game theory (in the Prisoner's Dilemma sense) in games, psychoeconomics, the enjoyment of gaming as an activity, and, indeed, a wealth of psychological phenomena as illustrated through video gaming.

Sleep Talkin' Man: a log of the bizarre, surreal and often obscenity-filled utterances of a man afflicted with the condition of sleep talking, as transcribed (and sometimes recorded and posted online) by his wife:

"Don't move a muscle. Bushbabies are everywhere... everywhere... Shoot the fucking big-eyed wanky shite fucks! Kick 'em. Stamp them. Poke 'em in their big eyes! Take that for scaring the crap out of me."

"My badger's gonna unleash hell on your ass. Badgertastic!"

"It's a good thing your breath smells of shit. It colors your words beautifully. Gives it an edge."

A new study from Bristol University has looked into the differences between cat owners and dog owners. As well as the usual stereotypes (cat owners are more likely to be women who live alone), they discovered that cat owners are more likely to have degrees than dog owners (47.2% of households with cats have one person with a degree, compared to 38.4% with a dog):

"Our best guess is that it's to do with working hours and perhaps commuting to work, meaning people have a less suitable lifestyle for a dog. It's really just a hunch though."

Or perhaps there are common psychological traits associated with a fondness for cats and a likelihood to apply oneself to study (or, indeed, a fondness for dogs and a likelihood to quit wastin' time and go out into the real world)?

The authors found that a small number of users had developed a compulsive internet habit, replacing real life social interaction with online chat rooms and social networking sites.

They classed 18 respondents - 1.2% of the total - as "internet addicts".
This group spent proportionately more time on sex, gambling and online community websites... The internet addicts were significantly more depressed than the non-addicted group, with a depression score five times higher.

Of course, the whole concept of "internet addiction" is a dubious one, and often tinged with tabloid-style moral panic, so there's a danger that the advocates of the "internet addiction" industry will wave this around as proof, ignoring the fact that the addictive behaviours there are more usefully described as gambling and/or pornography addiction.

The report does not put forward any causal links between heavy internet use and depression. Do specific patterns of internet use weaken social contacts, contributing to depression, or do depressed people use the internet to self-medicate?

Also, the inclusion of online community websites along with sex and gambling websites seems somewhat dubious; while the latter are masturbatory replacements for natural stimuli, especially those one leading an impoverished life may lack, can one really imply that social community sites substitute for and weaken social ties rather than facilitating them? I recall a study from a few years ago which showed that users of social web sites actually have stronger social connections, and improved wellbeing as a result of those. Though it is always possible that various characteristics of particular social websites (which may be influenced by their design and/or emergent from organic patterns of use) influence their ability to facilitate psychologically useful social ties.

The boffins at OKCupid have posted another statistical tour of the mysteries of human sexual attraction, this time looking at profile pictures, and what makes them work (or fail). Some of the findings: the "MySpace shot", cheesy as it may sound, does work for women (though only if they're looking for something other than interesting conversation), and if you're male, you're advised to get your shirt off:

Canadian psychology professor Bob Altemeyer has made available online the text of a book examining the psychology of authoritarianism. Altemeyer looks at what he calls Right-Wing Authoritarianism, a personality trait which manifests itself in a high degree of submission to the established authorities, high levels of aggression in the name of the authorities, and a high level of conventionalism, and correlates with the political right, at least in North America. (He also mentions left-wing authoritarianism—think dogmatic Maoism or similar—in passing, though dismisses it as having all but died out in North America, whereas right-wing authoritarianism is going from strength to strength.)

It’s about what happened to the American government after "conservatives" gained control of Congress in the 1990s and the White House in 2000. It’s about the disastrous decisions that government made, which have created the enormous problems we face now. It’s about the corruption that rotted the Congress. It’s about how traditional conservatism has nearly been destroyed by authoritarianism. It’s about how the “Religious Right” teamed up with amoral authoritarian leaders to push its un-democratic agenda onto the country.

For example, take the following statement: “Once our government leaders and the authorities condemn the dangerous elements in our society, it will be the duty of every patriotic citizen to help stomp out the rot that is poisoning our country from within.” Sounds like something Hitler would say, right? Want to guess how many politicians, how many lawmakers in the United States agreed with it? Want to guess what they had in common?

Altemeyer puts forward a Right-Wing Authoritarian personality scale, with higher scores correlating with the trait. High-RWA individuals have a "Daddy knows best" attitute to the authorities. They defer to their leaders, and even while they often believe that the law, however harsh, must be obeyed, they will exempt their leaders from this if the ends justify the means (such as approving of illegal activities against "radicals" or "enemies of society"). They view the world in terms of in-groups and out-groups, with little sympathy for the latter, and an us-vs.-them outlook, exhibit aggression against those seen to be transgressing against the norms of society, and are quicker than average to join with others to take action against them. And, being highly conventional, they interpret a lot of things as existential threats to the established order. (Authoritarianism, in other words, seems to tie in with a survival-values worldview, driven by the perception of existential threats and the need to deal with them.) Being driven by faith in authority, high-RWAs are more capable than most of compartmentalising contradictory beliefs and resisting challenges to their beliefs posed by logic or evidence.

The Authoritarians looks at the RWA scale and other phenomena, such as religious fundamentalism, social-dominance orientation and real-world politics. Not surprisingly, there are correlations between right-wing authoritarianism and religious fundamentalism, and both are strong predictors of prejudice against out-groups. (Paradoxically, many high-RWA people exhibit both racial prejudices and hostility to overt racism, largely due to not seeing themselves or their peers as racially prejudiced; this would be the dampening effect authoritarianism has on insight and analysis.) Meanwhile, there are both parallels and differences between right-wing authoritarian followers and people who score highly on the social dominance scale; the former don't necessarily want personal power, whereas the latter are less likely to be religious or constrained by rules, though will often happily feign religiosity as a means to an end. Some individuals, of course, score highly on both scales. Because authoritarian followers are receptive to messages that feel right, and are suspicious of critical thought, right-wing authoritarian movements attract more than their share of power-hungry sociopaths willing to pound the right talking points to get willing, unquestioning followers.

The bad news is, the authoritarians have been ascendant over the past decade (in the US, Altemeyer says, they have largely seized the Republican Party). The good news is that right-wing authoritarianism, as a tendency, can be defeated. Studies have found that fear increases RWA scores, in effect making people shut up and follow the leader. (This was used to great effect by the Bush Whitehouse, for example, by instituting a prominent colour-coded terror threat level, seldom dipping below "severe", and raising it inexplicably before elections.) Fearful societies are governed by authoritarian survival values, which have a harder time of getting a grip without fear. Exposure to people unlike oneself and one's "in-group" also weakens authoritarian tendencies, as does a liberal education. A study cited by Altemeyer showed university students' RWA scores declining steadily over the course of their studies, and remaining low throughout their lives. (Parenting, meanwhile, causes one's RWA scores to increase slightly.)

There are other ideas Altemeyer's Right-Wing Authoritarianism scale ties into, such as Lakoff's strict-father/nurturing-parent family dichotomy (which Altemeyer looks at though finds weakly connected), Milgram's obedience experiment, Zimbardo's Stanford Prison Experiment, and theories about the mass psychology of fascism. (Of which this strikes me as one of the more useful ones; while it may be fun to posit connections between fascism and manned flight or the mass spectacle of rock'n'roll, those are probably less useful for actually understanding the threat of fascism as a mass movement.)

The Gervais Principle, or the social psychology of how organisations really function, as seen in the Office TV comedies:

Now, after four years, I’ve finally figured the show out. The Office is not a random series of cynical gags aimed at momentarily alleviating the existential despair of low-level grunts. It is a fully-realized theory of management that falsifies 83.8% of the business section of the bookstore. The theory begins with Hugh MacLeod’s well-known cartoon, Company Hierarchy ..., and its cornerstone is something I will call The Gervais Principle, which supersedes both the Peter Principle and its successor, The Dilbert Principle.

The McLeod hierarchy, and the theory which is the cornerstone of Ricky Gervais' comedy (and its US remake), divides organisations into three psychological types, somewhat facetiously labelled Sociopaths (i.e., those driven by the desire to control and dominate, without whom no decisions would be made), Losers (i.e., those who have made the tradeoff of security for control of their destiny; these need not necessarily be losers in the colloquial sense) and the Clueless (who are in the middle of the hierarchy, but are one level below losers in self-awareness; whereas the loser typically puts in the minimum they can get away with, the clueless give their loyalty to the organisation out of a misplaced faith that it will be reciprocated). Initially, organisations start off with a few Sociopaths in the driving seat and a corps of Losers doing the gruntwork in exchange for a regular paycheque; as they get larger, a layer of Clueless is added, and expands. This layer may be imagined as a dense, inert substance, which serves to keep the otherwise inherently unstable organisation from imploding.

A sociopath-entrepreneur with an idea recruits just enough losers to kick off the cycle. As it grows it requires a clueless layer to turn it into a controlled reaction rather than a runaway explosion. Eventually, as value hits diminishing returns, both the sociopaths and losers make their exits, and the clueless start to dominate. Finally, the hollow brittle shell collapses on itself and anything of value is recycled by the sociopaths according to meta-firm logic.

The Gervais Principle builds on this, and describes how Losers who put in more than is in their best interest get promoted to middle-management, not because of their talents, or because of their incompetence (as per the Peter Principle or Dilbert Principle), but because they are most useful as pebbles in the insulating layer of the Clueless.

Sociopaths, in their own best interests, knowingly promote over-performing losers into middle-management, groom under-performing losers into sociopaths, and leave the average bare-minimum-effort losers to fend for themselves.

A loser who can be suckered into bad bargains is set to become one of the clueless. That’s why they are promoted: they are worth even more as clueless pawns in the middle than as direct producers at the bottom, where the average, rationally-disengaged loser will do. At the bottom, the overperformers can merely add a predictable amount of value. In the middle they can be used by the sociopaths to escape the consequences of high-risk machinations like re-orgs.

Which brings us to the other major management book that is consistent with the Gervais Principle. Images of Organization, Gareth Morgan’s magisterial study of the metaphors through which we understand organizations. Of the eight systemic metaphors in the book, the one that is most relevant here is the metaphor of an organization as a psychic prison. The image is derived from Plato’s allegory of the cave, which I won’t get into here. Suffice it to say that it divides people into those who get how the world really works (the sociopaths and the self-aware slacker losers) and those who don’t (the over-performer losers and the clueless in the middle).

The OKCupid people have been running a free online dating service, backed by psychological matching algorithms driven by user-written tests, for many years, and have build up a huge corpus of data about how people interact. Now they have started a blog, where they discuss the statistical findings that may be gathered from comparing people's profiles and message counts.

Race has a slightly greater influence (of a few percentage points either way), presumably because of uneven distribution of cultural backgrounds, but it is still fairly small. (Keep in mind that the match scores are computed from how users answer others' questions, and not from explicitly asking questions like "would you date a Virgo/Polynesian/Buddhist".) Religion, however, turns out to be a lot more telling:

According to this, atheists, agnostics, Jews and Buddhists seem to get along just swell (in fact, Buddhists appear to be slightly more compatible with the nonbelievers than with other Buddhists), whereas the Christians, Hindus and Muslims tend to be somewhat more contentious, not only not getting along with other religions as well but also with each other. Additionally, the more seriously one takes religion, it seems, the less likely one is to get along with others.

Love may be blind, but it also seems that it, or at least attraction, is deeply racist.

On a lighter note, OKCupid have crunched the word frequencies of successful and unsuccessful opening messages and discovered what to write if you want a reply. Netspeak and "hip" misspellings ('u', 'luv', 'wat') and physical compliments are out, whereas mentions of specific interests are helpful. Unsurprisingly, mentioning religion is generally a bad idea as well.

Romcoms don't merely provide an evening's harmless escapism. They help underpin one of the most potent doctrines of our culture: the sanctity of romantic love. It's a doctrine in which many find relief from the materialism, apathy and banality of a society no longer hallowed by religious transcendence. Yet it comes at a price.

The involuntary cognitive state that Jennifer Aniston finds herself depicting so frequently is real enough, but not particularly mystical. Brain scans show it to be generated by the frisky interaction of chemicals like norepinephrine and dopamine. If this hubbub's triggered by recognition of genetic quality, as now seems to be assumed, that would explain why Aniston and her ilk have to be so annoyingly good-looking.

What we call love induces some of the worst behaviour that we're likely to encounter. Yet when this occurs, it usually invites no censure, let alone punishment. Romantic love is a get-out-of-jail-free card that legitimises actions which would otherwise be thought contemptible. Home-wreckers steal something cherished far more deeply than money or possessions. Nonetheless, they go on to build their happiness on the misery of others without having to endure the slightest disapproval. After all, they had no choice but to do what they did: they were in love.

In other cultures, romantic love enjoys no comparable status. Our own ancestors might find our veneration of it as puzzling as we find their worship of pagan gods. In our otherwise disrespectful age, the persistence of its dominion is rather remarkable. Would it have proved so enduring without the big screen's relentless promotion of its supposedly limitless benefits?

I have been wondering whether the emphasis on romantic love in popular culture (a significant proportion of mainstream pop songs seem to be about the transitions into and out of the state of being in love, for example) and the sexualisation of the media are not two sides of the same coin, namely a focus on relations between people as being a marketplace of potential partners, rather than less glamorous and less dynamic forms of relations. Could this be a sort of social Reaganism/Thatcherism, the ideological assertion that "everything is a market" translated into the realm of interpersonal relations?

The clever experiments demonstrated that love makes us think differently in that it triggers global processing, which in turn promotes creative thinking and interferes with analytic thinking. Thinking about sex, however, has the opposite effect: it triggers local processing, which in turn promotes analytic thinking and interferes with creativity.

Why does love make us think more globally? The researchers suggest that romantic love induces a long-term perspective, whereas sexual desire induces a short-term perspective. This is because love typically entails wishes and goals of prolonged attachment with a person, whereas sexual desire is typically focused on engaging in sexual activities in the "here and now". Consistent with this idea, when the researchers asked people to imagine a romantic date or a casual sex encounter, they found that those who imagined dates imagined them as occurring farther into the future than those who imagined casual sex.

A global processing style promotes creative thinking because it helps raise remote and uncommon associations. Consider, for example, the act of finding a gift for your partner. If we think about a gift while in a local mindset, then we’ll probably focus on more literal and concrete options, most of which involve a tangible object wrapped in colorful paper. We’ll probably consider the usual suspects, such as a watch, a book, or perfume. However, thinking about a gift more globally might inspire us to consider a gift as "anything that will make him/her happy". This may, in turn, bring to mind more diverse and original ideas, such as going on a joint vacation, writing a song, or cleaning and remodeling the house. Of course, this doesn’t mean we should always think globally. While local processing might interfere with creativity, it also promotes analytic thinking, which requires us to apply logical rules. For example, if you are looking for a piece of furniture in a big display according to a pre-defined list of criteria (e.g., size, color, price), a local mindset may help you find a match, by preventing you from being side-tracked by attractive but irrelevant options and by making you pay more attention to relevant details.

I wonder how this ties into other things, such as holism and reductionism. Or wiether there's a correlation between short-term thinking and a prevalence of sexualised imagery/metaphors.

The seats are bright blue and have stickers above them marking them out as special seats for the use of obese persons. Strangely enough, they seem almost always empty; presumably, a lot of those who can fit into a regular seat or can bear standing are not keen to self-identify themselves as severely overweight.

One wonders how facility planners could cater for a widening population without coming up against against stigma and denial. They could, of course, go back to flat benches without divisions, except that those allow homeless people to sleep on them, which is unacceptable for various reasons. (Not all people are sufficiently enlightened and compassionate to share their daily commute with the aromatically homeless, and if public transport facilities adopted a secondary role as a homeless shelter, this would drive out many of those who are sufficiently well-off to avoid public transport, putting more cars on the road, and resulting in the money spent on running the actual trains being wasted, but I digress.) Possibly some sort of design with all seats being double-width with a low-key, or movable, divider in the middle, would do the trick; though that could have the unintended consequence of encouraging amorous couples.

Wikipedia link of the day:
Spite houses, or where malice and architecture intersect:

A spite house is a building (generally found in an urban environment) which was constructed or modified because the builder felt wronged by someone who did not want it there. Typically built to annoy someone, in most cases a neighbor, these buildings serve primarily as obstructions, blocking out light or access to neighboring buildings, or as flamboyant symbols of defiance.[1][2] Because actually inhabiting such structures is usually a secondary goal at most, they often have strange and impractical layouts.

Dr McComb and her team set up an experiment which tested human responses to the different purring types. She says: “When humans were played purrs recorded while cats were actively seeking food at equal volume to purrs recorded in non-solicitation contexts, even those with no experience of cats judged the ‘solicitation’ purrs to be more urgent and less pleasant.”

Not all cats, however, use this solicitation purring: “It seems to most often develop in cats that have a one-on-one with their owners rather than in large households where there is a lot going on and such purring might get overlooked. Meowing seems to be more common in these situations.”

Cats tend to use the "soliciting purr" at times such as early in the morning, to elicit compliance from humans who may otherwise prefer to do something else, such as remaining asleep. It appears to be individually learned rather than an evolved instinct. There are more details, including embedded video, here.

The year was 1374. In dozens of medieval towns scattered along the valley of the River Rhine hundreds of people were seized by an agonising compulsion to dance. Scarcely pausing to rest or eat, they danced for hours or even days in succession. They were victims of one of the strangest afflictions in Western history. Within weeks the mania had engulfed large areas of north-eastern France and the Netherlands, and only after several months did the epidemic subside. In the following century there were only a few isolated outbreaks of compulsive dancing. Then it reappeared, explosively, in the city of Strasbourg in 1518. Chronicles indicate that it then consumed about 400 men, women and children, causing dozens of deaths (Waller, 2008).

Not long before the Strasbourg dancing epidemic, an equally strange compulsion had gripped a nunnery in the Spanish Netherlands. In 1491 several nuns were ‘possessed’ by devilish familiars which impelled them to race around like dogs, jump out of trees in imitation of birds or miaow and claw their way up tree trunks in the manner of cats. Such possession epidemics were by no means confined to nunneries, but nuns were disproportionately affected (Newman, 1998). Over the next 200 years, in nunneries everywhere from Rome to Paris, hundreds were plunged into states of frantic delirium during which they foamed, screamed and convulsed, sexually propositioned exorcists and priests, and confessed to having carnal relations with devils or Christ.

The article examines these phenomena, dismissing various theories (such as them being caused by ergotism, or the consumption of bread contaminated with hallucinogenic mould), and makes the case that they were culture-bound psychogenic illnesses, enabled by accepted beliefs about the supernatural and triggered by stress:

Similarly, it is only by taking cultural context seriously that we can explain the striking epidemiological facts that possession crises so often struck religious houses and that men were far less often the victims of mass diabolical possession. The daily lives of nuns were saturated in a mystical supernaturalism, their imaginations vivid with devils, demons, Satanic familiars and wrathful saints. They believed implicitly in the possibility of possession and so made themselves susceptible to it. Evangelical Mother Superiors often made them more vulnerable by encouraging trance and ecstasy; mind-altering forms of worship prepared them for later entering involuntary possession states. Moreover, early modern women were imbued with the idea that as the tainted heirs of Eve they were more liable to succumb to Satan, a misogynistic trope that often heightened their suggestibility.

Theological conventions also conditioned the behaviour of demoniac nuns. This is apparent from the fact that nearly all possession epidemics occurred within a single 300-year period, from around 1400 to the early 1700s. The reason is that only during this period did religious writers insist that such events were possible (Newman 1998). Theologians, inquisitors and exorcists established the rules of mass demonic possession to which dissociating nuns then unconsciously conformed: writhing, foaming, convulsing, dancing, laughing, speaking in tongues and making obscene gestures and propositions. These were shocking but entirely stereotypical performances based on deep-seated beliefs about Satan’s depravity drawn from religious writings and from accounts of previous possessions. For centuries, then, distress and pious fear worked in concert to produce epidemics of dancing and possession.

The article concludes with examples of modern occurrences of such phenomena, from the rather feeble examples (such as epidemics of fainting) one could find in a materialistic post-Enlightenment society to "spirit possession" among factory workers drawn from rural communities in Malaysia and Singapore, to delusions of penis-stealing witchcraft in western Africa.

In hot places, human life seems to be worth less. Iraq is a place where an army of Mersaults in the form of soldiers and "contractors" are constantly on the brink of breaching their own culture's taboos, driven to acts of violence by the murderous heat. Under the sun, nothing matters any more.

In hot places, human life seems to be worth less. Iraq is a place where an army of Mersaults in the form of soldiers and "contractors" are constantly on the brink of breaching their own culture's taboos, driven to acts of violence by the murderous heat. Under the sun, nothing matters any more.

Some say that cultural psychology changes as you move closer to the equator. In Discovering Psychology with Philip Zimbardo, a precis of a film about cultural psychology made for the University of Stanford, James Jones of the University of Delaware argues that a way of being has evolved near the equator featuring particular uses of time, rhythm, improvisation, orality, and spirituality. What he describes is a failure to defer gratification:

Cod-Vic usually has an element of knowing perversity to it, relishing its bad taste rather than recoiling from it. Cod-Vic has also opened the floodgates to a host of one-liners, cheap novelties, and dead ends--some delightful, others less so. The most successful Cod-Vic offerings seem to strike a deft balance, borrowing from overwrought Victorian forms while maintaining a modern crispness and rigor.

Cod-Mod is another form of retro, but an altogether less problematic one than Cod-Vic since its clean lines, unadorned materials and simple color schemes lend a laid-back, contemporary air of unaffected sophistication, making it much easier to defend as an aesthetic. Cod-mod has been mined heavily by indie music (Camera Obscura and Belle & Sebastian, anyone?), as well as interior design types like Jonathan Adler and film directors like Wes Anderson. Oh yeah, and Ikea.

To figure out the current state and direction of the global economy, economists are turning to somewhat unusual indicators, such as the membership of extramarital infidelity websites and the price of prostitution in Latvia:

The Web site crunched its traffic and membership numbers and found that there was a big increase in both when there was a turning point in the FTSE-100 index, which measures the leading companies listed in London. When the market collapses, people plot affairs. And when the bulls rage, the same thing happens. When it is trading sideways, they stick with their partners.

“It has to do with people’s confidence levels,” says Rosie Freeman-Jones, a spokeswoman for the site. “When the markets are up, they think they can have an affair because they feel they can get away with anything. When the market hits the bottom, they are looking for a way to relieve the pressure.”

And here is more information on the prostitution index, and why prostitution prices make a good economic indicator.

Anyway the problem is that most industries have contractual arrangements which fix prices. Wages are very hard to flex downwards. Rents are fixed over sustained periods and the like. All of this means that people go bust rather than reduce prices – simply because prices are sticky.

Well – most prices. The contractual terms of prostitution are short (an hour, a night) and entry to the industry is unconstrained. That means that the prices are very flexible. Extraordinarily flexible.

Martin could often see precisely what cardholders were purchasing, and he discovered that the brands we buy are the windows into our souls — or at least into our willingness to make good on our debts. His data indicated, for instance, that people who bought cheap, generic automotive oil were much more likely to miss a credit-card payment than someone who got the expensive, name-brand stuff. People who bought carbon-monoxide monitors for their homes or those little felt pads that stop chair legs from scratching the floor almost never missed payments. Anyone who purchased a chrome-skull car accessory or a “Mega Thruster Exhaust System” was pretty likely to miss paying his bill eventually.

Martin’s measurements were so precise that he could tell you the “riskiest” drinking establishment in Canada — Sharx Pool Bar in Montreal, where 47 percent of the patrons who used their Canadian Tire card missed four payments over 12 months. He could also tell you the “safest” products — premium birdseed and a device called a “snow roof rake” that homeowners use to remove high-up snowdrifts so they don’t fall on pedestrians.

By the time he publicized his findings, a small industry of math fanatics — many of them former credit-card executives — had started consulting for the major banks that issued cards, and they began using Martin’s findings and other research to build psychological profiles. Why did birdseed and snow-rake buyers pay off their debts? The answer, research indicated, was that those consumers felt a sense of responsibility toward the world, manifested in their spending on birds they didn’t own and pedestrians they might not know. Why were felt-pad buyers so upstanding? Because they wanted to protect their belongings, be they hardwood floors or credit scores. Why did chrome-skull owners skip out on their debts? “The person who buys a skull for their car, they are like people who go to a bar named Sharx,” Martin told me. “Would you give them a loan?”

It's not only your purchasing record that's mined for psychological data, though:

Most of the major credit-card companies have set up systems to comb through cardholders’ data for signs that someone is going to stop making payments. Are cardholders suddenly logging in at 1 in the morning? It might signal sleeplessness due to anxiety. Are they using their cards for groceries? It might mean they are trying to conserve their cash. Have they started using their cards for therapy sessions? Do they call the card company in the middle of the day, when they should be at work? What do they say when a customer-service representative asks how they’re feeling? Are their sighs long or short? Do they respond better to a comforting or bullying tone?

The card companies have, as you might imagine, a variety of uses for this data. On the blunter side of the spectrum, signs of potential unreliability (bills for dive bars or marriage counselling services, unusual login patterns) may trigger card companies to raise interest rates or start pushing more aggressively for repayment. More subtly, though, if your credit card company calls you to discuss your bill, the person talking to you will be trained in psychological techniques and will have on their screen a detailed psychological profile of you, all the better to elicit compliance:

Santana had actually already sought permission from the bank to settle for as little as $10,000. It’s an open secret that if a debtor is willing to wait long enough, he can probably get away with paying almost nothing, as long as he doesn’t mind hurting his credit score. So Santana knew he should jump at the offer. But as an amateur psychologist, Santana was eager to make his own diagnosis — and presumably boost his own commission.

“I don’t think that’s going to work,” Santana told the man. Santana’s classes had focused on Abraham Maslow’s hierarchy of needs, a still-popular midcentury theory of human motivation. Santana had initially put this guy on the “love/belonging” level of Maslow’s hierarchy and built his pitch around his relationship with his ex-wife. But Santana was beginning to suspect that the debtor was actually in the “esteem” phase, where respect is a primary driver. So he switched tactics.

“You spent this money,” Santana said. “You made a promise. Now you have to decide what kind of a world you want to live in. Do you want to live around people who break their promises? How are you going to tell your friends or your kids that you can’t honor your word?”

The man mulled it over, and a few days later called back and said he’d pay $12,000.

“Boom, baby!” Santana shouted as he put down the phone. “It’s all about getting inside their heads and understanding what they need to hear,” he told me later. “It really feels great to know I’m helping people in pain.”

Of course, another way to look at this was that, had the chump (who, according to the article, had recently been left by his wife) not offered to pay up extra, the friendly man from the card company would know exactly which buttons to push to kick them down further. Which is all very well (Personal Responsibility, after all, is What Made America Great, as any card-carrying Libertarian will tell you), other than the inherent asymmetry of going up against a huge organisation with frighteningly powerful intelligence-gathering abilities, and no interest in your welfare beyond what's required to maximise its profits.

When US filmmaker Andrea Wachner was invited to attend her 10-year high-school reunion in the affluent Los Angeles suburb of Palos Verdes, she didn't want to go; so she recruited an exotic dancer to pretend to be her, fitting her with an earpiece and coaching her interactively on the people she was meeting. Tattooed, scantily-clad "Cricket" claimed that she was Andrea, had had reconstructive surgery and suffered amnesia after a car accident, and that she was working as a stripper to pay for her graduate school tuition. She was followed by a camera crew, ostensibly making a documentary about the daily lives of artists. Cricket finished off her performance by doing a striptease to a Lisa Loeb song.

Most of the people were taken in by this, or at least sufficiently uncertain to not raise a fuss in case they ended up making fools of themselves, and found out only later, when Wachner posted video to YouTube, as a teaser for a 40-minute documentary titled "I Remember Andrea Better" she was making on the incident.

The anxious interval: The anxious interval is the recent past. It's long enough ago to feel not-contemporary, but not long enough ago to feel utterly removed. It's at an uncomfortable distance, which is why I call it "anxious". You could think of the anxious interval as the temporal equivalent of the uncanny valley, that place where robots are similar enough to us to give us an uncomfortable shudder. You could also say the anxious interval is a place, a style, a set of references we avoid, repress, sublimate, have selective amnesia about, stow away, throw out, deliberately forget.

An example: Devendra Banhart and the scene that was called Freak Folk or New Weird America. The Wire magazine cover feature on New Weird America dates from August 2003. By April 2005 the San Francisco Chronicle is telling us that Freak Folk Flies High. By June 2006 the New York Times is telling its readers that "a music scene called freak folk is bursting up from underground" but adding that "it looked like a trend of the moment a couple of years ago". By 2009, it's safe to say that a reference to Freak Folk would be more likely to puncture your credibility than bolster it. Freak Folk is in "the anxious interval".

The goldmine is the cultural era the present is currently reviving. I've put a picture of Buggles, because in general we're reviving the 80s at the moment. You know, the guy from Hot Chip wears Buggles-like glasses, and so on. The goldmine is a goldmine for people who run secondhand clothes stores and have lots of stock from the requisite era, or people who are selling synths from that era, or people who've got a bunch of cheap Chinese Ray Ban copy frames. The smartest people in the present are remembering the goldmine and sifting through its waters like a crowd of panhandlers.

The battlefront is the area right at the edge of the goldmine -- the place where the acceptable and lucrative revival era meets a time which is currently repressed, neglected, and a-slumber. What's so interesting about the battlefront is that the process of reassessment is so visible here, and the revaluation is so daringly and consciously done. An elite of taste-leaders and taste-formers unafraid of ridicule are hard at work here, foraging for bargains, bringing an unacceptable era into fresh acceptability. There's a kind of shuddering repulsion for long-neglected, long-repressed artifacts, and yet something compellingly taboo about them. Their hiddenness makes them fascinating -- it's as if their very sublimation has given these cultural objects some kind of big power over our unconscious. The best curators and fashionistas are to be found at the battlefront, battling for the fascinating-repellant things they find in that twilit zone between acceptability and unacceptability.

Brynin found that the euphoria of first love can damage future relationships. "Remarkably, it seems that the secret to long-term happiness in a relationship is to skip a first relationship," said Brynin. "In an ideal world, you would wake up already in your second relationship."

While researching the components of successful long-term partnerships, Brynin found intense first loves could set unrealistic benchmarks, against which we judge future relationships. "If you had a very passionate first relationship and allow that feeling to become your benchmark for a relationship dynamic, then it becomes inevitable that future, more adult partnerships will seem boring and a disappointment," he said.

Cory Doctorow, freelance writer and novelist, has written a short article on how to write productively in the age of ubiquitous distraction. The advice he gives is rather novel; he dismisses the usual advice about switching off one's internet connection, and is also scornful of the idea of ceremony, or of setting the right mood. (And understandably so; acknowledging the idea of there being a right mood or atmosphere for evoking one's inner muse could lead to finding excuses, consciously or subconsciously, for not actually doing anything.)

The single worst piece of writing advice I ever got was to stay away from the Internet because it would only waste my time and wouldn't help my writing. This advice was wrong creatively, professionally, artistically, and personally, but I know where the writer who doled it out was coming from. Every now and again, when I see a new website, game, or service, I sense the tug of an attention black hole: a time-sink that is just waiting to fill my every discretionary moment with distraction. As a co-parenting new father who writes at least a book per year, half-a-dozen columns a month, ten or more blog posts a day, plus assorted novellas and stories and speeches, I know just how short time can be and how dangerous distraction is.

Short, regular work schedule. When I'm working on a story or novel, I set a modest daily goal — usually a page or two — and then I meet it every day, doing nothing else while I'm working on it. It's not plausible or desirable to try to get the world to go away for hours at a time, but it's entirely possible to make it all shut up for 20 minutes. Writing a page every day gets me more than a novel per year — do the math — and there's always 20 minutes to be found in a day, no matter what else is going on. Twenty minutes is a short enough interval that it can be claimed from a sleep or meal-break (though this shouldn't become a habit). The secret is to do it every day, weekends included, to keep the momentum going, and to allow your thoughts to wander to your next day's page between sessions. Try to find one or two vivid sensory details to work into the next page, or a bon mot, so that you've already got some material when you sit down at the keyboard.

Leave yourself a rough edge.
When you hit your daily word-goal, stop. Stop even if you're in the middle of a sentence. Especially if you're in the middle of a sentence. That way, when you sit down at the keyboard the next day, your first five or ten words are already ordained, so that you get a little push before you begin your work. Knitters leave a bit of yarn sticking out of the day's knitting so they know where to pick up the next day — they call it the "hint." Potters leave a rough edge on the wet clay before they wrap it in plastic for the night — it's hard to build on a smooth edge.

Realtime communications tools are deadly. The biggest impediment to concentration is your computer's ecosystem of interruption technologies: IM, email alerts, RSS alerts, Skype rings, etc. Anything that requires you to wait for a response, even subconsciously, occupies your attention. Anything that leaps up on your screen to announce something new, occupies your attention. The more you can train your friends and family to use email, message boards, and similar technologies that allow you to save up your conversation for planned sessions instead of demanding your attention right now helps you carve out your 20 minutes. By all means, schedule a chat — voice, text, or video — when it's needed, but leaving your IM running is like sitting down to work after hanging a giant "DISTRACT ME" sign over your desk, one that shines brightly enough to be seen by the entire world.

They found fans of films such as Runaway Bride and Notting Hill often fail to communicate with their partner. Many held the view if someone is meant to be with you, then they should know what you want without you telling them.

Kimberly Johnson, who also worked on the study, said: "Films do capture the excitement of new relationships but they also wrongly suggest that trust and committed love exist from the moment people meet, whereas these are qualities that normally take years to develop."

In a landmark 1991 E.R.P. study conducted at a prison in Vancouver, Robert Hare and two graduate students showed that psychopaths process words like “hate” and “love” differently from the way normal people do. In another study, at the Bronx V.A. Medical Center, Hare, Joanne Intrator, and others found that psychopaths processed emotional words in a different part of the brain. Instead of showing activity in the limbic region, in the midbrain, which is the emotional-processing center, psychopaths showed activity only in the front of the brain, in the language center. Hare explained to me, “It was as if they could only understand emotions linguistically. They knew the words but not the music, as it were.”

Today, Kiehl and Hare have a complementary but complicated relationship. Kiehl claims Hare as a mentor, and sees his own work as validating Hare’s checklist, by advancing a neurological mechanism for psychopathy. Hare is less gung ho about using fMRI as a diagnostic tool. “Some claim, in a sense, this is the new phrenology,” Hare said, referring to the discredited nineteenth-century practice of reading the bumps on people’s heads, “only this time the bumps are on the inside.”

But the problem is that “psychopathic behavior”—egocentricity, for example, or lack of realistic long-term goals—is present in far more than one per cent of the adult male population. This blurriness in the psychopathic profile can make it possible to see psychopaths everywhere or nowhere. In the mid-fifties, Robert Lindner, the author of “Rebel Without a Cause: A Hypnoanalysis of a Criminal Psychopath,” explained juvenile delinquency as an outbreak of mass psychopathy. Norman Mailer inverted this notion in “The White Negro,” admiring the hipster as a “philosophical psychopath” for having the courage of nonconformity. In the sixties, sociopathy replaced psychopathy as the dominant construct. Now, in our age of genetic determinism, society is once again seeing psychopaths everywhere, and this will no doubt provoke others to say they are nowhere, and the cycle of overexposure and underfunding will continue.

One researcher is doing research on prison inmates (a population in which psychopathy is greatly over-represented, as one might expect) using a brain scanner and tests reminiscent of the Voigt-Kampf test in Blade Runner, measuring their responses to moral questions non-psychopathic individuals would respond to viscerally.

The fMRI machine started up with a high-pitched whirring sound. I began to see photographs. One was of a baby covered with blood. I thought first about the blood, then realized the circumstances—birth—and rated the moral offense zero. A man was lying on the ground with his face beaten to a bloody pulp: I scored this high. There was a picture of Osama bin Laden. I scored it four, although I felt that I was making more of an intellectual than a moral judgment. Two guys inadvertently butting heads in a soccer game got a zero, but then I changed it to a one, because perhaps a foul was called. I had considered deliberately giving wrong answers, as a psychopath might. But instead I worked at my task earnestly, like a good fifth grader.

A biologist and a sociologist have put forward a new theory of brain development and mental disorders. Crespi and Badcock's theory posits a spectrum running between autism and related social dysfunctions on one side and schizophrenia, depression and bipolar disorder on the other, with the struggle between maternal and paternal genes in the womb determining where the child's neurology will fall on this axis:

Dr. Crespi and Dr. Badcock propose that an evolutionary tug of war between genes from the father’s sperm and the mother’s egg can, in effect, tip brain development in one of two ways. A strong bias toward the father pushes a developing brain along the autistic spectrum, toward a fascination with objects, patterns, mechanical systems, at the expense of social development. A bias toward the mother moves the growing brain along what the researchers call the psychotic spectrum, toward hypersensitivity to mood, their own and others’. This, according to the theory, increases a child’s risk of developing schizophrenia later on, as well as mood problems like bipolar disorder and depression.

It was Dr. Badcock who noticed that some problems associated with autism, like a failure to meet another’s gaze, are direct contrasts to those found in people with schizophrenia, who often believe they are being watched. Where children with autism appear blind to others’ thinking and intentions, people with schizophrenia see intention and meaning everywhere, in their delusions. The idea expands on the “extreme male brain” theory of autism proposed by Dr. Simon Baron-Cohen of Cambridge.

“Think of the grandiosity in schizophrenia, how some people think that they are Jesus, or Napoleon, or omnipotent,” Dr. Crespi said, “and then contrast this with the underdeveloped sense of self in autism. Autistic kids often talk about themselves in the third person.”

Faced with a wave of ostalgie, misty-eyed nostalgia for the fallen East German Communist regime, Germany's educational authorities have created a mockup of an East German classroom, in which school students would be subjected to the Communist experience. There they would be threatened with disciplinary action for wearing Western clothes, ordered to sing Communist marching songs and told of field trips to border guard regiments, by a "teacher" attired in authentic East German synthetic fabrics. One student would also volunteer in advance to play the child of dissidents, who would then be alternately criticised and ignored by the teachers. What the organisers hadn't planned on was that the whole thing would turn into a small-scale reenactment of the Stanford Prison Experiment, with dissident "Steffen"'s erstwhile classmates turning on him and joining in persecuting him like good cogs in the totalitarian machine:

The other pupils began to ostracise "Steffen" themselves and accused him of disrupting the class. Although they were encouraged to stand up against the system before the session, none of the pupils rallied to Steffen's support when he was told he could not visit the border-guard unit, or at any other time.

During these sessions Elke Urban models herself on Margot Honecker, the leader's wife who was also a hardline education minister. She said that only one group had dared to stand up and defend the dissident pupil during her classes. "I deliberately create a totalitarian atmosphere and I am still always shocked how quickly and easily people are conditioned by it," she said. "East Germany may have left a pile of Stasi files behind rather than a pile of corpses, but the similarities with the Nazi regime are there."

Research from 1915 through to the 1950s suggested that the vast majority of dreams are in black and white but the tide turned in the sixties, and later results suggested that up to 83 per cent of dreams contain some colour.

Only 4.4 per cent of the under-25s' dreams were black and white. The over-55s who had had access to colour TV and film during their childhood also reported a very low proportion of just 7.3 per cent. But the over-55s who had only had access to black-and-white media reported dreaming in black and white roughly a quarter of the time.

It isn't clear what sorts of dreams people who grew up without television have; whether they're less visual and more verbal, more three-dimensional, or just less spectacular.

BLINK and you would have missed it. The expression of disgust on former US president Bill Clinton's face during his speech to the Democratic National Convention as he says "Obama" lasts for just a fraction of a second. But to Paul Ekman it was glaringly obvious.

"Given that he probably feels jilted that his wife Hillary didn't get the nomination, I would have to say that the entire speech was actually given very gracefully," says Ekman, who has studied people's facial expressions and how they relate to what they are thinking for over 40 years.

Another algorithm scores politicians on the amount of spin, or manipulative content-free language, in their speeches, using word frequencies:

The algorithm counts usage of first person nouns - "I" tends to indicate less spin than "we", for example. It also searches out phrases that offer qualifications or clarifications of more general statements, since speeches that contain few such amendments tend to be high on spin. Finally, increased rates of action verbs such as "go" and "going", and negatively charged words, such as "hate" and "enemy", also indicate greater levels of spin. Skillicorn had his software tackle a database of 150 speeches from politicians involved in the 2008 US election race (see diagram).

In general though, Obama's speeches contain considerably higher spin than either McCain or Clinton. For example, for their speeches accepting their party's nomination for president, Obama's speech scored a spin value of 6.7 - where 0 is the average level of spin within all the political speeches analysed, and positive values represent higher spin. In contrast, McCain's speech scored -7.58, while Hillary Clinton's speech at the Democratic National Convention scored 0.15. Skillicorn also found that Sarah Palin's speeches contain slightly more spin than average.

So whilst Obama is one slick player, the straight-talkin', plain-dealin' McCain has little to rejoice about, according to a different metric:

"The voice analysis profile for McCain looks very much like someone who is clinically depressed," says Pollermann, a psychologist who uses voice analysis software in her work with patients. Previous research on mirror neurons has shown that listening to depressed voices can make others feel depressed themselves, she says.

Additionally, McCain's voice and facial movements often do not match up, says Pollermann, and he often smiles in a manner that commonly conveys sarcasm when addressing controversial statements. "That might lead to what I would call a lack of credibility."

In a study last year, Dr Thomas Jackson of Loughborough University, England, found that it takes an average of 64 seconds to recover your train of thought after interruption by email (bit.ly/email2). So people who check their email every five minutes waste 8 1/2hours a week figuring out what they were doing moments before.

The distractive (and some would say destructive) effects of email come down partly to the psychology of addiction and reinforcement:

Tom Stafford, a lecturer at the University of Sheffield, England, and co-author of the book Mind Hacks, believes that the same fundamental learning mechanisms that drive gambling addicts are also at work in email users. "Both slot machines and email follow something called a 'variable interval reinforcement schedule' which has been established as the way to train in the strongest habits," he says.

"This means that rather than reward an action every time it is performed, you reward it sometimes, but not in a predictable way. So with email, usually when I check it there is nothing interesting, but every so often there's something wonderful - an invite out or maybe some juicy gossip - and I get a reward." This is enough to make it difficult for us to resist checking email, even when we've only just looked. The obvious solution is to process email in batches, but this is difficult. One company delayed delivery by five minutes, but had so many complaints that they had to revert to instantaneous delivery. People knew that there were emails there and chafed at the bit to get hold of them.

On a similar tangent, one of the tips for getting more done in Tim Ferriss' The 4-Hour Work Week is a somewhat counterintuitive-sounding low-information diet; rather than binging on magazines, news, books, blogs, podcasts and such, he advocates cutting that out as much as possible, reasoning that we can get by on much less information than we habitually consume and still know enough, whilst having more time to actually do things. The holes in what we know will soon enough be filled by what we hear in smalltalk, learn from friends unavoidably see in front of the newspaper kiosk on the way to the shops. (Which sort of makes sense; think, for example, of the how much you know of the plots of various well-known movies you haven't seen or books you haven't read. Whether or not you've seen Star Wars or read Animal Farm (to cite two examples), you can probably come up with a summary of what they're about.)

A study recently published in the Australasian Psychiatry journal has found correlations between musical preferences and a variety of mental illnesses and antisocial tendencies, and recommends that doctors ask their teenaged patients what sorts of music they listen to. The study, by Dr. Felicity Baker of the University of Queensland, is not online, but thesearticles contain various points from it. Among them:

There are associations between listening to heavy metal and suicidal ideation, depression and drug use, while both metal and trance, techno and "medieval music" are connected with self-harm (though, apparently, only when associated with the goth subculture). Outside of the goth subculture, it seems, dance music is just associated with drug use.

Different forms of rap/hip-hop are associated with different levels of criminality and delinquency, as well as violence and misogyny; apparently the worst is "French rap". I wouldn't have guessed that enough Australian teenagers would understand French well enough to get into the sound of les banlieues. Could it be that teenagers are learning French for the street cred?

Those who are into jazz tend to be misfits and loners (one could presumably call this the Howard Moon Effect?) Is jazz a big thing among today's teens, or did they lump in a whole bunch of non-pop/non-dance genres, like post-rock, krautrock, Balkan/klezmer/gypsy and nu-gazer, with jazz?

There's an intriguing article in the Guardian about the descendents of German Nazis who converted to Judaism and moved to Israel. The article interviews several such converts (the son of a SS man who's an Orthodox rabbi, a left-wing lesbian campaigner for Palestinian rights, and a professor of Jewish Studies who is related to Hitler, and who describes his (Israeli-born, Arab-hating) son as a "fascist").

One somewhat obvious explanation for this phenomenon is that of assuagement of guilt by rejecting the oppressor population one came from identifying with the victims, and this explanation is floated by an expert on the psychology of the children of perpetrators. Interestingly, though, none of those interviewed, when asked for why they converted to Judaism, mention the Holocaust or Nazism, instead giving theological reasons:

"During my theological studies at university it became clear that I couldn't be a minister in the church," he says. "I concluded that Christianity was paganism. One of [its] most important dogmas is that God became man, and if God becomes man then man also can become God." He pauses. "Hitler became a kind of god."

I tell Bar-On they talk obsessively about the Trinity. But is incredulity really a reason for abandoning a religion with a three-in-one god for one that still believes bushes talk and that waves are parted by the will of God? "That is another way of saying what I have already told you," he says. "They want to join the community of the victim. They may have their own way of rationalising it."

A new study has revealed a correlation between the number of bumper stickers on a car and the aggressiveness of the driver's behaviour, presumably as bumper stickers indicate a territorial mindset on the part of the driver. Interestingly enough, there was no correlation between the content of the bumper stickers and the driver's behaviour, so a "Visualise World Peace" sticker would be as much of a danger sign as a "Don't Mess With Texas" one.

Psychology experiments have shown that subliminal exposure to brands can prime people with the attributes those brands have cultivated. For example, when students were exposed to either an Apple or IBM logo and asked to list all the uses for a brick they could imagine, the Apple ("creativity, noncomformity") group came up with significantly more than the IBM ("tradition, responsibility") group. In a subsequent experiment, candidates primed with the Disney logo behaved more honestly than those primed with the logo of E! Channel (which, I believe, is a celebrity-gossip cable-TV channel in the US).

The practical consequences of this are interesting: if this is to be taken at face value then, by the sheer power of subliminal conditioning and marketing, brands do have magical properties, and branded products would perform better than physically identical unbranded ones. A brand logo is a macro, a tightly-encoded package of ideas, instantaneously decoded by appropriately conditioned consumers (and that means all of us; given the studies showing that young children learn to recognise brands before they learn to read), and priming has been shown to work. (In one experiment (previously mentioned here), students were asked to sort words, and then surreptitiously timed as they walked down the corridor on leaving. Those given words relating to old age—including, memorably, "Florida"—walked more slowly than those given youth-related words. Another experiment showed that exposure to alcohol-related words increased men's sex drive.)

Putting these facts together, it seems that using an Apple computer would make you more creative, even if you work in the same version of Microsoft Word you could as easily use on Windows, though so would having an Apple iPod, and Nike shoes could make you run faster than generic trainers of exactly the same composition, and so on. It's not necessarily even limited to brands, but could extend to any perceptible medium associated with qualities or values. It'd be interesting to see whether, for example, if one took two groups of students and, after surreptitiously exposing half of them to Belle & Sebastian and the other half to 50 Cent, asked them to play a game, whether members of one group would be more aggressive or competitive than the other.

Anyway, this finding could be seen as a justification for big brands' steep markups of otherwise average products: they're not exploiting a gullible public, they're selling the psychological magic of their brand. Though if you don't want to pay the markup, you could just as easily clip ads out of papers and tape them around your cubicle/kitchen/locker/wherever, which might get you a similar result, at the risk of making you look like a tragic. Just keep reminding yourself that you're not a gullible dupe or an unpaid human billboard, but a cunningly rebellious pirate, sticking it to The Man by stealing his magic without paying.

Striking another blow against the modern idea that 100% cheerfulness is attainable or desirable, an expert on mood disorders at King's College argues that depression may be good for you:

The fact it has survived so long - and not been eradicated by evolution - indicates it has helped the human race become stronger.

"I have received e-mails from ex-sufferers saying in retrospect it probably did help them because they changed direction, a new career for example, and as a result they're more content day-to-day than before the depression."

Aristotle believed depression to be of great value because of the insights it could bring. There is also an increased empathy in people who have or have had depression, he says, because they become more attuned to other people's suffering.

However, contrary to popular speculation, it's not technical skills that make engineers attractive recruits to radical groups. Rather, the authors pose the hypothesis that "engineers have a 'mindset' that makes them a particularly good match for Islamism," which becomes explosive when fused by the repression and vigorous radicalization triggered by the social conditions they endured in Islamic countries.

Whether American, Canadian or Islamic, they pointed out that a disproportionate share of engineers seem to have a mindset that makes them open to the quintessential right-wing features of "monism" (why argue where there is one best solution) and by "simplism" (if only people were rational, remedies would be simple).

The internet, with its detachment between online and offline actions and its lack of a private register, has spawned the phenomenon of griefers, or highly organised subcultures of people (mostly young men) who delight in ruining other people's online fun:

Consider the case of the Avatar class Titan, flown by the Band of Brothers Guild in the massively multiplayer deep-space EVE Online. The vessel was far bigger and far deadlier than any other in the game. Kilometers in length and well over a million metric tons unloaded, it had never once been destroyed in combat. Only a handful of player alliances had ever acquired a Titan, and this one, in particular, had cost the players who bankrolled it in-game resources worth more than $10,000.

So, naturally, Commander Sesfan Qu'lah, chief executive of the GoonFleet Corporation and leader of the greater GoonSwarm Alliance — better known outside EVE as Isaiah Houston, senior and medieval-history major at Penn State University — led a Something Awful invasion force to attack and destroy it.

"The ability to inflict that huge amount of actual, real-life damage on someone is amazingly satisfying" says Houston. "The way that you win in EVE is you basically make life so miserable for someone else that they actually quit the game and don't come back."

To see the philosophy in action, skim the pages of Something Awful or Encyclopedia Dramatica, where it seems every pocket of the Web harbors objects of ridicule. Vampire goths with MySpace pages, white supremacist bloggers, self-diagnosed Asperger's sufferers coming out to share their struggles with the online world — all these and many others have been found guilty of taking themselves seriously and condemned to crude but hilarious derision.

Griefers defend their behaviour by claiming that they're merely giving those who take the internet far too seriously a reality check. The implied subtext is that anything that happens online is just a game and doesn't count. Though, given how the internet has become a mainstream part of many people's lives (witness, for example, the rise in social networking websites), this assertion makes about as much sense as Tom Hodgkinson's call to kill your Facebook account, throw away your email address and instead socialise in the pub with people near you. There's not a great leap from asserting that anything that happens online doesn't really count and absurdly ludditic claims like "if you don't know what someone smells like, they're a stranger".

On the other hand, there is no such thing as the right to be respected, or even to not be ridiculed. If one posts a web page detailing one's peculiar political views, conspiracy theories and/or sexual fetishes online, one can expect to be laughed at and even snidely remarked about. Though there is a distinction between demolishing someone's homepage in a blog or discussion forum and actively gathering a posse and going out to hound them off the net.

Griefing happens in the real world, though it's usually called other things, such as bullying. The difference is that the internet has democratised bullying. In the real world, in more conformistic societies, bullies can typically only be those either of or contending for alpha social status, enforcing an exaggerated version of majority values by picking on those perceived to not conform to them (witness the use of the word "gay", sometimes semi-euphemised as "ghey", as a general-purpose term of derision), and in more liberal or pluralistic environments, even that is frowned upon. Online, anyone can find a group of like-minded misfits, make up a cool-sounding name, set up a virtual clubhouse and start picking on mutually agreed targets, with little fear of social consequences.

Similarly, another experiment used a cookie-scented candle to further gauge whether appetitive stimulus affects consumer behavior. Female study participants in a room with a hidden chocolate-chip cookie scented candle were much more likely to make an unplanned purchase of a new sweater -- even when told they were on a tight budget -- than those randomly assigned to a room with a hidden unscented candle (67 percent vs. 17 percent).

The researchers make the further claim that "the presence of an attractive woman in the trading room might propel an investor to choose the investment option providing smaller but sooner rewards".

"It was a relief I think for both of us that we were not carbon copies. As similar as we looked when we compared pictures of ourselves as kids, as adults we have our own distinct style."

"We had the same favourite book and the same favourite film, Wings of Desire," says Elyse. "It was amazing," says Paula. "We felt we were conducting our own informal study on nature versus nurture in a way".

Which raises the question: how do you know that the way you live, and what you accept as normal today, is not actually part of some psychological experiment?

Don't envy the super-rich, this article says; their wealth has almost certainly made them miserable:

According to de Vries, the super-rich are increasingly succumbing to what has been labelled Wealth Fatigue Syndrome (WFS). When money is available in near-limitless quantities, the victim sinks into a kind of inertia.

"The rich are never happy, no matter what they have," he told CNN. "There was this man who owned a 100ft yacht. I said: 'This is a terrific boat.' He said: 'Look down the harbour.' We looked down the marina, and there were boats two and three times as large. He said: 'My 100ft yacht today is like a dinghy compared to these other boats.' When else in history has someone been able to call a 100ft yacht a dinghy?"

Some of our friends have jumped from nice five-bedroom houses in South Kensington to gated mansions in St John's Wood, complete with hot and cold running staff. But many who join the super-rich find it hard to keep their old circles of support. Happiness studies have repeatedly shown that being marginally better off than your neighbours makes you feel good, but being a hundred times richer makes you feel worse. So either you change your friends or live with the envy of others.

The article goes on to expound numerous other causes of wealth-induced misery: social support networks break down, as relationships with old friends are strained by the wealth disparity and poisoned by real or perceived envy; and all the cars, yachts and new houses your money can buy you just become boring much more quickly. (It's the hedonic treadmill effect, where one becomes acclimatised to one's level of comfort and contentment, it takes even more to not succumb to ennui.) Meanwhile, the wives of the super-rich (and most of the super-rich are men; presumably husbands of super-rich women or gay partners would suffer the same) suffer the same psychological consequences as the unemployed (that is, when they're not traded in for younger, prettier models), and their children, shuttled between nannies and estates, often end up clinically depressed.

The conclusion is that money can buy happiness — but only up to a point. A key component of happiness is social connectedness, of the sort that cannot be bought:

The happiest nations, he says, are those where people feel most equal, even if that means being less wealthy. Pentecost, a tiny island in the South Pacific, has recently been voted the happiest place on earth. They don't have WFS – in fact, they don't have money; they use pigs' horns instead.

In places such as Pentecost, people actually talk to each other – indeed, belonging to a community is one of the single most important prerequisites for happiness.

The bones of Cameron’s argument, set out in The Myth of Mars and Venus, are that Gray et al have no scientific basis for their claims. Great sheaves of academic papers, says Cameron, show that the language skills of men and women are almost identical. Indeed, the central tenets of the Mars and Venus culture – that women talk more than men, that men are more direct, that women are more verbally skilled – can all be debunked by scientific research. A recent study in the American journal Science, for instance, found men and women speak almost exactly the same number of words a day: 16,000.

Where the book becomes interesting is when she asks why we have become interested in these myths. “The first point to make is that in the past 20 years we have become obsessed by communication,” she says. “And that’s not just in relationships; it’s in customer care, it’s in politics. All problems are seen to be communication problems.

Cameron is not simply irritated that the Mars and Venus books have filled too many Christmas stockings. Her fervour on this issue runs deeper. There is, she thinks, something regressive, deeply conservative, in this outlook because what it seems to be saying is that we can’t change.

The author, Deborah Cameron, is a feminist philologist and Rupert Murdoch professor of Language at Oxford (really); other than the Mars-and-Venus brigade, she has in her sights Darwinists (which, I'm guessing, means the likes of Steven Pinker and/or Richard Dawkins), Tories, man-hating "pseudo-feminists" and punctuation/grammar pedants:

“You had people like Prince Charles and Norman Tebbit inferring that if people were making spelling mistakes it was only a short step to them coming in dirty to school and then there’d be no motivation for them to stay out of crime,” says Cameron. “There were these illogical slippery slope arguments: how, if children didn’t know how to use the colon properly, it was only a few steps from drug-taking and criminality. There was a deep moral and social dimension to it all.

Their experiments showed that the mere thought of one's mortality can trigger a range of emotions--from disdain for other races, religions, and nations, to a preference for charismatic over pragmatic leaders, to a heightened attraction to traditional mores.

To test the hypothesis that recognition of mortality evokes "worldview defense"--their term for the range of emotions, from intolerance to religiosity to a preference for law and order, that they believe thoughts of death can trigger--they assembled 22 Tucson municipal court judges. They told the judges they wanted to test the relationship between personality traits and bail decisions, but, for one group, they inserted in the middle of the personality questionnaire two exercises meant to evoke awareness of their mortality. One asked the judges to "briefly describe the emotions that the thought of your own death arouses in you"; the other required them to "jot down, as specifically as you can, what you think will happen to you physically as you die and once you are physically dead." They then asked the judges to set bail in the hypothetical case of a prostitute whom the prosecutor claimed was a flight risk. The judges who did the mortality exercises set an average bail of $455. The control group that did not do the exercises set it at an average of $50. The psychologists knew they were onto something.

The researchers did other experiments involving priming one group of candidates with the thought of their mortality. In one example, they found that awareness of one's mortality can induce xenophobia and distrust of difference (students at a Christian college who did the exercises had a more negative opinion of an essay they were told was written by a Jewish author than a control group did) and aggressive patriotism (those who did the exercises took a far more negative view of an essay critical of the United States, and also expressed more reverence for national icons).

After 9/11, the researchers did experiments specifically showing that Bush's popularity in the US was enhanced by Americans' awareness of their mortality:

The control group that completed a personality survey, but did not do the mortality exercises, predictably favored Kerry by four to one. But the students who did the mortality exercises favored Bush by more than two to one. This strongly suggested that Bush's popularity was sustained by mortality reminders. The psychologists concluded in a paper published after the election that the government terror warnings, the release of Osama bin Laden's video on October 29, and the Bush campaign's reiteration of the terrorist threat (Cheney on election eve: "If we make the wrong choice, then the danger is that we'll get hit again") were integral to Bush's victory over Kerry. "From a terror management perspective," they wrote, "the United States' electorate was exposed to a wide-ranging multidimensional mortality salience induction."

The induction of mortality salience is also claimed to have been instrumental in popular antagonism to perceived enemies (including France, Germany and Canada), and a mass shift towards reactionary conservative positions such as the defense of tradition and religious dictates (from rising opposition to abortion, gay marriage and liberal attitudes to the rise of the "strict father" model of the family, which on 10 September 2001, seemed like a laughable relic of the 1950s):

Indeed, from 2001 to 2004, polls show an increase in opposition to abortion and gay marriage, along with a growing religiosity. According to Gallup, the percentage of voters who believed abortion should be "illegal in all circumstances" rose from 17 percent in 2000 to 20 percent in 2002 and would still be at 19 percent in 2004. Even church attendance by atheists, according to one poll, increased from 3 to 10 percent from August to November 2001.

In the 1980s, some figure associated with the Thatcher government in the UK was quoted as saying that "the facts of life are Conservative". Whether or not that is the case, it seems that the facts of death are.

Iraqis have often been observed weeping and wailing in apparent anguish, but the study offers evidence indicating this may not be exclusively an outward expression of anger or a desire for revenge. It also provocatively suggests that this grief can possess an American-like personal quality, and is not simply a tribal lamentation ritual.

Psychologists and anthropologists have thus far largely discounted the study, claiming it has the same bias as a 1971 Stanford University study that concluded that many Vietnamese showed signs of psychological trauma from nearly a quarter century of continuous war in southeast Asia.

"We are, in truth, still a long way from determining if Iraqis are exhibiting actual, U.S.-grade sadness," Mayo Clinic neuropsychologist Norman Blum said. "At present, we see no reason for the popular press to report on Iraqi emotions as if they are real."

If you want drivers to cut you more slack when you're cycling, however, the advice is dress as a woman (if you're not one already, that is):

His findings, published in the March 2007 issue of Accident Analysis & Prevention, state that when Walker wore a helmet drivers typically drove an average of 3.35 inches closer to his bike than when his noggin wasn't covered. But, if he wore a wig of long, brown locks -- appearing to be a woman from behind -- he was granted 2.2 inches more room to ride.

The exact reasons for this are not known, though one theory is that it is because the Japanese attempt to suppress their emotions in the presence of others more than the loud, demonstrative gaijin do, and in such cases, the eyes provide more of a clue to someone's emotional state. One consequence of this, of course, is the difference between the way Westerners and Japanese draw happy-face symbols in ASCII characters, with the Japanese smiley looking like ^_^ (note the emphasis on the eyes), and the Western one being the familiar :-):

So when Yuki entered graduate school and began communicating with American scholars over e-mail, he was often confused by their use of emoticons such as smiley faces :) and sad faces, or :(.

"It took some time before I finally understood that they were faces," he wrote in an e-mail. In Japan, emoticons tend to emphasize the eyes, such as the happy face (^_^) and the sad face (;_;). "After seeing the difference between American and Japanese emoticons, it dawned on me that the faces looked exactly like typical American and Japanese smiles," he said.

The company thinks it can glean information about an individual's preferences and personality type by tracking their online behaviour, which could then be sold to advertisers. Details such as whether a person is more likely to be aggressive, hostile or dishonest could be obtained and stored for future use, it says.

The patent says: "User dialogue (eg from role playing games, simulation games, etc) may be used to characterise the user (eg literate, profane, blunt or polite, quiet etc). Also, user play may be used to characterise the user (eg cautious, risk-taker, aggressive, non-confrontational, stealthy, honest, cooperative, uncooperative, etc)."

Players who spend a lot of time exploring "may be interested in vacations, so the system may show ads for vacations". And those who spend more time talking to other characters will see adverts for mobile phones.

Not all the inferences made by monitoring user activity rely on subtle psychological clues, however. "In a car racing game, after a user crashes his Honda Civic, an announcer could be used to advertise by saying 'if he had a Hummer, he would have gotten the better of that altercation', etc," the patent says. And: "If the user has been playing for over two hours continuously, the system may display ads for Pizza Hut, Coke, coffee."

Data collection in 1984 was deliberate; today's is inadvertent. In the information society, we generate data naturally. In Orwell's world, people were naturally anonymous; today, we leave digital footprints everywhere.

1984's Big Brother was run by the state; today's Big Brother is market driven. Data brokers like ChoicePoint and credit bureaus like Experian aren't trying to build a police state; they're just trying to turn a profit. Of course these companies will take advantage of a national ID; they'd be stupid not to. And the correlations, data mining and precise categorizing they can do is why the U.S. government buys commercial data from them.

And finally, the police state of 1984 was deliberately constructed, while today's is naturally emergent. There's no reason to postulate a malicious police force and a government trying to subvert our freedoms. Computerized processes naturally throw off personalized data; companies save it for marketing purposes, and even the most well-intentioned law enforcement agency will make use of it.

A funny thing happened during a recent test of a military mine-disposal robot:

At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.

Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.

The human in command of the exercise, however -- an Army colonel -- blew a fuse.

The colonel ordered the test stopped.

Why? asked Tilden. What's wrong?

The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.

This test, he charged, was inhumane.

This is not the only incident of a curious camaraderie developing between soldiers and robots; the article describes other stories of US troops in Afghanistan and Iraq (where robots are being widely deployed) befriending their mechanical compatriots, ascribing quirks of individual robots to personalities, giving them names and (virtual) battlefield honours, and even going fishing with their robot buddies.

Which could be more evidence that the human mind automatically perceives anything whose actions show signs of intention to have psychological states. An example (in the book Mind Hacks) describes children being shown a display with two shapes moving on a screen, one after the other, and when asked what was happening, describing one as chasing the other. Could it be that when a machine tries to defuse a bomb, its operators, even though they know it is a machine, can't help but mentally classify it as being "alive"?

And very likely the phenomena of this early "epidemic of depression" and the suppression of communal rituals and festivities are entangled in various ways. It could be, for example, that, as a result of their illness, depressed individuals lost their taste for communal festivities and even came to view them with revulsion. But there are other possibilities. First, that both the rise of depression and the decline of festivities are symptomatic of some deeper, underlying psychological change, which began about 400 years ago and persists, in some form, in our own time. The second, more intriguing possibility is that the disappearance of traditional festivities was itself a factor contributing to depression.

One approaches the subject of "deeper, underlying psychological change" with some trepidation, but fortunately, in this case, many respected scholars have already visited this difficult terrain. "Historians of European culture are in substantial agreement," Lionel Trilling wrote in 1972, "that in the late 16th and early 17th centuries, something like a mutation in human nature took place." This change has been called the rise of subjectivity or the discovery of the inner self and since it can be assumed that all people, in all historical periods, have some sense of selfhood and capacity for subjective reflection, we are really talking about an intensification, and a fairly drastic one, of the universal human capacity to face the world as an autonomous "I", separate from, and largely distrustful of, "them".

But the new kind of personality that arose in 16th- and 17th-century Europe was by no means as autonomous and self-defining as claimed. For far from being detached from the immediate human environment, the newly self-centered individual is continually preoccupied with judging the expectations of others and his or her own success in meeting them: "How am I doing?" this supposedly autonomous "self" wants to know. "What kind of an impression am I making?"

If this hypothesis is correct, then the epidemic of depression and mental illness that began in the 1600s (which Ehrenreich provides supporting evidence for, in historical records) is a side-effect of a step in the evolution of human psychology that began at around that time, with the pressures of communication, trade and social organisation dragging the human mind kicking and screaming from a sleepy collective life to a more dynamic way of living. In this case, a lot of the anxiety, angst and low-level distress people feel routinely is not a result of human nature, but rather human nature reacting against "unnatural" circumstances. Small wonder that many have sought relief in an annihilation of the self, from hippie communes to Communist utopias, from meditation to severe religious submission, from the Arcadian pastoral utopias throughout art (Tolkien, William Morris and the Arcade Fire to name three examples off the top of my head) to the transcendental nihilism of drugs (take, for example, Lou Reed wishing he had been born "a thousand years ago" in Heroin).

So where does that leave us? Perhaps, given enough time (hundreds if not thousands of years), human psychology will evolve into depression-resistant directions, assuming that some kind of technological catastrophe doesn't cut the process short. Genetic evolution is slow, but cultural evolution is faster, and it could be argued that our technologies and cultural institutions are part of the "extended phenotype" of humanity; that the invention of antidepressant drugs is an adaptation to these changes in our environment. It's a crude, reactionary adaptation, merely treating the symptoms; though there is hope on the horizon. There has recently been a lot of focus on the study of the psychology of happiness, and what factors make for environments conducive to sustainable happiness. With any luck, this will lead to improvements in areas from urban planning to social policy to economics.

Then again, if the hypothesis is true, would it be possible to somehow get the best of both worlds? Could one have the happy, fulfilling collective connectedness people (allegedly) had before the 16th century, whilst retaining the gains made since then? Or is the very presence of subjective thought, the demarcation between the self and the collective, poisonous?

(On the other hand, L. Ron Hubbard claims that depression comes from humanity's early ancestor, the clam, and the tension between the desire to open and close its hinge.)

After Stephen Fry commented that British actors have an unfair advantage in America because Americans mistake British accents for brilliance, the BBC has published a piece on what a British accent gets you in the US. (And, apparently, a "British accent" includes anything from Hugh Grant plumminess to deepest darkest Geordie.)

"For most Americans, there's no distinction between British accents. For us, there's just one sort of British accent, and it's better than any American accent - more educated, more genteel," says Rosina Lippi-Green, a US academic and author of English with an Accent: Language, Ideology and Discrimination in the United States.

"There was a sitcom called Dead Like Me with a Brit [Callum Blue] in it. He was a scruffy, 20-something drug dealer. Even he had that sort of patina - his was not an RP accent, it was a working class London accent."

Katharine Jones, author of Accent of Privilege: English Identities and Anglophilia in the US, says the "educated and cultured" associations have a long history. "British etiquette books have been used for years; and although Americans say they have no class system, they do - and the American upper class apes the British upper class."

Another point the article makes: British expatriates in Australia (where their accent is associated with complaining and being bad at cricket, and/or where refinement and intelligence have traditionally been associated with weakness and/or metaphorical or literal homosexuality rather than any positive attributes) tend to lose their accents pretty quickly, whereas those in the US (where their accents make them appear intelligent and sophisticated, and often get them preferential treatment) retain theirs. Funny, that.

While it may seem that we live in an age of unprecedented violence and atrocity, according to Steven Pinker, violence has been steadily declining over the past few centuries, and we are now living at the most peaceful time in the history of humanity so far.

The decline of violence, he tells us, is a fractal phenomenon - we see it over the centuries, the decades and the years. That said, we see a tipping point in the 16th century - the age of reason - particularly in England and Holland.

One on one death has plummeted through the middle ages, with an "elbow" of the curve in the 16th century. Despite a slight uptick in the 1960s - "perhaps those who thought that rock and roll would lead to a decline in moral values had it right" we've seen two orders of magnitude fall in one on one violence from the middle ages to today. State sponsored violence has also fallen sharply - we've seen a 90% reduction in genocide since the end of the cold war. State on state conflicts are dropping every decade.

Pinker then calls bullshit on the Rousseauvian "noble savage" myth that, in some state of long-lost primordial innocence, our distant ancestors lived in blessed harmony with one another, and that ills such as warfare and violence are the result of the noxious effects of language/capitalism/agriculture/urbanisation.

Until 10,000 years ago, all humans were hunter gatherers. This is the group that some believe lived in primordial harmony - there's no evidence of this. Studying current hunter-gatherer tribes, the percent of male adults who die in violence is extraordinary - from 20 to 60% of all males. Even during the violent 20th century, with two world wars, less than 2% of males worldwide died in warfare.

The Middle Ages were filled with mutilation and torture as routine punishments for trangressions we'd punish with fines today. This was merely another charming feature of a time that featured pastimes like "cat burning", dropping cats into a fire for entertainment purposes. Some of the most creative inventions of the Middle Ages were fantastically cruel forms of corporal punishment.

Pinker offers several reasons for the illusion that violence is increasing and the past was more idyllic: improved communications (we have more awareness of acts of violence, petty and enormous, than people had in earlier centuries), the cognitive illusion that makes memorable events (which include acts of spectacular brutality) seem more common, and the fact that popular standards of what's acceptable are changing faster than behaviour actually is. He also offers four explanations for why violence is becoming less common: the Hobbesian hypothesis (that states with monopolies on violence reduce it), a decline in the belief that life is cheap, the rise of more non-zero-sum games such as international trade, which make potential rivals more valuable alive than dead, and the hypothesis of the "expanding circle":

By default, we empathize with a small group of people, our friends and family. Everyone else is subhuman. But over time, we've seen this circle expand, from village to clan to tribe to nation to other races, both sexes and eventually other species. As we learn to expand our circles wider and wider, perhaps violence becomes increasingly unacceptable.

Take time to talk - have an hour-long conversation with a loved one each week

Phone a friend whom you have not spoken to for a while and arrange to meet up

Give yourself a treat every day and take the time to really enjoy it

Have a good laugh at least once a day

Get physical - exercise for half an hour three times a week

Smile at and/or say hello to a stranger at least once each day

Cut your TV viewing by half

Spread some kindness - do a good turn for someone every day

The team, which consists of a psychologist, a psychotherapist, two "workplace specialists", a "social entrepreneur" and "Richard Reeves, whose expertise spans philosophy, public policy and economics", has been given the task of increasing the levels of happiness in Slough, a town whose name has stood as a byword for post-industrial alienation and the dehumanising effects of modernity since John Betjeman penned his famous ode to the place in 1937.

Culture-bound syndrome of the day: "Paris Syndrome". This is a condition affecting Japanese tourists who travel to Paris, romantic scenes from Amélie in their minds, only to discover that the city is considerably dirtier and—shock, horror—full of very rude people. This shock can cause a psychiatric breakdown:

An encounter with a rude taxi driver, or a Parisian waiter who shouts at customers who cannot speak fluent French, might be laughed off by those from other Western cultures.

But for the Japanese - used to a more polite and helpful society in which voices are rarely raised in anger - the experience of their dream city turning into a nightmare can simply be too much.

This year alone, the Japanese embassy in Paris has had to repatriate four people with a doctor or nurse on board the plane to help them get over the shock.

As many as 12 Japanese tourists fall victim to Paris Syndrome each year. The Japanese embassy has established a 24-hour hotline to help those afflicted.

Psychologists Sara Gutierres, Ph.D., and Douglas Kenrick, Ph.D., both of Arizona State University, demonstrated that the contrast effect operates powerfully in the sphere of person-to-person attraction as well. In a series of studies over the past two decades, they have shown that, more than any of us might suspect, judgments of attractiveness (of ourselves and of others) depend on the situation in which we find ourselves. For example, a woman of average attractiveness seems a lot less attractive than she actually is if a viewer has first seen a highly attractive woman. If a man is talking to a beautiful female at a cocktail party and is then joined by a less attractive one, the second woman will seem relatively unattractive.

Psychologists Sara Gutierres, Ph.D., and Douglas Kenrick, Ph.D., both of Arizona State University, demonstrated that the contrast effect operates powerfully in the sphere of person-to-person attraction as well. In a series of studies over the past two decades, they have shown that, more than any of us might suspect, judgments of attractiveness (of ourselves and of others) depend on the situation in which we find ourselves. For example, a woman of average attractiveness seems a lot less attractive than she actually is if a viewer has first seen a highly attractive woman. If a man is talking to a beautiful female at a cocktail party and is then joined by a less attractive one, the second woman will seem relatively unattractive.

The strange thing is, being bombarded with visions of beautiful women (or for women, socially powerful men) doesn't make us think our partners are less physically attractive. It doesn't change our perception of our partner. Instead, by some sleight of mind, it distorts our idea of the pool of possibilities.

Our minds have not caught up. They haven't evolved to correct for MTV. "Our research suggests that our brains don't discount the women on the cover of Cosmo even when subjects know these women are models. Subjects judge an average attractive woman as less desirable as a date after just having seen models," Kenrick says.

So the women men count as possibilities are not real possibilities for most of them. That leads to a lot of guys sitting at home alone with their fantasies of unobtainable supermodels, stuck in a secret, sorry state that makes them unable to access real love for real women. Or, as Kenrick finds, a lot of guys on college campuses whining, "There are no attractive women to date."

This effect apparently manifests itself in higher rates of divorce or persistent singleness due to people exposed to quantities of images of attractiveness their brains are not evolutionarily adapted to, and thus developing dissatisfaction with actual potential partners.

Mind you, this article is rather male-centric (it's partly a survey of studies, and partly a lament from the head of a Los Angeles PR agency, kvetching bitterly about all the unfeasibly gorgeous women he is surrounded by and how their presence is making his life a hell), and doesn't cover the female perspective; i.e., whether women are bombarded with images of unfeasibly attractive potential male partners, and whether this causes them to feel dissatisfied with actual partners (or potential partners) to the same extent.

Cognitive models developed by my advisor Gail Carpenter suggest that a more effective way to evaluate an intuition is to consider its mnemonic associations. If you can mentally trace some of the cognitive links of an intuition (through a process similar to priming), these links may suggest whether the intuition is meaningfully connected to the correct answer or whether the link is trivial, incidental, or wrong. For example, given the question "Bucharest is the capital of what European country?", you might have an intuition that the answer is Hungary, because the actual capital of Hungary--Budapest--sounds like "Bucharest" and is thus unconsciously linked. In this case, naively following your unexamined intuition would lead you away from the correct response: Romania.

My $250,000 question presented me with a case of pure intuition. "The department store Sears got its start by selling what specific product in its first catalog?" Since pop culture esoterica and business origins are outside my domains of interest, I did not know the answer. But for some reason, even before the four possible answers appeared, I thought of watches. When "watches" turned up as one of the choices, I reflected on it further. I did not feel any certainty. But why did my brain come up with "watches?" ... As I concentrated on my watch intuition, I began to think about railroads. My brain's memory pattern of watches was somehow linked to a memory pattern of railroads, and my railroad memory also evoked a memory of Sears. Though I still could not work out the explicit connection between watches and Sears, I satisfied myself that "watches" had some deep mnemonic relationship to both railroads and Sears--perhaps at some point in my life I had read that Sears originally delivered their watch catalogs by railroad?

Later, in the tranquility of my apartment, I discovered that 23-year old railroad station agent Richard Sears sold watches to other station agents along the Minneapolis and St. Louis Railway for a full year before meeting up with Alvah C. Roebuck. I never did discover how this obscure factoid had left its faint trace upon my brain.

In researching his book, Dr. Rosenblatt said even though many couples said they slept better alone, they still shared a bed. "When I asked why, they looked at me as if I'd asked them why they keep breathing," he said.

The subjects he interviewed invariably had their own side of the bed, and responsibilities like putting out the cat or opening the windows before turning in. They usually had rituals like watching the television news before lights out or snuggling before falling to sleep. And they often had signals for when they wanted affection, wanted to talk or wanted to be left alone.

"How they arrived at these systems could be said to mirror their relationships," said Dr. Rosenblatt. The most successful systems were those formed out of compromise and sensitivity to the other's needs.

Jerry Seinfeld compares telling a joke to attempting to leap a metaphorical canyon, taking the audience with him. The set-up is the nearside cliff, and the punchline is the far side. If they're too far apart, the listeners don't make it to the other side. And if they are too close together, the audience just steps across the gap without experiencing any exhilarating leap. The joke-hearer gets far more pleasure from the joke if he or she has to do a little work.

The surprise mechanism doesn't work without effective timing. It's almost impossible to explain in print because our eyes always skip ahead to the punchline before the set-up is properly digested. But next time you listen to a comedian, listen to the pauses. They're not that funny on their own - obviously, they're just tiny silences - but the point is, neither are the jokes.

It's not that long, wordy jokes can't be funny, but if too much is explained, there's no logical leap for the audience to make, and the paradigm shift which elicits laughter is lost.

Compare: I'm not a homosexual. Mind you, I might be mistaken for one if I went to the north of England. In places like Newcastle, there's such a culture of macho posturing that they go out in their shirtsleeves in all kinds of weather, so if you wear a coat they think you're gay.
And: I'm not gay. Unless you're from Newcastle and by 'gay' you mean, 'owns a coat'.

In RPS circles a common mantra is "Rock is for Rookies" because males have a tendency to lead with Rock on their opening throw. It has a lot to do with idea that Rock is perceived as "strong" and forceful", so guys tend to fall back on it. Use this knowledge to take an easy first win by playing Paper. This tactic is best done in pedestrian matches against someone who doesn't play that much and generally won't work in tournament play.

When playing with someone who is not experienced at the RPS, look out for double runs or in other words, the same throw twice. When this happens you can safely eliminate that throw and guarantee yourself at worst a stalemate in the next game. So, when you see a two-Scissor run, you know their next move will be Rock or Paper, so Paper is your best move. Why does this work? People hate being predictable and the perceived hallmark of predictability is to come out with the same throw three times in row.

When playing against someone who asks you to remind them about the rules, take the opportunity to subtly "suggest a throw" as you explain to them by physically showing them the throw you want them to play. ie "Paper beats Rock, Rock beats scissors (show scissors), Scissors (show scissors again) beats paper." Believe it or not, when people are not paying attention their subconscious mind will often accept your "suggestion". A very similar technique is used by magicians to get someone to take a specific card from the deck.

Organisers have suggested people instead turn up in "bunny ears, a Spam tin outfit, an astronaut's helmet, a witch's hat or just a plain old Buzz Lightyear lycra all-in-one" for the concerts which include performances by the Pet Shop Boys and the Scissor Sisters.

In another experiment, students who'd just come back from their break were polled. The ones who'd partied it up regretted their actions -- while those who studied were virtuously smug. But when asked to recall the spring break from the previous year, suddenly more students regretted their choice not to party. When alumni were asked to recall their spring breaks of 40 years ago, the results were even starker: Those who hadn't been doing beer shots out of a barber's chair were striken with remorse.

Why the reversal? Why do we opt for virtue in the short term, but prefer vice in the long? The reason, the researchers suggest, is in the mechanics of guilt: It's intense and painful emotion in the here-and-how, but fades over time. As they write:

Whereas guilt is an acute, hot emotion, missing out is a colder, contemplative feeling. Therefore, indulgence guilt is expected to predominate in the temporal proximity of the relevant self-control choice, but subsequently diminish over time.

Now there's a conclusion that will deeply freak out social conservatives.

The study was done in America in 2006; I wonder whether a similar study done in a culture (say, Japan) with a stronger sense of duty would yield different results.

More evidence of neoteny being a characteristic of evolutionary advancement: as coping the modern world requires more flexibility, immaturity levels in adults are rising. Which sounds alarming, until you consider that "maturity" (and the nebulous "wisdom" that comes with it) is a sclerotic set-in-one's-ways inflexibility and resistance to change, which no longer cuts it:

"The psychological neoteny effect of formal education is an accidental by-product -- the main role of education is to increase general, abstract intelligence and prepare for economic activity," he explained. "But formal education requires a child-like stance of receptivity to new learning, and cognitive flexibility."

"When formal education continues into the early twenties," he continued, "it probably, to an extent, counteracts the attainment of psychological maturity, which would otherwise occur at about this age.

While the human mind responds to new information over the course of any individuals lifetime, Charlton argues that past physical environments were more stable and allowed for a state of psychological maturity. In hunter-gatherer societies, that maturity was probably achieved during a persons late teens or early twenties, he said.

By contrast, many modern adults fail to attain this maturity, and such failure is common and indeed characteristic of highly educated and, on the whole, effective and socially valuable people," he said.

Some of the symptoms of neoteny include novelty-seeking, which ties in with the possibility of a "neophilia gene" previously mentioned here. In fact, if there was a genetic mutation that caused neophilia, the abovementioned article suggests that, in today's environment, it would be strongly selected for.

More on the subject of happiness and its exact nature: Edge.org talks to Daniel Gilbert, a researcher on the subject (he is director of Harvard's Hedonic Psychology Laboratory):

My research with Tim Wilson shows that when people try to simulate future events -- and to simulate their emotional reactions to those events -- they make systematic errors. Modern people take the ability to imagine the future for granted, but it turns out that this is one of our species' most recently acquired abilities -- no more than three million years old. The part of our brain that enables us to simulate the future is one of nature's newest inventions, so it isn't surprising that when we try to use this new ability to imagine our futures, we make some rookie errors. The main error, of course, is that we vastly overestimate the hedonic consequences of any event. Neither positive nor negative events hit us as hard or for as long as we anticipate.

We're all told that variety is the spice of life. But variety is not just over-rated, it may actually have a cost. Research shows that people do tend to seek more variety than they should. We all think we should try a different doughnut every time we go to the shop, but the fact is that people are measurably happier when they have their favorite on every visit -- provided the visits are sufficiently separated in time.

Those last four words are the important ones. If you had to eat 4 donuts in rapid succession, variety would indeed spice up your experience and you'd be wise to seek it. But if you had to eat 4 donuts on 4 separate Mondays, variety would lower your overall enjoyment. The human brain has tremendous difficulty reasoning about time, and thus we tend to seek variety whether the doughnuts are separated by minutes or months.

Even in a technologically sophisticated society, some people retain the romantic notion that human unhappiness results from the loss of our primal innocence. I think that's nonsense. Every generation has the illusion that things were easier and better in a simpler past, but the fact is that things are easier and better today than at any time in human history.

Our primal innocence is what keeps us whacking each other over the head with sticks, and it is not what allows us to paint a Mona Lisa or design a space shuttle. It gives rise to obesity and global warming, not Miles Davis or the Magna Carta. If human kind flourishes rather than flounders over the next thousand years, it will be because we embraced learning and reason, and not because we surrendered to some fantasy about returning to an ancient Eden that never really was.

Researchers created an artificial "music market" of 14,341 participants drawn from a teen-interest Web site. Upon entering the study's Internet market, the participants were randomly, and unknowingly, assigned to either an "independent" group or a "social influence" group. Participants could then browse through a collection of unknown songs by unknown bands.

In the independent condition, participants chose which songs to listen to based solely on the names of the bands and their songs. While listening to the song, they were asked to rate it from one star ("I hate it") to five stars ("I love it"). They were also given the option of downloading the song for keeps.

In the social influence group, participants were provided with the same song list, but could also see how many times each song had been downloaded.

Researchers found that popular songs were popular and unpopular songs were unpopular, regardless of their quality established by the other group. They also found that as a particular songs' popularity increased, participants selected it more often.

So what drives participants to choose low-quality songs over high-quality ones?
"People are faced with too many options, in this case 48 songs. Since you can't listen to all of them, a natural shortcut is to listen to what other people are listening to," Salganik said. "I think that's what happens in the real world where there's a tremendous overload of songs."

Which certainly explains [insert massively popular yet mediocre artist in the genre of your choice here].

You go to the supermarket and stop by some shelves offering French and German wine. You buy a bottle of French wine. After going through the checkout you are asked what made you choose that bottle of wine. You say something like "It was the right price", or "I liked the label". Did you notice the French music playing as you took it off the shelf? You probably did. Did it affect your choice of wine? No, you say, it didn't.

That's funny because on the days we play French music nearly 80% of people buying wine from those shelves choose French wine, and on the days we play German music the opposite happens

The study in question used stereotypical examples of national music (French accordion music and German "oom-pah" band music), yielding the results mentioned, and is effective primarily due to its subtlety. It would not be enough to make someone not intending to buy wine buy some, but is enough to influence the choice of wine.

What would be the effect, I wonder, of having someone stand by the shelves saying to the customers as they passed "Why don't you buy a French wine today"? My hunch is that you'd make people think about their decision a lot more - just by trying to persuade them you'd turn the decision from a low involvement one into a high involvement one. People would start to discount your suggestion. But the suggestion made by the music doesn't trigger any kind of monitoring. Instead, the authors of this study believe, it triggers memories associated with the music - preferences and frames of reference. Simply put, hearing the French music activates [4] ideas of 'Frenchness' - maybe making customers remember how much they like French wine, or how much they enjoyed their last trip to France. For a decision which people aren't very involved with, with low costs either way (both the French and German wines are pretty similar, remember, except for their nationality) this is enough to swing the choice.

John Birmingham puts forward the case that the political right pretty much has a monopoly on humour, with the left having become too puritanical and politically correct to laugh, with the voices that dare to be outrageous being predominantly right-wing, from shock-jocks and reactionary bloggers to institutions like VICE Magazine (infamously offending the uptight by pejoratively calling things "gay") and the creators of South Park and Team America (who skewered Hollywood liberals and left-wing sanctimony alike).

Of course, this relies on a rather broad definition of "right-wing", as anything that goes against a doctrinaire liberal/progressive view of propriety and "political correctness". By this token, one would classify Coco Rosie as a right-wing band, placing them in the same ideological milieu as Pat Robertson and Little Green Footballs, because one of their number attended "Kill Whitey" parties. And while VICE's Gavin McInnes claimed in American Conservative to represent a hip new conservatism (a view he later retracted, claiming he was joking/being ironic), the cocaine-snorting, nihilistic libertinism epitomised in the magazine, as much as it may offend "liberals" (or straw-man caricatures thereof), hardly fits well with the canon of conservatism and its emphasis on values, tradition and authority. However, it does fit in with the recently noted shift towards Hobbesian nihilism and radical individualism.

On a tangent: some American conservatives are concerned about FOXNews' alarming slide to the radical left; the channel, once the shining beacon of all things Right-thinking, has been compromising its Fair And Balanced™ reputation by running programmes on topics such as global warming. Pundits blame the influx of liberally-inclined ex-CNN reporters, the staffers having spent too long in Godless New York, away from the Biblical certainties of the Red States, or Murdoch not really being "One Of Us", but rather a cynical opportunist.

"We did not see any increased activation of the parts of the brain normally engaged during reasoning," said Drew Westen, director of clinical psychology at Emory University. "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts."

The test subjects on both sides of the political aisle reached totally biased conclusions by ignoring information that could not rationally be discounted, Westen and his colleagues say. Then, with their minds made up, brain activity ceased in the areas that deal with negative emotions such as disgust. But activity spiked in the circuits involved in reward, a response similar to what addicts experience when they get a fix, Westen explained.

The New York Times has a long and interesting article on the Japanese phenomenon of hikikomori, or of young Japanese dropping out of society and shutting themselves in their rooms for months at a time, emerging only to go to convenience stores at night or not at all:

A leading psychiatrist claims that one million Japanese are hikikomori, which, if true, translates into roughly 1 percent of the population. Even other experts' more conservative estimates, ranging between 100,000 and 320,000 sufferers, are alarming, given how dire the consequences may be. As a hikikomori ages, the odds that he'll re-enter the world decline. Indeed, some experts predict that most hikikomori who are withdrawn for a year or more may never fully recover. That means that even if they emerge from their rooms, they either won't get a full-time job or won't be involved in a long-term relationship. And some will never leave home. In many cases, their parents are now approaching retirement, and once they die, the fate of the shut-ins - whose social and work skills, if they ever existed, will have atrophied - is an open question.

In other societies the response from many youths would be different. If they didn't fit into the mainstream, they might join a gang or become a Goth or be part of some other subculture. But in Japan, where uniformity is still prized and reputations and outward appearances are paramount, rebellion comes in muted forms, like hikikomori. Any urge a hikikomori might have to venture into the world to have a romantic relationship or sex, for instance, is overridden by his self-loathing and the need to shut his door so that his failures, real or perceived, will be cloaked from the world.

By Japanese standards, his room was enormous, with a wall of delicate shoji screens leading to a rock garden. But it was hard to imagine what he did there all day. There were no stacks of manga, the popular Japanese comic books, no DVD's, no computer games, all things found in the rooms of most hikikomori. The TV was broken, and the hard drive was missing from his computer. There were a few papers on his desk, including a newsletter from New Start that Kawakami brought on her last visit. Otherwise, the only evidence that this was a hikikomori's room were three holes in the wall - the size of fists. Shut-ins often describe punching their walls in a fit of anger or frustration at their parents or at their own lives. The holes were suggestive too of the practice of "cutting" among American adolescent girls. Both acts seemed to be attempts to infuse feeling into a numb life.

By the time parents seek help, often their child has been shut in for a year or more. "When they call," Dr. Saito said, "I offer them three choices: 1) Come to me for counseling; 2) Kick your child out; 3) Accept your child's state and be prepared to take care of him for the rest of your life. They choose Option 1." He also offers poignantly simple parenting tips, like not leaving dinner at a child's doorstep. "You make dinner and call him to the table, and if he doesn't come then let him fend for himself." In addition to meals, parents often provide monetary allowances for their adult child, and in rare cases, if a child has become verbally or physically abusive, parents move out, leaving their home to the shut-in.

Parents of hikikomori now have support programmes to turn to, including volunteers known as "rental sisters", who try to befriend their children and coax them out of their rooms and into support centres, often over months or years.

There are multiple theories trying to explain the hikikomori phenomenon, but several frame it as a conscious rejection of the high pressure to conform and succeed placed on individuals in Japanese society; a conscious, if not particularly sustainable, decision to drop out of the traditional school-university-work career path.

A US professor of psychology has found the difference between English and American smiles. Apparently, Americans smile naturally and warmly, whereas an English smile is a suppressed grimace, or a signal of acquiescence to hierarchy (which, apparently, is internalised in the English national character):

Keltner hit upon this difference in national smiles by accident. He was studying teasing in American fraternity houses and found that low-status frat members, when they were teased, smiled using the risorius muscle - a facial muscle that pulls the lips sideways - as well as the zygomatic major, which lifts up the lips. It resulted in a sickly smile that said, in effect, I understand you must paddle me, brother, but not too hard, please. Several years later, Keltner went to England on sabbatical and noticed that the English had a peculiar deferential smile that reminded him of those he had seen among the junior American frat members. Like the frat brothers', the English smile telegraphed an acknowledgment of hierarchy rather than just expressing pleasure.

A hoaxer in the US Midwest has reprised the Milgram obedience experiments by calling fast-food restaurants posing as a police officer and instructing managers to strip-search employees, subjecting them to bizarre and degrading ordeals. The managers in question, being selected for unthinking obedience, never realised that anything was wrong, accepting "Officer Scott"'s authoritative tone of voice, stated reasons and the sounds of police radios in the background as sufficient reason to start obeying, and the fact that they were already obeying as sufficient reason to keep doing so, up to committing rape.

On May 29, 2002, a girl celebrating her 18th birthday -- in her first hour of her first day on the job at the McDonald's in Roosevelt, Iowa -- was forced to strip, jog naked and assume a series of embarrassing poses, all at the direction of a caller on the phone, according to court and news accounts.

He had mastered the police officer's calm but authoritative demeanor. He sprinkled law-enforcement jargon into every conversation. And he did his homework. He researched the names of regional managers and local police officers in advance, and mentioned them by name to bolster his credibility. He called some restaurants in advance, somehow getting names and descriptions of victims so he could accurately describe them later.

In her book, "Making Fast Food: From the Frying Pan into the Fryer," Canadian sociologist Ester Reiter concludes that the most prized trait in fast-food workers is obedience. "The assembly-line process very deliberately tries to take away any thought or discretion from workers," said Reiter, who teaches at Toronto's York University and who spent 10 months working at a Burger King as part of her research. "They are appendages to the machine."

Several people who followed orders were jailed for rape and related crimes. The hoaxer was later found to be a 38-year-old prison guard with a fantasy of being a police officer. Meanwhile, one of the victims is suing McDonalds for allowing this to happen; McDonalds, meanwhile, blames her for not reading the employee manual where it said that strip searches were prohibited and not recognising that the caller wasn't a real police officer.

Robert Marbury, an artist who photographed dozens of Manhattan bumper fauna for a project in 2000 (see urbanbeast.com/faq/strapped.html), said he had once asked a trash hauler why he had a family of three mismatched bears strapped to his rig. "He said: 'Yo, man, I drive a garbage truck. How am I going to get the ladies to look at me?' " Mr. Marbury recalled.

Monroe Denton, a lecturer in art history at the School of Visual Arts, traced the phenomenon's roots back to the figureheads that have animated bows of ships since the time of the pharaohs. "There was some sort of heraldic device to deny the fact of this gigantic machine," he said. "You would have these humanizing forms, anthropomorphic forms - a device that both proclaims the identity of the machine and conceals it."

"There's a transference in this," she said. "There's this soft, flesh-and-bone sanitation worker, who knows very well they could be crushed against this truck. The creature could be the sanitation worker in a very dangerous position, so the animal could be a stand-in."

"Binding a soft thing to a very powerful truck - there's a kind of macho thing about that," she said.

Scooby's story lends credence to the theory of Mr. Denton, the art historian, that the grille-mounted stuffed animal draws from the same well as the "abject art" movement that flourished in the 1990's and trafficked heavily in images of filth and of distressed bodies.

In 1950, a book titled The Authoritarian Personality posited the claim that, far from being alien, fascist tendencies were commonplace in American society. The book is best known for the "F Scale", a test of how inclined one would be, should the opportunity present itself, to don the jackboots of authoritarianism. The test consisted of a number of multiple-choice questions, with the answers added to give a score; in that sense, it's an ancestor of numerous OKCupid tests and LiveJournal memes.

The F Scale is not perfect: for one, it focuses almost exclusively on a strain of traditionalist, right-wing authoritarianism, ignoring other strains, such as Soviet-style social engineering. (This could be because one of the authors was the famous Marxist critical theorist Theodor Adorno, and/or because authoritarian utopianism à la Lenin never had more than niche popularity in the US, where the research was carried out.) However, according to this article, it's more relevant today than it was when it was written:

In the June 19, 2005, issue of The New York Times Magazine, the journalist Russell Shorto interviewed activists against gay marriage and concluded that they were motivated not by a defense of traditional marriage, but by hatred of homosexuality itself. "Their passion," Shorto wrote, "comes from their conviction that homosexuality is a sin, is immoral, harms children and spreads disease. Not only that, but they see homosexuality itself as a kind of disease, one that afflicts not only individuals but also society at large and that shares one of the prominent features of a disease: It seeks to spread itself." It is not difficult to conclude where those people would have stood on the F scale.

Consider the case of John R. Bolton, now our ambassador to the United Nations. While testifying about Bolton's often contentious personality, Carl Ford Jr., a former head of intelligence within the U.S. State Department, called him a "a quintessential kiss-up, kick-down sort of guy." Surely, in one pithy sentence, that perfectly summarizes the characteristics of those who identify with strength and disparage weakness. Everything Americans have learned about Bolton -- his temper tantrums, intolerance of dissent, and black-and-white view of the world -- step right out of the clinical material assembled by the authors of The Authoritarian Personality.

One item on the F scale, in particular, seems to capture in just a few words the way that many Christian-right politicians view the world in an age of terror: "Too many people today are living in an unnatural, soft way; we should return to the fundamentals, to a more red-blooded, active way of life."

Dr Hyde said gender differences accounted for either no or a very small effect for most of the psychological variables examined. She said only throwing distance and physical aggression showed marked gender differences.

It turns out that there are stereotypical male and female behaviours -- but they disappear as soon as the actor is not identified by sex:

Dr Hyde highlighted one study where participants were told that they were not identified as male or female nor wore any identification, which led to neither sex conforming to a stereotyped image when given the opportunity to act aggressively.

They actually did the opposite to what was expected - they did not stick to the stereotype of aggressive males and passive females.

The Huffington Post has some speculations on why so disproportionately many arrested paedophiles are found to be hardcore Star Trek fans (as documented here and here):

Star Trek paraphernalia has so routinely been found at the homes of the pedophiles they've arrested that it has become a gruesome joke in the squad room. (On the wall, there is a Star Trek poster with the detectives' faces replacing those of the crew members).

Several theories are given, including Captain Kirk's inability to hold down adult romantic relationships, a pervasive message that women are toxic (which, presumably, would strike a chord with men rejecting adult heterosexual relationships; incidentally, does Star Trek have a significant gay following?), and "bad" impulses being attributed to external forces (a cop-out familiar to child molesters). But most controversial could be the claim that Star Trek's very utopian basis is fundamentally inclined towards appealing to paedophiles:

In perversion, there is an attempt to obliterate any distinctions that provoke unconscious anxiety. First and foremost, this entails a denial of the difference between the sexes and the difference between the generations. Pedophiles are, at the very least, attempting to deny the difference between the generations. The utopian fantasy here is to normalize sex between adults and children.

According to Dr. Peter Mezan, a psychoanalyst in New York City, "There is an impulse that is common to perversion and to utopian thinking. The wish is to create a world in which differences make no difference. The great utopian thinkers have been immensely inspiring, but there is a reason that utopian communities have never worked out. In the name of equality of every sort and in the attempt to eliminate the tensions that normally divide us, they propose to create a marvelously unnatural world without the usual boundaries. But then it gets all fucked up."

Think of Michael Jackson. He has attempted to eradicate just about every sexual, generational, and racial difference -- and to construct an alternate utopian reality in Neverland. While there is certainly a futuristic quality to his clothing and mask-like facial features, it is unclear whether he watches Star Trek or just looks as if he does.

If the utopianism-paedophilia connection holds true, I wonder how many arrested paedophiles are pacifists, anarchists, libertarians, Esperantists or Momus-style emotional communists.

Researchers in the US have made advances in the production of cultured meat, i.e., meat grown in nutrients from cell colonies, but barriers, both technical and cultural, remain:

They envisage muscle cells growing on huge sheets that would be regularly stretched to exercise the cells as they grow. Once enough cells had grown, they would be scraped off and shaped into processed meat products such as chicken nuggets.

The idea of doing away with traditional livestock and growing steaks from scratch dates back at least 70 years. In a horizon-scanning essay from 1932, Winston Churchill said: "Fifty years hence we shall escape the absurdity of growing a whole chicken in order to eat the breast or wing by growing these parts separately under a suitable medium."

"Right now, it would be possible to produce something like spam at an incredibly high cost, but the know-how to grow something that has structure, such as a steak, is a long way off," said Mr Matheny.

"It won't appeal to someone who gave up meat because they think it's morally wrong to eat flesh or someone who doesn't want to eat anything unnatural," Ms Bennett [of the Vegetarian Society] added.

Of course, once it is possible to grow meat from donor cells without killing a living thing, a lot of things become possible. How long, I wonder, until some transgressive technogoth type decides to grow steaks from their own muscle cells and holds a cannibal dinner party? Despite the fact that no-one gets hurt, a lot of people would find this beyond the pale, and given how disgust often translates into legislation, chances are the practice will become outlawed in a great many countries. Which, of course, would only drive it underground and give it prestige and cachet.

Willer administered a gender identity survey to a sample of male and female Cornell undergraduates in the fall of 2004. Participants were randomly assigned to receive feedback that their responses indicated either a masculine or a feminine identity. While women's responses were unchanged regardless of the feedback they received, men's reactions "were strongly affected by this feedback," Willer said.

He questioned subjects about their political attitudes, including how they felt about a same-sex marriage ban and their support for President Bush's handling of the Iraq War. "I created composites from subjects' answers to these and other questions," he said. "I also gave subjects a car-buying vignette, presented as part of a study of purchasing a new car."

With this in mind, perhaps SUV manufacturers will start running ads, with no brand names on them, impugning their audience's masculinity. I can see them now: "Hey you," a crew-cut, neckless drill-sergeant type shouts from the TV, "you call yourself a man? You ain't a man, you're a big girl's blouse!" Two ads later, a spot for the Hummer or the latest ultra-macho urban assault vehicle appears. Within the next week, sales go through the roof as office drones compensate for their perceived emasculation.

Meanwhile the researchers in question next intend to measure respondents' testosterone levels and also test their attitudes to violence against women.

Someone named Claire Mills has created a magazine for synæsthetes. Named "Syn", it consists of 48 pages of personal accounts, commentary and theories about the condition. It's only available as a bunch of graphic files on the web though, which is a tad inconvenient to browse.

Londoners probably shouldn't relax too soon, if The Times' claim that a third terror cell is poised to strike are correct. The claims apparently come from a Special Branch conference at Scotland Yard, though elsewhere, the authorities have been downplaying them. In any case, this matter is far from over, and it may take months to find those behind the bombings. The fact that one of them managed to slip through the net and leave the country a few days after the alarm was raised doesn't raise one's confidence.

Elie Godsi, a consultant clinical psychologist, says that there is a huge stigma attached to terrorists who fail which means they are unable to return to their communities.

"There is a great deal of stigma in having not succeeded," said the forensic psychologist from the University of Nottingham and author of Making Sense of Madness and Badness. "They will regroup and try again or try to take their own lives."

Psychologists are now looking at magic tricks for answers to how the mind works. Developed and refined over centuries, such tricks and techniques are now being recognised as containing a lot of folk knowledge about the low-level workings of consciousness and perception:

A card trick that lasts four or five minutes, for example, might have 20 pages of detailed text to describe exactly where to look, what to say, what to do and so on. And a lot of the understanding of a trick has to be from the perspective of the audience.

Our brains filter out a huge amount of the mass of sensory input flooding in from our environment. Kuhn explains that we see what we expect to see and what our brains are interested in. "Our visual representation of the world is much more impoverished than we would assume. People can be looking at something without being aware of it. Perception doesn't just involve looking at an object but attending to it."

The new study followed almost 1500 Australians, initially aged over 70. Those who at the start reported regular close personal or phone contact with five or more friends were 22% less likely to die in the next decade than those who had reported fewer, more-distant friends. But the presence or absence of close ties with children or other relatives had no impact on survival.

Which all flies in the face of conventional wisdom, that supportive, tightly-knit families are the key to health and happiness. Though the explanation for this could be that it is not so much a truth as a manifestation of an evolved cultural response to the environment, a mantra of family-values repeatedly recited to counterbalance the ambient irritation of interaction with family members and reinforce stable social structures.

Social psychologists in the UK have enumerated the nine types of (romantic) love. These include a "grown-up version" that involves mutual trust, recognition and support, the "Cupid's dart" variety, "hedonistic love" and "dyadic partnership love" (where two people become a single unit, presumably delegating cognitive tasks to each other), as well as various transitional forms.

According to the Toronto police sex crimes unit, the vast majority of people arrested for child pornography offences are obsessed with Star Trek:

The first thing detectives from the Toronto police sex crimes unit saw when they entered Roderick Cowan's apartment was an autographed picture of William Shatner. Along with the photos on the computer of Scott Faichnie, also busted for possessing child porn, they found a snapshot of the pediatric nurse and Boy Scout leader wearing a dress "Federation" uniform. Another suspect had a TV remote control shaped like a phaser. Yet another had a Star Trek credit card in his wallet. One was using "Picard" as his screen name. In the 3 1/2 years since police in Canada's biggest city established a special unit to tackle child pornography, investigators have been through so many dwellings packed with sci-fi books, DVDs, toys and collectibles like Klingon swords and sashes that it's become a dark squadroom joke. "We always say there are two types of pedophiles: Star Trek and Star Wars," says Det. Ian Lamond, the unit's second-in-command. "But it's mostly Star Trek."

And there's more on the claims here, including letters from indignant Trekkies complaining that the article vilifies Trekkies whilst failing to put forward the "ethics, morality and message" of their faith TV show.

The initial drive - lust - is brought on by surges of sex hormones, such as testosterone and estrogen. These induce an indiscriminate scramble for physical gratification. Attraction transpires once a more-or-less appropriate object is found (with the right body language and speed and tone of voice) and is tied to a panoply of sleep and eating disorders.

Obsessive thoughts regarding the Loved One and compulsive acts are also common. Perception is distorted as is cognition. "Love is blind" and the lover easily fails the reality test. Falling in love involves the enhanced secretion of b-Phenylethylamine (PEA, or the "love chemical") in the first 2 to 4 years of the relationship.
This natural drug creates an euphoric high and helps obscure the failings and shortcomings of the potential mate. Such oblivion - perceiving only the spouse's good sides while discarding her bad ones - is a pathology akin to the primitive psychological defense mechanism known as "splitting". Narcissists - patients suffering from the Narcissistic Personality Disorder - also Idealize romantic or intimate partners. A similar cognitive-emotional impairment is common in many mental health conditions.

Love, in all its phases and manifestations, is an addiction, probably to the various forms of internally secreted norepinephrine, such as the aforementioned amphetamine-like PEA. Love, in other words, is a form of substance abuse. The withdrawal of romantic love has serious mental health repercussions.

Also via Mind Hacks, studies of the neurology of sarcasm, which turns out to be a good subject for exploring subjects' theory of mind, being carried out at the fantastically-named Rambam Medical Center in Haifa, Israel.

Blogging ambulanceman Tom Reynolds talks about the weather, or, more precisely, about how the weather influences the sorts of accidents and incidents that occur:

If the weather is grey and overcast, we tend to go to more old folk who are sitting indoors, or more commonly, falling over indoors. Sometimes you get the impression that they just want someone to talk to -- or to not be alone. There also seem to be more suicide attempts as well -- and it is fairly well known that suicide rates go up in springtime. So on those rainy spring days you end up seeing a lot of Paracetamol overdoses.

Spring and Autumn rains (and in England, Summer rains) bring with them car vs car collisions, as an infrequent rain lifts off the layer of rubber and pollution left on the road by passing cars and the roads become a skid pan. Fallen leaves on the road don't help, and neither do the effects of the rapidly changing hours of daylight on a drivers bodyclock.

The hot weather also brings out the people who start drinking at lunchtime, and continue throughout the day, tie this in with a lot of sporting fixtures, and we find ourselves going to a lot of fights in a lot of pubs.

It has been well known that alcohol causes many men to find women more attractive than they otherwise would (the effect is colloquially known as "beer goggles"). Now, researchers have found that actual alcohol isn't even required; exposure to alcohol-related words is enough; at least among men who expect that alcohol has that effect.

First, the subjects answered questionnaires that asked whether they thought alcohol affected their sex drive. Afterward, the men were divided into two groups and placed next to computers. One group was shown words that described alcohol, such as liquor, beer and keg. The other group saw words like water, soda and coffee.

The researchers found that the group of men who expected alcohol to enhance their sex drive found the women in the photos more attractive after viewing the alcohol cue words. The group that expected alcohol to reduce their sex drive found the women to be less attractive.

This echoes another priming experiment (reported in both Mind Hacks and Malcolm Gladwell's Blink), in which students surreptitiously exposed to old-age-related words like "wrinkled", "senior" and "Florida" were found to walk more slowly down a corridor than those not primed in this way.

A look at the U.S. Secret Service's tools for breaking encryption on seized data. Not surprisingly, they use a network of distributed machines to help brute-force keys. Cleverly enough, before they do so, they assemble a custom dictionary of potential keys/starting points from all data on the seized machine (including files, web browsing histories, and presumably terminology associated with the areas of interest visited web sites relate to). (via /.)

"If we've got a suspect and we know from looking at his computer that he likes motorcycle Web sites, for example, we can pull words down off of those sites and create a unique dictionary of passwords of motorcycle terms," the Secret Service's Lewis said.

Hansen recalled one case several years ago in which police in the United Kingdom used AccessData's technology to crack the encryption key of a suspect who frequently worked with horses. Using custom lists of words associated with all things equine, investigators quickly zeroed in on his password, which Hansen says was some obscure word used to describe one component of a stirrup.

The moral of this story is: if you're planning the perfect crime using computers and encryption, you may find it wise to develop an obscure interest and not mention it by electronic means. Or, for that matter, let it show up in credit card receipts, library records, personal effects, or any other information the authorities could get. Which could be trickier than it sounds.

He produces a sheet of blank paper and issues an instruction: Draw a picture. "Try to catch me out; make it a bit obscure," he orders. "Don't draw a house; don't draw a stick man." Walking to another room and out of sight, he decrees that the picture should be concealed until the end of the interview - whereupon, he claims, he will reveal what it is.

Recently, he said, he used his talents to defuse a situation in which an aggressive youth approached him on the street, yelling, "What are you looking at?" (Brown responded with a rapid series of diversionary non sequiturs, he said; the man burst into tears.)

Instructing me to concentrate, he pulls out a blank sheet of paper and begins sketching, chatting all the while. He tells me he "sees" a conical shape with spots on it - some sort of decorated lamp with a blob on top. And knock me down if he does not produce a near-exact replica of my drawing, the only differences being that his has more dots than mine, and his stripes are horizontal, not vertical.

Channel 4 has a Derren Brown microsite here, with streaming video and explanations of some of the tricks (such as making people fall asleep in phone booths). Think of it as the human equivalent of the buffer overrun attack.

He said many are "destabilised by falling in love, or suffer on account of their love being unrequited" and this could lead to a suicide attempt. Few studies deal with the "specific problem of lovesickness", he said.

Which sounds like a bit of a cop-out to me; I don't doubt that unrequited love has caused many mental breakdowns and suicides, though I wonder how much of that comes from the biology of the condition and how much comes from the social expectation that, when someone you fancy doesn't fancy you, you're entitled to whine, pout, go all emo and become temporarily unresponsible for your behaviour. For example, during the Victorian era, many women would faint in certain social situations. This was not due to the biology of the female gender being susceptible to sudden consciousness loss, but due to programmed-in social expectation. Could it be that losing one's shit over that one special person in the world who doesn't reciprocate one's passion is a similar case of cultural conditioning?

(Which is not to say that romantic love or sexual attraction is culturally constructed; I don't for a moment entertain the blank-slate theory of human nature. However, it's more than conceivable the expectations of how such urges are expressed, and how much they can affect one's behaviour, are strongly influenced by cultural expectations, and that, as biological and physical causes of behaviours are revealed, they gain more influence as the self-sustaining illusion of the sovereign free will becomes weakened.)

Then again, now that unrequited love is recognised as a bona fide medical condition, perhaps some pharmaceutical company will seize the opportunity and bring out an anti-unrequited-love drug, a sort of Prozac for the heart which quickly and conveniently cures this debilitating ailment, further streamlining the human condition.

Mind Hacks, the new O'Reilly book of cool tricks and experiments in applied neurology, is out now; the O'Reilly page includes some sample chapters in PDF format. If that's not enough, there is a Mind Hacks blog, constantly publishing new material on similar topics.

The timelessness of our desire to get drunk has led anthropologists such as Kate Fox, director of the Social Issues Research Centre in Oxford, to speculate about the British character. She concluded that we are all suffering from a "congenital sociability disorder", a disease whose symptoms are akin to a kind of autism combined with agoraphobia. In plain talk, the British are uniquely buttoned up and starched stiff. Animal watcher Desmond Morris says that if we were monkeys we would be picking imaginary fleas out of each other's fur, in an act of "social grooming", a pretext for prolonging social encounters. Instead we have for centuries propped up the bar.

A national characteristic has been identified in numerous scientific trials. In one, British volunteers were plied with drinks, all purporting to be alcohol, half of which were placebos. Everyone became equally loud, crude and garrulous, the technically sober behaving identically to the genuinely drunk. Similar tests carried out on volunteers from Mediterranean countries found no such associations. Scientists concluded that British people invested alcohol with "magical disinhibiting powers".

Via tyrsalvia, a fascinating article on why people vote as they do. As many have undoubtedly suspected, very few people vote rationally, i.e., considering and understanding the issues or policy platforms in question, with the vast majority of votes being cast for reasons unconnected to ideology, political belief or the candidates' visions:

Converse claimed that only around ten per cent of the public has what can be called, even generously, a political belief system. He named these people 'ideologues,' by which he meant not that they are fanatics but that they have a reasonable grasp of "what goes with what" of how a set of opinions adds up to a coherent political philosophy. Non-ideologues may use terms like "liberal" and "conservative," but Converse thought that they basically don't know what they're talking about, and that their beliefs are characterized by what he termed a lack of "constraint": they can't see how one opinion (that taxes should be lower, for example) logically ought to rule out other opinions (such as the belief that there should be more government programs). About forty-two per cent of voters, according to Converse's interpretation of surveys of the 1956 electorate, vote on the basis not of ideology but of perceived self-interest. The rest form political preferences either from their sense of whether times are good or bad (about twenty-five per cent) or from factors that have no discernible "issue content" whatever. Converse put twenty-two per cent of the electorate in this last category. In other words, about twice as many people have no political views as have a coherent political belief system.

Philip Converse's study, published in 1964, reignited doubts into the meaningfulness of democracy, and three theories have emerged over how a democracy really works. Theory 1 says that electoral outcomes are essentially arbitrary, i.e., the amount of signal (i.e., decisions made rationally by informed voters) is overwhelmed by noise (reaction to slogans, misinformation, sensational news, random personal associations (by some accounts, the colours of politicians' neckties are more important than their policy positions in deciding their fates), and even satisfaction or otherwise with things out of politicians' control, such as the weather). Theory 2 states that democratic decisions are made by elites who control the media, and have the power to send the messages which the apolitical bulk of the public respond to; i.e., the electoral process is essentially a low-pass filter on the opinions of Rupert Murdoch and his fellow oligarchs. Theory 3 states that the cues people respond to are heuristics which, to most intents, are as good as doing one's own research; these include consulting peers' opinions and intuitive judgments, i.e., "low-information rationality".

An analogy (though one that Popkin is careful to dissociate himself from) would be to buying an expensive item like a house or a stereo system. A tiny fraction of consumers has the knowledge to discriminate among the entire range of available stereo components, and to make an informed choice based on assessments of cost and performance. Most of us rely on the advice of two or three friends who have recently made serious stereo-system purchases, possibly some online screen shopping, and the pitch of the salesman at J&R Music World. We eyeball the product, associate idiosyncratically with the brand name, and choose from the gut. When we ask "experts" for their wisdom, mostly we are hoping for an "objective" ratification of our instinctive desire to buy the coolest-looking stuff. Usually, we're O.K. Our tacit calculation is that the marginal utility of more research is smaller than the benefit of immediate ownership.

The use of these heuristics leaves plenty of blind spots in the electoral process.

Bartels has also found that when people do focus on specific policies they are often unable to distinguish their own interests. ...
When people are asked whether they favor Bush's policy of repealing the estate tax, two-thirds say yes--even though the estate tax affects only the wealthiest one or two per cent of the population. Ninety-eight per cent of Americans do not leave estates large enough for the tax to kick in. But people have some notion--Bartels refers to it as "unenlightened self-interest"--that they will be better off if the tax is repealed. What is most remarkable about this opinion is that it is unconstrained by other beliefs. Repeal is supported by sixty-six per cent of people who believe that the income gap between the richest and the poorest Americans has increased in recent decades, and that this is a bad thing. And it's supported by sixty-eight per cent of people who say that the rich pay too little in taxes. Most Americans simply do not make a connection between tax policy and the over-all economic condition of the country.

Stanley Milgram, the social psychologist responsible for the how-much-pain-would-you-inflict-whilst-following-orders experiment and the six-degrees-of-separation experiment, is also known for another experiment: getting able-bodied students to ask Subway passengers for their seats. Surprisingly many passengers gave up their seats (though some with scornful remarks), though the thing that surprised Milgram was the degree of social mortification, and indeed physical distress, felt by the students asking the question. (via FmH)

According to psychological studies, common folk wisdom about spotting liars is little more than superstition. Liars do not habitually avert their gaze, touch their noses or fidget any more than truth tellers do. Which is not to say that there might not be patterns of behaviour suggestive of lying:

Fibbers tend to move their arms, hands, and fingers less and blink less than people telling the truth do, and liars' voices can become more tense or high-pitched. The extra effort needed to remember what they've already said and to keep their stories consistent may cause liars to restrain their movements and fill their speech with pauses. People shading the truth tend to make fewer speech errors than truth tellers do, and they rarely backtrack to fill in forgotten or incorrect details.

Liars may also feel fear and guilt or delight at fooling people. Such emotions can trigger a change in facial expression so brief that most observers never notice. Paul Ekman, a retired psychologist from the University of California, San Francisco, terms these split-second phenomena "microexpressions." He says these emotional clues are as important as gestures, voice, and speech patterns in uncovering deceitfulness.

Also, it emerges that most people are poor at identifying liars, scoring little better than chance. Though one study, which included a group of US Secret Service officials, showed that they were considerably better at detecting liars than the average person.

Insightful essay of the day: The Monkeysphere; that's related to the concept by which humans have a hardwired limit of 150 people they can care about, or even intuitively consider as people rather than abstractions. Understanding this explains a lot of the suffering and cruelty in the world (via caycos@lj):

But think of Osama Bin Laden. Did you just picture a camouflaged man hiding in a cave, drawing up suicide missions? Or are you thinking of a man who gets hungry and has a favorite food and who had a childhood crush on a girl and who has athelete's foot and chronic headaches and laughs when a friend farts, a man who wakes up in the morning with a boner and loves volleyball and fusses over his spoiled children and haggles over the price of a car and who goes on Seinfeld-esque rants about too much ice in his drinks?

Something in you, just now, probably was offended by that. You think I'm trying to build sympathy for the murderous bastard. Do you see the equation? Simply knowing random human facts about him immediately tugs at our sympathy strings. He comes closer to our Monkeysphere, he takes on dimension.

Now, the cold truth is my Bin Laden is just as desperately in need of a bullet to the skull as the raving four-color caricature on some redneck's T-shirt. The key to understanding people like him, though, is realizing that we are the caricature on his T-shirt.

A look at the stigma attached to women keeping cats. After all, everybody knows that people who keep cats are emotional basketcases; normal, healthy people either keep dogs (a proper, well-adjusted person's pet) or fish (which are more of a hobby than a pet) or don't need the emotional crutch of keeping animals:

Supposedly, to look into the female singleton's trolley is to gaze upon human despair in its purest form. The meals-for-one, the glossy magazines shrieking their self-help messages so loudly people three aisles away can hear, those furtive bars of high-quality chocolate brought as a substitute for the low-quality sex they were having before they decided enough was enough. All these items could be exhibited as evidence in the socio-emotional kangaroo courts that even today persist in judging the solitary female as worthless and hopeless simply because she is mate-less. However, the real clincher is the six-pack of top-of-the-range cat food. A kilo of heroin couldn't be more socially incriminating.

Militantly anti-cat Melanie Reid wrote recently; "Feminism has been blamed for many things but there is no doubt that it is also partly responsible for the rise of the cat."

It's very easy to deal with boyfriends who complain you treat your cat better than you do them. Just say: "Once you've produced evidence that an ancient civilisation worshipped you, then perhaps we'll talk." It could even be argued that some men end up being very poor cat substitutes. ("I just couldn't meet the right cat so I decided to have a relationship instead.")

The article also has a list of famous cat-haters, including William Shakespeare and King Louis XIV. They left John Ashcroft off the list.

Btw, what about men who keep/prefer cats; are we they also considered to be psychological liabilities, or perhaps cat-fanciers are Not Real Men (see also: vegetarians, Belle & Sebastian fans, non-followers of sports teams)?

"The more you post the evidence of legislative control, such as traffic signs, the less the driver is trying to use his or her own senses," says Hamilton-Baillie, noting he has a habit of walking randomly across roads -- much to his wife's consternation. "So the less you can advertise the presence of the state in terms of authority, the more effective this approach can be."

Contrast this approach with that of the United Kingdom and the United States, where education campaigns from the 1960s onward were based on maintaining a clear separation between the highway and the rest of the public realm. Children were trained to modify their behavior and, under pain of death, to stay out of the street. "But as soon as you emphasize separation of functions, you have a more dangerous environment," says Hamilton-Baillie. "Because then the driver sees that he or she has priority. And the child who forgets for a moment and chases a ball across the street is a child in the wrong place."

The new school of traffic engineering also draws on conclusions from evolutionary biology and psychology:

Subvert, don't attack, the dominant paradigm. Or, as David Engwicht, a shared-spaces proponent in Brisbane, Australia, has written: "Implicit in the whole notion of second-generation traffic calming is the idea that significant social change only happens when we amplify the paradoxical 'submerged voice' as opposed to tearing down the 'dominant voice.' Engwicht, a plenary speaker at the Walk 21 Cities for People Conference in Copenhagen this June, argues that controlling a driver's natural propensity for speed is futile. A more effective approach is to engage the driver by emphasizing "uncertainty and intrigue" in the street environment -- for example, planting a tree in the middle of the street instead of putting up a stop sign.

Safety analysts have known for several decades that the maximum vehicle speed at which pedestrians can escape severe injury upon impact is just under 20 miles per hour. Research also suggests that an individual's ability to interact and retain eye contact with other human beings diminishes rapidly at speeds greater than 20 miles per hour. One theory behind this magic bullet, says Hamilton-Baillie, is that 20 mph is the "maximum theoretical running speed" for human beings. (Evolutionary biologist E.O. Wilson has drawn similar conclusions.) "This is of interest," he says, "because it suggests that our physiology and psychology has evolved based around the potential maximum impact on the speed of human beings." The ramifications go beyond safety, says Hamilton-Baillie, to bear directly on the interplay between speed, traffic controls and vehicle capacity. Evidence from countries and cities that have introduced a design speed of 30 kilometers per hour (about 18.5 mph) -- as many of the European Union nations are doing -- shows that slower speeds improve traffic flow and reduce congestion.

Tonight I went to see Eternal Sunshine of the Spotless Mind. I really enjoyed it. The writing was by Charlie Kaufman (Adaptation, Being John Malkovich), though he wasn't being as much of a clever-dick as he usually was (though the somewhat self-absorbedly neurotic voice-over at the start had me worried for a while). The direction was by Michel Gondry (who did a number of other films with Kaufman, as well as videos for Björk and French TV commercials), and makes the most of the visual idiom.

I'd classify the film as speculative fiction (one could have called it "science fiction", only this term has been hijacked to mean action movies with dark metallic corridors lit by strips of neon, 1-piece jumpsuits, futuristic gadgets and lots of blinking lights). Basically, the story is this: neurotic boy (Jim Carrey, who's not at all the buffoon he's best known as) meets psycho hairdye girl (Kate Winslet, looking like too many cute-but-insane punk/goth/raver chicks you've probably met), and they hook up; then, sometime later, their relationship falls apart, and she goes to a clinic to have all memories of him eradicated from her mind. He runs into her, she doesn't recognise him then goes to the same clinic to do the same. Only as it's happening, he suddenly has a change of heart and races around the landscape of his mind, trying to save the memories of her from the erasure technicians. There's more to the story, such as one of the technicians (played by Elijah "Frodo" Wood, only looking like a member of a nu-metal boy-band) hitting on a way of using attractive female patients' erased memories to hook up with them, and the issue of whether erasing all memories of a failed relationship would be mostly good or mostly bad.

The gist of the film seems to be that erasing a memory is not the same as preventing an event from recurring; in the film, characters whose memories of their relationships with each other have been erased hook up again and repeat their connections. If you're a sentimentalist or wish to consider the film as a romantic comedy, it could be about the power of destiny and true love and soulmates being brought together by powerful forces beyond their control, like the angels people talk about on American daytime TV talk shows or something. If you're more of a skeptic or a cynic, it's more about one being destined to repeat mistakes if one doesn't remember them. On a deeper level, I thought it was about how the dynamic workings of the human mind are composed not so much of things (such as memories of moments and people) as of processes; you can lose your memories, but if you're still the same person you were before (and if you have lost learnings since, you may be more likely to be), the internal processes of your mind will guide you into repeating what you have forgotten, or at least riffing off it. (Maybe if one was to start a real-life Lacuna Inc., one would combine aversion conditioning with memory erasure, to make the patients avoid their problem exes, but I digress.)

It looks like antidepressants are wreaking havoc with the human courtship/mating/bonding instincts, doing everything from dampening sexual desire and disrupting the positive feedback mechanisms that lead to emotional bonding to preventing those affected from recognising bad relationships or feeling any desire to seek good ones: (via FmH)

Serotonin enhancers can also dampen the sex drive of men and even their ability to ejaculate. These men naturally shy away from bedding women, leading to increased loneliness, setting up a vicious cycle of depression. Also, without frequent orgasms, men and women dont have the flood of oxytocin and vasopressin that promote relationship bonding. Men might enjoy a womans company, but never fall head over heels for her. Semen may also be critical in retaining a womans interest, as recent studies indicate that men may alter womens emotional states through chemicals transmitted through semen.

One guy on SSRIs would look at a beautiful woman and recognise that intellectually, but he said there was no oomph. He described being on the drugs as if the lenses in his glasses somehow had been changed. He wanted off the drugs. Even if he couldnt chase women because he was a married man, he still wanted to enjoy looking.

Thomson also worries that some women could suffer a double whammy where antidepressants hinder their natural judgment to leave a bad relationship and also blunt their ability to spot healthy, desirable new mates. Indeed, he recalls that one patient wasnt healthily distressed when an abusive ex-boyfriend with a history of stalking showed up at her door.

Though I'm sure that humanity, a technological species, will adapt. Perhaps we'll become a species of solitary cube-dwellers, choosing breeding partners by some market-based mechanism or computerised matching system, and future generations will find it incredible that, once upon a time, people relied on wild, atavistic passions to select their breeding mates?

Journal of a Schizophrenic, a lucid first-person account of one person experiencing (and later playing pool with) a "voice" in his head and having delusions of persecution at work.

Me: Jesus H. Christ, I need to see a shrink.
It: That's not necessary. We could play some pool, and talk. Not out loud, please.
Me: .... Are you really there? Prove it - tell me something I don't know.
It: That's not how it works. We share the same physical body, we have the same hippocampus, the same brain; the same memories. We think differently, act differently.
Me: Then how can we play a game of pool?
It: It will be very difficult. Set up the table, take a shot, and then step back. I promise that I will step back after the game.
Me: No way. This is insane.

Sometimes, I confess, I do miss the voice. I suppose it's difficult to have a more meaningful metaphysical conversation than with someone in your own head. And true: I don't have very many friends; I spend very little time with my family; I've never had a girlfriend or a boyfriend. But the thing is, I wouldn't have it any other way. I live in my books, through my writing, and because of my ideas. While I am often alone, I am very rarely lonely.

An article suggesting that "positive thinking" is a route to disaster, by making one incapable of acknowledging reality. Goths, emo kids and LiveJournal angstpuppies can take heart in the knowledge that they're a lot healthier than the shiny happy people around them:

Increasingly it is becoming unacceptable to voice legitimate distress. If you lose your job, become chronically ill, or fall prey to loneliness or depression, you are likely to be told - often abrasively - to look on the bright side. With unseemly haste, people rush to put an optimistic gloss on a disaster or to suggest a patently unworkable solution. We seem to be cultivating an intolerance of pain - even our own. An acquaintance once told me that quite the most difficult aspect of her cancer was her friends' strident insistence that she develop a positive attitude, and her guilt at being unable to do so.

In our global world, we can no longer afford to edit out the uncomfortable spectacle of human misery. In the past, we have sometimes pursued policies that have resulted in great suffering, telling ourselves that all would ultimately be well. We have let conflicts fester until they have become intractable. We have supported such allies as Saddam Hussein, ignoring the atrocities they inflict upon their people. We are now rightly outraged by his massacre of his Kurdish subjects, but at the time we ineffectually turned a blind eye. Today we are reaping the reward of our heedless karma. The pain that we ignored in some parts of the world has hardened into murderous rage.

It's an interesting argument: suggesting that we in the West have allowed atrocities to be carried out in our names because of positive thinking. It reminds me of the claim, a number of years ago, that the (then) stock market boom was the result of massive use of Prozac and similar antidepressants causing investors and others to become irrationally optimistic. Could such chemically-induced optimism possibly make people more willing to cause suffering elsewhere in the world, in pursuit of their ideas? Or, perhaps, if you want an image of the future, imagine a jackboot with a smiling face, forever.

The ceramics teacher announced on opening day that he was dividing the class into two groups. All those on the left side of the studio, he said, would be graded solely on the quantity of work they produced, all those on the right solely on its quality. His procedure was simple: on the final day of class he would bring in his bathroom scales and weigh the work of the "quantity" group: fifty pound of pots rated an "A", forty pounds a "B", and so on. Those being graded on "quality", however, needed to produce only one pot -albeit a perfect one - to get an "A". Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the "quantity" group was busily churning out piles of work - and learning from their mistakes - the "quality" group had sat theorizing about perfection, and in the end had little more to show for their efforts than grandiose theories and a pile of dead clay.

The latest plague in the dating ecosystem is the "Whimpster"; which seems to be like a cross between a metrosexual and a pathologically insecure emo boi:

Simply put: He is male. He is white. He is wimpy. He looks a little bit emo, a little bit hipster, and he's more dangerous than you'd think. So, the next time you wake up next to someone whispering acrimonious nothings about his ex-girlfriend instead of going down on you, you'll know a little more about this seemingly gentle boy you went home with. This is the 'dark side' of Lloyd Dobler, of our precious Duckie, and life with him is much different after the credits roll. Whimpsters are men who use cultural artifacts and politically correct platitudes in place of the empty spaces where real thought and emotion should be. Whimpsters are men who unwittingly enjoy Bukowski's misogyny. Whimpsters walk a tenuous tightrope between their secreted, terribly warped masculinity and the mainstream manliness that they claim to abhor.

(I wonder what happens when a whimpster meets a quirkyalone.)

If you have the fortune to be non-heterosexual, there are now lots of things you can call yourself, from traditional words like "gay" and "lesbian" to niche genders like "boydyke", "genderqueer" and "boi" (which, here, doesn't mean "masculine specimen of alternative yoof subculture") (via FmH)

A US psychologist has devised a mathematical model that can predict divorce with 94% accuracy, by studying videotapes of arguments within couples and analysing modes of interaction and physiological data to get "bitterness ratings".

An article looking at seven sentiments most people won't admit to having, and feel embarrassed at even thinking about; and these are not any stereotypically Freudian sexual kinks either. They are: feeling uncomfortable around physically disabled people, publicly expressing grief for people one didn't care about in life/"Harolding" at funerals, schadenfreude, playing favourites with one's children, judging people by their wealth, feeling relieved when someone in chronic pain dies, and having sexual fantasies about people other than their partner.

Why jump on the bandwagon, when the bandwagon is a hearse? There are self-serving reasons: Evolutionary psychologists argue that the public expression of grief boosts your reputation as a trustworthy member of the community.

Sudden tragic death can inspire emotional rubbernecking in anyone. (How many of us have boasted about near misses--say, driving through an intersection five minutes before a fatal crash?) A national catastrophe such as September 11 brings this behavior out of the woodwork. That fall, people felt compelled to disclose that they had friends of friends of friends in the World Trade Center. New Yorkers morbidly compared notes: How close were you? What did you see? Who did you know? (In this creepy social gambit, the "winner" is the person most directly affected by the attack.) The same calculus was at work in other states or countries, where the comparison was not what you saw firsthand but who you knew in New York City or Washington, D.C.

New research shows that men who get married are more likely to suffer mental health problems, whilst men who remain single are most likely to suffer depression, with simply shacking up with a partner being the optimum solution. For men, that is; for women, unmarried cohabitation is, according to the Queen Mary University study, the worst of all options, with celibacy being best, followed by marriage. More ammunition for the claim that heterosexuality is an inherently adversarial zero-sum proposition.

Psychologists in Canada have proven that the mere sight of an attractive woman can induce men to act irrationally, impulsively. Candidates of both sexes were shown images of people's faces, ranked in terms of attractiveness (and taken from amihotornot.com), and then offered a choice between a small amount of money tomorrow or a larger one at some variable time in the future; men who saw images of highly attractive women were more inclined to go for the smaller reward available the next day (an irrational behaviour, much like those exhibited by drug addicts). Female candidates showed no difference in their behaviour when shown images of attractive or unattractive men.

"We believe romantic love is a developed form of one of three primary brain networks that evolved to direct mammalian reproduction," says researcher Helen Fisher, PhD, of Rutgers University in New Brunswick, NJ. "The sex drive evolved to motivate individuals to seek sex with any appropriate partner. Attraction, the mammalian precursor of romantic love, evolved to enable individuals to pursue preferred mating partners, thereby conserving courtship time and energy. The brain circuitry for male-female attachment evolved to enable individuals to remain with a mate long enough to complete species-specific parenting duties."

In the new research, Zak and his colleagues find that when someone observes that another person trusts them, oxytocin - a hormone that circulates in the brain and the body - rises. The stronger the signal of trust, the more oxytocin increases. In addition, the more oxytocin increases, the more trustworthy (reciprocating trust) people are.

"Interestingly, participants in this experiment were unable to articulate why they behaved they way they did, but nonetheless their brains guided them to behave in `socially desirable ways,' that is, to be trustworthy," says Zak. "This tells us that human beings are exquisitely attuned to interpreting and responding to social signals.

(Or, perhaps, that what we know as the conscious mind doesn't so much make decisions or control our behaviour as rationalise it; could it be that the conscious mind does little more than provide a running commentary for the many physical processes happening in the brain and nervous system, and the (advantageous) illusion of a coherent, unified "self"? But I digress.)

He paid male students $10 to come into the lab and leave a saliva sample. Unbeknownst to the men, the scientists staged a five-minute chat with a twentysomething female research assistant before they spit. This brief brush set the men's hormones surging: testosterone levels in their spit shot up around 30%. The higher a man's hormone soared, the more the female research assistant judged that he was out to impress - by talking about himself, for example.

Scientists have discovered how earworms work; apparently they work a lot like histamines. A succes
sful earworm creates a "cognitive itch" in the brain which can only be "scratche
d" by repeating the tune over and over.

Mozart's children would "infuriate" him by playing melody and scales on the pian
o below his room - but stopping before completing the tune. "He would have to r
ush down and complete the scale because he couldn't bear to listen to an unresol
ved scale," Mr Smith related.

It may be cause for some concern that this research was presented at a conferenc
e on Consumer Psychology.

Since music draws on so many of the brain's faculties, it vouches for the health of the organ as a whole. And since music in ancient cultures seems often to have been linked with dancing, a good fitness indicator for the rest of the body, anyone who could sing and dance well was advertising the general excellence of their mental and physical genes to a potential mate.

Note: this applies to music performance, not music composition. Jimi Hendrix got laid like, well, a rock star, and if you're in a band you may be in with a chance. Sitting at your computer composing techno tunes is not a sign of Darwinian fitness, and unlikely to pull 'em in. As for DJing, the jury's still out.

Group singing, or chorusing, may have been an intermediate step in this process, he suggests. He has preliminary evidence that singing in church produces endorphins, a class of brain hormone thought to be important in social bonding, he said.

(I've heard it claimed in NLP circles that singing together puts one into a receptive state by the fact that one is, by necessity, breathing in synchronisation with everybody else in the church (including the celebrant of the service) and thus is more receptive to their beliefs; hence, it works as a sort of hypnotic persuasion technique.)

Mori's Uncanny Valley is the phenomenon in human perception of human-like entities that accounts for people feeling revulsion when they see zombies in a horror movie. Put simply, the theory postulates that the relationship between similarity to human appearance and movement and emotional response is not a straight line; instead, there is a peak shortly before the appearance becomes completely human -- and then response dives into visceral horror, as the not-quite-human object enters the realm of moving corpses, blasphemous abominations and Things That Should Not Be, looking too human, yet somehow loathsomely unnatural.
First postulated in the 1970s, the Uncanny Valley theory is behind advise to make all human-like agents/robots look slightly stylised, just enough to appear distinctly non-human and not trigger the sensations of horror.

Via the story of the guy who mistook his girlfriend for a robot -- or rather made a lifelike animated head modelled on said girlfriend's head, and wired with cameras, motors and software. David Hanson, the roboticist in question, is not an adherent of the Uncanny Valley theory, or believes that he can cross said valley and come out at the other side.
(via jwz)

Scientists prove that marriage kills creativity; working from a database of bigraphies of scientists, they discovered that creative genius gets turned off like a tap as soon as one marries and settles down; much the same thing applies to geniuses in music, painting and writing. The good news is that criminality suffers much the same fate (which suggests something like what Greg Egan termed the Clockwork Orange Hypothesis; that genius and criminality or violence are interrelated). The decline in testosterone levels after a man settles down is believed to be related; it is unclear, though, whether the study was performed exclusively on men or on both men and women.

Though that may be why all the well-known writers and artists out of their 20s have rocky relationship histories; perhaps they're just the ones who escaped domestication?

An interesting online research project investigating what makes a face beautiful. It looks at various hypotheses (that the most mathematically average faces are the most beautiful, that beauty is symmetry, and that infantile features account for beauty in female faces) using computer-generated morphed faces, and comes to some interesting conclusions (i.e., that while extremely asymmetric faces are considered ugly, symmetry by itself does not correlate with beauty).
Finally, it concludes that people subconsciously associate beauty with virtue, and that in this age of visual media, we are subject to a sort of "beauty pollution", being bombarded with images of artificially beautiful faces.
(via bOING bOING)

Interactive art vs. animal rights
A Danish art gallery director is facing trial for animal cruelty after hosting an exhibit featuring goldfish swimming in a blender. The artist who created the exhibit, Marco Evaristtis, said that he wanted to make people "do battle with their conscience" when confronted with the switch. Throughout the course of the exhibition, two members of the public decided to press the switch (out of curiosity, disbelief that the blender could be live, or sheer sociopathic callousness; who knows?), killing the live goldfish. The gallery director is being sued for failing to cut off the electricity supply to the blender, which he says he didn't do as not to interfere with the artist's vision.

You know Brunswick St. is past its use-by date when the funky alternative clothing and bauble shops start opening childrenswear shops. The funky kids who hung out there in years gone by, back when it was "cool" and "alternative", are now in their 30s, married, mortgaged and with children of their own, and so the groovy boutiques have moved with the times and opened babywear boutiques, and the cycle continues. Though it makes one wonder what kids whose parents dressed them in Brunswick St. alternative clothes from the day they were born will do when they reach adolescence and need to individuate themselves.

Scientists have found out that women need larger computer monitors for working in graphical environments than men do, because of their lower spatial processing ability. With a standard monitor giving a 35 degree viewing angle, women tend to be on average 20% slower than men at certain spatial tasks; given two screens with a 100 degree viewing angle, this difference disappears.
If this is true, this could affect the big-arse monitor's significance as a macho status symbol.

Is George W. Bush a psychopath? This piece examines the evidence, comparing the clinical definition of the psychopathic personality syndrome to an article profiling Bush and other media. (And then, of course, are those stories of him blowing up frogs as a small child, and taunting death-row inmates as Governor of Texas.)

Of course, this isn't a valid clinical test, but it suggests that he may be one. Which makes one wonder what proportion of politicians are psychopaths (a useful mutation for that line of work).

Hare puts the average North American incidence of psychopathy at 1 per cent of the population, but the damage they inflict on society is out of all proportion to their numbers, not least because they gravitate to high-profile professions that offer the promise of control over others, such as law, politics, business management ... and journalism.

A Canadian researcher has found that the more older brothers a boy has, the more likely he is to be gay. It's not clear what the causal factors are, but the implications are quite peculiar. If it holds true, then (a) there were more gay men in the past (when people had larger families) than there are now, and (b) conservative religious subcultures, which encourage large families, may have more homosexuals among them than secular liberal society.

An article looking at the
phenomenon of earworms, or pieces of music which get stuck in one's head and resist removal.

Stuck song syndrome annoyed, frustrated, and irritated women significantly more than men. And earworm attacks were more frequent -- and lasted longer -- for musicians and music lovers. Slightly neurotic people also seemed to suffer more.
Kellaris hasn't yet found a cure. Women are more likely to try to get rid of the offending ditties. Men are just as likely to do nothing as to fight their earworms.

Unfortunately, the good doctor has not yet found a surefire way of stopping the bastards; though some folk cures include using another song to dislodge it, or trying to complete the song (breaking the loop).
(via 1.0)

With the possible exception of Ronald Reagan, whose fabled aloofness and privateness were probably signs of a deep introverted streak (many actors, I've read, are introverts, and many introverts, when socializing, feel like actors), introverts are not considered "naturals" in politics.

Isn't that the truth. If you're a natural introvert, being social can be like
acting as it requires running an extra layer of emulation, and extra effort; which is why us introverts get tired by having to do so for long periods of time, or why we can be grumpy or unsociable when tired.

In our extrovertist society, being outgoing is considered normal and therefore desirable, a mark of happiness, confidence, leadership. Extroverts are seen as bighearted, vibrant, warm, empathic. "People person" is a compliment. Introverts are described with words like "guarded," "loner," "reserved," "taciturn," "self-contained," "private"--narrow, ungenerous words, words that suggest emotional parsimony and smallness of personality.

Sometimes, as we gasp for air amid the fog of their 98-percent-content-free talk, we wonder if extroverts even bother to listen to themselves. Still, we endure stoically, because the etiquette books--written, no doubt, by extroverts--regard declining to banter as rude and gaps in conversation as awkward. We can only dream that someday, when our condition is more widely understood, when perhaps an Introverts' Rights movement has blossomed and borne fruit, it will not be impolite to say "I'm an introvert. You are a wonderful person and I like you. But now please shush."

What's more likely is that introversion will be medicalised (possibly classified as a form of Asperger's Syndrome; either that or a social anxiety disorder) and there will be drugs released to "cure" it. If your kid doesn't like playing sports with other kids and prefers to read books, or (less anachronistically) uses their computer for designing imaginary cities/languages/worlds rather than instant-messaging their friends, you will be able (and expected) to give them drugs that make them into a fully-functioning, ruggedly outgoing extrovert.
Sure, they'll lose a lot of their creativity and capacity for abstract thought, but they probably weren't going to be the next Albert Einstein anyway, and isn't it much better that they play well with other kids?

Social phenomena seen through American high school archetypes:
A WSJ writer claims that public criticism of SUVs (that's American for big-arse personal monster trucks) is geeks against jocks, the bookish dweebs who got beaten up in high school dressing up their rage at their former persecutors' wealth and soulless materialism in the garb of moral outrage, out of sheer impotent spite.

This anti-SUV fervor strikes me as a classic geek assault on jock culture. Here are the geeks: thoughtful, socially and environmentally conscious. They understand that only spiritually shallow people could possibly get pleasure from a motor vehicle. Then there are those jocks. They cruise through life infuriatingly unaware of how morally inferior they are to the geeks. They make money, become popular, play golf and have homes that are too large. And they're happy! For all the wrong reasons! And so every few years the geeks pick on some feature of jock life (McMansions, corporations, fraternities, country clubs) and get all worked up about it. And you know what? The jocks don't care! They just keep being happy. The geeks write, protest and fume. The jocks go to St. Croix.

By the same token one could dismiss any progressive concern is the whining of sore losers (or, indeed, "Jealousy masquerading as Class Consciousness"). Though, even if there is some truth in it (pertaining to the psychological motivations of some progressive activist types), that doesn't invalidate the argument.
(via Plastic)

Saddam == Osama (part 2):
A recent poll of 1,200 Americans asked a very simple question: "To the best of your knowledge, how many of the September 11 hijackers were Iraqi citizens?"
44% said that most or some were Iraqis; only 17% knew that none of them were.
65% of Americans also believe that Iraq and al-Qaeda are in very close collaboration; of which there is scant convincing evidence (or at least that has been made public). It looks like the Whitehouse has succeeded in conditioning the American people to associate long-time Bush family foe Saddam Hussein with the 9/11 attacks, all using psychological techniques:
(via bOING bOING)

It is not at all unreasonable to conclude that the suspected national identities of the hijackers -- 15 Saudis, one Egyptian, one Lebanese, and two from the United Arab Emirates -- must have been heard or read by everybody on at least several occasions. From there the raw information must have made its way to innumerable lunch rooms, bars and family dinner tables across the country, where it was debated and discussed. Though it was somewhat subversive and unpatriotic to ask why, there was an insatiable national hunger to know who. Even the realpolitik diplomatic strategy of the Bush administration -- to play down the frequency of dots leading to Saudi Arabia -- should not have penetrated sufficiently to impede free access to information that was clearly in the public domain.

So most Americans knew that there weren't any Iraqis involved, but (if the polls are representative) were persuaded into revising this knowledge by emotional conditioning, and repeated association of Iraq with terror by authority figures; a textbook example of the effectiveness of persuasion techniques at editing the public memory; owing equal parts to Noam Chomsky and Robert Cialdini.

To the behavioral psychologist, the truth about the hijacker's nationalities might seem a victim of a chronic state of inattention. Conditioning has rendered Americans hyper-responsive to emotional and sensory dynamics triggered by the news media, and relatively uninterested in intellectual content. Nobody understands this better than Rupert Murdoch, who has created an empire out of punchy anti-intellectualism. And few understand better how to use it to their advantage than the Bush White House. George W. Bush is, after all, the anti-intellectual's president.

It was a case of psychological transference on a national scale. The transformation came not by cognitive argument, but by emotional association -- Iraq was described persistently in the emotionally charged post-9/11 vocabulary and context, most often by an association with fear, anxiety and alarm.

Meanwhile, the involvement of that staunch bulwark of Truth, Justice and the American Way, Saudi Arabia, has been deemphasised, to the point where most Americans, aware only that the Saudis are Our Allies, would subconsciously edit out any ecollection of cognitively dissonant facts (such as that 15 of the 19 hijackers hailed from the sternly fundamentalist desert kingdom).

(Oh, and remember that British Intelligence dossier which proved beyond doubt that Iraq is guilty? Well, it turns out that it was plagiarised from academic articles about the 1991 Gulf War.)

Brits have a much more philosophical attitude in general. They think that they have a much more developed sense of irony than the Americans do. They mean that Americans are terribly earnest and terribly straightforward and gung-ho. It's like having a very big dog in the house that keeps panting, "Like me! Like me! Like me!"

I used to say to my husband, "How could they build an entire country with no closet space?" But they did. They forgot that we needed it. For most of the States that's not true. They have space. They have items.

A British guy I knew went out on a date with an American girl and she told him every terrible thing that ever happened to her and all of her issues and hang-ups. And he said to her, "Can't we just flirt? And talk about the weather? Why do I need to know all this?"

(Apparently, according to her, Oprah Winfrey is a fundamental aspect of how Americans think, and in particular their fixation with self-improvement and perfectibility.)

(What is it about music producers and guns? First Joe Meek went berzerk in the '60s, then Martin Hannett started shooting telephones, and now Phil Spector's gun-waving antics have come to a tragic denoument.)

Meanwhile, it has recently emerged that, on the other side of the world,
a religious organisation has been rounding up single mothers, sexual abuse
victims and orphans as well as girls who were too dangerously pretty to be
allowed out, stripping them of their identities, forbidding them to speak,
and forcing them into slavery, making a tidy profit of their labour.
Is this in Saudi Arabia? Nigeria? Or perhaps one of the excesses of the Taliban? No; it's the work of the Catholic Church in Ireland, and its "Magdalene Laundries", the last of which closed way back in 1996.

Beds were placed at a 20 degree angle, making them near-impossible to sleep on, and the floors of the 6ft by 3ft cells was scattered with bricks and other geometric blocks to prevent prisoners from walking backwards and forwards, according to the account of Laurencic's trial.
The only option left to prisoners was staring at the walls, which were curved and covered with mind-altering patterns of cubes, squares, straight lines and spirals which utilised tricks of colour, perspective and scale to cause mental confusion and distress. Lighting effects gave the impression that the dizzying patterns on the wall were moving.

The surrealistic cells were used to torture Francoist Fascists, as well as (of course) members of rival leftist factions and splinter groups.
(via Charlie's Diary)

(If they built something like that these days, mind you, they could probably pass it off as the latest clubbing sensation and charge admission for it.)

Teresa Nielsen Hayden has a long and detailed blog entry about the bizarre and disturbing world of animal hoarders (you know, the unhinged, reclusive types who pack dozens of cats or dogs into their tiny apartments until the neighbours complain about the smell of the filth and rotting carcasses and have them institutionalised).
(via bOING bOING)

Psychologists* claim to have found the formula for happiness. Happiness is, apparently, equal to P+(5E)+(3H), where P stands for "Personal Characteristics", E for "Existence" and H for "higher-order needs". I was almost expecting a 42 in there somewhere.

(In contrast, London-based Canadian/American indie band The Lollies state that happiness is "a myth dreamed up by advertising men to sell you mortgages, sports cars and deodorants". Whom do you believe?)

* or at least a "life coach", whatever that is; I suspect it means some sort of Tony Robbins-like guru type.

Everybody's trash to somebody, baby:
A psychologist has claimed that your mobile phone ringtone
says a lot more about you than you may like.
For example, young schoolgirls choose Top 10 pop songs to fit in; if young, competitive men, however, choose pop songs, they do so to make themselves look "safe" and camouflage their manipulative, sexually predatory natures. Meanwhile, themes from action movies are frequently chosen by young professionals who want to be seen as dynamic problem-solvers; however, those who have the time to set up their phones like that typically have dull lives that fail to live up to this image:

"They don't have huge excitement in their lives but like to think that they do. I doubt you'd find a firefighter or ambulance driver with a ring tone like that."

The strange, intensely paranoid world of Mafia, a former psychology experiment turned parlour game popular amongst writers and the like.

Here's how Mafia works: The party gathers in a room. Everyone is instructed to close his or her eyes, and three people are secretly selected to be in the "mafia" by the game leader, known as "God" or "the Mayor," whose job is to manage the action. No one knows who the mafia are except the mafia themselves, who are allowed to identify one another by opening their eyes. Later, when the whole groupcalled "the village"collectively opens its eyes, they are launched into the game: Through conversation, argument, questioning and accusation, a freewheeling group inquisition takes place to root out the mafia and kill them before the mafia kills the villagers.

An article about synaesthesia, and in particular, the tendency to associate colours with letters. The article gives a table of letters and their colours; are they more or less universal, or specific to one particular case?

I suspect that synaesthesia isn't all that exotic, and most people experience mild forms of it. I for one remember associating letters with colours when I was younger, though the colours were different (A,E and M were red, B was green, C and G were orange-yellow, and H was either red or blue). Some years later, I developed the theory that the mapping came from a set of alphabet blocks I played with when I was an infant.

One theory suggests that artists and criminals have a lot in common psychologically, such as a disdain for the rules of normalcy and often a primal rage, one which is expressed with creation in one case and violence in the other.
(via FmH)

Read:The Childhood Origins of Terrorism, an essay which suggests that Islamic Fundamentalist extremism and terrorism are products of abusive child-rearing practices found in parts of the Middle East, and the resultant culture of self-loathing and violent absolutism, and draws comparisons between the psychologies of terrorists and serial killers.
Which sort of makes sense, though sounds suspiciously Freudian in places.
(via FmH)

At the very moment of impact: do your best. Every deity and the spirits of your dead comrades are watching you intently. Just before the collision it is essential that you do not shut your eyes for a moment so as not to miss the target. Many have crashed into the targets with wide-open eyes. They will tell you what fun they had.

A sobering interview a professor of psychology who has studied the psychology of genocide, or why ordinary people commit atrocities. Eye-opening, and not optimistic; basically, his thesis is that the line between everyday civilisation and genocide is rather thin; that it's very easy to dehumanise an "enemy", thus enabling ordinary people to commit atrocities against their now no-longer "human" foes and still see themselves as good people, and that genocide is not, as is widely believed, an intrinsically male specialisation.

And in the Holocaust you had incidences of this, too -- I'm thinking of Jan Gross' book, entitled "Neighbors," about a small village in Poland named Jedwabne where the Catholic half of the village killed the Jewish half simply because they were given permission to do so. You realize how thin this veneer of civilization is that we put up. We say we live as neighbors and in a community, but when something happens structurally that says now you have permission to persecute, to take from, to even kill people that you've lived with for years, the relative ease with which people can do that is incredible.

It's one thing to understand killing, but killing with brutality and killing with zest and killing by taking trophies as American soldiers did with massacres of American Indians, is another thing. Why is that necessary? You'll even notice that in executions throughout World War II, the person's back is always toward the executioner. There really is no logistical reason for that in terms of ease of killing, it's more just a psychological defense of not having to see the victim.

And, according to Waller, genocide and other atrocities are likely to increase as more people compete for fewer resources.

A psychology study has found proof that alcohol makes you more attractive -- at least when someone else is drinking it.
In a study of 120 male and female students, researchers found that two pints of beer increased the perceived attractiveness of the opposite sex by about 25%.
This is hypothesised to be a result of alcohol stimulating the nucleus accumbens, which is responsible for judging facial attractiveness.

When memes compete for mindshare in the ideosphere, one of the things they're selected for is emotional impact. The most sensational story wins, as does the most disgusting urban legend, according to this paper.
(via FmH)

(Which all makes sense; by the same token, there are other (so far anecdotal) laws of memetics.
For example, it has been observed that urban legends that mention a "brand" of
some category mutate to refer to the best-known brand. (For example, the one about some small fried-chicken restaurant chain supporting the Ku Klux Klan mutated into an urban legend about KFC, and it's probable that the "Albert Einstein said we only use 10% of our brains" UL started as a claim about some lesser known very smart person making that statement.) I'd speculate that this is the result of a selection for economy or consistency with one's existing knowledge/memes, or a streamlining process that erodes memes into more agile forms.)

A study at Imperial College, London has found that eccentrics become more extreme with age. The researchers speculate that this is due to the human nervous system becoming less plastic, and less capable of covering up eccentricities to better fit in. Though Eliot from whom I got the link suggests it may be due to people becoming less concerned about others' opinions as they grow old. Though I wonder whether, given that thought and consciousness are physical processes, one is not a physical side-effect of the other.

Researchers in Wales have found that the types of books you read affect your dreams. Adults who read fiction have stranger dreams than those who don't, and are more likely to remember them; meanwhile, fantasy readers have more nightmares and lucid dreams, while those who prefer fantasy novels have more emotionally intense dreams.

Ah yes, the SIRC Guide to Flirting, enumerating the rules of the game in anthropological terms.
Useful for Martian scholars of Earth customs, or if you'd like to flirt but the flirting parts of your brain have become rewired for obscure programming languages or train spotting or something like that. Or just read it for the many insights into human psychology that emerge in such a subject:

Research has also shown that men have a tendency to mistake friendly behaviour for sexual flirting. This is not because they are stupid or deluded, but because they tend to see the world in more sexual terms than women. There is also evidence to suggest that women are naturally more socially skilled than men, better at interpreting people's behaviour and responding appropriately. Indeed, scientists have recently claimed that women have a special 'diplomacy gene' which men lack.

The "diplomacy gene" theory makes sense; one thing I've noticed that, in many close couples, a sort of specialisation develops where the woman handles most of the social interaction, even with old friends of her partner.
(via one.point.zero)

And that's what's fascinating about online dating. It reflects the human propensity for choice and classification, and the fact that technology is being molded to meet those propensities. By online dating Darwin might have been disturbed, but he would not have been surprised.

Morever, the truly innocent are often truly hoodwinked, according to the anonymous author of Saferdating.com, a site with extremely detailed advice and gruesome online dating stories, started by a woman who met her husband through the Internet, but "went through hellish experiences" beforehand. Online dating anecdotes posted on Saferdating.com have titles like "Determining Honesty Is Like Military Intelligence" and "A Horror Story of Cons and Scams."

"Our study showed if people are communicating with someone they believe to be attractive, they edit and rewrite more than if they don't care whether they are impressing them." Walther's chief concern is that email correspondence can lead to a dangerous wish fulfillment for the perfect love. "It is nearly impossible for people to live up to such an artificially high, idealized range of expectations," he noted.

Scientists at University College London have discovered that
making eye contact with an attractive face
stimulates the brain's reward centre.
Eye contact with an unattractive person triggers no such reward, and looking
at a beautiful face without eye contact triggers a much diminished reward.
The reward is not tied to the sex of the face.
This goes some way towards explaining snap judgments made of people, and
the evolutionary basis of evaluating social status and attractiveness.

Tenser, said the Tensor:
Scientists are studying Stuck Tune Syndrome, commonly
known as 'earworms', or the condition in which a melody or song starts repeating
in one's head, becoming impossible to dislodge. A researcher at the
University of Cincinnati is investigating what causes a song to become an earworm:

Kellaris, a marketing teacher who moonlights as a bouzouki player in a Greek band, theorizes that certain types of music operate like mental mosquito bites. They create a "cognitive itch" that can only be scratched by replaying the tune in the mind. The more the brain scratches, the worse the itch gets. The syndrome is triggered when "the brain detects an incongruity or something 'exceptional' in the musical stimulus," he explained in a report made earlier this year to the Society for Consumer Psychology.

The fact that the researcher in question is a marketing teacher, and working
in "consumer psychology", is slightly worrying, making one wonder exactly how
the research is going to be used. (See Egan, Greg, Beyond the Whistle Test.)

A classic example is "If You're Happy and You Know It," he says. The melody in each verse builds sequentially from the previous verse... With each "happy and you know it" line, the melody changes slightly, "but in a predictable way," he says. "It's the same pattern, which makes it more memorable."

Researchers at Glasgow University have found that
advertisements can alter your memory,
making you "remember" happy childhood memories, associated with a brand,
which never actually happened:

One root beer manufacturer, Stewart's, had discovered that many adults appeared to remember growing up drinking their product from bottles.
This was impossible since the company only began full-scale distribution 10 years ago. Before that, Stewart's root beer was available only from soda fountains.
However, the bottles were adorned with slogans such as "original", "old fashioned" and "since 1924", which conjured up images of times gone by.

Poetry by suicidal poets contains telltale signs of suicidal tendencies, in the form of
linguistic patterns. Researchers in the US applied computer analysis techniques
to poetry by poets who took their own lives, including Sylvia Plath, as well as a control
group of non-suicidal poets, and determined that the suicidal poets used many
more singular first-person references, and fewer words associated with
interpersonal communication, thus suggesting detachment and self-absorption.
Surprisingly, emotional words such as "love" and "hate" did not vary
significantly between the two words.

A psychological study commissioned by printer manufacturer Lexmark claims that
the fonts you use
reveal your personality. Courier is conservative, used by "old-school"
journalists, Helvetica shows that you're "in touch with contemporary issues"
and Times New Roman shows trustworthiness and compromise between old and new.
And presumably all of the above show that the user is too apathetic to actually
find and install some less overused fonts.
(via Meg)

To find out what it's like to be fat in public,
London-based reporter Angela Bottolph spent a week in a fat suit, and noted down how people reacted to
her differently when she was twice her normal size:

Sunday: I go to Waitrose and buy the healthy-ish food I usually buy. I'm becoming incredibly self-conscious around food in public. I'm too embarrassed to buy Pringles, and I'm dithering by the ice-cream when I realise the entire aisle is looking at me as if I'm an alcoholic on the rampage in an off-licence. I half expect someone to take my basket away and say, 'I think you've had enough already...'

Jane Mackay, former GP and currently artist-in-residence for the Cambridge
University Musical Society, has synæsthesia; a neurological condition which
enables her to see the colours of sounds. She describes this experience in
her own words.

"And my sister and I used to argue about our colours for the days of the week - my Wednesday is a lemony-yellow with angles in the middle of it, hers is green.

"Brian Perkins, the BBC Radio Four newsreader, has an amazingly rich, chocolatey-brown voice. "Yet 'Perkins' is a rather wishy-washy yellow-green, so I always forget his surname."

"I had a wonderful sneeze once, from someone sitting behind me in a concert. It was a really lovely turquoise that came across my shoulder in a triangular sheet."

(I once mentally associated letters of the alphabet with colours (A, E and
M were red, B and F were green and C was yellow), though I think that
originated in a set of wooden blocks I had as an infant.)

Researchers at Stanford University have found that
pessimists' and optimists' brains work differently. Optimists' brains
showed stronger reactions to images of rainbows and puppies, whereas pessimists'
brains showed no reaction there, but a reacton to images of angry people,
cemeteries and spiders (albeit not as much in the emotional centres of the
brain).

One year ago:

2017 is almost over, and so, here are my records of the year:
Alvvays - Antisocialites ( BandCamp )
The Canadian indiepop band's follow-up to their self-titled album turns up the polish, sounding in

Two years ago:

2017/2/4

A few days ago, I travelled from London to Amsterdam by train. I caught the Eurostar from St. Pancras International to Brussels-Midi, and then caught a Thalys high-speed train, along the Belgian/Dutch coast with its grey

Five years ago:

2014/2/20

On occasion of a Women In Rock mini-festival on Melbourne radio station 3CR, Mess+Noise got Ninetynine 's Laura Macfarlane and the members of the all-female rock trio Dead River to interview each other :
Laura: