Friday, September 30, 2011

Brook's in his latest column exposes our "convenience" of being rational creatures.

"Nobody is against empathy. Nonetheless, it’s insufficient. These days empathy has become a shortcut. It has become a way to experience delicious moral emotions without confronting the weaknesses in our nature that prevent us from actually acting upon them. It has become a way to experience the illusion of moral progress without having to do the nasty work of making moral judgments. In a culture that is inarticulate about moral categories and touchy about giving offense, teaching empathy is a safe way for schools and other institutions to seem virtuous without risking controversy or hurting anybody’s feelings.

People who actually perform pro-social action don’t only feel for those who are suffering, they feel compelled to act by a sense of duty. Their lives are structured by sacred codes."

The team developed ways to scan huge volumes of data to find slight abnormalities — computational biomarkers — that indicate defects in the heart muscle and nervous system. These included looking for subtle variability in the shape of apparently normal-looking heartbeats over time; specific sequences of changes in heart rate; and a comparison of a patient’s long-term ECG signal with those of other patients with similar histories.

They found that looking for these particular biomarkers in addition to using the traditional assessment tools helped to predict 50 percent more deaths. The best thing is that the data is already routinely collected, so implementing the system would not be costly.
- More Here

"You're taught to chase after all the usual brass rings; you try to be on this "who's who" list or that top 100 list; you chase after the big money and you figure out how big your corner office is; you worry about whether you have a fancy enough title or a fancy enough car. That's the message that's sent each and every day, or has been in our culture for far too long — that through material possessions, through a ruthless competition pursued only on your own behalf — that's how you will measure success. Now, you can take that road — and it may work for some. But at this critical juncture in our nation's history, at this difficult time, let me suggest that such an approach won't get you where you want to go; it displays a poverty of ambition."- 44th

"In what is the strongest evidence yet that the genetic material in food survives digestion and circulates through the body, fragments of plant RNA have been found swimming in the bloodstreams of people and cows. What's more the study by Chen-Yu Zhang of Nanjing University in China and his colleagues shows that some of these plant RNAs muffle gene expression and raise cholesterol levels in mice. The discovery opens up a new way to turn food into medicine: we may be able to design plants that change our genes for the better.

The genetic material in question is microRNA - tiny strands of RNA between 19 and 24 "letters" or nucleotides long. It is found in almost all cells with a nucleus and travels from cell to cell in the blood. Zhang and his colleagues wondered whether all the miRNA strands in our blood are made by our cells - or whether some comes from our food instead."

Even if RNA or DNA does not pass unscathed from food to eater, food can change gene expression in other ways. For example, cosmetics researchers recently suggested that a pill containing a mix of food extracts can influence our genes and boost collagen production in the skin, reducing the appearance of wrinkles (New Scientist, 24 September, p 10).

If Zhang's findings are replicated, we may discover that our blood is swimming with RNA from all kinds of plants. To date, all investigation of this possibility has been motivated by concerns that genes from genetically modified crops could harm health (see "Let's talk about GM crops"). But the new study opens the possibility of designing diets and plants with therapeutic effects.

Psychology research shows that, when people with similar opinions are put together, their views become more radical. In Going to Extremes: How Like Minds Unite and Divide, Cass R. Sunstein, the legal scholar who is now administrator of the White House Office of Information and Regulatory Affairs, reviews a variety of evidence and concludes, “When people talk to like-minded others, they tend to amplify their preexisting views, and to do so in a way that reduces their internal diversity.”
It is true that several respected political scientists have suggested that elites play a larger role in polarization than my analysis would suggest. But those arguments founder on a simple point: Political scientist Gary Jacobson has found that people’s views on politics have not diverged considerably from those of their representatives. This suggests that polarization is not primarily an elite-driven phenomenon. As Bill Galston and Pietro Nivola of Brookings explain, “Polarized politics are partly here, so to speak, by popular demand. And inasmuch as that is the case, undoing it may prove especially difficult.”

"The history of invention is not the history of a necessary future to which we must adapt or die, but rather of failed futures, and of futures firmly fixed in the past. We do not have a history of invention, but instead histories of the invention of only some of the technologies which were later successful."

Almost 30 years ago, the organisational theorist Karl Weick made an observation that campaigners on everything from global warming to homelessness have been ignoring ever since. Sometimes, he pointed out, convincing the world that you're fighting a Very Serious Problem actually makes it harder to solve. In a paper entitled Small Wins: Redefining The Scale Of Social Problems, Weick argued that perceiving challenges as huge made people seize up – disabling "the very resources of thought and action needed to change them". The history of gay rights, feminism and environmentalism, he claimed, showed that pursuing little victories was the better plan. They delivered quick motivation boosts, triggering a snowball effect. Want to change the world? First, stop trying to change the world.

By collecting diary entries from 238 people at seven companies, the authors generated 12,000 person-days of data on moods and activities at work. The striking conclusion is that a sense of incremental progress is vastly more important to happiness than either a grand mission or financial incentives – though 95% of the bosses didn't realise it. Small wins "had a surprisingly strong positive effect, and small losses a surprisingly strong negative one." Which chimes with recent research among US entrepreneurs by the business scholar Saras Sarasvathy: whatever they tell you on TV's Dragons' Den, the successful ones rarely made long-range business plans, and scorned market research. They went for quick wins – a few sales, then a few more – instead. Their philosophy was "ready, fire, aim".

The best gift you can give your kid this Xmas!! More on Amazon Fire here

"It's small, light (just 14.6 ounces), flat, and has an IPS screen and a look and feel that's reminiscent of an iPhone 3GS. Inside a mobile processor of unknown lineage purrs away at an undisclosed speed, although Amazon felt the need to tell us it's a "dual core" one. The screen has 169 pixels per inch, with a resolution of 1024 by 600 pixels, and Amazon is also bold enough to say the IPS tech that drives it is "similar technology to that used on the iPad," because Apple's made a big thig about its choice of screen tech in the past. It's got 8GB of on-board storage, good for "80 apps, plus either 10 movies or 800 songs or 6,000 books," but its connection to Amazon's cloud content feed means the internal storage size isn't really an issue.The battery is good for eight hours of "continuous reading" or one work or college day, or 7.5 hours of video "with wireless off," and Amazon notes the battery performance will vary with use, such as browsing the web. It has no 3G, but does have a Wi-Fi card that supports up to 802.11N networks, either public or private, but can't do ad-hoc or peer-to-peer wireless connections. And in terms of connecting to a computer, there are zero system requirements "because it's wireless and doesn't require a computer" (a slightly irrelevant dig at Apple's need to tether iDevices to a computer)."

""There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human's body will be cured."
- Richard Feynman

Tuesday, September 27, 2011

After the event people have the tendency to rationalize them giving the illusion that they were expected

The frustrating aspect of Black Swan Theory is that it provokes people to ask when the next black swan will be. This completely misses the point. Asking when the next black swan is or whether or not it is possible to avoid black swans tells me that you don’t understand Black Swan Theory. By definition black swans can’t be predicted or avoided. Yet, people continue to believe that both are possible.

Why?

I think it goes back to the narrative fallacy, which is what the third bullet point highlighted. As Taleb explains, the narrative fallacy describes our “limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, and arrow of relationship, upon them.” I italicize “limited ability” to suggest that the narrative fallacy is unavoidable. We simply cannot assess the past without creating a narrative that fits with the present and projects the future. For better or for worse, this means we will continue believe that past experience guarantees knowledge of the future. It’s a strange paradox: we will always believe that we have a grip on the future even when we are repeatedly surprised by what it brings us.

Slate has a eye-opening (read "warning") six part series on Robots stealing our jobs (learn probability, statistics and of-course programming to make sure we can compete and earn a decent living).

"Artificial intelligence machines are getting so good, so quickly, that they're poised to replace humans across a wide range of industries. In the next decade, we'll see machines barge into areas of the economy that we'd never suspected possible—they'll be diagnosing your diseases, dispensing your medicine, handling your lawsuits, making fundamental scientific discoveries, and even writing stories just like this one. Economic theory holds that as these industries are revolutionized by technology, prices for their services will decline, and society as a whole will benefit. As I conducted my research, I found this argument convincing—robotic lawyers, for instance, will bring cheap legal services to the masses who can't afford lawyers today. But there's a dark side, too: Imagine you've spent three years in law school, two more years clerking, and the last decade trying to make partner—and now here comes a machine that can do much of your $400-per-hour job faster, and for a fraction of the cost. What do you do now?

There is already some evidence that information technology has done permanent damage to workers in a large sector of the economy. This specifically applies to workers who are considered "middle skilled," meaning that they need some training, but not much, to do their jobs.

Middle-skilled jobs include many that are generally recognized to be antiquated—secretaries, administrative workers, repairmen, and manufacturing workers, among others. Since the 1980s, across several industrialized nations (including the United States), the number of workers in these job categories has been rapidly declining (the pace of the decline increased greatly during the last recession). Instead, most job growth has been at the poles, in professions that require very high skills and earn high wages, and in the service sector, where most jobs require few skills and pay tiny wages."

"It is remarkable that probability theory, which originated in the consideration of games of chance, should have become the most important object of human knowledge... The most important questions of life are, for the most part, really only problems of probability."

Monday, September 26, 2011

Fareed Zakaria nails it:"I think it's worth taking wage cuts to keep manufacturing jobs in America. You see, if those jobs keep going overseas, we could lose our most important job creator of all. Innovation.Take Amazon's trendy reading device, the Kindle. America invented it, but Taiwan makes it. Liveris says foreign manufacturers learn from the process of making stuff. Which helps them invent new products themselves. Innovation happens on the factory floor, not just in research labs. When you make stuff, you don't realize that when you move the making somewhere else, then the people who know how to make it have the intellectual know-how to make the next one."

"In his classic review of the relationship between intelligence, creativity, and personality, Frank X. Barron and David Harrington concluded that creative people tend to take more risks and are more impulsive (low Conscientiousness) but they also see themselves as competent and hard-working (high Conscientiousness). This seeming paradox (that creative people are simultaneously both high and low in Conscientious) can be resolved by recognizing there are two different aspects of Conscientiousness, each one with opposite correlations with creativity. Creative individuals tend to be more self-focused, independent, and intrinsically motivated (lower dependability/orderliness) while also being more driven, persistent, and gritty (higher industriousness/achievement)."
- More Here

“From the oyster to the eagle, from the swine to the tiger, all animals are to be found in men and each of them exists in some man, sometimes several at the time. Animals are nothing but the portrayal of our virtues and vices made manifest to our eyes, the visible reflections of our souls. God displays them to us to give us food for thought.”
- Victor Hugo

Sunday, September 25, 2011

To state the obvious; this is not a black swan. I had to start a new label before Mike Lewis writes a hilarious postmortem, 60 minutes has an exclusive, president gives another angry speech and before the ubiquity of "enlightenment"... such it goes...

Since the beginning of the recession, most types of credit have gone down. The only exception to that rule has been student loans.A recent piece in the Atlantic noted that student debt has grown by 511 percent since 1999. At that time, only $90 billion in student loans were outstanding—by the second quarter of 2011, that balance was up to $550 billion, according to the New York Fed. And the Department of Education estimates that outstanding loans total closer to $805 billion—and that number will pass $1 trillion soon.As student loans rise, so has delinquency. Phil Izzo at the Wall Street Journal reported that 11.2 percent of student loans were more than 90 days past due and that rate was steadily going up. “Only credit cards had a higher rate of delinquency — 12.2 percent — but those numbers have been on a steady decline for the past four quarters,” he noted.

In looking for ways to map the connections of neurons, scientists are turning to viruses – in particular, those that have evolved to infect neurons and to spread from one to another through the synaptic connections between them. One of these is the deadly rabies virus – a specialist in infecting neurons. Rabies is typically transmitted from one infected animal to another through saliva, often via a bite, which releases viral particles that infect peripheral neurons. From there, it spreads backwards into the spinal cord and brain, passing from the initially infected neurons into every neuron that connects to them. This continues in the next neurons, resulting in the rapid spread of infection throughout the entire nervous system.

The fact that the virus can spread from an infected neuron to other neuronsconnected to it makes it an almost perfect vector for tracing these connections in the brains of experimental animals. To make it perfect required some modifications.

First, using the tools of molecular biology, researchers have modified the genomeof the rabies virus, so that, as well as its own genes, the virus now carries so-called marker proteins, like the well-known green fluorescent protein from jellyfish. When ultraviolet light is shone on this protein, it fluoresces, giving off avivid green light. Neurons infected with the virus (either directly or via synaptic connections) can thus be beautifully visualised.

A major problem, however, is the rabies virus is too efficient – it continues to spread to all the neurons connected to each of the neurons connected to the first neuron, obscuring the pattern we are interested in. To get around this, the virus had to be crippled by removing one of the genes it needs to spread.

"People weren't getting their jobs through their friends. They were getting them through their acquaintances. Why is this? Granovetter argues that it is because when it comes to finding out about new jobs -- or, for that matter, new information, or new ideas -- "weak ties" are always more important than strong ties. Your friends, after all, occupy the same world that you do. They might work with you, or live near you, and go to the same same churches, schools, or parties. How much, then, would they know that you wouldn't know? Your acquaintances, on the other hand, by definition occupy a very different world than you. They are much more likely to know something that you don't. To capture this apparent paradox, Granovetter coined a marvelous phrase: the strength of weak ties. Acquaintances, in short, represent a source of social power, and the more acquaintances you have the more powerful you are."

Saturday, September 24, 2011

"What makes Steve’s methodology different from everyone else’s is that he always believed the most important decisions you make are not the things you do – but the things that you decide not to do. He’s a minimalist."

"Saving the world notwithstanding, the idea that decentralizing animal agriculture leads to greater animal welfare seems sensible enough. But does that mean serious animal suffering will be avoided? Hardly. An animal can be raised with care, fed a healthy diet, lavished with human affection, and kept relatively safe from external threats. However, more often than not, that animal is still going to be slaughtered. Putting aside for now the ethical implications of killing a sentient being in order to eat well, I want to ask a more tangible question: will the animal be slaughtered with compassion?

Please note: the point here is not to embarrass or castigate urban homesteaders who have boldly sought to take control of their own meat supply. Although I personally find the prospect of raising and killing an animal with my own hands to be deeply saddening, I admire these farmers for being deliberate with their lives and bucking the industrial meat system. My point, instead, is to draw on the published blogs of these (mostly urban) pioneers to highlight the rarely publicized fact that when amateurs take charge of the slaughter, the consequences can be anything but compassionate. Nobody can accurately say how representative the following accounts are, but their popularity suggests that, at the least, they’re not exceptional."

"What’s a progressive consumption tax? First of all, it’s not a sales or value-added tax, neither of which takes individual income into account. Those taxes are imposed on the spot when someone buys a good or a service.

Under a progressive consumption tax, taxpayers would report their incomes, much as they do now. They’d also report their annual savings, much as they do for tax-exempt retirement accounts. The tax would be based on “taxable consumption” — the difference between their income and annual savings, less a standard deduction of, say, $30,000 for a family of four. Rates on additional expenditures would start low and rise gradually with taxable consumption.

Because savings would be tax-exempt, the biggest spenders would save more and spend less on luxury goods, leading to greater investment and economic growth, without any need for government to micromanage anyone’s behavior. Consumers in the tier just below, influenced by those at the top, would also spend less, and so on, all the way down the income ladder. In short, such a tax would attenuate the expenditure cascade that has made life for middle-income families so expensive.

Adopting a progressive consumption tax would be like creating wealth out of thin air. Its magical quality stems from the fact that luxury spending is strongly context-dependent, just as antlers are. If everyone spends less, you can still have the biggest mansion — or antlers — on the block, but you’ll also be able to do many other useful things. The money saved could help resolve the current fiscal impasse. And it could also be used to fix roads and bridges and support a host of other genuine improvements.

Changing the tax code in a fundamental way won’t be easy. But as the late Herb Stein, Richard M. Nixon’s chief economist, once remarked, “if something can’t go on forever, it won’t.” The dysfunctional system now in place threatens to destroy our economic future. If we don’t change it now, we’ll have to change it later.

Can anyone imagine anti-government conservatives embracing a progressive consumption tax? Actually, yes. It’s perhaps the only important policy option that could win support from both ends of the political spectrum. A version of it, for example, was proposed in 1995 by Senators Pete V. Domenici, a Republican, and Sam Nunn, a Democrat.

Many conservatives advocate a flat tax — essentially, a national sales tax — but most realize that its adoption is unlikely because it would fall disproportionately on people with lower incomes. If the alternative is to stick with the current income tax, however, a progressive consumption tax begins to look pretty good to many conservatives. And once they understand how it would reshape spending patterns, they often become openly enthusiastic about it.

In 1997, shortly after publishing an article advocating this kind of tax, I received a warm letter from Milton Friedman, widely hailed as the patron saint of small-government conservatism."

"I believe that banking institutions are more dangerous to our liberties than standing armies. If the American people ever allow private banks to control the issue of their currency, first by inflation, then by deflation, the banks and corporations that will grow up around [the banks] will deprive the people of all property until their children wake-up homeless on the continent their fathers conquered. The issuing power should be taken from the banks and restored to the people, to whom it properly belongs."- Thomas Jefferson

Friday, September 23, 2011

Teens gravitate toward peers for another, more powerful reason: to invest in the future rather than the past. We enter a world made by our parents. But we will live most of our lives, and prosper (or not) in a world run and remade by our peers. Knowing, understanding, and building relationships with them bears critically on success. Socially savvy rats or monkeys, for instance, generally get the best nesting areas or territories, the most food and water, more allies, and more sex with better and fitter mates. And no species is more intricately and deeply social than humans are.

This supremely human characteristic makes peer relations not a sideshow but the main show. Some brain-scan studies, in fact, suggest that our brains react to peer exclusion much as they respond to threats to physical health or food supply. At a neural level, in other words, we perceive social rejection as a threat to existence. Knowing this might make it easier to abide the hysteria of a 13-year-old deceived by a friend or the gloom of a 15-year-old not invited to a party. These people! we lament. They react to social ups and downs as if their fates depended upon them! They're right. They do.

In scientific terms, teenagers can be a pain in the ass. But they are quite possibly the most fully, crucially adaptive human beings around. Without them, humanity might not have so readily spread across the globe.In times of doubt, take inspiration in one last distinction of the teen brain—a final key to both its clumsiness and its remarkable adaptability. This is the prolonged plasticity of those late-developing frontal areas as they slowly mature. As noted earlier, these areas are the last to lay down the fatty myelin insulation—the brain's white matter—that speeds transmission. And at first glance this seems like bad news: If we need these areas for the complex task of entering the world, why aren't they running at full speed when the challenges are most daunting?

The answer is that speed comes at the price of flexibility. While a myelin coating greatly accelerates an axon's bandwidth, it also inhibits the growth of new branches from the axon. According to Douglas Fields, an NIH neuroscientist who has spent years studying myelin, "This makes the period when a brain area lays down myelin a sort of crucial period of learning—the wiring is getting upgraded, but once that's done, it's harder to change."

"In an 1892 game against its archrival, Yale, the Harvard football team was the first to deploy a “flying wedge,” based on Napoleon’s surprise concentrations of military force. In an editorial calling for the abolition of the play, The New York Times described it as “half a ton of bone and muscle coming into collision with a man weighing 160 or 170 pounds,” noting that surgeons often had to be called onto the field. Three years later, the continuing mayhem prompted the Harvard faculty to take the first of two votes to abolish football. Charles Eliot, the university’s president, brought up other concerns. “Deaths and injuries are not the strongest argument against football,” declared Eliot. “That cheating and brutality are profitable is the main evil.” Still, Harvard football persisted. In 1903, fervent alumni built Harvard Stadium with zero college funds. The team’s first paid head coach, Bill Reid, started in 1905 at nearly twice the average salary for a full professor.

A newspaper story from that year, illustrated with the Grim Reaper laughing on a goalpost, counted 25 college players killed during football season. A fairy-tale version of the founding of the NCAA holds that President Theodore Roosevelt, upset by a photograph of a bloodied Swarthmore College player, vowed to civilize or destroy football. The real story is that Roosevelt maneuvered shrewdly to preserve the sport—and give a boost to his beloved Harvard. After McClure’s magazine published a story on corrupt teams with phantom students, a muckraker exposed Walter Camp’s $100,000 slush fund at Yale. In response to mounting outrage, Roosevelt summoned leaders from Harvard, Princeton, and Yale to the White House, where Camp parried mounting criticism and conceded nothing irresponsible in the college football rules he’d established. At Roosevelt’s behest, the three schools issued a public statement that college sports must reform to survive, and representatives from 68 colleges founded a new organization that would soon be called the National Collegiate Athletic Association. A Haverford College official was confirmed as secretary but then promptly resigned in favor of Bill Reid, the new Harvard coach, who instituted new rules that benefited Harvard’s playing style at the expense of Yale’s. At a stroke, Roosevelt saved football and dethroned Yale."

"Climb the mountains and get their good tidings. Nature's peace will flow into you as sunshine flows into trees. The winds will blow their own freshness into you, and the storms their energy, while cares will drop off like autumn leaves."
- John Muir

Thursday, September 22, 2011

Jan Lorenz recently found the same thing. Swiss college students can form a wise crowd when answering questions independently, but once they could find out what their peers had guessed, their answers became more inaccurate. In his summary of the study, Jonah Lehrer wrote, “The range of guesses dramatically narrowed; people were mindlessly imitating each other. Instead of canceling out their errors, they ended up magnifying their biases, which is why each round led to worse guesses.”

Is the crowd doomed to groupthink? Not quite. King found that he could steer them back towards a wiser guess by giving them the current best guess. When this happened, the median returned to a respectable 795. So the crowd loses its wisdom when it gets random pieces of information about what its members think, but it regains its wisdom if it finds out what the most successful individual said.

King says that this mirrors what happens in real life. The crowd may be a social beast, but it isn’t an indiscriminate one. Certain individuals wield disproportionate influence, and groups of soldiers, employees, players and even animals often rely on leaders when they make decisions.

There’s a reason for this. When King provided his volunteers with the best previous guess, their range of answers was narrower with fewer extreme predictions. Their collective answers were also about as accurate in small groups of 10 people as they were in larger ones of 70. King writes, “Copying successful individuals can enable accuracy at both the individual and group level, even at small group sizes.”
- More from Ed Young Here

Lying by Sam Harris. A simple click for $1.99 and euphoria of finishing a book in 40 minutes - pleasures of Kindle!!Sam Harris delivers as usual, this time much better in a small power packed version. A much easier and a better read than My Experiments with Truth. I know that was a bad analogy but this book is more pragmatic sans that stigma of Mahatma.
Those "quintessential" white lies start with how was your day? and continues to become WMD's in Iraq (highly recommend 2009 British movie, The invention of lying). We as a civilization can do much better than that. Are we ready to start our own experiments with truth and to face the consequences? I did and faced the consequences but refuse to give up even after an immense social cost. But returns to my conscience was priceless.

On Lying:Whatever our purpose in telling them, lies can be gross or subtle. Some entail elaborate ruses or forged documents. Others consist merely of euphemisms or tactical silences. But it is in believing one thing while intending to communicate another that every lie is born.
Lying is, almost by definition, a refusal to cooperate with others. It condenses a lack of trust and trustworthiness into a single act. It is both a failure of understanding and an unwillingness to be understand. To lie is to recoil from relationship.

On White Lies:What could be wrong with truly "white lies"? First, they are still lies. And in telling them, we incur all the problems of being less than straightforward in our dealings with other people. Sincerity, authenticity, integrity, mutual understanding - these and other sources of moral wealth are destroyed the moment we deliberately misrepresent our beliefs, whether or not our lies are ever discovered. A white lie is simply a denial of these realities. It is a refusal to offer honest guidance in a strom. Even on so touchy a subject, lying seems a clear failure of friendship.

On Honesty:Honesty is a gift we can give to others. It is also a source of power and an engine of simplicity. Knowing that we will attempt to tell the truth, whatever the circumstances, leaves us with little to prepare for. We can simply be ourselves.
Vulnerability comes in pretending to be someone your are not.
A wasteland of embarrassment and social upheaval can be neatly avoided by following a single precept in life. DO NOT LIE.

What would our world be like if we ceased to worry about "right" and "wrong," or "good" and "evil," and simply acted so as to maximize well-being, our own and that of others? Would we lose anything important?
- Sam Harris

Wednesday, September 21, 2011

"With Greece and Ireland in economic shreds, while Portugal, Spain, and perhaps even Italy head south, only one nation can save Europe from financial Armageddon: a highly reluctant Germany. The ironies—like the fact that bankers from Düsseldorf were the ultimate patsies in Wall Street’s con game—pile up quickly as Michael Lewis investigates German attitudes toward money, excrement, and the country’s Nazi past, all of which help explain its peculiar new status."

"The toilet-paper puppy arrived in 1972, at the suggestion of an executive at the Scott Paper Company's Andrex line in Britain. It soon became one of the most beloved advertising icons in the United Kingdom—such a success that in 2003, a few years after Kimberly-Clark bought out Scott, the company adopted the yellow Labrador as the spokesanimal for its own Cottonelle brand. The little dog on the package was supposed to convey vulnerability and a need for gentle treatment, a company rep told the New York Times.

The brand icons for the major toilet-paper companies have remained fairly constant in recent years. There have been a few minor changes: In 2010, the Andrex puppy received a CGI upgrade, and the Charmin bear was redrawn to show flecks of cartoon toilet paper on its cartoon behind."

Elaborative Encoding: Basic principle is to convert our non-synesthesia-tic brains into synesthesia (excellent video on synesthesia here) driven machines and make memories out of vivid imagery. Our memories weren't built for the modern world. Like our vision, our capacity for language, our ability to walk upright, and every other one of our biological faculties, our memories evolved through a process of natural selection in an environment that was quite different from the one we live in today.
The principle underlying all memory techniques is that our brains don't remember all types of information equally well. As exceptional as we are at remembering visual imagery, we're terrible at remembering other kinds of information, like lists of words or numbers. The point of memory techniques is to take the kinds of memories our brains aren't good at holding on to and transform them into the kind of memories our brains are built for.

Remembering Names: Use Baker/baker paradox. The paradox goes like this - A researcher shows two people the same photograph of a face and tells one of them that the guy is baker and the other the last name is Baker. A coupe of days later, the researcher shows the same two guys the same photograph and asks for the accompanying word. The person who was told the man's profession is much more likely to remember it than the person who was given this surname. Why different amount of remembering for the same? When we hear that the man is a baker, the fact gets embedded in a whole network of ideas what it means to be a baker: cooks bread, wears a big white hat etc. The name Baker, on the other hand, is tethered only to a memory of the person's face. That link is tenuous, and should it dissolve, the name will float off irretrievably into the netherworld of lost memories. But when it comes to the man's profession, there are multiple strings to reel the memory back in. Even if you don't at first remember that the man is a baker, perhaps you get some vague sense of breadiness about him, or see some associate between his face and a big white hat, or may be we conjure up a memory of a neighborhood bakery. The secret to success in the names-and-faces event-and to remembering people's names in real world--is simply to turn Bakers into bakers-or Foers to fours or Reagans into ray guns.
Another lesser effective technique is called phonological loop; which is just a fancy name for the little voice that we hear inside our head when we talk to ourselves. The phonological loop acts as an echo, producing a short-term memory buffer that can store sounds just a couple of seconds, if we're not rehearsing them.

Remembering Numbers: Use Chunking technique. Chucking is a way to decrease the number of items we have to remember by increasing the size of each item. Chunking is the reason phone numbers are broken into two parts plus an area code and that credit card number are split into groups of four. And chunking is extremely relevant to the question of why experts so often have such exceptional memories.

Memory Palace:Greek poet Simonides, according to the story, left a banquet early after being accosted by a messenger, and the building where he has been celebrating then crumbled to the ground the very instant he left the behind, leaving the victims inside unrecognizable. Simonides was able to identify the bodies by remembering where they had been sitting by mentally walking through the banqueting hall. Due to mentally walking through a familiar location Simonides invented the idea of the memory palace.
A memory palace is a location or series of locations that you know well. You mentally walk through the palace and place the item (or word) you are trying to learn or remember in each location. The location must be a place you know well, for example it could be your house or a city that you know well, depending on what it is you are trying to learn.
When you want to recall the items you retrace your steps mentally picking the items up as you go. For language learning as you are learning a word you need to visualize it in the location that you have chosen. The act of visualization helps in later recall. This idea is like a smaller version of the movie Inception (those awesome dreams!!). The idea is to create as many familiar places (helps if we travel a lot) as a memory palace, then destroy it and re-create it as the need goes. And yes, it's easier said than done!!

Overcoming the O.K plateau (complacency): Use the technique in speed typing literature. When people first learn to use keyboard, they improve very quickly from sloppy finger pecking to careful two-handed typing, until eventually the fingers move so effortlessly across the keys that the whole process becomes unconscious and the fingers seem to take a mind of their own. At this point, most people's typing skills stop progressing. They reach a plateau.
In the 1960's, the psychologists Paul Fitts and Michael Posner attempted to answer this question by describing the three stages that anyone goes through when acquiring a new skill. During the first phase, known as the "cognitive stage," we're intellectualizing the task and discovering new strategies to accomplish it more proficiently. During the second "associative stage," we're concentrating less, making fewer major errors, and generally becoming more efficient. Finally yo reach the "autonomous stage," when we figure that we're gotten as good as we need to get at the task and basically running on autopilot. During the autonomous stage, we lose conscious control over what we're doing. Most of the time that's a good thing since mind has one less thing to worry about. This is called Galton's wall (after 1869 book Hereditary Genius by Sir Francis Galton) - who argued that a person could only improve at physical and mental activities up until they reached a certain point, which cannot by any education or exertion be overpassed. But now psychologists believe that Galton's wall has much less to do with our innate limits than simply with what we consider an acceptable level of performance. What separates experts from rest of us is that they tend to engage in a very directed, highly focused routine called deliberate practice. Experts develop strategies for consciously keeping out of the autonomous stage while they practice by doing three things: focusing on their technique, staying goal-oriented, and getting constant and immediate feedback on their performance. In other words, they force themselves to stay in the cognitive phase.

We have outsourced our memories so effortlessly that I had to even outsource these notes on art of remembering everything. So the most obvious question is why we have to invest in our memory, given the ubiquity of external memories? Foer decimates that implausible question:

"How we perceive the world and how we act in it are products of how and what we remember. We're all just bundle of habits shaped by our memories. And to the extent that we control our lives, we do so by gradually altering those habits, which is to say the networks of our memory. No lasting joke, invention, insight, or work of art was ever produced by external memory (not yet, at-least).Memory training is not just for the sake of performing party tricks; it's about nurturing something profoundly and essentially human."

Monday, September 19, 2011

"And since 1945 in Europe and the Americas, we’ve seen steep declines in the number of deaths from interstate wars, ethnic riots, and military coups, even in South America. Worldwide, the number of battle deaths has fallen from 65,000 per conflict per year to less than 2,000 deaths in this decade. Since the end of the Cold War in the early 1990s, we have seen fewer civil wars, a 90 percent reduction in the number of deaths by genocide, and even a reversal in the 1960s-era uptick in violent crime.
Given these facts, why do so many people imagine that we live in an age of violence and killing? The first reason, I believe, is that we have better reporting. As political scientist James Payne once quipped, the Associated Press is a better chronicler of wars across the globe than were 16th-century monks. There’s also a cognitive illusion at work. Cognitive psychologists know that the easier it is to recall an event, the more likely we are to believe it will happen again. Gory war zone images from TV are burned into memory, but we never see reports of many more people dying in their beds of old age. And in the realms of opinion and advocacy, no one ever attracted supporters and donors by saying that things just seem to be getting better and better. Taken together, all these factors help create an atmosphere of dread in the contemporary mind, one that does not stand the test of reality.
Finally, there is the fact that our behavior often falls short of our rising expectations. Violence has gone down in part because people got sick of carnage and cruelty. That’s a psychological process that seems to be continuing, but it outpaces changes in behavior. So today some of us are outraged—rightly so—if a murderer is executed in Texas by lethal injection after a 15-year appeal process. We don’t consider that a couple of hundred years ago a person could be burned at the stake for criticizing the king after a trial that lasted 10 minutes. Today we should look at capital punishment as evidence of how high our standards have risen, rather than how low our behavior can sink.

Then there is the scenario sketched by philosopher Peter Singer. Evolution, he suggests, bequeathed people a small kernel of empathy, which by default they apply only within a narrow circle of friends and relations. Over the millennia, people’s moral circles have expanded to encompass larger and larger polities: the clan, the tribe, the nation, both sexes, other races, and even animals. The circle may have been pushed outward by expanding networks of reciprocity, à la Wright, but it might also be inflated by the inexorable logic of the Golden Rule: The more one knows and thinks about other living things, the harder it is to privilege one’s own interests over theirs. The empathy escalator may also be powered by cosmopolitanism, in which journalism, memoir, and realistic fiction make the inner lives of other people, and the precariousness of one’s own lot in life, more palpable—the feeling that “there but for fortune go I.”
Whatever its causes, the decline of violence has profound implications. It is not a license for complacency: We enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to end it, and so we should work to end the appalling violence in our time. Nor is it necessarily grounds for optimism about the immediate future, since the world has never before had national leaders who combine pre-modern sensibilities with modern weapons.

Saturday, September 17, 2011

"Sea turtles interact with a variety of fishing gears across their broad geographic distributions and ontogenetic habitat shifts. Cumulative assessments of multi-gear bycatch impacts on sea turtle populations are critical for coherent fisheries bycatch management, but such estimates are difficult to achieve, due to low fisheries observer effort, and a single-species, single-fishery management focus. We compiled the first cumulative estimates of sea turtle bycatch across fisheries of the United States between 1990 and 2007, before and after implementation of fisheries-specific bycatch mitigation measures. An annual mean of 346,500 turtle interactions was estimated to result in 71,000 annual deaths prior to establishment of bycatch mitigation measures in US fisheries. Current bycatch estimates (since implementation of mitigation measures) are ∼60% lower (137,800 interactions) and mortality estimates are ∼94% lower (4600 deaths) than pre-regulation estimates. The Southeast/Gulf of Mexico Shrimp Trawl fishery accounts for the overwhelming majority of sea turtle bycatch (up to 98%) in US fisheries, but estimates of bycatch in this fishery are fraught with high uncertainty due to lack of observer coverage. Our estimates represent minimum annual interactions and mortality because our methods were conservative and we could not analyze unobserved fisheries potentially interacting with sea turtles. Although considerable progress has been made in reducing sea turtle bycatch in US fisheries, management still needs improvement. We suggest that sea turtle bycatch limits be set across US fisheries, using an approach similar to the Potential Biological Removal algorithm mandated by the Marine Mammal Protection Act."

“If you always put limit on everything you do, physical or anything else. It will spread into your work and into your life. There are no limits. There are only plateaus, and you must not stay there, you must go beyond them If it kills you, it kills you.”
- Bruce Lee

Friday, September 16, 2011

"When you’re looking for a good snooze in warm bedsheets or a cozy couch potato session on the couch, nothing’s quite as nice as scoring that warm spot where the family fuzzball was just powernapping."

"Over the past decades, Americans have developed an absurd view of the power of government. Many voters seem to think that government has the power to protect them from the consequences of their sins. Then they get angry and cynical when it turns out that it can’t"

- More Here and as expected, Brooks will also be promoting Daniel Kahneman's forthcoming book Thinking, Fast and Slow (scheduled to be released October 25th 2011).

Thursday, September 15, 2011

I couldn't stop thinking about dogs (and Siberian Silver Fox study) while began reading Steven J Gould's 1979 essay on the occasion of 50th anniversary of Mickey Mouse - A Biological Homage to Mickey Mouse. Ironically, I missed the obvious analogy Gould was trying to draw between Mickey Mouse and us.

As Mickey's personality softened, his appearance changed. Many Disney fans are aware of this transformation through time, but few (I suspect) have recognized the coordinating theme behind all the alterations--in fact, I am not sure that the Disney artists themselves explicitly realized what they were doing, since the changes appeared in such a halting and piecemeal fashion. In short, the blander and inoffensive Mickey became progressively more juvenile in appearance. (Since Mickey's chronological age never altered--like most cartoon characters he stands impervious to the ravages of time--this change in appearance at a constant age is a true evolutionary transformation. Progressive juvenilization as an evolutionary phenomenon is called neoteny. More on this later.)
A marked slowdown of developmental rates has triggered our neoteny. Primates are slow developers among mammals. We have very long periods of gestation, markedly extended childhoods, and the longest life span of any mammal. The morphological features of eternal youth have served us well. Our enlarged brain is, at least in part, a result of extending rapid prenatal growth rates to later ages. (In all mammals, the brain grows rapidly in utero but often very little after birth. We have extended this fetal phase into postnatal life.)
But the changes in timing themselves have been just as important. We are preeminently leaning animals, and our extended childhood permits the transference of culture by education. Many animals display flexibility and play in childhood but follow rigidly programmed patterns as adults. Lorenz writes, in the same article above: "The characteristic which is so vital for the human peculiarity of the true man--that of always remaining: in a state of development--is quite certainly a gift which we owe to the neotenous nature of mankind."

In short, we, like Mickey, never grow up although we, alas, do grow old. Best wishes to you,

Time for the perpetual adolescents is curiously static. They are in no great hurry: to succeed, to get work, to lay down achievements. Perhaps this is partly because longevity has increased in recent decades -- if one doesn't make it to 90 nowadays, one feels slightly cheated -- but more likely it is that time doesn't seem to the perpetual adolescent the excruciatingly finite matter, the precious commodity, it indubitably is. For the perpetual adolescent, time is almost endlessly expandable. Why not go to law school in one's late thirties, or take the premed requirements in one's early forties, or wait even later than that to have children? Time enough to toss away one's twenties, maybe even one's thirties; 40 is soon enough to get serious about life; maybe 50, when you think about it, is the best time really to get going in earnest.

At a certain point in American life, the young ceased to be viewed as a transient class and youth as a phase of life through which everyone soon passed. Instead, youthfulness was vaunted and carried a special moral status. Adolescence triumphed, becoming a permanent condition. As one grew older, one was presented with two choices, to seem an old fogey for attempting to live according to one's own standard of adulthood, or to go with the flow and adapt some variant of pulling one's long gray hair back into a ponytail, struggling into the spandex shorts, working on those abs, and ending one's days among the Rip Van With-Its. Not, I think, a handsome set of alternatives.

The greatest sins, Santayana thought, are those that set out to strangle human nature. This is of course what is being done in cultivating perpetual adolescence, while putting off maturity for as long as possible. Maturity provides a more articulated sense of the ebb and flow, the ups and downs, of life, a more subtly reticulated graph of human possibility. Above all, it values a clear and fit conception of reality. Maturity is ever cognizant that the clock is running, life is finite, and among the greatest mistakes is to believe otherwise. Maturity doesn't exclude playfulness or high humor. Far from it. The mature understand that the bitterest joke of all is that the quickest way to grow old lies in the hopeless attempt to stay forever young.

Whether it’s the pioneer in the Conestoga wagon or someone coming here in the 1920s from southern Italy, there was this idea in America that if you worked hard and you showed real grit, that you could be successful,” he said. “Strangely, we’ve now forgotten that. People who have an easy time of things, who get 800s on their SAT’s, I worry that those people get feedback that everything they’re doing is great. And I think as a result, we are actually setting them up for long-term failure. When that person suddenly has to face up to a difficult moment, then I think they’re screwed, to be honest. I don’t think they’ve grown the capacities to be able to handle that.

Duckworth’s research convinced Levin and Randolph that they should try to foster self-control and grit in their students. Yet those didn’t seem like the only character strengths that mattered. The full list of 24, on the other hand, felt too unwieldy. So they asked Peterson if he could narrow the list down to a more manageable handful, and he identified a set of strengths that were, according to his research, especially likely to predict life satisfaction and high achievement. After a few small adjustments (Levin and Randolph opted to drop love in favor of curiosity), they settled on a final list: zest, grit, self-control, social intelligence, gratitude, optimism and curiosity.In fact, though, the character-strength approach of Seligman and Peterson isn’t an expansion of programs like CARE; if anything, it is a repudiation of them. In 2008, a national organization called the Character Education Partnership published a paper that divided character education into two categories: programs that develop “moral character,” which embodies ethical values like fairness, generosity and integrity; and those that address “performance character,” which includes values like effort, diligence and perseverance. The CARE program falls firmly on the “moral character” side of the divide, while the seven strengths that Randolph and Levin have chosen for their schools lean much more heavily toward performance character: while they do have a moral component, strengths like zest, optimism, social intelligence and curiosity aren’t particularly heroic; they make you think of Steve Jobs or Bill Clinton more than the Rev. Martin Luther King Jr. or Gandhi.

Wednesday, September 14, 2011

Incognito: The Secret Lives of the Brain by David Eagleman. Most of us are oblivious to the new Copernican Revolution of neuroscience. Yet another eloquently written book to enlighten the masses on the unconscious mind - David Eagleman and Johan Lehrer are the best in the business, period. Having said that if one happened to regularly follow the neuroscience blogs and books, at times it gets boring since the same characters and studies seemed be reiterated (albeit their importance) - Damosio's card game, Phineas Gage, Parkinson's/dopamine/gambling, trolley-ology et al. - get the point, right?Of-course, the most important chapter of the book is on the field pioneered by Eagleman - Neuroethics. This will be a game changer and it's only a matter of time before neuroethics will force Roe vs Wade debate into the abyss. Given that nature has abstained from defining uniqueness to our neural network and gifted it with perpetual neural plasticity, a neural history probably would be the only hope to vindicate or convict at-least during the nascency of neuroethics. It would make immense sense to start a yearly neural checkup akin to physical and dental checkups. Ironically, the dissonance of Cartesian dualism has deluded us for centuries by branding anything neural as taboo. A neural history will also be the seminal force behind a minority report vigilance and would only help bring down the medical costs. The question is, are we as a civilization ready to shed our preconceived notions on mental health? So much for neural plasticity!!

Johann Wolfgang von Goethe commemorated the immensity of Galileo's discovery:"Of all the discoveries and opinions… none may have exerted a greater effect on the human spirit.. The world had scarcely become known as round and complete in itself when it is was asked to waive the tremendous privilege of being the center of the universe. Never, perhaps, was a greater demand made on mankind-for by this admission so many things vanished in mist and smoke! What become of our Eden, our world of innocence, piety and poetry; the testimony of senses; the conviction of a poetic religious faith? No wonder his contemporaries did not wish to let all this go an offered every possible resistance to a doctrine which in its converts authorized and demanded a freedom of view and greatness of thought so far unknown, indeed not even dreamed of."-

Darwin foresaw the neuroscience revolution when wrote the following lines in Origin of Species: In the distant future I see open fields for far more important researches. Psychology will be based on a new foundation, that of the necessary acquirement's for each mental power and capacity by gradation.

Eagleman offer's a new perspective on non-human animal consciousness: "So are other animals conscious? Science currently has no meaningful way to make a measurement to answer that question - but I offer two intuitions. First, consciousness is probably not an all-or-nothing quality, but comes in degrees. Second, I suggest that an animal's degree of consciousness will parallel its intellectual flexibility. The more subroutines an animal possesses, the more it will require a CEO to lead the organization. The CEO keeps the subroutines unified; it is the warden of the zombies. To put this another way, a small corporation does not require a CEO who earns three million dollars a year, but a large corporation does. The only difference is the number of workers the CEO has to keep track of, allocate among, and set goals for."

"In every work of genius we rec­og­nize our own rejected thoughts; they come back to us with a cer­tain alien­ated majesty. Great works of art have no more affect­ing lesson for us than this. They teach us to abide by our spon­ta­neous impres­sion with good-humored inflex­i­bil­ity then most when the whole cry of voices is on the other side. Else tomorrow a stranger will say with mas­terly good sense pre­cisely what we have thought and felt all the time…"
- Self Reliance by Ralph Waldo Emerson

IBM Intelligent Transportation provides a comprehensive picture of what’s coming down the road at any given time to alleviate congestion, improve traffic management, optimize road capacity, rapidly respond to incidents and enhance the travel experience.

Improves citywide traffic planning and management even where infrastructure is constrained and expansion is not an option

IBM Intelligent Transportation utilizes IBM Intelligent Operations Center to enable real-time communication and collaboration with other city agencies to coordinate actions and resolve issues in an efficient manner.

IBM Intelligent Transportation and IBM Intelligent Operations Center are part of the family of IBM Smarter City Solutions designed to enable cities to deliver exceptional service to their citizens.

Health insurer WellPoint and IBM are set to announce a deal regarding the implementation of the Watson computer system for health care, The Wall Street Journal reports, adding that this would be "the first time the high-profile project will result in a commercial application." While the "exact terms of the agreement weren't disclosed," IBM's Steven Mills tells the paper that the system "could be used in settings as varied as call centers and offices doing engineering and scientific work, and he believes the Watson technology carries the potential to grow into a business generating $1 billion of annual revenue," the WSJ says. WellPoint officials "ultimately want to provide the Watson service more broadly to physicians who treat complicated chronic conditions, and they hope to create an application that could be accessed directly by patients seeking health information," the paper adds. WellPoint Executive Vice President Lori Beer tells the Journal that at present, it's "too soon to tell" whether her firm plans to eventually sell Watson-based services to medical providers.

Followers

Follow by Email

Subscribe To

About Me

I have this "little" 75 lb chocolate colored guy named Max and he has been the catalyst for my metamorphosis. Ever since he came into my life, I have been trying to subside that ape inside me.Blogging is proclamation of my ignorance to the world, the willingness to learn and an effort to get rid of my cognitive dissonances.