Saturday, December 31, 2016

There are, presumably, genuine jerks in the world. ... They don’t think of themselves as jerks, because jerk self-knowledge is hard to come by.

Psychologist Simine Vazire at the University of California, Davis argues that we tend to have good self-knowledge of our own traits when those traits are both evaluatively neutral (in the sense that it’s not especially good or bad to have those traits), and straightforwardly observable. ...

The question “am I really, truly a self-important jerk?” is highly evaluatively loaded, so you will be highly motivated to reach a favored answer: “No, of course not!” Being a jerk is also not straightforwardly observable, so you will have plenty of room to reinterpret evidence to suit: “Sure, maybe I was a little grumpy with that cashier, but she deserved it for forgetting to put my double shot in a tall cup.” ...

To be a jerk is to be ignorant in a certain way—ignorant of the value of others, ignorant of the merit of their ideas and plans, dismissive of their desires and beliefs, unforgiving of their perceived inferiority. ...

To sharpen our conception of jerkitude, it’s helpful also to consider the jerk’s opposite: the sweetheart. Maybe you know one or two of these people—habitually alert to the needs and interests of others, solicitous of others’ thoughts and preferences, liable in cases of conflict to suspect that the fault might lie with them rather than with the other party. ...

...ironically, it is often the sweethearts who are most worried that they have been acting like jerks...

If the essence of jerkitude is a failure to appreciate the perspectives of others around you, this suggests what might be a non-obvious path to self-knowledge: looking not at yourself but at other people. Instead of gazing into the mirror, turn away from the mirror and notice the colors in which the world seems to be painted. Are you surrounded by fools and non-entities, by people with bad taste and silly desires, by boring people undeserving of your attention...?

If this is how the world regularly looks to you, then I have bad news. Likely, you are the jerk. This is not how the world looks to most people, and it is not how the world actually is. You have a distorted vision. You are not seeing the individuality and potential of the people around you.

Thirty-eight hundred years ago, on the hot river plains of what is now southern Iraq, a Babylonian student did a bit of schoolwork that centuries later would change our understanding of ancient mathematics. The student scooped up a palm-sized clump of wet clay, formed a disc about the size and shape of a hamburger, and let it dry down a bit in the sun. On the surface of the moist clay the student drew a diagram that showed the people of the Old Babylonian Period (1,900–1,700 B.C.E.) fully understood the principles of the “Pythagorean Theorem” 1300 years before Greek geometer Pythagoras was born, and were also capable of calculating the square root of two to six decimal places. ...

“This geometry tablet is one of the most-reproduced cultural objects that Yale owns — it’s published in mathematics textbooks the world over,” says Professor Benjamin Foster, curator of the Babylonian Collection, which includes the tablet. ...

The tablet, formally known as YBC 7289, “Old Babylonian Period Mathematical Text,” came to Yale in 1909 as part of a much larger collection of cuneiform tablets assembled by J. Pierpont Morgan and donated to Yale.

When Walt Disney’s “Bambi” opened in 1942, critics praised its spare, haunting visual style, vastly different from anything Disney had done before.

But what they did not know was that the film’s striking appearance had been created by a Chinese immigrant artist, who took as his inspiration the landscape paintings of the Song dynasty. The full extent of his contribution to “Bambi,” which remains a high-water mark for film animation, would not be widely known for decades.

Like the film’s title character, the artist, Tyrus Wong, weathered irrevocable separation from his mother — and, in the hope of making a life in the United States, incarceration, isolation and rigorous interrogation — all when he was still a child.

In the years that followed, he endured poverty, discrimination and chronic lack of recognition, not only for his work at Disney but also for his fine art, before finding acclaim in his 90s.

Mr. Wong died on Friday at 106. A Hollywood studio artist, painter, printmaker, calligrapher, greeting-card illustrator and, in later years, maker of fantastical kites, he was one of the most celebrated Chinese-American artists of the 20th century.

But because of the marginalization to which Asian-Americans were long subject, he passed much of his career unknown to the general public. ...

Mr. Wong, newly married and needing steady work, joined Disney in 1938 as an “in-betweener,” creating the thousands of intermediate drawings that bring animated sequences to life. ...

A reprieve came in the late 1930s, when Mr. Wong learned that Disney was adapting “Bambi, a Life in the Woods,” the 1923 novel by the Austrian writer Felix Salten about a fawn whose mother is killed by a hunter.

In trying to animate the book, Disney had reached an impasse. The studio had enjoyed great success in 1937 with its animated film “Snow White and the Seven Dwarfs,” a baroque production in which every detail of the backgrounds — every petal on every flower, every leaf on every tree — was meticulously represented.

In an attempt to use a similar style for “Bambi,” it found that the ornate backgrounds camouflaged the deer and other forest creatures on which the narrative centered.

Mr. Wong spied his chance.

“I said, ‘Gee, this is all outdoor scenery,’” he recalled in a video interview years afterward, adding: “I said, ‘Gee, I’m a landscape painter!’”

“Walt Disney went crazy over them,” said Mr. Canemaker, who wrote about Mr. Wong in his book “Before the Animation Begins: The Art and Lives of Disney Inspirational Sketch Artists” (1996). “He said, ‘I love this indefinite quality, the mysterious quality of the forest.’”

Mr. Wong was unofficially promoted to the rank of inspirational sketch artist.

“But he was more than that,” Mr. Canemaker explained. “He was the designer; he was the person they went to when they had questions about the color, about how to lay something out. He even influenced the music and the special effects: Just by the look of the drawings, he inspired people.”

Mr. Wong spent two years painting the illustrations that would inform every aspect of “Bambi.” Throughout the finished film — lent a brooding quality by its stark landscapes; misty, desaturated palette; and figures often seen in silhouette — his influence is unmistakable.

But in 1941, in the wake of a bitter employees’ strike that year, Disney fired Mr. Wong. Though he had chosen not to strike — he felt the studio had been good to him, Mr. Canemaker said — he was let go amid the lingering climate of post-strike resentments.

On “Bambi,” Mr. Wong’s name appears, quite far down in the credits, as a mere “background” artist. ...

In 2001, in formal recognition of his influence on “Bambi,” Mr. Wong was named a Disney Legend. The honor — whose previous recipients include Fred MacMurray, Julie Andrews and Annette Funicello — is bestowed by the Walt Disney Company for outstanding contributions. ...

When his daughters were small, Mr. Wong encouraged them to make art, as his father had encouraged him. Yet he would not let them have coloring books.

The reason was simple: He did not want his children constrained, he said, by lines laid down by others.

Friday, December 30, 2016

In a phenomenon that may be completely unique to the Korean culture – which comprise some 75 million people who live on the Korean peninsula, and another 7 million in the global diaspora -- only three surnames, Kim, Lee and Park, account for the appellations of nearly one-half of all Koreans. ...

On the whole, according to various accounts, there are no more than about 250 surnames currently in use in Korea (in contrast, in Japan and the Netherlands there are more than 100,000 active surnames in each society). Korea's paucity of surnames and the heavy concentration of a handful of those names are linked to the peninsula's long feudal history and its complex relationships with aggressive neighbors China and Japan. ...

Korean names use Chinese characters, reflecting the Korean aristocracy's adoption of Confucian naming models (i.e., full names) as long ago as the fifth century. Commoners on the peninsula did not have that privilege.

“For much of Korean history, only the elite had surnames,” [Professor of Asian Studies Donald] Baker said. “Those elites tended to adopt surnames that would make it plausible to claim that they had ancestors from China, then the country Koreans admired the most. There were only a few such surnames. So, when commoners began acquiring surnames [later], they grabbed one already in use to bask in the prestige of the families that were already using that surname.” Baker further noted that Korea was an aristocratic society until the modern era, with only a few families at the top of the social ladder. “That limited the number of 'high-prestige' surnames available,” he added. ...

Sung-Yoon Lee, assistant professor of Korean Studies at the Fletcher School of Law and Diplomacy at Tufts University in Boston, told IBTimes that during the late Silla period of Korean history (coinciding with the ninth and 10th centuries of the Christian era), the practice of adopting Chinese-character surnames among the Korean nobility became popular. ...

Eugene Y. Park, Associate Professor of History and Director of the James Joo-Jin Kim Program in Korean Studies at University of Pennsylvania, told IB Times that by 1392 (the start of Chosun dynasty), roughly 70 percent of Koreans were using surnames -- meaning everyone but slaves.

By the time the Japanese Empire seized Korea in 1910 (upon the collapse of the Chosun dynasty), most Koreans already had surnames, and those who didn’t simply adopted the surnames of their masters, who had a limited number of names available.

Tuesday, December 27, 2016

In the 1920s major studios sent scouts to spot promising young talent and contract them for years of work. ...

With strict contracts, morality clauses, and minimal child labor laws, studio bosses were able to push child stars at breakneck pace. Garland worked six days per week, sometimes 18-hour shifts of constant singing and dancing to pump out as many movies as possible. To keep her energy up and force her weight down, studios plied her with “pep pills,” amphetamine uppers to keep her perky and alert all day. When she couldn’t sleep, they supplied sleeping pills.

“After four hours they’d wake us up and give us the pep pills again,” said Garland. She was using throughout the filming of Wizard of Oz.

In 1941, at age 19, Garland married composer David Rose. MGM did not approve, and ordered her back to work within 24 hours of the wedding. When she became pregnant, her mother Ethel worked with the studio to arrange for an abortion. She was innocent little Dorothy, after all. The public wasn’t ready to see her as a mother, a grown-up.

Meanwhile, MGM manipulated Garland’s publicity. When she gained weight, she was made to take more speed, while press reps told magazines she ate like a truck driver. Her persona was not her own, and she was given little time to discover herself outside the movies.

Friday, December 23, 2016

Nicholas Kristof: As you know better than I, the Scriptures themselves indicate that the Resurrection wasn’t so clear cut. Mary Magdalene didn’t initially recognize the risen Jesus, nor did some disciples, and the gospels are fuzzy about Jesus’ literal presence — especially Mark, the first gospel to be written. So if you take these passages as meaning that Jesus literally rose from the dead, why the fuzziness?

Tim Keller: I wouldn’t characterize the New Testament descriptions of the risen Jesus as fuzzy. They are very concrete in their details. Yes, Mary doesn’t recognize Jesus at first, but then she does. The two disciples on the road to Emmaus (Luke 24) also don’t recognize Jesus at first. Their experience was analogous to meeting someone you last saw as a child 20 years ago. Many historians have argued that this has the ring of eyewitness authenticity. If you were making up a story about the Resurrection, would you have imagined that Jesus was altered enough to not be identified immediately but not so much that he couldn’t be recognized after a few moments? As for Mark’s gospel, yes, it ends very abruptly without getting to the Resurrection, but most scholars believe that the last part of the book or scroll was lost to us.

Skeptics should consider another surprising aspect of these accounts. Mary Magdalene is named as the first eyewitness of the risen Christ, and other women are mentioned as the earliest eyewitnesses in the other gospels, too. This was a time in which the testimony of women was not admissible evidence in courts because of their low social status. The early pagan critics of Christianity latched on to this and dismissed the Resurrection as the word of “hysterical females.” If the gospel writers were inventing these narratives, they would never have put women in them. So they didn’t invent them.

The Christian Church is pretty much inexplicable if we don’t believe in a physical resurrection. N.T. Wright has argued in “The Resurrection of the Son of God” that it is difficult to come up with any historically plausible alternate explanation for the birth of the Christian movement. It is hard to account for thousands of Jews virtually overnight worshiping a human being as divine when everything about their religion and culture conditioned them to believe that was not only impossible, but deeply heretical. The best explanation for the change was that many hundreds of them had actually seen Jesus with their own eyes.

--Nicholas Kristof and Tim Keller, NYT, on evidence and faith. Other topics covered in the conversation include whether secularism really requires less faith than religiosity, the rationality of believing in miracles, and whether it's unfair that only those with a direct relationship with Jesus go to heaven.

Tuesday, December 20, 2016

Presbyterian reverend Thomas Bayes had no reason to suspect he’d make any lasting contribution to humankind. Born in England at the beginning of the 18th century, Bayes was a quiet and questioning man. ... Yet an argument he wrote before his death in 1761 would shape the course of history. It would help Alan Turing decode the German Enigma cipher, the United States Navy locate Soviet subs, and statisticians determine the authorship of the Federalist Papers. Today it has helped unlock the secrets of the brain.

It all began in 1748, when the philosopher David Hume published An Enquiry Concerning Human Understanding, calling into question, among other things, the existence of miracles. According to Hume, the probability of people inaccurately claiming that they’d seen Jesus’ resurrection far outweighed the probability that the event had occurred in the first place. This did not sit well with the reverend.

Inspired to prove Hume wrong, Bayes tried to quantify the probability of an event. He came up with a simple fictional scenario to start: Consider a ball thrown onto a flat table behind your back. You can make a guess as to where it landed, but there’s no way to know for certain how accurate you were, at least not without looking. Then, he says, have a colleague throw another ball onto the table and tell you whether it landed to the right or left of the first ball. If it landed to the right, for example, the first ball is more likely to be on the left side of the table (such an assumption leaves more space to the ball’s right for the second ball to land). With each new ball your colleague throws, you can update your guess to better model the location of the original ball. In a similar fashion, Bayes thought, the various testimonials to Christ’s resurrection suggested the event couldn’t be discounted the way Hume asserted.

In 1767, Richard Price, Bayes’ friend, published “On the Importance of Christianity, its Evidences, and the Objections which have been made to it,” which used Bayes’ ideas to mount a challenge to Hume’s argument. “The basic probabilistic point” of Price’s article, says statistician and historian Stephen Stigler, “was that Hume underestimated the impact of there being a number of independent witnesses to a miracle, and that Bayes’ results showed how the multiplication of even fallible evidence could overwhelm the great improbability of an event and establish it as fact.”

Sunday, December 18, 2016

To find out whether high-end rice cookers truly make a difference in the taste of rice, The Wall Street Journal conducted a blind taste test under the guidance of ricemeister Toyozou Nishijima. Mr. Nishijima is one of about 4,000 rice experts in Japan who have passed a rigorous exam by the rice retailers' association Japan Rice Retailers' Association, testing their knowledge as well as their abilities to blend, store and polish rice correctly and identify rice varietals in a taste test.

Using four different rice cookers -- flagship models by Matsushita Electric Industrial Co. Ltd., Hitachi Ltd. and Mitsubishi Electric as well as a three-year-old, less expensive Matsushita rice cooker I use at home for comparison -- we made two cups of rice, each with standard rice from a grocery store and soft water that was bottled domestically. A Mitsubishi Electric spokesman had warned us against using expensive European mineral water because its high mineral content gives a flavor that gets in the way of the taste of the rice.

When Mr. Nishijima arrived at our Tokyo office, we put four bowls of rice in front of him. At his request, we provided bottled soft water so he could cleanse his palate between tastings. We served the rice in bowls made of porcelain, a material that's free of any smells that could interfere with the scent of the rice.

For each bowl, Mr. Nishijima first took a close look at the rice, smelled it, and then took a bite, chewing slowly. We didn't tell him which rice cooker brands we used, but he guessed the manufacturers of three of them correctly.

"The taste preferences of the developer that created the original concept for the rice cookers often translate directly into the manufacturers' characteristic," said Mr. Nishijima. ...

The old Matsushita machine, for example, is a quintessential Matsushita rice cooker, he said. The rice was softer and increased sweetness with chewing, which was a characteristic that women tended to appreciate more because they tend to chew their rice for a longer time than men. Mr. Nishijima was impressed with the latest high-end Matsushita model because it made very sticky rice with a sweetness that could be tasted right away.

Mr. Nishijima said the rice made by Hitachi's high-end rice cooker was a little too sticky and lacked sweetness. "The rice has absorbed too much water, but it could be preferred by older people, who want softer rice," he said.

As for the Mitsubishi rice cooker, Mr. Nishijima said it made a harder rice that seemed a little dry. While it was difficult to taste the sweetness of the rice right after it cooked, the sweetness normally increases after several hours, he said.

Friday, December 16, 2016

What I was experiencing at that moment was a psychological phenomenon know as subjective time dilation — the strange temporal elasticity that so often accompanies moments of acute danger. The sensation was immortalized in the scene in The Matrix when Neo is able to perceive the passage of bullets through the air around him. ...

But how, exactly, does your brain slow down time? Maybe the same way a machine does. When you take a slow-motion video with your phone, it ramps up the rate at which frames are recorded, then plays them back at the standard number of frames per second. To find out if the brain takes the same approach, researchers at Baylor and the University of Texas enlisted volunteers who were willing to subject themselves to a nerve-shattering experience called the Suspended Catch Air Device at the Zero Gravity amusement park in Dallas. The participants were strapped into a harness, then dropped from a 150-foot-high tower. After a 2.5 second free fall, a net stopped their descent.

To ascertain whether adrenaline made their brains speed up, the scientists had invented a wrist-mounted device they called the “perceptual chronometer.” The LED display shows pairs of numbers. If the numbers are set to change more quickly than a person’s mind can perceive, he or she will see only a blur. The scientists set the perceptual chronometer’s flicker rate just beyond what their subjects could perceive when calm. If the intense fear of free fall ramped up the subjects’ perception rate, they should be able to discern the numbers.

As instructed, the subjects looked at the device as they fell. (For the most part — one subject was so terrified she kept her eyes clamped.) They saw — a blur. They were no better able to discern the numbers in free fall than they were when safely on the ground.

What this tells us is that our brains don’t speed up when we’re in danger. Instead, the rush of fear hormones causes the brain to retain richer memories of what’s happening. ...

Since the brain estimates the passage of time by how much information is stored within a given interval, richer memories make it feel like more time has passed.

Of the many oddities that are culturally specific to Japan — from cat cafés to graveyard eviction notices to the infamous Suicide Forest, where an estimated 100 people per year take their own lives — perhaps none is as little known, and curious, as “the evaporated people.”

Since the mid-1990s, it’s estimated that at least 100,000 Japanese men and women vanish annually. They are the architects of their own disappearances, banishing themselves over indignities large and small: divorce, debt, job loss, failing an exam.

“It’s so taboo,” Mauger tells The Post. “It’s something you can’t really talk about. But people can disappear because there’s another society underneath Japan’s society. When people disappear, they know they can find a way to survive.”

These lost souls, it turns out, live in lost cities of their own making.

The city of Sanya, as Mauger writes, isn’t located on any map. Technically, it doesn’t even exist. It’s a slum within Tokyo, one whose name has been erased by authorities. What work can be found here is run by the yakuza — the Japanese mafia — or employers looking for cheap, off-the-books labor. The evaporated live in tiny, squalid hotel rooms, often without internet or private toilets. Talking in most hotels is forbidden after 6 p.m. ...

A shadow economy has emerged to service those who want never to be found — who want to make their disappearances look like abductions, their homes look like they’ve been robbed, no paper trail or financial transactions to track them down.

Nighttime Movers was one such company, started by a man named Shou Hatori. He’d run a legitimate moving service until one night, in a karaoke bar, a woman asked if Hatori could arrange for her to “disappear, along with her furniture. She said she could not stand her husband’s debts, which were ruining her life.”

Hatori charged $3,400 per midnight move. His clientele was vast: from housewives who’d shopped their families into debt to women whose husbands had left them to university students who were sick of doing chores in their dorms.

In most countries, sleeping on the job isn’t just frowned upon, it may get you fired.

But in Japan, napping in the office is common and culturally accepted. And in fact, it is often seen as a subtle sign of diligence: You must be working yourself to exhaustion.

The word for it is “inemuri.” It is often translated as “sleeping on duty,” but Dr. Brigitte Steger, a senior lecturer in Japanese studies at Downing College, Cambridge, who has written a book on the topic, says it would be more accurate to render it as “sleeping while present.” ...

Inemuri has been practiced in Japan for at least 1,000 years, and it is not restricted to the workplace. People may nap in department stores, cafes, restaurants or even a snug spot on a busy city sidewalk.

Sleeping in public is especially prevalent on commuter trains, no matter how crowded; they often turn into de facto bedrooms. It helps that Japan has a very low crime rate. ...

Sleeping in social situations can even enhance your reputation. Dr. Steger recalled a group dinner at a restaurant where the male guest of a female colleague fell asleep at the table. The other guests complimented his “gentlemanly behavior” — that he chose to stay present and sleep, rather than excuse himself.

One reason public sleeping may be so common in Japan is because people get so little sleep at home. A 2015 government study found that 39.5 percent of Japanese adults slept less than six hours a night. ...

Dr. Steger pointed out that closed eyes may not always equal shut-eye: A person may close them just to build a sphere of privacy in a society with little of it.

Thursday, December 15, 2016

We need something extra. The thought reverberated for Sam Koch. In four days, the Baltimore Ravens would face one of the NFL's most explosive punt returners. Koch, a veteran punter in his ninth season, wanted to have a little something extra for him.

So as the Ravens gathered for their Wednesday practice, special-teams coordinator Jerry Rosburg suggested a twist. Let's see if we can fool him. Koch began experimenting. He angled his body toward the right sideline, a pre-snap position that usually indicates the kick's direction, and torqued his hips and right leg toward the left sideline without changing his horizon -- ultimately sending the ball some 40 yards toward the opposite sideline a returner would expect.

"After a few minutes," Rosburg said, "we knew we had something."

The effect was immediate and, without exaggeration, has turned punting strategy in the NFL upside down. Yet almost no one has noticed. ...

Koch punted six times in that initial game, a Week 9 matchup in 2014 with Pittsburgh Steelers returner Antonio Brown on the field. Brown made four fair catches, and the other two punts rolled out of bounds.

That success sparked further attempts to devise unpredictable punts. One year later, Koch has roughly 10 distinctly different kicks in what the Ravens refer to as his "golf bag."

Some are designed to hook toward the sideline with maximum hang time. Others use an intentionally low trajectory to aid coverage teams. He has a knuckler and one kick that drops, from the returner's perspective, roughly in the shape of the letter "S." Two weeks ago, he debuted a "boomerang" punt that does just what you would imagine it might. Most, but not all, of these punts are intended to discourage a clean catch and minimize the return. ...

The NFL changes every day, but there are only a few moments in each generation when it transforms. This is one of them. In plain sight, Sam Koch and the Ravens have introduced a new way to punt.

...we surveyed representative samples of the adult populations in Germany and the U.S. and implemented three randomized experiments on how the provision of information affects support for education spending. ...

We find that a vast majority of the public in both countries underestimates current levels of school spending and teacher salaries. Absent the provision of information, an absolute majority in both countries supports increased government spending on education, with somewhat higher levels of support among Germans than Americans (71 percent vs. 60 percent).

Our first survey experiment shows that citizens of both countries also react similarly to two information treatments, with treatment effects (relative to the control mean) hardly differing. Informing respondents about the current level of annual public education spending per student reduces support for increased spending by more than one quarter (to 50 percent in Germany and 43 percent in the U.S.). Additionally stating that the spending increase would be financed through higher taxation reduces support by more than half compared to the control group (to 30 percent in Germany and 26 percent in the U.S.), with the shares in support no longer differing significantly between the two countries.

When respondents are informed about current salary levels, the share who support increases in teacher salaries declines sharply by about 40 percent (relative to the control mean) in both countries, although baseline support is much lower in Germany. ...

Further analysis confirms that these treatment effects reflect actual information effects, rather than simply the effect of being primed to think about monetary values as opposed to, say, observable conditions in local schools before reporting support for spending increases (Iyengar et al. (1984); Krosnick and Kinder (1990)). In both countries, treatment effects are substantially larger for respondents who underestimated actual levels and are almost zero for respondents who had already been well informed prior to the information treatment.

--Martin West, Ludger Woessmann, Philipp Lergetporer, and Katharina Werner, "How information affects support for education spending: Evidence from survey experiments in Germany and the United States," on why we don't spend more on education

Monday, December 12, 2016

In 1776, whether you were declaring America independent from the crown or swearing your loyalty to King George III, your pronunciation would have been much the same. At that time, American and British accents hadn't yet diverged. What's surprising, though, is that Hollywood costume dramas get it all wrong: The Patriots and the Redcoats spoke with accents that were much closer to the contemporary American accent than to the Queen's English.

It is the standard British accent that has drastically changed in the past two centuries, while the typical American accent has changed only subtly.

Traditional English, whether spoken in the British Isles or the American colonies, was largely "rhotic." Rhotic speakers pronounce the "R" sound in such words as "hard" and "winter," while non-rhotic speakers do not. ...

It was around the time of the American Revolution that non-rhotic speech came into use among the upper class in southern England, in and around London. According to John Algeo in "The Cambridge History of the English Language" (Cambridge University Press, 2001), this shift occurred because people of low birth rank who had become wealthy during the Industrial Revolution were seeking ways to distinguish themselves from other commoners; they cultivated the prestigious non-rhotic pronunciation in order to demonstrate their new upper-class status.

"London pronunciation became the prerogative of a new breed of specialists — orthoepists and teachers of elocution. The orthoepists decided upon correct pronunciations, compiled pronouncing dictionaries and, in private and expensive tutoring sessions, drilled enterprising citizens in fashionable articulation," Algeo wrote.

Friday, December 2, 2016

For social-science nerds, [Michael] Lewis provides the back story to Dr. Kahneman and Dr. Tversky’s most famous papers. But the real drama of “The Undoing Project” — the scenes on the peak-end highlight reel — revolve around Dr. Tversky and Dr. Kahneman, both as individuals and as a creative pair. ...

As often happens in collaborations, one person, fairly or unfairly wound up getting more credit for the work, and in this case, it was Dr. Tversky. For Dr. Kahneman, this imbalance generated terrible tension and envy. That we know the details of such a close relationship, and its rocky emotional topography, is astonishing. Dr. Kahneman, 82, now at Princeton, seldom speaks to writers. But Mr. Lewis, as we know, is no an ordinary author. ...

And what do we learn? That envy really is corrosive. That successful marriages involve, as the psychologist Marcel Zentner discovered, “positive illusions.” That world-famous psychologists can be blind to the needs of those around them. And that even winning a Nobel doesn’t guarantee self-esteem. Late in life, Dr. Kahneman remained a rattling kettle of self-doubt.

In a remarkable note on his sources, Mr. Lewis reveals that for years he watched Dr. Kahneman agonize over his 2011 book, “Thinking, Fast and Slow,” which became both a critical and a fan favorite. “Every few months he’d be consumed with despair, and announce that he was giving up writing altogether — before he destroyed his own reputation,” Mr. Lewis writes. “To forestall his book’s publication he paid a friend to find people who might convince him not to publish it.”

Dr. Tversky never fully understood these fits of doubt. Nor did he see how he made them worse. “I needed to get away,” Dr. Kahneman said. “He possessed my mind.”

Thursday, December 1, 2016

Their formative event, the famous-in-retrospect “Funeral for the ’80s” in Central Park, was not especially well thought out. “It wasn’t like we were really earnest about this,” said Mr. Wink, the group’s big talker despite his years as a silent Blue Man. “It was more like, ‘Let’s put it out of its misery and make way for something new.’”

But he was savvy enough to send a news release to MTV. The V.J. Kurt Loder and a cameraman came along to witness a bunch of blue people carrying a coffin, making portentous pronouncements and setting fake fire to ’80s symbols they found objectionable, including Rambo. The audience: perhaps two dozen. ...

MTV hyped the story, Mr. Goldman said, “and through the magic of editing, made it look like you’d missed the Sex Pistols” if you missed the event.

The ’80s were still not over — this was 1988 — and the Blue Men began refining the Blue Man. ...

In 2010, Mr. Goldman sold his one-third share to GF Capital, a private equity fund.