13 June 2011

The cast of absurd characters (juxtaposed with the brooding and passive narrator) is fun, but the real star of the book is Murakami's prose (translated by Jay Rubin). The everyday blends seamlessly with the bizarre. Even the mundane is unmistakably sinister. And fairly regularly there is a metaphor or an aside that completely throws you for a loop.

I won't pretend to be able to comment coherently about the plot or what it means. (I'm not sure anyone can. But I certainly can't.) This story is like a dream committed to paper, one of those books that is about the ride rather than the conclusion. A fun read (and I usually stay far away from contemporary fiction). Recommended.

In The Seven Sins of Memory (paperback, Kindle), psychologist Daniel Schachter attempts to characterize the failure modes of human memory. Schachter enumerates seven "sins" of memory: transience, absent-mindedness, blocking, misattribution, suggestibility, bias, and persistence. At a high level, these failures include not being able to recall things when desired, recalling things when not desired, and recalling things incorrectly.

I suppose most of us are keenly aware of the first and second categories of memory errors, because at the time we experience any such failure it is very obvious that such a thing has happened. The last category, recalling things incorrectly, represents the "silent failure" mode of memory, and is rather more pernicious (and less well understood). The sin of bias, for example, reflects the fact that recalling is an active process. One manifestation of bias is that recollections of the past are suspiciously similar to conditions in the present. For example, "[people] whose views on political issues have changed over time often recall incorrectly past attitudes as highly similar to present ones." Another sin, suggestibility, has major consequences for how we should interpret the testimony of witnesses in court.

Schachter explains in detail each sin, and mentions relevant literature that sheds light on the specifics of each failure mode. Frequently, understanding these specifics suggests useful tips and techniques that can be used to counteract or "work around" the failure. (The tips are also simple— very little is said of memory palaces, Mega Memory, and other very tiring techniques.) These days, the portable electronic devices we carry around are also useful tools for augmenting our memories.

In the last part of the book Schachter speculates on why memory evolved this way.

For example, forgetting the details of events gradually over time (transience of memory) is clearly economical, since we have a greater need for recalling things in the immediate past. So it appears that the brain initially encodes detailed records in memory ("We started the morning with continental breakfast at the hotel...") but gradually replaces the detailed record with the gist of what happened ("I had a great time"). When you are trying to recall an event from long ago, you use the gist of the memory and other techniques (inference, your knowledge of what usually happens under similar circumstances, a wild-ass guess, etc.) to reconstruct the event rather than recalling it directly. The brain uses lossy compression, and for events further in the past it dials up the lossiness! Some sins, like transience, appear to be evolutionary adaptations (i.e. actually useful behaviors), but others are the byproducts of evolutionary design trade-offs. All told, memory is not a poorly implemented and unreliable hack, but rather a pretty decent system given the hardware constraints.

This book is ostensibly about memory but, memory being such an integral part of our awareness, there is also plenty about attention, learning, and cognitive biases. This book is chock-full of useful information about one's mind and how to use it just a bit more effectively. Recommended.

The Man Who Tasted Shapes (paperback, Kindle), by Richard Cytowic, is the story of one of the neurologists who brought synesthesia back into the field of view of psychologists. Synesthesia is a neurological condition in which stimulation in one sense leads to involuntary (and consistent) experiences in another sense. (Famous artists believed to be synesthetes include Billy Joel, David Hockney, and Itzhak Perlman.) In some of the more common variants of synesthesia, individual letters on a printed page, or individual musical notes, are involuntarily experienced as having colors. While synesthesia was known and widely studied even in the 19th century, by the mid 20th century the predominant view was that synesthesia was not really interesting to investigate and that the phenomena that synesthetes experienced were just "in their heads". Unfortunately, before long, few doctors were even aware that synesthesia existed, despite the fact that estimates of its prevalance are generally agreed to be at least 1 in 2000. Thanks to the work of Cytowic and others, synesthesia is now something that gets covered in an intro psychology course (the one I took, anyway).

The first part of the book is sort of a memoir and a medical detective story, in which Cytowic— working with a couple of patients over a period of years— assembles clues about synesthesia, digs deep into the literature, and begins to conduct experiments to learn more about its genesis. While the dialogues do seem a bit overdramatized and redundant at times, it is a good portrait of how science is done. Not the technology part— the fMRI and other imaging devices are only incidental to the plot— but rather the imaginative part of science: synthesizing theories based on the mass of available evidence.

The conclusion of the "mystery" is a bit anticlimactic, since we still don't have a complete understanding of what mechanism is responsible for synesthesia, just a few tantalizing clues. Cross-talk between regions of the brain responsible for processing different senses has been implicated. In fact, all newborns are born synesthetic but lose that ability over time as connections in the brain are pruned. Interestingly, though, a substantial fraction of people appear to have latent synesthetic abilities. Studies have shown that the incidence of synesthesia among trained meditators is about one hundred times higher than the baseline prevalence! So it appears that synesthesia represents a natural background process of the brain that is simply unavailable to conscious view in a select few people.

This observation is a springboard for the second part of the book, which is a series of essays on "the primacy of emotion". Synesthesia is not an isolated phenomenon in the respect I mentioned: much of what goes on in our brains is not accessible to self-awareness except possibly under extraordinary circumstances. It now seems that the conscious mind is in fact not even in the driver's seat when it comes to planning and execution. (See: Bereitschaftspotential) The essays in this section are a reflection on the implications of this somewhat unsettling model of mind. I enjoyed some of the essays but found just as many of them to be completely opaque.

Despite its flaws I found this to be an enjoyable and eye-opening read. Recommended.

08 June 2011

The Hunger Games (paperback, Kindle), by Suzanne Collins, is a novel set in a future North America. Each year twenty-four teenagers are selected from the twelve Districts and thrown into a wilderness arena to fight each other to the death in a televised event. The Hunger Games are the Capitol's way of demonstrating its absolute power and humiliating the districts. The heroine, Katniss, volunteers to enter The Hunger Games in her sister's stead. Katniss is very clever and quickly figures out what she is going to have to do (and how to play everyone— including but not limited to the other contestants) in order to win.

(Yes, the premise is a lot like Battle Royale; or "Survivor", with killing.)

This is a fun and not-very-long read. It is well paced and there are hardly any lulls in the action. I had trouble putting down this book. Recommended.

The Lifecycle of Software Objects, by Ted Chiang, is a novella centering around sophisticated virtual pets— "digital entities" ("digients") that are positioned as somewhere between pets and children.

The story follows the rise and fall of the startup that creates the digients, as well as some of the programmers and "owners", who become increasingly attached to them.

The storyline and the implications of the technology are well thought out, and you could see it unfolding the way it's described. But as a whole the story just seemed kind of empty. Nothing really happens to any of the story threads at the end, which I would be willing to forgive if the premise of the story were thought-provoking. But I felt like I had seen enough pieces of the plot before that I did not find the questions Chiang raises to be particularly interesting. Perhaps I am just AI jaded. Decent airplane reading but little more than that.

In The Birth of Plenty (paperback, Kindle), William Bernstein proposes a framework to help make sense of how civilizations get on the treadmill of sustained economic growth that has only been attained in the last 200 years. Why was it the English who managed it first, and not, say, the Chinese, or Muslims, both of whom had a tradition of scientific discovery and mathematical inquiry going back a thousand years or more?

Bernstein argues that four factors are necessary for nations to break out of stagnancy:

Property rights, including protection both from the government (i.e. rule of law) and from others (robbers, highwaymen, etc.)

Scientific rationality

Efficient capital markets (easy access to capital)

Useful power (for work and transport) and communications

The core of the book is case studies of the four factors and how they were (or weren't) implemented in various countries. It's history and politics and economics but Bernstein makes it very interesting and explains the relationships between the relevant threads of history.

The last part of the book concerns the modern era and the future. It would be really interesting to be able to use this sort of analysis in order to direct policy decisions, but it seems to me that the four-factor story has become, at least in part, a just-so story. After all, one can read about the tenets and method of science anywhere; capital is easy to come by (if not in your home country, you can obtain it from abroad); machines for work, transport, and communications can be shipped to anywhere in the world. With respect to the latter three factors, the cat is out of the bag. The only place in the modern world where any of those are not readily available is, possibly, North Korea. So it may be that no future natural experiment could assign any explanatory role to factors two, three, and four.

Instead, the first factor plays a key role: if we would merely supply the rule of law, says Bernstein— if people believed that they were not in mortal danger and that the fruits of their labor were safe from seizure— then people would have the motive to innovate (with the other three factors guaranteeing the means and opportunity). Not a new idea, but there is a twist. The four factors are logically independent of democracy; they can be present even under totalitarian regimes. And Bernstein cites some evidence that suggests that it's prosperity, borne of the four factors, that makes people ready for democracy. The causality between prosperity and democracy goes the way opposite to what many people assume.

Now, if we take this narrative seriously, then we really have to rethink the way we approach foreign aid and humanitarianism. Promoting (or installing) democracy in a country won't, by itself, lead to prosperity. And just giving money to poor countries is no help at all, if not worse. In both cases the change amounts to planting a seed in infertile soil. What's needed to bootstrap third-world countries today is, almost invariably, the first factor: property rights, considered very broadly. People need to believe that they and their possessions are secure; they need judges and lawyers. The absence of the requisite social institutions is so corrosive that economies cannot thrive without it, no matter how the political leaders are chosen or how much money there is lying around. Conversely, Bernstein suggests that once people have attained some critical level of prosperity they tend to lean towards democracy anyway. But that can lag economic prosperity by years if not decades.

This book is well worth reading, even you take issue with some of Bernstein's analysis. People are becoming increasingly mindful, and they want to know what they can do to make the world a more decent place to live; but no one (except ideologues, I suppose) is so committed to a particular means towards that end that they would persist in advocating something that demonstrably didn't work.

Economic history is complicated, and the scope of this book is very ambitious, but Bernstein manages to make it an easy (and even engrossing at times) read. Recommended.

Disclosure

I'm a software engineer at DNAnexus, Inc. This blog represents the opinion of myself and no one else.Unless specifically noted otherwise, I do not receive free review copies of books or other products mentioned here.