28 December 2009

The GSM algorithm, technically known as the A5/1 privacy algorithm, is a binary code — which is made exclusively of 0's and 1's — that has kept digital phone conversations private since the GSM standard was adopted in 1988.

But the A5/1 algorithm is a 64-bit binary code, the modern standard at the time it was developed, but simpler than the 128-bit codes used today to encrypt calls on third-generation networks. The new codes have twice as many 0's and 1's.

That last statement, while technically true, is remarkably vacuous—even when compared to other sentences in the same article. To an uninitiated reader it gives no hint as to the relative complexity of the two codes (a 128-bit code being 18 billion billion times harder to guess in the absence of any other vulnerabilities).

09 December 2009

As someone who took a couple of economics classes (and knows just enough to be dangerous), I find AT&T's continued network troubles to be quite puzzling.

[AT&T] has been criticized by owners of the [iPhone] for delayed text and voice messages, sluggish download speeds and other network problems.

[President and CEO for AT&T Mobility and Consumer Markets Ralph] de la Vega cited the heaviest data users, saying that 40 percent of AT&T’s data traffic came from just 3 percent of its smartphone customers.

But he emphasized that the company would first focus on educating consumers about their data consumption in the hope that doing so would encourage them to cut back, even though they are paying for unlimited data use.

I laughed out loud when I read the headline. AT&T thinks that educating users will get them to consume less data, even if it gives them no incentive to do so. I think this is about as likely to work as encouraging people to emit less carbon while giving them no incentive to do so. The article does say AT&T might be considering a non-constant pricing plan; I hope they realize this is a really good idea, soon.

The unlimited data plan is untenable with today's technology. When you, as a user, actually try and take advantage of your "unlimited" data plan, not only are you limited by the mediocrity of the network, you are making other users very, very sad by congesting the network! Offering unlimited plans only makes sense when you have actually built out sufficient capacity to cover the demand. That's why unlimited long distance calls (on both landlines and cell phones) are a good idea (now), and unlimited data plans aren't.

Carriers and consumers claim to prefer unlimited data plans because they're simple. But that simplicity comes at a huge cost to quality, which I bet many people would be willing to pay some amount to avoid.

Personally, I would welcome pay-per-byte pricing to the wireless industry (though I'm not holding my breath). By making people pay an amount commensurate with their impact on other users, it would avert the tragedy of the commons that is AT&T's network today.

Perhaps more importantly, pay-per-byte also provides the right incentives on the network provider's end. When you pay by the byte, your provider has an incentive to build out capacity, because they want to deliver those bytes to you as fast as possible, so they can free up their resources, so they can push you more bytes, which makes them even more money. Under an unlimited pricing structure, the provider has every incentive to drag their feet. Building out capacity costs them money now, but doing nothing at all doesn't cost them anything until your contract expires.

07 December 2009

William Stein has written a personal account of how he ended up writing Sage (the free software computer algebra system).

For many years, Stein worked on various bits and pieces of mathematical software to satisfy his own research needs. But with just him and a couple of other people working on it, they kept very low expectations for what would eventually become Sage. After all, how could a small group of people match the work of the thousands of engineers and mathematicians who were hired by the proprietary math software companies?

Stein only decided that Sage had to succeed when his license for Magma was terminated, and he realized (1) how insane it was to be dependent on proprietary secrets for math research and (2) how much leverage proprietary software makers had over him and his career:

Isn't it weird that mathematics can be done that way? In 2004, almost everybody in the world doing serious computations with elliptic curves, modular forms, etc., were using Magma. Magma was the industry standard, Magma had won for the forseable future. David Kohel and I were a big reason why. And yet what kind of mathematics is it, when much of my work fundamentally depends on a bunch of secret algorithms? That's just insane. [...]

Anyway, John Cannon's email [...] seriously scared me. I wasn't in any way confident that Sage would ever replace Magma for my work and teaching, and I had big plans involving interactive mathematical web pages. These plans were temporarily on hold as I was drawn into Sage. But there were still there. What John did with that email is tell me, in no uncertain terms, that if I was going to create those interactive mathematical web pages, they couldn't depend on Magma. "This is to formally advise you that your permission to run a general-purpose calculator based on Magma ends." I was scared. It was also the first time I saw just how much power John Cannon had over my life and over my dreams. That email was sent on a whim. I hadn't got any official permission to run that Magma calculator for a specific amount of time (just open ended permission). What John made crystal clear to me was that he could destroy my entire longterm plans on a whim. I looked around for other options, and there just weren't any. Sage had to succeed. But still I was certain that it just wasn't humanly possible, given that I had to do almost all the work, with limited funding and time.

26 November 2009

The Diving Bell and the Butterfly is based on the memoir of the same name by Jean-Dominique Bauby. Formerly the editor of Elle, Bauby suffered a stroke that left him with locked-in syndrome, a condition in which he remained conscious but unable to speak or move anything but his left eye. He dictated his book by blinking. (His speech therapist would read the letters of the alphabet in decreasing order of frequency, and he would blink at the letter he wanted to use.)

The title is a reference to Bauby's corporeal imprisonment and to how he escapes from it with his vivid imagination. Director Julian Schnabel does a good job portraying the terror and frustration of Bauby's impotence (some riveting camerawork here, if you can believe that) as well as his fanciful daydreams.

Diving Bell is a moving story of mind over matter and the power of the human spirit. Bauby's condition arouses pity, yet the focus is not on that but on his humanity. We get a view into his wishes, vices, regrets, dreams, and memories. I appreciated the fact that Schnabel doesn't lionize Bauby or overdo the sentimentality. Bauby is a courageous but flawed man, and it is only his warts that make him recognizable as a real person to those of us who have been more fortunate.

I hope you are having a good Thanksgiving and that you have much to be thankful for (yes, even if you not an American!).

Among my blessings I count my good health; good friends; my best friend; a wonderful family; a job where I feel I can advance not just innovation (i.e. novelty), but progress; and being able to enjoy life in general with few worries.

25 November 2009

After reading Visual Explanations I was intrigued enough to pick up Tufte's classic, The Visual Display of Quantitative Information. Overall it is quite good. With numerous examples, Tufte shows when graphics and tables can be used to illuminate the truth, and he presents some pieces of a theory to govern their design. Tufte also calls out the use of misleading data graphics in newspapers, ads, corporate publications, and other sources.

Some takeaways:

Graphics are frequently considered a crutch to lean on when text is deemed boring; but, when used judiciously, they can actually present and reveal data in a much more information-dense and easily-retained manner than can text. To make effective graphics takes statistical training, not just artistic training.

Creating graphics with "integrity" is, in large part, making sure that the relative sizes perceived by the eye are commensurate with the relative sizes in the real data. Tufte displays examples of many tricks that people have used in order to make differences appear more or less significant than they are. And it's not just about how the data are drawn but also often about what data are drawn.

Tufte also presents some general design principles, including:

Use as little ink on the page as you can, within reason. Avoid redundancy and remove inessential elements.

Avoid busy-looking textures and other "chartjunk."

Arrays of small similar graphics ("small multiples") are an elegant way to present multivariate data.

Integrate graphics with the narrative text when possible.

I was struck by Tufte's lament that computer graphics often evoke the thought "Isn't it remarkable that the computer can be programmed to draw like that?" rather than "My, what interesting data." This seems to be no less true in 2009 than when it was written. Given Tufte's opinions about PowerPoint, I do wonder what he would say about Apple's Keynote. PowerPoint presentations are usually merely inane or unattractive; Keynote presentations, with their typical distorted 3-D charts, are often downright misleading.

While I thought most of Tufte's book was valuable, he occasionally appears to make dubious logical jumps and comparisons to prove his point:

He speaks of a 2.2 megapixel grayscale astronomical survey map being subdivided into "2,275,328 rectangles" as if it is equivalent to a table with as many entries. The eye can perceive macro- and micro-structure in a graphic but not every last detail with fidelity. Thus the effective content of the image is much smaller than 2,275,328 elements. Throughout, Tufte shows some odd fascination with numbers like these and seems to labor under the delusion that their exact magnitudes are meaningful.

Tufte does some rethinking of how box plots, axes, and other graphical elements might look. But he seems to be driven only by his maxim of "reduce ink." In his favorite box plot alternative the different components are barely distinguishable from each other. The simplicity of the resulting graphic does not nearly make up for the fact that the information therein is much less easily perceived.

Still, I recommend this book. Reading it is like flipping through a curated gallery of data graphics designs, many of them strikingly elegant. It is an easy read (I read it in one sitting), but as many of us have to marshal data to make an argument once in a while, we could use some advice on how to do so effectively.

I watched Where The Wild Things Are. I did not enjoy it very much. Max and the beasts are supposed to be a portrait of children, but I just found them to be alternately grating, self-absorbed, and depressing. And whatever worthwhile ideas there were in the film were just lost in a vast sea of inanity (beasts throwing rocks at each other, etc.). It just felt too long. I might have enjoyed a short (or a 48-page picture book) with the same content, though.

The research, published as a featured article in the journal Obesity Research in 2005, showed that people eating from soup bowls that don't empty ate 73 percent more soup than those eating from normal bowls, said Wansink.

When presented with a cornucopia of soup, most subjects were completely oblivious:

[We] brought 62 people in for a free soup lunch [...] we found that those with refillable bowls ate 73% more soup, but did not feel any more full [...] Only 2 individuals ever realized [what] was happening.

01 November 2009

Edward Tufte's Visual Explanations is a complement to his earlier works The Visual Display of Quantitative Information and Envisioning Information (neither of which I have read).

Engineers like myself may benefit most from the first few chapters, in which Tufte presents case studies that demonstrate the importance of data representation in statistical graphics. Whether the graphics obscure or reveal cause/effect can mean the difference between life (as when John Snow identified tainted well water as the cause of the London cholera epidemic of 1854) and death (as when NASA made the catastrophic decision to launch the space shuttle Challenger in cold weather on 28 January 1986). Tufte calls attention to some important issues in data presentation, but I think his criticism of the NASA engineers is not really justifiable (hindsight being 20/20 and all).

Visual Explanations is ostensibly about showing cause and effect graphically, but there are many nuggets throughout that apply to informational graphics more generally. Tufte gives a suite of graphical "design patterns" that are good to keep in mind: eliminating unnecessary clutter, facilitating direct comparisons, using the right amount of contrast, and so on.

The later sections of the book have less specific practical advice but plenty of striking images. The last chapter focuses on the idea of a "visual confection" (Tufte's terminology, I believe)— an image, frequently fanciful, synthesized from smaller images or parts, that tells a coherent story. Perhaps the most poignant of these is the public safety message from The Washington Post, "Why is the Potomac River So Dangerous?", which was later reproduced on a metal sign next to the river.

One takeaway is the notion that a printed graphic is often much more illuminating than a photograph. A photograph can only show you what something looks like. But a cutaway, schematic, or other illustration can show you how something is, by virtue of calling attention to the important details, suppressing the unimportant ones, and indicating cause/effect or the passage of time. Illustrations are an effective and efficient way of conveying an idea directly to the mind's eye.

The book itself is an examplar of the strategies that Tufte advocates. Illustrations appear inline or near where they are referenced in the text. There are no awkward page breaks that require the reader to flip back and forth between an illustration and the text that describes it. The case studies are bite-sized and easy to digest.

Recommended, and an easy read.

This is one of those books that colors one's perception of the world. For example, I can't help but think that some Google Maps engineers took some of Tufte's advice to heart in their recent visual redesign. Many of the maps styles are now more harmonious and suppress unneeded detail and distractions.

One nagging thing that I still don't understand about myself is why I often succumb to well-documented psychological biases, even though I'm acutely aware of these biases. One example is my failure at affective forecasting, such as believing that I will be happy for a long time after some accomplishment (e.g. publishing a new book), when in fact the happiness dissipates more quickly than anticipated. Another is succumbing to the male sexual overperception bias, misperceiving a woman's friendliness as sexual interest. A third is undue optimism about how quickly I can complete work projects, despite many years of experience in underestimating the time actually required. One would think that explicit knowledge of these well-documented psychological biases and years of experience with them would allow a person to cognitively override the biases. But they don't.

The limitations of human cognition are sobering, and sometimes saddening.

"They Write the Right Stuff" (2007) is a Fast Company article about the methodology of the group that writes the on-board software for the space shuttle. The shuttle software has one of the lowest defect rates known of any large project, and it's all due to the process and the culture that surrounds its development. Some of the most interesting takeaways:

Specifications. People who work together on the shuttle software have to be absolutely sure that they are on the same page, that they agree about every aspect of what every part of the software will do. Before any code is written, the requirements for the on-board software are documented in excruciating detail (currently, 40,000 pages of specs for 420,000 lines of code). Compared to software in industry, the discipline needed here seems superhuman. You cannot dive in to coding until you understand precisely what needs to be done. There is no "let me build a prototype and see how it works;" no unnecessary hacks or flourishes in the code; no rock-star programmers.

Continuous improvement. Every time an error is discovered, the team doesn't just fix the error. They document the circumstances surrounding the bug and its discovery; identify how the development process allowed the bug to creep in, and amend it to prevent future occurrences; and look for latent errors that have the same source.

This process costs a lot of money, and it's not fast. But for the software that controls billions of dollars worth of equipment and (in part) determines whether astronauts live or die, it's probably worth it.

16 October 2009

Some big newspapers are still not very good doing this whole "web site" thing. As someone who reads more blogs than newspapers, I am particularly jarred sometimes by their misuse of hyperlinks. Two examples:

"Thanks to Twitter/all tweeters for fantastic support over past 16 hours! Great victory for free speech," Mr. Rusbridger wrote on his Twitter feed. Words tied to the case were among the most mentioned on Twitter Tuesday.

Even so, this article does not contain a single hyperlink.

The New York Times's "Bits" blog errs on the other side, having too many hyperlinks. They recently ran an article about the launch of Google Wave (29 Sep 2009). Unlike in the WSJ, there are useful hyperlinks to primary sources, such as blog posts. But the text is peppered with additional useless hyperlinks, like "Google" and "Microsoft" being linked to curated "More information about this company..." pages (example).

I find this practice borderline sleazy, and only a little bit better than those advertisements that show up in bubbles when you mouseover highlighted words on certain websites. Why? Because it violates the usual conventions between writer and reader. If words in a block of prose are hyperlinked, the target of the link is assumed to be relevant to the matter under discussion. (Paul Grice called this the Maxim of Relation.) A company history hardly contributes anything to my immediate understanding of the article's subject. When conventions like these are broken it makes it more difficult to identify, and extract information from, the real substantive links.

The solution? Put the links in a sidebar. Bullets under a heading such as "Companies mentioned in this article" would make the context and intent crystal clear.

10 October 2009

L.A. Confidential is a great crime drama. The period (1950's) atmosphere is cute and the screenplay is very good. What really makes the film is Guy Pearce's character. His political scheming is oddly compelling and, since he is sort of a nerd, I really wanted to root for him.

My Neighbor Totoro is supposed to be the iconic Miyazaki film, but I didn't love it. There were some good elements (the interplay between the children is adorable) but overall it seemed very disjointed and random at times. Of course, maybe I would have interpreted everything differently if I had known, at the time that I was watching it, that the film is an allegory in which the Totoro is the grim reaper...

17 September 2009

Beatles Rock Band has gameplay that is basically unchanged from that of Rock Band 2. However, it's the music that really steals the show. Many of the songs are just strikingly beautiful. I've been trying to get my hands on some of the Beatles' albums.

I've heard the Beatles before, of course, but I don't think I've ever listened to the Beatles much before last week.

Surprising fact of the week:

Nonetheless, The Beatles, with 28 million albums sold since 2000, may overtake Eminem's 32 million to become the decade's best-selling artist. That race ends Dec. 31.

13 September 2009

Norman Borlaug was an agricultural scientist recognized as the father of the Green Revolution, the technological transformation of agriculture in the second half of the 20th century. For his work in expanding the world's food supply, in particular creating high-yield varieties of wheat, Borlaug was awarded the Nobel Peace Prize in 1970.

30 August 2009

The main driver for the story is the use of "bobbles," stasis fields that suspend the flow of time inside, while being totally impervious to outside forces. The plot is set fifty million years in the future, when a few small groups of humans emerge from their bobbles, surprised to find themselves on an otherwise uninhabited Earth. But now that it is incumbent upon these few to repopulate the planet, there are power grabs, and politics, and class warfare, and a murder mystery.

I enjoyed reading Marooned in Realtime. Vinge paints a world where a few people have incredible power over space and time (though they are far from omnipotent). I thought a couple of aspects were particularly interesting: humans becoming increasingly reliant on augmented cognition (ahem, internet access?), and what happens to people when they can live for decades— or millennia— alone.

(And yet, even in the bizarre new era of the story, some of the human conflicts are still very recognizable.)

However, I thought that the conclusion degenerated into a bunch of clichés and didn't really leave me with anything satisfying. Still, the novel is an eye-opener.

26 August 2009

Magic Cube 4D is software simulating a four-dimensional analogue of the Rubik's cube.

Magic Cube 5D is... I won't insult your intelligence. Like the 3-dimensional Rubik's cube, it comes in a variety of sizes. And Levi Wegner has performed an extraordinary feat:

Levi Wegner recently solved the 65 puzzle! This monster has 12,960 stickers, and it took him 24 days, averaging roughly six hours per day. It is a good thing that the program supports macros otherwise this 1.9 million twist solution would have been essentially impossible.

This is so awesome as to pretty much defy comprehension.

The 75 puzzle remains unsolved, if you want to make a name for yourself.

Or you can try the Magic120Cell, which looks like an explosion at a dodecahedron factory:

24 August 2009

David Pogue reviews some new digital cameras that do better image processing on low-light images. Which is already a killer feature, but there's more:

The Sony performs two other stunts that will make your jaw drop. From its much larger, zoomier cousin, the HX1, the WX1 inherits Sweep Panorama mode. As you whip the camera in an arc around your body, it quietly snaps 10 consecutive photos, figures out how to connect them, and spits out a finished 270-degree panorama. Talk about wide-angle!

[...]

The Sony's other great trick is capturing 10 images in a one-second burst, a capture speed that puts most other pocket cams to shame. Unfortunately, the camera locks up for 18 seconds afterward, as it processes all those shots.

22 August 2009

The BBC has a feature on the history of Unix, in celebration of its 40th anniversary this month. In 1969, Ken Thompson and Dennis Ritchie had been meaning to write a new operating system for a while, but it wasn't until August that Thompson got some time to himself. Then he hammered out the core of Unix in one month:

[In] August 1969, Ken Thompson's wife took their new baby to see relatives on the West Coast. She was due to be gone for a month and Thompson decided to use his time constructively — by writing the core of what became Unix.

He allocated one week each to the four core components of operating system, shell, editor and assembler.

20 August 2009

I ordered a CD and was pleasantly surprised to learn that it came in a paper (biodegradable!) case. And not a simple sleeve, either— a pretty fashionable case with a cutout for the disc, and an album booklet glued in.

Of course, Amazon shipped this form factor in a bubble wrap envelope, while I believe they ship most CDs in a cardboard mailer. Oh well. You can't win them all.

18 August 2009

In high school I was taught that in order to keep the creative juices flowing while writing, you had to always keep your hands moving— either writing about trivialities or even just scribbling, if you were really drawing a blank.

Perhaps that is the reason I developed this weird habit of tapping Ctrl, Shift, Alt, and all the other no-op keys I can find on any given keyboard.

(If you were curious, and I really doubt you were, I also wave my hands around when I draw a blank while speaking.)

16 August 2009

Phil Schiller responded to Daring Fireball after the controversy in which Apple pressured app makers to remove profanity from a dictionary app. Daring Fireball seemed impressed:

This is music to my ears. That Schiller was willing to respond in such detail and length, on the record, is the first proof I’ve seen that Apple’s leadership is trying to make the course correction that many of us see as necessary for the long-term success of the platform. The improvement I consider most important is a significant focus on fairness, consistency, and common sense in the App Store review process.

(Emphasis mine.) This optimism stems from what I can only assume is a colossal failure of imagination.

This sort of stuff doesn't happen every six months just because there is some misunderstanding about Apple's policies or some level of inconsistency in their enforcement. In the world of technology, policy is basically worthless because tech tends to change faster than policies can anticipate.

The decisive factor here is Apple's technological powers.

Apple will mess with your phone if (1) it has the technological power to do so and (2) if the expected present value of doing so is positive. It's hard to measure criterion (2) but easy to measure criterion (1). If you are the kind of person who doesn't want Apple to mess with your phone, buy a phone that Apple cannot mess with.

11 August 2009

Star Trek: The Wrath of Khan is quite good. It's pretty clever and well-written. And unlike many of the Star Trek films that have come since, it has some memorable characters (even the villain!). William Shatner and Leonard Nimoy make a great pair onscreen.

Dark City is highly reminiscent of The Matrix (which it preceded, by a year). The story is fairly intriguing but the acting is so-so. The two things really worth seeing are the cool (and dark) visual style and Kiefer Sutherland playing a mad scientist type.

08 August 2009

I got a new road bike a couple of months ago (a Mercier Galaxy), and I've been quite happy with it, especially because now it's just so fun to go outside and get some fresh air and exercise.

There is something oddly satisfying about getting into the "zone" and in proper bike pedaling rhythm (about 90-100 min-1, if you wanted to know). And the agility of the bike, when compared to the old mountain bike I've been riding around since grade school, is positively exhilarating.

(Tangentially related: 100 min-1 is also the AHA-recommended rate for performing chest compressions during CPR. It's easy to get this right because it's also the beat to The Bee Gee's "Stayin' Alive". Who needs a cadence meter?)

05 August 2009

One passage from Moneyball (review) really struck me. Bill James, one of the first real advocates of the statistical approach to baseball, remarked on the difficulty of getting traction with his ideas:

Seven years into his literary career, in the 1985 Baseball Abstract, James formally gave up any hope that baseball insiders would be reasonable. "When I started writing I thought if I proved X was a stupid thing to do that people would stop doing X," he said. "I was wrong."

Perhaps this is just the way we are wired to think about things, but it is what it is. Advocates for all causes would do well to remember that the most effective arguments are a mixture of not only evidence but also some combination of flattery, repetition, inspiration, subtlety, awe, live demonstrations, and/or fear.

Moneyball: The Art of Winning an Unfair Game, by Michael Lewis, is a book about Billy Beane, general manager of the Oakland Athletics, and how under his management the A's became one of the top teams in MLB on a budget that was a fraction of that of many other teams.

Billy Beane's secret weapon is math. And this puts him at odds with baseball's conventional wisdom. The A's scouts are always second-guessing the work of Beane and his assistant, Paul DePodesta. The scouts travel around the country and recommend players on the basis of what comes down to good looks and wishful extrapolation. Beane and DePodesta, on the other hand, have accumulated massive amounts of data about potential players. They can characterize every aspect of a player's performance and they know how to put a price on it. But they can't explain this to the old guard:

No one in big league baseball cares about how often a college player walks; Paul cares about it more than just about anything else. He doesn't explain why walks are important. He doesn't explain that he has gone back and studied which amateur hitters made it to the big leagues, and which did not, and why. He doesn't explain that the important traits in a baseball player were not all equally important. That foot speed, fielding ability, even raw power tended to be dramatically overpriced. That the ability to control the strike zone was the greatest indicator of future success. That the number of walks a hitter drew was the best indicator of whether he understood how to control the strike zone. Paul doesn't say that if a guy has a keen eye at the plate in college, he'll likely keep that keen eye in the pros. He doesn't explain that plate discipline might be an innate trait, rather than something a free-swinging amateur can be taught in the pros. He doesn't talk about all the other statistically based insights— the overwhelming importance of on-base percentage, the significance of pitches seen per plate appearance— that he uses to value precisely a hitter's contribution to a baseball offense. He doesn't stress the importance of generalizing from a large body of evidence as opposed to a small one. He doesn't explain anything because Billy doesn't want him to. Billy was forever telling Paul that when you try to explain probability theory to baseball guys, you just end up confusing them.

The scouts (and the management of every team except the A's) were looking for personalities. They wanted players who were power hitters and would steal bases (and were, did I mention, good looking?), even though stealing bases doesn't win games. They looked at numbers like RBI and saves, even though RBI doesn't win games. Beane and DePodesta had sophisticated models and they attempted to maximize one thing, the one thing that actually matters: expected runs per dollar, and hence, games won per dollar. So Beane was highly effective in trading away his star (read: overvalued) players and replacing them with unknowns (read: undervalued) who were nearly as good. He knew what traits were important and what deficiencies could be ignored. (Good thing, because, the A's budget being what it was, Beane often had to settle for what were superficially "damaged goods".)

If you can picture all this, consider that Beane managed the team's training and strategy the same way he chose the draft picks. Under Beane's management, the A's had a phenomenal 2002 season, including a 20-game winning streak.

I don't even like baseball (full disclosure: I do like math), and I found this book riveting. Michael Lewis lays out the story very well— even though the book centers around statistics, there is no shortage of the human element here. Moneyball is an easy read, and highly recommended.

(Moneyball is being made into a movie, starring Brad Pitt as Billy Beane!)

25 July 2009

I've made hummus from scratch a couple of times now, and I love it. (I love chickpeas!) In comparison, store-bought hummus is thin, tastes bland, and kind of has a chemical/artificial flavor. Making your own hummus is really easy. Here's my recipe.

There's a lot of room for variation, and people have different preferences for the various dimensions of hummus: consistency, creaminess, spiciness, tartness, etc. Here are some approximate amounts to get you started. Start with the following; I recommend some experimentation. (This recipe is capable of making a thick hummus, suitable for pita chips as well as sandwiches.)

Instructions: In a food processor, blend the following ingredients:

2 cans chickpeas. I boil these, remove as many of the husks as I care to, and drain them. This softens them and takes some of the salt away.

4 cloves garlic. Toasting these in a pan, dry, for a few minutes takes some of the bite off.

2 tbsp olive oil

1/2 tbsp lemon juice

1/2 tsp sesame oil

salt

pepper (black and/or cayenne)

paprika (I love hot paprika)

a little (on the order of a few tbsp) water

This will not get you all the way there, but you should be able to obtain your ideal hummus by using this as your starting point and performing gradient descent with the last 7 ingredients. (Translation: season to taste.) Be careful not to use too much sesame oil, it can quickly overwhelm the rest of the flavors.

Empty into a bowl and let it sit in the fridge for two hours to let the flavors mix.

22 July 2009

Esther's German Bakery has weißwurst (weisswurst) on their menu, and it's not bad— the best I've found in the states (N = 2). They even serve it with the traditional sweet mustard (as well as a pretzel and radishes).

14 July 2009

I read The Fabric of Reality by David Deutsch. With Brian Greene's similarly-titled book still fresh in my mind, when I read the words "theory of everything" I was expecting a book about the implications of a quantum theory of gravity. But Deutsch's aims are far grander than that. He imagines a "theory of everything" in which we can identify common principles that apply to everything from fundamental physics to biology, computation, or economics.

Deutsch rightly makes a point of trying to change scientific dialogue to center around explanations rather than predictions. After all, any idiot can come up with a theory that is consistent with observed evidence (cough, creationism, cough). Instead, we have to judge theories by their explanatory power. In the "theory of everything" as he imagines it, Deutsch does not picture all knowledge as being explained in terms of fundamental physics, because an explanation which crosses too many levels of abstraction is not illuminating at all. Rather, Deutsch makes the case that we can uncover general principles that connect apparently disparate fields, and that such links are the only way we can really comprehend the universe. He gives examples of principles from quantum physics, evolution, epistemology, and the theory of computation— what he calls "the four strands"— that can help us understand all of the others.

I think the vision is a good one but its execution here is just too muddled. According to Deutsch the multiverse (many-worlds) theory is the only sensible interpretation of quantum mechanics. But he really just hand-waves his way to this conclusion without making any compelling arguments in its favor. Now, I'll grant that there is ambiguity surrounding the "correct" interpretation of QM. But it's still a huge leap from there to claim, as Deutsch does, that the multiverse theory has implications for morality, ethics, and free will. I just don't buy that.

What really bothers me about this book is Deutsch's highly specific and fantastic extrapolation. He asserts vast generalizations of principles such as the Church-Turing thesis, but his justifications seems to be nothing more than wishful thinking. Deutsch does advertise his claims as speculation, but it seems that he is falling into one of the traps he himself warns against: that is, believing that pure logic can make any meaningful statements about the physical universe.

There are many interesting nuggets in this book, and it is certainly intellectually challenging, but on the whole I wouldn't recommend it.

09 July 2009

One of the more overlooked features of the new iPhone 3.0 is support for a new open standard for live video streaming over HTTP, which promises to open up standards-based video broadcasting to a wide audience while giving mobile users an optimized picture as they roam between WiFi and mobile networks. [...]

Essentially, Apple wants a standard for streaming video that anyone can use so that it can continue selling hardware without being either shut out of the market by proprietary software [Microsoft's "smooth streaming" method], or held captive by it...

Oh, Apple doesn't want to be held hostage by proprietary software, huh? How cute.

28 June 2009

I just finished reading The Special Theory of Relativity by David Bohm, based on Bohm's undergraduate lectures at Birkbeck College. Bohm approaches relativity in a somewhat unusual way: he focuses on exposing the implicit assumptions that underlie our common-sense notions of spacetime; it is these assumptions that make relativity seem paradoxical for many people. For that reason I consider it worth reading, even if you've studied relativity before. In addition to the material on relativity itself, Bohm also has some insightful comments about the nature of scientific inquiry as well as about the development of human perception and how we acquire our common-sense notions of spacetime.

Some take-home points:

Lorentz and proponents of the ether theory actually had predicted many of the well-known relativistic effects (e.g. length contraction and time dilation for moving objects). One of Einstein's major contributions was to focus solely on relationships between objects, which are in principle observable, rather than on substances like the ether, which are not.

Relativity of simultaneity (i.e. the fact that observers can disagree on whether events occur at the same time or not) is very counterintuitive and can be thought of as the source of a lot of the apparent paradoxes in relativity. For example, when you measure the length of an object you record where its ends are at the same time, but the notion of "at the same time" is relative. Length contraction occurs because different observers disagree on what it is that should be measured.

It is kind of cute to characterize relativity as saying that "everything is relative" but it is really the things that the theory says are not relative that are of interest (e.g. the speed of light, the spacetime interval, and proper time and mass). It is through invariants that we can understand the aspects of a situation that are really fundamental to it and separate them from those that are contingent, or relative to our viewpoint. The idea is like being able to perceive that a table is circular even if it appears to look like ellipses of various shapes and sizes as we walk around the room. Bohm argues that the same process of inferring invariants is at the heart of both scientific inquiry and human perception.

In GEB, Hofstadter builds up the argument that minds and machines are fundamentally the same, in the sense that both can be represented by mechanical/mathematical rules. In particular, he shows that a sufficiently complex formal system gains the ability to reason and make statements about itself.

On the way, Hofstadter takes a tour through music, art, logic, neurology, computer science, genetics, Zen... you name it. The scope of this book is astounding.

I think the most intriguing theme is the idea of taking a step back and making generalizations (or induction, if you like) about a system. This property is at the core of what we would call intelligence and is commonly believed to be one of the things that separates us from most animals (and from computers). Hofstadter relates an anecdote about the Sphex wasp to argue that animals are just hard-wired to handle a finite repertoire. But it is kind of chilling when you realize that the same is likely true of humans— the only thing that is different is the size of our repertoire:

[The Sphex wasp] has no ability to notice when the same thing occurs over and over and over again in its system, for to notice such a thing would be to jump out of the system, even if only ever so slightly. It simply does not notice the sameness of the repetitions. [...] Are there highly repetitious situations which occur in our lives time and time again, and which we handle in the identical stupid way each time, because we don't have enough of an overview to perceive their sameness?

Hofstadter concludes the book with some speculation about how the mind might be "implemented" in hardware, i.e. neurons (for example, how high-level pattern recognition happens, and how features in the mind might be represented in hardware) and how we might begin to understand the physical basis for high-level features of the mind (intentions, emotions, etc.). You might find this part interesting even if you skimmed over (or wanted to skim over) the more mathematical content of the book. GEB was written 30 years ago but much of it is still relevant— little of Hofstadter's speculation about neurology has been resolved since.

19 June 2009

Slumdog Millionaire was pretty good (provided you think of it as a modern fairy tale, kind of). It paints quite a stark picture of poverty and class divisions in modern India. The ending doesn't totally hang together but the Bollywood dance number before the credits more than makes up for that.

The Terminator. As it turns out I watched Terminators 1-3 in reverse order, concluding with The Terminator most recently. It was kind of interesting watching the plots get better and the special effects get worse. (It looks like The Terminator was done in stop-motion. That sounds about right for 1984.) The movie was quite enjoyable. Perhaps it just seemed better because, having seen Terminators 2 and 3 already, the original now seemed more pregnant with meaning.

Up. Beautiful concept, top notch visuals. The beginning was poignant and fantastic, but by the middle it had kind of degenerated into "archetypal summer cartoon action movie". Not bad, although it was no WALL-E. But 98% on RT? There must have been something in the water at the screenings.

13 June 2009

I've long suspected that it was no coincidence that the Moon always keeps the same face towards the Earth, but I never actually knew why until yesterday.

Short answer

There is indeed a negative feedback loop that tends to synchronize the Moon's rotation with its orbit. This phenomenon is called tidal locking.

Longer answer

The Earth exerts a tidal force on the Moon, elongating it in the direction of the Earth-Moon axis (and compressing it in the perpendicular directions). However, since the Moon resists being deformed to some degree (don't we all?), if it rotates faster or slower than its orbital angular velocity, the axis of elongation can run ahead of, or behind, the Earth-Moon axis. In either of these cases, the Earth's gravity provides torque on the moon to slow down or speed up its rotation, respectively. Thus the Moon tends to keep the same face towards the Earth.

10 June 2009

As part of this year's focus on education, the UN Global Alliance for Information and Communication Technology and Development (GAID) presented the newly formed University of the People, a non-profit institution offering higher education to the masses.

[...]

For hundreds of millions of people around the world higher education is no more than a dream, Shai Reshef, the founder of the University of the People, told reporters. [...] Mr. Reshef said that this University opened the gate to these people to continue their studies from home and at minimal cost by using open-source technology, open course materials, e-learning methods and peer-to-peer teaching.

07 June 2009

I previously
raved about Dan Ariely's Predictably Irrational. So I also
eagerly watched the two TED talks he has given, which cover a lot of the
same ideas presented in his book, but with a couple of novel anecdotes.
They are kind of like the Readers Digest version of Predictably
Irrational, and I recommend them if you can spare 35 minutes:

Now, in any introductory psychology or economics class you learn a lot about
cognitive failings, or apparent deviations from rationality. I think that
what makes Dan Ariely's books and talks so valuable is that in addition to
pointing out our flaws, he gives advice on how we can work around them in
order to make life better for people. I think the conclusions from these two
TED talks are, in particular, quite important:

We can create more effective institutions if we design them so that they
take into account our cognitive limitations, rather than designing them
under the assumption that we (the users) are beings of perfect rationality.

We could find many better ways of doing things if only we were willing to
test our intuitions with experiments.

01 June 2009

Most of the problems described in the book do not have super subtle roots. When you read about each issue it is easy to understand in retrospect the source of the problem and how to avoid it. Yet, and this is really sobering, even the designers of the Java libraries made many of the mistakes described (the book has lots of Java war stories). And they are still paying for some of them so that they can maintain API compatibility. This book represents a lot of collective experience that Java programmers have acquired over the years. So it's really valuable to have all these tips in one place. In addition, Bloch's writing is fantastically lucid, one of the best among all technical books I've read.

I had read Effective Java before but now that I'm actually working on some big Java projects (for work, not hobby) I think I had a much higher absorption rate than before.

I had the first edition of Effective Java on hand, which was written for Java 1.3. The second edition was published in 2008 and includes new chapters about, among other things, generics, which were introduced in Java 1.5. Conveniently enough, the book's website has the chapter on generics as a free download. Since generics can be sort of tricky to get right I recommend reading that chapter (did I mention it's free?).

24 May 2009

When
You Are Engulfed in Flames is David Sedaris's latest book, and the
first of his works that I've read. A couple of the stories are quite
endearing, especially the ones about Sedaris's close encounters with death,
as well as his quest to quit smoking. But many of the others just strike me
as "Ha ha, look how quirky we were or are." While When You Are Engulfed
in Flames is at least mildly amusing and entertaining throughout, it
only rises to the level of laugh-out-loud funny in a couple of places.

As an aside, I noted that David Sedaris was puzzled by some of the same
things in Japan that puzzled me during my recent trip to Taiwan, namely, (1)
Why is everyone so exceedingly polite? and (2) Why is green salad served at
breakfast?

The
Fabric of the Cosmos: Space, Time, and the Texture of Reality is
Brian Greene's sort-of followup to The Elegant Universe.
Fabric of the Cosmos covers the principles of relativity and quantum
mechanics and the attempts to unify them (with string theory and its
variants). One overarching theme that Greene considers is how all these
different theories have different implications about the true nature of space
and time— are they purely artificial constructions, or are they
fundamental concepts, or are they emergent phenomena arising from something
more fundamental? And considering these alternative theories about the nature
of the universe is a lot more interesting than just asking "Can we smash tiny
particles into tinier particles?" (Although, high-energy physics is indeed
still an important apparatus.)

Having not yet totally forgotten my undergraduate physics classes, I was
pretty impressed by Greene's treatment of relativity and quantum mechanics.
He makes quite lucid analogies that convey the basic principles without much
math. His treatment of string theory and the possibility of extra space
dimensions was pretty enlightening, too.

The part where my eyes started to glaze over was during the chapters in the
middle about the Higgs field and cosmology. At this point it seemed like the
analogies that Greene used were just analogies for the sake of not using
scary terminology. They seemed to me kind of hollow and didn't really provide
any interesting insights about the underlying phenomena.

Overall, Fabric is a well-written guided tour of modern physics and
is worth a read (especially if you have not read The Elegant
Universe). My only complaint is that after reading so much physics
without any math I feel sort of swindled. I feel compelled to go purchase a
proper string theory textbook, which, I suppose, is pretty high praise for
Professor Greene.

23 May 2009

In tasks involving hand-eye coordination (for example, returning a serve in tennis), the brain has to estimate a quantity u based on an observation v. Bayes' rule tells us that the optimal estimate of u's distribution depends on both the prior distribution of u as well as on the evidence v:

P(u | v) ∝ P(u) P(v | u)

Intuitively we know that anytime we make an decision based on evidence, the decision critically depends on the uncertainty associated with the evidence (how "trustworthy" the evidence is). This is actually encoded in the equation above. If the observation contains little information about the actual value, then we put more weight on the prior. In the extreme case, if the distribution P(v | u) is independent of (i.e. contains no information about) u then the estimate is exactly the prior:

P(u | v) ∝ P(u)

But if the evidence tells us a lot, then we put less weight on the prior. In the extreme case, if P(v | u) = δ(v, u), we can actually ignore the prior:

P(u | v) = δ(v, u)

So, we have to integrate the two pieces of information— the prior and the evidence— to make an estimate, while accounting for the reliability of the evidence.

Now, if this sounds complicated, you can take some consolation in the fact that you actually already know all this stuff. In a paper published in Nature, Körding and Wolpert (1994) described an experimental setup in which they asked volunteers to complete a hand-eye coordination task. The subjects' task performance indicates that the human brain maintains estimates of the prior distribution and the evidence uncertainty, and combines them in a way that is consistent with the Bayesian estimate above (and inconsistent with a couple of alternative models of cognition).

That's right: we appear to be hard-wired for Bayes' rule. This is pretty amazing, if you ask me.

Now critics might argue that the ODF format is underspecified, but that's true in some sense of every standard. It's almost as if Microsoft did not even make a good faith effort here (I know, shocking):

Remember, it is not particularly difficult or clever to to take an adverse reading of a standard to make an incompatible, non-interoperable product. [...] The difference between minimal conformance and interoperability is well illustrated in these tests.

30 April 2009

What Happened is Scott McClellan's political memoir and the story of his time as press secretary under the Bush administration. It is a pretty good recap of the major scandals of the administration: the "sixteen words" controversy, the Valerie Plame leak and the subsequent special investigation, and hurricane Katrina.

What is really interesting is McClellan's assessment of Bush's character and of how the administration chose to deal with Congress and the media. In McClellan's eyes, Bush wasn't an idiot, but he was intellectually dishonest, never seeking out views opposed to his own. And Bush wasn't a liar, but many in his administration were. McClellan has harsh words for the Bush administration's strategy of constantly trying to manipulate the press for short-term gain, as if it had still been waging a campaign— the so-called "permanent campaign." Perhaps the height of hypocrisy was the Bush-authorized leaking of Valerie Plame's name from classified documents, for the purpose of discrediting critics of the administration.

So why was Bush so set on the Iraq war in the first place? Because apparently, he was gripped by an awe-inspiring vision of a democratic Middle East. It borders on the unbelievable that he put so many in danger to indulge this dream.

What Happened is a light-reading history of the Bush administration and an interesting look inside the White House. However, if you are a political junkie, which I'm not, I suspect you might not find enough of substance in here to keep you entertained.

Persepolis is a comic book, the story of a girl's coming of age during the Iranian Revolution. It's deeply moving to watch the installation of a theocracy through the eyes of a little girl who blossoms into a liberal-minded woman. Persepolis is both quite poignant and entertaining.

24 April 2009

The 10th Ubuntu release 9.04/Jaunty, was made yesterday. Since I started using Ubuntu (I think I tried the very first release, 4.10, and have been using it full-time since 2006 or so) it has come a long way, although many of its core strengths have been there from the beginning.

Good looks and good desktop interaction are important for attracting users, but the press rarely talks about the architectural/design qualities that help to keep users. Ubuntu (and many other GNU/Linuxes):

is a joy to use because it bends over backwards to fit my needs, instead of requiring me to adapt to the computer

being free software, is actually something I trust

is actually easy to acquire and install, and runs on almost anything

Windows and Mac OS have none of those qualities.

The total effect is that I have peace of mind that Ubuntu is something that I'll be able to use for as long as I care to use it, and that it's not a toy OS that I'm going to outgrow.

I'd like to offer my congratulations to the Ubuntu community on a great release (on time, I might add) and my thanks for a really remarkable gift.

16 April 2009

I love that the icon for the Synaptic Package Manager is, get this, boxed software, a CD-ROM, and, yes, a 3.5" floppy disk. Especially because these are all forms of distribution that are being rendered obsolete by, among others, Synaptic.

This is not to say that I might have done a better job at making an icon to illustrate something that is totally intangible.

14 April 2009

Scott Ritchie has written an interesting post about the challenges that Wine developers face, as well as the value of regression testing (yeah! testing!) and the importance of a stable release process (yeah! stability!).

05 April 2009

I had no idea. Not only are internationalized domain names— domain names with non-ASCII characters— already here (at least, they work for me in Firefox 3.5), there is a cool URL-shortening service called tinyarro.ws that uses them to make really short URLs. For example: http://✿.ws/ਛ

I guess this is nice if you are using Twitter.

(There is a standard by which domain names with international characters are translated, at the client side, into longer ones that contain only ASCII characters.)

Given the phishing possibilities associated with IDNs, this is positively terrifying. Either web browsers will need to alert users to the presence of international characters, or people will have to use a completely different mental model of trust on the web when following hyperlinks.

Update, 4 August 2009: it is my understanding that Firefox shows the translated (i.e. ASCII-only) URLs in the address and status bar, but, unfortunately, only for certain TLDs (like .com).

04 April 2009

There are now more than 25,000 programs, or applications, in the iPhone App Store, many of them written by people like Mr. Nicholas whose modern Horatio Alger dreams revolve around a SIM card. But the chances of hitting the iPhone jackpot keep getting slimmer: the Apple store is already crowded with look-alike games and kitschy applications, and fresh inventory keeps arriving daily. Many of the simple but clever concepts that sell briskly — applications, for instance, that make the iPhone screen look like a frothing pint of beer or a koi pond — are already taken.

This reminds me of the 1990s, when there were about a million crummy shareware apps, most of them redundant (and, I imagine, unprofitable). This waste of resources is one of the costs of proprietary software development.

29 March 2009

This is an interesting case of "your body protecting itself from you" (though somewhat inadvertently).

In the body, alcohol (ethanol) is converted by the enzyme alcohol dehydrogenase to acetaldehyde, which is in turn converted by the enzyme aldehyde dehydrogenase (ALDH2) into acetic acid. The intermediate product acetaldehyde is a toxin that contributes to some of the unpleasant reactions associated with hangovers.

Many people of East Asian ancestry have a mutation in the gene encoding alcohol dehydrogenase which makes it particularly effective at producing acetaldehyde, as well as a mutation in the gene encoding ALDH2 that makes it less effective at metabolizing acetaldehyde. Consequently, these people experience the alcohol flush reaction: flushing, nausea, elevated heart rate, and headache almost immediately after drinking alcohol.

Now researchers have found that people with a deficient ALDH2 gene are at elevated risk for squamous cell esophageal cancer, a form of throat cancer, if they drink alcohol. But people with two copies of the deficient ALDH2 are already averse to alcohol! The ones at risk are those with just one copy of the deficient ALDH2, because although they are still at elevated risk of cancer, their alcohol flush reaction is weaker and many of them develop a tolerance against it.

24 March 2009

France, which completely reprocesses its recyclable material, stores all the unused remains — from 30 years of generating 75% of its electricity from nuclear energy — beneath the floor of a single room at La Hague.

Meanwhile, we have a huge stash of nuclear waste at Yucca Mountain, and it has become a lightning rod for criticism. Why? Because the reprocessing of that "waste" into more useful nuclear products, as France does, was banned in 1977 by Jimmy Carter.

The Faculty of the Massachusetts Institute of Technology is committed to disseminating the fruits of its research and scholarship as widely as possible. In keeping with that commitment, the Faculty adopts the following policy: Each Faculty member grants to the Massachusetts Institute of Technology nonexclusive permission to make available his or her scholarly articles and to exercise the copyright in those articles for the purpose of open dissemination. In legal terms, each Faculty member grants to MIT a nonexclusive, irrevocable, paid-up, worldwide license to exercise any and all rights under copyright relating to each of his or her scholarly articles, in any medium, provided that the articles are not sold for a profit, and to authorize others to do the same. The policy will apply to all scholarly articles written while the person is a member of the Faculty except for any articles completed before the adoption of this policy and any articles for which the Faculty member entered into an incompatible licensing or assignment agreement before the adoption of this policy.

This is beautiful. It's heartening that MIT is taking a stand to share knowledge rather than hoard it.

15 March 2009

Buying digital music is so convenient that I do it without thinking twice now (I get mine from Amazon). DRM is a thing of the past and formats are basically standardized, so I can play my music on all of my devices, using only free software. You don't need any proprietary software to acquire your music either (on Amazon it's just click and go, and a download starts. At least for individual tracks.) You don't have to get in the car or wait for a CD to be shipped. Digital music costs less than CDs ever did, for both singles and albums.

They say the best way to combat copyright infringement is to make buying music legally more convenient than getting it through the illegal channels. I'm certainly sold. Buying music online is finally hassle-free and reliable. And music you bought online is basically as durable as CDs.

Now, I can't wait until I can say the same thing of movies and books. Reliable e-books will be spectacular.

09 March 2009

[When] you need financial models the most — on days like Black Monday in 1987 when the Dow dropped 20 percent — they might break down. [...] Dr. Merton and Dr. Scholes won the Nobel in economic science in 1997 for the stock options model. Only a year later Long Term Capital Management, a highly leveraged hedge fund whose directors included the two Nobelists, collapsed and had to be bailed out to the tune of $3.65 billion by a group of banks.

07 March 2009

The NYT Magazine has an interesting article about Zipcar and some of the business decisions that have been instrumental in its success. Perhaps first and foremost, its management realized that in order to make an impact, Zipcar would have to operate at a large scale— even if it meant angering environmentalists:

To some environmentalists, it was anathema to make driving seem fun [with the slogan "Wheels When You Want Them"]. [Founder Robin] Chase was unconcerned. "The whole game at Zipcar was to get people to join so that they would sell their car or not buy one. I used to joke that I would have put a Hummer in the fleet if it would get people to join."

More people will get on board if you ask them to drive less than if you ask them to stop driving.

Other ideas: fostering a sense of community (cf. Yelp)...

Chase threw potluck parties for them, mixers, a swim at Walden Pond. It didn’t matter that only a few people showed up: simply knowing that such events were taking place seemed to burnish the Zipster identity. Customers were encouraged to come up with quirky names for each Zipcar; naming the cars, Chase found, personalized them and encouraged members to treat them gently.

And picking a good name...

Then we tried 'U.S. Carshare.' That was how I learned that 40 percent of the people I talked to had an extremely negative reaction to the word 'sharing.' The word makes people nervous. They feel they're being scolded or told to wait their turn. At that point I banned my staff from using the phrase 'car sharing.' Do we call hotels 'bed sharing'? That's way too intimate.

02 March 2009

Assemblyman Joel Anderson, a San Diego-area Republican, decided to introduce his bill after reading that terrorists who plotted attacks in Israel and India used popular sites such as Google Earth and Microsoft's Virtual Earth.

His bill would restrict the images such Web sites could post online. Clear, detailed images of schools, hospitals, churches and all government buildings [...] would not be allowed.

28 February 2009

A designer, Jamie Divine, had picked out a blue that everyone on his team liked. But a product manager tested a different color with users and found they were more likely to click on the toolbar if it was painted a greener shade.

As trivial as color choices might seem, clicks are a key part of Google’s revenue stream, and anything that enhances clicks means more money. Mr. Divine’s team resisted the greener hue, so Ms. [Marissa] Mayer split the difference by choosing a shade halfway between those of the two camps.

Her decision was diplomatic, but it also amounted to relying on her gut rather than research. Since then, she said, she has asked her team to test the 41 gradations between the competing blues to see which ones consumers might prefer.

[...] bullshitters misrepresent themselves to their audience not as liars do, that is, by deliberately making false claims about what is true. In fact, bullshit need not be untrue at all.

Rather, bullshitters seek to convey a certain impression of themselves without being concerned about whether anything at all is true. They quietly change the rules governing their end of the conversation so that claims about truth and falsity are irrelevant. Frankfurt concludes that although bullshit can take many innocent forms, excessive indulgence in it can eventually undermine the practitioner's capacity to tell the truth in a way that lying does not. Liars at least acknowledge that it matters what is true. By virtue of this, Frankfurt writes, bullshit is a greater enemy of the truth than lies are.

19 February 2009

We like to think that we make decisions logically and rationally. This is one of the assumptions that underlies much of economics, public policy, and modern life. We just assume that people can determine what is best for themselves and that they act accordingly. Things operate this way even though we are all well aware that no one is perfect. We all make mistakes and suboptimal decisions.

In Predictably Irrational, MIT economist Dan Ariely argues that these deviations from rationality are not just random mistakes. We systematically make the same kinds of cognitive mistakes, day after day, even after the error of our ways is pointed out to us.

Ariely cites a number of psychological studies (most of which he designed or co-designed) and generalizes from them to shed light on various classes of irrational behavior or cognitive failings. (Just one example: people know they will procrastinate, but they systematically underestimate the extent to which they will do it.) And he reflects on the implications for our lives and for public policy. The book is a quick read and is pretty entertaining, and I recommend it. There are quite a few laugh-out-loud moments, mostly while Ariely describes his clever experimental setups, many of them involving unwitting MIT undergraduates.

Psychology fascinates me because for all the talk about people all being different, there are so many ways in which people are all the same. And being able to learn about itself may be one of the most amazing feats of the human mind. But there is something about the idea of "predictable irrationality" that is really sobering. We often imagine that, as human beings, our intelligence and our capacity for introspection could allow us to transcend all— once we have identified a problem we can learn from it and take steps to prevent it from happening again. That is one of the things that makes me (and many others, I imagine) so optimistic about the future. But as it turns out, there is such a thing as "human nature" and we're shackled to it, no matter how willing and observant we are. We make the same mistakes over and over again. As it turns out, we all have the same weaknesses, the same frailties.

16 February 2009

My goal is to read 15 books in the first half of 2009. So you should expect to be seeing a lot more book reviews here. But first...

Man on Wire (IMDB). A documentary about Philippe Petit, the Frenchman who, in 1974, strung a tightrope between the towers of the World Trade Center and went walking on it. The idea of a bunch of guys plotting something "illegal... but not wicked or mean" is uplifting in a way. Mr. Petit is quite the character, and the film is in large part a portrait of him. I love this one saying of his:

It's impossible, that's sure. So let's start working.

This film is well-done. It has all the suspense of a heist movie (but without the robbery). And the footage of the act is something to see. I spent about half the movie agape. But don't take my word for it. At the time of writing, Man on Wire is 100% fresh with 137 reviews on RT. It is also being nominated for the Academy Award for Best Documentary Feature of 2008.

Rock Band 2. I got RB2 for the Wii over the holidays, and it's a lot of fun. There is a strange visceral appeal about drumming. You can really get into the "zone" in a way that you don't when you're playing the guitar.

Battlestar Galactica. I've watched Season 1 and am starting Season 2. It's quite bleak, which is somewhat refreshing— in the story, humanity is on the run from robots and is on the edge of extinction. It's the military/political and human drama that really makes the show, though.

15 February 2009

I noticed that the importance of my paper mail is inversely related to the importance that is advertised on the envelope. The envelopes that say Important Cardholder Information always turn out to be some useless offer. And then, of course, my credit cards and checks arrive in envelopes with no distinguishing features at all.

That always reminds me of this story, from Bruce Schneier's Beyond Fear (link):

At 3,106 carats, a little under a pound and a half, the Cullinan Diamond was the largest uncut diamond ever discovered. It was extracted from the earth at the Premier Mine, near Pretoria, South Africa, in 1905. Appreciating the literal enormity of the find, the Transvaal government bought the diamond as a gift for King Edward VII. Transporting the stone to England was a huge security problem, of course, and there was much debate on how best to do it. Detectives were sent from London to guard it on its journey. News leaked that a certain steamer was carrying it, and the presence of the detectives confirmed this. But the diamond on that steamer was a fake. Only a few people knew of the real plan; they packed the Cullinan in a small box, stuck a three-shilling stamp on it, and sent it to England anonymously by unregistered parcel post.

14 February 2009

Instant feedback helps people to learn— quickly. And computers can often help to close the feedback loop, whether it's learning to play the drums or learning to drive efficiently:

Though the new [2010 Honda] Insight gets significantly lower mileage than the original, Honda has loaded it with an array of gauges and displays intended to coach drivers to be more economical. For instance, the speedometer's background color changes from blue to green as one's driving becomes "more environmentally responsible." Readouts reward the frugal driver with an "eco score"; if you excel, you win a digital trophy surrounded by a wreath.

You either need these video-game techniques, or a lot of driving lessons, in order to help people develop unconscious competence.

So I have a lot of old stuff to clean up and throw away. I am a packrat, but I do know when things are way past their useful lifetime. Definitely on the list of things that are going to go:

Anything with a PS/2 or parallel port connector.

Power adapters where I have no idea what they plug into.

Hard disk drives no larger than 20GB. I plugged one in, just for kicks, and so that I could wipe it. It was 2GB and the disk label was WINDOWS_95. The computer I just ordered is going to have three times as much RAM and about 1500 times more disk.

Old proprietary software. I have scores of CDs from an MSDN subscription circa 2002. These are the "AOL CDs" of this decade. I hate throwing away bits, but I rationalized this to myself: given their age and their proprietary-ness, I am pretty sure they have negative value. If someone installed any of these things and tried to use them, they would be worse off than if they got something Free, for free, off the internet.

02 February 2009

There are also two new modes: Smile and Blink Warning. In smile mode, the camera can actually tell a smile and will wait to shoot the picture until it detects one. Blink warning actually displays a warning message when, in the camera's opinion, the subject has blinked when the picture was taken. Since a lot of people have that annoying tendency, the warning message quickly lets you take another shot without having to check the picture.

Smile detection and blink detection, coming soon to a point-and-shoot that will cost less than my current camera did. Wow.

26 January 2009

The NYT has an interesting article about how the depression is driving trends in technology, including the adoption of netbooks, virtualization, and Linux. Netbooks are interesting and all, but I'm more fascinated by the software side of this.

Recessions "can cause people to think more about the effective use of their assets," said Craig R. Barrett, the retiring chairman of Intel.

If recessions are good for anything it's that they force us to use all our resources to the fullest and in the most efficient manner. And people are noticing that this is not really compatible with the idea of proprietary software, which basically represents starting with something infinitely malleable (software) and then crippling it in some manner that suits the vendor. Not so great.

Disclosure

I'm a software engineer at DNAnexus, Inc. This blog represents the opinion of myself and no one else.Unless specifically noted otherwise, I do not receive free review copies of books or other products mentioned here.