Cass Sunstein, author of the popular 2008 non-fiction Nudge, Harvard Law professor and husband of US Ambassador to the UN Samantha Power, punctuated his visit to Dartmouth this week with an extremely provocative speech in Filene Auditorium. In his speech, “Free by Default," Sunstein presented and qualified the pros and cons of personalized default rules – in comparison to the models of impersonal default rules and active choosing.

Sunstein explained the theories of libertarian paternalism and choice architecture to the audience and demonstrated that with each decision-making model the goal is to minimize the cost of making decisions and errors. Default rules are can play an extremely important role in this. They establish what is considered a loss and a gain, overcome inertia of the decision-maker and hold the power of suggestion. Sunstein pointed to defaults that promoted renewable energy use in Germany or double-sided printing at a Swedish university that people would have had to actively opt out of if they so desired, and the net positive impact of such defaults. Still, by defaulting people into options that don’t necessarily fit their preferences or needs can lead to crucial errors and setbacks.

The contrary option of active choosing in which the decision-maker has to opt in to everything (i.e. have to select ice cream flavors instead of always, by default, being served vanilla, for instance) has some merits: it handles diversity well, forces people to overcome inertia and eliminates wariness. Still, in unfamiliar or technologically complicated areas or when choice can be a burden, active choosing leads to negative impacts. Some even go so far as to say that it is a luxury to have decisions made for you.

To find a compromise between these two options, Sunstein explained the personalized default rule. He crafted an extremely relatable presentation, using examples ranging from Netflix ‘films you might like’ suggestions, to Amazon.com targeted advertisements, and Pandora music selections to demonstrate some sectors in which big data is making personalized defaults more and more reliable and user-friendly. He demonstrated that, while still crude and lacking enough data to personalize for every individual, personalized defaults are good for informed decision-makers who are reasonably looking to simplify and enhance their lives. The personalized default, while imperfect, preserves active choice while promoting and making more easily accessible the options more likely to be selected in the first place.

The presentation provoked an interesting discussion to follow with concerns over which sectors can benefit from personalized default rules, how voting and shared decision making as in the medical profession might be affected and what the ideal outcome for use of such a default might be.