September/October 2017 (Silbey)

As word of my forthcoming service as Chair of the Faculty traveled, several colleagues offered congratulations. Others offered commiseration. A few asked me what I wanted to accomplish. This was provocative! After only two months on the job, I know that there is not much free space for extended contemplation and creativity. I am, however, obliged to write regular columns for the Faculty Newsletter and I have decided to use these spaces to think with my colleagues, perhaps to be provocative myself, about what we do as teachers and scholars and the challenges presently confronting higher education.

A few years ago, several of our colleagues asked me for what they described as cultural help. With almost exactly the same words, they told me, "My students are brilliant, creative, can do anything, but they have no idea what is worth doing." "They can make anything but all they can say is 'awesome.' Can you help us understand this? What should we do?" I am still struggling with these questions. I think many of us are. This is the real immediate challenge, as important as the cost of higher education, political turmoil, climate change.

I begin by looking backwards. Across the last 70 years, beginning with the 1949 Lewis Report and culminating with the recent Task Force on the Future of MIT Education, the faculty has periodically taken stock of its educational commitments, each report restating MIT's vision for higher education. Reading through these reports reveals an interesting transformation: post-World War II worries about the capacity of an engineering school to produce socially responsible citizens have evolved into an institutional ambition to spread globally capacities for innovation. The Institute’s objective in 1949 was stated simply: to educate "the professional man who is an outstanding citizen." Sixty-five years later, the 2014 Report on the Future of MIT Education envisions the Institute as an "ecosystem for ongoing research, learning, and innovation."

What happened to our goal of responsible citizenship and civic responsibility? Does it comfortably go unstated in our broader vision of an innovation ecosystem? Might this be a moment for conversation about our shared commitments and responsibilities?

Educational institutions have historically been committed to imparting existing knowledge to new generations of undergraduate college students, while nurturing their appetite to make new knowledge. What happens when the mission changes: abandoning and breaking free from existing knowledge to create new ideas, new things? This may not be a seamless fit. Might we risk, inadvertently, devaluing the making and accumulation of knowledge?

How does innovation and ethical responsibility fit together? Innovation, like efficiency, lacks a politics unless we give it one. Toward what ends are we working and teaching?

Certainly, any flourishing organization will, and should, change over three-quarters of a century. Of course, change is neither easy nor unidirectional, and the tensions between preserving what is excellent and ambitions to make improvements are persistent. Yet, at particular historical moments the push for change can be noticeably stronger than usual. It is often unclear whether change agents are responding to needs or are themselves the impetus driving the conditions for change. In the last two academic years alone, the faculty has approved six new undergraduate majors, nine minors, four graduate degrees, as well as eight additional modifications; committees are currently exploring new degrees and the pattern of major enrollments. The currently voiced discontent with MIT's undergraduate curriculum can be interpreted as the latest permutation in a history of continuing reevaluation and renewal or, perhaps, an expression of something else more contemporary: the widespread embrace of disruptive innovation.

In 1971, also a period of political upheaval, and also a time when the curriculum was congested by the perceived expansion of technological knowledge, Benson Snyder, at the time MIT psychiatrist-in-chief and Dean of Institute Relations, offered a trenchant analysis of higher education. He argued that the experience of undergraduates was marked by a discrepancy between explicit demands (such as completing assignments) and unstated academic and social norms. Referring to the tacit norms as the "hidden curriculum," Snyder detected a conflict between students and instructors rooted in instructors' assumptions and values, students' expectations (which are inchoate and in a process of development), and the historical moment and social context in which the parties found themselves (rapidly changing cultural norms). Snyder interpreted the conflict over the hidden curriculum as a source of students' anxiety, depression, and alienation. Education was reframed, for and by students, as a competition, a type of game to master rather than a quest for knowledge or a process of moral development. The problem Benson identified is not only the expanding formal curriculum but also the various messages embedded in the way we organize and justify the curriculum.

How much has changed over these decades? What is today's hidden curriculum? Certainly the call for teaching about, and producing, innovation is clear. However, Benson was concerned about something less explicit: "the kinds of dissonance that are created by the distance between" the formal expectations and the informal responses and messages. "The hidden curriculum imparts to the students what particular performance is wanted from them," Benson wrote. What does innovation communicate? What does it demand of our students? Entrepreneurship or truth? While not mutually exclusive, are we asking them to pursue profit and market-making in lieu of knowledge and responsible citizenship? What do the students hear in the call for innovation?

Innovation, likened or not to disruption, is not simply an injunction or aspiration; as the Harvard historian Jill Lepore writes, it is a theory of social change, an explanation for how the world works. Moreover, as a model of change, it supplants alternative accounts. “The eighteenth century embraced the idea of progress; the nineteenth century had evolution; the twentieth century had growth and then innovation. Our era has [added] disruption, which, despite its futurism, is atavistic. It’s a theory of history founded on a profound anxiety about financial collapse, an apocalyptic fear of global devastation, and shaky evidence,” Lepore writes [Jill Lepore, "The Disruption Machine: What the Gospel of Innovation Gets Wrong," The New Yorker, June 23, 2014.] She documents the shabby empirical evidence for this theory of change as disruption. But, in this age of hyper communication, instant analysis, and media driven frenzy, we rarely take the unhurried time to critically engage with the rapidly circulating narratives, such as those whose protagonist is innovation. "Even people who cherish the idea of progress," Lepore claims, and who "point to improvements like the eradication of contagious diseases and the education of girls, have been hard-pressed to hold on to it while reckoning with two World Wars, the Holocaust and Hiroshima, genocide and global warming. Replacing 'progress' with 'innovation' skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices are getting newer and newer."

Disruptive innovation is an idea bred at Harvard Business School. Like many management models, it is expertly marketed and thus has become the moment's gospel. Recipes for management efficiency, even when on a smaller scale, embed theories of change often without sufficient scrutiny of their social impact. I note, however, the stated mission of the MIT Sloan School: to develop principled innovative leaders who improve the world.

How can we distinguish the well-marketed hype of disruptive innovation from a predictive theory of social change with which we can engage as teachers and scholars? At MIT there are several fine classes that do exactly that: systematically explore the major contending models for analyzing and explaining historical (temporal) and social (distributional and structural) change.

Such explorations reveal how the major social movements that have characterized modern history – beginning with the early peasant revolts, the liberal revolutions, the socialist movements of nineteenth and twentieth centuries, the post-colonial liberations as well as the environmental movement of more contemporary times – have each represented transformations in understandings of social change, specifically accounts of when and how human agency and power shape the world.

Lepore reminds us that,

". . . innovation and disruption are ideas that originated in the arena of business but which have since been applied to arenas whose values and goals are remote from the values and goals of business. Public schools, colleges and universities, churches, museums, and many hospitals, all of which have been subjected to disruptive innovation, have revenues and expenses and infrastructures, but they aren’t industries in the same way that manufacturers of hard-disk drives or truck engines or dry-goods are industries. Journalism isn’t an industry in that sense, either.

"Doctors have obligations to their patients, teachers to their students, pastors to their congregations, curators to the public, and journalists to their readers –obligations that lie outside the realm of earnings, and are fundamentally different from the obligations that a business executive has to employees, partners, and investors. . . . Charging for admission, membership, subscriptions and, for some, earning profits are similarities these institutions have with businesses. Still, that doesn't make them industries, which turn things into commodities and sell them for gain."

Accounting for the differences among these institutions in the narratives about disruptive innovation is critical if we are to understand the complexity of change. When we talk about education as innovation, are we burying talk of responsibility? Or, possibly, is this meme creating new spaces in which to be more self-conscious, less hidden or opaque about our responsibilities?

What is a college education for? Certainly as a university in which 75% of students graduate – even if all do not work – as engineers, we are providing job training. But, how are we educating these engineers? When I rode the New York City subways as a teenager, I was always dismayed by the abundant, often government-sponsored advertisements that said, "Get an education, get a job." It hadn’t occurred to me that one got an education in order to get a job. Perhaps my naiveté was a residue of pre-feminist consciousness, but I think not; I always knew I had to and would go to work. But I thought one got an education to learn how to think, to discover what had happened in the past, and why the world is the way it is, to "know stuff" as my teenage vocabulary may have said it. Yes, education could get me a better paying job, but I cannot recall a single class in college or graduate school that actually prepared me to teach – except by mimicking my professors. And this was not always for the better, certainly. Sadly, when I went to graduate school, we were not taught how to do research; again, one had to mimic what our professors did without explicit methodological instruction. Times have surely changed, and for the better in many ways.

In a challenging speech welcoming the freshmen class to the University of Chicago, the sociologist Andrew Abbott explained to the students that few, with the exception of scientists and engineers, will ever work at a job in which their college major is required preparation.

Doctors, lawyers, ministers, writers, and business leaders will have studied many different, often seemingly irrelevant and impractical, subjects. The sociological data show that worldly success does not depend on what is studied in college, nor is it predicted by college performance.

Being admitted to a selective institution by itself puts these students in a life-long trajectory at the upper tiers of American society. The selection by the college and the choices made following college are predictive of worldly success.

However, what college education offers, even for scientists and engineers, is the chance to learn how to make distinctions, what Abbott calls mental gymnastics: to learn to see the world from multiple perspectives, with complex dimensions, often to slow down as you encounter phenomena and take a closer look. "We should not want education now in order to get something later," he writes. Education is an end in itself simply because it makes life better. [Andrew Abbott, "The Aims of Education Address," The University of Chicago Record, November 21, 2002, 4-8.] The educated life is better because each event becomes a more complicated experience, a puzzle for unraveling and understanding, apprehension and, yes, sensual enjoyment. When jobs call for a college education, the employer seeks a workforce that can do the mental gymnastics that balance uncertainty with action, multiple demands with opposing interests. A recent study by Wellesley College sociologists Lee Cuba and Joseph Swingle and colleagues describes a college education as exactly that: learning to make choices, "a liminal space and place in which students make lots of decisions that serve as practice for the many more they will make as adults." [Lee Cuba, Nancy Jennings, Suzanne Lovett, and Joseph Swingle, Practice for Life: Making Decisions in College (Cambridge: Harvard University Press, 2016), 170.]

I like to think about MIT as a modest institution that has managed over its lifetime to do some extraordinary things. What happens when we see ourselves no longer as a modest yet successful university but as extraordinary, a global innovation leader? We celebrate both mens et manus, teaching ways of doing things but also ways of thinking. Is the hidden curriculum the fact that we also teach ways of being and feeling? MIT has always excelled at teaching how to make things; it may be what we are uniquely good at relative to other institutions. Perhaps, however, MIT's true power comes in those moments when it is able also to teach people ways of thinking and feeling that they take into those ways of making and doing.