179 posts categorized "Postacademic"

July 08, 2015

Rebecca Schuman writes in Slate about UC Irvine’s new program to get graduate students out in five years, and sees an important benefit: a climb down from the idea of the dissertation as an "endlessly protracted super-project that is so difficult, so important, and takes so long that by the end its writer feels both entitled to a place in its field and unfit for any other type of work."

What a great line!

I finished my degree in five years, thanks to being intensely tactical about my dissertation (though this was mainly my advisor’s doing), and luckily well-funded. So I think the five year benchmark makes good sense. Graduate school can be a good experience, but not when it turns into the academic equivalent of the siege of Leningrad.

August 03, 2014

If you can reach this Chronicle of Higher Education article on “Things You Should Know Before Publishing a Book," read it. Elizabeth Knoll worked in both academic and trade publishing, and her insights into the editing process, contract negotiations, jacket design, and other things are spot on.

I worked for a bit with Elizabeth when I was a postdoc. I first sent my eclipse expeditions book to her at UC Press, but she left for W. H. Freeman before I was able to finish the revisions; then I got another job, moved to Chicago, delayed the revisions, UC lost interest in the book, but fortunately Stanford picked it up.

Anyway, having learned quite a bit about the publishing process from my most recent book, I can recommend the article.

July 22, 2014

At least that’s the impression I get from this Atlantic piece by David Wheeler, which describes issues facing new clergy that would sound very familiar at the AHA: older pastors are retiring but not being replaced with full-time positions, the amount of time for contemplation is down, and high levels of personal debt are a way of life.

This in particular jumped out at me:

Working two jobs has become so common for clergy members, in fact, that churches and seminaries have a euphemistic term for it: bi-vocational ministry.

Working multiple jobs is nothing new to pastors of small, rural congregations. But many of those pastors never went to seminary and never expected to have a full-time ministerial job in the first place. What’s new is the across-the-board increase in bi-vocational ministry in Protestant denominations both large and small, which has effectively shut down one pathway to a stable—if humble—middle-class career….

Sometimes evangelical pastors, especially those planting a new church in an economically disadvantaged area, intentionally choose a bi-vocational life. Fredrickson says these pastors often “sense that they will be able to serve their neighborhood better if they are engaged on a regular basis in their community.” One example of a deliberately bi-vocational church is Love Chapel Hill in North Carolina, where five co-pastors share the workload of the church and work other jobs on the side.

“We are reaching an eclectic group of people,” says Mat LeRoy, one of the five co-pastors. “We have a growing core of young families and professionals, a large collection of college and grad students from [the University of North Carolina at Chapel Hill] and a beautiful group of local homeless friends. With this type of socioeconomic diversity, bi-vocational ministry is currently a strategic necessity for a sustainable outreach.” He adds, “This is not an easy choice for us, but it is worth it to continue our mission in our community."

As someone who’s done a lot of thinking about (and experiments around) the viability of being a scholar outside the traditional academic track, a lot of this sounds familiar. The sense that there are advantages to this kind of foot-in-two-worlds situation that can outweigh the disadvantages; the problem that if it’s not what you expect and train for, it can be a rude shock when you graduate; and the structural factors that make this not a crisis but something more like a state of exile.

July 08, 2014

I think it’s fair to say that unless you’re an academic*, everyone who writes a book hopes that it’ll do well enough for them to start writing full-time. But the reality has always been that that unless your living expenses are shockingly low, or you had a wealthy distant uncle who left you an inheritance, the odds were against making a decent living as a writer.

Those odds are getting even worse now, according to a new survey by the Authors' Licensing and Collecting Society of British writers. ALCS surveyed "almost 2,500 working writers," according to The Guardian, and it found that

the median income of the professional author in 2013 was just £11,000, a drop of 29% since 2005 [the last time such a survey was conducted] when the figure was £12,330 (£15,450 if adjusted for inflation), and well below the £16,850 figure the Joseph Rowntree Foundation says is needed to achieve a minimum standard of living. The typical median income of all writers was even less: £4,000 in 2013, compared to £5,012 in real terms in 2005, and £8,810 in 2000.

Not surprisingly, this translates into fewer people writing full-time: “in 2013, just 11.5% of professional authors – those who dedicate the majority of their time to writing – earned their incomes solely from writing. This compares with 2005, when 40% of professional authors said that they did so."

And this isn’t a problem that’s affecting bad writers:

James Smythe published his first novel in 2010 with an indie publisher, and he has published five with HarperCollins. He has been shortlisted for major science fiction awards, been glowingly reviewed, and won the Wales book of the year. He told the Guardian that his novels had never earned out. "Being a writer can't be treated like it's a job. It maybe was once, but no writer can treat it as such nowadays. There's no ground beneath your feet in terms of income, and you can't rely on money to come when you need it," said Smythe, who also teaches at Roehampton University.

I’m not convinced that you have to be a full-time writer to do good work: the number of people who both have careers and manage to write are too numerous to conclude that the muse only comes when you’re unemployed. We might all like to be Ernest Hemingway, writing in the mornings and fishing and drinking in the afternoons, but we’re more likely to be William Carlos Williams (physician) or Wallace Stevens (insurance). Or at best, writing full-time is something we’re able to do on one project, but not another.

But the decline is still troubling for two reasons. First, the absolute decline in the amount of money writers get for the same work makes carrying out any sort of creative life more challenging. The cushion of an advance or some royalties can make the difference between finishing the next book while your publisher and readership still remember who you are, and seeing your moment pass.

But second, it’s another sign that the publishing industry as a whole is in bad shape and getting worse. No industry in which incomes are on the decline can be considered healthy.

So don’t give up your day job. Learn how to write in the morning.

* Not to cast aspersions on my former life, or my first book for that matter. What I mean is that the academic case is different because the reward doesn’t come as a royalty check, but tenure or promotion. Such indirect payouts for book-writing are rare, though consultants who write about their work can also make money from higher fees and speaking gigs. The late, great Russ Ackoff once told me that he made more money off one day's consulting than from the combined royalties of his books (and he'd written about two dozen of them)-- but he could charge so much for consulting because he'd written those books. So if you're a partner in Accenture's outsourcing practice group (or whatever), your young adult trilogy isn't going to get you much additional social capital, or real capital.

November 10, 2013

A few weeks ago I spoke a memorial service for one of my thesis advisors, Riki Kuklick. While I was at Penn I also gave a couple other talks, on postacademic careers and contemplative computing; but all three turned out, one way or another, to touch on Riki and her influence on me.

After I returned home, I noodled around with the talks, and eventually put them together. The result wouldn't have been appropriate in any of the three venues, but it better reflects what I was struggling to say in separate places on different days.

Introduction

In September 2013 I returned to Philadelphia to speak at a memorial service for one of my favorite professors, Henrika Kuklick. Exactly thirty Septembers earlier, I stepped into my first classroom with Riki, and her course on the sociology of knowledge. It was the beginning of an association that would shape the next eight years of my life at Penn, and beyond.

Even though my father was a professor, and I was lucky to have some great teachers and role models at Penn, Riki lived the life of the mind in a way that was especially vivid and accessible. It goes without saying that she was as brilliant as the other professors who most deeply influenced me at Penn-- her colleagues Rob Kohler and Thomas Hughes; art historian David Brownlee; and strategist and systems thinker Russ Ackoff-- but she was a great model for aspiring scholars.

Riki took unreserved, transparent pleasure in the craft of scholarship, in writing, teaching, talking shop with students. Her stories of her latest agony writing what she called "the Great American monograph" kept me and other graduate students entertained.

For students trying to become scholars, her willingness to pull back the curtain on academic life was refreshing and reassuring. My decision to work on Victorian science was influenced in no small part by her accounts of living in England and working in the archives there.

The Problem of the Real World

The importance of academic models like Riki for aspiring scholars shouldn't be overestimated, because academic life is often looked at skeptically by people who see themselves as firmly rooted in the "real world."

As my years at Penn drew out, some of my old friends and relatives expressed the opinion that all this education was just a way of avoid going into the real world. The real world was the place where people DID things, made money, got stuff done. The university was fine if it helped you get a job, but otherwise it was little point to it. Well, if the university was NOT the real world, then I wanted no part of it. I wanted to be a professor; the campus would be MY real world.

That didn't work out: I graduated into a terrible job market, and after finishing my first book and a couple postdocs became a consultant. But then I made a surprising discovery: the "real world" was actually a great place to pursue the life of the mind.

Working as a futurist means grappling constantly with epistemological issues around the possibility of predicting the future, your professional credibility, and the standards by which your work should be judged-- all familiar themes in the sociology of science. In the mid-1990s, thanks to the growth of the Internet, the rising importance of the service economy, the ferocious pace of technological and global change, and other factors, the boundary between the world of ideas and the "real world" was collapsing. In order to survive in today's economy, organizations have to think seriously about what they were doing and why, and have models that explained how the world works and how it's changing. In their worldly impact, ideas are more real than ever.

One reason I was able to continue my own intellectual life was that I had Riki's pursuit of it as a model. There was nothing unreal about the life of the mind the way she lived it, or her love of the craft of scholarship. Her own professional life was lived in the ivory tower, she would have regarded the prospect of working with C-suite executives with horror. Despite this, she gave me the means to see the life of the mind as a devotion rather than just a profession, as an internal discipline as well as an academic one.

In a sense, I was also applying to my own life another lesson Riki taught me: that we should question what others believe is inevitable and inescapable, because what appears fixed may in fact be contingent and changeable. The expertise that may seem unassailable, the assumptions that seem self-evident, the truths that claim to be eternal, all may not be as real as they seem-- or like a great movie, their greatness may a blend of hard word, clever staging, and a willing suspension of disbelief.

Seeing that the boundaries between the academic world and "real world" could be more porous than I'd believed helped me create a life that borrowed from both worlds. It let me uproot my own well-cultivated prejudice against corporate life. It freed me to reimagine academic life as something more portable and useful than I'd previously imagined. It let me see that one could make a life that combined the vita activa and vita contemplativa.

Another Real World: IRL

That experience of moving between worlds had a subtle but important resonance in my latest book. While writing The Distraction Addiction, I ran up against the sensibility that Facebook, text messaging, the Web, and the other things that make up the digital world can ONLY be distractions from a well-lived life; that proximate physical interactions are naturally superior to anything we can experience online; and that the best solution to our electronic troubles is simply to turn technologies off. We should get offline in order to spend more time in the real world, where we can have a real life. The simple and apparently innocuous acronym "IRL" turns out to be a kind of intellectual virus. It packs a lot of unexpected information and moral judgment in a very small package.

This claim is one side of an argument that's into its third decade. In the 1990s and the early days of the World Wide Web, figures like John Perry Barlow and Esther Dyson declared that cyberspace was a new world separate from and superior to the physical world; critics answered that the Internet was a threat to literature, social development, even our memory and cognitive abilities. To me this debate had a ring of familiarity. If the distinction between the academic world and real world doesn't make a lot of sense, I wondered, could the same be true of the apparently huge gap between digital life and real life?

Merging Worlds

Once I dug deeper, I saw that just as the distance between academic life and real life was overhyped, so too was the distance between digital life and real life. Technologies like smartphones, locative services, and wireless Internet access have erased the functional boundary between bits and atoms, while ecommerce, email, and social media have woven the digital world into our everyday lives.

Even more profoundly, I realized, using technologies is not something that makes us less human, or takes us away from our natural selves. Since the invention of stone tools two million years ago, human bodies have co-evolved with our physical tools, while our minds have co-evolved with our cognitive tools. We are, as philosopher Andy Clark puts it, natural-born cyborgs. At its best, this entanglement of person and technology extends our cognitive and physical abilities, gives us great pleasure, and makes us more human.

The challenge with smartphones and social media, then, is not to learn to give them up, but to learn to use them wisely. We need to practice what I call contemplative computing, developing ways of working and interacting with information technologies that help us be more mindful and focused-- and thus better people-- rather than be endlessly distracted and frustrated.

By better understanding the nature of attention and distraction, by studying how our interactions with technologies go bad, and by experimenting with new ways of using them, we can resolve the paradoxes these technologies seem to bring into our lives. Using them wisely helps us become wiser about ourselves. Being more mindful about HOW we use technologies helps us be more mindful WHILE using them.

This leads me to argue that we should push back against the moral distinction between academic life or digital life on one hand, and real life on the other. We shouldn't think in terms of a "real life" versus a "digital life" any more than we should think of our lives in the library or laboratory as unreal.

IRL = In Richer Life

To put it another way, we should redefine what the acronym IRL means. When people talk about "going IRL," one of the things they're doing is expressing a desire for self-improvement: turning off the devices, going camping or spending time with the family and friends. The impulse is laudable, but the assumption that it can only happen when you hit the off switch is incorrect.

Instead, we should think of RL as a richer life, one of that isn't driven mainly by distractions, but reflects a serious attempt to create meaning in the world, to do things that matter with our lives, to build and extend our selves. This is an effort in which the thoughtful, judicious, mindful use of technology can play a role-- and which those habits of mind that we think of as "academic" can also be intensely useful. We can build lives aren't merely real, but are richer, using tools that take form in silicon and electrons, or tools that are encoded in words and ideas.

Practicing contemplative computing requires taking a more critical, ethnographic approach to how we use technology; asking basic questions about why we use technologies, noticing unconscious habits, how we think about them, and how they affect the way we think about ourselves. All these ideas could have come from one of Riki's classes, even though they're applied in an area that seems outside her scholarly interest.

Riki and the Richer Life

But that ability to follow ideas wherever they lead, to pursue diversions until they reveal something unexpected yet connected to your original interests, is just me channeling another of Riki's habits.

Riki was an astonishing conversationalist-- indeed it was hard to get a word in edgewise. If you didn't know her you might listen to her monologues and think she was just free associating. But if you listened carefully, you discovered that she would start a sentence, interrupt herself and veer off onto another subject, then do it again, and again-- and then, systematically work her way back, until twenty minutes later she finished that first sentence. That ability to draw together a dozen different subjects in a single conversation, to weave between and weave together different ideas, never failed to amaze her students, and I suspect there's an echo of it in my writing even today.

But in a sense the questions I'm working on now are not outside her area at all. What Riki showed me, through her work and her life, is that far from being an escape from real life, the life of the mind can serve as a model for how to build richer lives.

Indeed, there's a parallel between our engagement with books and ideas, and our dual lives in the physical and digital worlds.

The categories of "real world" on one hand, and "digital world" or "academic world" on the other, can be remade, and in the course of doing so, we can make better, richer lives for ourselves. A more thoughtful understanding of our everyday engagements with technology can make our lives better. It's an attempt to make sense of how we should define what it means to be human, how to think about the divide between people and technologies, and to see that the challenge and the opportunity we face is not to learn how to live in real life, but to learn how better to use tools and time to have a richer life.

May 14, 2013

Yesterday I found out that one of my mentors from college and graduate school, Henrika Kuklick, died.

Riki was one of the professors who got me hooked on the history of science, and along with Rob Kohler helped make me who I am. In the fall of my freshman year I had taken a seminar with Tom Hughes, mainly because it sounded interesting and he had a Ph.D. from the University of Virginia, and then in the spring had a class with Hughes and Rob, who would go on to be my undergraduate and graduate advisor. In my sophomore year I took Riki's sociology of science class, and from then on hardly a semester went by when I wasn't taking something with her.

Riki was a kind of intellectual performer I'd never encountered before. I never knew anyone who could keep track of so many thoughts: I marveled at how she could start a sentence, divert herself, then go off on something else, but then work her way back up and finish the sentence 20 minutes later. She had a kind of unreserved enthusiasm for life and ideas that really resonated with me; my decision to work on Victorian science was influenced in no small part by her description of living in England and working in the archives there. When I was a bit older and had more of a critical sensibility, I found her scholarship to be really outstanding, erudite without being purposely complicated: I taught her Great Zimbabwe Ruins article in several of my classes, and it always went over well.

She was also a great person and teacher, always supportive and generous, great at helping you think through arguments. Not the closest reader, though; lots of chapters came back with "Good work" scrawled at the end, and little more. (That's why you needed Rob Kohler on your committee. That man could line edit a diffraction grating.)

There are lots of people who can hardly remember classes from college, or the professors they had. Riki, in contrast, introduced to me a set of questions about the ways people, ideas, and technologies interact that I'm still dealing with. It's why I dedicated my first book to her and Rob. And I think I'll spend the rest of my life working on things that we talked about. Fortunately they're very big questions.

I find as I close in on 50, I don't particularly notice my age: I've had some grey hair since I was in graduate school (it'll do that to you), and aside from bifocals, I'm not in worse physical shape (though that's not the highest bar ever set), and more important, I'm a better writer and thinker than I've ever been in my life. But what I can't comprehend is other people getting older, too: my parents are in their 70s, which I find weird, and Riki was 70, which to me is inconceivable: my memory of her was fixed in the 1980s.

It's one of life's ironies that the gap a person leaves when they're gone is as large as the impact they made when they were alive. By that standard, Riki's passing leaves a very large gap indeed.

November 21, 2012

"We practitioners and quants aren't too fazed by remarks on the part of academics – it would be like prostitutes listening to technical commentary by nuns." (From his new book Antifragile, rather negatively reviewed in the Guardian)

Ferguson's critics have simply misunderstood for whom Ferguson was writing that piece. They imagine that he is working as a professor or as a journalist, and that his standards slipped below those of academia or the media. Neither is right. Look at his speaking agent's Web site. The fee: 50 to 75 grand per appearance. That number means that the entire economics of Ferguson's writing career, and many other writing careers, has been permanently altered. Nonfiction writers can and do make vastly more, and more easily, than they could ever make any other way, including by writing bestselling books or being a Harvard professor. Articles and ideas are only as good as the fees you can get for talking about them. They are merely billboards for the messengers.

That number means that Ferguson doesn't have to please his publishers; he doesn't have to please his editors; he sure as hell doesn't have to please scholars. He has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk. That incredibly sloppy article was a way of communicating to them: I am one of you. I can give a great rousing talk about Obama's failures at any event you want to have me at.

What's so worrying about this trend is that Niall Ferguson, once upon a time, was the best. I'm one of the few people who has actually read his history of the Rothschilds, The World's Banker, all 1,040 pages of the thing, and it is brilliant, a model of archival research. I find it fantastically depressing that the man who could write that book could end up writing a book like Civilization or an article with just as much naked silliness as the Newsweek cover.

I feel very much the same way about Victor Davis Hanson, a man whose military history is really absolutely first-rate, whose The Other Greeks fairly exploded with insight into Greek society and philosophy, but who's been mailing in sloppy, thoughtless pieces ever since he left the farm for The Farm. Sad.

July 27, 2012

George Monbiot calls publishers like Elsevier and Springer "the most ruthless capitalists in the Western world":

What we see here is pure rentier capitalism: monopolising a public resource then charging exorbitant fees to use it. Another term for it is economic parasitism. To obtain the knowledge for which we have already paid, we must surrender our feu to the lairds of learning.

Open-access publishing, despite its promise, and some excellent resources such as the Public Library of Science and the physics database arxiv.org, has failed to displace the monopolists…. The reason is that the big publishers have rounded up the journals with the highest academic impact factors, in which publication is essential for researchers trying to secure grants and advance their careers. You can start reading open-access journals, but you can’t stop reading the closed ones.

June 05, 2012

Michael Lewis' Princeton commencement address is terrific. After the obligatory opening joke ("Members of the Princeton Class of 2012. Give yourself a round of applause. The next time you look around a church and see everyone dressed in black it’ll be awkward to cheer. Enjoy the moment"), he talks about writing Liar's Poker and the role of luck in making that book possible:

I was 28 years old. I had a career, a little fame, a small fortune and a new life narrative. All of a sudden people were telling me I was born to be a writer. This was absurd. Even I could see there was another, truer narrative, with luck as its theme. What were the odds of being seated at that dinner next to that Salomon Brothers lady? Of landing inside the best Wall Street firm from which to write the story of an age? Of landing in the seat with the best view of the business? Of having parents who didn’t disinherit me but instead sighed and said “do it if you must?” Of having had that sense of must kindled inside me by a professor of art history at Princeton? Of having been let into Princeton in the first place?

This isn’t just false humility. It’s false humility with a point. My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don’t want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.

where students decide on a career plan -- academic or nonacademic -- they want to embark on by the end of their second-year of graduate study, file the plan with their department, and then prepare projects and dissertation work that would support that career…. This would represent a dramatic shift from the current norm, whereby many humanities grad students say that their entire program is designed for an academic career, and that they only start to consider other options when they are going on the job market -- a bit late to shape their preparation for nonacademic options.

February 13, 2012

I'm going to blow through this quickly, so I can get back to real stuff, but I couldn't let this awfulness go unremarked: Gary Olson's latest essay in the Chronicle of Higher Education, on "How Not to Reform Humanities Scholarship." The piece starts by noting "the growing number of commentators" at the recent Modern Language Association meeting "who were recommending changes in how the discipline conceives scholarly work." I suspect if you went to any MLA between, say, 1960 and today, you could print that sentence and it would due true, but let's take Olson's word that such calls are becoming more frequent and confident.

Certainly, he says, the number of people contacting him to say how terrible such things would be is on the rise. Whatever their good intentions,

Such recommendations, my callers unanimously agreed, would damage not only the careers of aspiring and new professors but also the reputation of the humanities. The proposed changes would also present substantial challenges to academic administrators charged with evaluating scholarship for tenure and promotion.

I'll just note four huge problems with the essay.

The first is the clumsy use of "some people worry about something, so that's evidence" as a form of argument. (One might argue that in a soft field like the humanities perception is reality, but given that this is an essay arguing for the strength of humanistic thinking and scholarship, I think Olson doesn't want to go there.) So you get claims like this:

Some veteran faculty members worry that graduate students and young faculty members—all members of the fast-paced digital world—are losing... their capacity for deep concentration—the type of cognitive absorption essential to close, meditative reading and to sustained, richly complex writing.

And this:

[A]llowing doctoral students to produce alternative projects may well disadvantage them on the job market, as hiring committees—or at least some members of them—may not be as receptive to experimental forms and may favor candidates who have, in fact, produced a monograph…. "I can just imagine how my colleagues in our very traditional department would respond to a colleague's tenure application if most of the work were digital," said one department chair. "We would have a clash of cultures and values, and, sadly, I know who would win."

And finally this:

It is true that more and more online journals are claiming to employ a peer-review process. That could be a positive development if we can arrive at a point where the community of scholars has confidence that the review process in online venues is as rigorous as it is in top-tier print journals. At the present, however, many scholars are still skeptical that the processes are equivalent.

Now, the argument that people don't concentrate any more in the digital age is one worth having; my book contends that while plenty of people feel like their faculties of concentration and memory are under assault, it absolutely doesn't have to be this way. Connection is inevitable, but distraction is a choice. But "some people say" is not proof.

Nor, I think it the argument that "we shouldn't do it because the old fogies would shake their canes and yell, you kids get off my lawn" particularly convincing. It's an unfortunate reality that some people don't like new stuff. But that's not a reason to not do new things that are good.

The second problem is that, tragically and not surprisingly, the assumption is that humanities Ph.D.s are all bound for academic jobs, and that training for other professions is more or less unthinkable.

Hence the equivalence of "job market" and "hiring committees," even though 1) a fraction of humanities Ph.D.s are ever going to get tenure-track jobs, and 2) other industries are much more likely to be able to see the value of an innovative piece of work than the search committee chair whose last book was published by Yale UP in 1977. Google's HR people won't care if you haven't produced a monograph, but rather have created something else that displays imagination, an ability to think deeply, and a capacity for focus.

More generally, the essay betrays an unwillingness, shared by far too many members of this generation of scholars, to admit that their field is not in some temporary crisis from which they're going to soon recover, and that good people are ground up and denied futures for structural reasons. Instead, you get things like this:

Besides, the typical rationale for abandoning the traditional dissertation—that the time-to-degree for the humanities doctorate is too long—is not a function of the monograph as a genre; it is a function of some dissertators' personal lives, as they attempt to juggle numerous priorities along with completing a dissertation.

So what recommendations does the essay embrace? How do we move forward to improve humanities scholarship?

This is the third problem with the essay: for the life of me, I cannot tell.

Olson doesn't seem to say, except to imply that we need more of the same, only better funded. Like too many academics, he seems to believe that if we wait long enough, the fairies to come and sprinkle gold dust on everything. There's no effort to distinguish good reform proposals from bad, to suggest how the rigor of traditional peer-review could be brought to electronic journals, to say how we might use other Web-based metrics (trackbacks, hits, number of comments, and other updated bibliometrics, for example) to help make informed judgments about digital scholarship.

Fourth an finally, I think this gives very short shrift to older faculty. As the son of someone who retired after twenty years at CSM, and then immediately went to Singapore for two years, I've seen at first hand that the relationship between age and personal conservatism is only as strong as you want it to be. He ends up constructing two sets of straw men, digital Panglosses and aging Cassandras, and thus doing justice to neither.

January 03, 2012

The Republican candidate Newt Gingrich and the cable channel History have both followed the same formula for success, by elevating fantasy over actual history. The difference, however, is that Newt wants to carry his sensational vision of a bygone age into office.

I think the Grafton and Grossman essay points in the right direction, and it inspires two suggestions and a caveat.

First, for students in the early stages of the dissertation, it could be tremendously helpful for a department to bring in a literary agent for a day. There are agents who specialize in academic-to-trade crossover projects, and the business is competitive enough for there to be some younger agents who'd find the prospect of representing an entire department interesting. In an afternoon, the agent could explain how the whole selling books for money thing works, and interested students can pitch their dissertations as book proposals.

It wouldn't be the end of the process of turning a thesis into a trade book, but just the beginning; but you have to start somewhere, and if it's possible to craft a Ph.D. with an eye to immediately converting it into a trade press manuscript-- preferably by just stripping out the footnotes and some of the academic framing in chapter 1-- that would do a lot to acculturate young Ph.D.s to the idea that they don't have to make Faustian bargains to make a living writing. (Of course you can if you want, but the academic vs. trade route is not a choice between freedom and serfdom: it's a choice between two different sets of pressures and constraints.)

This would do several things: help demystify the world of trade publishing, give students a sense of how their projects could be crafted for a broader audience, and for at least some, get some funding for the writing. Not every dissertation is the next "Longitude," but I'll bet a surprising number could be crafted for the trades. My agent was phenomenally valuable in both shaping my current book, and without her I'd still be trying to get MIT Press to return my phone calls. Instead, I'm in a very different position.

This might also help deal with a second issue. The biggest thing I had to deal with after finishing my dissertation was a sense of narrowed professional horizons. The cruel irony is that newly-minted history Ph.D.s tend to have a sense that they're LESS able to survive in the world than when they graduated from college, and often less interested in doing so. I'm not really sure there's a whole lot anyone can do to reduce this. It can help to bring in people like me who've had intellectually interesting lives (interesting to me at least) outside academia, but I think graduate school requires internalizing the cultural norms in order to survive-- not to mention justify the intense focus on a narrow subject, deferred income, etc..

At the same time, there's a critical thing that must be maintained in graduate school at all costs. Spaces for contemplation are being torn up faster than rain forests: just look at the mania for collaborative spaces in library architecture, the assumption that knowledge work is all about networking and idea-sharing, the arguments among (both evangelical and liberal) Protestant ministers over bringing social media into church services ("RT Luke 3:16 LOL #atchurch"), etc. etc.

If there is one great thing I got from graduate school that has sustained me in all my professional endeavors, it's the capacity not just to write and produce knowledge-- scholarly knowledge, popular pieces, even slightly disreputable consulting "product" with what Stephen Colbert might call "knowledginess"-- but an understanding that serious thinking really requires time and sustained, slightly manic, attention. There are precious few places outside universities-- and fewer and fewer places within the academic "marketplace of ideas" (kill me now)-- that take the vita contemplativa seriously; one of the best things you can do for students is help them learn how to live that life, and to make it portable.

October 03, 2011

Inside Higher Ed reports that the American Historical Association has just released a position paper, co-authored by AHA president Anthony Grafton and AGA executive director James Grossman, arguing that non-academic careers for history Ph.D.s shouldn't be thought of as some kind of aberration or "plan B," but recognized as the new normal.

For years now, humanities and other disciplines have promoted "alternative" careers for new Ph.D.s, trying both to increase the range of opportunities available to new graduates and to ease the competition just a bit in the academic job market.

The president and executive director of the American Historical Association have just released a statement calling for their field to abandon the idea that any career path -- including those paths outside of academe -- be classified as "alternative." It is time, they argue, to admit that the academic job market is not coming back anytime soon, that many new Ph.D.s who find jobs outside academe find rewarding work (both financially and intellectually), and that the doctoral experience needs to change in some ways so that new Ph.D.s have more options.

It's only taken 20+ years to recognize that graduate training promotes an outmoded, unrealistic (and, I would argue, unnecessarily narrow) set of career expectations. But maybe attitudes will actually start to change. Or perhaps graduate programs will just reduce their enrollments by 50%, to reflect the permanently reduced size of the academic job market, and to keep from having to change their way or working.

September 22, 2011

After a couple months working on it, I've been thinking about the experience of writing a serious non-fiction book. It's been a stretch for me, in quite a good way so far: I'm writing about something big that I'm passionate about, but in a manner that I find new and very challenging.

First, writing without footnotes is a pretty liberating experience. I'm the sort of scholarly writer who likes the recapitulate his entire intellectual history in the first five footnotes, and construct a dense thicket of citations to support my main text. If anybody doesn't know, this is part of an academic game that has several goals: creating a defensive barrier below your work that keeps it from being undermined ("well, yes, you would think that's a flaw in my argument if you haven't read these sixteen other things"), creating a place for your work in the literature, and sending little mash notes to people whose work you like. The downside is that this is an enormously time-consuming game, and it's a great way to procrastinate; if you get too caught up in it, it gets harder to actually write.

Little Brown doesn't do footnotes; instead, their books have bibliographic essays at the end. This means that I don't have to document every claim I make as I write it; I need to keep track of what I'm doing and where things come from, of course, but there's a whole slice of literary labor that I can forget about. The standard in the industry is also to quote other people sparingly, unless they're Shakespeare or Yogi Berra; as my editor explained, they're buying MY ideas, not my gloss on someone else's.

The result of all this has been that I'm writing faster; it also means that I'm constructing a different kind of relationship between this work and my sources, and between my authorial self and other writers in the field.

Put most simply, knowing that I can't impress readers with spectacular acts of citation jujitsu means that I have to make the work itself more compelling, and my own voice more authoritative. A footnote citing half a dozen books can be the intellectual version of an incomplete sentence, an erudite way of saying, "Well, you know..." With this, I have to actually FINISH the thoughts, and make them mine.

The practice of quoting other works has one big benefit: it means I'm doing more interviews with people. Even when I can pull a quote from something a person has written, it's better to quote from an interview. This is, in effect, a great excuse to talk to have conversations with interesting people, which is something I always enjoy. And fortunately they're quite forgiving when we go over things they've already written about; few people actively dislike talking about their work, and most of them know how this game work.

What I find really unexpected is that this kind of authority-- writing that depends more on what the AUTHOR does, than on who the author cites-- is, for me at least, truer to the ideal of scholarly authority. It forces you to take complete responsibility for your ideas. (Even at the Institute, while we didn't use footnotes, we often supported ideas that were challenged by readers (usually clients or prospective clients) by saying, in effect, we're just telling you what our expert sources told us.) You can argue that some writers abuse this, by appropriating other people's ideas, or not sufficiently acknowledging their debts; I hope to avoid that, but I can now see how it happens.

I've also been struck at how much writing is a business, albeit one that requires a high degree of focus and creativity. Even after editing the Encyclopaedia Britannica, publishing an academic monograph, turning out articles in newspapers, Scientific American, and lots of academic journals, I'm learning a LOT about how the trade book market works, and it's pretty different from everything else.

With my ambitious 1,000 word/day writing schedule, I'm also having to be very ruthless about my time and avoiding distractions. I'm not always successful (WILL JACK EVER ESCAPE FROM THE OTHERS? WHAT THE HELL IS THAT SMOKE MONSTER?), but this kind of writing requires starting early (the days when I'm up before 6, and get some writing done before I have to take the kids to school, are the most satisfying), and not giving up. People who think you get inspired, then rush to the keyboard and write in a creative frenzy, have it exactly backwards: you sit at the keyboard, and hope you can get to that state.

At the same time, while you need to hit your deadlines, you also need to be creative: anyone can tell the difference between what I've written when I'm really engaged and passionate, and what I write when I'm turning out Product. People don't tell their friends that they have to read this Product; Terry Gross doesn't interviews to writers about Product. They want strong, passionate writing, and creating it is... a challenge.

Paradoxically, I think setting a 1,000 word/day pace for myself turns out to be a good way to bring on that more creative state, that feeling of being entangled with the work. The more you're able to write to a schedule, the more likely you are to hit those great moments when you feel like you're transcribing ideas that come from somewhere other than your own mind. It can take at least a day to get to that mental state where the ideas really flow well; inspiration doesn't come in a flash, but after a long run-up. Put another way, those states can, to some degree, be induced: you can start wordsmithing and end up doing something really creative. This helps explain Frans Johannsen's observation (in The Medici Effect) that creative people do some of their best, most memorable work when they're doing a LOT of work. We assume that masterpieces are the result of long solitary focus on a single problem, but they're more usually part of a bigger enterprise.

April 05, 2011

This echoes feelings I've had, and I've heard plenty of other people express:

I have given up my secure academic job as Reader at the University of the West of England for the vagaries of life as a freelance. And why? Because I want to work - really work - and my job made that impossible.

Am I mad? The losses include a reliable salary, a pension, sick pay, a heated room, and a computer that someone comes to mend when it breaks down. But I can do without those (I think). The gain is a true academic life at last. I can devote my time to thinking, and reading and writing; to sharing ideas with others; to asking questions of the universe and trying to find the answers. The simple fact is that I could not do these things and the job.

I'm reading Blackmore's Zen and the Art of Consciousness, which I think does a brilliant job of communicating how difficult meditation and mindfulness exercises are.

December 14, 2010

Okay, all publishing involves at least a little bit of vanity, but... I recently published an article in an Elsevier journal, and today they sent me a message about their "article services."

Think all you can get are reprints? Think again! I could get an "eye-catching, full-color, poster of your article on the cover of the journal," or "an attractive color poster" of my article, "Perfect for your lab or office," or a "Certificate of Publication... in a high-quality frame, dark brown wood with gold trim."

Just in time for the holidays! Except probably not.

I wonder in which countries, or which disciplines, these things sell? Academic life has lots of well-worn rules about display and status, and the book-lined office, piles of paper on the desk (and floor and extra chair), and harried yet abstracted expression are all signifiers of The Life and how well you play it. (Few things mark the boundary between tenured faculty and adjuncts more powerfully than their control of space: the bare office shared with two other people fairly screams, "I'm just here temporarily, pay no attention to me.")

But having a poster advertising an article... that seems over the top, at least in the places I taught. But maybe in places that are very status- and publication-conscious, it's actually useful to have such in-your-face markers of accomplishment?

[To the tune of Keith Jarrett Trio, "You Took Advantage Of Me [Live]," from the album Yesterdays [Live] (a 2-star song, imo).]

August 06, 2010

in the findings of fact Judge Walker ended up relying almost exclusively on the plaintiffs’ witnesses, above all Nancy Cott, whose work is indeed excellent. Gays and lesbians should thank her, George Chauncey, Hendrik Hartog, and indeed the entire American historical profession that supported their work. Historians have made very clear what marriage has and has not been throughout American history, even if it doesn’t necessarily square with many of our received understandings about the way things always were. The truth is that marriage has been a continuously changing institution, not one settled for all time, and that same-sex marriage is simply one more change in a direction that aligns marriage more closely with our ideals and values. Historians have performed an invaluable service to the cause of liberty and human dignity in this case.

[To the tune of Shirley Bassey, "Diamonds Are Forever (Mantronik Diamond Cut Club Mix)," from the album The Remix Album...Diamonds Are Forever (a 4-star song, imo).]

July 03, 2010

I have a bunch of books-- probably a couple hundred-- from my professional/scholarly collection that I want to give away. Most are history (with an emphasis on European and British history), history of science (largely modern, but a respectable smattering of early modern), STS, and contemporary technology and business. Many are duplicates (how did I get three copies of Rheingold's Virtual Communities and Benedict Anderson's Imagined Communities); others are books I've carried around for years and realize I will never read again (holla, Renissance Self-Fashioning!); and various others no longer match my current or likely future interests (Bernal's 3-volume history and Daniel Lindberg's Rise of Western Science are both great, but I'm not likely to teach intro history of science again).

I would prefer they go to someone in the field-- ideally a history or STS grad student or postdoc-- rather than just be donated to my local library's book sale; I don't want to go through the trouble of putting them up on eBay. Is there an academic equivalent to Freecycle that I can use to connect with some worthy soul (who will agree to pay shipping)?

[To the tune of Johann Sebastian Bach, "Contrapunctus III (Fuga A 4 Voci)," from the album The Art of Fugue Vol. 1 (a 2-star song, imo).]

May 27, 2010

Steve Eisman, "the outspoken investor whose huge wager against the subprime mortgage market was chronicled by author Michael Lewis in his bestselling book The Big Short, talking about the for-profit education industry:

Until recently, I thought that there would never again be an opportunity to be involved with an industry as socially destructive and morally bankrupt as the subprime mortgage industry. I was wrong. The for-profit education industry has proven equal to the task.

As Mother Jones elaborates,

Driving much of the growth, Eisman explained, was the sector's easy access to federally guaranteed debt through Title IV student loans. In 2009, he said, for-profit educators raked in almost one-quarter of the $89 billion in available Title IV loans and grants, despite having only 10 percent of the nation's postsecondary students.

Eisman attributes the industry's success to a Bush administration that stripped away regulations and increased the private sector's access to public funds. "The government, the students, and the taxpayer bear all the risk and the for-profit industry reaps all the rewards," Eisman said. "This is similar to the subprime mortgage sector in that the subprime originators bore far less risk than the investors in their mortgage paper."...

Another similarity between subprime lending and for-profit education is this, Eisman said: Both push low-income Americans into something they can't afford.... [Finally], the industry's era of massive profits—ITT is more profitable on a margin basis than Apple, he notes—are about to end, thanks to new government regulations in the pipeline.

Wonderful.

[To the tune of Pink Floyd, "Shine On You Crazy Diamond (Parts I-V)," from the album Wish You Were Here (a 4-star song, imo).]

May 26, 2010

A while ago I wrote about reinventing academic talks. It got me thinking about how to better design workshops or conferences that bring together scholars or scientists (who, broadly speaking, like to think about stuff) with policy people, corporate strategists, and military people (who, broadly speaking, also like to think, but really like to DO).

I noticed a traffic spike on the blog, thanks to Lexi Lord's essay on post-academic life in the recent Chronicle of Higher Education (thanks, Lexi!). She talks about how she decided to leave a tenure-track position, and her discovery of the fact that you don't have to be an academic to have an interesting intellectual life (and indeed, can have a more interesting one if you're not a professor):

Because I live in a large city, as opposed to the small college towns where I was a professor, I live in a world of museums, lectures, public seminars, extraordinary bookstores, fantastic archives, and libraries. I live in a place that has racial as well as ethnic diversity. All of those factors encourage me to think about historical problems in a rigorous albeit different fashion from how I saw them in academe....

I live where a lot of archives are—which makes research easier than it was in academe. I write and publish. My new book, researched and written completely outside academe, was just published by Johns Hopkins University Press.

Since leaving academe, I have continued to endorse the belief that being an intellectual entails analyzing and understanding issues from multiple angles. I hope that in advising their undergraduates, academics will encourage their students to share that view. More important, I hope faculty members will encourage students to do informational interviews and extensive research on career options—before entering a Ph.D. program, which is, after all, only one path to the life of the mind.

This is always good advice, but it's especially timely, given that last night I had an experience that reminded me of the increased feasibility of pursuing academic projects outside the university.

I recently became interested in the concept of unintended consequences, and how the term is used to either describe or excuse the unexpected. It would be obvious to start such an essay with "a Raymond Williams Keywords-like analysis of its history," and last night I decided to poke around a little bit and see if I could find some early uses.

A little time on Google Scholar turned up the fact that Robert Merton wrote an article about the term in 1936, and died with a book on unintended consequences still unfinished-- a warning that I should be very tactical in how I approach the subject. (The fact that Google Scholar has "Stand on the shoulders of giants" as its motto warms my heart, since Merton wrote a book on the phrase.) That took a few minutes.

I then jumped over to the Stanford Library Web site, to see if Poole's Index of 19th Century Periodicals was online. When I was writing my dissertation, I spent a LOT OF TIME with Poole's-- it was an invaluable resource, and I remember many hours in the Penn and UC Berkeley libraries, looking for article citations, then tracking them down in the stacks. Instead, I quickly found the 19C Index, an online repository / directory that includes Poole's, but also a number of other 19th century indexes, publications, scanned magazines, etc.

For the next couple hours, I tracked down various combinations of unintended, unexpected, and unanticipated, and effects or consequences; by bedtime, I had a couple pages' worth of material written (most of it is footnotes and quotations, of course).

All this happened on my couch, with the "Biggest Loser" finale in the background.

I wouldn't give up those days spent in the library for anything; and I still really enjoy going to libraries to read and write. But the point of the story is this: that while fifteen years ago (when I did it) successfully leaving academia but remaining intellectual required geography and attitude-- I could do it because I was living in Chicago, Lexi was in DC, and we both were willing to keep a growth mindset about the next phase of our lives-- today, resources like 19C make it even easier to do serious scholarly work-- at least preliminary scholarly writing-- without being close to libraries. I'm about three miles away from Green Library, but with kids, work, and other stuff, it's hard to get there, and impossible to just dash over to the reference section to check up on something (as I could do when I was single and living a mile from the Berkeley campus).

So what Lexi argues in her recent piece, and what I argued years ago, is more true than ever: the raw resources for pursuing academic projects are more accessible and portable than ever. It still often requires maintaining some kind of connection with an academic institution-- my Stanford affiliation gets me access to the online databases like 19C and JSTOR-- and you still have to manage all the logistical stuff required to carve out time for yourself, but the Web at least seriously lowers the barriers to getting access to the resources necessary to support a real intellectual life.

I know that projects like JSTOR are intended to support academics, but I think it's even more valuable for people who are doing serious intellectual work but who aren't academics. These services were designed to support scholarship is doing that... but the most profound benefits aren't going to the people they were originally designed for.

April 06, 2010

Literature, like other fields including history and political science, has looked to the technology of brain imaging and the principles of evolution to provide empirical evidence for unprovable theories.

Interest has bloomed during the last decade. Elaine Scarry, a professor of English at Harvard, has since 2000 hosted a seminar on cognitive theory and the arts. Over the years participants have explored, for example, how the visual cortex works in order to explain why Impressionist paintings give the appearance of shimmering. In a few weeks Stephen Kosslyn, a psychologist at Harvard, will give a talk about mental imagery and memory, both of which are invoked while reading.

While this is very interesting, the practice of drawing on the sciences (particularly cognitive science) to inform the humanities is less new than the article suggests: E. H. Gombrich's classic Art and Illusion opens with a discussion of the latest findings on perception and cognition (from the 1950s, obviously) and how they should be applied to art history and criticism.

[To the tune of Tabla Beat Science, "Tala Matrix," from the album Live In San Francisco At Stern Grove [Disc 2] (a 3-star song, imo).]

February 10, 2010

Graduate school in the humanities is a trap. It is designed that way. It is structurally based on limiting the options of students and socializing them into believing that it is shameful to abandon "the life of the mind." That's why most graduate programs resist reducing the numbers of admitted students or providing them with skills and networks that could enable them to do anything but join the ever-growing ranks of impoverished, demoralized, and damaged graduate students and adjuncts for whom most of academe denies any responsibility.

February 09, 2010

When France’s most dashing philosopher took aim at Immanuel Kant in his latest book, calling him “raving mad” and a “fake”, his observations were greeted with the usual adulation. To support his attack, Bernard-Henri Lévy — a showman-penseur known simply by his initials, BHL — cited the little-known 20th-century thinker Jean-Baptiste Botul.

There was one problem: Botul was invented by a journalist in 1999 as an elaborate joke, and BHL has become the laughing stock of the Left Bank....

Mr Lévy admitted last night that he had been fooled by Botul, the creation of a literary journalist, Frédéric Pages, but he was not exactly contrite.

Appearing on Canal+ television, he said he had always admired The Sex Life of Immanuel Kant and that its arguments were solid, whether written by Botul or Pages. “I salute the artist [Pages],” he said, adding with a philosophical flourish: “Hats off for this invented-but-more-real-than-real Kant, whose portrait, whether signed Botul, Pages or John Smith, seems to be in harmony with my idea of a Kant who was tormented by demons that were less theoretical than it seemed.”

Granted I haven't had any coffee this morning, but it sounds like Lévy's argument is, "Yes the work I cite is fiction, but it says what I think, so I'll continue to reference it." Which sounds rather like an appeal to truthiness: it's not true, but it kind of looks true, and confirms my own beliefs, so I'm going to find it convincing.

[To the tune of They Might Be Giants, "Lazyhead and Sleepybones," from the album No! (a 3-star song, imo).]

December 18, 2009

With colleges and universities cutting back because of the recession, the job outlook for graduate students in language and literature is bleaker than ever before.

According to the Modern Language Association’s forecast of job listings, released Thursday, faculty positions will decline 37 percent, the biggest drop since the group began tracking its job listings 35 years ago.

The projection, based on a comparison between the number of jobs listed in October 2008 and October 2009, follows a 26 percent drop the previous year.

I read this, and wondered what bothered me about it. Obviously the news itself is bad, but not surprising: the academic job market has been a disaster area for a generation now, it's not going to get better, and anyone who thinks it will is delusional. When I was in grad school in the late 1980s, the conventional wisdom was that we were hitting the job market at exactly the right time: the generation that was hired during the Great Expansion in the 1950s and 1960s would retire, and we'd cruise into those positions.

Needless to say, that didn't happen, and the fact that many of those jobs were converted into short-term positions should have been a clear signal that The Market Had Changed.

But this is old news. What gets me about this piece, I realized, is how it's framed. It equates "the job outlook for graduate students in language and literature" with "the academic job market:" there's no sense that Ph.D.s might be capable of doing SOMETHING ELSE with all that knowledge. Demonstrably wrong, guys.

[To the tune of British Sea Power, "Something Wicked," from the album The Decline Of British Sea Power (It's a 1-star song, imo).]

October 24, 2009

It was 2003, and Warren, an earnest-sounding and ever enthusiastic Harvard law professor who specializes in bankruptcy, was on the set of Dr. Phil. She had written a book with her daughter called The Two-Income Trap: Why Middle-Class Mothers & Fathers Are Going Broke, and she'd expected to sit next to the host and explain its key points. Instead, Dr. Phil was interviewing a stressed-out couple with serious medical and financial troubles. After they mentioned they had obtained a second mortgage to pay off their credit card debt, the lights went up on Warren, and Dr. Phil asked her if this had been a smart step. No, she declared, because now they could lose their home if they defaulted.

As soon as her turn was over, Warren found herself thinking, "You've been doing this work for 20 years now, and it is unlikely that any of it has had as direct an impact as these 45 seconds." She had reached millions, some of whom might actually pay attention to her advice. "So here you are, Miss Fancy-Pants Professor at Harvard. What do you plan to do now? Is it all about writing more academic articles, or is it about making a difference for the families you study? I made a decision right then: It was for the families, not the self-aggrandizement of scholarship."

Since then she's proved herself to be surprisingly mediagenic, in a very understated, just-drove-the-minivan-to-the-office kind of way. At the same time, she's not given to oversimplification or jargon: she's really good at explaining the stakes in TARP (she's part of the Congressional office that tries to oversee TARP), where the money's going, of why we don't know where the money's going. (Check out her appearance-- in twoparts-- on The Daily Show.)

Yet despite, or more accurately because, of her willingness to choose "families" over "the self-aggrandizement of scholarship,"

Harvard economists... dismiss Warren as insufficiently theoretical. "They think she shouldn't be talking about bankruptcy except as someone in the economics department would—that is, with formulas and theorems, not about how it affects real people."

I suppose this drives me around the bend for two reasons.

First, my work has sometimes been accused of being insufficiently theoretical (usually in reader's reports), as if theory is the sine qua non of importance. As Taibbi would put it, first of all, few kinds of scholarly work are both harder and less likely to stand the test of time as theory; and second, what the fuck? When did we all turn into mini-Derridas? Isn't theory a tool? I mean, we all use word processors, but I don't see many of my colleagues rushing to create their own versions of Microsoft Word.

Second, probably the single greatest personal intellectual epiphany I've had since leaving academia is that the real world actually has interesting problems: not just problems that you ought to deal with because life as we know it could get pretty screwed up if we don't, but problems that are actually intellectually engaging, make use of the cognitive muscles you developed in academia, force you to develop new abilities, and expose you to interesting questions you would never have discovered otherwise. The assumption that academia is where people grapple with interesting questions, and the business world is where stupid things happen, is just wrong.

someone like Elizabeth Warren doesn’t want that responsibility, well, she shouldn’t have gone into office and gone on TV making all that sense and shit. She’s pushed for transparency in the Fed, is openly furious about the misuse of bailout money, and seems to take personally the chicanery that credit card companies and banks use to game the suckers out there. I simply cannot see her suddenly flipping and holding $2000-a-plate fundraisers with Lloyd Blankfein and Jamie Dimon.

[To the tune of Bruce Springsteen, "This Land Is Your Land," from the album Live 1975-85 (I give it 4 stars).]

October 07, 2009

Fights broke out as law students queued for up to 11 hours last night to secure the dissertation supervisor of their choice at Brunel University.

More than 100 students queued outside Brunel Law School overnight in the hope of working with their preferred academic, after the school introduced a first-come, first-served supervisor-allocation system.

I love the University's utterly tone-deaf response.

A spokesman for Brunel said the university was “very concerned” that law students had queued overnight and was “disappointed to see the lengths to which some feel they have had to go”.

“In preparing for their dissertation, students are informed that neither their choice of topic nor their first choice of supervisor can be guaranteed. It seems that they have done all they can to try to achieve their first topics and supervisors.”

Ummm.... why should that be disappointing, or any kind of surprise?

[To the tune of Sun Kil Moon, "Neverending Math Equation," from the album Tiny Cities (I give it 3 stars).]

September 27, 2009

The cause of the meltdown in global financial markets is obvious: leveraged trading in financial instruments that bear no relationship to the things they are supposed to be secured against.... The academy, too, is a market - a large one in which the value of any piece of research is ultimately secured against the world. If the world is not as described or predicted in the article or book, the research is worthless. A paper that claims that autism is caused by vaccination or terrorism by poverty is valuable only if it turns out to be a good explanation of autism or terrorism. That is why an original and true explanation is the gold standard of academic markets....

The academic market is also like the financial market in another way. Stocks trade above their value, which leads to bubbles and crashes. Brain-imaging studies, for example, are a current bubble, not because they don't tell us anything about the brain, but because the claims made for them so vastly exceed the information they actually provide.... [E]very week we read in the science pages that brain-imaging studies prove X, where X is what the readers or columnists already believe. Women can't read maps! Men like sex! Childhood trauma affects brain development! There is an Angelina Jolie neuron!...

[Much scholarship by] [h]istorians, anthropologists, linguists and even philosophers... is unsecured and highly leveraged. By this I mean that people in the humanities often do not write about the world or the people in it. Rather, they write about what somebody wrote about what somebody else wrote about what somebody else wrote. This is called erudition (not free association), and scholars sell it to their audience as a valuable insight about the nature of terrorism or globalisation or the influence of the internet (preferably all three). Almost every grant application in the humanities mentions these three topics, but the relationship between them and the names and concepts dropped en route are utterly obscure.

None of this would matter if the market were basically self-correcting like the science market, or erratic but brutally self-correcting like the financial markets.... [But] the main corrective mechanism in the humanities is reputation built on publication and, since publication is often based on reputation, the danger of a bubble is extreme. Someone who takes a supervisor's advice to base a career on writing about Slavoj Zizek is in the position of an investor deciding to invest in Bear Stearns on the advice of Lehman Brothers. The price is high and predicted - by those who have a vested interest - to rise further....

Compare the citation for a Nobel prizewinner in chemistry or physics with the way humanities research is evaluated. The Nobel citations are accessible to any intelligent reader.... Things sometimes seem to go the other way with the big names in the humanities. A problem (eg, terrorism) is misdescribed (eg, as an expression of subaltern response to modernity) and a raft of pseudo-explanation is recruited to leave everyone baffled.

From an essay in the Times Higher Education on the seven deadly sins in academia, when I first read it this piece on lust made my eyeballs hurt, and not in a good way:

When Willie Sutton was asked why he robbed banks, he is famously said to have replied, "because that's where the money is". Equally, the universities are where the male scholars and the female acolytes are. Separate the acolytes from the scholars by prohibiting intimacy between staff and students (thus confirming that sex between them is indeed transgressive - the best sex being transgressive, as any married person will soulfully confirm) and the consequences are inevitable.

The fault lies with the females. The myth is that an affair between a student and her academic lover represents an abuse of his power. What power? Thanks to the accountability imposed by the Quality Assurance Agency and other intrusive bodies, the days are gone when a scholar could trade sex for upgrades....

Normal girls - more interested in abs than in labs, more interested in pecs than specs, more interested in triceps than tripos - will abjure their lecturers for the company of their peers, but nonetheless, most male lecturers know that, most years, there will be a girl in class who flashes her admiration and who asks for advice on her essays. What to do?

Enjoy her! She's a perk. She doesn't yet know that you are only Casaubon to her Dorothea, Howard Kirk to her Felicity Phee, and she will flaunt you her curves. Which you should admire daily to spice up your sex, nightly, with the wife.... And in any case, you should have learnt by now that all cats are grey in the dark.

So, sow your oats while you are young but enjoy the views - and only the views - when you are older.

Crooked Timber comments, this is a "classic example of the sort of thing where having shown a draft to a single close female friend might have saved the day, and in the process offered a useful insight into the distinction between the concept 'refreshingly un-PC' and the concept 'creepy'."

However, the author answers his critics this way:

This is a moral piece that says that middle aged male academics and young female undergraduates should not sleep together. Rather, people should exercise self-restraint. Because transgressional sex is inappropriate, the piece uses inappropriate and transgressional language to underscore the point - a conventional literary device. At a couple of places, the piece confounds expectations, another conventional literary device, designed to maintain the reader's interest. Sex between academics and students is not funny, and should not be a source of humour. But employing humour to highlight the ways by which people try to resolve the dissonance between what is publicly expected of them and how they actually feel - not just in this context - reaches back to origins of humour itself. In his introduction, [editor] Matthew [Reisz] wondered how many of his contributors would enter into the spirit of levity that inspired the idea of the seven deadly academic sins (submitting a piece on prevarication late, etc) and I suspected that one could get to heart of all that is wrong with sex between scholars and students by employing the good ol' boy language of middle aged male collusion. I'm not sure I'm wrong.

If it's intended to be a piece whose style and tone exemplify its subject, I have to admit it does a decent job. Naturally the piece has generated a huge number of comments, though this one takes the prize:

Professor Kealey assumes that every male academic’s wife mustn’t be that attractive. How wrong! I, for example, I am far hotter than any of my husband’s young and inexperienced students could ever (unfortunately for them) hope to be.

Hear hear. When I was in Oxford, walking around in the evening and trying to navigate around the crowds of students in high heels and skirts, I'd sometimes think, "They might be interesting in twenty years." I can't be the only man who reacts like that.

[To the tune of Radiohead, "Scatterbrain (As Dead As Leaves)," from the album Hail To The Thief (I give it 1 stars).]

August 26, 2009

Via Crooked Timber, this Inside Higher Edreview of Diego Gambetta's Codes of the Underworld: How Criminals Communicate that has a great comparison of projected incompetence among mafiosi, who according Gambetta, cheerfully "let the professionals and the entrepreneurs take care of the actual business operations" and admit that they're only good at shaking people down, and a certain brand of italian academic, the "baroni (barons) who oversee the selection committees involved in Italian academic promotions."

While some fields are more meritocratic than others, the struggle for advancement often involves a great deal of horse trading. "The barons operate on the basis of a pact of reciprocity, which requires a lot of trust, for debts are repaid years later. Debts and credits are even passed on from generation to generation within a professor's 'lineage,' and professors close to retirement are excluded from the current deals, for they will not be around long enough to return favors."

The most powerful figures in this system, says Gambetta, tend to be the least intellectually distinguished. They do little research, publish rarely, and at best are derivative of "some foreign author on whose fame they hope to ride.... Also, and this is what is the most intriguing, they do not try to hide their weakness. One has the impression that they almost flaunt it in personal contacts."

Well, one also has the impression that the author is here on the verge of writing a satirical novel. But a friend who is interested in both the politics and academic life of Italy tells me that this account is all too recognizably accurate, in some fields anyway. Gambetta calls the system "an academic kakistocracy, or government by the worst," which is definitely an expression I can see catching on.

[To the tune of Thievery Corporation Feat. Sister Nancy, "Originality," from the album Versions (I give it 1 stars).]

More often than I can believe, someone will preface a reading by saying, "I just wrote this last night." Why on earth, I wonder, would you read something that raw? Generally public readings are set up months in advance. It's not like the speakers don't know they're going to have to have something ready.... But then I remembered that arrogance is often the conjoined twin of insecurity. What those writers wanted us to know, perhaps, was that this new work was the result of pure talent: Just think, audience, how good this would be if it were coupled with labor? If the piece stinks, it's simply a matter of timing. It's not my fault. I could do better, really, I could. I just didn't have the time....

Most academics don't present hastily written papers. But they do something almost as bad. They read their papers aloud. Some professors read their lectures. It's common practice, I know, but frankly, it bugs me. It's hard enough for an audience to follow a short story, where, presumably, some attention is being paid to crafting narrative tension. Having to track audibly an argument written in long, convoluted sentences and leaden, jargon-ridden prose can feel like a forced drowning.... Reading instead of presenting is, I think, the academic equivalent of "I just dashed this off last night." It's an act borne out of (choose as many as apply): fear, insecurity, arrogance, procrastination, habit, poor training, or lack of regard for the audience. It's also just plain lazy. It's a lot of work to think something through and then write it out as a conference paper. Taking the next step—understanding what you've done and figuring out how to summarize it extemporaneously—seems to be one that many are willing to forsake.

The piece is a reminder of just how different the kinds of talks I've done for the last few years, and the sorts of intellectual events I'm usually involved in, are from conventional academic presentations. I spend huge amounts of time preparing for the workshops I facilitate: I go over every activity, every breakout session, think about the posters I need to create, the instructions I should give, what I should and shouldn't say, and what outcomes the client and I want.

All this preparation generates one of two things: artifacts and other materials that help organize an event (or that help participants stay self-organized and -aware of what they're supposed to be doing); and a clearer understanding of what I need to do for the day to succeed. What that preparation doesn't generate is a perfectly-planned day: all that planning, I know, is to prepare me to succeed despite the fact that something is going to happen that requires me to adapt and adjust.

What you absolutely cannot do in an environment like this is throw something together the night before; nor can you write it all out and assume you can just follow the script mindlessly-- the two options Toor describes.

Why are these events so different? Two reasons. First, the faciltiated workshop, much more than the academic conference, is explicitly about the production of shared meaning. The aim after a day or two is to have a common vision of the future, a common roadmap, and common understanding of what an organization's strategy should be. You don't necessarily have that as an outcome of a scholarly conference. Second, workshops are a means to an end, not an end in themselves: they're supposed to catalyze action, not be the end of action.

With the proliferation of interesting kinds of workshops, novel forms of meetings, and now the rise of the unconference, I think it's high time we thought about how we could reinvent academic (maybe mainly humanities) conferences. There's no reason we can't create a better model, that satisfies conference speakers' professional needs (e.g., the line on the c.v., the publicity, the chance to interview for jobs) and personal ones (e.g., the opportunity for subsidized travel to see your friends), as well as the needs of conference organizers and the profession/discipline as a whole-- and is a lot more interesting and engaging. So many academic events I go to end on an optimistic note, or generate lots of interest in moving on to actually doing something... but then dissipate, and at best yield an edited volume. Sitting in a stuffy (or over air-conditioned) hotel conference room, listening to someone read a talk, and feeling the collective interest and enthusiasm generated by the event evaporate days afterward-- aren't there better ways we could all spend our time?

Seriously, I'd really like to do this.

[To the tune of Walter Wanderly, "One Note Samba," from the album Out Of Sight (I give it 1 star).]

July 26, 2009

“Academics, like teenagers, sometimes don’t have any sense regarding the degree to which they are conformists.”

So says Thomas Bouchard, the Minnesota psychologist known for his study of twins raised apart, in a retirement interview with Constance Holden in the journal Science....

The strength of this urge to conform can silence even those who have good reason to think the majority is wrong. You’re an expert because all your peers recognize you as such. But if you start to get too far out of line with what your peers believe, they will look at you askance and start to withdraw the informal title of “expert” they have implicitly bestowed on you. Then you’ll bear the less comfortable label of “maverick,” which is only a few stops short of “scapegoat” or “pariah.”

A remarkable first-hand description of this phenomenon was provided a few months ago by the economist Robert Shiller, co-inventor of the Case-Shiller house price index. Dr. Shiller was concerned about what he saw as an impending house price bubble when he served as an adviser to the Federal Reserve Bank of New York up until 2004.

So why didn’t he burst his lungs warning about the impending collapse of the housing market? “In my position on the panel, I felt the need to use restraint,” he relates. “While I warned about the bubbles I believed were developing in the stock and housing markets, I did so very gently, and felt vulnerable expressing such quirky views. Deviating too far from consensus leaves one feeling potentially ostracized from the group, with the risk that one may be terminated.”

June 18, 2009

I've been in Bloomington, Indiana for a conference on visualization and the history and philosophy of science. It's one of those events that brings together my old life as an historian, and my new life as a futurist: on one hand we're mainly talking about how visualizations of scientific communities and social dynamics can be used by historians and philosophers; on the other I suspect that there are cool things I could do with these maps to forecast the future of science.

There's one other think-tank person here, which saves me from being the one non-academic Ph.D. in the room, the scholarly equivalent of Stephen Colbert's one black friend.

There have been some efforts to use scinometric (or "science of science") maps in the history of science, but so far as I know, most of this work has followed fairly conventional historiographic paths: for example, mapping the Darwin or Mersenne correspondence, or asking questions about the growth of scholarly networks. We've not yet used them to something radically new, like using geographical coding to calculate the speed of the transmission of ideas or instruments, or constructing agent-based models of scientific communities and seeing how they evolve over time. But that's why we're here-- to think about how we could create such things, and what benefit they might bring.

I quite like Bloomington, or the few blocks of Bloomington that I've seen.

The place is enormous. It has roughly the same number of students as Berkeley, but physically it's much larger. It also takes collegiate Gothic (a somewhat stripped-down, modernized version) to a scale I don't think I've never seen before. If you took Princeton or Bryn Mawr, put it on a balloon, then blew up the balloon to five times its previous size, you'd get the IU campus. Yale and University of Chicago bear some family resemblance to Oxford or Cambridge, thanks to their small scale; IU takes Gothic where it's never gone before.

The town has a lot of restaurants, and a lot of foreign food, for a place its size. Tuesday night I had dinner at an Ethiopian restaurant, and last night it was Thai at Siam House. (Both are a serious challenge to dieting!) One local attributed this to the long presence of foreign students at IU, some of whom brought spouses or other relatives who went into the restaurant business. I have no way of knowing if this is true, but for whatever reason, there's good food here.

May 28, 2009

In the last few days I've been doing a lot of stuff: biking, organizing a Memorial Day dinner, preparing for a week-long trip to the East Coast, thinking about the craft and design of workshops. (These are the expert workshopsthatIorganizeallovertheplace.)

In many ways these are very different activities, but I really enjoy them all. I recently realized that despite their differences, they actually share a few qualities.

1) They're active, embodied knowledge.

Obviously bicycling is physical, but cooking is a nice combination of fine motor skill and lifting big heavy things (or in my case, avoiding setting myself on fire); you're always on your feet in a workshop; and travel is pretty physically strenuous, for good and bad reasons. Maybe I'm getting older, I'm less of a couch potato, or my ADD is increasing (and I know these are somewhat mutually exclusive explanations), but I find my patience with sitting for long hours and just reading is decreasing. I can do it, but I'm happier engaging my body. And nothing is better than activities where you're involving your body, but you have to think about what you're doing. (Gregg Zachary had a great piece last year on the rediscovery of the virtues of manual work. I'm part of a movement.)

My capacity for finishing things that have open-ended deadlines, or fake deadlines ("so we all agree that we'll finish our tasks by next week, right? right?"), is plummeting to near zero. I have too much other stuff in my life that absolutely has to get done.

So hard deadlines are good for me now. Essential even. The workshop starts at exactly this time, the plane leaves at exactly that time, the guests are arriving now.

Hard deadlines also put a nice bound on craftwork, by preventing you from tinkering forever with something. A paragraph could always be better, but as Sennett writes, the demands of the trade force craftsmen to accept limits, to do the best job they can within the time they have, and to learn to be satisfied with that. As graphic designers say, "Finished is Good."

3) They require preparation.

The day of the cookout, I spent hours chopping vegetables, checking marinades, cleaning off platters (you can never have too many platters at a BBQ), locating plates and cups, setting up staging areas for food and drinks, laying out tools, etc. (I noticed, though, that this wasn't tedious, it was pleasant. It was a classic example of what Csíkszentmihályi calls flow.)

Likewise, when you travel, you've got to think a lot about what to pack, how to structure your time, how to get among different places, etc.. A bike won't work with a flat tire, nor will a cyclist work if he's dehydrated, so you'd better be prepared for those possibilities. Every ride requires some kind of adjustment: technical climbs mess up gears; thorns flatten tires; I get hungry. Having the resources to deal with those things lets me keep riding.

With workshops, you have to think in advance about everything, and I mean everything: you have to go over the agenda minute-by-minute, think about the flow of the day, tinker with questions and exercises to eliminate ambiguity and focus people, lay out materials, move the furniture around, make sure the caterers know when to appear, etc., etc. (Indeed, there are things that we normally don't think about that I'd like to start experimenting with, like lighting and ambient sound, making some activities more embodied and physical-- sitting is exhausting-- and playing with the day's menu to keep people from getting weighed down by muffins and too much coffee.)

Good preparation doesn't require you to think just about one thing. It requires you to think about a lot of different things, big and small; to think about timing and process; about division of labor; about contingencies and strategies. That's part of what makes it pleasant.

Some of that preparation is meant to help you keep things on track, and do things exactly the right way. But most serious preparation isn't about scripting. Rather, its about making it possible for you to adapt to whatever actually happens. I've never had a workshop run exactly the way I imagined it would: more people show up, they turn out to be interested in other things than we'd discussed before, the room isn't laid out the way we expected-- a thousand different things can go akimbo.

I used to think that the point of planning workshops in such great detail was so I'd have more control over them. Wrong. You never have control. You have whatever you have when you get in the room. The point of doing all that planning is to deeply understand the intentionality and philosophy behind the workshop, so you can improvise your way to the same end-point, and you have the tools at hand to do so.

[Update: I've realized that this is my complaint about humanities graduate training: it socializes you to believe that you possess skills that are useful only in a very specific future-- namely tenure track jobs in your field-- and train you to believe that you're less qualified to succeed at a different future, and that any other future is a failure.]

If you know that you're going to go off the map-- if events are going to conspire to send you in another direction, and they will-- the best that you can do is have the right gear, and a clear picture of where you want to go.

4) They have serendipity.

The upside of plans not working out the way you expect is that they can work out better. Sometimes the very coolest thing isn't on the map, and the only way to find it is to venture into the unknown.

One of the great pleasures of having a big party is that mixing up friends who don't know each other can have pleasant results for everyone. The best rides are ones that have a brilliant hill and view that you didn't know about. The best trips are the ones that expose you to something you've never seen before, or didn't even know was cool. I fell in love with Budapest not because I'd always wanted to go there, but because it's an amazing, complicated, Old World post-socialist place that I find alternately fascinating and frustrating. I love London because it rewards walking: I know it well enough to be able to navigate by Tube or on foot, but every time I go out in the evening I discover something-- a little square, a park, a row of businesses-- that charms and captivates, and that I'd never heard of.

Workshops have serendipity too. Tons of it. You want to build connections between ideas or fields that even experts hadn't seen before, or explore the cross-impact of trends that people normally think about separately. When that works, the results are awesome-- and the amazing thing is, the results are awesome a lot more often than you'd expect. You never know what the outcome of a workshop is going to be-- and if you do, there's really no point in having it in the first place. This doesn't mean that a workshop shouldn't have certain goals or deliverables; far from it. But it's like an evening walk in London: you know where you're going to end up, you know that there are certain landmarks you'll pass, but you don't know what else you're going to see along the way. Your job is to be open to the serendipity, so you can take advantage of it.

5) They draw out people.

I mean this in two senses. First, they can push you do things you didn't know you could. Good rides challenge you to do things you didn't think you were capable of, or leave you exhausted by happy with your performance.

Second, they open up a space for people to contribute. My wife used the cookout as an opportunity to repot a bunch of flowers in the backyard, dig out and repot some aging bamboo, and do other things on her gardening/home improvement list. Once kids started arriving, my daughter made (or taught the kids how make) balloon swords, which they then played with all evening. I hadn't thought of either of these, but people commented on how nice the backyard looked, and the kids all left exhausted and uninjured. Win.

Workshops require both kinds of drawing out. Running a workshop isn't an exercise in controlling other people, but it's a hard task to create a venue in which everyone can think seriously, think differently, and think together.

It's also not about getting a certain result, but about creating the conditions out of which interesting new things will emerge. Of course, workshops have objectives, but as a facilitator, you have to approach them obliquely, and recognize that the actual work and thinking will be done by participants: you're just ("just" isn't quite the right word!) there to help make it happen.

You can challenge people, but you can't order them to be innovative. You can try to get guests to mingle or introduce them to each other, but you can't make them be chatty and friendly. You can also push yourself, but you must recognize that pushing doesn't get you everything: you can get to the airport on time, but you can't control the weather and need to be able to go with whatever the situation presents.

my son on a happier ride

This morning I got an unexpected lesson on pushing versus flow from my son. We were biking to school, and he has the habit of standing up while pedaling. I can't get him to stop (he's seven, after all), so I was trying to teach him how to do it in a way that maintains his balance. He got frustrated and mad, which made him distracted; and so he took a spill. Bad enough to break the mirror on his bike, add a couple nicks to the brakes or handlebars, and require some ice and band-aids when he got to school. Fortunately nothing on him was broken, and he'll be fine.

As I try to tell the kids, biking is one of those things that demands mindfulness: you have to watch the road, know what gear you're in, know where the cars are, know how tired you are. You can push yourself, but if you lose your concentration-- if you lose the flow-- you're likely to crash. In the course of pushing him, I made him lose what little flow he had.

Still, any spill that doesn't send you to urgent care is a learning opportunity, not an accident. And as a friend of mine wrote after hearing about the crash,

But falling is an essential part of growth. It teaches you where the boundaries are. If you never push hard enough to fall, you will never know if you could grow twice as much or twice as fast-- because you are playing it safe.

So across all these activities-- and maybe across everything you do-- hitting that mix of pushing and flow, planning but staying open to serendipty, and being active is key.

[To the tune of Keith Jarrett, "Hourglass, Part 2," from the album Staircase (I give it 4 stars).]

April 14, 2009

Nothing in it about penis-shaped helicopters, but this Anthony Grafton piece about going to graduate school is pretty good-- the kind of combination of encouragement about the inherent (if quirky) rewards of academic apprenticeship, combined with some (maybe too gentle) warnings about the downside. I particularly like this little "then-and-now" gem:

In the ’60s, as universities expanded around the country and the world, job offers strewed the desks of bright Ph.D. candidates like autumn leaves in Vallombrosa. [ed: This is the kind of thing that separates writers like Tony from us mere mortals. I have no idea what it means, but I feel more erudite just reading that reference.] One friend of mine opened an envelope that had been buried under detritus on his desk and discovered that he’d been offered a job two years before and never even answered.

Not very likely to happen these days, but my father (who got his Ph.D. in 1970, and his first tenure-track job three of four years before) confirms that yes, that's what it was like back then.

While Tony advises readers that they shouldn't "jump [into grad school] before you find out exactly what lies below," though I wonder if it's really possible to "find out" what it's like, or what it'll do for (or to) you with anything approaching exactitude.

Of course you should talk to lots of people, but for most prospective students that universe will only include current students and faculty. The students who will be alternately glowing about grad school and their prospects, or will try to give you the scary "real" story. The faculty will be pretty useless as advisors about the realities of grad school: life looks very different at the head of the seminar table.

On the face of it, talking to students and faculty is a pretty logical decision, but the problem is this: odds are, you're not going to get a Ph.D. and then be a professor at the kind of university you aspire to attend. Further, while they're helpful about the day-to-day reality of school, graduate students are going to be useless sources about the long-term effects of going to graduate school-- either in economic or career terms, or in psychological terms. At the same time, other people who could be very informative-- people who've been ABD for 15 years; people who finished their Ph.D.s and then went to Wall Street, the World Bank, or think tanks; students who dropped out before their orals-- are much harder to track down.

So there's an inverse relationship between the availability of experts to consult, and the likelihood that their expertise is actually going to be useful in your own life.

When I was an undergrad (I was one of those nerdy kids who went straight from college to grad school-- actually, I started taking graduate classes as a sophomore), I never thought about talking to people who'd almost finished the programs I was looking at but dropped out, or people who didn't become academics. It turns out, of course, that it would have been far more useful for me to talk to Ph.D.s who'd gone into business. But those people aren't as easy to find as the ones in the faculty lounge or TA offices.

This is actually an example of a bigger problem that people and organizations face when thinking about the future: we tend to confine our research to cases that are relatively easy to find, and look only at successes (successful cases, organizations, or people), and not at failures. Getting a handle on that space-- or at least a more realistic appreciation of the likelihood of the unexpected happening-- is one of the toughest things you can do as a forecaster, or parent, or human. After all, success is what we want, and it's easy to understand; failure is what we want to avoid, and people fail for all sorts of unpredictable reasons. Success if what a strategy or good decision or first-rate school can bring you; failure is what'll happen if you don't get those things. We don't think explore the possibility that we could get those things, execute properly, and still not reach our goal; but that happens all the time. Success, we think, is comprehensible and predictable (and not largely determined by the economic state of universities and how expansive faculty hiring is allowed to be in any given season); failure is random, or something that'll happen to other people. But in reality, we're probably going to end up one of those other people. We're better off if we know that in advance.

And if we know that the definition of "failure" is sometimes as arbitrary as the forces that determine whether it happens to us or not. I can testify that it's possible to have an interesting intellectual life without being an academic (though having a library card does help). As Grafton notes,

Even if you don’t finish, or finish and don’t wind up as a professor, the skills you learn in grad school can be of value in a range of other venues. Some of my most successful former students work as scholars, teachers or writers outside the academy. But as you might expect, few follow this path without some bitterness. And no wonder. A fair number of professors treat students who leave the academy, even after experiencing terrible difficulties, as renegades and wash their hands of them. Be prepared.

March 09, 2009

Having spent so much time thinking about young Ph.D.s developing postacademic lives, it never really occurred to me that there would be similar problems of professional marginality at the end of one's career. But Siris makes an argument that the failure of philosophy-- which, one imagines, would be second only to history as a scholarly activity in which age is a virtue rather than a disadvantage-- to find a place for emeritus scholars in the profession represents "the second failure of academia:"

[E]veryone assumes that retirement is and must be the end of the road: that the only reason you'd retire is because you've become dead wood. And no one has recognized that this is a symptom of a profound failure on our part, one almost as profound as the failure to prevent 'adjunctification'.

It is utterly absurd that we have no standard options after retirement for senior philosophers who still want to be actively involved in philosophy. If anything, retirement should standardly be the next stage after tenure, not an exit from the field but another kind of removal of constraints.

Perhaps we get something vaguely like this in how some departments treat emeritus professors; but only vaguely, and only like. We are failing people at the end as we are at the beginning.

But what gets me is that everyone takes it for granted: suggest retirement and it is assumed you are suggesting uselessness -- and, given the way the system's set up, that's a not unreasonable assumption. But it needs to be brought to consciousness that this is a failure that needs to be overcome, not a reasonable feature of the landscape.

How many fields are like this? Most of them, I'll bet. And it reflects our somewhat schizophrenic attitudes towards age, experience, and work: we alternately talk about experience and skill being the most valuable things an organization can have, but at the same time sometimes imagine real innovation only coming from twentysomethings who sleep under their desks. Even in academia, some fields-- mathematics and theoretical physics, for example-- assume that the really brilliant work is done by the young, and if you don't have a major discovery by the time you're 30, you never will.

This idea struck a chord for personal reasons. My father just retired from his professorship at the Colorado School of Mines, to take advantage of some new professional opportunities, and to give himself more time to work on writing projects. His impulse to see retirement not as a chance to kick back, but to do the work he really wants, is hardly unusual. And I expect if I ever get to that age, I'll approach retirement the same way. Assuming retirement, or something like it, still exists.

Actually, Theodore Roszak makes a really good point in his book The Longevity Revolution (who I visited a few years ago, and whose work I talk about) that the concept of retirement as a period of time that you could do something with is a very modern invention. It used to be that you were likely to die within a few years of retirement (assuming you made it that far), and for part of that time were likely to be an invalid. In contrast, now people regularly face years or decades of life in retirement, and fewer and fewer of them are content with the idea of just running out the clock in Florida (and more and more can't afford it anyway). So if academia is behind the curve in recognizing post-retirement as a productive time, that's probably not a surprise-- though anything that wastes talent is always a shame.

January 26, 2009

It's hard to tell young people that universities recognize that their idealism and energy — and lack of information — are an exploitable resource. For universities, the impact of graduate programs on the lives of those students is an acceptable externality, like dumping toxins into a river. If you cannot find a tenure-track position, your university will no longer court you; it will pretend you do not exist and will act as if your unemployability is entirely your fault. It will make you feel ashamed, and you will probably just disappear, convinced it's right rather than that the game was rigged from the beginning.

December 08, 2008

Frank Rich's New York Timespiece cautioning that "the brightest are not always the best" is very good.

IN 1992, David Halberstam wrote a new introduction for the 20th-anniversary edition of “The Best and the Brightest,” his classic history of the hubristic J.F.K. team that would ultimately mire America in Vietnam. He noted that the book’s title had entered the language, but not quite as he had hoped. “It is often misused,” he wrote, “failing to carry the tone or irony that the original intended.”...

The stewards of the Vietnam fiasco had pedigrees uncannily reminiscent of some major Obama appointees. McGeorge Bundy, the national security adviser, was, as Halberstam put it, “a legend in his time at Groton, the brightest boy at Yale, dean of Harvard College at a precocious age.” His deputy, Walt Rostow, “had always been a prodigy, always the youngest to do something,” whether at Yale, M.I.T. or as a Rhodes scholar. Robert McNamara, the defense secretary, was the youngest and highest paid Harvard Business School assistant professor of his era before making a mark as a World War II Army analyst, and, at age 44, becoming the first non-Ford to lead the Ford Motor Company.

The rest is history that would destroy the presidency of Lyndon Johnson and inflict grave national wounds that only now are healing.

For those of us who come out of this kind of world, or at least have been influenced by and wanted to emulate these kinds of people, it's a nice little reminder that brains-- or the particular forms of intelligence that are bred in the hothouses of academia and think-tanks-- aren't everything.

I've been thinking about this for a while, because I've recently become aware of how formative the experience of graduate school was for me (or perhaps I've allowed it to become), and how much I've had to unlearn-- and still am unlearning-- some of the habits that I developed there and as a young academic.

In my current incarnation (as a mortgage owner, to say nothing of someone who lives at the interface of marketplaces and ideas), the contempt for money that I learned as a young professor-to-be is definitely a maladaptation. It's good to not be motivated primarily by money (unless you're in a job like banking, where that makes sense), but it's always bad to be careless about it, or to be uncomfortable talking about it-- something that as a consultant you can NEVER get away with. (Academic contempt for money is also to some degree a product of two other things: the fact that you're likely never to see much of it anyway, and that once you're tenured, you never really have to worry about it again. Your income is not large but extremely secure.)

Likewise, the assumption that that you have to rewrite things a dozen times, worry over them for months, and get as much of your argument exactly right before you can let someone else see it, is definitely not attuned to the way the rest of the world works. The tiniest fraction of ideas are meant to be timeless; a slightly larger sliver might last for years; but the fact is, most ideas are perishable goods, that need to be churned out, circulated, and monetized before times change. (Actually, a lot of scientific and scholarly ideas are like this, too.) Timely goodness is better than obsolete excellence.

I think of myself as living mainly in my own mind: aside from the family and friends, most of my conversations take place with books and words. This has tended to translate into a mild (or maybe not so mild) disregard for the world. But recently I realized that I need to be hyper-effective in the world to be effective in my own world. The demands of the everyday don't go way; instead of ignoring them, you need to be able to deal with them with ruthless efficiency, so you have time and bandwidth for what really matters.

Johnson, after his first Kennedy cabinet meeting, raved to his mentor, the speaker of the House, Sam Rayburn, about all the president’s brilliant men. “You may be right, and they may be every bit as intelligent as you say,” Rayburn responded, “but I’d feel a whole lot better about them if just one of them had run for sheriff once.”

October 23, 2008

Up at the Carnegie Foundation for the Advancement of Teaching this evening, for the opening of a conference on tinkering. It looks like it's going to be a really fascinating event. There are lots of cool people, it's a wonderful subject, and the venue is really nice.

October 14, 2008

Wal-Mart, the nation’s largest private employer, long criticized for its workplace policies, is a “more-honest employer” of part-time workers than colleges that employ thousands of adjunct faculty members. That was the harsh message delivered to a group of college human-resources officials here on Monday by one of their own: Angelo-Gene Monaco, associate vice president for human resources and employee relations at the University of Akron....

“We helped create a highly educated part of the working poor, and it’s starting to get attention from outsiders,” he said, noting that unions are trying to organize part-timers, and lawmakers in nearly a dozen states are examining the issue.... “We rely on them for a very important function, and we assume that they will continue to accept mistreatment in return.”

My next book, Rest: Why Working Less Gets More Done, is under contract with Basic Books. Until it's out, you can follow my thinking about deliberate rest, creativity, and productivity on the project Web site.