In some areas, though, what we’ve needed isn’t a new legal framework so much as better application and enforcement of what’s in place. To that end, last week Weil addressed the problem of worker misclassification—the administrative sleight of hand by which an employer deems its workers to be independent contractors rather than employees—by issuing an “Administrator’s Interpretation” of the relevant portions of the Fair Labor Standards Act (FLSA).

Misclassified employees are often denied access to the critical benefits and protections they are entitled. Misclassification also generates substantial losses to the federal government and state governments in the form of lower tax revenues, as well as to state unemployment insurance and workers’ compensation funds. It forces workers to pay the entirety of their payroll (FICA) tax. It also tips the scales against all of the employers who play by the rules and undermines the economy.

Anyone who’s so much as glanced at the Business section lately is likely aware that this very issue of worker misclassification has been dogging the ride-sharing service Uber, which last month saw the California Labor Commissioner’s Office side with an Uber driver who objected to being classified as an independent contractor. It’s a question that’s shadowed the wider gig economy for years, with many companies successfully exploiting what have been seen as gray areas in the relevant laws. It’s Weil’s hope that by providing workers and employers with a clear understanding of the rules—and continuing to police their application—some of the pressures that’ve fissured our workplace may be relieved.

21 July 2015

In the wake of the Supreme Court’s decision in Obergefell v. Hodges, many have looked to the Roe v. Wade ruling in an effort to forecast the cultural and political impact of federally sanctioned same-sex marriage. But as Mary Ziegler shows in After Roe: The Lost History of the Abortion Debate, the narrative that now surrounds that decision and its effects overly simplifies what was a very fluid situation, projecting today’s polarities further back in time than the history supports. Below, Ziegler outlines the folly in declaring Obergefell the next Roe when we still have such trouble understanding the first.

-----

Critics of the Supreme Court’s recent marriage equality decision, Obergefell v. Hodges, have predicted that it will be the next Roe v. Wade. Indeed, comparison of Obergefell and Roe has become almost mandatory for Republican presidential candidates and conservative activists.

Some use the analogy to criticize the Court for short-circuiting political debate about marriage in the states. Others clearly want the Court to worry about the consequences of striking down marriage bans in so many states. Politicians turn to these arguments because they might appeal to voters otherwise supportive of marriage equality. If Obergefell is indeed the next Roe, the Court may have once again escalated the culture wars, making it harder for Americans to agree on issues of sexuality or gender, much less on same-sex marriage itself.

Are the Court’s critics right? At first blush, the comparison between Roe and Obergefell seems unconvincing. While the nation remains divided about abortion, polls have shown increasing support for marriage equality, particularly among younger Americans. For many, the moral issues also seem easily distinguishable. Pro-life Americans argue that a woman’s decision to choose abortion harms a person not included in the decision. Longstanding agreements about fetal personhood aside, the argument that abortion is not a purely self-regarding act is much easier to make than any related claim about marriage equality.

16 July 2015

C. Namwali Serpell’s Seven Modes of Uncertainty has roots in her observation that while she found great pleasure in the experience of not knowing what’s really happening in a book, she hated feeling uncertain in her life. Seven Modes is an attempt to draw the uncertainty we recognize in stories into relation with the familiar uncertainty of life, and to consider whether literary uncertainty could perhaps help us understand how to actually live with the anxiety of not knowing.

Serpell argues that literary uncertainty affords diverse modes of experience with aesthetic, affective, and ethical dimensions, and that it emerges over time from a reader’s shifting responses to complex structures of conflicting information. Think of the destabilizing feeling of a shifting point of view, of hearing the same story from a different perspective, or of a scene repeating until what seemed normal grows dream-like, uncanny. She uses readings of books by the likes of Vladimir Nabokov, Toni Morrison, and Tom McCarthy to show that novels are “structurally suggestive,” affording readings that in turn afford ethical experiences, positive and negative.

Fairly or not, this study of how literature can influence by disrupting what we think we know is itself given a hint of uncertainty when one learns that Serpell herself is actually a decorated writer of fiction. Indeed, this month her short story The Sack won the 2015 Caine Prize for African Writing, awarded annually for outstanding English-language short fiction by an African writer. The Caine, which has been described as the African writing equivalent of the Booker Prize, carries a £10,000 award, a sum Serpell opted to share with fellow shortlistees Segun Afolabi, Elnathan John, FT Kola, and Masande Ntshanga. Serpell described the decision as an act of “mutiny,” and showed the capaciousness of her concern for structure in explaining that choice to Huck magazine:

Maybe it’s because I’ve been teaching about mutiny that that word is so present to me. It came from a sense that the prize itself is structured like a competition. Prizes are often competitions, but this particular prize brings the shortlisted writers together for a week before the ceremony. We did panels together. We did readings together. We hung out together. We drank together. We ate together and we talked about our families and our work.

When you spend that much time supporting each other, it felt horrible to be pitted against each other. You could feel the difference in the atmosphere when we would all be talking and hanging out and a stranger would come up and say: “Good luck,” or “Who do you think is going to win?” Suddenly the tenor would change. It’s so uncomfortable to be asked to compete with your friends. Writing to me has never seemed like a competitive sport.

The more I thought about it, I figured the reason the prize is structured this way is because of the money. The money has to go to one person and for people to be interested, we have to drum up this sense of drama. I thought instead of attacking the prize, which is a wonderful thing, I’ll go for the source of its structure, which, for me, seemed to be the money.

In the BBC Africa “Masterclass” video below, produced to mark Serpell’s Caine Prize win, she describes the power and possibility of short stories:

Readers can perhaps look forward to further ethically productive uncertainty at Serpell’s hand: she’s at work on a novel.

09 July 2015

In Stuff and Money in the Time of the French Revolution, Rebecca Spang turns to one of modern history’s most infamous examples of monetary innovation to demonstrate that money is as much a social and political mediator as it is an economic instrument. In her tracing of the creation and abandonment of the assignats—a currency initially defined by French revolutionaries as “circulating land”—we gain not just a new understanding of the Revolution but also greater insight into larger truths about the chasms that can arise between intentions and outcomes, political ideals and practical realities. With those lessons in mind, we asked Spang how the French Revolution’s failed monetary experiment may help us understand the present eurozone impasse.

-----

Q: What happens if we think about the eurozone crisis in terms provided by the history of the French Revolution?

In the eighteenth century, fiscal-monetary crises provoked two major revolutions. “No taxation without representation” was an early rallying cry of the American Revolution and something similar is true for France, as well. Throughout the eighteenth century, the French monarchy repeatedly tried to tap the vast wealth of the nobility and the Catholic Church. The rich and the super-rich stymied these attempts at more equitable taxation by charging the monarchy with “despotism” and positioning themselves as defenders of the public good. Noblemen and magistrates thereby successfully protected their own privileges—including their largely tax-exempt status—by claiming to be at the vanguard of resisting oppression. Some would say this is what anti-EU Conservatives in Great Britain are doing today; you could also think of the Koch Brothers and other wealthy Tea-Party supporters in the US. In the short term, it was an effective strategy. But it had its limits.

After several years of stalemate and near government shutdown, the King agreed that the Estates-General (the French parliamentary body) would be allowed to vote on any new taxes. It was men elected to that body who rejected centuries-old procedure and instead—speaking, as the elites had done, in the name of “the public”—took the revolutionary step of proclaiming a National Assembly. The French Revolution was a case where no one was trying to start a revolution: the King and his ministers wanted to increase tax revenues, while the political elite wanted to protect their wealth. Similarly, when European officials say that Greece needs to honor its debt, they are taking a conservative position. But it’s one that is having radical effects.

Looking at the case of the French Revolution, we see that trying to hold onto power and privilege by claiming to have the public good at heart can easily backfire. It empowers others to make the same claim. In this sense, calling a referendum was a stroke of genius on Alexis Tsipras’s part. It lets Syriza take the political-moral high ground, and rightly so. At the same time, the Revolution’s history is the story of one unintended consequence after another. So I find it hard to join those who are cheering the current situation as a victory for “democracy” and the beginning of the end of neoliberal austerity. Because I know we don’t know what comes next.

26 June 2015

“Same-sex desire alone does not equal gayness. In order to be gay, a man has to learn to relate to the world around him in a distinctive way.” So writes David Halperin in How To Be Gay, expressing the contentious notion behind his controversial University of Michigan course of the same name. “‘Gay,’” he continues, “refers not just to something you are, but also to something you do... Gayness, then, is not a state or condition. It’s a mode of perception, an attitude, an ethos: in short, it is a practice.” American gay male life, he argues, is reflective of “a common culture” and “shared sense of self” that must be acquired, a “characteristic relation to mainstream culture” that must be “discovered” by gay male subjects who “resist the summons to experience the world in heterosexual and heteronormative ways.”

To Halperin, gay culture is something of value, able to enhance or enrich the perspective of people of any sexual orientation. But for Halperin, as for others, the gay rights movement seemed to have prioritized “normality” in ways that both denied and jeopardized gay culture, perhaps nowhere more so than in the effort to legalize gay marriage. With the U.S. Supreme Court having deemed same-sex marriage a right, we look to Halperin’s take on what “gay marriage” might mean for “gay culture.”

-----

When gay people are deprived of a common, communal existence, of a social world of their own, the keynote of gay politics ceases to be resistance to heterosexual oppression and becomes, instead, assimilation—that is, accommodation to the mainstream, the drive to social acceptance and integration into society as a whole. It’s all about the need to fit in, to adapt yourself to the locality in which you already happen to be living and working. Issues like gay military service or marriage equality, which had formerly been about access to benefits, distributive justice, and the removal of discriminatory barriers, now become struggles over the symbolism of social belonging. They are reframed to center around social recognition, the definition of citizenship, the meaning of patriotism, the practice of religious worship, the idea of family. There are still important material demands behind such struggles for inclusion, but they tend to be subordinated, at least in the rhetoric of the movement, to the goals of assimilation and conformity.

In such a context, gay culture seems an increasingly bizarre, insubstantial, intangible, nebulous, irrelevant notion. It is the sign of a failure (or refusal) to assimilate. What would gay people want nowadays with a separate culture anyway? Such a thing might have made sense in the Bad Old Days of social oppression and exclusion. Now it is simply a barrier to progress. It impedes the achievement of assimilation. No wonder we keep asking, with barely suppressed impatience, why gay culture doesn’t simply disappear. Surely social acceptance and integration will spell the end of gay culture. Since gay people are no longer so oppressed, there is little reason for them to band together in separate social groups, let alone to form distinct cultural communities. The assimilation of gay people into straight society has put an end to all that. Gay culture is a vestige from an earlier time. It is archaic, obsolete. Gay culture has no future.

These predictions, I believe, overlook a crucial consideration. Social acceptance, the decriminalization of gay sex, the legalization of homosexual social and sexual institutions, the removal of barriers to same-sex marriage, to military service, to the priesthood and psychoanalysis, along with other previously off-limits professions, should not be confused with the end of sexual normativity, let alone the collapse of heterosexual dominance.

12 June 2015

Like set pieces of life, duels and their drama have starred in Western literature for centuries, a history presented by John Leigh in the newly-published Touché: The Duel in Literature. “Writers are drawn to duels,” Leigh explains, “in the interests of discovering something fundamental about human beings and the way they variously organize and delude themselves, the way they face one another, their fears, and, ultimately, death.” Fiction’s many famous bouts have helped the duel to sustain a cultural heft resonant of both honor and absurdity, able at a mention to evoke intense and complex emotions of a nature known to every era, as Leigh shows below.

-----

In the weeks before last month’s General Election in Britain, politicians had begun to look bored by their own campaigns. The Premier League had already been won, the royal baby was refusing to appear, and the weather was dull. Suddenly, however, a challenge to a duel was made public, and a nation stirred. Yanek Zlinski, who was identified by the media as a Polish “Prince” (the application of the scare quotes has yet to provoke any further challenges), had invited Nigel Farage, the leader of the UK Independence Party (UKIP), to cross swords in Hyde Park.

The proposed duel, of course, never took place. But Zlinski had made his point. He wished to suggest that UKIP’s stance on immigration, which comes largely from the poorer nations of Eastern Europe, constituted an affront rather than an argument and did not deserve to be dignified by opposition in dialogue. Furthermore, by challenging the Englishman to a duel, he intimated that Poland remained a no less honourable member of old Europe, enjoying with Britain a common allegiance to a cosmopolitan aristocratic culture.

The stunt would indeed appear to hark back to noble traditions of political antagonism, when campaigns (honouring the etymology of that word) would take politicians out onto the field in order to sort out their differences. Of course, the opposing sides in the House of Commons are still separated by the length of two swords. Yet this direct appeal to one of the party leaders may actually be a symptom of the Americanisation of British politics, for television debates, recently introduced into the UK after considerable debate about their form, appear to have encouraged the electorate to envisage leaders as more presidential personalities.

09 June 2015

“Not so dismal,” ran the headline in The Economist. Early every summer, the city of Trento in the Italian Alps hosts a four-day festival of economics that punctures the idea that economics has to be a “dismal science.” Banners of famous economists hang over the medieval and Renaissance streets, huge orange tents host temporary bookstores, politicians (this year the Prime Ministers of Italy and France) rub shoulders with journalists, and a gigantic screen broadcasts lectures to crowds in the square outside the cathedral—a necessity considering the overflow from the packed palazzos, theaters and public buildings that host the free talks.

01 June 2015

This month Harvard commemorates the 100th anniversary of the opening of the Harry Elkins Widener Memorial Library, the university’s flagship and still the largest university library in the world. Along with lectures and events celebrating the library’s life, the university offered the following “ode,” narrated by actor and Harvard alum John Lithgow:

To mark the completion of an extensive five-year Widener renovation project in 2004, the Harvard College Library published Widener: Biography of a Library. The book, written by then-Harvard Library Bulletin editor and current metaLAB Associate Director Matthew Battles, tells the story of Widener as that of higher education itself in the midst of the social, political, and cultural tumult of the twentieth century. It’s above all, though, the story of an unsung institution at the center of all that the university was and has become:

The story of Widener is as much the tale of the students, staff, and scholars who used it as it is the record of benefactors and collections. In the life of a library so richly faceted as Widener, one dimension is as irreplaceable as the other. The life of Widener emerges from the archival traces of those people who worked in and used the library; in library rules and regulations; in the letters of students and staff; in the notes and reports of its caretakers; in the photographs and drawings of it, inside and out. Amid its products—the stream of publication, lectures, and courses that flow from it—traces of Widener and its history are obscure. In our historical sensibility, libraries are transmitters not subjects, of historical knowledge. Historian Alistair Black has called the library a transparent institution—its condition is that of medium, not message, and still less messenger.

Battles’s metaLAB home has done much to further our thinking on libraries lately, from The Library Beyond the Book, by Battles and his metaLAB colleague Jeffrey Schnapp, to Cold Storage, a terrific documentary look at Harvard’s offsite book depository, to the drone-shot footage documenting the very scale of Harvard’s collections, as seen in both Cold Storage and Lithgow’s ode above.

19 May 2015

One can hardly fling a mortarboard these days without hitting someone decrying the state of American higher education. And there’s much to decry. One particular strain of critique centers on the issue of choice, with students suffering from either too many or too few options. The problem apparently lends itself to gastronomic metaphor, as two recent takes make clear.

First there’s the cafeteria. Redesigning America’s Community Colleges, which we published last month, seems to have been the right book at the right time, coming just on the heels of President Obama having introduced a $60 billion initiative to make two years of tuition-free community college accessible to everyone. That plan essentially bolsters what the community college system was designed to do: expand college enrollments, particularly among underrepresented students, and to do so at a low cost. As authors Thomas Bailey, Shanna Smith Jaggars, and Davis Jenkins explain, though, while community colleges have been extraordinarily successful in increasing access they’ve not excelled at ushering their students along to graduation. “Colleges designed to maximize course enrollment,” they write, “are not well designed to maximize completion of high-quality programs of study.”

The root of the problem is that students face a clutter of choices in an absence of guidance:

The emphasis on low-cost enrollment has encouraged colleges to offer an array of often-disconnected courses, programs, and support services that students are expected to navigate mostly on their own. Students are confused by a plethora of poorly explained program, transfer, and career options; moreover, on closer scrutiny many programs do not clearly lead to the further education and employment outcomes they are advertised to help students achieve. We refer to this as a cafeteria-style, self-service model.

We argue that to improve outcomes, colleges need to move away from the prevailing cafeteria-style model. Instead, they need to engage faculty and student services professionals in creating more clearly structured, educationally coherent program pathways that lead to students’ end goals, and in rethinking instruction and student support services in ways that facilitate students’ learning and success as they progress along these paths. In short, to maximize both access and success, a fundamental redesign is necessary. We refer to the resulting strategy as the guided pathways model.

Imagine a distant grocery store that advertises that it can meet all of your cooking and eating needs. You make the trip, and when you get there you discover that you can only buy lemon grass, pomelos, and Sriracha sauce. You ask about the limited selection, and the manager tells you to wait till next week, when they’ll be selling pimentos, artichoke hearts, and brandied cherries.

That’s what it’s like to pick your courses when you’re a beginning graduate student in the humanities. Term by term, year by year, the graduate course offerings in humanities departments don’t make sense together. They’re a hodgepodge of specialized inquiries: snapshots of books and articles in progress by professors who know what they’re teaching, but aren’t much aware of what’s being taught in colleagues’ courses alongside their own.

He continued:

Humanities graduate students pick through the eccentric course offerings on the buffet table and try to make a balanced meal out of them. They know that they have to nourish themselves for the comprehensive exam that’s ahead. But how do you gather together a bunch of specialized inquiries into preparation for a general and comprehensive one? Graduate students in the humanities usually solve that problem—that is, they pass their comps—but it takes time, and they have little of that to spare.

In each case it falls to the student to create a coherent program of study, or—to continue the theme—a balanced diet. Community college students are presented with a wide range of courses but relatively little help in translating those options into a specific career goal or the path to achieve it; graduate students are given a clear goal of comprehensive competence, but a slate of courses too specialized to help them prepare.

Both Cassuto and the authors of Redesigning America’s Community Colleges recommend moving to a more student-centered model that helps learners toward their goal, be it an associate’s degree or a PhD. Despite any limits such restructuring may appear to impose on the freedom of faculty to pursue their interests, “doing right by our students,” Cassuto concludes, “is a form of academic responsibility.”

13 May 2015

“Never during my three years as a law student or two years as a law clerk at federal courts did I hear of the ‘Insular Cases.’ Yet the series of US Supreme Court decisions gathered under this name established a doctrine, in force to this day, determining that the US Constitution does not apply fully to territories acquired through conquest after the Spanish-American War and the signing of the Treaty of Paris in 1898.” So writes Harvard Law School Dean Martha Minow in her Preface to Reconsidering the Insular Cases: The Past and Future of the American Empire, the latest publication from Harvard Law School’s Human Rights Program. The volume grew out of a February 2014 conference at Harvard that was organized to interrogate this century-old series of Supreme Court decisions that, judging by Minow’s experience, aren’t often given the attention they deserve. As Juan R. Torruella, Puerto Rico-born Judge of the US Court of Appeals for the First Circuit expressed in his conference keynote—included in the printed volume—“the Insular Cases represent classic Plessy v. Ferguson legal doctrine and thought that should be eradicated from present-day constitutional reasoning.”

In his Introduction to the book, excerpted below, Harvard Law School professor Gerald L. Neuman further elaborates on the need to continually question a doctrine that raises crucial issues of both constitutional law and human rights.

-----

The US Supreme Court’s decisions in the Insular Cases of 1901 provided the legal framework for the governance of a colonial empire in the Atlantic and the Pacific, loosening the constraints of constitutional principle in order to facilitate rule over the subjected areas and their inhabitants. In the wake of the Spanish-American War and the transfer of several of Spain’s imperial possessions, the metaphorically expressed question “Does the Constitution follow the flag?” became newly urgent. The Supreme Court majority gave a new answer: not entirely.

The most important of the 1901 decisions was Downes v. Bidwell, in which the court divided five to four in favor of congressional power to discriminate between the mainland and the new territories in customs matters. In retrospect, the crucial opinion was the concurrence of Justice Edward Douglass White. He accepted that the US Constitution governed the actions of the United States at any location, but he contended that it was still necessary to determine the appropriate geographical scope of each constitutional provision. The applicability of a constitutional limitation to a particular territory would depend on the situation of the territory and its relations to the United States. If the United States acquired a new territory and did not admit it as a state, then Congress could choose whether to “incorporate” the territory into the United States as an integral part or to treat it merely as a territory appurtenant to the United States. White thereby adopted a distinction that had been suggested by the political scientist Abbott Lawrence Lowell in an article in the Harvard Law Review, although White added some elements of his own in elaborating the consequences. For incorporated territories, the Bill of Rights and other constitutional limitations would apply in the usual way. For unincorporated territories, only “fundamental” restrictions on government power would apply. Moreover, an unincorporated territory could be kept in subordination indefinitely, without the prospect of future statehood. The United States had to have the same power to acquire and govern overseas territories and populations as the European colonial powers were exercising under international law.

About

The Harvard University Press Blog brings you books, ideas, and news from Harvard University Press. Founded in 1913, Harvard University Press has published such iconic works as Bernard Bailyn’s The Ideological Origins of the American Revolution, John Rawls’s A Theory of Justice, and Sarah Blaffer Hrdy’s The Woman That Never Evolved.