2 Free Issues

Follow

JSTOR's Hidden Power

Training students to be attentive and discriminating in their use of sources is difficult and time-consuming. It is enormously tempting to say “just use Project Muse and JSTOR articles” and be done with it.

Few people outside of academia had heard of JSTOR, an aggregator and distributor of digital versions of academic journals, until a young activist named Aaron Swartz took his own life last January. Swartz downloaded, without proper authorization, a great many articles from JSTOR via MIT’s servers—as he had earlier downloaded and distributed millions of federal court documents in the PACER database—because he passionately believed that information should be as free as possible and as widely available as possible.

Because of Swartz’s particular commitments, and because his death brought so much attention to those commitments, much of the conversation about JSTOR and similar databases since he took his life has been about the value of open access to academic and other scholarly work. And open access is indeed something worth fighting for, and something to which databases like JSTOR—and Project Muse, and the Elsevier books and journals in the sciences, and several other major distributors—are necessarily opposed to. (See this recent contretemps for ample evidence of that opposition.)

But open access is not the only issue here, and if academics ever do manage to achieve an end-run around such distributors, they’ll have to confront some deeply entrenched habits of their own. In fact, those habits strengthen the cause of the distributors, and could make it much harder for open access to win the day.

Most academic journals get started at particular institutions, arising from the interests of a professor or two or three, but, while small numbers of people can edit such journals, the actual publication and distribution of them are more complicated. Eventually some academic presses came to specialize in such work—in America, Oxford University Press and Johns Hopkins University Press are probably the most prominent—and they provide multiple services to journal editors: They not only print and distribute, they also provide a kind of imprimatur, a seal of academic approval from well-regarded presses. To get your journal taken up by Oxford or Johns Hopkins is something of a coup.

It’s easy to see how these powers have been amplified in the digital age—and they’re powers that have had an enormous influence on how academic work gets done, from high-school students to the more elevated reaches of the professoriate. JSTOR (where the Oxford University Press journals, among many others, went) and Project Muse (which was created by Johns Hopkins University Press specifically for its journals) can make a very strong case for the value of their services to everyone in the academic ecosystem.

To the editors of journals, they say: We can get your articles—including long-forgotten ones, decades old—read and used by countless thousands of people who otherwise never would have heard of them.

To libraries, they say: You don't need to devote your staff’s limited time and energy to sifting through thousands of academic journals, trying to figure out which ones to buy access to. Just pay one fee to us—and perhaps to a couple of other equally prestigious services—and we’ll give your community instant access to thousands and thousands of peer-reviewed academic articles the quality of which we solemnly vouch for.

To students, they say: Figuring out what sources to use for your research paper is hard, isn’t it? You never know whether your professor is going to acknowledge a given source as reliable and appropriate, do you? Well, just search our database and use what you find there, and you’ll be good as gold.

And to faculty, they say: Students really have no idea how to evaluate sources, do they? And who has time to teach them? It’s not like you don't have enough to do already. So just point them to us, and they’ll be good as gold—and you’ll have one less thing to think about.

As a teacher, I can’t deny feeling the force of that last pitch. For the past few years I’ve been asking freshmen literature students to evaluate critical sources, on some work we’re reading in the class, of different kinds: books, print articles, online articles. I ask them to describe what they discover and to answer these key questions: Would you cite this source in a paper for class? Why or why not? The results of this assignment have been consistently, shall we say, sobering. (“You think that is a trustworthy source??”) Training students to be attentive and discriminating in their use of sources is difficult and time-consuming. It is enormously tempting to say “just use Project Muse and JSTOR articles” and be done with it.

But to do so is to evade a significant opportunity to teach students some things that they very much need to know. Years ago, in one of the first books to reckon seriously with the effects of the internet and digitization on education, James O’Donnell wrote that if in the past your professor served as the primary local source of knowledge in his or her special field, “the real roles of the professor in an information-rich world will be…to advise, guide, and encourage students wading through the deep waters of the information flood.” To turn over that task to Project Muse and JSTOR seems to me an abdication of a key teacher’s responsibility in our time. But our digital tools make that abdication very easy indeed.

And so, no matter how appealing the idea of open access is, and how consonant with the core values of academic life, it may run into obstacles other than the one usually cited, which is greed. Those who want to make money from academic publications may be less of a problem, in the long run, than academics who can't resist the temptation to offload some of what they think of as—and what may often be fairly described as—the drudgeries of teaching. Few of us are as committed to open access as Aaron Swartz was: We may say we don't like the power that has fallen into the hands of the big aggregators and distributors, but our behavior, when faced with the genuine services those companies provide, indicates something different. Are we willing to change that behavior, to take on greater responsibility for instructing our students in the quest for reliable sources and genuine knowledge?

Such a change is much easier for faculty members, like me, who teach relatively few classes with relatively few students, all of whom are well-prepared for academic work. For those with heavier teaching loads, and especially for adjuncts, who do much of the actual teaching in American universities and are likely to do still more in the future, it’s almost impossible not to employ whatever labor-saving devices they can find. Thus we see how many, and how intractable, are the structural impediments to serious education in America.

Again: Open access to academic work should happen. But while the big distributing services play a role in preventing it from happening, they don’t play the only role. Our desire—in many cases, need—to save time and effort also serves to keep this system running. But it needs to change, if our students are going to learn the skills they need to navigate “the deep waters of the information flood.”

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

No other place mixes affordability, opportunity, and wealth so well. What’s its secret?

If the American dream has not quite shattered as the Millennial generation has come of age, it has certainly scattered. Living affordably and trying to climb higher than your parents did were once considered complementary ambitions. Today, young Americans increasingly have to choose one or the other—they can either settle in affordable but stagnant metros or live in economically vibrant cities whose housing prices eat much of their paychecks unless they hit it big.

The dissolution of the American dream isn’t just a feeling; it is an empirical observation. In 2014, economists at Harvard and Berkeley published a landmark study examining which cities have the highest intergenerational mobility—that is, the best odds that a child born into a low-income household will move up into the middle class or beyond. Among large cities, the top of the list was crowded with rich coastal metropolises, including San Francisco, San Jose, Los Angeles, San Diego, and New York City.