Underpinning a disdain for social media in higher education is the assumption that incoming students have an inherent aptitude for new technologies

"If you took a soldier from a thousand years ago and put them on a battlefield, they'd be dead," Howard Rheingold, a professor teaching virtual community and social media at Stanford University, told me one morning via Skype. "If you took a doctor from a thousand years ago and put them in a modern surgical theater, they would have no idea what to do. Take a professor from a thousand years ago and put them in a modern classroom, they would know where to stand and what to do."

Terms like "digital native" and "digital immigrant" have been used by marketers as a way of differentiating generations.

This tale is not new. The vaunted halls of academia move slowly and cautiously. Research is produced, reviewed, and vetted to be given credibility, and there are times when this deliberate pace poses problems for professors, philosophical, pedagogical or otherwise. But the rise of social media may change that. With social media becoming increasingly pervasive on college campuses, in classrooms and in dormitories, a shift in how higher education approaches the medium is under way, if at a much slower rate than in the professional world.

For the last several years, teaching social media has been reactionary, found either in non-matriculated night classes for the working professional or in business schools for the budding marketer to learn and hone his or her online and social marketing skills.

"Some people were affected directly in their everyday lives by this thing called social media," says Mihaela Vorvoreanu, an assistant professor of computer graphics technology in the College of Technology at Purdue University, who teaches a doctoral level research seminar in social media. "They had to figure it out. There was no choice about it. They had to learn about it." So they went back to school to learn how to create Facebook campaigns, how to incorporate SEO best-practices, how to blog, and how to create social media strategies.

But as social interactions and technologies mature, there has been a swing in the pendulum. Professors are now approaching the teaching of social media from a pedagogical perspective, as much as a practical one.

Teaching Social Media Theory

In communications, business, psychology, anthropology, sociology, and information technology departments across the nation, theories of social media -- and how to teach it -- are becoming more prevalent. Sarah Smith-Robbins, professor and Director of Emerging Technologies at the Kelly School of Business at Indiana University, teaches a course called "Social and Digital Marketing." "We go over the theories behind social media: why do things go viral, the social theories of how people act and how they communicate to a network, or one person at a time, and why do certain tools work they way they do for us," she says. With an obvious slant towards the professional, these theoretical questions help students grasp the fundamentals of social media, outside of posting personal status updates on Facebook or Twitter. Instead of understanding social media as products, students are encouraged to treat status updates as part of a larger information ecosystem.

"As faculty, we're always trying to engage our students better," Smith-Robbins says. "If we see them using a tool like Facebook, there's this huge temptation to say, 'Well, I use Facebook in class,' because that's where they're at. More times than not, it doesn't work because it has to be a pedagogical decision first, rather than a technology decision. Plus, all these tools have their own culture and if you try to use them for something different, you're more often than not going to make mistakes."

With social media being a pervasive, if not invasive, aspect of our lives, it makes perfect sense for the Ivory Tower to embrace social media from a theoretical perspective to help students understand the technology and its effect on their daily lives, as well as the epistemological question of "how do we know what we know?" At Bradley University, Heidi Rottier started a social media class in the university's marketing department and has now, with her marketing department colleagues, created a Social Media minor to address this issue.

"Helping [the students] understand that in all they do, in all the traditional media world, now has to be translated and useful in the social media world," she says. "Our thought in creating the concentration was we wanted students who were uniquely prepared with a strong marketing background to then do the social media side."

Each discipline approaches teaching of social media differently: The medium is relatively new enough that there's no canon shaping social media, just conceptual frameworks for looking at the effects of social media on students' lives and communities and on society as a whole. The task of academics is to give students a vocabulary to understand these perspectives, tools to make sense of the theoretical discussions and think critically about social media.

The Anti-Social Media Faction

However, not all academics are embracing the teaching of the social world. "There are people who are anti-social media, who couldn't care less one way or another," says Reynol Junco, a professor in the Department of Academic Development and Counseling at Lock Haven University. "Opinions run the gamut from hostility to preaching of the benefits of social media. The assumption from faculty is that students are on Facebook and not doing anything else, and taking all their time away from study."

Junco teaches a Social Media in Higher Ed course, exploring multiple ways social media can be pedagogically incorporated into higher education. "It amazes me how many courses are out there about social media that don't use the tools in the course," said Smith-Robbins. "It's like studying to be a doctor and never touching a body, and then going into practice. You gotta get your hands on it, know how it works. You can only theorize about how these communities work if you're not willing to actually go in and see how they function."

Professor Vorvoreanu agrees: "I don't think you have the credibility of doing research, of writing about, unless you get to really know that culture. And the best way of knowing the culture is to actually be immersed in it."

This anti-social media outlook cited by several professors is endemic in an educational system that has, as Rheingold puts it, "no positive incentives for innovating in pedagogy."

Social media creates an opportunity to change this outlook -- whether through, as Rheingold says, deprogramming students in the way they learn by using the tools available to them, or by throwing them out into the business world, via internships, armed with theories of social media. Of course, the latter has led to an even more interesting set of assumptions.

The Myth of the Digital Native

Underpinning a disdain for social media in higher education is the assumption that incoming students already have an inherent aptitude for new technologies. Students in Professor Smith-Robbins's class kept running into a strange obstacle at their various internships. Many of them were born after 1992, right at the beginning of the popularization of the Internet, and their employers, executives at Fortune 500 companies, believed the students inherently understood social media.

Terms like "digital native" (those born during or after the introduction of digital technology -- computer, Internet, etc. -- and have an assumed greater understanding of how technology works because they've been using digital technology their entire lives) and "digital immigrant" (those born before this introduction and have had to adapt and adopt the technology at a later point in life) have been bandied around by experts and marketers as ways of classifying and differentiating between generations, and, more importantly, the expectations of those who fall into either category.

Professor Vorvoreanu pointedly declares these terms a myth. "It's a myth that's harmed this current generation," she said. "And the way it has harmed them is because it has stopped educators from teaching what they need to teach. It has scared educators into thinking students know more than us. God forbid we learn something from our students. And, so, we assumed these kids already know, and we don't teach them. And we expect them to know things and we grade them; we evaluate them; we hire them based on what we think, we assume, they know. And they don't. How would you know this stuff if no one ever bothered to point it out to you that this is something you should be learning, because everyone assumes you already know?"

Another set of issues stemming from the notions of "digital native" is the lack of critical literacy. Since students of the Digital Age have not had to acclimate to this sweeping change from analog to digital and are assumed to possess some innate technological knowledge based solely on the year they were born, they don't necessarily have to acclimate to the sheer velocity of recent innovations.

"We have on our hands the last generation of educators who do remember life before these tools, and so therefore, we have an opportunity to teach some critical literacy that these students may not get otherwise; this generation may not get otherwise," Smith-Robbins says.

One of the ways to approach critical literacy is by changing the pedagogy. Rheingold, who is at the forefront of the social media classroom, believes in collaborative learning. Rheingold puts the onus on the students to learn not just from him, but from each other. Instructors can serve as a facilitator, but the student has to want to be there, process that information, and use that information in a productive way.

"The students teach each other much more than they used to," he said. "They need some guidance on how to do that, and they need a little bit of an awakening because they've been in a kind of test-trance for so many years."

As the study of social media swings from the practical to the theoretical, many institutionalized issues will arise -- from internal politics to claims on which department owns social media. But what seems clear is that teaching social media through a traditional mode will not suffice. And, while 'digital native' is a misnomer at best and marketing myth at worst, students (and recent professors) growing up in the Digital Age will have a different set of expectations -- about education, about professors, about life -- upon entering school.

"The issues around social media -- community, identity, presentation of self, social capital, public sphere, collective action; a lot of important topics from other disciplines -- aren't really being raised in academia," said Rheingold. "They ought to be because these topics, not only academically, in terms of the shifts in media and literacy that they're triggering in the world, are where the students live and work."

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A controversial treatment shows promise, especially for victims of trauma.

It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)

Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.

Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.