3 Quarks Daily Advertising

Please Subscribe to 3QD

If you would like to make a one time donation in any amount, please do so by clicking the "Pay Now" button below. You may use any credit or debit card and do NOT need to join Paypal.

The editors of 3QD put in hundreds of hours of effort each month into finding the daily links and poem, putting out the Monday Magazine, administering the Quark Prizes, arranging the DAG-3QD Peace and Justice Symposia, and doing the massive amount of behind-the-scenes work which goes into running the site.

If you value what we do, please help us to pay our editors very modest salaries for their time and cover our other costs by subscribing above.

We are extremely grateful for the generous support of our loyal readers. Thank you!

3QD on Facebook

3QD on Twitter

3QD by RSS Feed

3QD by Daily Email

Recent Comments

Miscellany

Design and Photo Credits

The original site was designed by Mikko Hyppönen and deployed by Henrik Rydberg. It was later upgraded extensively by Dan Balis. The current layout was designed by S. Abbas Raza, building upon the earlier look, and coded by Dumky de Wilde.

Wednesday, February 03, 2016

Something happens to a novel as it ages, but what? It doesn’t ripen or deepen in the manner of cheese and wine, and it doesn’t fall apart, at least not figuratively. Fiction has no half-life. We age alongside the novels we’ve read, and only one of us is actively deteriorating. Which is to say that a novel is perishable only by virtue of being stored in such a leaky cask: our heads. With just a few years’ passage, a novel can thus seem “dated” or “irrelevant” or (God help us) “problematic.” When a novel survives this strange process, and gets reissued in a handsome 20th-anniversary edition, it’s tempting to hold it up and say, “It withstood the test of time.” Most would intend such a statement as praise, but is a 20-year-old novel successful merely because it seems cleverly predictive or contains scenarios that feel “relevant” to later audiences? If that were the mark of enduring fiction, Philip K. Dick would be the greatest novelist of all time.

David Foster Wallace understood the paradox of ­attempting to write fiction that spoke to posterity and a contemporary audience simultaneously, with equal force. In an essay written while he was at work on “Infinite Jest,”Wallace referred to the “oracular foresight” of writers such as Don DeLillo, whose best novels — “White Noise,” “Libra,” “Underworld” — address their contemporary audience like a shouting desert prophet while laying out for posterity the coldly amused analysis of some long-dead professor emeritus. Wallace felt that the “mimetic deployment of pop culture icons” by writers who lacked DeLillo’s observational powers “compromises fiction’s seriousness by dating it out of the Platonic ­Always where it ought to reside.” Yet “Infinite Jest” rarely seems as though it resides within this Platonic Always, which Wallace rejected in any event. (As with many of Wallace’s more manifesto-ish proclamations, he was not planting a flag so much as secretly burning one.) We are now at least half a decade beyond the years Wallace intended his novel’s subsidized time schema — Year of the Whopper, Year of the Depend Adult Undergarment — to represent. Read today, the book’s intellectually slapstick vision of corporatism run amok embeds it within the early to mid-1990s as firmly and emblematically as “The Simpsons” and grunge music. It is very much a novel of its time.

How is it, then, that “Infinite Jest” still feels so transcendentally, electrically alive? Theory 1: As a novel about an “entertainment” weaponized to enslave and destroy all who look upon it, “Infinite Jest” is the first great Internet novel.

When I first saw the TV show, I loved it. The first season is perfectly balanced between individual stories (the story of Hailey, the young oboe player who gets her shot at the big time; the story of Gael Garcia Bernal’s hotshot conductor Rodrigo, brought in to save the NYSO) and the story of the larger thing of which they are all a part: the orchestra, the music. As Hailey and Rodrigo enter the orchestra from opposite ends — the bottom and the top — we get an upstairs-downstairs picture of the institution, and at the same time, we are invited to consider the value of an undervalued art. In its first season, Mozart was essentially Slings and Arrows, the story of a group of players who must constantly, loudly, and insistently declare that the show must go on, because the people they must convince are themselves, because it’s anything but clear that it can, and because they are the ones who must put their asses and livelihoods on the line, day after day, night after night. Their belief, their faith, and their gamble are the only things that make the show go on: the only certainty is that if they stopped saying it, it wouldn’t. And so, the conductor, musicians, and staff of the orchestra all have to keep insisting that they are New York’s orchestra — that they are our orchestra, your orchestra — to keep the ship above water, and to counteract the worrying realization that no one seems to want them very much. Their audience is aging and dying, their patrons are more interested in cultural capital than in the music, and true lovers of their art are few and far between (and usually broke). Orchestras are expensive and tickets are hard to sell; the orchestra’s main audience seems to be itself while Roderigo DeSouza struggles to live up to the expectations of the ghost of Mozart.

The conclusion to the first season is superbly and powerfully crafted: the show, it turns out, goes on, but only once a few illusions are shed. The conductor must become a member of the orchestra; individual themes must be subordinated for the good of the music. At its best, it turns out, it isn’t about you. It’s about the music.

But the more I think about Tindall’s book, the more I realize how rose-colored that idealism ultimately is. The show not only makes classical music seem fun and sexy, and occasionally dangerous, but it draws you into its idealism, the abiding faith of its musicians that the thing itself is not only enough, on its own, but that it’s the only thing that matters. And this is exactly the passion for the music that the industry uses to keep itself going, and which keeps, in turn, its most exploited workers and musicians in their place.

Robert Greene II on Jonathan Zimmerman's Too Hot to Handle: A Global History of Sex Education, over at US Intellectual History:

In 1947, just six years after Life Magazine declared the rise of the “American Century,” the American Social Hygiene Association (ASHA) distributed sex education materials to 47 different nations and over 60 organizations across the globe. Formed in the Progressive Era and now combined with the political, economic, and military might of the United States in the early years of the Cold War, the ASHA and similar groups reflected the rise of sex education in Europe, the United States, and eventually the non-western world throughout the twentieth century. Scenes of educators, government officials, and health care workers unpacking sex education films and written materials for classrooms in places such as East Asia, Africa, and Latin America symbolized the rising hopes and influence of western sex educators after World War II.

However, if little about the postwar world looked familiar to those who had lived through the first half of the century, historian Jonathan Zimmerman’s Too Hot to Handle: A Global History of Sex Education (Princeton, 2015) emphasizes the enduring limits of sex education in both the United States and abroad. This brief but ambitious survey of sex education since 1898 tempers claims about modernity and revolutionary change with a narrative largely about continuity. Despite their resources and often evangelical commitment, sex educators in virtually every country from the United States to the Middle East faced strikingly similar opposition to their efforts to teach students about the physical, emotional, and social realities of sexuality. Rather than simply conflicts over specific curriculum materials or the training of teachers, the passionate arguments over the nature and future of sex education in the last century illuminate larger intellectual debates over culture, power, and rights, both individual and parental, in a globalized world.

The governments of Egypt and Turkey are brazenly leading a multi-pronged assault on writers, artists and intellectuals. Turkish president Recep Tayyip Erdoğan last month denounced his critics among Turkish academics as treasonous fifth columnists of foreign powers; many of them have been subsequently dismissed and suspended. Both Turkey and Egypt have imprisoned journalists, provoking international protests. But the suppression of intellectual and creative freedoms is assuming much cannier forms in India, a country with formal and apparently free democratic institutions.

Controlled by upper-caste Hindu nationalists, Indian universities have been purging “anti-nationals” from both syllabuses and campuses for some months now. In a shocking turn of events last month, Rohith Vemula, a PhD student in Hyderabad, killed himself. Accused of “anti-national” political opinions, the impoverished research scholar, who belonged to one of India’s traditionally and cruelly disadvantaged castes, was suspended, and, after his fellowship was cancelled, expelled from student housing. Letters from Modi’s government in Delhi to university authorities revealed that the latter were under relentless pressure to move against “extremist and anti-national politics” on campus. Vemula’s heartbreaking suicide note attests to the near-total isolation and despair of a gifted writer and thinker.

The extended family of upper-caste nationalists plainly aim at total domination of the public sphere. But they don’t only use the bullying power of the leviathan state – one quickly identified by local and foreign critics – to grind down their apparent enemies. They pursue them through police cases and legal petitions by private individuals – a number of criminal complaints have been filed against writers and artists in India. They create a climate of impunity, in which emboldened mobs ransack newspapers offices, art galleries and cinemas.

In his newest book, historian Greg Grandin provides background to Herman Melville’s classic Benito Cereno, an 1855 short novel about a slave rebellion. Reflecting on this story written almost two centuries ago, Grandin opens up space for further research by those investigating the Black Atlantic. Melville’s novel told the story of a concerned and liberal sea captain, Amasa Delano, who boarded the slave ship San Dominick and encountered a deferential slave, Babo, caring for his slave master who had taken ill. Delano was moved by the humble slave’s concern for his master, the ship’s captain. Not until the end of that day does Delano realize that he had been deceived and that the slaves on the San Dominick had revolted and had actually taken charge. Babo was not a deferential slave, solicitous toward his sick master, but rather was the revolt’s ringleader! When Delano discovers the true circumstances, he directs his armed team of sailors to round up the rebels, and a fight ensues. The Melville narrative is extraordinary, ironic, and liberatory, and Grandin’s book provides some remarkable background material to the novel. Benito Cereno had been based on actual events recorded in the non-fictional Delano’s 1817 journal.Grandin provides a context for grasping this late eighteenth- to early nineteenth-century revolutionary era, reflecting a whole period of slave rebellion. However, three special contributions by Grandin deserve explication here: his emphasis on the Muslim influence on the slaves, the brutality of the free-enterprise seal-hunting industry, and the harsh march of slaves over the Andes.

First, one of Grandin’s striking contributions is his situating the real and the fictional Amasa Delano, and fictional Babo, within a revolutionary history and alerting his readers to the Muslim background of many West African slaves. Grandin reports that “some estimate as many as 10 percent” of over twelve million African slaves taken to America were Muslims (195). Grandin shows us, in fact, that when Protestant Delano meets Babo, he is possibly not confronted with a Christian slave, but a Muslim one, a Muslim brother of those who rose up and fought for a decade to acquire Haitian independence in 1804.

More here. (Note: At least one post will be dedicated to honor Black History Month throughout February)

Tuesday, February 02, 2016

It is estimated that between 1890 and 1925, an African American was lynched every two and a half days. The academic and intellectual community was no different from the bulk of mainstream America. Peoples of African descent were visibly absent in any scholarship or intellectual discourse that dealt with human civilization. Black history events, African Americans were so dehumanized and their history so distorted in academia that slavery, peonage, segregation, and lynching were considered justifiable conditions. Under Woodson's direction and contributions from other African American and white scholars, the Negro History Week was launched on a serious platform in 1926 to neutralize the apparent ignorance and deliberate distortion of Black History. Theme of black history month 2016 is, Hallowed Grounds: Sites of African American Memories.

February was selected by a man named Carter Goodwin Woodson, who was a noted historian and publisher, and who was a pioneer in American Black history. He selected February for several reasons, in that this month has an enormous significance in Black American history. First it is in celebration of two historical figures who had a great impact on the Black population. They are Abraham Lincoln and Frederick Douglass. Other noteworthy persons whereby the month of February is significant are: W.E.B. Dubois, who was born on February 23, 1868, and who was a Civil Rights leader and co-founder of the N.A.A.C.P. The 15th Amendment to the United States Constitution was passed on February 3, 1870 which gave Blacks the right to vote.The first Black senator, Hiriam R. Revels took office on February 25, 1870. The N.A.A.C.P. (National Association for the Advancement of Colored People) was founded in New York City of February 12, 1909, and Malcolm X, the militant leader who promoted Black Nationalism was shot and killed by Black Muslims on February 21, 1965.

More here. (Note: At least one post will be dedicated to honor Black History Month throughout February)

One evening in the late fall, Lucien Majors, 84, sat at his kitchen table, his wife Jan by his side, as he described a recent dream. Mr. Majors had end-stage bladder cancer and was in renal failure. As he spoke with a doctor from Hospice Buffalo , he was alert but faltering. In the dream, he said, he was in his car with his great pal, Carmen. His three sons, teenagers, were in the back seat, joking around. “We’re driving down Clinton Street,” said Mr. Majors, his watery, pale blue eyes widening with delight at the thought of the road trip. “We were looking for the Grand Canyon.” And then they saw it. “We talked about how amazing, because there it was — all this time, the Grand Canyon was just at the end of Clinton Street!” Mr. Majors had not spoken with Carmen in more than 20 years. His sons are in their late 50s and early 60s. “Why do you think your boys were in the car?” asked Dr. Christopher W. Kerr, a Hospice Buffalo palliative care physician who researches the therapeutic role of patients’ end-of-life dreams and visions. “My sons are the greatest accomplishment of my life,” Mr. Majors said. He died three weeks later.

For thousands of years, the dreams and visions of the dying have captivated cultures, which imbued them with sacred import. Anthropologists, theologians and sociologists have studied these so-called deathbed phenomena. They appear in medieval writings and Renaissance paintings, in Shakespearean works and set pieces from 19th-century American and British novels, particularly by Dickens. One of the most famous moments in film is the mysterious deathbed murmur in “Citizen Kane”: “Rosebud!” Even the law reveres a dying person’s final words, allowing them to be admitted as evidence in an unusual exception to hearsay rules.

Consists of two tight-twisted, separate strands Conjoined as one: and not unlike, in fact, Our own familiar silver wedding bands, Though these are loosely woven, inexact, With wide interstices, so that each makes A circle of ellipses. Tightly caught At random intervals, two little snakes Of wire are crimped into a snaggled knot,

That four short ends, sharp bevel-cut, present Unsheathed, ingenious fangs. And when in place, Stretched taut, or strewn in loose coils, may prevent The passage through some designated space

Of beast, or man. You got used to the stench; The mud was worse than being under fire, My father said. A detail left the trench At night, to get the dead back from the wire,

And no one volunteered. They stood, to view Our brief exchange of rings and vows, for both Our fathers had survived that war: and knew Of death, and bright entanglement, and troth..

Researchers at Harvard-affiliated Boston Children’s Hospital have, for the first time, visualized the origins of cancer from the first affected cell and watched its spread in a live animal. Their work, published in the Jan. 29 issue of Science, could change the way scientists understand melanoma and other cancers and lead to new, early treatments before the cancer has taken hold.

“An important mystery has been why some cells in the body already have mutations seen in cancer, but do not yet fully behave like the cancer,” says the paper’s first author, Charles Kaufman, a postdoctoral fellow in the Zon Laboratory at Boston Children’s Hospital. “We found that the beginning of cancer occurs after activation of an oncogene or loss of a tumor suppressor, and involves a change that takes a single cell back to a stem cell state.”

That change, Kaufman and colleagues found, involves a set of genes that could be targeted to stop cancer from ever starting.

The study imaged live zebrafish over time to track the development of melanoma. All the fish had the human cancer mutation BRAFV600E — found in most benign moles — and had also lost the tumor suppressor gene p53.

In Israel and the occupied Palestinian territories, 2016 has begun much as 2015 ended — with unacceptable levels of violence and a polarized public discourse. That polarization showed itself in the halls of the United Nationslast week when I pointed out a simple truth: History proves that people will always resist occupation.

Some sought to shoot the messenger — twisting my words into a misguided justification for violence. The stabbings, vehicle rammings and other attacks by Palestinians targeting Israeli civilians are reprehensible. So, too, are the incitement of violence and the glorification of killers.

Nothing excuses terrorism. I condemn it categorically.It is inconceivable, though, that security measures alone will stop the violence. As I warned the Security Council last week, Palestinian frustration and grievances are growing under the weight of nearly a half-century of occupation. Ignoring this won’t make it disappear. No one can deny that the everyday reality of occupation provokes anger and despair, which are major drivers of violence and extremism and undermine any hope of a negotiated two-state solution.

Israeli settlements keep expanding. The government has approved plans for over 150 new homes in illegal settlements in the occupied West Bank. Last month, 370 acres in the West Bank were declared “state land,” a status that typically leads to exclusive Israeli settler use.

At the same time, thousands of Palestinian homes in the West Bank risk demolitionbecause of obstacles that may be legal on paper but are discriminatory in practice.

Monday, February 01, 2016

For much of the 20th Century, the U.S. was a culinary backwater. Outside some immigrant enclaves where old world traditions were preserved, Americans thought of food as nutrition and fuel. Food was to be cheap, nutritious (according to the standards of the day) and above all convenient; the pleasures of food if attended to at all were a minor domestic treat unworthy of much public discussion.

How times have changed! Today, celebrity chefs strut across the stage like rock stars, a whole TV network is devoted to explaining the intricacies of fermentation or how to butcher a hog, countless blogs recount last night's meal in excruciating detail, and competitions for culinary capo make the evening news. We talk endlessly about the pleasures of food, conversations that are supported by specialty food shops, artisan producers, and aisles of fresh, organic produce in the supermarket. Restaurants, even small neighborhood establishments, feature chefs who cook with creativity and panache.

Why this sudden interest in food? As I argue in American Foodie: Taste, Art and the Cultural Revolution, our current interest in food is a search for authenticity, face-to-face contact, local control, and personal creativity amidst a world that is increasingly standardized, bureaucratic, digitized, and impersonal. In contemporary life, the public world of work, with its incessant demands for efficiency and profit, has colonized our private lives. The pressures of a competitive, unstable labor market, the so-called "gig" economy, along with intrusive communications technology make it increasingly difficult to escape a work world governed by the value of efficiency. This relentless acceleration of demands compresses our sense of time so we feel like there is never enough of it. Standardization destroys the uniqueness of localities and our social lives are spread across the globe in superficial networks of "contacts" where we interact with brands instead of whole persons. The idea that something besides production and consumption should occupy our attention, such as a sense of community or self-examination, seems quaint and inefficient—a waste of time. Thus, we lose touch with ourselves while internalizing the self-as-commodity theme and hiving off all aspects of our lives that might harm our "brand"—a homogenized, marketable self. Even our vaunted and precious capacity to choose is endangered, for we no longer choose based on a sensibility shaped by our unique experiences; instead our sensibilities are constructed by corporate choice architects, informed by their surveys and datamining that shepherd our decisions.

I came dangerously close to not becoming a mathematician. Like many people my experiences with math in school left me irritated and bored. I have a poor memory and I'm not a detail oriented person [1]. The arbitrary rules to be memorized and the fiddly and unforgiving nature of calculations made each homework a minefield of point-losing opportunities. And the problems! To "motivate" us with "applications" the problems were meant to be real-world, yet always seemed to involve the patently ridiculous: rectangular pastures, conical barns, spherical cows. I don't know how anyone can refer to such obviously contrived problems as "real-world" with a straight face.

Or, worse, problems were completely devoid of any motivation whatsoever. I have strong memory of having to learn how to multiply together matrices. The rules were clearly designed to maximize the number of calculations required and, hence, the chances of making a mistake. I can't imagine who thought this was a good topic for fifteen year olds. Not a word was said about why we should learn such a thing, or why anyone, anywhere should care. Oh to have known something about how matrices are used in geometry and computer graphics, or to store and manipulate data, or to compute probabilities in Markov processes. Heck, just to point out that it is an example of a "multiplication" where AB and BA are not equal would have been great start!

Of course my experience is the rule, not the exception. Paul Lockhart wrote a fantastic essay in 2002 entitled "A Mathematician's Lament" which captures the situation perfectly. It's requiring everyone to be able to read music and never letting them hear a tune, only saying it will be needed in some unspecified way as a working adult. Or teaching reading using only tax forms and TV repair manuals. Everyone with an interest in math or education should read it. You can read it here. As Lockhart writes,

...if I had to design a mechanism for the express purpose of destroying a child’s natural curiosity and love of pattern-making, I couldn’t possibly do as good a job as is currently being done— I simply wouldn’t have the imagination to come up with the kind of senseless, soul-crushing ideas that constitute contemporary mathematics education.

When it comes to political questions, reasonable people disagree. Reasonable disagreement persists also in philosophy, religion, and a broad array of interpersonal matters. That's life. And, indeed, we must live; we must make decisions, set plans, and adopt policies that affect, interest, and impact others. Our decisions have drawbacks, actions have consequences, and plans impose costs on others. We cannot always just go our own way; we have to consult others in trying to figure out how to go on. Hence disagreements arise.

Any view important enough to stimulate disagreement is a view that will look to some reasonable others as prohibitively costly, suboptimal, incorrect, or foolhardy. Thus assessing the drawbacks of one's view is where the argument concerning its overall merit begins, not where it ends. Thoughtful people are aware that their views will strike some reasonable others as manifestly rejectable, and consequently, thoughtful people take reasonable criticism not always as an attack on their proposals, but rather as an occasion for thinking and saying more about them. In some instances, the case can be made that the drawbacks of one's view must be borne (because, perhaps, the viable alternatives are yet even worse); in other cases, it might be arguable that the costs of adopting one's view are merely apparent or on the whole insignificant. The point is that it's plainly insipid to proceed as if the fact that an opponent's view is imperfect were a decisive reason to reject it. Showing that an interlocutor's proposal is thoroughly criticizable is never the end of the matter. What must also be shown is that the interlocutor's criticizable proposal is inferior to the other (criticizable) proposals worth considering. And that comparative task requires us to allow our interlocutors to respond to our criticisms.

The trouble is that so much popular political debate seems to presuppose that the only political view worth accepting would be one that could not be reasonably criticized.

Nearly half a million applications for asylum submitted by refugees were processed by German authorities in 2015, according to the German Federal Office for Refugees and Migration. The number of people who were officially registered in Germany as potential asylum seekers was even far higher-roughly one million in 2015 – which suggests that Germany anticipates an even higher number of official asylum applications for 2016. Chancellor Angela Merkel has defied many critics even in her own party and cabinet by emphasizing that Germany can and will take on more refugees, most of whom are coming from war-torn countries such as Syria, Iraq and Afghanistan. "We can do it!" ("Wir schaffen das!") was the phrase she used in September of 2015 to convey her optimism and determination in the face of ever-growing numbers of refugees and the gradual rise of support for far right extremist demonstrations and violent attacks by far right extremists on refugees centers in Germany.

The German media and right wing populists are currently obsessing about statistics such as the fact that the far right and libertarian party AfD (Alternative für Deutschland - Alternative for Germany) will garner 10% of the popular vote or that the vast majority of the refugees are male and could lead to a demographic gender shift if they remain in Germany. While such statistics serve as an important barometer of the political climate in the German electorate or to prepare for the challenges faced by the refugees and German society in the next years, they do not address the fundamental philosophical questions raised by this refugee crisis. In the latest issue of the popular German philosophy periodical "Philosophie Magazin", the editors asked philosophers and other academic scholars to weigh in on some of the key issues and challenges in the face of this crisis.

Should we be motivated by a sense of global responsibility when we are confronted with the terrible suffering experienced by refugees whose homes have been destroyed? The sociologist Hartmut Rosa at the University of Jena responds to this question by suggesting that we should focus on Verbundenheit ("connectedness") instead of Verantwortung ("responsibility"). Demanding that those of us who lead privileged lives of safety and reasonable material comfort should feel individually responsible for the suffering of others can lead to a sense of moral exhaustion. Are we responsible for the suffering of millions of people in Syria and East Africa? Are we responsible for the extinction of species as a consequence of climate change? Instead of atomizing – and thus perhaps even rendering irrelevant – the abstract concept of individual responsibility, we should become aware of how we are all connected.

I stopped caring about him sometime between January and May, when the weather changed and the leaves came back. He went on that big white pill and couldn't have aged cheese or avocado and I sat at the table in the kitchen, watching him watch me.

The yelling wouldn't stop until he'd had enough, when his eyes no longer felt right in his head and he'd rather lay down than stand there, fist in mouth, cat rubbing against both legs.

He once told me that depression comes in waves but that makes it sound too beautiful. There was nothing good about the bad.

Sometimes we'd try to fight it before it hit. I'd take a shower. He'd shave his face. Vacuum the hallway rug. But it never worked and the top would blow off and it would be hands to throats again, just like that.

Teacups shook in their skin, books fell over on themselves and I wanted to see how it would all play out. Does he get the girl in the end? Or does she leave during a quiet moment, smiling as she turns away. His hand pressed against her like an ear.

"No sooner does man discover intelligence than he tries to involve it in his own stupidity." ~ Jacques Yves Cousteau

Over the course of my last few posts I have been groping towards some kind of meeting point between, on the one hand, the current wave of information technologies, as represented by artificial intelligence (AI), social media and robotics; and on the other, what might be termed, for the sake of brevity, the social condition. The thought experiment is hardly virtual, and is in fact unfolding before us in real time, but as I have been considering the issues at stake, there are significant blind spots that will demand elaboration by many commentators in the years and decades to come. Assuming that, as Marc Andreessen put it, software (and the physical objects in which it is increasingly becoming embodied) will continue to "eat the world", how can we expect these technological goods to be distributed across society?

It's actually kind of difficult to envision this as even being a problem in the first place. It's true that, up until in the first years of this century, there was some discussion of the so-called ‘digital divide', where certain segments of the population would not be able to get onto the ‘Internet superhighway' (another term that has fallen into disuse, perhaps because it feels like we never get out of our cars anymore). These were the segments of society that were already disadvantaged in some respect, where circumstances of poverty and/or geography prevented the delivery of physical and therefore digital services. Less so, those on the wrong side of the divide may have also landed there because of language proficiency or age.

Sunday, January 31, 2016

For the first time, scientists have pinned down a molecular process in the brain that helps to trigger schizophrenia. The researchers involved in the landmark study, which was published Wednesday in the journal Nature, say the discovery of this new genetic pathway probably reveals what goes wrong neurologically in a young person diagnosed with the devastating disorder.

The study marks a watershed moment, with the potential for early detection and new treatments that were unthinkable just a year ago, according to Steven Hyman, director of the Stanley Center for Psychiatric Research at the Broad Institute at MIT. Hyman, a former director of the National Institute of Mental Health, calls it "the most significant mechanistic study about schizophrenia ever."

"I’m a crusty, old, curmudgeonly skeptic," he said. "But I’m almost giddy about these findings."

The researchers, chiefly from the Broad Institute, Harvard Medical School and Boston Children's Hospital, found that a person's risk of schizophrenia is dramatically increased if they inherit variants of a gene important to "synaptic pruning" -- the healthy reduction during adolescence of brain cell connections that are no longer needed.

In patients with schizophrenia, a variation in a single position in the DNA sequence marks too many synapses for removal and that pruning goes out of control. The result is an abnormal loss of gray matter.

"Today," Sanders said as he announced his campaign last May, "we begin a political revolution to transform our country economically, politically, socially and environmentally."

And what, exactly, does he mean by that?

On its surface, the concept is simple: Sanders wants to organize and mobilize the people against the powerful — specifically, corporations and the wealthy. He argues that by building a movement among average Americans, he'll be able to win elections, defeat special interests, push liberal reforms into law, and build an economy that works for everyone.

But when you drill down into the details, Sanders's hoped-for political revolution is more complicated than that — and more interesting.

For one, it's a contested theory about the best way to advance progressive policies. For another, it's an electoral argument about how the Democratic Party can expand its appeal among white voters and change the existing partisan math. It's a case that the system is so broken and corrupt that extreme measures are needed to shake it up. And it's a usually implicit, sometimes explicit critique of President Obama and his party.

Tallulah: Netflix purchased streaming rights to Sian Heder’s writing/directing debut for $5 million. The film stars Ellen Page as Lu, a free-spirited young woman who, distraught after being left by her boyfriend, wanders an upscale hotel looking for leftovers. When she’s mistaken for a maid by one of the hotel’s patrons, she decides to “rescue” the patron’s child from its mother. Lu takes the child to the house of her boyfriend’s mother, Margo (played by Allison Janney). What follows is a deep story of two women from different worlds trying to understand one another.

Under the Shadow: Netflix also bought the rights to this tense and haunting horror film set in Tehran during the 1980s. Directed by Babak Anvari, the Farsi film centers on an aspiring doctor named Shideh (Narges Rashidi) who wants to continue her medical studies but faces pushback because she’s done political activism. As the story develops, supernatural forces seem to mix with the real-world horrors of war.