Blog Archives

At the beginning of each semester I tell my students the very thing that journalist Zat Rana gets at in a recent article for Quartz when I deliver a mini-sermon about my complete ban on phones — and also, for almost all purposes, laptops — in my classroom. A smartphone or almost any cell phone in your hand, on your desk, or even in your pocket as you’re trying to concentrate on important other things is a vampire demon powered by dystopian corporate overlords whose sole purpose is to suck your soul by siphoning away your attention and immersing you in a portable customized Matrix.

Or as Rana says, in less metaphorical language:

One of the biggest problems of our generation is that while the ability to manage our attention is becoming increasingly valuable, the world around us is being designed to steal away as much of it as possible….Companies like Google and Facebook aren’t just creating products anymore. They’re building ecosystems. And the most effective way to monetize an ecosystem is to begin with engagement. It’s by designing their features to ensure that we give up as much of our attention as possible.

Rana offers three pieces of sound advice for helping to reclaim your attention (which is the asset referred to in the title): mindfulness meditation, “ruthless single-tasking,” and regular periods of deliberate detachment from the digital world.

Interestingly, it looks like there’s a mini-wave of this type of awareness building in the mediasphere. Rana’s article for Quartz was published on October 2. Four days later The Guardian published a provocative and alarming piece with this teaser: “Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention.” It’s a very long and in-depth article. Here’s a taste:

There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity — even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”

But those concerns are trivial compared with the devastating impact upon the political system that some of Rosenstein’s peers believe can be attributed to the rise of social media and the attention-based market that drives it. . . .

Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. . . .

“The dynamics of the attention economy are structurally set up to undermine the human will,” [ex-Google strategist James Williams] says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?

“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”

In the same vein, Nicholas Carr (no stranger to The Teeming Brain’s pages) published a similarly aimed — and even a similarly titled — essay in the Weekend Review section of The Wall Street Journal on the very day the Guardian article appeared (October 6). “Research suggests that as the brain grows dependent on phone technology, the intellect weakens,” says the teaser. Here’s a representative passage from the essay itself:

Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object in the environment that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.” Media and communication devices, from telephones to TV sets, have always tapped into this instinct. Whether turned on or switched off, they promise an unending supply of information and experiences. By design, they grab and hold our attention in ways natural objects never could.

But even in the history of captivating media, the smartphone stands out. It’s an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what [Adrian] Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it’s part of the surroundings — which it always is. Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library, and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That’s what a smartphone represents to us. No wonder we can’t take our minds off it.

At his blog Carr noted the simultaneous appearance of his essay and the Guardian article on the same day. He also noted the striking coincidence of the similarity between the titles, calling it a “telling coincidence” and commenting:

It’s been clear for some time that smartphones and social-media apps are powerful distraction machines. They routinely divide our attention. But the “hijack” metaphor — I took it from Adrian Ward’s article “Supernormal” — implies a phenomenon greater and more insidious than distraction. To hijack something is to seize control of it from its rightful owner. What’s up for grabs is your mind.

Perhaps the most astonishing thing about all of this is that John Carpenter warned us about it three decades ago, and not vaguely, but quite specifically and pointedly. The only difference is that the technology in his (quasi-)fictional presentation was television. Well, that, plus the fact that his evil overlords really were ill-intentioned, whereas ours may be in some cases as much victims of their own devices as we are. In any event:

There is a signal broadcast every second of every day through our television sets. Even when the set is turned off, the brain receives the input. . . . Our impulses are being redirected. We are living in an artificially induced state of consciousness that resembles sleep. . . . We have been lulled into a trance.

Nick Hanauer, himself a certified member of America’s ruling “one percent,” warns his fellow plutocrats that the pitchforks are coming for all of them: “No society can sustain this kind of rising inequality. In fact, there is no example in human history where wealth accumulated like this and the pitchforks didn’t eventually come out. You show me a highly unequal society, and I will show you a police state. Or an uprising. There are no counterexamples. None. It’s not if, it’s when.”

Religion scholar Timothy Beal (author of the wonderful Religion and Its Monsters) examines the relationship between our spiritual impulse and our enduring fascination with the monsters of supernatural horror: “The import of the spiritual mainstream is holistic and ‘cosmic,’ speaking to our desire for grounding and orientation within a meaningfully integrated and interconnected whole. The monsters of contemporary horror, on the other hand, often remind us of the more chaotic, disorienting, and ungrounding dimensions of religion, envisioning an everyday life that is not without fear and trembling. ”

There is something about biting and blood that we never get over. Luis Suarez and his bite debated round the world in the World Cup. Bram Stoker’s Dracula, and the Victorian tale of castles and darkness that we still feel at our throats. That story has had amazing staying power. “I want to suck your blood!” and all the rest. Built off the story of Transylvania’s real Vlad the Impaler. Back to Europe’s long struggle with the Turkish caliphate. The story never dies. This hour On Point: the history and myth of literature’s great vampire — Dracula.

Writer, artist, and photographer Karen Emslie writes from first-person experience about the terror — and bliss — of sleep paralysis (while holding to a reductive neurobiological understanding of the phenomenon): “[S]leep paralysis has naturally spawned some very scary stories and films. But as a writer and filmmaker as well as a long-time percipient, I have another story to tell. Beyond the sheer terror, sleep paralysis can open a doorway to thrilling, extraordinary, and quite enjoyable altered states.”

Welcome to the new Gilded Age: “We haven’t just gone back to nineteenth-century levels of income inequality, we’re also on a path back to ‘patrimonial capitalism,’ in which the commanding heights of the economy are controlled not by talented individuals but by family dynasties.”

Welcome to the end of night: “An eternal electric day is creeping across the globe, but our brains and bodies cannot cope in a world without darkness.”

Beware modern medicine, which decreases our chance of a good death: “Death used to be a spiritual ordeal; now it’s a technological flailing. We’ve taken a domestic and religious event, in which the most important factor was the dying person’s state of mind, and moved it into the hospital and mechanized it, putting patients, families, doctors, and nurses at the mercy of technology.”

If you keep planning, but failing, to ditch Facebook and other social media, maybe this explains: “It is hard to resist a technology that is also a tool of pleasure. The Luddites smashed their power looms, but who wants to smash Facebook — with all one’s photos, birthday greetings, and invitations? New digital technologies, particularly social media, make money by encouraging us to spend our lives on their platforms.”

Worried about the rise of Big Data? Then good news! The whole field may be total bullshit: “At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, [the major claims about Big Data] can be ‘complete bollocks. Absolute nonsense.'”

My first publication in print form came in the 2002 Del Rey horror anthology The Children of Cthulhu, where I was honored to share book space with many authors whom I had long admired. Among these was Alan Dean Foster, some of whose earlier work, including several of his Star Wars novels and the story basis he provided for Star Trek: The Motion Picture, played an important part in my science fiction education during adolescence. Then in 2011 I had the distinct pleasure of meeting and chatting with him at MythosCon. So he’s always on my radar, and that’s why it’s nice to come across some high praise for his classic novelization of Alien,which is now out in a 35th Anniversary Edition.

The presiding reflection for today’s offering of recommended and necessary reading and viewing comes from British novelist and essayist Tim Parks, who elicits an important truth about the silence that so many of us seek, or say we think we seek, amid a culture of clamor:

Arguably, when we have a perception of being tormented by noise, a lot of that noise is actually in our heads — the interminable fizz of anxious thoughts or the self-regarding monologue that for much of the time constitutes our consciousness. And it’s a noise in constant interaction with modern methods of so-called communication: the internet, the mobile phone, Google glasses. Our objection to noise in the outer world, very often, is that it makes it harder to focus on the buzz we produce for ourselves in our inner world.

. . . Our desire for silence often has more to do with an inner silence than an outer. Or a combination of the two. Noise provokes our anger, or at least an engagement, and prevents inner silence. But absence of noise exposes us to the loud voice in our heads. This voice is constitutive of what we call self. If we want it to fall silent, aren’t we yearning for the end of self? For death, perhaps. So talk about silence becomes talk about consciousness, the nature of selfhood, and the modern dilemma in general: the desire to invest in the self and the desire for the end of the self.

. . . What silence and meditation leaves us wondering, after we stand up, unexpectedly refreshed and well-disposed after an hour of stillness and silence, is whether there isn’t something deeply perverse in this culture of ours, even in its greatest achievements in narrative and art. So much of what we read, even when it is great entertainment, is deeply unhelpful.

* * *

Taken (The New Yorker)
Under civil forfeiture, Americans who haven’t been charged with wrongdoing can be stripped of their cash, cars, and even homes. Is that all we’re losing?

Big Banks Conspiracy is Destroying America (Paul B. Farrell for Marketwatch)
How market manipulation is now standard policy as all banks have become Goldman wannabes. “Goldman Sachs has become just another second-stringer in the new global Big Banks Conspiracy as capitalism appears about to self-destruct Adam Smith’s ideal and trigger the third major market crash of the 21st century, followed by a collapse of the economy, driving America and the world deep into a new Great Depression. Be prepared.”

“Organic stories” (The New Inquiry)
Thoughts inspired by Facebook’s recent tweak of its News Feed algorithm. “Facebook is like a television that monitors to see how much you are laughing and changes the channel if it decides you aren’t laughing hard enough. It hopes to engrain in users the idea that if your response to something isn’t recordable, it doesn’t exist, because for Facebook, that is true. Your pleasure is its product, what it wants to sell to marketers, so if you don’t evince it, you are a worthless user wasting Facebook’s server space. In the world according to Facebook, emotional interiority doesn’t exist.”

Gothic Nightmares: Fuseli, Blake and the Romantic Imagination (Tate)
“Gothic Nightmares explores the work of Henry Fuseli (1741–1825) and William Blake (1757–1827) in the context of the Gothic — the taste for fantastic and supernatural themes which dominated British culture from around 1770 to 1830. Featuring over 120 works by these artists and their contemporaries, the exhibition creates a vivid image of a period of cultural turmoil and daring artistic invention.”

We all have dreaming minds, and we are all capable of being terrified (Tate Etc.)
A conversation between novelists Patrick McGrath and Louise Welsh, held in conjunction with the above-linked Tate exhibit, about horror, gothicism, Fuseli’s “Nightmare,” and the way the dark-dreaming mind has given rise to a genre that has attracted writers, filmmakers, musicians and artists across the centuries.”

‘Angel’ priest visits Missouri accident scene (USA Today)
A pointedly Fortean or Harpurian (think Daimonic Reality) incident that has achieved shocking media prominence in the past couple of days. “Emergency workers and community members in eastern Missouri are not sure what to make of a mystery priest who showed up at a critical accident scene Sunday morning and whose prayer seemed to change life-threatening events for the positive. . . . Even odder, the black-garbed priest does not appear in any of the nearly 70 photos of the scene of the accident in which a 19-year-old girl almost died.”

Mexican drug cartels whose operatives once rarely ventured beyond the U.S. border are dispatching some of their most trusted agents to live and work deep inside the United States — an emboldened presence that experts believe is meant to tighten their grip on the world’s most lucrative narcotics market and maximize profits. . . . [A] wide-ranging Associated Press review of federal court cases and government drug-enforcement data, plus interviews with many top law enforcement officials, indicate the groups have begun deploying agents from their inner circles to the U.S. Cartel operatives are suspected of running drug-distribution networks in at least nine non-border states, often in middle-class suburbs in the Midwest, South and Northeast. “It’s probably the most serious threat the United States has faced from organized crime,” said Jack Riley, head of the Drug Enforcement Administration’s Chicago office.

. . . . Years ago, Mexico faced the same problem — of then-nascent cartels expanding their power — “and didn’t nip the problem in the bud,” said Jack Killorin, head of an anti-trafficking program in Atlanta for the Office of National Drug Control Policy. “And see where they are now.” Riley sounds a similar alarm: “People think, ‘The border’s 1,700 miles away. This isn’t our problem.’ Well, it is. These days, we operate as if Chicago is on the border.”

. . . . “This is the first time we’ve been seeing it — cartels who have their operatives actually sent here,” said Richard Pearson, a lieutenant with the Louisville Metropolitan Police Department, which arrested four alleged operatives of the Zetas cartel in November in the suburb of Okolona. People who live on the tree-lined street where authorities seized more than 2,400 pounds of marijuana and more than $1 million in cash were shocked to learn their low-key neighbors were accused of working for one of Mexico’s most violent drug syndicates, Pearson said.

. . . . In Chicago, the police commander who oversees narcotics investigations, James O’Grady, said street-gang disputes over turf account for most of the city’s uptick in murders last year, when slayings topped 500 for the first time since 2008. Although the cartels aren’t dictating the territorial wars, they are the source of drugs. Riley’s assessment is stark: He argues that the cartels should be seen as an underlying cause of Chicago’s disturbingly high murder rate. “They are the puppeteers,” he said. “Maybe the shooter didn’t know and maybe the victim didn’t know that. But if you follow it down the line, the cartels are ultimately responsible.”

[EDITOR’S NOTE: Last year I abandoned Google’s search engine (for DuckDuckGo), Google Mail (for Zoho), Google Docs (for various substitutes), and Google Reader (for Netvibes) because of the company’s decision, mentioned by Morozov in this op-ed, to sew together privacy data from more than 60 of its products/services into a single, mega-master One Profile to Rule Them All for each of its users. Here, Morozov lays out some of the more far-reaching intentions behind, and meanings and implications of, Google’s move. Be sure to read his words in the mutually illuminating light of the article directly below about the new Facebook phone.]

Let’s give credit where it is due: Google is not hiding its revolutionary ambitions. As its co-founder Larry Page put it in 2004, eventually its search function “will be included in people’s brains” so that “when you think about something and don’t really know much about it, you will automatically get information”.

Science fiction? The implant is a rhetorical flourish but Mr Page’s utopian project is not a distant dream. In reality, the implant does not have be connected to our brains. We carry it in our pockets — it’s called a smartphone.

So long as Google can interpret — and predict — our intentions, Mr Page’s vision of a continuous and frictionless information supply could be fulfilled. However, to realise this vision, Google needs a wealth of data about us. Knowing what we search for helps — but so does knowing about our movements, our surroundings, our daily routines and our favourite cat videos.

. . . . [W]hen last year Google announced its privacy policy, which would bring the data collected through its more than 60 online services under one roof, that move made sense. The obvious reason for doing so is to make individual user profiles even more appealing to advertisers: when Google tracks you it can predict what ads to serve you much better than when it tracks you only across one such service.

But there is another reason, of course — and it has to do with the Grand Implant Agenda: the more Google knows about us, the easier it can make predictions about what we want – or will want in the near future. Google Now, the company’s latest offering, is meant to do just that: by tracking our every email, appointment and social networking activity, it can predict where we need to be, when, and with whom. Perhaps, it might even order a car to drive us there — the whole point is to relieve us of active decision-making. The implant future is already here — it’s just not evenly resisted.

[T]he biggest play here is not technical or strategic, but rhetorical. Facebook wants to change the way people think about technologies. . . . Throughout Zuckerberg’s talk, people and Facebook friends were used interchangeably. And for Zuckerberg and his employees, I think this is technically true. For them, all the people they care about are not only on Facebook, but active users who devote time and resources to building digital streams that are legible to other people as their lives. So, while you can read the Facebook phone announcement as the story of the company’s deeper integration with Google’s Android operating system, I also read Facebook Home as a story of the integration that Facebook’s employees have with their own product. And they’d like for the rest of the world to experience what they do.

. . . . Why do I think it is so important not to allow Zuckerberg to redefine “people” as “Facebook friends”? Because we need to be able to evaluate this technology’s impact very specifically within Facebook’s culture and aims. Facebook Home is not a story about “making the world more open and connected,” in general. This a story about Facebook “making the world more open and connected,” with all the specific definitions the company brings to those ideas.

. . . . It’s not that I think Facebook communications are inferior to other ones, whether that’s face-to-face, Twitter, talking on the phone, or standard text messaging. That’s not the point. The point is that they are *not the same* as these other things.

. . . . Will it be worth opening up every part of your phone interaction to Facebook in order to access that experience? Do you want your definition of a computer to center on Facebook Friends and the limited et [sic] of actions you can take with them? I can’t answer that for you, but I can say that it is a tradeoff, and the more you think about it, the better.

[EDITOR’S NOTE: Yes, it’s Morozov again. The man is all but ubiquitous today, and that’s a good thing, because he’s pointedly worth listening to. In the case of this particular piece, he’s pointedly worth listening to very slowly and deeply, because this is some seriously insightful — and darkly, counterculturally revolutionary — stuff that he’s laying out about the hijacking of our collective cultural discourse by a kind of linguistic-conceptual virus that disguises the ideological core assumptions of digital techno-utopianism under a cloak of inevitability, so that any serious critical examination of them becomes literally unthinkable.]

While the brightest minds of Silicon Valley are “disrupting” whatever industry is too crippled to fend off their advances, something odd is happening to our language. Old, trusted words no longer mean what they used to mean; often, they don’t mean anything at all. Our language, much like everything these days, has been hacked. Fuzzy, contentious, and complex ideas have been stripped of their subversive connotations and replaced by cleaner, shinier, and emptier alternatives; long-running debates about politics, rights, and freedoms have been recast in the seemingly natural language of economics, innovation, and efficiency. Complexity, as it turns out, is not particularly viral.

. . . [A] clique of techno-entrepreneurs has hijacked our language and, with it, our reason. In the last decade or so, Silicon Valley has triggered its own wave of linguistic innovation, a wave so massive that a completely new way to analyze and describe the world — a silicon mentality of sorts — has emerged in its wake. The old language has been rendered useless; our pre-Internet vocabulary, we are told, needs an upgrade.

. . . That we would eventually be robbed of a meaningful language to discuss technology was entirely predictable. That the conceptual imperialism of Silicon Valley would also pollute the rest of our vocabulary wasn’t.

The enduring emptiness of our technology debates has one main cause, and his name is Tim O’Reilly. The founder and CEO of O’Reilly Media, a seemingly omnipotent publisher of technology books and a tireless organizer of trendy conferences, O’Reilly is one of the most influential thinkers in Silicon Valley. Entire fields of thought — from computing to management theory to public administration — have already surrendered to his buzzwordophilia, but O’Reilly keeps pressing on. Over the past fifteen years, he has given us such gems of analytical precision as “open source,” “Web 2.0,” “government as a platform,” and “architecture of participation.” O’Reilly doesn’t coin all of his favorite expressions, but he promotes them with religious zeal and enviable perseverance. While Washington prides itself on Frank Luntz, the Republican strategist who rebranded “global warming” as “climate change” and turned “estate tax” into “death tax,” Silicon Valley has found its own Frank Luntz in Tim O’Reilly.

[EDITOR’S NOTE: Stanislav Grof is an icon and a legend in the field of transpersonal psychology, and is one of the field’s founders. H. R. Giger is an icon and a legend in the world of art, having made his mark as a painter, sculptor, and set designer with a genius for the dark and surreal, with his most famous work probably being his Academy Award-winning design of the aliens and their environment in the Alien film franchise, followed closely by his breathtaking semi-Lovecraft-inspired paintings in the 1977 book Necronomicon. In this interview, Grof muses — pun definitely intended — on the transpersonal/transcendent sources of Giger’s inspiration.]

I first encountered his work in Necronomicon, which was a large format, high-quality paperback. I couldn’t believe what I saw. It was absolutely amazing. Now, I have a good understanding of him, not only because we have spent a lot of personal time together, but I had the chance to interview him for many, many hours for the book; and during that time, I was able to find out not only about his life but also about how he works.

It’s extraordinary. Some of his large paintings cover one wall in his house, and these amazing compositions are frequently arranged symmetrically. I found out that particularly when he is working with an airbrush, he has absolutely no idea what he is painting. He just begins in the left upper corner and aims the airbrush at the canvas. Then, as he told me, something just comes through, and he is himself surprised by what emerges.

In discussing Giger’s genius, I quote what Friedrich Nietzsche wrote in Thus Spoke Zarathustra (1885) about his own state of consciousness while creating:

If one had the smallest vestige of superstition left in one, it would hardly be possible to set aside the idea that one is mere incarnation, mouthpiece, or medium of an almighty power. The idea of revelation, in the sense that something, which profoundly convulses and shatters one, become suddenly visible and audible with indescribable certainty and accuracy, describes the simple fact. One hears—one does not seek; one takes—one does not ask who gives; a thought suddenly flashes up like lightning, it comes with necessity, without faltering—I never had any choice in the matter.

In essence, something grabs you and comes through, and you basically become a channel for it. You’re not really the creator of it. You’re a mediator. Hans Rudi certainly falls into that category.

Several years ago, I had the privilege and pleasure to spend some time with Oliver Stone, visionary genius who has portrayed in his films with extraordinary artistic power the shadow side of modern humanity. At one point, we talked about Ridley Scott’s movie Alien and the discussion focused on H. R. Giger, whose creature and set designs were the key element in the film’s success. In the 1979 Academy Awards ceremony held at the Dorothy Chandler Pavilion in Los Angeles in April 1980, Giger received for his work on the Alien an Oscar for best achievement in visual effects.

I have known Giger’s work since the publication of his Necronomicon and have always felt a deep admiration for him, not only as an artistic genius, but also a visionary with an uncanny ability to depict the deep dark recesses of the human psyche revealed by modern consciousness research. In our discussion, I shared my feelings with Oliver Stone, who turned out to be himself a great admirer of Giger. His opinion about Giger and his place in the world of art and in human culture was very original and interesting. “I do not know anybody else,” he said, “who has so accurately portrayed the soul of modern humanity. A few decades from now when they will talk about the twentieth century, they will think of Giger.”

Although Oliver Stone’s statement momentarily surprised me by its extreme nature, I immediately realized that it reflected a profound truth. Since then, I often recalled this conversation when I was confronted with various disturbing aspects of the western industrial civilization and with the alarming developments in the countries affected by technological progress. There is no other artist who has captured with equal power the ills plaguing modern society – the rampaging technology taking over human life, suicidal destruction of the eco system of the earth, violence reaching apocalyptic proportions, sexual excesses, insanity of life driving people to mass consumption of tranquilizers and narcotic drugs, and the alienation individuals experience in relation to their bodies, to each other, and to nature.

. . . Giger’s art clearly comes from the depth of the collective unconscious, especially when we consider his prolific creative process. He reports that he often has no a priori concept of what a painting would look like. When creating some of his giant paintings, for instance, he started in the upper left corner and aimed the airbrush toward the canvas. The creative force was simply pouring through him, and he became its instrument. And yet the end result was a perfect composition and often showed remarkable bilateral symmetry.

. . . Giger’s determined quest for creative self-expression is inseparable from his relentless self-exploration and self-healing. In the analytic psychology of C. G. Jung, integration of the Shadow and the Anima, two quintessential motifs in Giger’s art, are seen as critical therapeutic steps in what Jung calls the process of individuation. Giger himself experiences his art as healing and as an important way to maintain his sanity. His art can also have a healing impact on those who are open to it because, like a Greek tragedy, it can facilitate powerful emotional catharsis for the viewers by exposing and revealing dark secrets of the human psyche.

This week: How entire U.S. towns now rely on food stamps. The regrets of the Iraqi “sledgehammer man,” whose image became famous in Western media when Saddam’s statue fell. The Obama administration’s epic (and hypocritical) focus on secrecy. The demise of Google Reader and what it portends for Net-i-fied life and culture. The sinister rise of an all-pervasive — and unblinkingly embraced — Orwellian Big Brotherism in the age of Big Data, with a focus on Facebook’s “Like” button, Google Glass, and Google’s vision of “a future of frictionless, continuous shopping.” A surge of ghost sightings and spiritual troubles among survivors of Japan’s earthquake and tsunami. The rise of the “Little Free Libraries” movement in America and abroad. Read the rest of this entry →

The Teeming Brain’s “Recommended Reading” series has been on hiatus since last November. And now it’s back, with a slightly altered/streamlined format (read: no graphics, just links and text) that’s more sustainable in the context of your trusty editor’s various other claims on time, energy, and attention.

[EDITOR’S NOTE: An American ex-pat living in Spain — and deeply loving it — explains how the country’s apocalyptically awful socioeconomic situation is forcing him and his wife to leave. Reading his description of the situation, one can’t help but extrapolate from it and speculate about the possible preview it provides for what will happen, and already is happening, elsewhere, especially since the stated causes and progression of the crisis sound so very familiar here in, e.g., the United States.]

Spanish paro [unemployment] has already surpassed the worst levels of the American Great Depression. The Red Cross recently launched a campaign to combat hunger in Spain, redirecting resources previously dedicated to Haiti. More than one in every four children live in households below the poverty line. Things are bad in a way no one could have imagined even five years ago.

. . . . Spain’s unemployment figures depress me because they seem to presage collapse, but the reality of life in a country with so many unemployed is even sadder. Elisa and I relocated from Córdoba to Madrid this past April, and since then almost every day I see a corriente, or average, person rooting around in the trash in search of food — never mind homeless people, who now also have competition at soup kitchens and food banks. The border between the perennially homeless and the newly homeless is increasingly porous and irrelevant.

. . . . What brought Spain to this point? The Spanish economic boom in the years preceding the crisis was a grim parable described as a fairy tale we’re all familiar with: subprime mortgages, unchecked speculation, laughable regulation, political complicity—a world built on fictions. The Spanish version had a result even more disastrous than elsewhere because way too many of the country’s economic eggs were in the construction sector basket.

. . . . On top of the increasingly untenable work situation, the comportment of police in the face of demonstrations is becoming more brutal and frightening. In September we happened to leave Neptune Plaza just minutes before police began beating demonstrators who had nonviolently surrounded the congress. In a restaurant we watched live TV coverage of defenseless people holding up their hands and yet still receiving blows. The next morning a shocking video appeared of police launching projectiles in a train station. A few days later the head of the riot police was awarded a medal by the government.

[EDITOR’S NOTE: The title and subject of this article would tend to invite scorn and skepticism for their seemingly over-the-top invocation of outlandish science fictional-type fears if it weren’t for the fact that, as described by the following excerpts, the imminent rise of autonomous killer robots, as well as the present rise of serious opposition to them by people in positions of authority and respect, really and truly is happening.]

A new global campaign to persuade nations to ban “killer robots” before they reach the production stage is to be launched in the UK by a group of academics, pressure groups and Nobel peace prize laureates. Robot warfare and autonomous weapons, the next step from unmanned drones, are already being worked on by scientists and will be available within the decade, said Dr Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.

The Stop the Killer Robots campaign will be launched in April at the House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons.

“These things are not science fiction; they are well into development,” said Sharkey.

. . . . US political activist Jody Williams, who won a Nobel peace prize for her work at the International Campaign to Ban Landmines, is expected to join Sharkey at the launch at the House of Commons. . . “Killer robots loom over our future if we do not take action to ban them now,” she said. “The six Nobel peace laureates involved in the Nobel Women’s Initiative fully support the call for an international treaty to ban fully autonomous weaponised robots.”

[EDITOR’S NOTE: This is simply riveting, and in a way that explodes an all-too-easy and glib dismissal along the lines of “Yeah, we already know junk food is addictive. So what else is new?” Moss names names and gives specifics in an article, adapted from his new book Salt, Sugar, Fat: How the Food Giants Hooked Us, that sounds like it could blow open the junk food industry in much the same way the famous 1996 Vanity Fair article that served as the basis for Michael Mann’s The Insider blew open the shady world of Big Tobacco.]

The public and the food companies have known for decades now. . . . that sugary, salty, fatty foods are not good for us in the quantities that we consume them. So why are the diabetes and obesity and hypertension numbers still spiraling out of control? It’s not just a matter of poor willpower on the part of the consumer and a give-the-people-what-they-want attitude on the part of the food manufacturers. What I found, over four years of research and reporting, was a conscious effort — taking place in labs and marketing meetings and grocery-store aisles — to get people hooked on foods that are convenient and inexpensive. I talked to more than 300 people in or formerly employed by the processed-food industry, from scientists to marketers to C.E.O.’s. Some were willing whistle-blowers, while others spoke reluctantly when presented with some of the thousands of pages of secret memos that I obtained from inside the food industry’s operations. What follows is a series of small case studies of a handful of characters whose work then, and perspective now, sheds light on how the foods are created and sold to people who, while not powerless, are extremely vulnerable to the intensity of these companies’ industrial formulations and selling campaigns.

. . . . If Americans snacked only occasionally, and in small amounts, this would not present the enormous problem that it does. But because so much money and effort has been invested over decades in engineering and then relentlessly selling these products, the effects are seemingly impossible to unwind. More than 30 years have passed since Robert Lin first tangled with Frito-Lay on the imperative of the company to deal with the formulation of its snacks, but as we sat at his dining-room table, sifting through his records, the feelings of regret still played on his face. In his view, three decades had been lost, time that he and a lot of other smart scientists could have spent searching for ways to ease the addiction to salt, sugar and fat. “I couldn’t do much about it,” he told me. “I feel so sorry for the public.”

Teaser: A revolution in technology is allowing previously inanimate objects—from cars to trash cans to teapots—to talk back to us and even guide our behavior. But how much control are we willing to give up?

In 2010, Google Chief Financial Officer Patrick Pichette told an Australian news program that his company “is really an engineering company, with all these computer scientists that see the world as a completely broken place.” Just last week in Singapore, he restated Google’s notion that the world is a “broken” place whose problems, from traffic jams to inconvenient shopping experiences to excessive energy use, can be solved by technology. The futurist and game designer Jane McGonigal, a favorite of the TED crowd, also likes to talk about how “reality is broken” but can be fixed by making the real world more like a videogame, with points for doing good. From smart cars to smart glasses, “smart” is Silicon Valley’s shorthand for transforming present-day social reality and the hapless souls who inhabit it.

But there is reason to worry about this approaching revolution. As smart technologies become more intrusive, they risk undermining our autonomy by suppressing behaviors that someone somewhere has deemed undesirable. Smart forks inform us that we are eating too fast. Smart toothbrushes urge us to spend more time brushing our teeth. Smart sensors in our cars can tell if we drive too fast or brake too suddenly. These devices can give us useful feedback, but they can also share everything they know about our habits with institutions whose interests are not identical with our own. Insurance companies already offer significant discounts to drivers who agree to install smart sensors in order to monitor their driving habits. How long will it be before customers can’t get auto insurance without surrendering to such surveillance? And how long will it be before the self-tracking of our health (weight, diet, steps taken in a day) graduates from being a recreational novelty to a virtual requirement?

[EDITOR’S NOTE: When somebody of Rushkoff’s status and stature as a commentator on the world of information technology goes and does (and says) something like this, you can know that a sea change is brewing.]

I used to be able to justify using Facebook as a cost of doing business. As a writer and sometime activist who needs to promote my books and articles and occasionally rally people to one cause or another, I found Facebook fast and convenient. Though I never really used it to socialize, I figured it was okay to let other people do that, and I benefited from their behavior.

I can no longer justify this arrangement. Today I am surrendering my Facebook account, because my participation on the site is simply too inconsistent with the values I espouse in my work. In my upcoming book Present Shock, I chronicle some of what happens when we can no longer manage our many online presences. I argue — as I always have — for engaging with technology as conscious human beings, and dispensing with technologies that take that agency away.

Facebook is just such a technology. It does things on our behalf when we’re not even there. It actively misrepresents us to our friends, and — worse — misrepresents those who have befriended us to still others. To enable this dysfunctional situation — I call it “digiphrenia” — would be at the very least hypocritical. But to participate on Facebook as an author, in a way specifically intended to draw out the “likes” and resulting vulnerability of others, is untenable.

Facebook has never been merely a social platform. Rather, it exploits our social interactions the way a Tupperware party does. Facebook does not exist to help us make friends, but to turn our network of connections, brand preferences, and activities over time — our “social graphs” — into a commodity for others to exploit. We Facebook users have been building a treasure lode of big data that government and corporate researchers have been mining to predict and influence what we buy and whom we vote for. We have been handing over to them vast quantities of information about ourselves and our friends, loved ones and acquaintances. With this information, Facebook and the “big data” research firms purchasing their data predict still more things about us.

The word “Luddite” continues to be applied with contempt to anyone with doubts about technology, especially the nuclear kind. Luddites today are no longer faced with human factory owners and vulnerable machines. As well-known President and unintentional Luddite D.D. Eisenhower prophesied when he left office, there is now a permanent power establishment of admirals, generals and corporate CEO’s, up against whom us average poor bastards are completely outclassed, although Ike didn’t put it quite that way. We are all supposed to keep tranquil and allow it to go on, even though, because of the data revolution, it becomes every day less possible to fool any of the people any of the time.

If our world survives, the next great challenge to watch out for will come — you heard it here first — when the curves of research and development in artificial intelligence, molecular biology and robotics all converge. Oboy. It will be amazing and unpredictable, and even the biggest of brass, let us devoutly hope, are going to be caught flat-footed. It is certainly something for all good Luddites to look forward to if, God willing, we should live so long. Meantime, as Americans, we can take comfort, however minimal and cold, from Lord Byron’s mischievously improvised song, in which he, like other observers of the time, saw clear identification between the first Luddites and our own revolutionary origins. It begins:

As the Liberty lads o’er the sea
Bought their freedom, and cheaply, with blood,
So we, boys, we
Will die fighting, or live free,
And down with all kings but King Ludd!

Modern science has achieved high credibility and prestige, not only for its intellectual plausibility, but because of its immense practical successes. Modern science, and the technology it has made possible, has fundamentally changed the circumstances of human life on this planet. One result of this has been the ideology of scientism, which asserts that science is the only valid avenue to truth. On the part of believers there has been the understandable impetus to present belief itself as being based on science. The prototypical figure in this has been Mary Baker Eddy, founder of a denomination aptly called Christian Science, with Jesus transformed into someone called Christ, Scientist. Not only does this do violence to the Jesus found in the New Testament, but equally so to science as an intellectual discipline. In the same line there have been attempts to establish a Christian economics, a Christian sociology, and so forth. Such constructions are as implausible as a Christian geology, or a Christian dermatology.

But there is something more fundamental involved in all of this: The refusal to accept the fact that there is more than one way to perceive reality.

. . . . If [the historian] wants to claim the status of “science” for his discipline, he has no alternative to following in the “naturalistic tradition”. The acts of God (miraculous or otherwise) cannot be empirically investigated or falsified. How the historian then looks at the same phenomenon, such as a Biblical account of ancient events, will obviously depend on his theology. If he believes in Biblical inerrancy — every sentence is literally true — he will definitely have some serious problems. But there are other, more flexible ways of looking for revelation “in, with and under” the Biblical text. In that case, even the most rigorous historical scholarship cannot undermine the approach of faith.

The Extinction Papers – Chapter Two

I am routinely wrong about many things. The enduring popularity of televised talent shows. The assured success of former Raider Bill Callahan as the new head coach of my 2004 Nebraska Cornhuskers. The viability of something called Twitter. While the second one caused me more pain (barely edged out by the first), the last might be my biggest miss as a cynical and formerly smug prognosticator.

From what I knew of Twitter at the time, I just couldn’t imagine that this insignificant and seemingly limited tentacle of social media would be embraced, let alone last long enough to metastasize into a societal norm, and even a verb (“tweeting” <shudder>). Allowing one to send out uninteresting life updates in 140 characters from the line at the grocery store (“Ugh! I’m SO ANNOYED by people who pay for their cat food with checks! FML!”), or the gym (“Just ripped off 15 reps at 230 on bench, bruh . . . Feeling pumped”), or from their own living room (“Watching re-runs of ‘Cagney & Lacey’ on Oxygen, y’all, and gotta’ admit, Tyne Daly is at the top of her game”), just didn’t seem to have any cachet, let alone meaning. Even with the proliferation of insipid reality programming, I still didn’t foresee the voracious interest in the mundane minutiae of the lives of everyday people. I had no idea that sharing random thoughts on traffic lights or a blurring phone pic of what one is about to eat for lunch would enthrall a nation, let alone a world. One would assume that a so-called enlightened civilization would have more important things to occupy their hopefully expanding brains than your college roommate’s recent sock purchase at Target.

But, I was wrong. Lords of Light, was I ever wrong. People dig this shit. CRAVE this shit. JOIN IN on this shit.

So I sat, baffled — with my quiet, unintelligent phone stowed somewhere in my bag — by the explosion of Twitter and the flood of tweets that were now an essential part of seemingly everyone’s daily lives. And baffled I remained, until I remembered that in the 21st century, EVERONE wants to be famous and recognized, even if only amongst a small group of friends, family, and online acquaintances. This is the era where fame trumps all, trampling the desire for talent, happiness, and stability, and just barely edging out success. Fame is king, queen, emperor, and god. As such, it attracts acolytes of the Status Cult, who routinely have sacrificed and will sacrifice anything upon the freshly stained, newly hewn titanium altar to achieve immortality, which these days can last only a few minutes, falling far short of that promised Golden Fifteen.Read the rest of this entry →

One of the most subtle and subversive pieces of social criticism in Fahrenheit 451comes early in the book when Montag, a fireman (i.e., book burner) who eventually wakes up to a recognition of his society’s essential character as a fascist-totalitarian dark age, chats with a teenaged girl named Clarisse. Or rather, it’s she who chats with him. The dumbed-down denizen’s of Bradbury’s keenly envisioned future dystopia of ignorance, repression, distraction, and dissipation are more fond of television, music, games, sports, sedatives, and other amusements than they are of real human contact, and when Clarisse suddenly shows up, introduces herself, and begins talking to Montag on a succession of evenings as he walks home from work, he’s considerably discomfited. But he finds her intriguing, and eventually he comes to look forward to their talks, so that when she unexpectedly disappears — presumably having been taken away by the repressive central government (a suspicion that’s confirmed later in the novel) — he’s deeply disturbed by it.

At one point in their conversations, he asks her why she isn’t in school. Her response reflects a profound inversion and perversion of what it means to be “antisocial” as judged by the surrounding society:

“Oh, they don’t miss me,” she said. “I’m antisocial, they say. I don’t mix. It’s so strange. I’m very social indeed. It all depends on what you mean by social, doesn’t it? Social to me means talking to you about things like this.” She rattled some chestnuts that had fallen off the tree in the front yard. “Or talking about how strange the world is. Being with people is nice. But I don’t think it’s social to get a bunch of people together and then not let them talk, do you? An hour of TV class, an hour of basketball or baseball or running, another hour of transcription history or painting pictures, and more sports, but do you know, we never ask questions, or at least most don’t; they just run the answers at you, bing, bing, bing, and us sitting there for four more hours of film-teacher. That’s not social to me at all. It’s a lot of funnels and lot of water poured down the spout and out the bottom, and them telling us it’s wine when it’s not. They run us so ragged by the end of the day we can’t do anything but go to bed or head for a Fun Park to bully people around … I guess I’m everything they say I am, all right. I haven’t any friends. That’s supposed to prove I’m abnormal.”

Although Bradbury’s critique in this passage is aimed largely at the public school system, his description of Clarisse’s ironic plight, in which her authentic human sociability earns her the label “antisocial” — a label that, as the book later shows, is tantamount to a criminal charge in this particular (semi-)fictional dystopia — has wider resonances in today’s world of cultural dominance by social media. In fact, we may be seeing a similar inversion and perversion of language and values play out right before our eyes at this very cultural moment.Read the rest of this entry →

About

The Teeming Brain explores news, trends, and developments in religion, horror, science fiction, fantasy, the paranormal, creativity, consciousness, and culture. It also tracks apocalyptic and dystopian trends in science, technology, politics, ecology, economics, the media, the arts, education, and society at large. Its founder and primary author is Matt Cardin.

Subscribe

Enter your email address to subscribe to The Teeming Brain Weekly, a digest of each week's content delivered to your in-box every Sunday:

Email

Email Format

html

text

mobile

Or subscribe via RSS using your favorite reader:

Search this site

Search for:

Support Us

If you enjoy The Teeming Brain's ongoing exploration, please consider making a donation to help defray the time and costs associated with this project.

OPTION 1: You can make a recurring (and voluntary) monthly donation of $3:

OPTION 2: You can make a one-time donation in any amount you choose:

Whichever option you choose, be sure to subscribe (for free) to our blog updates as well. See above. And thank you sincerely for your support!

Praise for the Teem

FOR MATT CARDIN:

"[Dark Awakenings is] a thinking-man's book of the macabre...Cardin's tales are rich with references to Lovecraft, Nietzsche, and other writers whose work gives them unusual philosophic depth." – Publishers Weekly

“It’s a bold writer who, in this day and age, tries to make modern horror fiction out of theology, but Cardin pulls it off.” – Darrell Schweitzer

“In the tradition of Poe and Lovecraft, Cardin's accomplishments as a writer are paralleled by his expertise as a literary critic and theorist.” – Thomas Ligotti

“Matt Cardin is one of those rare horror authors who is also a true scholar and intellectual.” – Jack Haringa

FOR RICHARD GAVIN:

"Literate horror fans who have yet to encounter Canadian author Richard Gavin are in for a treat. The lyrical prose is often at a higher level than usual presentations of otherworldly demons and malevolent forces." – Publishers Weekly

"Richard Gavin is one of the bright new stars in contemporary weird fiction. His richly textured style, deft character portrayal, and powerful horrific conceptions make every one of his tales a pleasure to read." – S. T. Joshi

"Gavin's storytelling can be masterly. As with Machen and Blackwood at their best, an epiphany or illumination is achieved, though Gavin's mysticism is darker and distinctly his own." – Wormwood

FOR STUART YOUNG:

"No one can accuse Stuart Young of avoiding the big issues -- with insight and verve, he tackles head-on the existence of God, the mystery of human consciousness and the transformative effects of psychedelic drugs." – Mark Chadbourne