Thursday, August 29, 2013

There were several stories on the BBC last week that at first glance appear unrelated, but with a deeper perspective all deal with the same thing. All of the stories share a similar structure. First, they optimistically discuss some sort of new scientific miracle or breakthrough. Then, they explain why the new innovation or idea is so urgent, usually because of some dire need, with plenty of statistics to back it up. Then they explain the downsides and stumbling blocks to adaptation of the new idea. And finally, often times there are things which show just how ridiculous these innovations are, and how inferior they are to alternatives.

Now what makes these stories all the same? The simple fact that all of these scientific breakthroughs and innovations are desperate measures – they are invented to deal with the dire situations that a previous scientific innovation has unleashed upon the world – overpopulation, ecosystem destruction, pollution, climate change, desertification, malnutrition, fossil fuel dependence, and so on. Or, they are ways to keep the existing status quo going at all costs. They are all solid evidence of the fact that we need to innovate ever faster in order to avoid disaster thanks to past innovations.

These articles are all illustrations of the vicious circle principle - the idea that we need to innovate ever faster to keep disaster from occurring. Every time we innovate, we need to make society more complex, and the stakes for failure get higher. At the same time, the vast majority of people are worse off, with only a few elites at the top of the pyramid benefiting. This has been the case since the invention of agriculture with one exception – the extraordinary bounty created by the harnessing of fossil fuels and the application of science. But those abilities are increasingly being applied to fix problems caused by fossil fuel use, such as climate change, overpopulation, resource scarcity, and economic collapse.

Scientists took cells from a cow and, at an institute in the Netherlands, turned them into strips of muscle that they combined to make a patty... The burger was cooked by chef Richard McGeown, from Cornwall, and tasted by food critics Hanni Ruetzler and Josh Schonwald.

Stem cells are the body's "master cells", the templates from which specialised tissue such as nerve or skin cells develop. Most institutes working in this area are trying to grow human tissue for transplantation to replace worn-out or diseased muscle, nerve cells or cartilage. Prof Post is using similar techniques to grow muscle and fat for food. He starts with stem cells extracted from cow muscle tissue. In the laboratory, these are cultured with nutrients and growth-promoting chemicals to help them develop and multiply.

Three weeks later, there are more than a million stem cells, which are put into smaller dishes where they coalesce into small strips of muscle about a centimetre long and a few millimetres thick These strips are collected into small pellets, which are frozen. When there are enough, they are defrosted and compacted into a patty just before being cooked. Because the meat is initially white in colour, Helen Breewood - who works with Prof Post - is trying to make the lab-grown muscle look red by adding the naturally-occurring compound myoglobin.

Yum! And why do we need to turn to a fossil-fuel powered university laboratory to procure animal flesh, something humans have been doing for literally thousands of years, and harvesting from nature for perhaps millions of years before that?

Researchers say the technology could be a sustainable way of meeting what they say is a growing demand for meat.... The world's population is continuing to increase and an ever greater proportion want to eat meat. To meet that demand farmers will need to use more energy, water and land - and the consequent increase in greenhouse gas emission will be substantial... An independent study found that lab-grown beef uses 45% less energy than the average global representative figure for farming cattle. It also produces 96% fewer greenhouse gas emissions and requires 99% less land.

Ah, I get it now. We're running out of land, water, and energy, while the demand for meat keeps growing. So we're kind of up against it, aren't we? But rather than address the root causes of this situation, we're going to grow meat in a lab like in some sort of bad science-fiction novel from the seventies. And we can imagine there is some sort of billionaire funding all of this:

Sergey Brin, co-founder of Google, has been revealed as the project's mystery backer. He funded the £215,000 ($330,000) research.

Of course he did. But I’m guessing Mr. Brin can afford the finest Kobe beef imported from Japan; petri dish meat is what the rest of us are going to eat. And no doubt it will be a good deal less tasty than what Mr. Brin has for dinner:

One food expert said it was "close to meat, but not that juicy" and another said it tasted like a real burger. Upon tasting the burger, Austrian food researcher Ms Ruetzler said: "I was expecting the texture to be more soft... there is quite some intense taste; it's close to meat, but it's not that juicy. The consistency is perfect, but I miss salt and pepper. "This is meat to me. It's not falling apart." Food writer Mr Schonwald said: "The mouthfeel is like meat. I miss the fat, there's a leanness to it, but the general bite feels like a hamburger. "What was consistently different was flavour."

There are usually a few curmudgeons with crazy ideas about reforming the social system who can safely be ignored:

Critics of the technology say that eating less meat would be an easier way to tackle predicted food shortages …Prof Tara Garnett, head of the Food Policy Research Network at Oxford University, said decision-makers needed to look beyond technological solutions." We have a situation where 1.4 billion people in the world are overweight and obese, and at the same time one billion people worldwide go to bed hungry," she said. "That's just weird and unacceptable. The solutions don't just lie with producing more food but changing the systems of supply and access and affordability, so not just more food but better food gets to the people who need it."

There are just a few obstacles to be overcome, though, before we can save the world with our latest techno-fix:

Currently, this is a work in progress. The burger revealed on Monday was coloured red with beetroot juice. The researchers have also added breadcrumbs, caramel and saffron, which were intended to add to the taste, although Ms Ruetzler said she could not taste these. At the moment, scientists can only make small pieces of meat; larger ones would require artificial circulatory systems to distribute nutrients and oxygen.

Prof Mark Post, of Maastricht University, the scientist behind the burger, remarked: "It's a very good start."

As meat prices rise, consumers may change their minds, but this is an argument insect enthusiasts are wary of pushing too hard. "What we don't want people to think, which has been some people's interpretation, is that insects are just meat for those who don't have money," says Eduardo Galante, professor of zoology at Alicante University and an expert in entomology. "When you look at the comments below news articles on the internet here in Spain you can see that lots of people have got the message that 'salaries are going down, economically we're doing really badly, and we're going to have to eat insects.' It's very important how we transmit the message. I'm convinced that, if one day an important chef decides to put insects on the menu here, then from that day on people will eat insects...

It's true that the movement of westerners towards eating insects has been dwarfed by the movement of people in the rest of the world towards meat. According to the FAO, in 1961 the Chinese consumed 3.6kg of meat per person. In 2002 that figure had increased to 52.4kg. If the world's two billion entomophagists were to decide that insects are disgusting, there's no way of producing enough meat to feed them all. Even vegetarians will find prices of staples rising as more land is set aside for animal feed.

The "dystopian" future of meat for the rich, insects for the poor, however, has to be better than "meat for the rich, nothing for the poor". Either way, the mission of groups such as Ento and the Nordic Food Lab could be more important than we think. If the new question is "how do we eat insects?" or, indeed, "how do we cook lab-burgers?", then it's chefs, not biologists or chemists, who will have to lead the way.

Quite a long way from hunting and gathering, eh? Not so much really, most likely, pre-humans like Australopithecus and earlier human ancestors relied on insects as sources of protein before they developed the large brains required for tool use, fire, and big-game hunting. Progress!

Scientists in the Philippines are weeks from submitting a genetically modified variety of rice to the authorities for biosafety evaluations. They claim it could be in the fields within a year, but national regulators will have the final say. Supporters say it will help the 1.7 million Filipino children who suffer vitamin A deficiency - which reduces immunity and can cause blindness. But campaigners say "Golden Rice" is a dangerous way to tackle malnutrition. They say that it threatens the Philippines' staple food.

So we need to genetically modify rice to prevent vitamin A deficiency, eh? And why, exactly, are 1.7 million children suffering from vitamin A deficiency (a number that would constitute a significant portion of the human race in pre-agricultural times) that we need to start screwing around with the genetic structure of our staple grains?

Rice is by far the most important crop in the Philippines, with the average Filipino eating 100kg (dry weight) per year. Two thirds of households don't eat enough to meet their dietary energy needs, and most of the calories they do get come from rice.

Aha. So this techno-fix is intended to rectify malnutrition caused by poverty. But we're not going to fix the poverty, are we? This innovation, like the one above, has been a long time in the making and has required a lot of resources to pull off:

It has taken scientists more than two decades to boost the beta-carotene in Golden Rice to meaningful levels. But Dr Antonio Alfonso, who leads the project at PhilRice, says the product is now ready.

Oh, so twenty years later it’s finally ready. That’s good. Of course the powers that be want genetically engineered rice to be the answer, since people will be dependent upon solutions provided by agribusiness and engineered in labs. While they claim that this is not being done for profit, it's hard to not see how we won't become dependent on the major corporations that have the money to genetically modify seeds, produce pesticides, etc. They don't want to tackle the fundamental problems that are causing children to experience vitamin A deficiency. That would tip the apple cart in some way, and that is unacceptable. And while it's not mentioned in the article, I know from other sources that yet another tech billionaire, Bill Gates, is one of the main backers, in the interests of "philanthropy" of course.

Once again, there are some critics:

In fields outside the town of Tayabas in south Luzon, Dr Chito Medina, national coordinator of charity MASIPAG, is working with farmers to improve the diversity of their crops using organic growing techniques. He argues that a more diverse harvest contains naturally high levels of Vitamin A and other nutrients, making Golden Rice redundant."Malnutrition is a broader issue, therefore the solution needs to be broader also," he explained."The more important thing is alleviating poverty, providing more diverse seeds to farmers so they can grow more diverse crops and having more diverse food and a more balanced diet. Then there would be no vitamin deficiencies at all.

"There are so many natural sources of Vitamin A, especially in tropical countries: almost all green and leafy vegetables, yellow vegetables and fruits like mangos and cantaloupes."Dr Medina added: "We have a variety of sweet potato which has five times the level of Vitamin A than there is in Golden Rice. Ecologically, this is more sustainable and it's the way agriculture should be in the future.

"Economically, it generates more income for farmers because there are fewer expenditures: they don't have to buy chemical pesticides, fertilisers or seeds."

Not having to buy chemical pesticides fertilizers or seeds is not part of the plan. No, we're probably going to go the GM crop route. And what about climate change? What sort of techo-fixes are we talking about here? Well, how about 'powdered rain?': Can 'powdered rain' make drought a thing of the past?

The lack of water is a growing, global problem that seems intractable. While the UN estimates that a large majority of the water we use goes on irrigation, researchers have been working on a range of ideas that make the water we use in agriculture last longer.

There has been a great deal of excitement and some dramatic headlines in recent weeks about a product that is said to have the potential to overcome the global challenge of growing crops in arid conditions. "Solid Rain" is a powder that's capable of absorbing enormous amounts of water and releasing it slowly over a year so that plants can survive and thrive in the middle of a drought.

So we'll use this solid rain made in labs to replace the natural rain that's no longer occurring thanks to climate change? Wonderful. Once again, some people are skeptical:

But not everyone is convinced that Solid Rain is a significant solution to the problem of drought. Dr Linda Chalker-Scott from Washington State University says that these types of products have been known to gardeners for several years now. "They're hardly new, and there's no scientific evidence to suggest that they hold water for a year, or last for 10 years in the soil," she told BBC News. "An additional practical problem is that gels can do as much harm as good. As the gels begin to dry out, they soak up surrounding water more vigorously. That means they will start taking water directly from plant roots," she added. Dr Chalker-Scott says that research she carried out in Seattle with newly transplanted trees showed that wood chip mulching was just as effective as adding powdered materials and gels to the soil. And it was significantly cheaper.

But mulch is not as profitable, is it? Lab-grown meat, Golden Rice, Solid Rain; these all come from the high-tech laboratories of the world’s corporations. And they are all innovations desperately designed to keep the status quo going at all costs. And speaking of high-tech laboratories: Critical phase for Iter fusion dream:

Since the 1950s, fusion has offered the dream of almost limitless energy - copying the fireball process that powers the Sun - fuelled by two readily available forms of hydrogen. The attraction is a combination of cheap fuel, relatively little radioactive waste and no emissions of greenhouse gases.

But the technical challenges of not only handling such an extreme process but also designing ways of extracting energy from it have always been immense. In fact, fusion has long been described as so difficult to achieve that it's always been touted as being "30 years away".

I don’t think we need to elaborate too much on the fundamental reasoning behind this research – trying to come up with enough power to keep global civilization growing and expanding in the age of fossil fuel depletion. Fracking, which could be the ultimate example of “innovation” being used to preserve the status quo despite horrible negative consequences, also figured in the BBC news for that week: Fracking: Water concerns persist?

Recent studies from the US have again raised questions about the impact of hydraulic fracturing, or fracking, on water supplies. These show that chemicals, including methane and arsenic, have been found more often in water wells near natural gas extraction sites. Despite this, the actual causes of the contamination are not clear.

In the UK, water companies say drinking supplies must be protected "at all costs" if fracking becomes commonplace. Hydraulic fracturing's impact on water is a concern because the gas and oil they aim to extract are normally quite deep in the earth compared with our drinking supplies. The fracking pipes go through the drinking water aquifers and there are worries that any cracks in the lining of the drilling wells could contaminate supplies.

And as for high-tech, the challenges of fusion make lab grown meat and GM rice look like stone-age technology by comparison:

It will involve creating a plasma of superheated gas reaching temperatures of more than 200 million C - conditions hot enough to force deuterium and tritium atoms to fuse together and release energy. The whole process will take place inside a giant magnetic field in the shape of a ring - the only way such extreme heat can be contained.

The plant at JET has managed to achieve fusion reactions in very short bursts but required the use of more power than it was able to produce. The reactor at Iter is on a much larger scale and is designed to generate 10 times more power - 500 MW - than it will consume.

The unique chemical structure of the compound could lead to a new class of antibiotic medicines. Thomas Frieden, director of the US Centers for Disease Control and Prevention, recently warned of the risk posed by antibiotic-resistant "nightmare" bacteria while Sally Davies, UK Chief Medical Officer, described them as a "ticking time bomb" that threatens national security. The Infectious Disease Society of America has expressed concern that the rate of antibiotic development to counter resistance is insufficient. This makes this latest discovery particularly welcome news.

A lot of times the point that so much of our scientific research is just trying desperately to fix conditions caused by earlier discoveries or to prolong the status quo is rather academic. But here we can see a whole host of stories proving the point beyond any doubt all in one week in one British publication! (okay, two). Genetically modified rice, lab grown meat, insect farming, fusion reactors, hydraulic fracturing, GM crops, trolling the sea bed for antibiotics; this ought to give you a good picture of where we’re headed as a civilization. These are not new breakthroughs that are raising the living standards of the world's population. They are desperate measures, aimed at averting disaster. Increasingly, this is the subject of our “innovation” (or making products nobody needs and convincing us we want them).

What if in the Philippines we encouraged diverse perennial polycultures? What if we integrated animals into these perennial polycultures? What if we encouraged natural pest control, including eating the pests? What if we encouraged water conservation through terracing, mulch and storage? What if we protected the tree canopy through agroforestry? What if allowed small farmers to get good prices for what they grow instead of exporting it to foreign markets? What if we used these to harvest energy sources like wood and bamboo, along with wind and solar power? What if we protected the natural environment? This would solve:

But those types of solutions are not on the table are they? They are not considered "innovations" that will save us. So the next time you read a story in the media of some sort of miracle techno-fix solution, ask yourself: what fundamental problem is it trying to solve, is there a simpler/easier way that gets to the root of the problem, and who benefits? I think you'll look at a lot of stories very differently.

In the Philippines, coconut oil was until recently used solely for roasting and baking. Now, it’s used to produce vast quantities of biodiesel.

Romulo Arancon, executive director of the Asian and Pacific Coconut Community, says using coconuts to produce fuel cuts down on the high cost of importing fossil fuels. “Using coconut oil instead of diesel will make countries more and more independent. Now we’re trying to increase the coconut harvest without destroying the environment. It’s important that making biodiesel does not compete with food plants,” said Arancon.

The eco-friendly fuel is not only cheaper but also releases a minimal amount of CO2. Plus, coconut-based diesel oil is also said to smell much better than the traditional variety.

Monday, August 26, 2013

So David Autor and David Dorn have a piece in The New York Times this weekend about how technology is destroying middle class jobs and
“polarizing” the workforce into high-wage and low-wage occupations. We’re back
to my society of nurses, cooks and engineers. Nothing we haven’t talked about
before, but it’s good to see the discussion taken to a wider audience.:

So
computerization is not reducing the quantity of jobs, but rather degrading the
quality of jobs for a significant subset of workers. Demand for highly educated
workers who excel in abstract tasks is robust, but the middle of the labor
market, where the routine task-intensive jobs lie, is sagging. Workers without
college education therefore concentrate in manual task-intensive jobs — like
food services, cleaning and security — which are numerous but offer low wages,
precarious job security and few prospects for upward mobility. This bifurcation
of job opportunities has contributed to the historic rise in income inequality.

This ties in nicely with David Graeber’s essay
about the rise of bullshit jobs. These “bullshit jobs” are one category of jobs
that we seem to be creating in spades. While these “creative managerial” types
are celebrated by economists, their salaries have to come from somewhere, and
most of it comes from gouging the general public, especially the costs of college and healthcare, and to some extent certain government positions,
particularly at the federal level. Most of these “creative managerial” tasks
that Autor and others like him describe are total bullshit: moving money
around, filing unnecessary paperwork, sitting through meetings, taking clients
to sporting events, convincing people to buy stuff they don’t want or need,
frauds and scams; things like that.

The other note is to highlight Matt Taibbi’s
latest expose summarizing the college loan ripoff being perpetrated on theAmerican public. So while we claim that “professional and managerial”
occupations are the wave of the future, we charge a king’s ransom to be part of
this class. We claim that college is the new high school degree, but we turn
people into debt serfs to get the “work chit” of a college degree. What sense
does this make?

School, health care - it seems like in America every necessary institution in the running of a modern society is swiftly transformed into some sort of massive grift for the benefit of a few insiders while society decays around them. I doubt all this college has really made anyone smarter. In the past, people just read books and educated themselves. All it has led to is debt and grade inflation. Many people have noted that the average eighth grade textbook from a hundred years ago is as challenging as college material today.Start funding college like high school (Salon)

The cause is more fundamental than the cycles of the economy: The country is turning out far more college graduates than jobs exist in the areas traditionally reserved for them: the managerial, technical and professional occupations.The Bureau of Labor Statistics tells us that we now have 115,000 janitors, 83,000 bartenders, 323,000 restaurant servers, and 80,000 heavy-duty truck drivers with bachelor’s degrees -- a number exceeding that of uniformed personnel in the U.S. Army.

Note also that productivity gains are being
kept entirely by the ownership class with workers getting none of it as thisNaked Capitalism piece asserts. Again I ask –who are the real makers and who
the real takers? This has little do with automation and more to do with a vast
surplus of labor. The lump of labor fallacy is not a fallacy.

I’d also like to point out a simple logical
fallacy in economic arguments. Economists always claim mass immigration does
nothing bad to wages, because all of those immigrant workers are also consumers,
so they expand the size of the overall economy, cancelling out any negative
effects of labor surplus. However, economists insist that paying more wages to
workers will ruin the economy, despite the fact that those same workers are
also consumers. Somehow paying more in wages to people who actually work for a living won't expand the economy, unlike mass immigration. Hmmm.

Scientists have long surmised that moods affect health. But the underlying cellular mechanisms were murky until they began looking at gene-expression profiles inside white blood cells. Gene expression is the complex process by which genes direct the production of proteins. These proteins jump-start other processes, which in the case of white blood cells control much of the body’s immune response.

It turned out that different forms of happiness were associated with quite different gene-expression profiles. Specifically, those volunteers whose happiness, according to their questionnaires, was primarily hedonic, to use the scientific term, or based on consuming things, had surprisingly unhealthy profiles, with relatively high levels of biological markers known to promote increased inflammation throughout the body. Such inflammation has been linked to the development of cancer, diabetes and cardiovascular disease. They also had relatively low levels of other markers that increase antibody production, to better fight off infections.

The volunteers whose happiness was more eudaemonic, or based on a sense of higher purpose and service to others — a small minority of the overall group — had profiles that displayed augmented levels of antibody-producing gene expression and lower levels of the pro-inflammatory expression.

What this finding indicates, says Steven W. Cole, a professor of medicine at U.C.L.A. and senior author of the study, published last month in The Proceedings of the National Academy of Sciences, is that “our genes can tell the difference” between a purpose-driven life and a shallower one even when our conscious minds cannot. Of course, genes cannot actually perceive or judge our behavior, so the shift in gene expression is very likely driven by an evolutionary strategy of working for the common good.

If Americans exercised more and ate and smoked less, this conventional wisdom holds, the United States would surely start moving up in the global health rankings.

But many epidemiologists — scientists who study health outcomes — have their doubts. They point outthat the United States ranked as one of the world’s healthiest nations back in the 1950s, a time when Americans smoked heavily, ate a diet that would horrify any 21st-century nutritionist, and hardly ever exercised.

And none of these determinants matter more, these researchers contend, than economic inequality, the divide between the affluent and everyone else. Over 170 studies worldwide have so far linked income inequality to health outcomes. The more unequal a modern society, the studies show, the more unhealthy most everyone in it — and not the poor alone.

We're constantly told that the digital world we have at our fingertips is the crowning achievement of our civilization. But what if it's incompatible with the good life?

Indeed, tech anxiety abounds. And I take it seriously. Some people feel something is amiss in their relationships, and that technology is to blame. There's a move, cataloged in nearly every magazine, towards seeing the offline as authentic and the online as hollow, false, unreal. This may be a false distinction, digital dualism, as Nathan Jurgenson calls it, but it's a widespread reaction to the technologies at hand. What was once an exciting new way to make friends now feels overengineered, or -- more damningly in the current climate -- processed.

Processed foods were once the time-saving, awe-inducing markers of an upwardly mobile household. (Check out this ad for dextrose.) Now, among the upper middle classes, they're a sure sign that someone does not have a firm grip on what the good life is. Processed food, Michael Pollan would tell you, is not even really food at all. And it tangles you up in huge economic webs that stretch across the globe. So while Farm Bill politics make larger-scale solutions impractical, the answer, mostly, is to eat local, organic food -- prepared like Grandma would.

This logic has been extended to digital friendships. Processed relationships get scare quotes: Facebook "friends." Processed relationships can't be as genuine or authentic or honest as real life friendships. Processed relationships generate data for Facebook and Twitter and Google and the NSA. So the solution is to make local friends, hang out organically, and only communicate through means your Grandma would recognize. It's so conservative it's radical!

But our brains are designed to more easily be stimulated than satisfied. "The brain seems to be more stingy with mechanisms for pleasure than for desire," Berridge has said. This makes evolutionary sense. Creatures that lack motivation, that find it easy to slip into oblivious rapture, are likely to lead short (if happy) lives. So nature imbued us with an unquenchable drive to discover, to explore. Stanford University neuroscientist Brian Knutson has been putting people in MRI scanners and looking inside their brains as they play an investing game. He has consistently found that the pictures inside our skulls show that the possibilityof a payoff is much more stimulating than actually getting one.

Just how powerful (and separate) wanting is from liking is illustrated in animal experiments. Berridge writes that studies have shown that rats whose dopamine neurons have been destroyed retain the ability to walk, chew, and swallow but will starve to death even if food is right under their noses because they have lost the will to go get it. Conversely, Berridge discovered that rats with a mutation that floods their brains with dopamine learned more quickly than normal rats how to negotiate a runway to reach the food. But once they got it, they didn't find the food more pleasurable than the nonenhanced rats. (No, the rats didn't provide a Zagat rating; scientists measure rats' facial reactions to food.)

That study has implications for drug addiction and other compulsive behaviors. Berridge has proposed that in some addictions the brain becomes sensitized to the wanting cycle of a particular reward. So addicts become obsessively driven to seek the reward, even as the reward itself becomes progressively less rewarding once obtained. "The dopamine system does not have satiety built into it," Berridge explains. "And under certain conditions it can lead us to irrational wants, excessive wants we'd be better off without." So we find ourselves letting one Google search lead to another, while often feeling the information is not vital and knowing we should stop. "As long as you sit there, the consumption renews the appetite," he explains.

Actually all our electronic communication devices—e-mail, Facebook feeds, texts, Twitter—are feeding the same drive as our searches. Since we're restless, easily bored creatures, our gadgets give us in abundance qualities the seeking/wanting system finds particularly exciting. Novelty is one. Panksepp says the dopamine system is activated by finding something unexpected or by the anticipation of something new. If the rewards come unpredictably—as e-mail, texts, updates do—we get even more carried away. No wonder we call it a "CrackBerry."

The system is also activated by particular types of cues that a reward is coming. In order to have the maximum effect, the cues should be small, discrete, specific—like the bell Pavlov rang for his dogs. Panksepp says a way to drive animals into a frenzy is to give them only tiny bits of food: This simultaneously stimulating and unsatisfying tease sends the seeking system into hyperactivity. Berridge says the "ding" announcing a new e-mail or the vibration that signals the arrival of a text message serves as a reward cue for us. And when we respond, we get a little piece of news (Twitter, anyone?), making us want more. These information nuggets may be as uniquely potent for humans as a Froot Loop to a rat. When you give a rat a minuscule dose of sugar, it engenders "a panting appetite," Berridge says—a powerful and not necessarily pleasant state.

If humans are seeking machines, we've now created the perfect machines to allow us to seek endlessly. This perhaps should make us cautious.

...for most of human existence, most people have got by with very little private space, as I found when I spoke to John L Locke, professor of linguistics at Ohio University and the author of Eavesdropping: An Intimate History (2010). Locke told me that internal walls are a relatively recent innovation. There are many anthropological reports of pre-modern societies whose members happily coexisted while carrying out almost all of their lives in public view.

You might argue, then, that the internet is simply taking us back to something like a state of nature. However, hunter-gatherer societies never had to worry about invisible strangers; not to mention nosy governments, rapacious corporations or HR bosses. And even in the most open cultures, there are usually rituals of withdrawal from the arena. ‘People have always sought refuge from the public gaze,’ Locke said, citing the work of Paul Fejos, a Hungarian-born anthropologist who, in the 1940s, studied the Yagua people of Northern Peru, who lived in houses of up to 50 people. There were no partitions, but inhabitants could achieve privacy any time they wanted by simply turning away. ‘No one in the house,’ wrote Fejos, ‘will look upon, or observe, one who is in private facing the wall, no matter how urgently he may wish to talk to him.’

The need for privacy remains, but the means to meet it — our privacy instincts — are no longer fit for purpose

From the 1960s onwards, Thomas Gregor, professor of anthropology at Vanderbilt University in Nashville, studied an indigenous Brazilian tribe called the Mehinaku, who lived in oval huts with no internal walls, each housing a family of 10 or 12. Mehinaku villagers were expected to remove themselves altogether from the life of the village at important stages of life, such as adolescence. When a boy hit puberty, he disappeared into the jungle, returning a man. In today's digital culture, of course, this is precisely the stage at which we make our lives most exposed to the public gaze.

Grimmelmann thinks the suggestion that we are voluntarily waving goodbye to privacy is nonsense: ‘The way we think about privacy might change, but the instinct for it runs deep.’ He points out that today’s teenagers retain as fierce a sense of their own private space as previous generations. But it’s much easier to shut the bedroom door than it is to prevent the spread of your texts or photos through an online network. The need for privacy remains, but the means to meet it — our privacy instincts — are no longer fit for purpose.

Over time, we will probably get smarter about online sharing. But right now, we’re pretty stupid about it. Perhaps this is because, at some primal level, we don’t really believe in the internet. Humans evolved their instinct for privacy in a world where words and acts disappeared the moment they were spoken or made. Our brains are barely getting used to the idea that our thoughts or actions can be written down or photographed, let alone take on a free-floating, indestructible life of their own. Until we catch up, we’ll continue to overshare.

It was almost exactly this feeling that lead me to sigh just three days ago that “sometimes the war over ideas feels exhausting and pointless”. And again, I only write online as a hobby; I’m sure the exhausting and pointless feeling is magnified exponentially when you do this all day. I am often thankful to have my spreadsheets, datasets, and stata code to retreat into. Stata may argue with me sometimes, but at least she never trolls me.

I’m tempted to say that Roberts burn out offers a lesson for everyone, but I don’t I know if this is something people who argue about policy on the internet experience or if normal people on the internet get this too. His complaints do sound like an awful lot like those I hear from people who “quit” facebook. Will internet burnout ever reach the point of becoming a widespread phenomenon? Tyler Cowen has written about “threshold earners” who manage to step off the materialist treadmill:

A threshold earner is someone who seeks to earn a certain amount of money and no more. If wages go up, that person will respond by seeking less work or by working less hard or less often. That person simply wants to “get by” in terms of absolute earning power in order to experience other gains in the form of leisure—whether spending time with friends and family, walking in the woods and so on. Luck aside, that person’s income will never rise much above the threshold.

Thursday, August 22, 2013

The always-excellent David Graeber hits another home run talking about something we’ve raised here many times before: that many of our jobs are totally unnecessary, or even socially harmful. That our excruciatingly long hours are more about proving to ourselves that our job is useful, or trying to impress the boss and get a promotion, than provide any sort of necessary good or service. He calls these bullshit jobs. And he cites them as the primary reason we're not all working less as was predicted by past economists:

There’s every reason to believe he [Keynes] was right. In technological terms, we are quite capable of this. And yet it didn’t happen. Instead, technology has been marshaled, if anything, to figure out ways to make us all work more. In order to achieve this, jobs have had to be created that are, effectively, pointless. Huge swathes of people, in Europe and North America in particular, spend their entire working lives performing tasks they secretly believe do not really need to be performed.... rather than allowing a massive reduction of working hours to free the world’s population to pursue their own projects, pleasures, visions, and ideas, we have seen the ballooning not even so much of the “service” sector as of the administrative sector, up to and including the creation of whole new industries like financial services or telemarketing, or the unprecedented expansion of sectors like corporate law, academic and health administration, human resources, and public relations. And these numbers do not even reflect on all those people whose job is to provide administrative, technical, or security support for these industries, or for that matter the whole host of ancillary industries (dog-washers, all-night pizza deliverymen) that only exist because everyone else is spending so much of their time working in all the other ones.

This is a profound psychological violence here. How can one even begin to speak of dignity in labour when one secretly feels one’s job should not exist? How can it not create a sense of deep rage and resentment. Yet it is the peculiar genius of our society that its rulers have figured out a way...to ensure that rage is directed precisely against those who actually do get to do meaningful work. For instance: in our society, there seems a general rule that, the more obviously one’s work benefits other people, the less one is likely to be paid for it. Again, an objective measure is hard to find, but one easy way to get a sense is to ask: what would happen were this entire class of people to simply disappear? Say what you like about nurses, garbage collectors, or mechanics, it’s obvious that were they to vanish in a puff of smoke, the results would be immediate and catastrophic. A world without teachers or dock-workers would soon be in trouble...Even more perverse, there seems to be a broad sense that this is the way things should be. This is one of the secret strengths of right-wing populism. You can see it when tabloids whip up resentment against tube workers for paralysing London during contract disputes: the very fact that tube workers can paralyse London shows that their work is actually necessary, but this seems to be precisely what annoys people. It’s even clearer in the US, where Republicans have had remarkable success mobilizing resentment against school teachers, or auto workers (and not, significantly, against the school administrators or auto industry managers who actually cause the problems) for their supposedly bloated wages and benefits.

It’s as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is precisely what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the sort of very problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don’t really need to employ. Still, somehow, it happens.

While corporations may engage in ruthless downsizing, the layoffs and speed-ups invariably fall on that class of people who are actually making, moving, fixing and maintaining things; through some strange alchemy no one can quite explain, the number of salaried paper-pushers ultimately seems to expand, and more and more employees find themselves, not unlike Soviet workers actually, working 40 or even 50 hour weeks on paper, but effectively working 15 hours just as Keynes predicted, since the rest of their time is spent organizing or attending motivational seminars, updating their facebook profiles or downloading TV box-sets.

Health care and education are two areas where we see this phenomenon in spades, and it’s no coincidence we’re paying through the nose in both of these areas. In higher ed, there seems to be one administrator for every teacher. In fact, I’ve seen numbers for the teacher/administrator ratio that are getting ever more ridiculous, something on the order of 5 to 1. These administrators man a desk all day long, and do ridiculous things like fill out forms and schedule meetings. They seem to spend their days coming up with new busy work to justify their position more than anything else.

The same goes for health care. Go into any office and you’ll see a front office full of people on computers filling out paperwork all day long rather than helping patients. I personally have seen rooms full of hospital administrators who do nothing but go to meetings all day and make “decisions.” There seem to be one of these characters for every five doctors or nurses. They never see a patient or cure a disease, yet they make enormous, princely salaries, and commute in from those bloated McMansions springing up in distant exurbs out by the freeway. They jump on all sorts of buzzwords to justify what they do: let’s get new “metrics” so we can implement “Toyota principles” or some nonsense like this. And this “decider” class tries to squeeze ever more value from the people actually doing front-line work, while they hit the links at 4:30. A lot of similar bloat is going on in government, too.

I’ve noticed that these bullshit jobs seem to be occupied primarily by women. I think that’s why female unemployment numbers are better than men’s. The exception is upper management echelons, which seems to be primarily men trading in on their golf-course social connections, but this is a smaller caste, and a bit harder to get admitted to. And another disturbing thing is that the "marketers" are pretty much running the show now. You know these people: toned, tanned and square-jawed, dressed impeccably all the time, outgoing, garrulous and charming; these people just schmooze all day long and collect enormous paychecks without knowing how to actually do anything but manipulate people. A disproportionate amount of these people are devotees of Ayn Rand and see themselves as superior for all their sporting junkets and flesh-pressing, while people with actual specialized training and knowledge are just handmaidens to their greatness. Where I work, architects are second-class citizens, marketers are kings and queens, and "getting work" is all that matters.

What I’m surprised he doesn’t point out is how this plays into the ‘makers’ versus ‘takers rhetoric successfully deployed by right-wing populism. In that view, the “takers” are usually people doing real and necessary jobs, while the “makers” are the professional lunch-eating class whose role is to "allocate capital." So the Wal-Mart worker who resorts to getting food stamps because they have been reduced to part-time to avoid getting benefits is a "taker," and the corporate media whips up anger against them by whining about the tax "burden" (which is extremely minimal for social insurance), and the debt that they have convinced us will bring about the end of the world. And people from both the left and the right eat it up.

Graeber argues that keeping peoples' noses the grindstone keeps them too busy to argue for a better deal. I'm inclined to agree; you saw this massive push for social change in the 1960's and 1970's which was the height of middle class prosperity in America. Today, with people working two jobs and under huge debt burdens, people are just trying to survive from day to day, and, sure enough, there is much less pushback. People are just too busy. The golden age of leisure was also the golden age of protest and experimentation. Add to that the fact that no one reads anymore, they just soak up whatever he said/she said nonsense is being put out by a media controlled entirely by corporations.

First, if you look back historically, the idea that the lower classes needed to be kept busy for their own sake was presented in moralistic terms but was in fact ruthlessly economic. The whole point of making the peasants work instead of faff around and drink was to enable them to be exploited by the newly-emerging entrepreneurial class

In other words, a big part of the capitalist exercise is to find or create workers to exploit. Graeber has the story backwards. The moral fable (idleness is bad for the perp and putting him to work is thus a moral undertaking) was not, as Graeber suggests, because lazy people are proto-insurrectionists. It is that people who are self-sufficient and have time on their hands on top of that drove the early capitalists nuts. They were exploitable resources lying fallow, no different to them than a gold vein in the next hill that the numbnick farmer/owner was unwilling to mine because he liked the view and was perfectly content grazing sheep.

A second problem with Graeber’s discussion is the idea that leisure is a good thing given how we now have society ordered in America> (trust me, this statement is not as nutty as it seems when put that baldly).

Monday, August 19, 2013

Blogging’s been a little light lately and may get lighter – I’ve been swamped at work, have a side project going (some fun designs for Milwaukee’s oldest urban garden), and have managed to acquire something of a social life :-). Hopefully everybody’s on vacation.

Today’s my 40th birthday. I never thought I would be this old. I wish I could celebrate more, but it’s Monday and all that. At some point you realize that there is more sand in the bottom of the hourglass than the top, and you wonder where it all went. But enough of that.

Anyway, briefly, In the spirit of The Onion:

Human extinction to cost 500 trillion dollars (BBC)

According new a new study, the cost to the global economy of total and complete human extinction would be approximately 500 trillion dollars.

“That’s a huge amount”, said the lead author of the study from the Miskatonic University economics department, Dr. Paul Gradgrind.

“It makes one think about the potential economic impact of human extinction, and the harm that would ensure for future economic growth,” he said.

“We’ve run the numbers, and we believe that we can reasonably estimate the elimination of every human life on earth as costing 500 trillion over perhaps the next decade.”

Calculating the impact to the economy of the removal of the human species for the rest of the planet’s history was a difficult task facing the economists.

“There are a lot of variables involved. When you think of all the goods and services humans provide, as well as their value as consumers, you realize just how large the number is,” he added.

“We need to make an economic case that preserving the human species will be cost-effective in the long-run,” he said. “We believe this study does that. Maybe now, we can convince the world’s leaders to take action.”

The study has its critics, however.

“Attempting to head off human extinction would harm economic growth and hurt the market,” said Dr. J. Goebbels of the Heritage Foundation, one of the study’s lead critics. The study was also pilloried in The Wall Street Journal.

“We believe that human extinction can be managed through market forces, as long as they are free from government interference,” argued an editorial in The Journal on Monday.

Sunday, August 18, 2013

I hope my readers not interested in architecture will indulge me, because I'd like to explore this topic a bit further. This will be something of an addendum to my last post, and a grab-bag. Those not interested can move on. Hopefully non-architects will find the discussion illuminating.

Regarding the practicality of architectural education, I meant to include this excellent interview of Joe Lstiburek of Building Science from Inhabitat, which I think emphasizes some important points I tried to make:

Inhabitat: What does building science really mean? Did it not exist 50 years ago?

Joe Lstiburek: Well, it always existed. It’s really the technical side of architecture that architects gave up. If architects did their job there wouldn’t be any need for building science. You know, I’m flabbergasted by the architectural profession giving up control of such a profitable part of the industry, which is the interaction of the building enclosure with the climate and the people and the mechanical system.

You know, this occurred because of the change in the focus on the education of the architects, the school. They’re focused – they’re trained in art. They’re not trained in physics and material science to actually execute their designs.

Back in the day, 100 years ago, or maybe 50 years ago in Europe, architects were trained like master builders. They understood structure. They understood mechanical. They understood physics. They understood material science. They understood how everything worked together. The focus now on the architectural education is all art and what’s missing are all of those other pieces — one of those missing pieces is building science or building physics.

Wednesday, August 14, 2013

Last week I was going to include this post
somewhere which I ran across in The New York Times: Writers as Architects. As
someone with a foot in both worlds, this article seemed especially relevant to
me. But I came away with a different impression than I expected from the
article.

This article perfectly illustrates a point I’ve
tried to make for some time. Simply put, architecture
is not sculputure! The pieces in the article are lovely pieces of sculpture, but they are not
architecture! Indeed, the interpretation.in physical form of abstract
concepts is not unheard of in the arts. But ask yourselves, could any of these
“buildings” stand up for more than five minutes? Or keep the wind and the rain
out? Could you find your way around in them? Could they meet their intended
purpose (and if no intended purpose, why waste the resources)? Could they be
constructed economically? Could they even stand up physically? What materials
are they made from – I have no idea.

I think the answer to all this is clearly no.
So why do we call these things architecture?
Because they are vaguely building-shaped? Why not be honest and say writers as
sculptors? After all, I quite like these pieces as sculpture. I’d enjoy looking
them in an art museum. Clearly these are not intended to be constructed. So why
is this writers as architects? Have we become so confused that we don't know the difference between sculptors and architects anymore?

Architecture, unlike sculpture, is a practical art. And that’s the way it
should be. That means that architecture operates under a different set of
limitations than sculpture (or painting or poetry or dance). Buildings have to
do a million things, from facilitate the activates taking place inside them
(office work, medicine, education, government) to keeping people thermally
comfortable, to resisting earthquakes. Rather than beginning with these limitations, architects resist them at every
turn, instead trying to impose some overall “vision” that no one seems to
understand (often just an attempt to stand out from the crowd). Then, massive
amounts of engineering complexity and money are tossed at these buildings to
make the design work and compensate for the fact that practical considerations
were for all intents and purposes ignored at every turn. But that all too often
fails. Here are a few differences between architecture and sculpture:

1. Sculpture stands alone as an objet d’art.

2. Sculpture is not inhabited by human beings.
No activity is intended to take place inside sculpture.

3. There are no life safety or exiting
requirements for sculpture. People will not lose their lives due to poorly
designed sculpture.

4. The forces of gravity do not affect
sculpture in the same way. The scale is totally different. Sculpture does not
have to “stand up” or resist lateral forces like wind and earthquakes.

5. There is (typically) no direct context for
sculpture. Sculpture does not create a built environment and is not placed
side-by side with existing sculptures, often from different time periods.

6. The cost in money and resources of a
building are orders of magnitude greater than for sculpture. Efficiency
matters.

7. There are no systems for water, air, light,
electricity, telecommunications, and the like inside sculpture.

8. Sculptures (typically) do not use energy.
Resource efficiency of a sculpture is not determined by its form because it
uses no resources.

9. Sculpture is (typically) not built to last
for decades or centuries. It is less vulnerable to freeze/thaw, UV damage,
wear-and-tear. Sculpture does not have to deal with the weather.

I'm sure there are more. Both are 3-dimensional. Both should be
beautiful and inspiring. Those are the similarities. But they mostly end there.

Here’s the real reason: Note that this was done
under the aegis of an architecture school. And now you have a good look inside
how architecture students are taught nowadays, and why Johnny can’t design a
decent building that can’t keep out the rain, doesn’t look like an amoeba, or
costs less than 100 million dollars. Rather than “the timeless way of
building,” students are all taught to be the next design “genius” in the mold
of Frank Gehry, Thom Mayne, or Zaha Hadid. The twenty-year old kids at Harvard
or Columbia are taught design-speak and are set loose to individually give form
to international art museums and Chinese skyscrapers and are told to let other
people (engineers, contractors and consultants) figure out the messy details. A
desperate rear guard action is later done by students to acquire all the information
they never got in school. What is the point of architecture school again? It
used to be that architects had to know how a building went together before they became “innovative”
designers. Now it’s the reverse. Yet the profession digs in its heels in the
face of the mass unemployment of its graduates and arrogantly proclaims that it
teaches its students “how to think.” Apparently, it’s not well.

Now you know why architects have lost all
relevance to the practical world of buildings. Architects are little more than
corporate whores, because these are the only organizations capable of building
such wasteful vanity projects on a massive scale.

Architects today are not architects, nor are
they trained as such. Instead they are trained as sculptors working in an
abstract world free from the wind, rain and snow, building codes, user needs,
ductwork runs and chases, piping, fire rated enclosures,
lateral bracing, etc. They come up with their “vision” and then rationalize their
abstract concepts in some ridiculous way. And then their buildings fail, are
perennially too hot or too cold or too loud, have too few bathrooms, or people
trip and fall on the stairs, get lost, etc., or they get sued, are not compliant with codes, or are ridiculously over budget.

Lest you think I exaggerate, please see this
site. http://futuresplus.net/. These are actual graduate thesis projects!This
is what the profession celebrates, and it what students are taught. It’s also
why architecture retreated to elite universities in the post-war period as the
only valid form of training (enforced by licensing and accreditation),
everything began to go downhill. A lot of them are brilliant art, but are they architecture?

People are quicker to pull out a smartphone to check the time, then to look at their watch, and even quicker to go shopping on their phone then to go to a mall. The shopping experience as we know today is rapidly disintegrating. More people are making their purchases online and now they can do it from anywhere, no longer shopping from home. The smartphone has allowed us to actually interact with our surrounding spaces. Shang-Jen Victor Tung begins to look at changing the way in which we shape, organize and even stock for today’s modern shopper. The architecture will no longer stand as a static strip mall, but will strive to connect with each customer thru digital means.

Tan Akinci project is located at the site of Vienna’s Westbahnhof train station at the end point of the commercial center of Mariaholferstreet. Coming out of the same studio a project we showcased earlier Asemic Forest by Shahira Hammad. However instead of contaminating the site, the project pushes to create a public space in which defines the boundaries and flow of the site. However, as is the problem with many scripted projects, we tend to lose a level of spatial relationship and scale to its surrounds. Diagrammatically as you read through the sections you can see the light pull of movement and flow through the project, which builds to dramatic and strong end to with its connection to the street.

Note how many of these projects are impractical
and unbuildable. Often it’s beautiful art, but architects seem to have
abandoned the three-dimensional world altogether in favor of really cool
renderings. That’s fine if you’re making Avatar or the sequel to Blade Runner,
but that’s not so good if you want buildings to work in the real world.

It’s simplistic but true: modernist
architecture has evolved from modern art, particularly sculpture. It is
obsessed with “meaning” and “message.” It is obsessed with novelty and
symbolism. It does not look to the past, nor respect any traditions of building
and construction. It does not see the beauty inherent in forms forged by
practical concerns, and this is too bad. For in nature, every form follows
function, every form is ultimately derived from the forces that surround us,
from gravity to the surface tension of water. By constructing abstract
sculptures on a massive scale, and treating public criticism as so much braying
from ignorant philistines, architecture has abandoned the field to become an
insular profession with less and less relevance to the built environment that
people actually inhabit.

Sadly, what the abandonment of practical building by
architects and the retreat into insular abstract intellectualism has meant is
that most buildings go the other direction – bargain-basement utilitarian crud
sprawling across the landscape with no thought or aesthetics whatsoever.
Architects have instead confined themselves to art museums in China and
prestige products for universities, rather than any kind of relevance to the
wider society. Architects even took on the air of modern artists, cladding
themselves in black turtlenecks, scarves, horn-rimmed glass and becoming celebrities
in the mode of Duchamp, Dali or Warhol. Owning a “Gehry” is just like owning a “Picasso,”
except we all have to live with the mess.

It wouldn’t be so bad if this “freedom” made
buildings substantially more beautiful. But I don’t think it has. In fact, if
anything the public is more alienated from buildings than ever. Architecture
has no relevance to their suburban world of world of particle-board shacks and
cinderlock strip malls, and the public gazes from afar at the multi-million
dollar steel and glass skyscrapers and boutiques of the the one percent in cities
where “ordinary” people cannot afford to live. Older places of beauty are
snatched up by the oligarchy and “ordinary” people are priced out. Asked to
name their favorite buildings, most people chose ones at least sixty years old,
while clamoring to tear down anything built after 1970 without shedding a tear
(or in fact actually celebrating). So it doesn’t seem like the modern
high-concept sculpture approach emanating from the university departments has
really served us that well.

Let’s stop treating architecture as sculpture
and rediscover how to actually build user-friendly, contextual buildings more concerned with place-making than novelty or showmanship. Let’s top training students to be
design “stars” and more how to work with clients and how to put their concerns first. Let’s have students spend
less time in rendering software, and more time on actual construction sites. Let’s
stop pretending architecture students are the same as industrial designers,
giving objects form, rather than practitioners of a set of principles that are
unique to buildings. Let's not consider history as bunk, and limitations as something to be overcome. Let’s let architects think about how their buildings actually work and how they are actually used. Let’s teach
students the rules of good design and structure before we allow them to break those rules. Otherwise, future architects
will be little more than people teaching students how to put together a bunch of
glue and cardboard to symbolize Gravity’s
Rainbow.

Sunday, August 11, 2013

I've noticed a disturbing tend. Across all levels of society, people who do actual work, be they laborers, professionals, blue-collar, white-collar, pink-collar, skilled, unskilled, highly educated, specialized, unspecialized, etc., at every level from engineers to teachers to chefs to airline pilots - people who actually work for a living - are getting paid less and less and have to work harder and harder. Meanwhile, certain individuals who sit atop the economic pyramid are getting an ever-larger share of the fruits of society, while claiming that they "earned" it (why didn't they earn it in previous decades?).

Without those workers, of course, society would fall apart. Without firefighters responding to blazes, nurses tending to patients, systems administrators making sure computers are running smoothly, airline traffic controllers to make sure planes can land smoothly, or pipefitters to maintain mechanical and plumbing systems, the things we enjoy today, from modern healthcare to air travel to electricity to safe streets - none of it exists. Yet the people who make it happen are getting poorer and poorer, while the professional lunch-eating and golfing class has more money than they can spend in twenty lifetimes.

This is aggravated by the fact that post-Fordist economies seem to be only capable of producing jobs with abysmal pay. Two of the three largest employers in America are Wal-Mart and McDonalds, yet we maintain the fiction that these are just second jobs for housewives or summer jobs for teenagers. Keep in mind, those are just two employers, there are also umpteen retailers like Target, JC Penney, Kohls, and fast-food shacks like Arby's, Wendy's Taco Bell, etc. Outfits like Manpower and Kelly Services are also booming. And let's not even talk about unpaid internships, prison labor, migrant workers, H1-B visas, "undocumented" immigrants, and the like. The "service economy" is a low-wage economy, and a cruel joke played on all of us. It's yet another bullshit story from the economic priesthood that we've managed to accept without question as if it were some sort of religious doctrine.

A small technocratic elite still makes relatively decent wages, and everyone is working themselves raw in a desperate effort to join this vanishingly small privileged class. That technocratic elite recruits worldwide; odds are your company's new accounting manager could just as likely be from Bangalore as from Bangor. Meanwhile, anyone not in this elite is blamed for their own fate - for not getting enough education, or studying the wrong thing (the devil take you if you do not study for a STEM degree). If you don't want to sink yourself into debt for the mere "chance" of getting a position that will pay enough to cover costs, you're not entrepreneurial enough to succeed apparently. What would be considered a rational cost/benefit decision is recast as not believing you have control over your destiny. Musical chairs is not a way to structure a workforce.

All of the costs for becoming an effective economic member of society have been placed onto the backs of employees themselves, so is it any wonder that we have less and less people who are qualified to be economically productive members of society? Yet corporate America does nothing to rectify this, even while constantly complaining about a skills mismatch.

Thankfully, there is finally some pushback. If nothing else, the Occupy movement has pushed inequality to the top of the national agenda, so that even the President and corporate-owned media are paying lip-service to doing something about it (as long as they don't actually do something about it, of course). Economists are starting to point out that reducing workers to penury is not smart in an economy dependent upon consumer spending in a mass market. And, most hearteningly, fast-food workers have walked off the job in major cities around the country.

Faced with a total abandonment of governance at the national level, municipalities have started to pass legislation on their own mandating decent wages. It is telling that some companies, even those with billion-dollar profit margins, are so resistant to the idea of paying their employees decent wages that they are refusing to open stores, that is abandoning those markets altogether! To which we should all say good riddance - we don't need their crap goods and crap jobs. That's the first step to independence from the corporate wealth pump that siphons money away from communities and into the Sodom and Gomorrah of Wall Street. It's the first step to local economies.

The white rump that forms the core of the Republican party has managed to maintain abnormally high wages, but the increasingly young, racially-diverse and urbanized future of the country who have been getting low wage jobs and temp work, even with large amounts of education, has finally begun to do what previous generations have done - fight for more. I see this as one of the major trends that is going to unfold more and more. It's an encouraging development. The trickle-down doctrine that Americans have swallowed for years is finally losing it's hold on America.

There was once a time where people on the top of society were seen as parasites. They knew their fortunes were derived from the people who worked for them, and if they took too much, they would destroy the very society that made them rich in the first place. Now, we've gone to the extreme that sees working America as mere parasites of the financial aristocracy, and dependent on their good graces to have a job at all. I recall seeing in the news a Tea Partier who loudly proclaimed "no poor person ever gave me a job," while protesting for lower taxes on the rich and less workplace regulations. I think such extreme idiocy is finally on its way out, as Calvinist America gets a strain of Liberation theology from Latin America.

I've often noted that at no point in history have we ever been more interdependent, and at no point in history has our wealth been produced more collectively (since complexity and globalization mean that no one person has much knowledge to produce much of anything anymore - what single person can make a car or a microchip, or run an Internet?), yet at no point has our wealth been so concentrated in fewer hands! I forget the exact statistics, but the wealth of the Forbes Billionaires alone could eliminate global poverty twenty times over. You would think they would be hiding their wealth, but instead, it is flaunted, so sure are they over control over the hearts and minds of the world's workers. Communism is dead, they proclaim, secure in the knowledge that any notion economic justice died along with it. Perhaps we're finally seeing the signs that the people who really make society function have finally had enough.

The faculty are unhappy (Marginal Revolution). Shows that even in industries that are extremely wealthy, the people on the front lines are getting the shaft. While University administrators collect seven-figure paychecks and extreme perks from housing to travel:

Public university professors don’t enter the profession to get rich. But some faculty are having trouble paying bills, and have even qualified for foods stamps, Olson said. “For somebody to go five to seven years beyond college to obtain a Ph.D. degree and to realize that you are in need of federal assistance to make ends meet — and that’s for a tenure-track position –” is devastating.

Adding what some view as insult to injury, a recently published database of public employee salaries shows that some professors earn less than their colleagues at local high schools without doctorates.

We can afford to pay these workers: a petition titled “Economists in Support of a $10.50 U.S. Minimum Wage” estimates that McDonald’s could recoup half the cost of such an increase simply by hiking the price of a Big Mac from $4 to $4.05. One item; 1 percent.

So the only reason this kind of outrage continues is that many ultrarich are denying the needs and suppressing the rights of our lowest-paid workers. These people face huge odds, but equal challenges were overcome in both the 1930s and the 1960s by bold and sometimes “crazy” actions. There was mild government support then, and that’s weaker now; but perhaps midterm elections will change that.

The recession killed 60 percent of $15- to $20-an-hour jobs, which should be the lowest-paying ones. Around 20 percent have returned, but the rest are being replaced by those paying less than $13 an hour. Thus median income for working-age households fell more than 10 percent from 2000 to 2010.

A vast majority of Americans are much closer in income to McDonald’s workers than to corporate C.E.O.’s. Yet we tolerate the fact that one in seven of our fellow Americans live in poverty, with half of those people working tough jobs. Do we want to be part of that? Surely, better scenarios exist. And victory for the lowest-wage workers will have a positive impact on wages for everyone.

You thought Ryanair's attendants had it bad? Wait 'til you hear about their pilots (The Independent) Old, but relevant. I'm always amazed at how low everyone involved with keeping planes in the air are paid, from pilots to mechanics to security personnel to stewardesses. Millions of people literally trust their lives to people getting paid poverty level wages every single day! This is so fucked up! Meanwhile, airline CEO's get paid millions despite constantly losing money and requiring government bailouts.

Employers added a seasonally adjusted 162,000 jobs in July, the fewest since March, the Labor Department said Friday, and hiring was also weaker in May and June than initially reported. Moreover, more than half the job gains were in the restaurant and retail sectors, both of which pay well under $20 an hour on average.