What's a Hipcrime? You committed one when you opened this blog. Keep it up. It's our only hope.

Thursday, April 4, 2013

Jawbz

This is a very important piece - it explains exactly why so many jobs are necessarily miserable in a capitalist economy. And it does so by using mathematics and equations, so that even economists can understand it!:

You know it's true. A boss needs to be able to dump you at the drop of a hat. Maybe it's to boost profits. Maybe it's to cut costs. Or maybe it's just because it feels good.

And a boss needs to be able to dump you without it having detrimental effects. There must be ready replacements, eager to crank it up, the moment you're out the door. And if morale suffers, because the buddies you left behind miss you, the boss will want to send them packing too, and bring in a fresh batch of wage-slaves.

Quite simply, there is little place for satisfying roles, the kind where you get to learn more and more interesting stuff over time. The only good on-the-job learning is no learning at all. Or, if you must learn, thirty minutes tops to master a dead-end role.

Workers in seven of the 10 largest occupations typically earn less than $30,000 a year, according to new data published Friday by the Bureau of Labor Statistics. That's a far cry from the nation's average annual pay of $45,790.

Food prep workers are the third most-common job in the U.S., but have the lowest pay, at a mere $18,720 a year for 2012. Cashiers and waiters are also popular professions, but the average pay at these jobs tallies up to less than $21,000 annually. There are 4.3 million retail sales workers out there, making them the most common job, but the position pays only $25,310 for the year.

Among the 10 most popular professions, only the nation's 2.6 million registered nurses earn a good living, bringing home nearly $68,000 a year on average. Another two of the most common jobs -- secretaries and customer service representatives -- have an average annual wage of about $33,000.

It's the job of the future! And it's terrible! It's low-wage, low-tech, long hours, in most cases it's not covered by minimum wage or overtime protections, and it's projected to grow by 70 percent between 2010 and 2020. That's right: It's home health care work.

The average hourly wage is just $9.70 an hour, according to the Labor Department.
For those in the industry who work full-time, this amounts to roughly $20,000 a year. Many health care aides only work part-time though—and they do not receive benefits.
Under these conditions, it's no surprise then that about 40% of home aides rely on public assistance, such as Medicaid and food stamps, just to get by.

Many researchers have documented a strong, ongoing increase in the demand for skills in the decades leading up to 2000. In this paper, we document a decline in that demand in the years since 2000, even as the supply of high education workers continues to grow. We go on to show that, in response to this demand reversal, high-skilled workers have moved down the occupational ladder and have begun to perform jobs traditionally performed by lower-skilled workers. This de-skilling process, in turn, results in high-skilled workers pushing low-skilled workers even further down the occupational ladder and, to some degree, out of the labor force all together.

'Is the Demand for Skill Falling?' (Economist's View). Mark Thoma comments, "If true, this would upset nearly everyone’s narrative apple cart, including mine." And one of the comments reads, "just check BLS: The 30 occupations with the largest projected employment growth, 2010-20: http://www.bls.gov/news.release/ecopro.t06.htm. Other than nurses and teachers, most of the top ten dont even require a high school diploma.."

The US has gained 387,000 managers and lost almost 2m clerical jobs since 2007, as new technologies replace office workers and plunge the American middle class deeper into crisis.

Data from the Bureau of Labour Statistics divide the US workforce into 821 jobs from dishwasher to librarian. They show rapid structural shifts – on top of a cyclical unemployment rate of 7.7 per cent – that may increase income inequality.

One probable cause of rising inequality is new computing technologies that destroy some middle-class occupations even as they create jobs for highly skilled workers who can exploit them.

The number of clerical workers such as book-keepers, tellers, data entry keyers, file clerks and typists has been falling, pointing to a structural decline. The number of retail cashiers has also dropped – indicating that internet shopping and self-checkout systems may be eroding another occupation.

Employment growth came from healthcare, management, computing and food service jobs. The number of personal care aides is up 390,000 since 2007. Demand for people who figure out how to replace clerical workers – such as operations managers, management analysts and logisticians – grew substantially.

…But salaries for many of the fast-growing occupations are lower than those they are replacing. The average wage for a clerical job in 2012 was $34,410 compared with $24,550 for a post in personal care. The average computing wage was $80,180 and $108,570 for managers.

"Even though the U.S. is more competitive globally, manufacturing doesn't give you the kind of direct job creation it did in years past," said Joseph G. Carson, director of global economic research at AllianceBernstein, a Wall Street investment firm. "At the end of the day you still want a strong manufacturing base, but there aren't as many people on the factory floor."

Indeed, while the sector has added 500,000 jobs since the recession ended and the value of what the nation's factories churn out is close to a high, there are nonetheless two million fewer manufacturing workers today than in 2007. Ever since the early 1960s, the share of jobs in manufacturing has been on a nearly uninterrupted downward slope, now accounting for less than 9 percent of all employment in the United States.

The Wall Street Journal reported last week that, according to Labor Department data, roughly 284,000 American college graduates are working minimum wage jobs. While that is down from its 2010 peak, it is still double the number who worked such jobs before the Great Recession.

As the Journal notes, the share of college educated workers in minimum wage jobs hasn’t changed — it is still roughly 8 percent. What has increased, however, is the number of college graduates working for hourly pay:

Instead, they’re ending up slightly higher up the ladder, in jobs that pay an hourly wage. In 2002, college graduates made up 13% of all hourly workers. That figure has risen every year since, hitting 17.8% in 2012. There are now 13.4 million college graduates working for hourly pay, up 19% since the start of the recession. While the Labor Department doesn’t provide data on how much those jobs pay, it’s a safe bet most of them aren’t the kind of jobs students were hoping for when they graduated.

These increases, both in minimum and hourly wage jobs, are likely due to explosive growth in low-wage sectors since the end of the recession. Low-wage occupations accounted for a fifth of job losses during the recession, but they made up 58 percent of the jobs added since the recession’s end, according to a recent study from the National Employment Law Project. Meanwhile, mid-wage occupations, which would generally cater to recent college graduates, made up 60 percent of recession losses and just 22 percent of jobs added since it ended.

Compared with December 2007, when the recession officially began, there are 5.8 million fewer Americans working full time. In that same period, there has been an increase of 2.8 million working part time. Part-time workers — defined as people who usually work fewer than 35 hours a week — are still a minority of the work force, but their share is growing.
When the recession began, 16.9 percent of those working usually worked part time. That share rose sharply in 2008 and 2009 and has not fallen much since then. Today the share of workers with part-time jobs is 19.2 percent.

This would not be so troubling if people were electing to work fewer hours. But that is not the case.

Basically all of the growth in part-time workers has been among people reluctantly working few hours because of either slack business conditions or an inability to find a full-time job. Together these people are considered to be working part time “for economic reasons.” Their numbers have grown by 3.4 million since the downturn began.

The number of people working part time “for noneconomic reasons,” on the other hand, has fallen by 639,000 since the recession began.

The way we live now: it takes a B.A. to find a job as a file clerk, according to the New York Times, which declared today that the college degree is the new high school diploma.

Of course, most of us don't have to read the paper of record to know that because we're living it, either through firsthand experience or friends and family who've looked for a job in the past few years. But this piece has some poignant depressing quotes from employees at Busch, Slipakoff & Schuh, a 45-person Atlanta-based law firm where everyone — the receptionist, paralegals, administrative assistants, file clerks, and in-house courier who basically just runs around transporting documents — went to a four-year college.

"College graduates are just more career-oriented," said Adam Slipakoff, the firm's managing partner. "Going to college means they are making a real commitment to their futures. They're not just looking for a paycheck." (Um, those are directly conflicting statements, right?

The number of job openings has increased to levels not seen since the height of the financial crisis, but vacancies are staying unfilled much longer than they used to — an average of 23 business days today compared to a low of 15 in mid-2009, according to a new measure of Labor Department data by the economists Steven J. Davis, Jason Faberman and John Haltiwanger.

Some have attributed the more extended process to a mismatch between the requirements of the four million jobs available and the skills held by many of the 12 million unemployed. That’s probably true in a few high-skilled fields, like nursing or biotech, but for a large majority of positions where candidates are plentiful, the bigger problem seems to be a sort of hiring paralysis.

We are quickly becoming a nation of permanent freelancers and temps. In 2006, the last time the federal government counted, the number of independent and contingent workers—contractors, temps, and the self-employed—stood at 42.6 million, or about 30 percent of the workforce. How many are there today? We have no idea since 2006 was the last year that the government bothered to count this huge and growing sector of the American workforce.

America’s stark class divides are a product of its ongoing economic transformation. As the ranks of the working class have shrunk due to the devastating one-two punch of automation and globalization, two other classes have swelled. On the one hand, there is the creative class of scientists and engineers; business professionals and knowledge workers; artists, entertainers, media workers and cultural creatives. Numbering more than 40 million, they account for almost a third of the American workforce. With average annual earnings of more than $70,000, they collect almost half of all U.S. wages and salaries and control some 70 percent of the nation’s discretionary income.

But in parallel, another much larger class has arisen. More than 60 million Americans belong to the service class. These are some of America’s fastest-growing job categories, such as food preparation, personal care, and retail sales, but on average they earn just over $30,000 in annual wages, and many quite a bit less than that.

America once had a dream. For almost two-thirds of us, that dream is either dead or dying.

The math is terrifying. Add the ranks of the unemployed, the displaced, and the disconnected to these tens of millions of low-wage service workers, and the population of post-industrialism’s left-behinds surges to as many as two-thirds of all Americans. This is a much larger, and perhaps more permanent, version of the economic, social, and cultural underclass that Michael Harrington long ago dubbed "the other America." In fact, it is our majority.

For years, medical students who chose a residency in radiology were said to be on the ROAD to happiness. The acronym highlighted the specialties — radiology, ophthalmology, anesthesiology and dermatology — said to promise the best lifestyle for doctors, including the most money for the least grueling work.

Not anymore. Radiologists still make twice as much as family doctors, but are high on the list of specialists whose incomes are in steepest decline. Recent radiology graduates with huge medical school debts are having trouble finding work, let alone the $400,000-and-up dream jobs that beckoned as they signed on for five to seven years of relatively low-paid labor as trainees. On Internet forums, younger radiology residents agonize about whether it is too late to switch tracks.

"I never really thought about whether I would actually like being a lawyer, which seems absurd to me now," Manhattan-based lawyer Emily, 28, told the Guardian, "but is extremely common among young people applying to law school."

Emily's debt after graduating with a degree from Fordham Law is $139,000 – average Fordham debt: $134,319 – and though her salary of $170,000 at a Manhattan law firm is enough to pay off the bill in 10 years, she's locked into a career she doesn't enjoy at least until then. Her loan payment: $1,700 per month.

"I haven't enjoyed my career so far, and I honestly don't think I will," Emily said. "I am tied to working at a big law firm if I want any chance at repaying my loans in a reasonable time and it is not a good job. And I'm one of the lucky ones because I'm employed!"

For me, it began in college, taking rigorous pre-medical courses against a large yearly burden of tuition: $27,000 of debt yearly for 4 years. I was one of the fortunate ones. Because I excelled in a competitive academic environment in high school and was able to maintain a position in the top tier of my class, I obtained an academic scholarship, covering 70% of this tuition. I was fortunate to have graduated from college with "only" $25,000 in student debt. Two weeks after finishing my undergraduate education, I began medical school. After including books, various exams that would typically cost $1,000 to $3,000 per test, and medical school tuition, my yearly education costs amounted to $45,000 per year.

Unlike most other fields of study, the demands of medical school education, with daytime classes and night time studying, make it nearly impossible to hold down an extra source of income. I spent an additional $5,000 in my final year for application fees and interview travel as I sought a residency position in Internal Medicine. After being "matched" into a residency position in Michigan, I took out yet another $10,000 loan to relocate and pay for my final expenses in medical school, as moving expenses are not paid for by training programs.

At that point, with medical school completed, I was only halfway through my journey to becoming a doctor. I recall a moment then, sitting with a group of students in a room with a financial advisor who was saying something about how to consolidate loans. I stared meekly at numbers on a piece of paper listing what I owed for the two degrees that I had earned, knowing full well that I didn't yet have the ability to earn a dime. I didn't know whether to cry at the number or be happy that mine was lower than most of my friends. My number was $196,000.

No, you will not get a job—not because, like Kafka’s mouse, you went in the “wrong” direction, but because today’s academic job market is a “market” in the sense that one stall selling fiddlehead ferns in the middle of a strip mall is a “farmer’s market.” In the place of actual jobs are adjunct positions: benefit-free, office-free academic servitude in which you will earn $18,000 a year for the rest of your life.

But how did this happen? Colleges and universities have more students than ever—and charge higher tuition than ever—so whither the humanities professorship amid all the resort-like luxury dormitories and gleaming student centers? Is the humanities professorship extinct because at this very second, thousands of parents of wide-eyed college freshmen are discouraging them from taking literature, philosophy, foreign languages or history (the disciplines that comprised a college education in its entirety for thousands of years, but whatever), even though quite unlike humanities Ph.D.s, humanities B.A. degrees are actually among the most hirable? Or is it, as Rosenbaum and others have suggested, that the overproduction of obtuse torrents of jargon has caused my profession to hasten its own irrelevance?

Who cares? None of this will be sorted out in the five to 10 years it takes you to get a Ph.D. So don’t. Sure, you may be drawn to the advanced study of literature like my late grandmother to her three daily packs of Kools—but in the 1950s, smokers didn’t know any better. In 2005 when I began my own Ph.D., I should have known better, but I didn’t. Now that you know better, will you listen? Or will you think that somehow you can beat odds that would be ludicrous in any other context?

Look, smarty-pants, let me put this in overcomplicated language you can understand: Ludwig Wittgenstein concluded his Tractatus Logico-Philosophicus by proclaiming that anyone who “understood” the work would know to discard everything in it after reading it, to “throw away the ladder” after reaching the top, as it were. But with academia, you don’t need to put yourself through five to 10 years of the hardest work you will ever do, followed by four years (and counting) of rejection and dejection, simply to conclude that the experience was ill-advised. When it comes to graduate school, you should just chuck the ladder before you try to climb it. You’ve only got to run the other way.

Get used to "the new normal." Oh, and from the article above, I just had to include this fable from Kafka, because it sums up the entire job market so well:

"Alas," said the mouse, "the world gets smaller every day. At first it was so wide that I ran along and was happy to see walls appearing to my right and left, but these high walls converged so quickly that I’m already in the last room, and there in the corner is the trap into which I must run."

Well, the Marxist view is because you need to sell your labor power in order to survive, unless you can loaf off investment income.

But really, I'm always amazed at stories about lottery winners who still trudge into work and sit in a cubicle everyday. I mean, WTF, they now have tons of money and this is all they can think of to do with their lives? And the media always reports this as some kind of noble gesture rather than the tragedy it is. It seems many people are so well trained that they don't even know what freedom is. It's like a zoo where the cage is opened, and the animal does not want to leave.

And I never understood the whole "my work is my whole life's purpose - I can't live without it" idea here in America. Whose work is that important? Most of us have ultimately very little choice or say over job or career. Maybe if you're an artisan or scientist or inventor or work for yourself in some capacity. But where does this idea come from that working 40 hours a week for some company or individual is so important that it not only pays the bills but defines you as a person? It's bizarre. Especially since having "a job" is a fairly recent modern invention.

Yeah, and some lottery winners even kill themselves afterwards. That might be due to their loss of friends/family etc when they come into money though.

Work is horrible. You remember Ran Prieur wrote about the adjustment period after you drop out? Some like: it takes a few years to find that internal motivation needed to get stuff done when you're not being forced to work. I think we're all susceptible to that. And thinking about job-quitters rather than lottery winners, there's huge social pressure to go back to work, since if you're not on the clock for the entirety of your adult life, you're somehow a bum.

I just read this interesting post on The Last Psychiatrist. It's about SSI and SSDI payments serving as make-shift minimum income for the unemployable. I wonder how many mid-twenties white guys with BA degrees are on this...