Breaking Point Blog

Tuesday, October 23, 2012

I admit that I have a tendency to talk more about
language than about substance when analyzing political speech, but then again
there isn’t much substance to be had.And besides, the little things matter.Failing to take note of tricks of language and Orwellian efforts at
branding can mean lowering your guard against political manipulation.So I don’t feel bad about picking at nits,
because I believe those nits have a tendency to grow and consume public dialogue
if they aren’t gotten rid of.

There was one talking point in last night’s
presidential debate that I found exceptionally aggravating.It was the point that Mitt Romney made about
the perilous situation that the United States faces in the world because Iran
is now “four years closer” to having a nuclear weapon.Considering that that was repeated four times
over the course of the ninety minutes, it seemed to me that it relied upon the
assumption of an audience that wasn’t really paying close attention.It grabbed my attention as an utterly empty
political slogan the first time it was uttered, and the rest of the audience
had three additional opportunities to get that same impression.

There’s really no other impression to be had.Saying that Iran is four years closer to a
nuclear weapon is like saying that I’m four years closer to a Nobel Prize.Chances are pretty good that I’m not going to
get one of those.I guess you could
argue that even if I don’t, as long as I keep working towards a relevant goal –
writing literature, for instance – I might be closer to having a Nobel Prize
when I die than I am right now.But
speaking more colloquially, if a certain outcome is never going to happen, one
doesn’t get closer to it over time.

Now, in the event that I am going to win a Nobel
Prize, then the hypothetical statement is true.I am four years close to that outcome than I was four years ago.Similarly, if at any point in the future,
under any circumstances, Iran develops and builds a nuclear weapon, then they
are presently four years closer to that outcome than they were four years
ago.That’s
the way time works.Future events
get closer every day.Would a Mitt
Romney presidency unlock some secret of the universe that would allow time to
flow backwards within the borders of one country?Or was this intended as a subtle metaphor for
bombing nations back to the Stone Age?

At this point, I imagine a lot of people will
narrow their eyes disdainfully at me and hiss, “Oh, you know what he meant.”And in a vague sense, yes, of course I
do.Obviously he meant to suggest that
President Obama’s first term allowed the Iranian regime to become materially
closer to having the knowledge and resources required for building a nuclear
weapon.But I don’t know what Romney
meant by “four years closer,” presumably because “four years closer” doesn’t
mean anything.It’s the vaguest and
least substantive way he could have phrased what he was trying to say, and
since it was repeated four times over, that must have been deliberate.

If Romney had had a substantive claim to make
about a worsening Iranian threat resulting from an Obama presidency, he had
ample opportunity to make it.He didn’t.That’s not to say that there’s no such
argument to be made, but it does suggest that Romney’s claims relied on bullshit
– a disregard for the truth value of what he was saying, in favor of whatever
would serve his ends if it happened, incidentally, to be true.

That behavior is quite in keeping with Romney’s
entire approach to his campaign.Typically, he seems to accomplish this exploitation of politically
convenient narratives by plainly reversing position, or by outright lying.Now, with the third debate, he seems to have
uncovered the perfect means of utilizing bullshit, which is by making claims
that cannot be contradicted because they provide absolutely no information,
even as they sound damning for the opposing party.I suppose that in that way Romney has
succeeded in emulating Reagan.

The public cannot allow such casual disregard for
truth or rational argumentation to stand as a relevant political tactic.We cannot allow politicians, corporations, or
anyone else to believe that they can sway us by branding and rhetoric alone,
without having to appeal to factual data or to be transparent about their own
views.Such dialogue will only improve
if we hone our ability to parse it and separate it and call out the
bullshit.We’re failing in that
responsibility if Mitt Romney believes he can say four times in the same debate
that we’re four years closer to the future, and have that somehow count in his
favor.

Friday, October 12, 2012

I just happened upon a clip from Chris Matthews
coverage of the supporter gatherings prior to the Vice Presidential
debate.It is not enormously
significant, but it is a delicious bit of video, which I have an irresistible urge
to comment upon.The roughly one-minute
clip begins with Matthews interviewing a random Obama supporter.Just as he asks her about her health care
situation, an old woman interjects from off camera by shrieking the word “communist!”
in a voice that would have made it notably fitting if she had followed up with,
“burn him!”

Everyone in frame reacts to the shout, but the
woman being interviewed shakes it off and takes a few seconds to explain that
she and her husband had recently lost health insurance for the first time in
their lives.Chris Matthews lets her finish
her answer, but the speed with which he departs when she reaches the end of her
sentence suggests an almost Pavlovian response to the shrill voice at the edge
of the crowd.He lowers the microphone
immediately and says, “Okay let’s go over to this lady,” whereupon he seeks out
the person who yelled communist, in order to ask her what she meant by it.

What follows is a stunningly awkward exchange in
which Matthews asks the woman exceptionally unchallenging questions,
essentially just repetitions of “what do you mean?” and she repeatedly fails to
answer them, instead chiding the professional journalist and commentator to “study
it out, just study it out,” derisively referring to him as “buddy,” and
asserting that she knows what she means.It would be painful to watch if I had any inkling that the woman had
sufficient self-awareness to be embarrassed by it.It would be hilarious if it wasn’t such a
tragic commentary on the state of political discourse. Watch it if you like:

Obviously, our culture and systems of information
need to be reformed enough to precipitate a breaking point whereby nobody can
remain so self-satisfied in their own ignorance as this woman showed herself to
be.Her willingness to gather at a
political rally and shout her views on national television suggests that she is
firmly committed to them, but even in the space of a minute, her complete
inability to explain or defend those views paints the image of someone who has
absolutely no idea what she’s talking about, but also doesn’t care that she’s
not informed and doesn’t think she has to be.

I watch this woman wag her head at Chris Matthews
and pause at length before shooting back, “You don’t know?” when asked what she
means by “communist,” and I see someone who believes that in the face of any
challenge to their worldview, a self-righteous attitude eliminates the need for
facts and rationality, every time.It is
indicative of a sociopathic mindset that takes confidence and strength to trump
all else, and that mindset seems like it is breeding extensively in the modern
population.That in turn is indicative
of a serious cultural failure in America, though unfortunately one that is near
impossible to overturn.

Far less difficult to attain is the personal
breaking point that this clip seems to point to, though I must admit that I don’t
know which side of it I ought to come down on.I must admit that in watching the clip, the thought almost immediately
crossed my mind that maybe this woman was some sort of amateur satirist aiming
to portray the Republican opposition as deluded and irrational, and even that
maybe she had been planted there by some group on the left.I entertain those thoughts because, as with
most conspiracy theories, it’s simply easier to believe than the frightful
reality, which in this case would be that America is long on individuals who
form firm, aggressive opinions on the basis of the extracts of ether and
bullshit.

I know that my skepticism about public ignorance
is unsustainable.Indeed, I know that it
can be harmful, because it’s a sort of ignorance in itself.Fundamental to my personal philosophy is the
idea that you can’t hope to effectively solve a problem if you deliberately
avoid recognizing the reality and extent of that problem.Public ignorance is the problem at the root
of all other problems, because it is that which allows people to avoid reality,
and thus deny solutions.

The problem here is that I don’t know whether I
should be pushing myself towards the breaking point of taking public ignorance
for granted, or if instead I should find a way to keep from assuming that
conspiracies are afoot while still giving individuals the benefit of the doubt
as regards their level of information.In other words, one might say that witnessing ignorance of the
proportions on display in this clip challenges me to avoid two negative
breaking points, which threaten to make me either overly cynical about either
human stupidity or overly cynical about political manipulations.

I’d venture to guess that not a lot of people have
carefully-reasoned assessments of their fellow men, so this is a personal
breaking point that others may have to contend with as well, but being
personal, it’s of secondary importance.What this video clip has brought to mind that could be addressed on a
large scale right now is a question for the media about how to handle firm
opinions voiced by the public.

I honestly can’t decide whether to praise or
criticize Chris Matthews’ response to the political heckler.Part of me wants to criticize just because I
used to get a lot of enjoyment out of focusing my ire for the news media
against Matthews, who, despite being a bright guy, was terrible at his job back
when I considered MSNBC a news organization.Now that his job is “partisan” rather than “journalist,” he doesn’t seem
so bad.Okay, it also helps that I don’t
have a TV.But in any event, even if
Matthews remains professionally an idiot, the woman he had his brief exchange
with is an idiot in much larger terms, and to an unquantifiably greater extent.

The relevant question, then, is, “Did Matthews
have good enough reason to focus the attentions of the microphone and camera on
this woman’s dimwittedly vociferous views?”On the one hand, by giving her a voice once she’d asked for it, and
contributing no commentary of his own, Matthews allowed the woman to provide
her own refutation of her talking points.The exchange conveyed the impression that extremist views are based on
no information, which of course they often are.That’s a good fact to put on display when the opportunity arises.

On the other hand, we have to remember the
shamelessness with which the old woman held her ideas in absence of evidence or
personal understanding.Such
shamelessness probably isn’t much affected by having a mirror held up to its
own ignorance, and that fact threatens to let this incident stand as
encouragement for other people like her.As I said, the greatest breaking point involved here is also all but
unattainable: the creation of a culture that prevents the embrace of ignorance.For the foreseeable future, lack of
information and presence of strong opinions will continue to go hand-in-hand
among a sizable portion of the American public.It will take generations of concerted effort to change that fact.But that doesn’t mean that opinionated idiots
will always be activists.

I estimate that much less comprehensive cultural
changes could prevent people who hold uninformed opinions from being so vocal
and so public with those opinions.And
one thing that probably doesn’t help is giving voice to those opinions, in all
their self-righteous vacuity, on national television.Viewers at home whose perspective on American
politics don’t go much farther than “he’s a communist!” won’t be shamed or
enlightened by their impromptu spokesperson’s self-defeated, just as she wasn’t
shamed or enlightened by it.To the
contrary, the presence on the airwaves of uninformed declarations and accusations
provides more fodder for lazy people to find something to parrot as they make
the leap from uninformed citizen to armchair activist.

The opinions that are screeched from the sidelines
are the ones that most need to be debunked once they’re present, but they’re
also the ones that most need to be disallowed from taking the field.Overall political discourse is cheapened not
only by their ignorance but also by their lack of decorum.As regards ethics, I think I am so committed
a deontologist that I have internalized Kant’s categorical imperative.When I see things like this video clip and start
wondering what ought to have been done in the situation I find myself universalizing
the act I witnessed and looking for its effect on the moral system.

In this case, what would the effect be if
journalists always turned their attention to the loudest and most abrasive commenter
on the scene as Matthews seems to have done?He even turned his attention away from the woman who was contributing
relevant anecdotes to the public understanding, in order to give the shrill,
ancient cold warrior a chance to explain her unexplainable views.I fear that the current state of journalism
is not far from embracing the loudest participant in any debate, because the
hypothetical result is that all of American politics becomes a shouting match,
and that is seemingly not far from the situation that we already face.

In light of that threat of a still more corrupted
political and journalistic landscape, I’m tempted to say that although the
woman’s response was rather satisfying, the better thing to do in that
situation and all similar situations is to keep the person who’s shouting
epithets off of our television screens.But I’d be interested to know what readers think of the effects of
either encouraging or discouraging uninformed speech.

Thursday, October 4, 2012

Wednesday’s debate made it clear to me that American politics remains tragically far from an essential breaking point in how people talk about the economy. One of the nice things about living in the computer age is the access that it gives us to transcripts and the Ctrl+F keystroke. That’s great for people who are interested in analyzing branding strategies and verbal rhetoric. It is perhaps more useful in instances where the trends are chasing, which even a casual listen to the recent debate would have revealed is not presently the case.

In an article at AND Magazine months ago, I criticized the American political system for tendency to entirely eschew reference to poverty except for when such reference is politically, and momentarily, useful. The entire narrative of political economy on both sides relies on the mass delusion that everyone in America is, or at least will be, middle class. Concordantly, in a ninety minute debate that was entirely about the economy, President Obama and Governor Romney spoke solely to that artificially inflated middle class, and almost entirely about them.

I waited with low expectations for any mention whatsoever of the poor, and finally encountered one about half an hour into the debate. It didn’t exactly create a new trend in the dialogue afterwards.

The word “poor” was used exactly four times (excluding Jim Lehrer speculating about the job he did as moderator). Each of those four instances belonged to Mitt Romney, three of them appearing in the discussion of shifting Medicare to states. His use of the term in that context, and that context alone, only furthers the evidence of federal disregard for lower class Americans and for policies that might positively affect them en masse.

Said Governor Romney:

I would like to take the Medicaid dollars that go to states and say to a state, you're going to get what you got last year, plus inflation, plus 1 percent, and then you're going to manage your care for your poor in the way you think best.

In other words, addressing the horrors of poverty is not and ought not be in the purview of the president of the United State, nor even of the rest of the federal government. I would even go so far as to say that the subtext of this – as seems to be in keeping with Romney’s deluded views about universal economic opportunity and social mobility – is that poverty is a transient state, and something that is made more or less likely by policy.

What is especially objectionable in this skeleton of a platform on the subject is the fact that he presents poverty as an affliction that calls only for “care,” which is a word that sounds decidedly passive in comparison with other alternatives, like “intervention.” But why should society intervene to create economic opportunity or to close the gulf between social classes, which is yawning wider with every passing year? Why indeed, when for the likes of Mitt Romney, staying poor is a matter of choice, or the result of a lack of ambition? Given that worldview, it is the job of the individual state to see to it that their poor don’t die for their failings, but it is nobody’s job to provide the afflicted with a hand up in their struggle to escape the circumstances into which they were either born or thrust.

And Obama’s refusal to let the word “poor,” or even the phrase “low-income” pass his lips suggests that there is no ideological opposite of that perspective represented on the national stage. There is no political worldview that dares accept poverty or economic disparity as a reality in this country, much less acknowledge it as the result of policies or flaws in the economic structure of the nation.

The fourth time that Governor Romney used the word “poor,” he immediately corrected himself, referring to Title 1 of the Elementary and Secondary Education Act as affecting “poor kids,” then saying “or lower-income kids, rather.” The slip may have momentarily belied his message, but it also pointed to the skill with which he had trained in properly re-branding his views for the debate audience. It is difficult to think of a reason why Romney would have corrected himself via synonym, if not because he knew that he was not supposed to use the word “poor” in the context of national policy.

In fact, the same training was on display in his discussion of the segment of American society that politicians are allowed to talk about. The word “middle” appeared in the debate, with regard to socio-economic status, a total of thirty-one times. Nineteen of those instances belonged to President Obama, and twelve to Governor Romney. But what’s especially remarkable is the distinct difference in corollary words used by each man. Nine of the thirteen times Romney used the word “middle,” it was to refer to “middle-income” Americans. Obama, by contrast, used the phrase “middle class” in all nineteen cases.

This matters either to what each man believes about the distribution of wealth in America, or to the image that he wants to present of it, or both. With the deliberate application of a less common term, Romney, evidently, is trying to promote the fantasy that there is no social class in America, that there are those that work harder and earn more, and those that work less and earn proportionally, but no natural division between them. That narrative has underlay a great many of his public comments, though nobody but Mitt Romney can say for certain whether he truly believes it or if it simply advances his goals.

Obama, by contrast, sticks to the term “middle class” throughout discussions of economic policy. Speaking about impacts of policy upon only that groups, which almost all Americans believe they are a part of, is politically beneficial, though ideologically weak. But also, speaking about them as a social class must be a conscious decision on some level.

Obama heard Romney refer to middle-income Americans nine times during their exchanges, and he never modified his language to match or to directly challenge the rhetoric. Either he wasn’t paying attention to the vagaries of diction, or he believed it to be politically preferable to advance a narrative that acknowledges class divisions as an American reality. Naturally, if the latter is the case, I’m on his side. But then I wonder where, if class divisions exist, is the lower class, the poor, the perennially underprivileged? And what do the president’s – or the governor’s – policies proposed to do for them.

We never hear an answer to that question. And we never have. Certainly never during this election cycle. Never even throughout my adult life. There is a clear division between the candidates and their parties in terms of how they understand the social structures of America. In the Republican narrative, economic opportunity exists in equal measure for all and income levels differ, but never according to external factors, never in a way that patriotic Americans could construe as unfair. In the Democratic narrative, government actually has a role to play in the economic lives of its citizens, because there are natural divisions and inequities that must be controlled.

One of these, of course, is the groundwork for the lesser of two evils. But judging by the language of debates and national speeches, in neither narrative do poor citizens exist. And that makes each side similarly a party to the same grand delusion. Towards what end does every national candidate embrace that willful ignorance? Is it that they are afraid of acknowledging society’s evils? Is it tacit acceptance of the selfishness of voters, which guides them to turn a blind eye to that which doesn’t personally affect them? It can’t be that they’re simply unwilling to take on a problem that they know they can’t solve, because the entire process of campaigning is built on making impossible and contradictory promises.

Perhaps that’s just it. The political lives of our would-be leaders are constructed around convincing voters to believe in absurd fantasies about the candidate and his capabilities. Perhaps nobody wants to mention poverty because everybody has already been convinced of the fantasy that it’s not a real problem in America. Everybody, that is, except for the poor. They exist.

Tuesday, September 4, 2012

The Buffalo Zoo celebrated the
traditionally-last weekend of summer by offering a ninety percent
discount on admission on Labor Day. Since one dollar is something I
can just about afford on a good week, I took a holiday-morning bike
ride around Delaware Park and then queued up with the mass of people,
mostly families with small children, who had just as readily sprung
at the opportunity for a cheap cultural activity.

Considering the lines at the gate, I
was surprised that the scene inside was not as claustrophobic as it
could have been. It took a little jostling or waiting in the wings to
get a proper angle, but everyone seemed to get their opportunity to
look at the cute, or fearsome, or comic animals. I freely admit that
I was mostly there just to take another look at some of my favorite
creatures, to watch the polar bear swim in its artificial pond, far
from the threatened environment of its natural-born fellows, to grin
down on the docile capybaras lounging in the rainforest exhibit, to
rediscover my respect for the vulture which I discovered when I wrote
a report on the species in elementary school, to look for big cats
pacing like in Rilke's description of the panther.

But even though this excursion wasn't
exactly intended as a fact-finding field trip, I never go to a museum
or zoo or aquarium without trying to learn something about the stuff
I'm looking at. Not a heck of a lot changes at the Buffalo Zoo from
year to year, and I think I had been there about a year ago, so it's
not as if I could have expected to discover an animal the existence
of which I was altogether unaware of. But there's only so much I can
commit to memory, so naturally I find myself rediscovering things on
subsequent visits to the same places of learning. I always seem to
forget, for instance, that the Rocky Mountain Bighorn Sheep are
capable of running at up to fifty miles per hour. The up-side of my
disappointment at not retaining encyclopedic recollections – a
failure that seems to become ever-worse as I age – is that I
sometimes get to re-experience the joy of learning something
interesting all over again.

Even if I don't read all of the
wildlife facts, of which there aren't even that many at the Buffalo
Zoo, I do at the very least try to get the names of the animals
right. This is more than I can say of the vast majority of the other
patrons that I encountered yesterday. It having been a year since my
last visit, I found myself trying to actively identify each species,
endeavoring to commit to memory the ones that escaped me this time
around. This is natural to me, and I thought it was part of the
essential purpose of going to the zoo. I always took it to be a place
where you went not merely to look at animals as in a menagerie, but
to find out something about the wider world by discovering what they
are and from where they come. I especially thought that that was why
parents took their children to the zoo. I'd always assumed that it
was meant as a supplement to a child's primary education, a way to
instantiate curiosity and gauge the direction of nascent scholarship.
Apparently I was quite wrong about this as well.

Most any time that I go to places like
zoos or museums and find myself crowded by children and their adult
chaperones, I am downright shocked by the lack of interest that
parents have in conveying any information whatsoever to their
charges, or even in encouraging those children to learn anything on
their own. I fear that my disdain paints me as a killjoy and that the
average reader will see me as attaching far too much significance to
the conduct of people who are on a simple, light-hearted family
outing. But that's just the trouble. I worry that people attach
entirely too little significance to such everyday opportunities to
influence the character, values, and perspective of impressionable
children.

As much as Americans today recognize
and lament the widespread failure of education and the failure of
modern children to live up to appropriate standards, I think
commentators and individual parents are too much inclined to see that
failure as institutional and too little inclined to consider it as
social and cultural. If the behavior of parents at zoos and museums
is indicative of their broader attitudes, it suggests that people
have widely forfeited the recognition of personal responsibility for
the education of their own children, instead handing that
responsibility off to schools as if the process of raising an
intellectually astute and ambitious child is something that can be
consolidated into a specific set of hours in specific locales.

If that is indeed the view – if the
need for education is recognized, but only recognized as being needed
somewhere outside the home – then I can only conclude that people
don't really value education at all. That is, they don't value
education as it ought to be valued, for its own sake, as both a
public and a personal good. You can't expect children to learn well
and perform at a high level in school if the culture that they're
coming up in is one that portrays education as a sort of
obligation and something that brings good things to the learner, but
is not good enough in its own right to be worth pursuing in absence
of the social obligations of homework and exams.

What else can I conclude from regularly
observing that perfectly middle class parents, far from exhibiting
much intellectual curiosity of their own, don't even respond to the
intellectual curiosities of their own children. But perhaps that's a
little unfair. At the zoo yesterday I did find one or two adults
expressing curiosity to the extent that they pressed their faces to
the glass and perplexedly asked of no one in particular, “What is
it?” They just didn't express a great deal of interest in actually
doing anything to satisfy their curiosity. They just couldn't be
bothered to walk back two feet in order to read the damn nameplate.

This is entirely their own affair when
the adults are on their own and solely responsible for their own
edification or ignorance. But it gets under my skin when their own
lack of care for finding answers threatens to be transmitted to a
child who is still blessed by wide-eyed eagerness to comprehend the
world around him, whatever aspects of it should set itself before
him.

Just a few exhibits down from where I
heard one unresolved ejaculation of “What is it?” I found myself
looking at another glass enclosure that housed three wallabies
crouching at the back of their habitat, when a family walked around
me to look at the same. It was comprised of a couple with a daughter
just barely of speaking age and a son perhaps six years old. The
parents looked, glassy-eyed, into the scene while the boy excitedly
called out “kangaroos!” I had started moving away from the
exhibit, but noticing the boy being met with silence, I said simply
“wallabies,” partly in hopes that his parents would hear me and
realize, if they did not realize it on their own, that their son had
made a reasonable but slightly mistaken assumption about what they
were looking at.

However, I was essentially met with
silence, too, except in that the boy, perhaps hearing me or perhaps
just seeking acknowledgment from his parents, repeated “kangaroos.”
Noticing that they weren't going to say anything and that their eyes
had apparently still not passed over the signs that clearly stated
the name of the species, I repeated, with the boy more specifically
in mind, “wallabies.” Now looking squarely at me, and
inquisitively, the boy again said “kangaroos.” It could not have
been more obvious that the child was interested in being corrected.
He wanted to learn, as most children do when simply presented with
the opportunity. This child was young, but most likely old enough to
sound out the word “wall – a – bye” if he knew where to look,
and if he was made to realize that he didn't know the answer without
looking. But to do that, he would need an example to follow, a pair
of parents who had the tools to find out answers for themselves, and
cared to give their children the same.

The child looking to me instead of his
parents for that meager bit of instruction, I addressed him directly,
explaining, “No, these are wallabies. Kangaroos are big; these are
smaller.” And at that he turned to his parents and his younger
sibling to repeat it to them: “These aren't kangaroos, the man
says.” At that I was walking away, and I can only hope that their
son's claim finally prompted them to look at the sign and sound out
“wall – a – bees.” It was up to them to take an interest on
their own, but it seemed to me that the child, being a child, not
only wanted to know about these things in the zoo, but wanted others
to know about them to.

I experienced the same thing elsewhere.
In the crowded rainforest exhibit, I, being a nerd, spoke straight to
the capybaras, telling them that I just wanted them to know that they
are the largest rodents on Earth, and that that's awesome and they
should be proud. A young girl just beside me asked, seemingly of no
one in particular, "What are those called?" It could be that she heard
me demonstrating some knowledge of them and figured that I had the
answer, or it could be that she, like so many young children, thought
her parents would have all the answers she sought.

She had not spoken straight to me, and
that being the case, I would think that a scientifically interested
parent, one familiar with zoos, would say something like, “I don't
know, let me look at this information card over here so we can find
out.” The parents did not move, of course, so I turned to the child
and told her, “Those are called capybaras.” Naturally, she then
looked back to her parents and sought to inform them of what they did
not inform themselves: “They're called capee-bears.” The parents
did not repeat the information; they did not move to confirm it or
commit it to memory; they did not give her any indication that she
should feel proud of having learned something, that she should be
thankful for the knowledge, or that she should seek to learn other
things as well.

The desire to learn is so natural and
so passionate among children. How poorly we must regard it as a
society that students evidently end up so thoroughly dissuaded from
eager learning long before reaching the lower threshold of adulthood.
What standards can we possibly expect students to meet if we handicap
them in all the faculties that might prompt them to aim above the
mark. If this culture persists, the most likely solution is simply to
expect less of students, as has already become the defining feature
of decades in the devolution of higher education.

In the future of this culture, we may
as well just rename familiar animals to match the absent
understandings of parents and their children. Having been to a couple
of zoos and aquariums in recent years I've found that as far as
doting children and intellectually incurious parents are concerned,
every lemur is called King Julian and every clownfish is Nemo. This
really aggravates me. My best friend is terrifically fond of the
Niagara Aquarium, so I have gone there with her on several occasions.
Upon every visit, without fail, one can hear at least half a dozen
parents exclaiming, “All right, let's find Nemo,” or, “There's
Nemo.” I think I've heard the word “clownfish” used by a parent
to a child exactly once.

I have no doubt that some of these
parents are just lazy and find “Nemo” easy to remember, but I
warrant that a number of them may have good intentions. They're
probably trying to use pop culture as a way to facilitate their
children's interest in the natural world. But there's more than one
reason why this is misguided. For one thing, having been to the
aquarium several times, it's clear that children don't need some
secondary point of reference in order to take an interest in the
natural world, because the natural world is terrifically fascinating.
And that's especially obvious when you're a child.

So using an animated film as a way of
connecting with an aquatic exhibit is extraneous, but far worse than
that it obfuscates children's understanding of what they're actually
looking at. It disregards the separation between fantasy and reality,
it suppresses knowledge of the actual species name, and it encourages
children to understand the creature through an individual depiction
and not through objective facts. And then on top of all of this, for
many families the fixation on something that is recognizable from
fiction overrides the significance of everything else that's on
display. People walk in the door and say, “Find Nemo!” and they
breeze through ninety percent of the aquarium to get to something
that won't teach a child very much that he doesn't already know. If
they didn't immediately put that idea in his head, they might be
astonished by how much he doesn't care about the clownfish once he's
seen the solitary-social penguins, the balloonfish with their
glittering eyes, the sharks skulking past viewing windows, the
challengingly camouflaged rockfish, and so on and so on.

When parents almost thoughtlessly
constrain the purpose of visits to zoos and aquariums and museums,
they probably think, more often than not, that they are doing it for
the benefit of their children, that they are moving to retain a young
attention span and provide its owner a quick shot of enrichment while
they can. In fact, I think such parents and caregivers should
consider that they might have it all backwards and that the feelings
of stress and impatience are all their own, and merely projected onto
their children. They should concern themselves less with what their
children are looking to get out of the experience, and more with what
they themselves are after. If the answer isn't “knowledge, and lots
of it,” they can probably expect much more of their children's
interest in the moment. But they likely won't be able to go on
expecting it as those children age in the presence of a society that
doesn't care particularly much for learning.

Wednesday, August 29, 2012

I recently applied for a job in
Wyoming. It was an entry-level reporting position in a small town,
and it was advertised via an unusual posting that seemed to encourage
a unique cover letter from me. I delivered that, received a response
that may or may not have been a form letter, and, on its request,
replied with a confirmation of my sincere interest in the position.

The original ad put more emphasis on
the setting of the job than on the job itself, and the response
really drove that home, emphasizing that the remote location was “not
a romantic getaway by any means,” which “might not suit
everyone.” My cover letter clearly outlined how I had always hoped
to live and work in a remote location after graduating from college
in the big city, and that the job seemed perfect for me. In my
confirmation of interest, I disputed the notion that it wasn't a
romantic getaway, and made it clear that in any event it was a place
I could see residing happily, especially if I had a career to build
upon there.

The editor sent a form letter to all
still-interested applicants to the effect that she would have more
time to go over the applications after a specific date. A week after
that date she wrote to me directly to confirm that I was not
to be interviewed, and in that brief message, she emphasized yet
again the apparent insecurities of her entire organization regarding
its setting, and explained that she had found someone who she thought
would bring a lot to the paper while also enjoying the surroundings.

When I actually hear back from
no-longer-prospective employers these days, I am no longer shy about
pushing them to the limits of their patience in pursuit of
explanations, and in this case I was really confused. I wrote to ask
her if I had somehow given the impression that I wouldn't have been
able to tolerate living in the sort of remote region that I had just
used two sincere letters to explain that I specifically wanted to
live in. She kindly pointed to a specific line in my second message.
This was the comment that sunk my application:

Speaking more generally, I'm not so
concerned with what the job or its surroundings can bring to me, as
with what I can bring to them.

Am I crazy for
being nonplussed by her reaction? That line came after two solid
paragraphs of explaining why the job and its surroundings appealed to
me, which followed upon an entire prior letter of the same, and yet
all of that was apparently wiped from this editor's short-term memory
by my decision to make the point that my values make me more
interested in doing a perfect job than having a job I consider
perfect.

I can't interpret
this in any other way than that I was refused an interview for yet
another job that I would have done fantastically well because I was
insufficiently selfish. The briefly-prospective employer has given
me the distinct impression that the job went to somebody whose
application placed more emphasis on how much he wanted someone to
give him that job, and less on how well he would perform its duties.

It's another
example of the seemingly backwards hiring practices that have been
dogging me for six goddamn years, and I took the opportunity to press
this person on it, writing back:

I've gotten a certain impression
many times over from people responsible for hiring. In your capacity
as such a person, which goal would you rank ahead of the other, if
you had to choose between them? 1) Finding someone who will do the
best job. 2) Finding someone who is least likely to leave the job.

I give her a lot of
credit for having been so communicative with me overall, but her
response to this question was pathetic:

It depends. I try to find a good
balance between the two.

Did I not make
myself clear? I know she tries to find a good balance between the
two. What I asked was which one was more important, and she simply
dodged the question, avoiding any acknowledgment that there is a
fragile value system at play in hiring practices. And though I can't
wrest a confirmation of this from anyone in a position to give it, I
consistently get the impression that human resource departments and
hiring managers are interested in finding people just good enough for
the open position that the company won't have to do anything to keep
that employee on board, because they'll probably never get a better
offer.

Other people that
I've known have been crippled in their job searches by this employer
culture, as well. Acquiring more qualifications often seems to harm
job seekers more than it helps – such as teaching at the college
level when one is looking for a career in early childhood education.
It's evidently not worth taking the risk on hiring a good educator, a
good writer, a good anything, if there's a good chance that their
ambitions extend beyond the position one is looking to fill.

Obviously
no one has admitted to this outright, but this most recent editor
rather distinctly suggested it. Her rejection of my application was
phrased so as to directly contradict the line that sunk my
application, the one in which I said it was most important to me that
I bring value to the organization that hires me. She wrote, “The
job and its surroundings are to me much more important.”

Much more important than what? Than
the person you hire being a good worker, a talented writer, a
committed journalist, a person of decent character? All of that
takes a backseat to believing that the job and its surroundings are
exactly what the applicant wants and that nothing will tempt him away
from whatever you're to offer him?

Anecdotal evidence doesn't count for
much – you can always find some example that supports what you
believe about the world – but at the same time that I and others I
have known seem to absorb the damaging effects of these employer
practices, I know of one person who appeared to be decidedly on the
good side of them.

My ex-girlfriend never graduated high
school, having gotten a GED instead. When I met her she had not been working for a longer period of time than I. During the time that I
knew her, she routinely quit jobs without notice. I later found she
took the same approach to relationships – find something better,
sever ties immediately. Despite the fact that her resume didn't
suggest impressive qualifications and the fact that she probably
didn't have great references from prior employers, she had little
problem walking out of one job and into another.

Why on Earth was she capable of being
hired immediately, whereas if I applied for the same jobs my resume
would be rejected without so much as a phone interview? The only
logical conclusion I can come to is the same observation about
employer culture. I can easily imagine hiring managers looking at
her past history and deciding, “this girl doesn't have a lot of
prospects in front of her; we'd be offering something that she should
be truly grateful for.” They may have been wrong on both points,
as to her graditude and her future outlook, but her mediocre resume
gave them good reason to believe that hiring her wasn't a gamble.

With every job I've had, my managers
have regarded me as having a work ethic that exceeds that of my
coworkers. My performance and responsiveness to training have been
roundly praised. The one time in my life that I got to work in an
office, I received a year-end bonus that exceeded that of the person
who had been promoted out of my position, even though I had only been
there for six months. Despite all of this, actually finding a job is
damn near impossible for me. I don't have a bit of doubt that I
would perform the responsibilities of any job that I applied for with
more competence and conviction than just about anyone competing with
me for it. But I'm nearly as confident that that's not primarily
what employers are looking for.

Of course, it could be that I'm taking
too positive a view of myself. It could be that I'm just a terrible
applicant. But I'm not about to assume that explanation in absence
of evidence for it, and I'm certainly not getting any from the sorts
of employers from whom I'm seeking jobs.

Previous to applying for this job in
Wyoming, I was rejected without interview for another one that I was
even better qualified for, and which was also out of my area. When I
asked why, the editor did see fit to get back to me, but her response
was utterly meaningless on point of qualifications. She said only
that the person she hired "had what she needed." But she also pointed
out that he had grown up in the area of the job, so I rephrased my
question and asked whether, if I'd had the same qualifications I do
now but had grown up in that region, I would have been at least
interviewed.

Her response still makes me angry, and
I expect that it will for as long as I struggle to have a legitimate
career before the end of my twenties. She wrote back with one line:
“Ed, I'm sorry. I'm not going to break it down.”

I had asked a straightforward yes-or-no
question. I was looking for some indication, even if perfectly
vague, as to whether my inability to secure a simple interview was
attributable to being underqualified, overqualified, or simply having
qualifications different from those that match the sorts of jobs I
apply for. I didn't ask her to answer to any of that, though. All
she had to do was say “yes,” “no,” or even “maybe.” To
do so would have taken less effort than it took to type what she did.

To date, I can't conceive of any reason
why she would respond that way, other than to be deliberately rude.
This is my entire life we're talking about, and all that a person
like her needs to do to give me a little more insight into why it
remains so far off the rails is to say either “yes” or “no,”
and she couldn't even do that.

I guess in light of that I should feel
very pleased with the Wyoming editor for putting forth the effort to
dodge my question in a way that at least seemed like an answer.
Maybe that counts as progress.

Wednesday, August 8, 2012

Last week’s issue of the New Yorker included an
article by Andrew Marantz in “The Talk of the Town” that I found unusually
inspirational.That article also
included reference to a fact that I think is deplorably neglected and
under-explored: “… the Chronicle of Higher Education recently reported that in
the past few years ‘the percentage of graduate-degree holders who receive food
stamps or some other aid more than doubled.’”People who are relatively familiar with my views on institutional
education will recognize this as fodder for my ire over the socially endemic
assumptions about the economic value of college education.

(If you want to get acquainted with those views,
please read this,
and this, and
this,
and this.)

Marantz went on to connect this situation to what
he says has been called the crisis in the academy, defined by the very
situation that I have been watching develop for years, in which the academic
labor market is so glutted with highly educated people that terrific scholars
are sometimes shouldered out of any sort of employment.Actually, Marantz – I think just by way of a
slightly clumsy transition – identifies the two issues with each other, as if a
need for public assistance and the absence of a high-profile academic post are
equivalent.There is a middle ground
that is being needlessly excluded, there.

Still, both issues desperately need to be
addressed in their own right, and Marantz highlights two individuals who have
taken steps to combat the lesser crisis among would-be academics.Ajay Singh Chaudhary and Abby Kluchin
recognized a demand for education among people who could not afford either the
time or the money to take the relevant courses at universities, and they
responded by teaching their disciplines in cafés over the course of several
weeks, at a cost of a few hundred dollars.

Marantz calls their business venture, the Brooklyn
Institute for Social Research, “a locavore pedagogy shop,” and I think that’s
as good a term as any for what I expect is part of a trend in education which
will increasingly challenge the large, money-driven institutions that so many
students are finding deliver little in the way of outcomes aside from a
crushing debt load.

I can still recall how excited I was years ago,
when my disdain for institutional education was still in its childhood – not its
infancy, mind you; that disdain actually predates my NYU enrollment – when I
heard a story on the news about private genetic engineering labs that people
were running in their basements.After
my graduation, I began to advocate with particular verve for the outright
rejection of the formal institutions.I
wanted, and still want, people who legitimately care about education, to show
that commitment in their private lives by educating themselves and one another
and exploring in private settings those new ideas which might be suppressed in
the academy, in favor of the status quo.

At the time that seemed like an easy thing to
accomplish with the social sciences and humanities, but the idea of moving
physical sciences out of the institution and into more intimate settings seemed
quite challenging.Seeing evidence that
not only were people up to the challenge but that they were actually doing it
thrilled me and gave me great hope for the future of smaller scale scholarly
structures.

It’s been a long time, but Marantz’s article
finally gives me hope that the trend is continuing, and that it’s embracing not
only private experimentation and scholarship, but small-scale education.With formal tertiary education demanding more
and more financial investments from students and delivering lesser and lesser
financial rewards, as well as questionable educational outcomes, I expect
people to gravitate in growing numbers towards alternative forms of both
teaching and learning.

There are others in addition to the Brooklyn
Institute, of course.The internet
provides curious individuals with many opportunities to absorb lectures for
free and in their own time through uploads of actual college courses, video
channels designed for broad-based education, TED Talks, and so on.At least one company that I know of sells
entire college courses on DVD for students to acquire at a fraction of the cost
of tuition.

I fully expect more competitors to join in this
trend, and so I expect that education in the future will look much different
than it looks under the formal structures of today.Unless the costs or the benefits of colleges
and universities dramatically shift gears, the schooling of the future will in
large part be much more local and much more collaborative.The alternatives that provide that character
have about as much knowledge to offer as the status quo, given the volume of
unemployed scholars.The only thing that
they decidedly lack is accreditation.But if degrees from accredited schools continue to deliver such dubious
prospects for employment and financial security, what value will accreditation
really have?

Subscribe and Be Broken

About Me

I live my life in search of breaking points. It is only on the other side of them that I feel truly alive. It is only by breaking that I can build myself anew and develop as a person. I expect the same of society as a whole. Everything may come to a breaking point - a realization, a numerical, material, or emotional tipping point after which what has gone before can no longer be sustained. These are some of the breaking points I'm looking for.