December 31, 2012

Sweet, naive, trusting, and above all likable, the ditz represented the extreme tail of the bell curve for self-consciousness, namely at the opposite end from hyper-self-aware. She was not socially awkward like a nerd, sperg, or other weirdo -- her wiring was fine, it's just that she was running on auto-pilot most of the time. Her mind was never blinded by the spotlight of self-monitoring, although that meant that it would often look like the lights are on but nobody's home.

Quite a few female characters from '70s, '80s, and early '90s TV shows were, like, totally ditzy, however exaggerated it may have become as the series ran out of jokes. And they weren't just teenage girls like Mallory from Family Ties, Kimmy from Full House, and Kelly from Married With Children. They were in their 20s like Chrissy from Three's Company, Hilary from the Fresh Prince of Bel-Air, and Lucy from Twin Peaks. And they were still spacey even during middle age and retirement, like Edith from All in the Family and Rose from Golden Girls. The whole society back then was just a lot less self-conscious.

The fundamentally wholesome nature of the ditz can be contrasted with the sassy, jaded, cynical, and repulsive nature of the oh-so-self-aware diva who has become a cultural mainstay over the past 20 years. When the writers and actresses do attempt to portray a ditz, their heightened self-consciousness turns the character into a blackface version of the ditz, snarky and stupid.

In fact, the writers and actresses have opted more for a different character type altogether, what the TV Tropes people call "The Brainless Beauty". In their minds, self-awareness = reflection = reason = intelligence. And so, ditziness = go-with-the-flow = intuition = stupidity. Where the older type was scatterbrained, her replacement is downright moronic. She cannot therefore be likable, except in the sense of refraining from looking down on her because she can't help being an idiot, and laughing at her wouldn't be nice.

Looking through the examples of "The Ditz" in the TV Tropes list, it seems like most of the ones from the past 20 years are closer to the type who is dimwitted and unlikable, or at best inoffensive, like Phoebe from Friends or Karen from Mean Girls. The only genuine exception is Cher from Clueless; she was a real breath of fresh air in the '90s, the last wholesome-ditzy kind of girl.

Ironic ages are so poor at irony because the background level of self-awareness makes the delivery too on-the-nose. It requires sincerity to appear unrehearsed, and the ditz's delivery was beyond understated. No one did it better when it came to oblivious double entendres.

Is there something I can take out for you?

Does the decline of the ditz reflect a real-world change, or was the character type just some Hollywood fad? It sure seems real to me. I remember ditzy girls in elementary school (later '80s, early '90s), not to mention among the teenage babysitters that I and my friends had. And even though I didn't know them, random girls at the mall often gave off a ditzy vibe, that strange spacey look on their face where they're cruising through daily life without a navigator who's fully alert.

There were a few ditzes left by high school (later '90s), but at my school anyway they caught so much shit, it must have ultimately woken them up from the dream-state. Clueless and kind-hearted was out, savvy and sassy was in. By the 2000s, I don't recall any true ditzes among the students I tutored, or among my fellow 20-somethings, let alone among older women. I'm sure there are still a handful drifting around out there somewhere, but even if you wanted to find out where, they wouldn't have the presence of mind to send up a flare to let you know.

Although the ditz may be dead for now, I wouldn't be surprised if she showed up again the next time the crime rate starts rising for awhile. I don't recall seeing ditzy female characters from the culture of the mid-century, although there are several in Fitzgerald's short stories from the Jazz Age. As a matter of fact, the actresses who played the two older ditzes from the last heyday (Edith and Rose) were both born in the first half of the 1920s, making them part of the Greatest Generation, who are the counterparts of today's Generation X and Gen Y, based on where they were born in relation to the peak of the crime rate. Ditziness seems to have largely skipped the Silent Generation, who appear to have been more self-conscious as teenagers. Ditto the Millennials, who are their contemporary counterparts.

In the meantime, let's remember how pleasant these girls were to interact with. The ultimate in easy-going, you could play any tune, and they'd happily dance right along. Indeed, their largely reactive nature in social situations required that the man take the lead, the exact opposite of the self-aware diva, whose courtship behavior consists of either bitchily pushing men away or sleazily seizing them by the arm. And although their scatterbrained behavior made them a little unreliable, their trusting and amiable nature still made them good friends and co-workers, especially compared to those who are unreliable due to their Machiavellian mindset.

In the good old days, even the least desirable of the common female character types was still harmless and wholesome.

December 28, 2012

What's so shocking about how sheltered kids are these days is not only the extent of their cocooning but just how rapidly the world changed. Most of our collectively recalled history comes from Baby Boomers, and therefore most of our images of how kids behaved before Ritalin and child safety seats takes place during the 1960s. The Sandlot, The Wonder Years, and Forrest Gump -- that's the standard image of pre-domesticated childhood in the popular imagination.

Naively connecting the two dots of the 1960s to the 1990s and after suggests that it all went slowly downhill in the 1970s. And yet in this case as in so many others, the '60s were just the beginning for a phenomenon that would peak in the '80s or early '90s. Generation X and the mini-cohort just after then, Gen Y, have just as many memories, and just as vivid, as the Boomers do, so why is there such a huge gap in the popular imagination of what daily life was like 20 to 30 years ago?

I think the Boomers were simply fortunate to enter their nostalgic years during a romantic era -- the 1980s -- when no one would look at you funny for looking back on the good old days. And the not so good old days too, witness the steady stream of pop culture about the Vietnam War throughout the '80s and early '90s. Now the Boomers were going to reflect on what it all meant from an adult perspective, not through a child's or adolescent's eyes.

Even the younger generation hopped on board, despite having no personal memories of those times -- Stand by Me is way more popular with Gen X than with the Boomers whose early years it depicts. I imagine the same was true for Happy Days, Back to the Future, and the whole rockabilly revival.

Nowadays, though, we have sunken too deeply into sarcasm, irony, antipathy, and rationalism for an honest, upbeat look back at the '80s to catch on broadly. It would have to be performed in blackface, like the musical Rock of Ages. But so what if it wouldn't be popular? You really wouldn't want to be in-demand during such snarky and sappy times.

So in my own little way, I hope this romp through pictures of the lives of children during the Reagan and Bush years will help fill a gap in the popular imagination. It'll focus mostly on how independent and rambunctious we were encouraged to act, and how eager we were to oblige. I'll probably have more posts to follow that cover other aspects of life back then.

I've related so many personal stories over the past several years that I thought it'd be better to focus on my younger twin brothers. That way it won't be biased by wanting to remember my own childhood as greater than it was. It will also show how sharp the divide is between Gen Y and Millennials, the first to be raised by helicopter parents. Although I was born in 1980, my brothers were born in '82, and even they lived normal, exciting lives. These scenes are mostly from the late '80s and early '90s and show how late the trend lasted. They show both suburban and rural settings to emphasize how much "ain't that America?" there was even in the 'burbs.

First, here's one from early 1983, when I'm not even 2 1/2 years old, but still super-stoked to get to help Pap carry some firewood in from the garage at my grandparents' house. I think I'd already started to help chop it outside first, at least the smaller pieces.

This trip to the Smithsonian is I'd guess from the summer of '87. My brother and I are on the horn/plate area, but just look at how much the kids are climbing all over the thing. Also notice how few parents are in sight, and how those that are nearby are either paying them no mind (assured that the kids aren't stupid and clumsy enough to kill themselves), or are looking on from a distance.

This next one is a bit hard to make out, but it looks like the spring from anytime between '87 and '89. The distance alone makes it unusual from today's perspective of parents snapping pictures from up close. My mother kept herself hidden in the kitchen like in some wildlife documentary, so as to not disturb their natural behavior.

They've got a plastic kids' table set up with what looks like milk and probably cookies for sale to passersby. Letting milk sit out in the afternoon sun? Rookie mistake, dudes. But they learned for the next time and started selling kool-aid. I only assume they thought there was an untapped niche for those growing anxious of the crowded market in lemonade. You don't ever see kids selling stuff on their own anymore, but it was nothing newsworthy in the entrepreneurial Eighties. They're discussing something, who knows what, but it's all taking place unmediated by grown-ups. If you don't learn how to represent yourself in a social interaction, you'll never feel comfortable around other people.

And now we turn to the real shockers. One of my brothers was born in camouflage, and here he is setting off on his first real hunting trip with two uncles, sometime in the late '80s. Yep, that's a real .22 rifle in the hands of a second or third-grader. I don't think he hit anything himself, but he did bring back a grouse head to scare the girls at school with. Nowadays Pap would probably be fined or locked up for teaching the three of us to shoot at such a young age.

This one's from late 1988. My grandmother's caption reads, Let's go to the woods "Pap" -- we would always bug him to take us out there so we could run around, chop tree branches off, push big dead trees over, sever a thick vine to go swinging on out across the ravine, and whatever else would put hair on our chest. It wasn't the anarchy of spoiled brats given total license with dangerous toys -- we actually had to learn how to use those things properly or suffer the consequences. I'm carrying an ax, the center brother a pick-ax, and the left brother a hatchet. All that red... and not givin' a fuck.

The next two scenes are from their 8th birthday party in 1990. One of their friends aims a mini-sawed-off shotgun at the other who points back a life-size rifle. And not a grown-up in sight to steer them away! Also notice the lack of a big orange knob at the muzzle, or any other garish colors and weird shapes that would signal right away that it's just a replica for babies. Kids' guns looked like real guns at one time.

The final scene from Scarface? Nope, just another party where we're pretending to blow each other away, point blank. Just think: that's one of my parents taking a picture of their own precious little dear with a gun pressed against the back of his skull! We indulged in so much more black humor back then -- everything's so damn serious and weepy these days. Everybody is a snarkmeister, yet nobody wants to "go there" or "be that guy". It's like the dork squad took over the whole country.

Easter 1991, when they're still 8. So many things going right here it'd make a modern parent's head explode: he must be 15 feet up in the air, he's distracted and only holding on with one hand, his brother isn't doing much to spot him on the ground, and as usual there are no grown-ups in sight.

I had a summertime job when I was 10, and my brother took a stab at earning a living at that age too. He's on the left, I'm there on the right to lend a hand. This is almost the summer of '93, and he's set up a table selling baseball cards at one of those small-scale conventions that they used to have before people took it way too seriously and would only patronize Lord of the Rings-scale conventions -- i.e., for dorks. This was in a church basement, I think. You can't see it, but on the floor behind the table he brought his functioning cash register -- back then lots of kids had been bitten by the Alex P. Keaton bug.

Finally, here's one from somewhere between '92 and '94. The society was starting to move away from the "you can do it!" spirit and back toward the cult of managerialism last seen during the mid-century. But the decline had only begun; you could still find unsupervised activities that were technically illegal yet necessary for proper development. Pap let us drive his tractor once we'd gotten to be around 10 or 11 or so, and as you can see, it didn't matter if a pedestrian was chasing you or not. Shoot, they need to learn how to make a clean getaway while they're still young, don't they?

Ax-swinging little boy, can you be as far away as you appear? We flung ourselves into some weird parallel dimension and must squint to make out your face anymore.

December 27, 2012

As much as we may have gone against our parents' wishes or even explicit instructions back in the good old days, we never dared to disrespect them to their face -- let alone call them a swear word. Well, teenagers did seem more defiant back then, but definitely not toddlers or children who weren't big enough to hit back.

Since it never happened, I'm not really sure what our punishment would've been -- ax to the head, maybe? Impalement out in the front yard as a lesson to the other neighborhood brats? I don't know, and I'm glad I never found out.

Yet today it's quite common to hear 3-to-7 year-olds shouting commands and insults right in their parents' face, in public. Whether outside the nearby daycare center, inside Starbucks, or around the supermarket, I can't count the number of times I've heard some little squirt scream "Stupid!" "You give me that RIGHT NOW!" "STOP! You stop being a bad mommy!" One more taboo that has disappeared in my own lifetime...

Still, that's nothing compared to what you'll hear when they're around their family in a private space, where they know strangers won't shoot them a nasty glare, where they know their blood relatives will tolerate way more shit than the general public would.

So far I've heard "asshole!" and "little bitch!" from my nearly 5 year-old nephew when he gets angry during playtime. Well, your parents may not automatically spank you for cursing right to their face, but when you're playing with Uncle Agnostic, homey don't play dat. WHOMP!

I've had to give him three hard spankings so far, but if that only happens once or twice a year, he'll just write it off as the occasional cost of having to visit family. I doubt even a steady stream of spankings from his parents alone would correct him. It's the punishment they get from the outside world that sets them straight. If the other kids on the playground shirk their duty of beating up on the bratty kids, if the grown-ups refrain from walking over to pinch a little shit's ear in public, then children learn that it's OK to fuck around with other people.

I've tried to drive that point home by, perhaps hopelessly, trying to reason with my nephew -- that no one likes to be called bad names, so if he calls those big boys over there a swear word, they won't be like Uncle Agnostic, and they'll punch him in the face, beat him up, and never want to play with him again.

It's no wonder the Millennials continue to act so retarded and disrespectful well past their college years. Socialization grows out of socializing. A generation of micro-managed drones content in the snugness of their cocoons cannot turn out other than dismissive or mousy as adolescents and adults.

So pitch in everyone, and paddle a preschooler today. Let's all help to keep our community clean.

December 26, 2012

Every time I see my nephew, who's now 4 years and 9 months old, it crystallizes something I'd already seen more broadly but not as vividly. For example, children these days have almost no interest in playing on their own or with others their own age, especially if it's playing outside.

Instead, they want to stay indoors, and they keep bugging you to join in their activities -- not just here or there, and not just to bring something to your attention for approval, but like you have to play along with them whenever they get into something. And they don't just single out a particular family member -- everyone present has to drop the grown-up things they're doing and play with or supervise the kiddies.

Back in the '80s, all I wanted my parents to do regarding toys was to buy them and if necessary put them together. On special occasions they might take pictures, shoot a home movie, or have us speak into the microphone for a tape recording. But on the whole, we wanted to play with our toys, run around the yard, and play in the sandbox without adult supervision, let alone adult participation. Even better if we could take off altogether on our bikes, skates, or on foot, to go visit a friend's house, the park, the pool, the arcade, the mall, or any other place our parents did not control.

And our parents were only too happy to leave us be. They had socializing of their own to carry on, TV shows of their own to watch ("aw man, the grown-ups are hogging the TV for Jeopardy and Matlock again), or just relax on their own free from the distractions of Hey look at this! and Hey look at that! Yeah, they interacted with us here and there, but mostly they left us to our world and we left them to theirs. The only times that they would usually get involved were those where we couldn't do something by ourselves, like reading us a book.

These kinds of dependent behaviors show that the social isolation of children over the past 20 years is not only due to helicopter parents struggling to lock up their kids, but just as well to the stunted goals of the kids themselves. In the good old days, if our parents wanted to imprison us when we really wanted to have fun, we might've just flown the coop.

December 24, 2012

During the lead-up to Christmas, you may have heard two polar opposite camps pushing the same message -- that Christmas is not a True Christian (TM) holiday, so it would be best if we just dumped it. Those groups are, first, the hardcore new atheist types, who think they're pulling the rug out from under the pretensions of tradition for a practice that only got going during the Victorian era; and second, the church of hardcore Christian Cosplayers for whom any deviation from authentic practice of ye olden days, like 1000 years ago or longer, is heresy.

The new atheist types offer nothing to replace our existing set of rituals, and can therefore be dismissed right away. Some fraction are just lamewads who don't want any special rituals in their lives at all, and the other fraction are those who accept some ritual component to life, but only if it's asocial or anti-social, such as camping out for Hobbit tickets or clawing your fellow man during a Black Friday looting session, respectively. Coming together to go caroling, non-ironically, would be beyond the pale.

But the Cosplay Christian objection must be taken more seriously, because they do propose specific holidays to focus on more than Christmas, whether earlier Catholic traditions like Advent or even further back like keeping the Sabbath, observing Passover, etc., as the Jews in Jesus' time would have. Whatever they include, they definitely exclude practices from the pagan or barbarian groups from the Medieval period and earlier. In particular Christmas and Easter would be out due to their pagan roots.

Still, we should keep traditions that are worth keeping. How can we distinguish those that do their job well as traditions from those that do not? One quick check is to see how well it has fared over time. By this measure, following kosher food taboos would seem to be fair game for junking, as just about every religious group has done that descends from the religion of the Jews in Jesus' time. Moving from the realm of rituals to beliefs, the vivid pictures of Heaven and Hell -- of the land of the dead in general -- are among the most memorable and widely held beliefs, even though they're derived more from Proto-Indo-European and other pre-Christian religions, such as Zoroastrianism. Their success would recommend that we keep them.

In fact, when we look at our most enduring holiday rituals, it seems like most of them have pagan rather than strictly Christian roots -- not just the overall spirit of Christmas but specific activities like caroling, the rite of spring that accompanies Easter, the harvest-time indulgence of Thanksgiving, the carnivalesque role-reversal of Halloween, the calendrical rite of the New Year, and the patriotic celebrations of the Fourth of July. Or at least as these holidays existed up through the 1980s, before becoming more atomized and drained of spectacle. But hey, lasting all that time is still pretty damn good.

Is this part of a more general pattern? Indeed it is. Here is a wonderful, brief cross-cultural review of rituals and their relationship to societal structure. (They also look at how different parts of the ritual relate to each other, apart from social context.) The authors Atkinson and Whitehouse look separately at dysphoric and euphoric rituals -- the former are painful ones like initiation rites, and the latter are more enjoyable ones like public celebrations.

Their data come from a compendium of anthropological fieldwork, so they have no idea how long the practices have been in practice. But what appears to give a ritual a better chance of being passed on through the generations is a high level of arousal -- that feeling of elation or even getting pumped up. Rushing around the mall with only thoughts of gifts for others on your mind, as well as anticipating what you'll be getting in your turn, the thunder and explosion of fireworks on the Fourth of July, getting sloshed and dancing the night away on New Year's Eve, taking on a different, more hell-raising persona on Halloween without fear of being punished for it later. It's no wonder these traditions survived as long as they did -- they're just so much fun! They lift you up out of your ordinary routine and throw you into a special experience.

What societal variables have an influence on how arousing a group's euphoric rituals are? Controlling for other factors, it was only the presence of classical religion (i.e. the major "world religions" like Islam, Christianity, Buddhism, Hinduism, etc.). And that influence was negative. As the authors discuss, these religions tend to have a more literate and doctrinal emphasis, not simply a set of energetic corporeal activities that everybody just joins in without knowing exactly -- or even remotely -- "what it's all about". World religions try to domesticate the wild frenzy of earlier religious rituals.

And yet just reciting the catechism is not enough to glue the adherents together. Look at how pitiful the track record is for the various sects of Marxism / Communism. They had their own guides written in question-and-answer format, derived from some higher authority, allowing them to be transmitted quickly and widely. Yet by not offering anything exciting to ever take part in, they failed to replace religion. Far and away the strongest repulsive force of leftist sects is their joyless treadmill of an alternative lifestyle, one damned meeting after another.

Unlike modern systems of doctrine, those spawned by the world religions could only have caught on by tolerating or even encouraging the adopters to carry on some of their traditional rituals. Today, although belief in those doctrines has been fading, the pagan rituals haven't suffered as precipitous of a decline. (Though there too people these days don't seem as interested in losing themselves in ritual -- they'd rather continue their boring, never-ending session of web-surfing.)

The power of religion to bind members of a community together and guide them in their daily lives cannot be denied. But as we grope our way forward in a de-Christianizing phase of history, we should keep our eyes open to what has served us well in the past, and try now as then to incorporate those rituals into whatever new system we converge toward. Again the non-answers of new atheists means we can ignore them when trying to figure out what to replace existing rituals with, or how to modify them. At the same time, only an idiot would dismiss the Greek and Roman heritage as mere pagan philosophy and barbarian mythology, let alone the broader Proto-Indo-European tradition.

Given how uncertain this process will be, it doesn't make sense to chart out a blueprint and start ticking off the boxes once we've exchanged this ritual for that one, or modified this one in such-and-such a way. It will involve lots of trial and error, but then it always has. Still, we should remember that traditional traditions have withstood a stronger test of time than new-fangled traditions, and that we might end up relying heavily on the barbarians for our rituals, even as we rely on the civilized for our doctrines.

December 23, 2012

When people feel more like secluding themselves from the rest of their neighborhood, they think primarily about the kinds of spaces and structures that they'll be inhabiting on a daily basis. The built environment therefore shows some of the most striking changes over time, as societies cycle through more outgoing and more cocooning phases.

Earlier I looked at the rise of privacy fences over the past 20 years, and still have yet to keep my promise to cover the same trend during the mid-century. It was there; I just haven't felt like writing it all up yet.

Another example came from a source that wasn't even trying to brainstorm all the various ways that we started cocooning more over the past 20 years. Last night I caught an episode of That's So '90s on the DIY network, and it covered all sorts of trends and fads from that decade. One was the proliferation of compost piles -- basically a large, smelly heap of yard waste, food scraps, and critter shit.

They showed clips from '90s-era home & garden TV shows advising how to set it up, and they all placed them along a fence, i.e. next to your property line. Hey, what better way to drive away your neighbors than to pile foul odors right up against their supposed backyard sanctuary? Whenever we started ours in the mid-'90s, I remember placing it along a fence, and I think the neighbors on the other side had theirs there too. Shoot, our compost piles socialized with each other more than we did!

Converting a small chunk of your backyard into a garbage dump also had the side effect of keeping you out of your own backyard. You holed up indoors not only to avoid the nearby compost piles, but your own as well. If everybody just pitched in to make the whole block smell like grunge, then we could all enjoy the feeling of never wanting to hang out in our backyards. Talk about neighbors helping neighbors.

Returning to a point I brought up in the post on perfume, we tend not to remember smells too well. A good number of books have been written detailing what different periods looked like, sounded like, and felt like. Taste and especially smell don't leave such strong memories, so reconstructing their histories is much harder to do. But when evoked, these memories -- or even vivid descriptions to someone who wasn't there -- can be powerful. Someone wrote that in the 1970s, New York City smelled like piss and sex. Well, in the 1990s, suburban backyards smelled like shit and mold.

Why are they not as prevalent now as during the '90s? Some trends of that decade represented an overshooting of anti-socialness. It was the acrimonious initial phase of the divorce of every individual from their community. After the split had been completely effected, we didn't need to act so hostile -- by now, it's understood that we don't want anything to do with each other. So although we are more fragmented now than in the '90s, we're at least more amicable than we were then, although obviously far less so than during the '60s through the '80s.

December 22, 2012

Was TV always so dumb and predictable? I've been tuning in to some classic shows after visiting home for Christmas, and it's striking how even ordinary sit-coms from the good old days show craftsmanship in their writing. Nothing mind-blowing, but novel and unexpected enough to catch your attention, and with the narrative threads woven together naturally.

This prevents the tedium and predictability of more recent comedy shows and movies where one major thread is pursued. Two equally important plotlines allow the possibility of a problematic interaction which is later resolved. With one, the only thing that can get in the way of the plot advancing are separate little obstacles that are eventually surmounted. The weaker strength of these often contrived obstacles don't allow much tension to build up -- you sense that the main plotline is too dominant, and it'll just bulldoze clumsily over the mini-obstacles. If there is a second plotline at all, it unfolds entirely independently of the main plotline. Family Guy on TV and Superbad on film are two examples of this recent trend toward comic narrative monotony.

And the unpretentious nature of TV and film from back then steered clear of the opposite extreme, where two supposedly strong and independent threads are headed on a collision course that you can see coming from a mile away. That announces the writer's authorship too loudly. It also defuses the unpredictability that comes from an episode where the two threads have no obvious relationship to each other. In this latter case, you're wondering, if they interact at all, how and when it'll all come together. It keeps things exciting.

An example of the telegraphed-crash approach is the movie Rushmore, where early on two main characters first develop an unusual friendship (one teenaged, one middle-aged) and then fall for the same woman. It's obvious how it will roughly end, since the woman is an adult teacher at the school where the younger guy is a student, while the middle-aged guy is a wealthy industrialist. Clearly she'll choose the older over the younger of the two. Whatever other little surprises this may allow, it still gives the movie a fatalistic rather than spontaneous quality.

I can't think of sit-coms that go to this other extreme, perhaps because the narrative structure is more suited to feature-length comedy-drama movies. I mention it mainly to show that older comedy shows maintained an artful balancing act, not merely choosing one extreme over another.

To give one quick example, I just saw a 1978 episode of Three's Company which begins with two separate plotlines: 1) Jack sucks up to Janet in the hopes that she'll give him, rather than Chrissy, a pair of Frank Sinatra tickets so he can impress a girl with a classy date, and 2) Mr. Roper buys a parakeet as an anniversary gift for his wife, but gives it to Jack, Janet, and Chrissy to watch over it and keep it a secret before the day arrives. These threads have no obvious connection at all, and don't interact until near the end.

Jack has won the Frank Sinatra tickets and starts getting all swaggery as he calls up the girl to ask her out. In his excitement during the phone call, he plops down on the couch and squashes the box that the parakeet had been kept in. Mr. and Mrs. Roper show up so that he can present her with her gift, but with the bird dead, Jack quickly improvises and hands Mrs. Roper the concert tickets as the anniversary gift, with Mr. Roper playing along, his wife none the wiser, crisis averted.

In a final flourish, Chrissy reveals that she'd let the bird out of the box earlier, so that Jack has sacrificed his tickets -- and a good shot at scoring with a Swedish babe -- all for nothing. I think that was the gist anyway, as I got distracted at that point by the credits rolling in a pop-up box as the show was still going. Even re-runs these days are sliced up to fit in another 3 or 4 minutes of ads.

In any case, the interaction between the two plotlines was not foreseeable, and neither was the resolution. Instead of mindlessly marching forward, these ornamental twists and turns give the show a spontaneity and energy that's been lacking in more recent sit-coms. It's not the stuff of storytelling legend, but then it's only meant to be an enjoyable 30-minute romp, and surprises like that make it worth tuning in for. You don't really notice how boring the newer TV shows are until you watch something from a time of skilled narrative decoration.

December 19, 2012

It's taken so long for the idea to even occur to me, but I haven't looked into the history and cycles of perfume and cologne. Seems natural enough after looking into the visual culture, music, literature, etc., to see if they too track the trend in the crime rate. Because they leave no trace, it's hard to know what scents were popular when and among whom, without first-hand testimony. Fortunately it seems like there's enough written down to piece together the history.

Before writing that up, though, I thought it'd be worth taking a quick look at what the industry's ad campaigns used to look like. Today, they're certainly the first ones we'd think of when calling to mind examples of trashy or offensive ads.

My earliest memory of fragrance ads are the Kate Moss campaign for Obsession. She's nude and looks to be about 13 -- which I didn't mind back in 1993, when I was also 13. But taking a second look nearly 20 years later, yeah, I see why everybody got so worked up over them, especially since they were plastered everywhere.

I never noticed any change away from that trend in the meantime, so I'd always assumed they were like that back through, well, I didn't think about it exactly, but probably back through the 1970s and late '60s.

It turns out the mainstream porno perfume ad is a product of the past 20 years. It's unrelated to the sexual revolution -- indeed, it sprung up just as young people began to have less sex and with fewer partners. Real life was becoming more prudish, so people wanted to look at really bizarre ads to shock themselves awake? I don't know.

An easy way to see these changes is to look at the ad campaigns for a single perfume over time. Ideally you'd want to look at the ads for many popular perfumes in a given year, and do so across however many years. But the quick way works pretty well too.

Here is a gallery that shows the evolution of the ads for Coco by Chanel, starting with its debut in the mid-1980s. You can find other popular perfumes by decade by searching Fragrantica and then doing a Google image search for their ads.

To anyone whose memories only go back to the naked pedo-looking Kate Moss, the ads of the '80s appear to come from an earlier, defunct civilization. The women are glamorous, mysterious, and playful, rather than cheap, obvious, and bitchy.

The main difference, at least for me, is how the '80s babes may come off as mischievous, but it's all in the spirit of good fun, wanting to challenge the men to approach them and see how well they can do. Their followers from the past 20 years give off a haughty and bored-with-you kind of vibe, while still showing way more of their bodies -- combined, they project a harsh warning of "look but don't touch". They're attention whores who don't need to interact with someone else to please themselves.

That focus on sassiness actually detracts from all the skin shown by the recent models: a half-naked woman who insists on keeping the man at arm's length appears uncomfortable with her own body. Just by looking, you can tell it'd be like making it with a sack of potatoes. The earlier pictures, although revealing less skin, show a more sensual woman.

Also note the total lack of shock value in the earlier ads. Even in the one from '88-'89, it takes awhile for the mind to register that you're looking at a woman with no top on. It's not so on-the-nose. Desperate attempts to shock the audience, like the S&M, heroin, and generic prostitute imagery of the past 20 years, only shows how vegetative the viewers' daily lives must be. Back in the '80s, you didn't need to shake them awake -- they were already rarin' to go.

If you're still skeptical, just see for yourself. Find some popular perfumes from previous decades with the Fragrantica search thingie, then check out their ad campaigns on Google images. Especially if you were born after the early '70s, you'll be surprised by how tasteful, if provocative, the perfume ads used to be.

December 17, 2012

Judging from the Lord of the Rings movie that I fell asleep during, the new Hobbit movie must be a real snoozer -- blowing a kid's movie up into a hundred-hour trilogy? Jesus.

Although I won't go see it, I was interested to hear that Peter Jackson has been trying to push for a new technology in recording and displaying the movie -- capturing it at 48 frames per second, or twice the standard rate. It's actually closer to the frame rate used for TV shows. By taking twice as many snapshots per unit of time, the result is more photorealistic.

And yet by pushing photorealism too far, it winds up looking merely like TV. Here's someone's attempt to re-create the effect as accurately as possible through a YouTube video:

Now, 24 frames per second has been the industry standard since the birth of the talkie era in the late 1920s. It provides just enough of a flow of images to be convincing, yet not so much flow that it looks ordinary. It conveys that the movie is something special, like the brushstrokes of a painting.

As it turns out, there was an earlier attempt to boost the frame rate of blockbuster movies, then as now to achieve less flicker and more realism. When would you guess that was? That's right -- the 1950s. Oklahoma! (1955) and Around the World in Eighty Days (1956), both of which landed in the top 10 at the box office, were filmed at 30 fps when the Todd-AO process began taking the industry by storm.

They soon dropped the technology and returned to 24 fps for their subsequent hit movies like South Pacific, Cleopatra, and The Sound of Music. Based on the tepid reaction to the Hobbit's high frame rate, I assume it too will become marginal after perhaps another couple of trials.

The approach to movie-making in the mid-century might be best described as the Bombastic Ordinary. And in so many ways, movies of the past two decades have slowly revived that approach: not only taking a stab at higher frame rates, but also running times well over 2 hours, epic themes and plotlines overflowing with backstory, a long depth of field that displays too many distracting objects in clear focus, the worsening of that problem by the use of 3-D, other sensory gimmicks like vibrating the theater seats, and dull visual effects, whether the rubber suit of many a cheesy '50s sci-fi flick or the porridge-like CGI of today.

The general audience in both periods has so little appreciation for the unusual, the sacred, or the sublime that no profit-seeking movie producer would make something that stood out as visually special. It has to look as much as possible like everyday real life. And yet audiences "won't leave their homes" unless there's some kind of different experience, hence the bloated narratives, pretentious acting, and fantasy settings that don't feel tantalizingly exotic, but too remote to get truly lost in.

Our Neo-Fifties zeitgeist is easiest to see in the visual culture because it's palpable rather than abstract, and because the visual culture of the past is fairly well preserved, unlike fuzzier social trends. Those who refuse to believe that teenagers these days have such low sex drives cannot wave away how similar our movies look and feel to mid-century movies.

A girl living in our house moved out this weekend, and I'm reminded again of how little they do to maintain a common, shared, or public space.

Usually the person moving out will clean the bathroom or something one last time, trying to leave it the same as or better than they found it. Nothing from this one, though, and she never did clean any of it during her semester-length stay here, other than Windexing the mirror once.

But with people moving in and out somewhat often, it's impossible to enforce a regular system of maintenance. You just don't know each other well enough. The only time I've had chick housemates who contributed to cleaning a common area was when our landlord, a grad student living in the house, made it a condition of renting -- you would be part of a weekly rotation of cleaning the bathroom.

Don't even look at, let alone enter into the bathroom shared by strangers in a freshman girls' dorm room or suite. They're so insistent on not playing the role of Cinderella to the wicked step-sisters, that no one ends up pitching in at all. So gross.

Then there's women's public bathrooms. Sometimes when the men's is occupied, I'll go into the women's instead, obviously only when it's made for a single person and locks. Guys, you wouldn't believe how much junk you'll find lying around on the floor there -- crumpled paper towels that missed the trash can, unused paper towels that must have come out too many at a time from the holder, random sheets of toilet paper, and the occasional bits of trash that also missed the trash can.

When a woman causes a piece of junk to fall to the ground in the bathroom, she isn't very likely to pick it up like a man would. "Oh shit... well, fuck it, somebody who works here will pick it up. I mean, it's like what they get paid to do." And of course none of the women after her will pick it up either -- "Who do these bitches think I am, expecting me to pick up their trash after them?"

A guy walking into such a situation is more likely to get angry that a bunch of slobs have messed the place up. He'd chew them out if he could find them, but since that ship has sailed, he'll just take one for the team and pick up some of the junk. It's a dirty job, but someone's gotta do it.

A men's public restroom almost always looks nicer because those who use it feel like they're part of a team they haven't even met, and they're all willing to pitch in here and there to maintain the facilities. The only way it can get bad is if bums are allowed in -- they're almost always male, and one bum can really fuck a place up.

Sure, a slightly messier bathroom isn't the end of the world. But these behaviors generalize across all public, common, or shared spaces. Women are adapted to the private or domestic sphere, including only herself, her offspring, any of her own kin living there, and a boyfriend or husband if he's around too. They're great at keeping their home clean if it's their space. But once it becomes a space shared with anyone other than kin or a boyfriend/husband, lots of luck with having them join the teamwork.

Men are the opposite, more adapted to larger and broader social networks involving all manner of genetic strangers, and stretching far away from home base. So while they may tend to let their personal desk or their entire bedroom become too cluttered, that's only because their mind is on the maintenance of so many different spaces that they frequent. They're willing to let their own personal space suffer a bit, as long as they're pitching in to keep their various public spaces in decent shape. Besides, if they're living with a female family member or a girlfriend/wife, she'll probably clean up his spaces in the house, however annoyed she may get.

I don't think it's correct to see the difference as entitled behavior from women and generous behavior from men. If women are cleaning up after the men they live with, while the men take care of public spaces, then the two complement each other. But once you get girls living with housemates -- male or female -- then they start shirking their domestic-sphere duties, while also enjoying the efforts of males to preserve the public spaces.

So, it's females being single and childless for so long that lies at the root of this parasitic problem.

These examples are some of the most vivid you can point to whenever you hear someone complaining about how under-represented women are in any domain that involves teamwork. "If women ruled the world..." -- then our parks and plazas would become as crummy as a women's public bathroom.

December 15, 2012

Why, of all possible speech abnormalities or eccentricities, does the lisp stand out among queer speech?

There is no single form of a lisp; it's just a generic term that captures all sorts of messed-up ways of pronouncing sibilant sounds like "s" or "z". So just because gays don't produce a sound exactly like "th", but something close to it (a "hyper-corrected" form), is just trying to wave the problem away. They obviously have some very distinctive error in sibilant pronunciation, and to the ear of the average person the closest thing it sounds like is "th". Hence the common phrase "gay lisp".

Moreover, I've heard the gay lisp even among Spanish-speakers, whether from Spain or from Latin America, and in a handful of East Asian languages like Mandarin Chinese, Korean, and Japanese. So it's something that strikes them regardless of their native language. It is not a language-specific thing like choice of slang, but a common involuntary disruption to queer speech.

Here are some clues that the gay lisp is another part of the larger pattern of gays being stunted in childhood:

As a functional speech disorder, lisping has no clear known cause. It is often referred to as a speech delay of unknown origin. . . .

Lisping is also associated with immature development. Some children will adopt a lisp as a means of gaining attention. Other children will begin to lisp after they have experienced unusual stress or trauma. This behavior is part of a regression into a more secure period and can include other types of regressive behaviors such as bed wetting or wanting to sleep with the light on in the bedroom.

First, it is a delay, meaning stunted growth along the natural developmental path. It's not any old error like pronouncing "b" like "k". It's the kind of error that a stunted child makes, although most normal children will outgrow it.

And second, a child may speak with a lisp not because they got stuck in an immature state, but are regressing there after already having reached the intended, mature state of pronunciation. Regressing in order to gain attention -- reminds you of every faggot drama queen you've ever run into, doesn't it?

The review says that lisping tends to disappear for most children after a trouble period of ages 4 to 8. That reinforces that claim I've been making throughout this series of investigations into gay Peter Pan-isms that they're stunted after toddlerhood but before older childhood when they're on the cusp of puberty. They are in that resolutely "girls are yucky" phase of elementary school before 4th or 5th grade, when boys start to mature more socially, emotionally, and physically.

Finally, lisping is neither an exaggerated masculine or exaggerated feminine way of speaking, so we can rule out the popular yet wrong-minded views of male homosexuality that see it as hyper-masculinization or feminization. Females are more neotenous, so it is easy to confuse feminization with infantilization. But then along come cases like the gay lisp or the lack of a nurturing instinct that conclusively point to stunted development as the root cause.

December 12, 2012

I first remember that term going viral in the early 2000s when groups of promoters hit the busy areas of our college to advertise for an internet dating site, campushookups.com -- surprisingly no longer online. It sounded like a code word for a one-night-stand, like "wanna get laid this weekend" dot com.

Ever since then it's evolved into a vague term more like "fooling around" or "making out" that involves some minimal amount of activity, but leaves how far it went unspecified, partly for privacy and partly for making it sound like you got further than you actually did.

To put their fingers on what young people actually mean by the phrase, the authors of this article asked a group of college freshman girls about their most recent "hookup," using that term explicitly. These girls were born around 1991-'92.

What went on? No shock that 98% involved kissing, but touching the breasts? -- only 67%. And scarcely half involved genital touching outside or underneath clothing. Merely 27% got as far as intercourse, and ditto for oral. Of course, oral and intercourse aren't mutually exclusive, but they do co-occur pretty highly today among young people who are sexually active, so maybe 1/3 involved either-or.

Well, there you have it: "hooking up" means you certainly kissed, probably did not go all the way or get a blowjob, and had a 50-50 shot at reaching third base, with somewhat better chances of at least making it to second. So the next time you hear or read some college kids casually talking about how Chase and Becca hooked up over the weekend, don't be so scandalized. It's just another case of the dork squad trying to embellish their experience-free experiences.

The article makes some other interesting observations. Girls hooking up had an average of three drinks beforehand, with 64% having at least one. That echoes a point I made using data from the Youth Risk Behavior Survey -- that kids these days are drinking less, except before having sex. Part of that is because Millennials are so afraid of taking risks and so afraid of real human interactions that they need to get pretty pissed just to get kissy-kissy with each other. But it also means the physical act itself (drunk sex) won't be that spectacular, which is crucial to making sure that it be forgettable, not something magical that might attach them emotionally to another person. They just want to scratch their itch (evidently not even that deeply), and get right back to their busy schedules of playing video games and refreshing Facebook to find new comments to like.

As for who their hookup partners were, only 14% were strangers, while 47% were friends, 23% acquaintances, and 12% ex-boyfriends (lol). Not only were their partners already well known to them, they tend to stick with the same known individual -- 44% said it wasn't the first hookup with them. So it's not as though they're confining themselves to their social circle while still playing round-robin. It's another sign of the Millennials' cautiousness, low trust, and monogamousness.

A review article on dating and mating among Millennials came to the same conclusion about how uncomfortable it feels to young people to lock eyes with a stranger and get into the mood of ships that pass in the night:

To meet eyes with a stranger: weird

As for how Millennials find people to date, Rhoades and her colleagues found there is a lot of online dating after college. But while in college, people meet mostly through friends or at clubs or parties. But even in those places, they meet through a group of friends and acquaintances. Millennials are far less likely than those of previous generations to go where singles hang out or date someone they meet simply by chance.

"This generation is so socially connected to each other and the world because of technology that the idea of dating someone you meet on the bus while commuting to work seems pretty far afield. They want to be connected to the person they date in some social way," says Rhoades. . . .

Bogle teaches a class called Love, Marriage, and Parenting and says her students don't see the romance in having their eyes meet a stranger's across a crowded room. In fact, they think it's weird. "They felt it was far more normal to meet someone on the computer, rather than to meet a stranger that just happens to be in the same public space as you are," she says.

The verdict is in -- spontaneity is creepy. You know, then, that the girls aren't feeling butterflies in the stomach, falling head over heels, and so on. It's a pure quid pro quo -- I'll handle your junk if you handle mine -- with a partner chosen to maximize the convenience of the transaction.

Drained of all possible passion, it's no wonder that the girls hooking up in the first study don't feel blown away by the experience, giving it an average rating of 5 out of 7, compared to the 6 out of 7 that they gave to their romantic interactions with a long-term partner. Physical activity within a relationship also went farther, with 56% of events going all the way, and involved almost no alcohol to dull the sensation (average of 0.5 drinks beforehand, and only 16% having at least one).

This comparison is important to deflate another lame attempt by Millennials to glam up their boring lives -- i.e., that they're too busy having fun and playing around to settle down into a long-term relationship. In fact, those hooking up are not rounding as many bases and are not getting as much enjoyment out of their encounters as those in a real relationship. Far from being an extraverted, wild-child form of behavior, hooking up is more the choice of numbed-out cocooners.

December 11, 2012

Why are some middleman minorities bitterly despised and others fondly admired? I took a crack at this problem in an earlier post comparing the fates of the Parsis (a commercial and professional elite group of Persians in India) vs. the Ashkenazi Jews and the Chinese. In short, the more charitable the elite minority group is toward its host population, both in spirit and in deed, the more they are valued as a national treasure.

The danger that a host faces in bringing in guests is that they could take advantage of his hospitality. By proving themselves to be altruistic rather than parasitic, the Parsis not only have alleviated those fears, but have given their Indian hosts good reason to indefinitely extend their welcome to their Parsi guests. Indeed they are taking whatever measures they can to keep the Parsis from disappearing due to falling birth rates: "The Indian government is to fund new fertility clinics to help save its dwindling Parsi population which is now under threat of extinction," reads the sub-headline of a recent article.

Can you imagine the government of any Eastern European country investing taxpayer dollars into boosting the fertility of its Ashkenazi Jewish population? Or Southeast Asian governments doing so for their Chinese populations? The Parsis, however, are the targets of an anti-pogrom -- a concerted, collective attempt to send their numbers soaring.

This comparison just goes to show that there's nothing inevitable about an elite-status minority of foreign origin becoming the objects of envy, hate, and violence.

Still, couldn't the Parsis have some major faults, some big cost they impose on their hosts, but that is felt to be outweighed by the even larger benefit they provide to their hosts? If they did, it would show up as some kind of scandal. So I searched google for Parsi scandals -- and found nothing. Neither in India nor in America (where many reside), and neither in the present nor in the past. There has to be something, but it is apparently so rare in frequency and so minor in severity that it does not show up in google searches. Leave a comment if you discover any.

Jeez, even the Anglosphere has its share of scandals, whether in business, politics, religion, or entertainment. And you figure that if such bad behavior had truly taken place among the Parsis, the Indians would've noticed and blown it into a national scandal. The country is riven by all sorts of ethnic faction lines, so people are quick to notice when one of the many Other groups steps out of line -- it only proves how irredeemably wicked they are, how unlike Us they are. If no one has spoken up yet, that must be because the Parsis are close to model citizens.

About the only scandal involving them I can think of is the news that Freddie Mercury, the lead singer of Queen, had contracted HIV and soon died from AIDS. Even this was not a scandal of parasitizing his host country, but more of a shocking revelation about what he'd done off-stage.

In contrast, the history of Ashkenazi scandal is too unwieldy to tightly summarize. (The parallel case histories of the Chinese are omitted to save space.) It seems to have gotten worse after Jewish emancipation in Europe in the 19th C., just as blacks became more disdainful and disruptive of mainstream America after they'd won the desegregation and Civil Rights movements of the 1940s through the early '60s. They felt their hosts getting softer, and so felt more emboldened to press their own demands.

The 19th C. timing also reflects the rise of industrialization, which allows Jews to exercise disproportionate influence -- there were suddenly so many industries that reached up to a national or international scale. And that is the quantity that must be measured -- not the number of scandalous figures per capita among Jews, but this probability multiplied by the damage caused by the scandal.

As a brief reminder of this history, in the Victorian era there was Karl Marx, co-founder and arch-propagandist of Communism, to be followed by scores of early-adopter Jews in Communist movements throughout Europe, culminating in Leon Trotsky. In just about any financial scandal, you can expect to find a good number of Jews, right up through Michael Milken, Bernie Madoff, Lloyd Blankfein, et al., of the present day.

Just about all of the atomic spies for the Soviets in the Ozzie and Harriet years were Ashkenazi Jews, including the infamous Rosenbergs. Even worse, the spies were born and raised in America, not recent emigres who might be expected to have low attachment to the country. So much for gratitude toward the hosts of your birth country, eh? The same sorry charade played out decades later once Jews saw Israel as the up-and-coming state to spy for in order to score ideological points, not those has-been Communists. Jonathan Pollard is just the tip of the iceberg.

In addition to individual opportunism and organized subversion, Ashkenazi Jews are also notorious for sex scandals striking their religious communities. They're fortunate not to be a very religious people, or there would be even more. There are repeated such scandals in New York City's Hasidic community, including the recent conviction of Rabbi Nechemya Weberman for molesting a girl he was counseling, starting when she was 12. Nor are such scandals confined to America (link):

Another major Jewish organization in Australia is embroiled in a child sex abuse scandal, adding to the trauma triggered by recent revelations of similar cases involving students at schools in Melbourne run by Chabad-Lubavitch and Adass Israel.

Although not as scandalous as child sex abuse, Ashkenazi Jews have also played a disproportionate role in the American porn industry, both as producers and performers.

Jews may be joined by gentiles in all of these scandals, though not nearly to the same degree. This makes the scandalous activities of Jews harder to ignore and to rationalize away once discovered. The contrast with the Parsis makes this harder still -- why don't the Parsi priests use their influence and trusted positions to molest 12 year-olds? And why don't the Parsi professionals spy on behalf of their co-ethnics back in Iran? And why don't Parsi merchants and captains of industry try to leave others holding the bag after a financial screw-up, or stoop to peddling smut for a living? It's almost as though they held sexual relations sacred, citizenship sacred, even their career sacred, drawing a line that you just don't cross.

In our increasingly profane, trivial, and selfish world, it can be no wonder that their hosts are pulling out all the stops to rescue the Parsi community from demographic decline.

December 8, 2012

In The Complete Directory to Prime Time Network and Cable TV Shows, 1946--Present, there's a list of primetime TV theme songs that made it into the top 60 on the Billboard singles charts. These are original songs, not existing ones adopted by a new TV show. In the 2007 edition of the directory, the authors have this to say:

Increasingly this has become a historical list because TV themes are disappearing, a victim of very short breaks between programs and the desire of networks to give viewers as few reasons as possible to tune out. The days of the long, leisurely theme song that set the mood -- and perhaps the premise -- for a show (remember the theme for Gilligan's Island?) are long gone. Many of today's hit series, such as Lost and Grey's Anatomy, have no theme at all.

I don't keep up too much with TV, but I had no idea it'd gotten so bad that some shows had no theme song at all. From what I have watched, it sure seems like TV nowadays just sounds boring. Not just the absence of a catchy theme song, but it doesn't seem like music plays much of a role throughout the episode either. Original songs, existing songs, instrumental score -- you hardly hear any of that, like you would have in Miami Vice or The Wonder Years.

Because the lack of music extends throughout the episode, it isn't the result of shorter breaks between programs.

In fact, TV shows are not the only media that have seen their music disappear. Movies used to have hit soundtracks, not just the score but the set of songs that punctuated the action. There was that one lame song from Titanic that became a hit in 1997, but you'd have to go back to 1992 to find several movies that were top 10 at the box office that people remember for their music -- Aladdin, The Bodyguard, Wayne's World, maybe Basic Instinct. Throw in Forrest Gump from 1994 too. But the past 20 years have generally seen music evaporate from the movie-going experience. Either it's not there, or it's not memorable.

Porno movies used to have an original soundtrack too -- nothing special of course, but it did its job of setting the mood and helping you to forget yourself. Sounds better than some fake screeching from the girl and dorky remarks from the male crew. Again, no big loss to society; the point is to establish the breadth of the disappearance of music in visual media, all during the same time frame.

Shit, MTV hasn't played music on television for 15 years.

And what ever happened to catchy jingles in TV commercials? "The best part of wakin' up, is Folgers in your cup!" Good ol' 1984, man. No one expects commercials to sound mind-blowing, but at least don't make them grating on the ears ("Aflaaaaac!"). Now the only ones that don't sound annoying make use of oldies from the '80s, like a recent ad for Reese's Peanut Butter Cups that featured "(Feels Like) Heaven" by Fiction Factory.

And of course pop music in its own world has become progressively lamer over the past 20 years. Musicians just can't do what they used to, whether it's in their own songs, jingles, TV theme songs, or movie soundtracks.

Now, onto those theme songs that became Billboard hits. Below are two charts showing how many songs charted, by 5-year periods. The number shows the highest position it attained. I included more than one recording of the song, by different artists, if they came out during the original run, but not if they were covered later on. That only excluded a few; most theme songs were hits when the shows themselves were hits.

Note that there are no entries after 1995, when the Friends theme caught on. Again this is from the 2007 edition, though I doubt anything in the last 5 years has charted. Technically, "Bad Boys" came out in 1987, but was not famous at all. It only charted in 1993 after it was released as a single following the popularity of Cops. So it's more or less an original theme song -- one that just about all audience members would not have heard before.

Almost all of the hit theme songs come from rising-crime times, no surprise there. They don't steadily rise in frequency, though, nor would an index that weighted each song based on its chart position. The peak is the later half of the '70s, both by frequency and by how highly they ranked.

I think if you added in catchy ad jingles, porno music, and especially movie soundtracks, you would see a steady rise through at least the mid-'80s. It's tempting to conclude that within the rising-crime surge of TV themes, the periods of greater production were when pop music wasn't quite as happening, leaving more musicians free to write the TV themes instead.

You do have to control for how good the rest of pop music was, though. If it was great, then a halfway-catchy TV theme probably would not have been able to defeat the competition for a spot on the Billboard charts. There were plenty of enjoyable theme songs like those from Cheers, Family Ties, Golden Girls, Alf, just to name a few, that could have squeaked in if the '80s hadn't been churning out one great pop song after another. But when the catchy theme for Laverne and Shirley hit the charts in 1976, it only had to contend with Barry Manilow, Bay City Rollers, and Wings.

Heh, that's as good a note as any to end on. Just goes to show how boring the culture has become that it can't even make a groovy, upbeat theme song anymore.

December 6, 2012

We live in a culture plagued by hover handing, where a guy is too nervous to touch a girl with his hand in a situation where it's not only permissible but required by basic manners. So although it comes off as rude and makes him look awkward, he just lets his hand float somewhere near the girl's body. This site has some of the more famous examples, while this one is updated with depressing frequency.

Last year I reviewed evidence from pictures of real life that the young people of the mid-20th century also appeared to be flagrant hover-handers. I hardly found any such pictures of young people from the '60s through the '80s. My basic conclusion was that these changes over time reflect how sociable vs. cocooning people are -- outgoing people are more comfortable being touchy-feely with one another.

But after looking more into the phenomenon, I've noticed that there seem to be two different contexts where it occurs, one with slight and one with major offenses. That variation ought to give us more clues about what's driving the behavior.

The first situation is where a guy has met up with some way-out-of-his-league chick who he will never see again, let alone get the chance to touch again. Often he's an autistic nerd at some comic book or video game convention who has paid to have his picture taken with a famous actress, or who decides to snap a picture with some of the hired guns ("booth babes") paid to make the convention look less slovenly and ugly.

In these cases, the guy tends to commit only minor hover hand. I guess he feels like he paid whatever it costs for the actress picture, or when would he ever see the booth babe again -- there'd be no way for him to feel embarrassed by her later, or for her to spread gossip about him to anyone in his social circle. Granted most of these pussies aren't going to fully contact her, but short of that they're going to go for the gold. Look through a lot of the pictures at the two sites linked above, and you'll see. Here are some examples:

The hand is as close as it can get without touching, including the fingers that curve in a cup-like way to stay pretty close to the surface.

Unlike these fly-by-night situations, where he could suffer no consequences, the second situation involves a guy -- usually not even a nerd, but someone from all walks of life -- and a girl who he's connected to in some kind of long-term or lasting relationship. It could be boyfriend/girlfriend, close friends, prom dates, clubbing buddies, or any combination. The setting is some kind of highly sociable event like a house party, school dance, beach, bar, or nightclub. They're having their picture taken to capture their feelings for each other.

Oddly, these cases show the severest degree of hover-handing -- and only by the guy. Often the girl is leaning toward or into the guy's chest, perhaps pressing her head against his, and non-awkwardly touching him with her open hands. She's always smiling and gives no loud signals of feeling uncomfortable touching him or being touched by him. Yet when he's with girls he knows, today's young male stiffens his arm even farther away from the girl next to him. Again, look through those sites to see for yourself. Here are a few examples:

Now the hand is much farther away, nowhere close to the surface, and the fingers may even be partly balled up so that even if his hand did accidentally graze her body, only the outside of his hand would do so, not the sensitive inner surface.

Why is hover-handing so much worse when they're socially closer? He must be more concerned about how his level of hands-on-ness could affect his reputation in the future. That's something that the nerd paying for a picture with Summer Glau doesn't have to worry about. The guy in the second situation doesn't want to become known in his social circle as an octopus who paws every girl he stands next to. And what better way is there to signal how unlikely you are to get touchy-feely than by refusing to clasp the shoulder even of your close friend or girlfriend? By comparison, all other girls would stand less of a chance still of being touched by you.

The dating and mating lives of young people have become much more monogamous over the past 20 years (see the Youth Risk Behavior Survey). A young guy trying to pursue a promiscuous strategy these days is not going to get too far, at least compared to where he could've gotten in the '60s, '70s, or '80s. And young guys themselves pick up on that in real life after awhile. When they try to place their hands on the girl who's wiggling her butt around in their crotch at a dance, and she either removes his hands or simply bolts off back to her friends, they eventually learn that the ubiquitous attention-whoring of their female peers does not actually lead to sex.

So now that girls insist a lot more strongly on monogamous commitment than before, they're going to want to see some honest signals. One easy way for a signal to be honest is for it to be costly. And it's harder to imagine a more costly behavior than a young guy restraining himself when he could be touching the bare skin of some babe who's already pressing herself against his body. Like, how could she expect him to cheat on her if he's not even comfortable touching a girl he's already gained the trust of and may be dating?

I don't think it's a conscious, Machiavellian deception by the guys either. They appear genuinely awkward touching girls they know, not like they're just faking it for the moment, and then they'll turn around next week and cheat on her with several different girls. Rather than guys consciously altering their true preferences, it looks like girls are just choosing a different kind of guy, someone who deep down gets uncomfortable laying his hands on a girl's body. That would certainly fit with all the other lines of evidence that females are starting to choose the doormat, good-provider, doofus dad type for their long-term relationships.

This signal does not seem vulnerable to being faked either, like if some promiscuous guy just forced himself into a hover handing gesture to gain the girl's trust, and then love her and leave her. After all, it takes more than one instance of hover-handing to convince the girl that you're a low-libido beta-male. It's just one more hoop the guy will jump through as part of the ever longer, blue-balling courtship process of the 21st century. A truly promiscuous guy is going to figure that all of that isn't worth the trouble, and try to get a different set of girls in some other way.

And of course this idea explains why hover-handing was so common in the mid-century: aside from being a less sociable culture, it was also one where females insisted on males being meek, monogamous providers. Not like what the Flaming Youth had experienced during the Roaring Twenties.

There are other types of societies where men go to great lengths to not touch women, like the tropical gardening (horticulturalist) groups where men hang out only with each other, and seclude women and their polluting sexuality, except once in awhile when the business needs to get done. However, in those places the men are incredibly violent, boastful, and gaudy in appearance. Highland New Guinea, for example. They are cad rather than dad societies. The guys above, though, are peaceful, meek, and drab. So ours doesn't seem to be a case of the Fear of Women phenomenon.

A falling-crime period almost by definition is one that feminizes men. In earlier stages of pacification, maybe that was a good thing (or not), bringing men from incredibly violent extremes back toward the center. But at least since the 19th century, our falling-crime periods, having already made their largest advances, are taking us from a moderate level of masculinity toward outright feminization -- the Victorian era, the mid-century, and the Millennial era.

An upward tick in the crime rate back to where it was in the '60s, '70s, or '80s would keep us very far from savagery, but move us back far enough to let men enjoy a basic level of guy-like behavior. Now they're so warped that they feel paralyzed just to touch their own girlfriend.

December 5, 2012

It's strange how, over the same period that Christmas has become de-sacralized, we nevertheless begin to see and hear about it much earlier than before, giving the holiday more exposure time.

When I was little, Thanksgiving and Christmas had a buffer of maybe a couple weeks where we were in the orbit of neither the one nor the other. Now there's stuff up for Christmas even before Thanksgiving, but certainly in the days after, the songs, decorations, and everything else starts to go up almost immediately. I remember getting our Christmas tree pretty close to Christmas day, maybe a week or two at most beforehand -- and I don't recall seeing trees sold farther back. I recall putting the ornaments on as a Christmas Eve tradition, although our stockings might have gone up a week before. Ditto for buying egg nog, candy canes, and Christmas stuff in general.

In Christmas Vacation from 1989, they don't show the preparations unfolding right after Thanksgiving. It seems like it's set well into December, with Thanksgiving nowhere in sight, all of it taking place no more than one or two weeks before Christmas. Not until 1993 do newspapers begin mentioning Black Friday behavior.

What's the connection between an ever lower appreciation of and respect for Christmas and a seemingly greater length of time that we've got it on our minds? I think the function of much earlier preparations has been to dilute our emotional investment in the holiday, spreading it thin over a month instead of concentrating it into no more than two weeks. A longer but less intense series of encounters leaves a weaker impression on us than a shorter but more intense series.

It also deflates the surprise and excitement that we ought to feel on the day itself -- shoot, we only saw it coming a month away -- so again it doesn't make as much of an impact on us.

Finally, it sets us up for fatigue. When our investment is concentrated, it doesn't last long enough for us to get bored of it. But when it's strung out for a month, those already small investments toward the end are not even giving us the mild enjoyment that they did at the start. By weeks 3 and 4, we've habituated and no longer respond emotionally to the sounds of yet another broadcast of some song, nor to our 20th viewing of the neighbors' decorations.

All of these effects seem to be understood by the general public, who complain about how early the Christmas season keeps getting. But because these practices have only grown over time, it must be that most people want to induce emotional detachment from the holiday by stretching out the lead-up period. That interpretation fits much better with all the other evidence of people increasingly not caring about the holiday at all.

This same process has affected Halloween, the second-most obvious case of a holiday that's been corrupted within my own lifetime. Party plans, decorations, candy, costumes going on sale, costumes being thought up and crafted by hand, etc. -- all that begins around the start of October. All of this boring homework for so many weeks deflates the excitement that should have surrounded just the week of Halloween.

Thanksgiving probably comes next, with preparations beginning right after Halloween on November 1st. And again it doesn't feel that special once it comes around because we've more or less been incorporating it into our ordinary routine for several weeks beforehand.

It's almost as though we have entire months rather than single days for celebrations. By spreading them out over a whole month, we blend holiday-related thoughts and activities into our daily routine, violating the taboo of keeping the ordinary and the special apart to complement each other. The season was more memorable back when we might have had generic fall or winter-time feelings in November and December, but not specifically about Thanksgiving or Christmas until right before they arrived.

Although they're not major holidays, I sense something similar but to a lesser degree for Valentine's Day and Easter.

But there are two genuine exceptions -- the 4th of July and New Year's Eve. Maybe it's just because it would seem silly preparing in June for July 4th, and because New Year's Eve falls so close after Christmas that it can't compete for the earlier weeks in December for attention. Or perhaps these two holidays would be safe from dilution-by-preparation if they fell on more agreeable dates. They are the least religious or supernatural, for one thing, and so haven't been targeted so harshly as the others have.

Still, I think people begin making detailed plans for these two earlier than they used to, especially planning what you're going to be doing on New Year's Eve. It's not so bad that people begin counting down the days to the New Year, 31 days in advance -- talk about anti-climactic on midnight January 1st -- but still.

The lesson here is that even the most secular holidays and rituals are not safe from the corrosive efforts to strip our communal lives of special meaning and activity. The ramp-up in secularism over the past 20 years seems to be part of something larger -- the drive to atomize and cocoon ourselves. That means religious activity has to go, but so does all sorts of other secular community-binding rituals like the raucous 4th of July parade and spectacular fireworks display, or trick-or-treating on Halloween.

That's why it's important to stick up for religious activity (beliefs / theology is not very important), even if you're not very religious. All that other good stuff goes along with it to give us enjoyable lives; therefore it all comes under common attack. And what kind of team member would you be if you didn't back up your other team members, whether you're on the friendliest terms with them or not?

December 3, 2012

When societal cohesion begins to weaken, individuals begin looting public resources or otherwise trying to have some group of others foot the bill for one's own way of living. Only when cohesion is strong do people hold such parasitic behavior to be taboo.

The worst cases involve siphoning off funds intended for someone else. Over the past 20 or so years, Americans have begun to do just that with the disability insurance program of the Social Security Administration. When you think of workers getting compensated for disabilities, you think of a construction worker getting knocked in the head by a careless crane operator, and being unable to work for awhile afterward.

It may have been that way once, but it is now mostly paying out benefits to people claiming to be disabled, not from injuries, congenital defects, or other accidents beyond their control, but from musculo-skeletal disorders and mental disorders. In practice, that means someone fat enough to feign or truly show back pain, and some con artist who gets a shrink to label the patient's dickwad behavior as a mood disorder. "Yeah, I suffer from intense anxiety when I think about going to work. Give me a check instead."

Not much of the change has to do with demographics, workplace safety, etc. (See here for a recent review.) It is simply a shift in the mentality of lower and lower-middle class Americans -- like, if the elites aren't going to look out for the little guy and just plunder the country for their own benefit, why shouldn't we find a system to game ourselves? It is a clear sign of how quickly trust and solidarity have unraveled since the 1990s.

How do we know it wasn't always so bad? People are always complaining about how fragmented the society became in the 1960s, and lasting forever after, to greater or lesser degrees. But in reality Americans kept getting closer to each other socially and culturally, and feeling more national pride, from the '60s through the '80s. As a result, they would have felt it to be the height of shamefulness to selfishly try to milk a federal insurance program meant to protect disabled workers.

Below is a graph showing the number of applications to the disability insurance program over time, controlling for the size of the population aged 20 to 59. (You can get the benefits if you're 18-64, but I have data already at hand on the basically same group of 20-59 year-olds.) I chose the number of applications, rather than number of awards, number of beneficiaries, amount of pay-outs, etc., to get as close as possible to the grassroots demand for these benefits. Whether an applicant ultimately gets an award also depends on the supply side at the SSA. Most others look at the number of beneficiaries, etc., because they're concerned with the unsustainability of the payment system, but I'm concerned with how likely people are just to stick their hand out in the first place.

The first year of data for the program is usually 1960, but for some reason the SSA website only provided it back to 1965. In any case, the rate of applying more than doubled from near its inception in the early '60s through the first half of the '70s. I think a good amount of this is just a natural increase when a brand-new form of government insurance is offered. It takes time for the truly deserving to learn about it and feel secure enough to rely on it instead of alternatives, in case of accidents. The Occupational Safety and Health Administration only got going in 1971, so it's possible that there was a real rising need for disability benefits before then. Unfortunately the Bureau of Labor Statistics only has data on workplace accidents and fatalities back to the early '90s, so this is just a guess.

What's truly astonishing is the steady decline from a peak in 1974 to a bottom in 1989, a 42% drop. This period is known for the "Right Turn" in American politics, where even the lower-status citizens felt like they shouldn't rely so much on the government for support, unless they really needed it, because going on the dole made them feel ashamed and pitiful. Try to get support from sources that are meant more for you specifically, like your family and your church, and leave the national funds for someone else living who-knows-where who needs it more than you do.

And by some miracle, the later '70s and '80s did not see average Americans getting unusually poorer, sicker, or more miserable. I say "unusually" because eating more carbohydrates has caused a steady erosion of our heart health, not to mention cause the obesity and diabetes epidemic, but the period of falling selfishness did not do anything extra to worsen such trends. Inequality has also gotten steadily worse since the early '70s, but again the choice to refrain from milking the disability program did not send it soaring. If anything it looks like the early-mid-'90s saw a greater jump in the Gini coefficient. And anyway, the explosion in disability applications since the '90s has done nothing to reverse these trends in health and prosperity.

The graphs look more or less the same for the outcomes -- percent of some age group receiving disability benefits, etc. The graph on applications, though, shows that the swings up and down over time are not due only to changes on the governmental side, but also to the tide turning -- and then turning back again -- among my fellow Americans. In a democracy, we more or less get the policies that we ask for, so there's a whole population-level zeitgeist of rising or falling support for able-bodied people to sponge off of a disability program.

Finally, to return to the theme of an earlier post on the geography of selfishness as measured by Google searches for Black Friday deals, let's repeat that for searches for "social security disability". Darker means greater search volume, lighter means less.

Once more it looks like the power of the western frontier in American history lasts even through today. The solidarity of the eastern half of the country is roasted and toasted, whereas the Plains and Mountain states are doing OK. Don't be misled by the large colored areas in Utah and Arizona, since they're measuring searches in the Salt Lake City and Phoenix metro areas, not really all that territory shaded in. Even the L.A. and San Francisco areas don't search as much as their blue-state brethren along the Bos-Wash corridor, Chicago, and Dallas (an increasingly liberal city since 2006).

Mapping out the percent of qualifying residents who actually receive, rather than search Google for, disability benefits would show the same rough picture, but the data are only at the state level and lack the resolution seen above. The only big difference I noticed was that selfish blue-staters relied more on dickwad behavior disorders, and selfish red-staters on lardass scooter-riding disorders, when holding out their hand to Uncle Sam.

As all other writers have pointed out, the program as it exists is unsustainable, but not because the government has been colonized by aliens bent on shelling out benefits to a public that did not ask for them. Rather the program cannot last because we no longer enjoy a national culture and resulting feeling of solidarity that would keep us from crossing the line of something held sacred. And again, the lower-status Americans feeding at the disability trough are only aping the behavior of their superiors, who can't import enough cheap labor, or seek bailouts when their risky investments blow up in their face.