By most measures, today marks my full transition from relative youth to middle age. Although, when I really think about it, didn’t I really hit middle age in December ’07, when I turned thirty-eight? The average life expectancy of an American male is about seventy-seven, right? And for Black males, it’s barely sixty-five. Given my family history, though, I won’t hit middle age for another two years. My maternal grandfather turned ninety-six three months ago, and my paternal grandfather lived until he was ninety-six. Even my father’s still moving along at seventy-five, despite his battle with alcoholism between the ages of twenty and fifty-eight.

I do feel things in my body and mind that until a few years ago were merely minor aches and pains. My right hip is misaligned with my left hip, likely from years of walking at warp speed, lots of basketball, and six years of my running regime. My L-5 vertebrae is a bit compressed, due to years of activity, including many years hunched over a keyboard trying to make myself into a writer, author and educator. My right knee has been a bother since I was twenty-four, but the issue has gotten worse in the past two years (maybe time for some HGH or microfracture surgery?). I now have white-coat syndrome (because most doctors and nurses get on my last nerve), and I’m mildly anemic. No, folks, forty-six isn’t the new thirty-six, even if I can still run forty yards in under five seconds, pop a three over my son’s outstretched hand or leg press 360 pounds.

Me via Photo Booth, December 17, 2015. (Donald Earl Collins).

But I still have good health and a mostly healthy body and mind. Since I turned twenty-seven, my weight has never been higher than 241 pounds (including clothes, wallet, phone, and keys) or lower than 212 (I weight 229 now). I can still memorize when inspired to do so, remember virtually anything important from my life from the age of four to the present, and could still probably win at Jeopardy if I ever got the call.

What’s more impressive, though, is whom remains in my life now that I’m no longer “young” anymore. My friends live all over the map, from the DC area to Pittsburgh to the Bay Area and New York, from Atlanta to Athens and from Seattle to Shanghai. I’ve made peace (mostly) with my family and my past, even if they aren’t always at peace with me. There’s my wife and son, of course, who are mostly likely the reason I’m still “young” relative to my age. Though I remain a Christian, I do not have the blind faith or evangelical -isms of my youth, and I’m at peace with that as well. I’m probably further to the left culturally and politically than I was at sixteen, twenty-six, or thirty-six. Because I’ve learned, sadly, that so much of what I was taught or fed growing up was either incorrect or a complete lie. But even with that sad disillusionment, I’ve come to accept the possibility of change for myself and the Sisyphean task that this nation and world always has been.

Yet even the idea of middle age has changed in the minds of capitalists as the Baby Boomer generation has begun retirement and all of them have received their first AARP cards. Before 2000, the ad folks and entertainment folks had split up adults into the age demographics of 18-34, 35-44, 45-64, and 65 and up. Now, it’s 18-24, 25-54, and 55 and up. This privileges Baby Boomers (as usual) and props up Millennials (folks who used to be Gen Y). My middle age is not the same as Baby Boomers’ middle age. Even in demographic representations, money-grubbing capitalists give us Gen Xers little respect.

Rodney Dangerfield quipped this funny line in Back to School (1986):

Coach Turnbull: What’s a guy your age doing here with these kids?
Thornton (played by Dangerfield): I’m lookin’ for the fountain of middle age.

Maybe when I’m sixty-five (like Rodney Dangerfield was in this film), I’ll be looking for the Fountain of Middle Age, too. But my choice will be to stand in it for the next thirty or forty years!

Da Vinci’s Demons Season 3 Poster, October 2015. (Starz via http://www.ew.com/). Fair use due to direct connection with subject matter and non-commercial use.

I finished catching up on the Starz series Da Vinci’s Demons over the weekend, which included me watching the final episode via Xfinity On Demand last night. I will not provide any spoilers. But, I am at a loss regarding the one constant with most period pieces or fantasy TV series and movies. I simply do not understand the need for a British accent in everything that isn’t American in Hollywood or its near-equivalent. If my understanding of history is correct, Romans didn’t speak in British English with English colloquialisms like “Needs must” or “bollocks.” Alexander the Great was never “knackered” or “gobsmacked.” And Leonardo da Vinci as played by Tom Riley saying “codswallop” or “fortnight” or any of a hundred British sayings with a nasally, pretentious accent is about as accurate as firing a shotgun at a barn’s broadside from 1,000 meters away.

In general, I love hearing and reading the differences in British and other forms of English when not spoken by Americans. With so many TV series and movies now done with international casts, though, it has turned into a form of cultural imperialism. Especially since I know that until two centuries ago, the average Brit sounded more like the average Canadian or American than like James Bond while attending Oxford. The change in accent was part of the United Kingdom of Great Britain’s rise to superpower empire status during the eighteenth and nineteenth centuries. Anyone who wanted to become part of the British gentry class — including newly monied industrialists and merchants — hired speech coaches to train them to speak English in the manner that we hear from most British English speakers in 2015.

Even with the rise of the US as the world’s leading superpower and its gigantic cultural and economic footprint, British English still is the language and sound of empire. At least where Hollywood is concerned. Because British English sounds imperial, high-class, important, urgent, and intelligent, according to the ears of many an English speaker, whether in the UK, the US, or elsewhere in the world. It doesn’t matter if the modern version of f–k didn’t exist until the eighteenth century. It doesn’t matter if some shows like Game of Thrones or movies like Lord of the Rings are complete fantasy and should have no adherence to any particular language or sound. British English is all that and a bag of chips, the cat’s meow, is dope for the English-speaking world.

For me, though, all I’m hearing with shows like Da Vinci’s Demons, Rome, Spartacus and The Borgias (the Showtime version), and with movies like 300 and Gladiator is a narrative framed by imperialism and the triumph of the West. It’s not just the British English accent and the imperialism it represents. It’s also how producers, directors, and actors consistently frame period pieces in the West vs. East, civilization vs. savagery narrative. Be them Ottoman Turks, Germanic barbarians or Persian emperors, they’re all simplistic antagonists for cinematic fodder that reinforces the West as good, godly and victorious. Or, rather, to quote Pittsburgh Steelers’ head coach Mike Tomlin, those without British accents are “the dead Indians” in these modern-day “cowboy” movies.

“Linguistic Imperialism” – what the world might look like if borders were based upon the 10 major languages, May 10, 2014. (The Economist via http://pinterest.com ).

And like Tomlin, in the case of screen entertainment, I’m kind of tired of the same theme running over and over again. Not only is depicting the Ottoman Empire or Achaemenid Persia as evil heathens when compared with Italy in the 1480s or Greece in 480 BCE a historical lie. It also reinforces the idea that when the West wields its power and imperialism, the world is always a better place. If the climate summit in Paris demonstrated anything, it showed how much damage the West has done to the world.

A century or two from now — assuming our globalized world doesn’t fall apart due to wars and climate change — entertainment will likely be highlighted with accents unfamiliar to today’s Western ears. Even if one of entertainment’s primary languages remains English, it will likely have another accent. Ace!

At my chiropractic appointment yesterday morning, my bone-cracking doctor of fourteen years and I got into a discussion of our holiday plans over the next couple of weeks. Her and her family will visit with extended kin in Virginia, while we’re heading to Pittsburgh to see my in-laws. During our conversation, my chiropractor brought up some of the family traditions she’s preserved with her handful of Christmases with her young daughter and two sons. Traditions like Danish pork roast for dinner, ornaments and other hand-me-downs from her grandparents and other ancestors as part of trimming the tree.

“I wouldn’t know anything about traditions. Matter of fact, there were eight years growing up where we didn’t even celebrate Christmas,” I said, with no forethought about what her reaction might be.

“Oh, I’m so sorry,” my chiropractor said in a quiet yet somewhat shocked tone, as if I’d ruined the Christmas spirit for her kids.

“That’s what happens when you go up in poverty,” I said apologetically, realizing that I might have cost my chiropractor some peace of mind this holiday season.

Even at nearly forty-six, I can still say things without thinking, causing others to have to think more than they normally would. Sometimes, it’s without intent or malice, sometimes it’s because I don’t give a crap what people may think. Regardless, it’s certainly not because I want people to feel sorry for me or to give me a hug.

The truth is, the only holiday traditions I have come either from my wife or her family or were born out of my circumstances. Like making super-sweet, two-packs of Fruit Punch Kool-Aid and mixing it with either ginger ale or Sierra Mist for either Thanksgiving or Christmas. Or getting our son’s Christmas presents ready for him without him knowing the night before. Or me making some holiday/birthday cake for me and us (since my birthday is two days after Christmas). And often going to a soup kitchen, homeless shelter or other venue to give away clothes, toys, money, my time in knowing that no matter how I might feel about my life, plenty others have it much worse.

The truth is also more complicated than simple poverty. Up until my eighth birthday in ’77, my Mom and me and Darren (with either my father or my idiot stepfather) celebrated Darren’s birthday, Christmas and my birthday as separate or nearly separate events. Some of my best times growing up were those days. Then, when the hyperinflation of the late-1970s kicked in — along with a second marriage and two more mouths to feed — Christmases ’78 and ’79 consisted of a fake two-foot table tree, a new shirt or sweater and a new pair of slacks. There were no birthday celebrations for me.

A contemporary Candelabrum in the style of a traditional Menorah. United Kingdom, Chanukah service, December 2014. (Gil Dekel; http://www.poeticmind.co.uk; via 39james via Wikipedia). Released to public domain via CC-SA-4.0.

Between Christmas ’80 and Christmas ’88, we didn’t even have the fake dwarf tree. Of course, four of those years we were Hebrew-Israelites. But there is this holiday known as Chanukah that also occurs in December, in which Torah believers celebrate the Festival of Lights with eight days of gifts and giving. But these were also the worst of our poverty-stricken years, and we could barely afford one candle for the menorah, much less eight or nine. The best gift I got those years was my idiot stepfather being out the apartment at 616 and on the prowl for other victims for his fast-talking nonsense about making money and living a godly way-of-life. I also attempted suicide on my fourteen birthday, not exactly a tradition worth repeating.

Finally, in December ’89, we had our first Christmas at 616 with my Mom having divorced my now idiot ex-stepfather. She bought a fake full-sized tree. I bought my four younger siblings gifts big and small for the holiday. My mom even made me a Duncan Hines chocolate cake with vanilla icing for my twentieth birthday that year. We didn’t have much, but what we did that year meant so much as we moved into the 1990s.

In all of my adult Christmases, I’ve actually only done one in Pittsburgh prior to our trip coming up in eleven days. It was Christmas ’98. That week, perhaps the only important tradition I’ve ever been a part of began. I moved in with my then girlfriend Angelia, mostly as a cost-cutting measure, partly out of love and concern for our respective futures. We’ve been living together and celebrating the holidays ever since!

I wrote this a bit more than three years ago, when the Supreme Court first heard the case of one Abigail Fisher against the University of Texas admissions policy in October 2012:

Abigail Fisher has joined Allan Bakke, Jennifer Gratz/Patrick Hamacher and Barbara Grutter as part of a list of Whites who have used race as an excuse because they faced a road block for maybe the only time in their lives. The idea that we should have race-neutral college and graduate school admissions policies in a country that’s far from race-neutral shows an enormous sense of unacknowledged entitlement and privilege.

Here’s why. Using myself as an example, I graduated Mount Vernon HS (NY) in 1987, 14th out of 545 students (the top 3% of my class), with a 3.83 GPA on a 4.0 scale, with an 1120 SAT (a 1220 on today’s SAT). I didn’t get into Yale, but was accepted at Columbia and the University of Pittsburgh. Money was an issue, as I ended up going to Pitt because they offered me an academic scholarship, while Columbia offered a private investigator into my father’s finances. Still, my grades would’ve easily knocked Fisher out of contention at UT-Austin, as well as Gratz and Hamacher.

I also think about the two decades I’ve spent teaching high school, college and graduate students. The most consistently obstinate students I’ve taught have been White students who thought they knew more than me. They didn’t get that context always matters when interpreting history, especially something like affirmative action. For those students, for Fisher, et al., and for the Supreme Court, entitlement matters more than context. Facts, circumstances be damned.Read the Article at HuffingtonPost

I was wrong about one thing in my earlier post. I based my comparison of my SAT score from 1986 on revisions to the standardized test in the 1990s, not in the 2000s, when they added a third section. Based on that, my educated guess for a score in that period would’ve been between a 1850 and 2000 (between the 60th and 70th percentile).

Today, the Supreme Court heard from Fisher’s and the University of Texas’ lawyers — again, about the efficacy of using race as part of a larger formula for achieving demographic diversity in the state higher education system. During today’s oral arguments, the ever-brilliantly racist Justice Antonin Scalia pressed the University of Texas on why they needed to account for race (and apparently, for class as well) in their admissions plan at all, considering the academic issues many Black student face.

Scalia said, “it does not benefit African-Americans to — to get them into the University of Texas where they do not do well, as opposed to having them go to a less-advanced school, a less — a slower-track school where they do well.” He added that “most of the black scientists in this country don’t come from schools like the University of Texas. They come from lesser schools where they do not feel that they’re — that they’re being pushed ahead in — in classes that are too — too fast for them.”

It is fairly obvious that Scalia and at least three other justices (including his intellectual puppet Justice Clarence Thomas) would do away with affirmative action sooner than Scalia and Thomas could suck down two one-gallon tubs of rocky road ice cream. But the veneer of racism, the assumption that Blacks are “too slow” for elite public universities, the Social Darwinist interpretation of higher education? Or assuming that Blacks who go to lesser known institutions, particularly HBCUs are getting a lesser and slower education as a result? Scalia doesn’t know his history, and doesn’t care to know the history of Blacks in higher education at all.

Given the direction the Supreme Court is leaning, it may take a burgeoning Black Lives Matter movement of the scope of the Civil Rights Movement of fifty years ago to reverse this court’s attempt at a twenty-first century version of Plessy v. Ferguson (1896). Let’s not forget, though. There are millions of Scalias and Thomases out there who firmly believe that African Americans — even those with excellent grades, high test scores, and lots of passion and intellectual drive — deserve nothing more than a jail cell, a janitor job, or a bullet to the brain.

But what makes their perspective worse is that Scalia, et. al, are cutting off their collective noses to spite the country’s face. It won’t be just high-achieving African American students losing out if the court curtails or renders race-based admissions policies unconstitutional. A decision like that will hurt millions of White students as well. Not just because segregated higher education could eliminate a diversity of ideas and thinking and will poison the wealth of knowledge and efforts toward a better American society through the benefits of the college experience. It will also mean that Whites like Abigail Fisher will no longer have an easy and vulnerable scapegoat for their educational failures. The Abigail Fishers will be experiencing their own form of stereotype threat. Oh, how will they hold on to their narcissism, their intellectual delusions of grandeur then?

There's also a Kindle edition on Amazon.com. The enhanced edition can be read only with Kindle Fire, an iPad or a full-color tablet. The links to the enhanced edition through Apple's iBookstore and the Barnes & Noble NOOK edition are below. The link to the Amazon Kindle version is also immediately below: