I’ve never really liked Halloween. Probably because most of my Octobers growing up in Mount Vernon, New York were pretty horrible ones. The worst of those late Octobers were in the early 80s, starting in ’81.

That year, Halloween was a forbidden holiday in my life anyway. But the trick was on me. On a day just before Halloween, my day’s meal consisted of an ice cream sandwich as hard as a rock. The lunch at A.B. Davis Middle School that Friday — as it was most Fridays back then — was a grilled ham and cheese sandwich with fries, not exactly a Hebrew-Israelite’s diet. It was also about thirty degrees outside and partly cloudy, unusually cold for early fall in New York. So I stood near the steps leading down to the back of Davis, which led to the athletic field below. The field had turned a dirty yellow-green, the color of mid-fall. It matched how I felt about my life on that day.

The only reason I even had a rock-hard ice cream sandwich for lunch was because I’d won one of our seventh grade social studies teacher Mr. Court’s bets. He’d made an incorrect historical assertion in class, and I caught it, collecting a quarter from him that morning. Still, I learned, fully and truly for the first time, how

poor me and my family had become, all while bitterly jamming the ice cream sandwich down my throat. So much for discovering my inner Hebrew-Israelite self through fasting and eating kosher foods!

I very quickly grew to hate hearing the words Hebrew-Israelite, especially since I’d never been to a traditional synagogue, much less Israel, Palestine, or even Ethiopia. Our Hebrew-Israelite ways had left us with little to eat when I was at home. There was a benefit to all of this. It made the fasting part of fasting and prayer easier. Not easy, just easier. My first Yom Kippur ceremony was difficult. We fasted on fruit for three days, and I barely made it through school each of those days. I almost passed out from the lack of food.

My older brother Darren, meanwhile, had decided that “the day of atonement” and all things Torah didn’t include his stomach. By the end of October, I would watch him take his kufi off as he boarded his bus for The Clear View School (see “About My Brother” from December ’07). I caught Darren walking near our apartment building with the last of a Hostess’ Apple Pie and its wrapper during Yom Kipper. He had snuck around the building to eat his contraband. What made this transgression worse was that Hostess used lard to create its desserts. And Darren, once caught, just stared at me and smiled.

My Mom was too busy and tired for me to think about complaining to her about this or about the issues I faced during my first days of Humanities. For more than three years, my Mom’s income had dropped so much compared to rising food and energy prices that we didn’t have food in the house for the last ten days of every month. Sometimes we didn’t have heat either, because we were usually two or three weeks behind on

Anthracite coal (like the lump of coal that was my life in '81), March 7, 2007. (United States Geological Survey). In public domain.

the Con Edison bill. I also knew that we were consistently behind on rent. I felt as isolated as a kidnapped tweener chained to a radiator in a walled-off-window basement.

Lack of food and heat at home weren’t the only problems. My Mom had popped out two of my younger brothers in the previous three years. We lived at 616 in a 1,200 square-foot, two bedroom and one bathroom apartment, so overcrowding had become an issue. Me and Darren were sharing a bedroom with our two siblings.

Not only did I start to believe that my then idiot stepfather Maurice Washington — oops, Judah ben Israel — had colluded with his version of God to play a cruel trick on my mother and my family. Not only did it finally dawn on me that we had slid into poverty somewhere between beginning on ’79 and Halloween ’81. But I knew that we were in a family crisis, financial, material and spiritual. And there was nothing, absolutely nothing, I knew to do about it. Not even asking for candy would’ve helped.

Modified image of Apple iPad 2 with college cap, tassle and diploma, October 27, 2011. (http://digitaltrends.com/Donald Earl Collins). Qualifies as fair use under US Copyright laws because of the low resolution, alterations and subject of this blog post.

When I attended the Center for American Progress’ two-hour conference on Anya Kamenetz’s Gates Foundation-funded ebook The Edupunks Guide to a DIY Credential last month (see my recent post “Education Incorporated“), there was one thing that the experts on the panel kept bringing up. All of them agreed that the Information Age and the possibilities of online education are so great that the days of the traditional university are numbered. Like the dismantling of the traditional newsroom model of newspapers and magazines over the past fifteen years, the traditional university model was within ten years of becoming obsolete, at least according to this not-so-objective group of commentators.

Student loan debt the equivalent of an American house in 1980, questions about the usefulness of a four-year degree in the world of work, and the relevance of academic coursework to the “real world” will be the biggest reasons for why hundreds of higher education institutions could be out of business by the 2020s.

Despite my background, I am not some apologist for the state of American higher education today. There are quite a few things wrong with the current model, especially for standard state and regional public institutions, historically-Black colleges and universities, and most two- and four-year community colleges. But, for a variety of reasons, I’m not completely sold on the brave-new world of online education, currently dominated by for-profit colleges and technical postsecondary schools either. Like University of Phoenix, most charge as much for tuition as traditional four-year institutions, minus the academic and social supports necessary to retain and graduate students.

Still, this doesn’t mean that there won’t be a new kind of higher education for those millions of us who won’t have the grades and/or can’t afford to attend an elite or near elite institution. You know, somewhere between

Harvard, UC Berkeley and the University of Pittsburgh. I think, ultimately, that ten years from now, going to college will be as simple as clicking on the iCollege or iUniversity app on your iPhone, iPad, iTV, or whatever Apple, Inc. comes up with next.

Of University of Phoenix, DeVry Institute, Kaplan University, Capella University and ITT Technical Institute, and I picked Apple? Why, pray tell? Because, believe it or not, Apple has the history of collaboration, technical expertise, and innovative vision — even without the great Steve Jobs — to pull off the moving of the higher education platform to an accredited application that even Harvard, Yale, Princeton and Oxford couldn’t thumb their noses at (though they may have to hold their noses, at least at first). After all, Apple has moved into the mainstream music arena and into the land of Hollywood and made it work to their advantage. Not to mention, to our advantage as consumers. Why not higher education?

How would this work? For starters, Apple could work with a slew of professors and teachers in fields as varied as astronomy, construction engineering, history, medicine, psychology and theater arts to put together adaptive virtual classrooms. The key word here is adaptive. It would be like EA Sports’ Madden NFL ’12,where each teacher or professor would be put in a lab with sensors attached to them and a classroom full of students asking every possible question and providing every possible answer to a given topic or series of topics that would add up to a course. And Apple would do this over and over again for, say, the 3,000 or so possible courses that an undergraduate student would take, not only in the US, but anywhere in the world.

That alone would make this a decent idea. But combining it with Apple’s ability to negotiate contracts and agreements — in this case, with accrediting agencies and with major universities across the country — will make iCollege or iUniversity a great idea. Because of these deals, iCollege or iUniversity students could transfer their credits to a UC Berkeley, Harvard or University of Maryland if they so chose. More

importantly, since acting in a play, shooting film projects or doing a laser light show can’t necessarily be done online, deals with schools, technical institutes and even major corporations would make it possible for any student’s iCollege or iUniversity experience to be well-rounded and tailored to their needs.

What’s more, once Apple makes the $100 or $200 million investment to set this up, they can put up reasonable prices for postsecondary credentials. For an industry or job-related certificate: $5,000. For a two-year or associate’s degree: $10,000. For a four-year degree: $30,000. The extra costs for the degrees would cover technical changes, the consulting fees for using professors and teachers as part of the iCollege or iUniversity app, and to cover the costs of students taking face-to-face courses as part of the process. But with five million, ten million, even a hundred million students enrolled around the world, Apple could make $50 billion in profits from such an app. Every single year.

Yet I know there are numerous faculty and administrators in academia who’d throw a fit upon reading this (or any similar) idea. The fact is, we’ve had a corporatized sort of education at the college level for at least fifty years now. Our scientific and engineering communities are fully entrenched in the military-industrial complex. Even history professors fight for endowed chairs. Between capital campaigns, as well as corporations and dying CEOs investing in schools and having buildings named after them, we’re past the point of no return. We in higher education need to get ahead of the tide before being drowned by it. For once.

In the past couple of years, in my down time between teaching and occasional consulting, I sometimes listen to ESPN 980 Washington, DC. I get a perverse pleasure out of it, especially this time of the year. With football season on, there’s nothing better than hearing Washington Redskins (hereafter known in this blog as Deadskins) fans whine and complain and kvetch after a loss. It’s not just because I became a New York Giants fan when I was a teenager. Nor is it because I’m also a Pittsburgh Steelers fan. No, there’s something different about these Deadskins fans, something that might be related to the personality of this area.

To those who don’t read my blog regularly, I’ve only been a DC/Maryland resident consistently since August ’99 (I did live in DC briefly in ’95 while doing my doctoral thesis research). I grew up in Mount Vernon, New York — the greater NYC metropolitan area, not upstate, just in case — for the first seventeen years and eight months of my life. I went to school and lived and taught in Pittsburgh for twelve other years. Not to mention spent the equivalent of five weeks in the Bay Area, three weeks in Atlanta, and two weeks apiece in Chicago and Boston. From a sports perspective, none of these areas and cities have fans as delusional as Deadskins fans, at least, not on a game-by-game or week-to-week basis.

The best way to show the difference is to take the same scenario and apply it to each city or area in which I’ve lived. The hometown team has just lost a game, in the ugliest possible way. Tune into a sports talk show or read the headlines for each team in their respective cities/areas, and this is what you’d read or see:

After a Giants loss:

“They suck! They suck! Did you see what Eli did with that throw! I’m tired of these guys screwin’ up! I want Coughlin’s head on a f–ing pike!”

That would go on for a day, maybe two, and if it’s really bad, maybe for three or four days. Then eventually, the Giants fan base settles down to, “What do we have to do to win this week?”

This would go on for a couple of days. Then, like the little engine building up momentum, the Stiller’s talk would turn to, “What do we have to do to win this week?”

After a Deadskins loss:

And this goes on all week long, every week they lose, and through every off-season. Until talk show hosts like the great former Georgetown coach John Thompson literally cuts callers off due to their “high levels of ignorance.”

It’s not as if the other places I’ve lived and the teams I’ve rooted for haven’t seen any hardship. Heck, the Steelers went through a twelve-year decline between their fourth Super Bowl win in ’80 and the hiring of Bill Cowher in January ’92, the last five seasons I witnessed first-hand. Not a single fan jumped off a bridge because of a loss, or took a rocket up to the moon over a victory. Maybe the realism that Pittsburgh as a city had to live with, including the loss of their industrial base for jobs, had something to do with their realism around the Steelers.

As a Giants fan, I appreciated their realism, and it helped make me a Stillers fan, too. Coming from an area where my team had one their first Super Bowl in ’87, only three years removed from a 3-12-1 season in ’83, I thought that some Johnny-Come-Lately types expected too much in a strike/scab-shortened season. But, even with that, the cycle of Jackie Gleason-esque fits of rage, followed by calm rationalism, were a reflection of the New York City I knew in the ’80s, sometimes ugly, but usually manageable. Even in troubled times.

This cycle also made the Giants fans of the ’80s more psychologically stable than the average Deadskins fan, then and now. Yes, it’s a reflection of an area of the nation that is also out of touch, as the worst effects of the Great Recession aren’t equally felt. As the expectations of Deadskins fans are as realistic as it was to believe that Mayor Vincent Gray could move DC government in the same way as Adrian Fenty, only without the ruffled feathers.

This form of delusion, though, otherwise known as bipolar disorder, where the highs are euphoric and lows can make you suicidal, may be catching on. For it shows our expectations of the economy and our politics.

Good or well cartoon, January 24, 2009. (Gaurav: http://mbatutes.com). Qualifies as fair use under US Copyright laws because of lower quality resolution with purpose for describing a part of speech.

Puberty is such a strange time in life, whether you’re male or female. It’s even worse when you don’t have the maturity to deal with the changes, physical, social and emotional. Add to that no real guidance from parents, teachers or other adult figures about how to deal, and you have a recipe for disaster, even social suicide. That became the case for me thirty years ago, in a fight with a history, with my late ex-classmate, Brandie Weston.

But I planted the seeds for it almost a year and a half before, the second Saturday in May ’80, the very first time I met Brandie. My father Jimme had taken me and Darren to his “girlfriend’s” two-bedroom apartment on Mount Vernon’s South Side. His alleged girlfriend — drinking buddy, really — turned out to be Brandie’s mother. When Brandie walked through the apartment door an hour into the visit, I greeted her with the words, “Wow, she’s fat!,” as if I’d complimented her on her great beauty (for full story, see “First Impressions and Brandie” post from May ’10).

Irony, of course, has been one of the ways I’ve come to be sure that there is a God. Why else would I have ended up in Humanities and in the same classroom with Brandie in September ’81? I had no doubt that Brandie told her Pennington-Grimes friends about the incident as soon as she saw me in class on the first day of seventh grade. Every time I saw them, my shy “Hi’s” were greeted with grunts, names like “dumb ass” and “idiot,” or just plain ignored.

My cold war with Brandie became a fight only weeks into seventh grade. It wasn’t much of a fight, though. It was in Mrs. Sesay’s classroom, our 7S homeroom where we started the day, ended the day, and had our first-period English class. At the end of this day in October, Brandie was clearing out of room from the back, passing by my seat on the left side of the classroom toward the door.

“Dumb ass,” Brandie said out of nowhere, as usual.

“You’re stupid,” I said, not even bothering to look up as I put my plastic Mead, three-ring and five-subject notebook in my book bag.

Mike Tyson laying out Larry Holmes, January 22, 1988.

Within a couple of seconds, I got pushed from behind. I turned, and Brandie threw a punch into my chest. I threw one back into her right arm as she recoiled from landing her first punch. We were fighting in the back of the classroom.

It was two semi-nerds in a fight of words, lots of shoves, and a flurry of half-hearted punches. It was an ugly display, like watching a Larry Holmes fight or Muhammad Ali in his last days before his retirement. In one corner, at five-foot-two and 120 pounds was me, in the other, at five-foot-seven and about 150 pounds was Brandie. I certainly didn’t want to fight a girl. Brandie seemed to think that she could pound me into the ground, hitting me on top of the head a few times.

At one point I punched Brandie in the chest, only to find that her chest felt spongy. It dawned on me that Brandie had breasts. I stopped pushing and punching her right then and there, somewhat in shock from the revelation.

“You’re a pervert!,” Brandie yelled while two of her friends pulled her away from me.

I didn’t know what “pervert” meant — not that I would’ve admitted such a thing, since I was the “smartest kid in the whole world.”

“Well, you’re an adverb!” I yelled in response. Someone I pulled that out of my brain to call Brandie in response.

Of all the words — adverb? That ended our fight in horrific laughter from Brandie and the classmates who witnessed it. It was another About A Boy moment. It was a weird moment, even for me. I was embarrassed, all but sure that my classmates thought that I was incredibly dumb.

My fight with Brandie had awakened some sleeping wolves, what I called the Italian Club long before we actually formed one for our Italian class. And these preteens had social adjustment issues that made me, Hebrew-Israelite and all, look like a model preteen by comparison — a story for another post. But, still, on the next to last Friday in October three decades ago, I made myself into the adverb greater, as in an asshole to a greater degree. Seriously.

Close Encounters Of The Third Kind (1977) Screen Shot, October 18, 2011. (Donald Earl Collins).

Over the course of a decade, between January ’02 and December ’09, I exchanged emails, interviewed by phone and visited nearly thirty former classmates, teachers and administrators from Mount Vernon, New York public schools for my book manuscript Boy @ The Window. Not to mention family members willing to be honest about life in Mount Vernon and 616 East Lincoln Avenue. Not to mention my family intervention nearly ten years ago. The saying “you can’t go home again” is such an understatement.

At times, my walks down memory lane have left me verklempt, or feeling that I’ve entered the Twilight Zone. Meeting with a former tormentor from my Davis Middle School days was strangely pleasant, while talking with my class’ salutatorian was both illuminating and a little weird. I met with some former Humanities classmates who seemed more ornery than former Georgetown coach John Thompson after a sleepless night dealing with idiot refs. I talked on the phone with former classmates and teachers who either couldn’t remember details about our school, or flat-out lied about some of the things they had said to me and about me twenty-five or thirty years ago.

But of all of those meetings and time machine-like encounters, none made me more nervous than my interview with Crush #1 five years ago. I was nervous for any number of reasons. I hadn’t seen her in nearly seventeen years when I went to see her in the Old South in October ’06. My plan was to be up front about my crush, my borderline love for her back in ’82, which would make anyone anxious or feel really silly, I guess.

And I was stuck at this point of my memoir, the part about how my crush on Crush #1 came about, and how abuse and domestic violence at 616 brought it to a crashing end, between March and August ’82. I knew what to write. I just didn’t want to relive all of those emotions, as they led me to seriously consider suicide within a year and half of all of that.

Salvador Dali, The Persistence of Memory (1931), October 18, 2011. (http://www.moma.org). In public domain.

What I walked into on that rainy October ’06 day mirrored my own Silver Spring, Maryland residence. It was a modern-day carpeted flat in an apartment-home townhouse, appearing as lived-in by the scattered toys in the living room and foyer. Crush #1 was making stew peas. If I’d been in another frame of mind, a look of shock would’ve come over me. Crush #1 cooking? Put that above the fold of the New York Times! Yet since I was willing to expect anything from the new Crush #1, I wasn’t all that surprised.

Her husband greeted me warmly, which was a bit of a surprise. I’ve been around enough couples to get a sense of how these kinds of interactions are supposed to work, regardless of sexual orientation. It’s where the husband or the “man of the house” sizes me up, regardless of my intentions. Crush #1 walked out of the kitchen and gave me a hug, the kind friends give each other after seventeen years apart.

Then I met her daughter, this chip off the not-so-old block, a great combination of Crush #1’s and her husband’s facial features. She was an adorable four-year-old wanting to learn about the world around her. We shook hands and made animal noises for about two minutes. I felt at home. It was as if I walked into my apartment and had to chance to see myself, my wife, and my kid in action, with sarcastic banter and silly noises included.

There was so much to discuss and so little time. So I started where the twelve-year-old in me would’ve if he had a voice. I asked about her mother, her family, her growing-up years in New York, her time in school and in Humanities. What came out was so different from what I expected because it was so similar to my experience and because our similar experiences occurred during the same time frame.

It was all so normal, so typical for people from our respective backgrounds, so, well, human. I liked this real-life version of Crush #1, and not in that twelve-year-old, I-think-I’m-in-love kind of way. That was something else I really didn’t expect. Not only did I enjoy the visit. I enjoyed getting to know one of my ex-classmates for the first time.

A generation ago, most of us in education worried about a federal government takeover of America’s 15,000 school districts with mandated standards. Wow, that prediction was way off, wasn’t it? (Oh, wait, the No Child Left Behind Act, passed in 2001, created a new era of national standards for accountability, not to mention high-stakes testing).

President George W. Bush signs into law the No Child Left Behind Act, Hamilton HS, Hamilton, Ohio, January 8, 2002. (http://www.whitehouse.gov). In public domain.

Now, we worry with good reason, as corporate interests inject themselves into education reform at every level. This has brought an imbalance to the education reform conversation that hasn’t existed since the days of Andrew Carnegie and the height of immigration of swarthy peoples from Southern and Eastern Europe. Now, as it was a century ago, it was the inclusion-vs.-exclusion debate. Whether to provide the best possible education for all comers, or sort and kick out as many “dull-minded” “undesirables” (both literally from 1911 to describe the learning disabled, immigrants and Black migrants) in K-12 schools as possible.

But this debate today — if we can really call it that — includes higher education. Of course, we know better than to call the millions of potential students who need some sort of post-high school training and education “morons” (also a 1911 term used by White psychologists who assumed anyone not WASP didn’t have the mental capacity to make use of a high school education). Yet we do attempt to sort these students and potential students into categories, like “adult learners,” “non-traditional students,” even “Edupunks,” a term

Anya Kamenetz, author of Edupunks Guide, at University of South Dakota, August 27, 2010. (http://www.usd.edu). In public domain.

None of this has eliminated a common refrain in our field. That a four-year degree “isn’t for everyone,” as Kamenetz said to me after I asked her a tough question regarding the accessibility of her ideas for a Do-It-Yourself-university (DIYu) process of pursuing a college degree. It was a conference hosted by the Center for American Progress, but paid for by the Bill & Melinda Gates Foundation. Her idea, while helpful to 18-30 year-olds who are tech-savvy and with enough income to make this piecemeal education process work, was unhelpful to low-income students, and students of color over the age of thirty. And Kamenetz’s response was the typical exclusionary one.

Apparently, in our current economic climate, a full-time job isn’t for everyone either. Still, despite this reality, the Gates Foundation, Lumina Foundation for Education, and the Hewlett Foundation have all adopted similar models of thought around K-12 and higher education reform that have legitimized the work of people like Wendy Kopp and Michelle Rhee (Teach for America and The New Teacher Project, respectively) and institutions like University of Phoenix and Kaplan University. Models which draw heavily from corporate paradigms for success, including the punishment of failure. But they haven’t led millions of us to jobs in the new economy.

I recently attended a meeting hosted by the Brookings Institution’s Hamilton Project, in which authors presented a series of papers on K-12 education reform. Supposedly with cutting-edge ideas. Like one presented by MacArthur Foundation “genius” (emphasis on the quotation marks) Award winner Roland Fryer on providing student incentives linked to immediate and long-term educational goals for those most at risk of

dropping out of school, like low-income boys of color. Examples of paying fifth graders in Houston and New York $2 for every book they read or for completing their homework wasn’t so much cutting-edge as it was unremarkable. Incentives are fine, if you can pay for them or show how they nurture a passion for learning beyond the goal of completing individual tasks. This, of course, the “genius” couldn’t show.

The ridiculous assumption that Fryer made, arguing that because money in K-12 education had doubled since 1970, that funding wasn’t the issue, would’ve made me laugh as a high school senior. When you account for inflation, K-12 funding has declined, and not by a small amount, since the 1970s, and by the way, the millennial generation has created a new demand for schools, as the number of new schools or schools in need of renovations adds to this doubling in four decades. Fryer’s exclusion of data that a first-year graduate student wouldn’t have missed made me realize that most people in the field are so desperate for ideas that anything that sounds new must be good or cutting-edge. Especially if it’s funded by the Gates Foundation.

It’s not just the Gates Foundation, per se. It’s the idea that since things aren’t working for millions of students and undereducated workers, a model that concentrates on teacher effectiveness and treating students as customers — whether in fifth grade or in college — is the best way to go. This attitude has become so pervasive among well-funded education reformers that the idea of increasing funding for schools, or of making schools from pre-K on focus on all students in need of college/workforce readiness is about as welcome as Michael Moore at a Koch Brothers fundraiser.

Early college high schools and single-track, college-prep K-12 school districts, two of the great secrets of K-12 and higher education reform, remain such because these are difficult to bring to scale, and require more upfront investment than most philanthropists and businesses are willing to make. Not to mention, these represent the hard work of real reform, but ones that won’t make people like Kamenetz, Fryer, Kopp and Rhee stars. But by all means, let’s continue to fund every hair-brained idea as if tweaking our education system will yield results like a nuclear fusion plant on steroids.

Signed Copy of Faces at the Bottom of the Well, October 8, 2011. (Donald Earl Collins).

In a twenty-four hour span on Wednesday, three American giants died. The Rev. Fred Shuttlesworth, the ultimate Civil Rights activist, had been reported dead first by mid-afternoon on the fifth. Then, in quick succession the media reported two other deaths. Apple co-founder, two-time CEO and 300+ patents Steve Jobs passed around 7 pm. While Civil Rights activist, law professor, critical race theorist and best-selling author Derrick Bell also passed that evening, very quietly.

The media — social, cable and otherwise — dutifully dedicated itself to rolling out every author and person connected to Jobs the Visionary, Jobs the Thomas Edison of the Information Age, Jobs the Innovative Entrepreneur. By 9:30 pm, even my ambivalence about Jobs the Capitalist (as tweeted @decollins1969) would’ve been seen as heretical by the folks whom Jobs had fired over the years, or had their jobs outsourced to China in the past ten years.

No doubt that Steve Jobs, my he rest in peace, was a sort-of Wizard of Menlo Park, California (really, Silicon Valley, but taking poetic license here). But, as much as I love my MacBook, iPod, iTunes, iMovie and iPhoto, and other Apple products I’ve used since I wrote an AP English paper on an Apple IIe my senior year at Mount Vernon High School in ’87. I didn’t get this outpouring of love and sorrow two days ago.

Then it occurred to me that I was watching two stories. One story was of a generation that saw Jobs as the man who fused technological innovation with cultural relevancy, the folks who grew up while Jobs was in the midst of his second coming at Apple. As he remade the niche company into the largest corporation (more or less) in the world. The other story is the media story, the Baby Boomer story of a cultural rebel who made good as an Information Age capitalist while maintaining his Zen-ness, an ultimate cultural outsider-corporate insider.

As much as I think people should admire the late Steve Jobs — and there’s quite a bit to admire about his life — there’s so much more to admire about Shuttlesworth and Bell. Shuttlesworth survived multiple attempts on his life, was threatened too many times to count, co-founded the Southern Christian Leadership Conference in 1957 (along with MLK and others) and helped lead the campaign to integrate Birmingham, Alabama in the early 1960s, among many accomplishments. Rev. Shuttlesworth literally gave his blood, sweat and tears for civil rights and equality, but I didn’t see anyone put a candle on an iPad for him Wednesday night.

Bell, well, I’m a bit more biased about Professor Bell. I met him two years before he published Faces at the Bottom of the Well. Bell gave a talk at the University of Pittsburgh Law School (his JD alma mater) in October ’90 on his essay “The Racial Preference Licensing Act,” one that would end up in the book. The idea that racist businesses could opt out of an integrated America by buying a license and paying a race tax in order to deliberately bar Blacks and others of color from their services and jobs, I thought that was truly radical. The slightly older Pitt Law students, Black and White, were up in arms. One went so far as to suggest that Bell was somehow now working for the other side, those who’d like to turn back the clock to the days of Jim Crow.

Through it all, Professor Bell just smiled and joked, and most of all, explained. His story about this Act was a way of getting ahead of the tide of politicians and judges that had been eroding Black gains since the mid-1970s, of moving beyond the crucible of the Civil Rights era — integration at any cost. Bell wasn’t suggesting self-segregation. He was hoping to provoke a larger discussion of the kind of equality Blacks and progressives should hope to achieve in a post-Civil Rights era. One in which all deny racism and racial inequality, but put it in practice in their words and actions every day.

Bell’s ambivalence about the achievements of his generation, about the legacy of the Civil Rights Movement, about desegregation, made him the target of traditional Civil Rights royalty — the “How dare you!” crowd. But it made me and many others from the generation that actually remembers the Steve Jobs as the guy that co-built the world’s first personal computer in his garage big fans of Professor Bell.

To turn your back on three decades’ worth of struggle and success because you foresaw the coming storm around race. To bridge the divide between Baby Boomers/ the Civil Rights generation and us post-Civil Rights folks by turning complex legal theories into allegorical stories. To take a stand that costs you your job at Harvard Law to ensure that the next Asian American female candidate would be given a real chance at a job. Bell’s my hero, and I don’t have a lot of people I’d call a hero.

The media might have put Bell and Shuttlesworth at the bottom of their news cycle well — no doubt, race and the media’s consistent attempt to ignore race was a factor here — but it’s up to all of us that they are winched out of that well to the top. And I think that Jobs would agree with that. May they all RIP.

There's also a Kindle edition on Amazon.com. The enhanced edition can be read only with Kindle Fire, an iPad or a full-color tablet. The links to the enhanced edition through Apple's iBookstore and the Barnes & Noble NOOK edition are below. The link to the Amazon Kindle version is also immediately below: