Steve Harvey Claims Hollywood Is More Racist Than America

Steve Harvey Claims Hollywood Is More Racist Than America

Steve Harvey should be a very happy man. He has a hit daytime talk show, still hosts a popular TV game show, has best selling books that become box office hit movies, a popular self described relationship expert (please no snickering), and a very successful comedy career to boot.

But there are some things that Harvey is not happy about.

In a recent piece about him in the Hollywood Reporter, Harvey lashed out at the unfairness he sees in Hollywood:

“Hollywood is still very racist. Hollywood is more racist than America is. They put things on TV that they
think the masses will like. Well, the masses have changed. The election of
President Obama should prove that. And television should look entirely
different. Kerry Washington should not be the first
African-American female to head up a drama series in 40 years. In 40 years!
That’s crazy.”

The article went on to reveal exactly when Harvey’s eyes were first opened to this, though it was pretty obvious to everyone else:

“… it was an education in the thinly veiled
ghettoization of network television. At the time, he says, a high-ranking WB
executive explained to him that new networks invest in shows starring
African-Americans because they bring a guaranteed audience. “But as they
build the network and get more eyeballs, they slowly start phasing them
out,” explains Harvey, and the networks try to woo higher-income brackets
with a less diverse slate of programming that is perceived as more palatable to
the mainstream.”

I’m sorry, but like who didn’t know that?

So do you agree with Harvey? Are you surprised or is he just stating obvious?