Is America really a ‘center-right’ nation?

What's Going On

One of the most persistent truisms in politics is that America is a center-right nation. It’s as common as the idea that Ronald Reagan was the quintessential conservative president. And it’s equally untrue. America is a centrist country, but it’s also one in which the “right” is being constantly redefined.

Reagan: left, right and center

Ronald Reagan was a changeling. He was a labor organizer as head of the Screen Actors Guild, who as president broke the air traffic controllers union. He was all for gun control as California governor in the late 1960s, when it was Huey Newton and the Black Panthers who had the guns, and an ally of the NRA when he ran for president in 1980. He reverted back to being pro gun control after he and his aide, Jim Brady, were shot by a would-be assassin in 1981, and even helped President Bill Clinton get an assault weapons ban through congress in 1994.

Reagan kicked off his presidential campaign with a “states rights” speech near Philadelphia, Mississippi, site of the infamous murder of three civil rights workers in 1967, blowing a clear dog whistle to southern racists. He made the term “welfare queen” common parlance. But he also made Gen. Colin Powell the first black national security adviser to an American president, and his term saw the rise of a generation of black Republicans that included Powell and Condoleezza Rice.

Reagan didn’t just do immigration reform: he did full-on amnesty for some 3 million undocumented immigrants.

He cut a deal with Democratic House Speaker Tip O’Neil to preserve Social Security and strengthen Medicare — a program he had railed against as evil socialism in the 1960s. And he raised taxes — multiple times — increased the debt ceiling, and exploded the deficit.