Is the USA Center-Right? An Issue-by-Issue Breakdown

It is a persistent belief among many in the political and media establishments that the United States is a “center-right nation” which finds progressives to be far too liberal for mainstream positions of power.

If you look purely at electoral outcomes, those who assert this appear to have a fairly strong point. The last several decades of federal politics have been dominated by center-right policies and truly left wing politicians have been largely marginalized (ex. Bernie Sanders). Even Clinton and Obama—the last two Democratic presidents who, theoretically, should be leftists—are corporate-friendly moderates who have triangulated during negotiations with Republicans to pass center-right policy compromises (ex. Obama’s Heritage Foundation inspired ACA or the Clinton Defense of Marriage Act compromise).

While electoral results support the idea of a center-right USA, looking beyond electoral politics—which involve a mixture of policy choices, party politics, fundraising, and propaganda—and…