The University We Need

Have American universities declined beyond hope of recovery? Of course not. Their decline has lasted only about 50 years, and in another 50 years they might well improve. Right now, however, the signs are not good. Almost all American universities have grown less interested in education and more interested in ideology. While their ideology has variants, its goals are always “diversity,” “inclusivity,” “equality,” and “sustainability” and its aim is the defeat of “racism,” “sexism,” “heteronormativity,” and “elitism,” without examining the merits of these principles or tolerating dissent from them. The professors and administrators who are still interested in traditional education are becoming steadily fewer and less visible. Most of those with traditional training and scholarly interests are near retirement and anyway have learned to keep quiet, since otherwise they would probably have been forced out of the profession long ago.

The universities are making progress in imposing their ideas off campus as well. Many recent university graduates tend to believe that well-educated people can hold only left-wing views, and academic opinion has moved the attitudes of most Americans at least slightly to the left.

Meanwhile, as universities turn away from traditional education, American college degrees have never cost more and have never meant less. Students have grown much less interested in the postmodernist sort of liberal-arts education offered to them, more attracted to pre-professional programs, and more distracted by sports, drinking, drugs, and sex. While several, mostly small, colleges and universities stand apart from campus leftism, most of their students are just as interested in these distractions, and in any case such places have next to no influence on other universities or on public opinion. Even if a few conservative colleges offer a good education—and it has to be said that most do not—the degrees they provide are less valuable than those of the more prestigious universities.

What can be done? Critics have called attention to the problem in articles and books for more than 40 years, with no obvious effects. Cutting state spending on higher education has also been tried, and its main effects have been a vast increase in student debt and wholesale replacement of regular faculty with wretchedly paid and often underqualified adjunct professors. The spread of adjuncts has partly achieved another proposed solution: the abolition of tenure. The main results of this weakening of tenure have been to endanger the remaining professors who hold minority views and to shift still more power to administrators opposed to traditional education. By now too few dissenting administrators and professors are left to make reform from within a realistic option. In 1987, a group of professors founded the National Association of Scholars with all the right principles, but it and similar organizations have barely slowed the trends they oppose. Donors who have tried to use their money to encourage traditional education or a free exchange of ideas have seen their donations either refused or spent contrary to their wishes. The problem has grown too big and systemic for small or gradual solutions.

Yet elements of a potential solution exist. The growing dissatisfaction with the current regime could serve as the foundation for a new type of university altogether. People familiar with the glorious history of the Western university tradition are increasingly troubled by the intolerance on campus and inability of these schools to provide a good education in literature, history, the arts, and the sciences. The universities have moved so far to the left that they are now condemning views held by most citizens, parents, students, and donors. Even most professors are dissatisfied with their pay, their lack of prestige, their overbearing administrators, and the exploding numbers of adjunct professors. (The adjuncts, who now constitute well over half of the American professoriat, are unhappier still.) The fashions that have shaped today’s universities have resulted not from a reasoned debate but from a herd instinct, a sense of inevitability, and intellectual intimidation. These fashions began at a handful of leading universities—especially Harvard, Princeton, Yale, Berkeley, and Stanford—and have spread through their influence.

A new university, standing apart from the culture of this failed system, would offer the best hope for halting and ultimately reversing the dismal trends we now see.

The great work involved in founding such a university means it would not be possible to complete such an endeavor in a year or two. But before we can even begin, we need a conceptual blueprint. This essay is a thought experiment of sorts, a way of thinking through how such an institution could be created and what practices would best ensure its success.

A moment’s reflection should confirm how strange it is that no leading university has been founded in the United States since Leland Stanford endowed one in Palo Alto in 1891. American education has expanded exponentially during that time. Before founding his university, Stanford had a fortune that, adjusted for inflation, would not even put him among Forbes’s 400 richest Americans today, when the country has more and richer donors than ever before. In 2014, donors gave about $38 billion to higher education, more than the total endowment of Harvard (about $36 billion) and almost double the endowments of Princeton or Stanford (about $21 billion each). Many donors are troubled by the general campus hostility to free speech, capitalism, religion, and traditional education, but, with no good university of another kind to support, they give either to their alma mater, to existing schools, or to other causes.

These frustrated donors could find a cause in a new leading university with a full range of academic programs.

The university would not need to be larger than Princeton, which has around 1,000 professors, 5,000 undergraduates, and 2,500 graduate students. (Princeton’s administrative staff of roughly 1,000 is much larger than it needs.) Above that minimum, size ceases to be an advantage: Princeton is a far more important university than Arizona State, which has ten times as many students and faculty. An initial donation of several billion dollars, a sum within the means of many wealthy Americans, would probably attract enough additional donations to make a new leading university a reality. Paying for such a university would become still easier if it were founded (as Stanford was) as part of a new town planned by developers who would help fund the institution and create a pleasant place for its students and faculty to work and live.

Yet a donation of just several million dollars would be needed to form a planning group, with office space, a small staff, a travel budget, and fees for outside consultants and fundraisers. It could include professors from the National Association of Scholars and other experts on higher education who favor the project. This group could be given a deadline of a year to prepare and publish a plan for a new university, with a deadline of five years to found the university if sufficient funds were pledged for it. The group’s plan should include a statement of principles (but not a “mission statement,” which at universities has become the last refuge of the scoundrel). Besides estimates of the basic costs of each stage of the university’s development and a proposed location for it, the plan should include an administrative structure, an undergraduate curriculum, and procedures for faculty hiring and student admissions.

The university’s professors would on average be more independent-minded, more interesting, and more accomplished than professors at today’s leading schools, and unlike them would represent the views of the majority of educated people outside academia.

America’s leading universities share some characteristics worth emulating. For one thing, their locations fit what might be called the Oxbridge model: They are within reach of an important metropolis but not so near as to be overshadowed by it. Just as Oxford and Cambridge are about an hour and a half away from London, so Princeton and Yale are within an hour and a half of New York, while Harvard, Stanford, and Berkeley are closer to the centers of the smaller metropolitan areas of Boston and San Francisco. All these universities dominate college towns of their own that range in population from Princeton’s 29,000 to Oxford’s 160,000. And all these towns have some attractive neighborhoods that combine the benefits of small towns with the amenities of big cities and (of course) of major universities. Accordingly, Oxford, Princeton, and Berkeley have more distinct and cohesive academic communities than universities located within major cities such as the University of London, Columbia, or UCLA. On the other hand, universities that are too far from major cities are at a disadvantage in attracting national attention and the best students and faculty.

Given all this, the best place for a new university might well be the metropolitan area of Washington, D.C., which now has no leading university. Washington has unique connections to the news media and government agencies that could give a new university much-needed visibility and influence in public affairs (and opportunities for internships). Washington also has major academic resources, such as the Smithsonian Institution, the National Gallery, the National Archives, and especially the Library of Congress. With access to the Library of Congress and the large and growing number of books and periodicals available online, a new university could forgo the full expense of assembling a great research library and could manage merely with a good library of its own, which should be affordable now that used books are becoming relatively cheap. The Washington exurbs are also a promising location for a new college town, which could attract people not directly connected with the university. There are suitable sites for such a town within 50 miles of Washington. (Since the 1960s, the successful planned towns of Reston, Virginia, and Columbia, Maryland, have both been developed within 25 miles of Washington, closer than Stanford is to San Francisco.) No comparable locations are available near, say, New York City.

Since almost all major universities now discriminate systematically against moderates, conservatives, religious believers, and people interested in traditional education, a university that put academic freedom and quality first could attract excellent professors and students from leading universities, lesser universities, and more conservative institutions where academics are undervalued. The university’s professors would on average be more independent-minded, more interesting, and more accomplished than professors at today’s leading schools, and unlike them would represent the views of the majority of educated people outside academia.

A concentration of moderate and conservative professors at a university that encouraged and rewarded them could form a real intellectual community from professors now scattered at different institutions across the country. The national media, who now look for experts and opinion leaders at Harvard, Princeton, and Berkeley—but not at conservative schools such as Hillsdale College, Baylor University, or Ave Maria University—might well seek experts and opinion leaders at a new leading school, if only to make news through a lively debate.

The new university should be traditional in character but not specifically “conservative” in politics. It should seek faculty and students who are interested in academics as such, not just as a vehicle for ideological expression and activism. The only ideologies it should deliberately exclude are postmodernism, deconstructionism, and other relativistic doctrines that insist nothing is objectively true and everything is an instrument of power. Although the university should welcome students and faculty of any religion or none, it would do well to dedicate itself formally to traditional Christianity and Judaism. Recent years have shown that an absence of religion in public life can quickly decline into outright hostility to religion, and that many of the main groups defending the right to hold moral views outside the leftist consensus are religious. The new university should nonetheless defend the rights of all students and citizens to express unfashionable views, even without invoking religion. This would require a strong legal department to contest the growing body of government regulations that are incompatible with free speech and academic quality.

Professors end up choosing their colleagues not in the interests of the university, department, or students, but on the basis of their own likes and dislikes and to avoid being overshadowed by superior colleagues.

Except for a language requirement restricted to languages with important literatures, the university’s curriculum should avoid “distribution requirements,” which force students to choose from lists of specialized courses in various fields. (For example, students at Harvard can satisfy their general requirement in “Aesthetic and Interpretive Understanding” with such courses as “American Dreams from Scarface to Easy Rider.”) These requirements now make a coherent education almost impossible because they force students to take overly specialized and doctrinaire courses. If all students should know something about a subject, they should be required to take a general survey course on it. The main outlines of a good program in general education should be obvious to anyone without a bias against Western civilization; examples can be found in the required program at Columbia and the Common Core required until recently at the University of Chicago. The courses should include great books that every educated person should have read, and the reading lists should be similar for all students to give them a common foundation of knowledge. Contrary to present practice at most universities, the new university should encourage survey courses on basic subjects and discourage idiosyncratic courses on narrow topics. The new university should also adopt guidelines for its undergraduate majors, which could ban departmental distribution requirements, encourage tutorials and survey courses, allow some courses to be prerequisites for others, but otherwise leave students discretion to plan their own courses of study.

Students should be admitted on the basis of academic criteria. Along with grades, essays, and test scores, interviews in person or by telephone or Skype can be especially useful for determining whether students show real signs of intellectual life. While admitting all applicants with the finest overall academic qualifications, the university should also admit some with extraordinary abilities in particular academic fields even if they are not necessarily “well-rounded.” Students should also be selected to ensure a variety of majors, as determined by the interests they mention when they apply. Since a university is among other things a social community, some attention should be given to students’ personalities, at least to the extent of holding antisocial applicants to higher intellectual standards than others. Again for social reasons, the university should make an effort to keep the student body from being lopsidedly male or female. Such adjustments, however, should not lead to rejecting any outstanding students or to admitting any undistinguished students. Easily offended students, or students who insist on saving the world before learning about it, should be encouraged to go elsewhere.

H iring the right people to be the university’s president, provost, deans, and department chairmen would be essential, but the administration should remain as small and inexpensive as possible. Bloated administrative structures can in time elevate bureaucratic considerations over educational policy. This means the new university should have no vice presidents, few deans, and no associate or assistant deans. Special care should be given to hiring the dean of admissions and the department chairmen, who should be not only distinguished scholars but also gifted talent scouts. Although ideally the president should also be a distinguished scholar, an exception could be made for a figure with special talents as a fundraiser and as a public spokesman. The provost and deans should, however, be professors in the university’s departments, with ranks corresponding to their academic achievements, and with salaries never more than one and a half times those of the best-paid professors outside the administration. In order to discourage the growth of a special class of professional administrators, professors should frequently move in and out of the university administration.

Department chairmen should have the primary responsibility for hiring faculty in their departments, not just at first but permanently. A major problem with academic hiring today is that no single person is really responsible for any department as a whole. Professors end up choosing their colleagues not in the interests of the university, department, or students, but on the basis of their own likes and dislikes and to avoid being overshadowed by superior colleagues. Finding excellent scholars who are eager to hire other scholars as good as themselves or better is always hard, but it needs to be done only once for each department if the department chairman is in charge of hiring. The administration should study each department chairman’s hiring recommendations carefully and veto proposed offers to scholars who are less than distinguished. The administration should also always be ready to replace the department chairmen, who would naturally remain professors after being replaced as chairmen.

Each advertised position should be broadly defined, usually leaving the professorial rank open, in order to attract the largest number of applicants. The department chairmen should actively recruit outstanding scholars who might otherwise not apply, including scholars from foreign countries with a good command of English. Positions should also be created for any truly great scholars who could be recruited. The speaking and teaching skills of applicants should be judged from guest lectures rather than from teaching evaluations by students, which can be manipulated by easy assignments and lenient grading.

The main grounds for hiring professors should be their records of research and publication, judged by originality, importance, accuracy, rigor, and clarity. A professor who has written original, important, accurate, rigorous, and lucid works will almost certainly be a good teacher of good students and will probably also be heard outside the university.

Professorial salaries at the new university should be on average somewhat higher than salaries paid at established leading universities. This would allow professors hired away from the leading universities to be compensated for moving to a new institution with a still-developing reputation. The salary scale should be made public, with clearly defined ranks and the same salary for every professor at each rank, ranging, for example, from Assistant Professor I to Full Professor XII. Each professor’s rank should be based on his academic and intellectual accomplishments. Significant deficiencies in teaching, particularly giving inflated grades, should be penalized. (Inflated grades can be detected through a statistical comparison of the grades professors give with their students’ overall grade-point averages.) Adjunct professors should be few (and mostly not academics), and paid regular professorial salaries adjusted for their teaching loads and qualifications (around ten times what most adjuncts are paid now). Faculty committees, which at most universities provide many distractions and few advantages, should be kept to a minimum.

Since the university would soon grow too large to be a single community where everyone knew everyone else, it would need smaller units. Such units in American universities, including Harvard houses and Yale colleges, have failed to develop the sense of community of Oxford and Cambridge colleges because the American units lack a real function in the process of education, which is run instead by academic departments. The best solution would probably be to have departmental colleges, with residences, dining halls, classrooms, and faculty offices organized around departments or groups of related departments. Some junior faculty and graduate students would serve as tutors and live in the departmental colleges with the undergraduates. There should be no vocational departments—in other words, there should be a department of economics, but not a department of business administration. There should be no programs of women’s or ethnic studies, which usually turn out to be ideological rather than academic. All students should have private rooms to keep roommates from disturbing one another’s studying or sleeping. Residential entryways should be segregated by sex and subject to sensible visiting hours. Students should be required to live on campus.

In any case, the university should resist the dogma that something is wrong with a society unless every activity and profession has the same proportion of each race and sex as the population as a whole.

Every student should be required before enrolling to subscribe to an honor code, which should include pledges never to engage in cheating or fraud and never to obstruct the free speech of others. Serious violations of the honor code should be enforced by expulsion. Deciding exactly what to do about underage drinking, misdemeanor drug use, and sexual activity is admittedly difficult for universities today, when our laws consider most college-age students responsible enough to vote but too irresponsible to drink a glass of beer. But as a rule, the university should avoid turning every form of behavior into either a crime or a civil right. Rape is a serious crime and should be investigated and punished not by the university but by the police and the courts. If misguided regulations force the university to try rape cases, specially hired lawyers and retired judges should handle them in a setting as much like a courtroom as possible. If the university does its job well, most students should have little time for alcohol or drugs, and those who let drugs or alcohol affect their studies would soon be suspended or dismissed for academic reasons. The university should limit its athletic programs to intramural teams and to providing students and professors with facilities for exercise and recreation.

Some might also argue that starting a new university would be more difficult and expensive than reforming an old one. In theory, no doubt, enlightened trustees at an existing university could name a determined and forceful new president, who could then select new deans and department chairmen, introduce a rigorous curriculum, reform hiring and admissions, and even reorganize the university around departmental colleges. In practice, however, such a president would surely face a student and faculty revolt over the new curriculum and the new program of hiring and admissions. A newly founded university could choose administrators, professors, and students who supported its goals, but attempting reforms at an existing university would create discord that would last for years and disrupt any changes. On the other hand, the example and competition of a successful new university would eventually make reforms easier to promote at other universities.

Another argument against such a university is that it would discredit itself among academics and intellectuals because its faculty and student body would be disproportionately male and white or Asian. In fact, today’s academic hiring and admissions usually favor only blacks, Hispanics, and women who hold views that the universities favor. And those women, blacks, and Hispanics who want to study subjects unrelated to the group identities that they are supposed to share are seldom hired at leading institutions and are given little preference in admissions. The new university should therefore be able to recruit significant numbers of good female, black, and Hispanic professors and students. In any case, the university should resist the dogma that something is wrong with a society unless every activity and profession has the same proportion of each race and sex as the population as a whole. Discrimination against women and minorities is practically nonexistent in American higher education today, and it is certainly negligible in comparison with the massive discrimination in favor of women and minorities and against moderates and conservatives.

The leading universities enjoy three great advantages: money, prestige, and a lack of competition. Yet these institutions spend most of their money on things that contribute nothing to academics, such as bloated and overpaid administrative staffs, perpetually hoarded endowments, unnecessary new facilities, lavish intercollegiate athletics, and ideological programs of no academic value. A recent survey found that Harvard spent 40 percent of its budget on administration and just 29 percent on instruction. The minute percentage of money the leading universities spend on bidding for professors usually goes to candidates chosen for their ideology, race, or gender. Growing public awareness of the decline in higher education has damaged the prestige of its leading institutions but hasn’t hurt their ability to get students, professors, and donations, because most of the less prestigious schools have declined even more. This lack of competition is in fact the essential advantage today’s leading institutions possess. The way to break their stranglehold on the system is to create competition where none now exists. This plan for a new university would do exactly that.

Choose your plan and pay nothing for six Weeks!

For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.

Must-Reads from Magazine

Chicago, Illinois — Andy has little time to chitchat. There are hundreds of hot towels to sort and fold, and when that’s done, there are yet more to wash and dry. The 41-year-old is one of half a dozen laundry-room workers at Misericordia, a community for people with disabilities in the Windy City. He and his colleagues, all of whom are intellectually disabled and reside on the Misericordia “campus,” know that their work has purpose, and they delight in each task and every busy hour.

In addition to his job at the laundry room, Andy holds two others. “For two days I work at Sacred Heart”—a nearby Catholic school—“and at Target. Target is a store, a big super-store. At Sacred Heart, I sweep floors and tables.”

“Ah, so you’re the janitor there?” I follow up.

“No, no! I just clean. I love working there.”

Andy’s packed schedule is typical for the higher-functioning residents at Misericordia, many of whom juggle multiple jobs. Their work at Misericordia helps meet real community needs—laundry, recycling, gardening, cooking, baking, and so on—while preparing residents for the private labor market. Andy has already found competitive employment (at Target), but many others rely on Misericordia’s own programs to stay active and employed.

Yet if progressive lawmakers and minimum-wage crusaders have their way, many of these opportunities would disappear, along with the Depression-era law which makes them possible.

The law, Section 14(c) of the Fair Labor Standards Act, permits employers to pay people with disabilities a specialized wage based on their ability to perform various jobs. It thus encourages the hiring of the disabled while ensuring that they are paid a wage commensurate with their productivity. The law safeguards against abuse by, among other things, requiring employers to regularly review and adjust wages as disabled employees make productivity gains. Many of these employers are nonprofit entities that exist solely to provide meaningful work for the disabled.

Only 20 percent of Americans with disabilities participate in the labor force. The share is even smaller among those with intellectual and developmental disabilities. For this group, work isn’t mainly about money—most of the Misericordia residents are oblivious to how much they get paid—so much as it is about purpose and community. What the disabled seek from work is “the feeling of safety, the opportunity to work alongside friends, and an atmosphere of kindness and understanding,” says Scott Mendel, chairman of Together for Choice, which campaigns for freedom of choice for the disabled and their families. (Mendel’s daughter, who has cerebral palsy, lives and works at Misericordia.)

Abstract principles of economic justice, divorced from economic realities and the lived experience of people with disabilities, are a recipe for disaster in this area. Yet that’s the approach taken by too many progressives these days.

Last month, for example, seven Senate progressives led by Elizabeth Warren of Massachusetts wrote a letter to Labor Secretary Alexander Acosta denouncing Section 14(c) for setting “low expectations for workers with disabilities” and relegating them to “second-class” status. The senators also took issue with so-called sheltered workshops, like those at Misericordia, which are specifically designed to help the disabled find pathways to market employment. Activists at the state level, meanwhile, continue to press for the abolition of such programs, and they have already succeeded in restricting or limiting them in a number of jurisdictions, most notably in Pennsylvania, where such settings have been all but eliminated.

While there have been a few, notorious cases of 14(c) and sheltered-workshop abuse over the years, existing law provides mechanisms for punishing firms for misconduct. Getting rid of 14(c) and sheltered workshops, however, could potentially leave hundreds of thousands of disabled people unemployed. Activists have yet to explain what it is they expect these newly jobless to do with their time.

Competitive employment simply isn’t an option for many of the most disabled. And even those like Andy, who are employed in the private economy, tend to work at most 20 hours a week at their competitive jobs. What would they do with the rest of their time, if sheltered workshops didn’t exist? Most likely, they would “veg out” in front of a television. Squeezing 14(c) program and forcing private employers to pay minimum wage to workers whose productivity falls far short of the norm wouldn’t improve the lot of the disabled; it would leave them jobless.

Economic reality is reality no less for the disabled.

Nor have progressives accounted for the effects on the lives of the disabled in jurisdictions that have restricted sheltered workshops. “None of these states have done an adequate job of ascertaining whether these actions actually enhanced the quality of life for the individuals affected,” a study in the Social Improvement Journal concluded last year. Less time in sheltered workshops, the study found, “was not replaced with a corollary increase in the use of more integrated forms of employment.” Rather, “these individuals were essentially unemployed, engaging in made-up day activities.”

Make-work is not what Andy and his colleagues are up to today at Misericordia. They complete real tasks, which benefit their fellow residents in concrete ways. “This work is training, but it also gives them meaning,” one Misericordia director told me. “It’s not just doing meaningless work, but it’s going toward something. We’re not setting them up to do something that someone else takes apart. This is something that’s needed.” Yet, in the name of economic justice, progressives are on the verge of depriving men and women like Andy of the dignity of work and the freedom of choice that non-disabled Americans take for granted.

Choose your plan and pay nothing for six Weeks!

For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.

To paraphrase New York Times columnist Ross Douthat (with apologies), the less Republicans do in office, the more popular they generally become. That is, when the GOP exists solely in voters’ minds as a bulwark against cultural and political liberalism, it can cobble together a winning coalition. Likewise, Democrats regain the national trust when they serve only as an obstacle to Republican objectives. It’s when both parties begin to talk about what they want to do with their power that they get into trouble.

That is an over-simplification, but the core thesis is an astute one. In an age of negative partisanship and without an acute foreign or domestic crisis to focus the national mind, it’s not unreasonable to presume that both parties’ chief value is defined in negative terms by the public. Considering how little of the national dialogue has to do with policy these days, general principles and heuristics are probably how most marginal voters navigate the political environment.

Somewhere along the way, though, Democrats managed to convince themselves that they cannot just be the anti-Donald Trump party. Their most influential members have become convinced that the party needs to articulate a positive agenda beyond a set of vague principles. For the moment, Democrats who merely want to present themselves as unobjectionable alternatives to Trumpism without going into much broader detail appear to be losing the argument.

According to a study of campaign-season advertisements released on Friday by the USA Today Network and conducted by Kantar Media’s Campaign Marketing Analysis Group, Democrats are not leaning into their opposition to Trump. While over 44,000 pro-Trump advertisements from Republican candidates have aired on local broadcast networks, only about 20,000 Democratic ads have highlighted a candidate’s anti-Trump bona fides. “Trump has been mentioned in 27 [percent] of Democratic ads for Congress, overwhelmingly in a negative light,” the study revealed. In the same period during the 2014 midterm election cycle, by contrast, 60 percent of Republican advertisements featured President Barack Obama in a negative light.

There are plenty of caveats that should prevent observers from drawing too many broad conclusions about what this means. First, comparing the political environment in 2018 to 2014 is apples and oranges. Recall that 2014 was Barack Obama’s second midterm election, so naturally enthusiasm among the incumbent party’s base to rally to the president’s defense wanes while the “out-party’s” anxiety over the incumbent president grows. If Donald Trump’s job-approval rating is still anemic in September, it is reasonable to expect that Republican candidates will soft-peddle their support for the president just as Democrats did in 2010. Second, Democrats running against Democrats in a Democratic primary race may not feel the need to emphasize their opposition to the president, since that doesn’t create a stark enough contrast with their opponent.

And yet, the net effect of the primary season is the same. Democrats aren’t just informing voters of their opposition to how Trump and the Republican Party have managed the nation’s affairs; they’re describing what they would do differently. By and large, the Democratic Party’s agenda consists of “doubling” spending on social-welfare programs, education, and infrastructure, and promising a series of five-year-plan prestige projects. But Democratic candidates are also leaning heavily into divisive social issues.

The themes that Democratic ads have embraced so far range from support for new gun-control measures (“f*** the NRA,” was one New Mexico candidate’s message), to protecting public funding for Planned Parenthood, to promoting support for same-sex marriage rights, to attacking Sinclair Broadcasting (which happened to own the network on which that particular ad ran). A number of Democratic candidates are running on their support for a single-payer health-care system, including the progressive candidate in Nebraska’s GOP-leaning 2nd Congressional District who narrowly defeated an establishment-backed former House member this week, putting that seat farther out of the reach of Democrats in November.

In the end, messages like these animate the Democratic Party’s progressive base, but they have the potential to alienate swing voters. That may not be enough to overcome the electorate’s tendency to reward the “out-party” in a president’s first midterm election. And yet, the risk Democrats run by being specific about what they actually want to do with renewed political power cannot be dismissed. Democrats in the activist base are convinced that embracing conflict-ridden identity politics is a moral imperative, and the party’s establishmentarian leaders appear to believe that being anti-Trump is not enough to ensure the party’s success in November. All the while, the Democratic Party’s position in the polls continues to deteriorate.

Choose your plan and pay nothing for six Weeks!

For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.

A running theme in Jonah Goldberg’s fantastic new book, Suicide of the West, is the extent to which those who were bequeathed the blessings associated with classically liberal capitalist models of governance are cursed with crippling insecurity. Western economic and political advancement has followed a consistently upward trajectory, albeit in fits and starts. Yet, the chief beneficiaries of this unprecedented prosperity seem unaware of that fact. In boom or bust, the verdict of many in the prosperous West remains the same: the capitalist model is flawed and failing.

Capitalism’s detractors are as likely to denounce the exploitative nature of free markets during a downturn as they are to lament the displacement and disorientation that follows when the economy roars. The bottom line is static; only the emphasis changes. Though this tendency is a bipartisan one, capitalism’s skeptics are still more at home on the left. With the lingering effects of the Great Recession all but behind us, the liberal argument against capitalism’s excesses has shifted from mitigating the effects on low-skilled workers to warnings about the pernicious effects of prosperity.

Matthew Stewart’s expansive piece in The Atlantic this month is a valuable addition to the genre. In it, Stewart attacks the rise of a permanent aristocracy resulting from the plague of “income inequality,” but his argument is not a recitation of the Democratic Party’s 2012 election themes. It isn’t just the mythic “1 percent,” (or, in the author’s estimation, the “top 0.1 percent”) but the top 9.9 percent that has not only accrued unearned benefits from capitalist society but has fixed the system to ensure that those benefits are hereditary.

Stewart laments the rise of a new Gilded Age in America, which is anecdotally exemplified by his own comfort and prosperity—a spoil he appears to view as plunder stolen from the blue-collar service providers he regularly patronizes. You see, he is a member of a new aristocracy, which leverages its economic and social capital to wall itself off from the rest of the world and preserves its influence. He and those like him have “mastered the old trick of consolidating wealth and passing privilege along at the expense of other people’s children.” This corruption and Stewart’s insecurity is, he contends, a product of consumerism. “The traditional story of economic growth in America has been one of arriving, building, inviting friends, and building some more,” Stewart wrote. “The story we’re writing looks more like one of slamming doors shut behind us and slowly suffocating under a mass of commercial-grade kitchen appliances.”

Though he diverges from the kind of scientistic Marxism reanimated by Thomas Piketty, Stewart nevertheless appeals to some familiar Soviet-style dialectical materialism. “Inequality necessarily entrenches itself through other, nonfinancial, intrinsically invidious forms of wealth and power,” he wrote. “We use these other forms of capital to project our advantages into life itself.” In this way, Stewart can have it all. The privilege enjoyed by the aristocracy is a symptom of Western capitalism’s sickness, but so, too, are the advantages bestowed on the underprivileged. Affirmative action programs in schools, for example, function in part to “indulge rich people in the belief that their college is open to all on the basis of merit.”

It goes on like this for another 13,000 words and, thus, has the strategic advantage of being impervious to a comprehensive rebuttal outside of a book. Stewart does make some valuable observations about entrenched interests, noxious rent-seekers, and the perils of empowering the state to pick economic winners and losers. Where his argument runs aground is his claim that meritocracy in America is an illusion. Capitalism is, he says, a brutal zero-sum game in which true advancement is rendered unattainable by unseen forces is a foundational plank of the liberal American ethos. This is not new. Not new at all.

Much of Stewart’s thesis can be found in a 2004 report in The Economist, which alleges that the American upper-middle-class has created a set of “sticky” conditions that preserve their status and result in what Teddy Roosevelt warned could become an American version of a “hereditary aristocracy.” In 2013, the American economist Joseph Stiglitz warned that the American dream is dead, and the notion that the United States is a place of opportunity is a myth. “Since capitalism required losers, the myth of the melting pot was necessary to promote the belief in individual mobility through hard work and competition,” read a line from a 1973 edition of a National Council for the Social Studies-issued handbook for teachers. The Southern Poverty Law Center, which for some reason produces a curriculum for teachers, has long recommended that educators advise students poverty is a result of systemic factors and not individual choices. Even today, a cottage industry has arisen around the notion that Western largess is decadence, that meritocracy is a myth, and that arguments to the contrary are acts of subversion.

The belief that American meritocracy is a myth persists despite wildly dynamic conditions on the ground. As the Brookings Institution noted, 60 percent of employed black women in 1940 worked as household servants, compared with just 2.2 percent today. In between 1940 and 1970, “black men cut the income gap by about a third,” wrote Abigail and Stephan Thernstrom in 1998. The black professional class, ranging from doctors to university lecturers, exploded in the latter half of the 20th Century, as did African-American home ownership and life expectancy rates. The African-American story is not unique. The average American income in 1990 was just $23,730 annually. Today, it’s $58,700—a figure that well outpaces inflation and that outstrips most of the developed world. The American middle-class is doing just fine, but that experience has not come at the expense of Americans at or near the poverty line. As the economic recovery began to take hold in 2014, poverty rates declined precipitously across the board, though that effect was more keenly felt by minority groups which recovered at faster rates than their white counterparts.

As National Review’s Max Bloom pointed out last year, 13 of the world’s top 25 universities and 21 of the world’s 50 largest universities are located in America. The United States attracts substantial foreign investment, inflating America’s much-misunderstood trade deficit. The influx of foreign immigrants and legal permanent residents streaming into America looking to take advantage of its meritocratic system rivals or exceeds immigration rates at the turn of the 20th Century. You could be forgiven for concluding that American meritocracy is self-evident to all who have not been informed of the general liberal consensus. Indeed, according to an October 2016 essay in The Atlantic by Victor Tan Chen, the United States so “fetishizes” meritocracy that it has become “exhausting” and ultimately “harmful” to its “egalitarian ideals.”

Stewart is not wrong that there has been a notable decline in economic mobility in this decade. That condition is attributable to many factors, ranging from the collapse of the mortgage market to the erosion of the nuclear family among lower-to middle-class Americans (a charge supported by none-too-conservative venues like the New York Times and the Brookings Institution). But Mr. Stewart will surely rejoice in the discovery that downward economic mobility is alive and well among the upper class. National Review’s Kevin Williamson observed in March of this year that the Forbes billionaires list includes remarkably few heirs to old money. “According to the Bureau of Labor Statistics, inherited wealth accounts for about 15 percent of the assets of the wealthiest Americans,” he wrote. Moreover, that list is not static; it churns, and that churn is reflective of America’s economic dynamism. In 2017, for example, “hedge fund managers have been displaced over the last two years not only by technology billionaires but by a fish stick king, meat processor, vodka distiller, ice tea brewer and hair care products peddler.”

There is plenty to be said in favor of America’s efforts to achieve meritocracy, imperfect as those efforts may be. But so few seem to be touting them, preferring instead to peddle the idea that the ideal of success in America is a hollow simulacrum designed to fool its citizens into toiling toward no discernable end. Stewart’s piece is a fine addition to a saturated marketplace in which consumers are desperate to reward purveyors of bad news. Here’s to his success.

Choose your plan and pay nothing for six Weeks!

For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.

We try, we really do try, to sort through the increasingly problematic “Russian collusion” narrative and establish a timeline of sorts—and figure out what’s real and what’s nonsense. Do we succeed? Give a listen.

Choose your plan and pay nothing for six Weeks!

For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.

On July 4, 1863, Rabbi Sabato Morais of Philadelphia’s Mikveh Israel congregation ascended the pulpit to deliver the Sabbath sermon. Those assembled in the synagogue knew that over the previous few days, Union and Confederate forces had been engaged in an epic engagement at Gettysburg, but they had no idea who had won or whether Confederate forces would continue onward to Washington or Philadelphia. That year, July 4 coincided with the 17th of Tammuz, when Jews commemorate the Roman breach of the walls of Jerusalem. Morais prayed that God not allow Jerusalem’s fate to befall the American capital and assured his audience that he had not forgotten the joyous date on which he spoke: “I am not indifferent, my dear friends, to the event, which, four score and seven years ago, brought to this new world light and joy.”

An immigrant from Italy, Morais had taught himself English utilizing the King James Bible. Few Americans spoke in this manner, including Abraham Lincoln. Three days later, the president himself reflected before an audience: “How long ago is it?—eighty-odd years—since on the Fourth of July for the first time in the history of the world a nation by its representatives assembled and declared as a self-evident truth that ‘all men are created equal.’” Only several months later, at the dedication of the Gettysburg cemetery, would Lincoln refer to the birth of our nation in Morais’s manner, making “four score and seven years ago” one of the most famous phrases in the English language and thereby endowing his address with a prophetic tenor and scriptural quality.

This has led historians, including Jonathan Sarna and Marc Saperstein, to suggest that Lincoln may have read Morais’s sermon, which had been widely circulated. Whether or not this was so, the Gettysburg address parallels Morais’s remarks in that it, too, joins mourning for the fallen with a recognition of American independence, allowing those who had died to define our appreciation for the day that our “forefathers brought forth a new nation conceived in liberty.” Lincoln’s words stressed that a nation must always link civic celebration of its independence with the lives given on its behalf. Visiting the cemetery at Gettysburg, he argued, requires us to dedicate ourselves to the unfinished work that “they who fought here have thus far so nobly advanced.” He went on: “From these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion,” thereby ensuring that “these dead shall not have died in vain.”

The literary link between Morais’s recalling of Jerusalem and Lincoln’s Gettysburg Address makes it all the more striking that it is the Jews of today’s Judea who make manifest the lessons of Lincoln’s words. Just as the battle of Gettysburg concluded on July 3, Israelis hold their Memorial Day commemorations on the day before their Independence Day celebrations.On the morning of the Fourth of Iyar, a siren sounds throughout the land, with all pausing their everyday activities in reverent memory of those who had died. There are few more stunning images of Israel today than those of highways on which thousands of cars grind to a halt, all travelers standing at the roadside, and all heads bowing in commemoration. Throughout the day, cemeteries are visited by the family members of those lost. Only in the evening does the somber Yom Hazikaron give way to the joy of the Fifth of Iyar’s Yom Ha’atzmaut, Independence Day.For anyone who has experienced it, the two days define each other. Those assembled in Israel’s cemeteries facing the unbearable loss of loved ones do so in the knowledge that it is the sacrifice of their beloved family members that make the next day’s celebration of independence possible. And the celebration of independence is begun with the acknowledgement by millions of citizens that those who lie in those cemeteries, who gave “their last full measure of devotion,” obligate the living to ensure that the dead did not die in vain.

The American version of Memorial Day, like the Gettysburg Address itself, began as a means of decorating and honoring the graves of Civil War dead. It is unconnected to the Fourth of July, which takes place five weeks later. Both holidays are observed by many (though not all) Americans as escapes from work, and too few ponder the link between the sacrifice of American dead and the freedom that we the living enjoy. There is thus no denying that the Israelis’ insistence on linking their Independence Day celebration with their Memorial Day is not only more appropriate; it is more American, a truer fulfillment of Lincoln’s message at Gettysburg.

In studying the Hebrew calendar of 1776, I was struck by the fact that the original Fourth of July, like that of 1863, fell on the 17th of Tammuz. It is, perhaps, another reminder that Gettysburg and America’s birth must always be joined in our minds, and linked in our civic observance. It is, of course, beyond unlikely that Memorial Day will be moved to adjoin the fourth of July. Yet that should not prevent us from learning from the Israeli example. Imagine if the third of July were dedicated to remembering the battle that concluded on that date.Imagine if “Gettysburg Day” involved a brief moment of commemoration by “us, the living” for those who gave the last full measure of devotion.Imagine if tens—perhaps hundreds—of millions of Americans paused in unison from their leisure activities for a minute or two to reflect on the sacrifice of generations past. Surely our observance of the Independence Day that followed could not fail to be affected; surely the Fourth of July would be marked in a manner more worthy of a great nation.

Choose your plan and pay nothing for six Weeks!

For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.