By Josh Zeitz

Politico Magazine July 20, 2015

Vietnam wasn’t supposed to rear its head in 2016. With the election of Barack Obama, the first president to have come of age after the war’s close, many political observers expected that the quadrennial debate over who served and who dodged—an issue in every presidential election from 1992 through 2004—was at last over. Leave it Trump to drag it back into the public square on Saturday, when he derogated the wartime service of Sen. John McCain, a combat veteran who endured five years of torture as a POW in the notorious Hanoi Hilton. “I like people that weren’t captured,” he said.

The Donald, who received a medical deferment in 1968 for bone spurs in his heels, seems genuinely confused by the backlash. It would be easy to write his nescience off as a form of adolescent self-absorption (though, in fairness to adolescents, most probably know how to recognize a war hero when they see one).

But part of his problem owes to a lasting historical legacy of the Vietnam War. Simply put, Vietnam was an internal class war as well as a war against a foreign belligerent. Unlike all American conflicts that preceded it, Vietnam drew sharp lines between those with means and those without. Young men from privileged backgrounds who served in Vietnam, like John McCain and John Kerry, usually did so electively, and as officers. Most working-class men, on the other hand, had no choice. They could join or be drafted, and almost always, they were enlisted.

We tend to lump the “sixties generation” into one undifferentiated cohort. But there was considerable divergence between the experiences of working-class men and those of their more privileged peers. This departure explains much about politics in the 1970s and 1980s, as well as some of Donald Trump’s current struggle.

***

In September 1967 the New YorkTimes spent several days following a group of 18-year-old students as they arrived at area colleges. Freshman orientation, the paper observed, was a wonderland of “boat rides, excursions and get-together dinners.” From the moment of their arrival, freshmen were greeted with open arms and made to feel like important members of the collegiate community. At Columbia University, volunteers helped them move into their dorm rooms. University administrators hosted teas and lunch receptions to welcome them to campus. At nearby schools like Vassar and Hofstra, students learned that they were free to attend faculty and administration meetings. At Baruch College, part of the City University of New York system, the associate dean assured freshmen that if they had “any problems or complaints, come and talk to me about it. My door is always open.”

Hundreds of miles and many worlds away, young men like Ron Kovic experienced an altogether different rite of passage. Filing off a military bus at Parris Island, South Carolina, in the pitch dark of night, Kovich and his fellow Marine recruits were greeted by a tall, muscular drill instructor who gave them three seconds to line up on yellow-painted footprints spanning the hard concrete parade deck. “Awright, ladies!” the DI barked. “My name is Staff Sergeant Joseph. This is Sergeant Mullins. I am your senior drill instructor. You will obey both of us. You will listen to everything we say. You will do everything we tell you to do. Your souls today may belong to God, but your asses belong to the United States Marine Corps.”

While college deans invited incoming students to join them for sandwiches and orientation lectures, Staff Sergeant Joseph berated his trainees. “There are eighty of you, eighty young warm bodies,” he yelled, “eighty sweatpeas … and I want you maggots to know today that you belong to me … until I have made you into marines.”

Roughly 27 million young men came of draft age between 1964 and 1973—the peak years of American military engagement in Southeast Asia. Of that total, 2.5 million men served in the Vietnam War. Roughly 25 percent of all enlisted men who served in Vietnam were from poor families, 55 percent from working-class families, and 20 percent from the ranks of the middle class. In an era when half of all Americans claimed at least some post-secondary education, only 20 percent of Vietnam War servicemen had been to college, while a staggering 19 percent had not completed 12th grade. “When I was in high school, I knew I wasn’t going to college,” remembered a typical recruit. “It was really out of the question. Even graduating from high school was a big thing in my family.”

Among enlisted men who fought in Vietnam, roughly one-third were drafted, one-third joined entirely out of choice and one-third were “draft-motivated” enlistees who expected to be swept up by the Selective Service and volunteered in hopes of choosing the branch and location of their service. Many recruits who joined of their own volition had few alternative options. Unemployment rates for young men hovered around 12.5 percent in the late 1960s (over double that figure for young black men), and even in places where unemployment was low, companies were reluctant to hire and train young working-class men, for fear they would soon be drafted. “You try to get a job,” explained one such unemployed man, “and the first thing they ask you is if you fulfilled your military service.”

By contrast, middle-class boomers enjoyed a host of options in avoiding the draft. The government extended deferments to students enrolled in college or graduate school, but only to those who were full-time students. For one draftee who was working his way through the University of Hartford, the deferment system proved useless. “I was in school,” he recalled. “But I was only carrying a course load of nine credits. You had to have 12 or 15 back then [to earn a deferment]. But I was working two jobs and didn’t have time for another three credits.” Selective Service snatched him up.

Potential conscripts could also avoid the draft if they furnished military authorities with proof of psychiatric or medical ineligibility, but as a general rule, few working-class families enjoyed regular access to private physicians who could furnish or fabricate evidence of long-term treatment for a qualifying disability. Even something so simple as orthodontic braces were grounds for ineligibility, but few working-class men could afford to pay $2000 for elective dental work.

Because of the built-in bias in the draft system, Vietnam split Americans by class and geography. Three affluent towns in Massachusetts—Milton, Lexington and Wellesley—lost 11 young men in the war out of a total population of roughly 100,000. Nearby Dorchester, a working-class enclave with a comparable population, saw 42 of its sons die in southeast Asia. A study conducted in Illinois found that young men from working-class neighborhoods were four times as likely to be killed in the war as men from middle-class neighborhoods, while in New York, Newsday studied the backgrounds of 400 Long Island men who died in Vietnam and concluded that they “were overwhelmingly white, working-class men. Their parents were typically blue collar or clerical workers, mailmen, factory workers, building tradesmen, and so on.” In 1970, where a man lived, who his parents were, and how he grew up mattered enormously.

***

For most enlisted men who fought on the front lines in Vietnam, boot camp followed a predictable pattern. “They strip you, first your hair,” one veteran recalled. “I never saw myself bald before. … Guys I had been talking to not an hour before—we were laughing and joking—I didn’t recognize no more. … It’s weird how different people look without their hair. That’s the first step.” New servicemen began a grueling routine of physical and mental conditioning that began each day at dawn 4:00 a.m. and lasted until after sunset. Long hours of pushups, sit-ups, marches and outdoor infantry training were de rigueur.

After basic, new servicemen underwent several weeks of training for their military occupational specialty (MOS) and then shipped off for the balance of their service. For many enlisted men, this meant 12 or 13 months in Vietnam, followed by another six months of stateside service.

From the very start, war was surreal. Rather than send servicemen by military transport, the government contracted with commercial airlines to shuttle fresh troops to Southeast Asia. The sleek civilian jets were “all painted in their designer colors, puce and canary yellow,” remembered one veteran. “There were stewardesses on the plane, air conditioning. You would think we were going to Phoenix or something.” One veteran remembered that “you could cut the fear on that plane with a knife. You could smell it.”

by Matt Jacobs

HNN July 21, 2015

The restoration of U.S. and Cuban diplomatic ties is quite an event, particularly given the hostility that defined relations between the two countries for so long. President Obama’s decision to re-open an embassy in Havana and Raul Castro’s agreement to do the same in Washington continues the thaw in U.S.-Cuban relations. The steps taken by both countries have generated much publicity over the past few months. Numerous U.S. media outlets have produced stories on the implications for Obama’s legacy and the potential fallout for 2016 presidential candidates. As usual Washington politicians and pundits have focused their attention on the reasons for the U.S. shift. Yet, it is not President Obama’s decision to seek a normalization that warrants the most attention, but rather the Castro government’s reasoning behind their determination to chart a new course in U.S.-Cuban relations. In fact, much more can be learned from concentrating instead on what is behind the Cuban leadership’s thinking.

Havana’s recent decisions are deeply rooted in what can best be termed as Cuba’s “revolutionary pragmatism.” Though the Castro government continually speaks the language of revolutionary change, it also has also taken a sensible view to foreign policy matters when necessary. Such an approach has guided Cuban engagement with the world from the 1960s to the present.

“Revolutionary pragmatism” traces back to the very beginning of the Castro regime. In the years immediately following the Cuban Revolution, for example, a top issue in US-Cuban relations included Fidel Castro’s support for anti-US guerilla movements throughout Latin America. Castro repeatedly challenged Latin Americans and others around the world to stand up to the United States. He famously declared in 1962 that it was “the duty of every revolutionary to make the revolution. In America and the world, it is known that the revolution will be victorious, but it is improper revolutionary behavior to sit at one’s doorstep waiting for the corpse of imperialism to pass by.”

Yet, privately, Castro proved willing to develop a foreign policy based on practical considerations. On a recent research trip to Cuba I gained access to the Foreign Ministry Archive in Havana and was surprised at what I found. Many detailed reports from the early 1960s discussed the prospects for revolution in Central and South America, but concluded that conditions were not ripe in many nations for radical change. This reality led to a more pragmatic position being taken by leaders in Havana as they approached Latin America.

The most documented aid came in the form of training young Latin Americans in guerilla tactics who traveled to Cuba. As historian Piero Gliejeses’s excellent studies demonstrate, Castro turned his attention to Africa as early as 1964. Havana’s decision to abandon any large-scale support for revolutionary groups in Latin America was not made due to a lack of enthusiasm for challenging Washington’s traditional sphere of influence, but owed instead to practical considerations.

Similarly, in the 1980s when the Sandinista triumph in Nicaragua offered Havana an ally in Latin America, Castro held to “revolutionary pragmatism.” He counseled Daniel Ortega not to antagonize elite economic interests too much. On a visit to Managua, Castro even declared that allowing some capitalism in the Nicaraguan economy did not violate revolutionary principles. He bluntly told Nicaraguan leaders that they did not have to follow the path taken by Cuba, “Each revolution is different from the others.”

Perhaps the greatest illustration of Cuban flexibility was the Castro regime’s response to the collapse of the Soviet Union. In June 1990, after receiving word that aid from Moscow would no longer flow to Havana, Fidel Castro announced a national emergency. He called his initiative “the Special Period in Peacetime.” Cuba welcomed foreign investment, tourism, the U.S. dollar, and allowed small-scale private businesses. While many prognosticators predicated a complete collapse of the Castro regime, the revolutionary government endured due to its ability to adapt.

Thus, recent developments must be viewed within their proper historical context. As it has in the past, Castro’s regime is pursuing “revolutionary pragmatism.”

The impetus for changes in Cuba’s approach owes to several reasons. First, since the death of Hugo Chávez in 2013 Venezuela has become a questionable economic ally. Political instability coupled with a crumbling economy has likely caused Havana to view a key economic patron in Caracas as increasingly unreliable. A complete breakdown of order in Venezuela would greatly affect the Cuban economy in a negative way. Thus, a better economic relationship with the United States is one way of protecting the island from a changing relationship with Venezuela.

Other reasons for Cuba’s rapprochement with the United States owe to domestic concerns. Since taking power in 2008, Raul Castro has been open to reforms in an attempt to make socialism work for the twenty-first century. Over the last few years the Cuban government has relaxed controls over certain sectors of the economy, but reforms have been slow and halting. Anyone who has spent time in Havana cannot help but notice the aging infrastructure and inefficient public transportation system. A key to any reform agenda is attracting foreign investment, and the United States stands as an attractive partner.

Furthermore, as Raul is poised to step down from power in 2018, Cuba is starting to make preparations for a successful turnover. An improving relationship with Washington may help his likely successor, Miguel Díaz-Canel, better navigate the transfer. In sum, at this point and time, normalization of U.S.-Cuban relations serves Havana’s best interests.

It remains to be seen just how far the Cuban government will go regarding changes in policy. Going back to 2010, Raul Castro declared during a national address that “we reform, or we sink.” His recent push for renewed relations with the United States will likely create an influx of U.S. tourists and more capital from American businesses. In turn, this could place Cuba down the path of other communist nations who embraced elements of capitalism, China and Vietnam notably. Just how far Raul will go with his reform agenda remains to be seen.

Ultimately, a U.S.-Cuban thaw is a positive step. Antagonism between the two countries serves no one, especially the Cuban people. Yet, we should not see the recent shifts as merely Washington changing course. The steps taken by Havana are equally important and should be viewed as part of a long history of shrewd diplomacy. While Cuban foreign policy has traditionally been revolutionary in rhetoric, it has proven once again to be pragmatic in practice.

Matt Jacobs received his PhD in History from Ohio University in 2015. This fall he will be a Visiting Assistant Professor of Intelligence Studies and Global Affairs at Embry-Riddle’s College of Security and Intelligence. He has conducted research at the Cuban National Archive and the Cuban Foreign Ministry Archive, both in Havana.

The New Deal welfare state was exclusionary and inequitable. We must envision and organize for something better.

Jacobin July 23, 2015

Since the creation of the free-market Liberty League by the DuPont brothers in 1936, hostile corporate leaders, financiers, economists, and lawmakers have been bent on destroying Franklin Roosevelt’s New Deal welfare state.

Wisconsin workers have seen their right to collective bargaining outlined in the New Deal’s Wagner Act gutted, while public pensions, created during the Great Depression to bolster public employment and ensure long-term economic security, have been attacked from Alaska to Florida. Congress also continues to chip away at the state-sponsored provision of basic needs, recently targeting the food stamp program (originally created under FDR) by proposing that all recipients hold jobs, suffer lifetime limits, and receive lower overall benefits.

To many observers, it appears that the New Deal and its safety net have been shredded. Political scientists and others have argued that the perilous individual economic risk that Americans faced before the New Deal has been foisted back on them as its collective protections have withered. With the shocking growth in economic inequality that has arisen alongside cuts to the New Deal, freedom from want — the keystone of Roosevelt’s “Four Freedoms” — has been chipped away to a pebble. It’s enough to make Americans long for a revival of the politics of the 1930s.

But we should be clear-eyed rather than nostalgic about the demise of the welfare state.

The New Deal was a flawed welfare system. It was built through exclusions and inequities, embracing some Americans while cutting out many others. Though its programs enveloped a wider swath of citizens over time — more non-whites, more women, and more marginal workers — their entrance into the safety net was hard fought and politically controversial. The fractured inequities the New Deal produced among the populace never really disappeared and, in some ways, widened and sharpened the divide between those inside of the New Deal’s protective web and those beyond it.

This was in part because New Deal programs were not the only vehicles for social welfare, but existed alongside other programs that also differentiated among citizens and their entitlements. The New Deal was never synonymous with the welfare state as many European countries developed it: comprehensive and universal social welfare programs for populations enjoying rough equal citizenship rights.

Instead, the New Deal was part of a hodgepodge of varied and sometimes hidden social welfare programs — some public, some private — that rewarded different groups of Americans for different reasons.

Seeing the New Deal alongside the unwieldy and unequal panoply of American social welfare prevents us from indulging in an exceptionalist narrative of its history, or embracing a misplaced nostalgia for a glorious historical moment.

For Whom Was the New Deal a Deal?

For workers in steady industrial jobs, working year-round, the New Deal provided economic security through unionization, labor protections, and social insurance. The lucky Americans who held these jobs were largely male and white, beneficiaries of the sex and race-segregated labor markets of the time.

Reinforced by the economic growth of World War II, the GI Bill, postwar prosperity, and the union-corporate accords of the 1950s, New Deal supports afforded these men — and their families — a higher standard of living, even when they were too old or sick to work, than any common citizens of the United States had ever experienced.

Their unions protected them in the workplace. Their bank accounts were insured. For some, the Federal Housing Administration provided loans. Unemployment insurance offered unprecedented protection from the vagaries of the volatile capitalist economy. And Social Security offered the promise of retirement or, upon death, the protection of wives and children — a historic first for the working class.

The New Deal cast the net of economic security wider than ever before, but not wide enough to bring in vast numbers of Americans who labored outside of the steady, salaried primary labor market.

Unskilled non-industrial workers never made it inside the original New Deal’s safety net. Southern and Western representatives of agricultural interests would not abide social protections and entitlements for the largely non-white agricultural workforce in their states.

Southerners lobbied for the right to discriminate against African Americans, whom they feared would leave plantations and domestic work for higher paid public works jobs. One DuPont vice president, an early member of the Liberty League, wrote angrily to a political sympathizer about the “Five negroes on my place in South Carolina [who] refused work this spring . . . saying they had easy jobs with the government.”

If applied equally, the New Deal’s public works and public welfare programs could offer economic alternatives to poorly paid, exploited African-American agricultural and domestic labor. Southerners traded their votes for the white, male industrial programs of the New Deal in order to prevent such eventualities in their states.

Domestic workers, employed throughout the country and largely non-white and female, were not entitled either to labor protections or social insurance. Non-whites, largely African Americans and Mexican Americans, were denied insurance or union protections because of their low status in the secondary labor market, and they were also discriminated against in New Deal recovery and public works programs.

The National Recovery Administration gave hiring preference to whites and sanctioned separate, lower pay scales for African Americans. The Public Works Administration and Works Progress Administration offered fewer programs in the agricultural areas of the country where non-whites were concentrated.

The Civilian Conservation Corps operated racially segregated camps, and the New Deal’s agricultural programs offered incentives to white landowners to throw African-American tenants and sharecroppers off the land. In the South and West, state and local leaders used the discretionary powers granted by the federal public assistance programs to limit cash assistance to African Americans.

Women, like non-whites, found that the New Deal did not provide them a very good deal, at least not directly. In the 1930s, women constituted between 24–30 percent of workers in the labor market (their numbers in the labor market increased over the course of the decade). And although the Wagner Act legalized unionization, sex-segregated labor markets untouched by the New Deal meant that women only had access to about 10 to 15 percent of unionized jobs.

Fired in the face of men’s perceived need to possess the scarce jobs of the 1930s, many women sought public works jobs to support themselves. But women only received about 12 percent of New Deal public works program jobs — less than half as much as their representation in the labor market. The New Deal public works jobs that were open to them often placed them in the traditional sex-segregated female positions in which they labored in the private sector, like domestic service, sewing, and nursing.

Imagining them as secondary and non-essential, the New Deal cast women as less than full economic citizens, failing to offer even regular, salaried full-time female workers the same access to social and economic security as men. New Deal policymakers imagined women as fundamentally dependent on male breadwinners, and constructed New Deal social welfare programs around that image.

Social Security — the keystone of the New Deal welfare programs — also yoked women to their husbands in old age. Married women who had paid into the Social Security program would have to share the payments of their higher paid husbands and forfeit their own. Policymakers instead siphoned off married women’s payments into the general revenues of the program.

Networks of Exclusion

The New Deal was not just limited — it was also only one of numerous coexisting systems of welfare provision. Over the course of the twentieth century, millions of Americans derived social and economic support through myriad other government “welfare states” outside the New Deal orbit. These programs tended to accentuate the inequities institutionalized in the New Deal, bringing greater economic security to white, male breadwinners in the primary labor market.

The military welfare state for veterans and active duty personnel shored up the economic and social security of the millions of Americans — overwhelmingly men — who served in the wars of the twentieth century.

The post–World War II GI Bill was practically a New Deal of its own. It vaulted millions of American men and their families into the middle class through tuition payments and stipends, and home, farm, and small-business loans. GI Bills for the veterans of Korea and Vietnam, while not as generous as the original, continued the tradition of veterans’ support.

With the creation of the all-volunteer armed forces in 1973, the military began to offer generous social and economic welfare programs in order to recruit and retain the mostly male personnel it needed. For the over ten million personnel who have served since then, and their tens of millions of spouses and children, the military has offered what might be the most comprehensive social welfare system in the United States.

The post–World War II era tax system formed another bulwark, providing write-offs for heterosexual marriage, children, and home ownership. Often unrecognized, they operated as what Suzanne Mettler has called a hidden welfare state, but their credits helped build the Ozzie and Harriet suburbs that sustained millions of white men and their families.

Many American men with good jobs in the primary labor market were also able to access a private safety net in addition to a public one. White-collar salaried workers for America’s large blue chip corporations — overwhelmingly male and white — as well as unionized blue-collar workers in America’s postwar factories — again, mostly male and white — negotiated private employer–provided insurance and medical programs. Subsidized and encouraged by the government through corporate tax incentives, private employee benefits supplied the largely male managerial and unionized industrial workforce a private supplement to the New Deal welfare state under which they were already covered.

Franklin Roosevelt’s New Deal thus provided one important avenue of social welfare rather than the sole path to welfare provision. But even the more patchwork welfare states all worked in a kind of herky-jerky synchronicity to shore up the well-being of the initial beneficiaries of the New Deal, while leaving most non-whites and women with second-class social and economic citizenship.

The New Deal Legacy

There is now a vigorous debate among historians about the New Deal’s legacy. Some, like Jefferson Cowie and Nick Salvatore, argue that the New Deal’s exclusions, while real, should not diminish its achievements.

The Wagner Act, fair labor standards, Social Security, unemployment insurance, public assistance, and public works programs — all provided greater “collective economic security” to more Americans than ever before. The New Deal programs established the basis for a principle of social and economic protection that, they argue, could in theory be expanded to others.

But a wealth of scholarship, by people like Ira Katznelson and Alice Kessler-Harris, reveals that sanguine analyses like these overlook the compromised foundation of the New Deal’s achievements: it was precisely the exclusion of blacks and Mexicans, and the imaging of women as dependent wives, that allowed for the creation of a New Deal welfare state for white male breadwinners in regularized industrial and union jobs. The architecture of protection for white men was built in part on the backs of those who were denied full economic and social citizenship.

Good, protected jobs and social welfare existed now as a laudable opposite of lesser jobs — and lesser citizens. Southern and Western landowners could still exploit non-white labor in the fields or on the docks. African Americans and women would face barriers to challenging white men in the primary labor market, while married women would continue to be reliant on male breadwinners and provide needed domestic labor in those homes. The limited citizenship of many non-whites and women were traded for — and literally made possible by — the granting of full social and economic citizenship to white men.

Over time New Deal programs did expand to include more Americans. Social Security was extended to nearly 90 percent of American workers by the 1970s so that by the mid 1970s, poverty among older Americans had dramatically declined. Unemployment insurance also expanded significantly, softening some of the hardship of the business cycle. New programs covering disabilities of various kinds, both through insurance and public assistance, were created from the 1940s through the 1970s, and in the past twenty years have constituted the fasting growing realm of social protection.

As these expansions took place, marginal workers, women, and African Americans began to finally demand their own “New Deal.” In fact, entitlement to social support and economic protection constituted one of the central goals of both the Black Freedom Movement and the feminist movement. Women and nonwhites argued that the New Deal’s support programs were hallmarks of equal citizenship. In these ways, “rights” movements actually functioned as fights for equal access to the safety net that white men already enjoyed.

But those already on the inside of the New Deal met the requests of women and non-whites for access to entitlements with sharp rebukes.

Historians like Thomas Sugrue, Lisa Levenstein, and David Freundhave documented how white, mostly male communities of workers and homeowners rejected African Americans’ claims to social protections they enjoyed, such as union-protected jobs, FHA loans, and access to public hospitals and schools. Marisa Chappell andDonald Critchlow have likewise demonstrated the ferocious backlash against women’s claims to equal employment opportunities and the feminist movement’s requests for social protections like childcare or maternity leave.

Traditional New Deal supporters balked at including non-whites and women, who now sought first-class citizenship. Indeed, they turned on those aspects of the welfare state most likely to benefit non-whites and women — public assistance, food stamps, and public housing.

Some in the traditional New Deal coalition colored the War on Poverty as a “black” program and rejected it even as Lyndon Johnson’s Great Society brought more highways, parks, and college loans to their suburban communities. They charged social movements demanding equal social protection with being divisive “individual rights” movements that undermined the imagined “collective spirit” of the “universal” New Deal and its legacy.

Even today, the charge that social movements’ focus on “individual rights” somehow fractured the liberal left and killed the New Deal coalition and its social welfare legacies carries weight among liberal and left scholars, even though it patently echoes the original unequal exclusion and entitlement of the New Deal.

Of course, this charge fuels even more fire among opponents of the New Deal, whose resistance to the inclusion of blacks and women formed an important prong of their assault.

Conservative Republicans and Southern Democrats had already beaten back additions to the New Deal — they soundly rejected legislation for full employment and universal health care in the 1940s. But their plans for rollback of the existing welfare state accelerated in the 1960s and 1970s amid the breakdown of the postwar economic order, just as African Americans and women began to gain access to the more inclusive programs of the New Deal as well as additional modes of social welfare.

In the 1980s, the frenzy over the “underclass” purportedly created by the War on Poverty and the obsession with eliminating Aid to Families with Dependent Children (AFDC) could not be understood without reference to the ways that racist and sexist ideas merged with philosophical support for “free” markets and opposition to “socialism” in the face of economic crisis.

In the past ten years,the Right has skillfully employed gendered and racialized dog whistles to delegitimize government itself through a strategy of “welfare-ization” of the state. Republicans liken public school teachers and road builders to “welfare queens” who bilk the taxpayers through their bloated benefits and dependency on government.

As Gov. Scott Walker said when he eliminated the right of Wisconsin’s public workers to collectively bargain for their wages and benefits: “We can no longer live in a society where public sector workers are the haves and the taxpayers who foot the bill are the have nots.” The jobs and pensions of government workers are now fair game.

Something From Nothing

Many of the social and political movements of today, on both the Left and the Right, lay bare the legacies of the New Deal’s opportunities and limitations. Those long cut out of the New Deal — and other social welfare programs — are still trying to secure first-class citizenship.

Retail and service workers enduring low wages, irregular hours, lack of benefits and time off, and layoffs are now involved in an increasingly recognized national movement: the $15 per hour movement. Minimum wage laws are being approved in municipalities and states, with successful democratic referenda behind many of them. Unorganized workers in various economic sectors are circumventing unionization and joining worker centers. Notably, these centers focus on entire communities of low-wage workers, and the communities’ needs, endeavoring to organize for social welfare, not just on-the-job protections.

The immigration movement — a literal fight for full citizenship — seeks access to all the social and economic protections undocumented people are now denied. Even the Black Lives Matter movement, which is primarily about policing, reflects the crises faced by communities that lack social and economic security and full citizenship. First-class citizenship means protection from police brutality along with rights to social and economic protections that can and should be shared among all citizens.

Yet the timing for these movements’ claims to the welfare state is precarious. The most marginal Americans are grasping for victories at the same moment that the long-term New Deal programs that first built a white, male middle class are coming under fire — the gutting of collective bargaining, the assault on public pensions, and even the continued threat of Social Security privatization. Whether the new movements will lay the basis for a fuller welfare state or are a last gasp before a full unraveling remains to be seen.

In this context New Deal nostalgia is a trap. It deludes us about happier times that were not in fact happy for many Americans. While the New Deal offered an unprecedented safety net for many, its holes allowed at least half of the population to fall through. And its dependence on unjust social arrangements accentuated inequalities among the population, as other parts of America’s piecemeal social welfare system amplified original exclusions.

Nostalgia’s backward-looking wistfulness discourages the vision necessary for change. New Dealers themselves never called on nostalgia for inspiration. With no existing welfare state, they could only look forward.

Those of us who value social and economic security, and embrace a radical program of social provision that challenges the drives of capital, must also look forward. We face a challenge just as difficult as the one facing activists and reformers in the 1930s — but of a far different kind.

Today we confront anti-state, pro-corporate politics stronger and more pervasive than during the Depression. We must incorporate the claims of a far more diverse set of Americans — all those still waiting on their New Deal — and we are not inventing welfare, but taking on the unprecedented task of building from the ashes after its end.

If we are to realize the long overdue New Deal for everyone — and then go beyond even that — we’ll need an abundance of imagination rather than nostalgia.

Jennifer Mittelstadt is an associate professor of history at Rutgers University. Her latest book, The Rise of the Military Welfare State, will be published by Harvard University Press this fall.

The US’s repeated imperialist interventions in Haiti have left a legacy of despotism.

Jacobin July 22, 2015

US Marines marching in Haiti in 1934. Bettmann / CORBIS

On July 28, 1915 the United States invaded Haiti, and imposed its diktat on the nation for close to two decades. The immediate pretext for the military intervention was the country’s chronic political instability that had culminated in the overthrow, mob killing, and bloody dismemberment of President Jean Vilbrun Guillaume Sam.

The American takeover was in tune with the Monroe Doctrine, first declared in 1823, that justified the United States presumption that it had the unilateral right to interfere in the domestic affairs of Latin America. But it was not until the late 1800s when America had become a major world capitalist power that it actually acquired the capacity to fulfill its extra-continental imperial ambitions. In 1898 it seized Cuba, Puerto Rico, and Guam and soon afterwards took control of the Philippines, the Dominican Republic, and Haiti.

The US’s goal was to transform the Caribbean into an “American Mediterranean” inoculated from the influence of French, German, and Spanish power.

The 1915 invasion was in fact the culmination of America’s earlier interferences in Haiti — on eight separate occasions US marines had temporarily landed to allegedly “protect American lives and property.” The latter part of this claim was more accurate than the former, for these earlier skirmishes served to solidify and enhance the presence of American financial banking interests.

This priority became clear when, on December 17, 1914, US marines, acting on the orders of US Secretary of State William Jennings Bryan, forcibly removed Haiti’s entire gold reserve — valued at $500,000 — from the vaults of Banque Nationale. The bullion was transported to New York on the gunboat Machias and deposited in the National City Bank.

American imperialism had thus announced its designs; it was bent on undercutting French and German economic dominance as well as signaling to Haitian authorities that they would be forced to pay their debt to US private banks. From Washington’s perspective, Haiti had to establish a political order serving American economic and strategic objectives. Ultimately, the means to that end was an occupation.

The first task of the occupiers was to select a new president to replace Sam. Rosalvo Bobo, who headed a caco army that led the insurrection ending with Sam’s brutal demise, was on the verge of moving into thePalais National. The United States, however, had other ideas. Washington viewed Bobo as too nationalistic to assume the reins of power.

While Capt. Edward Beach, the chief of staff of Adm. Caperton who led the Marines’ takeover of Haiti, acknowledged Bobo’s immense popularity, he deemed him “utterly unsuited to be Haiti’s President” because he was “an idealist and dreamer.” In fact, Beach informed Bobo that the United States considered him “a menace and a curse to [Haiti]” and thus forbade him to stand as a candidate for the presidency.

A revolutionary nationalist like Bobo was inimical to American interests. While he was being forced into exile and his cacos were launching a futile uprising against the occupying forces, Adm. Caperton installed a new president who would “realize that Haiti must agree to any terms laid down by the United States.” This new president was Philippe Sudré Dartiguenave.

The US not only imposed the unpopular Dartiguenave on Haiti, it also compelled Haitian authorities to sign a treaty legalizing the occupation. Caperton had orders “to remove all opposition” to the treaty’s ratification. If that failed, the United States had every intention to “retain control” and “proceed to complete the pacification of Haiti.”

Not surprisingly, on November 11, 1915 the Haitian Senate ratified the treaty and placed the country under an American protectorate. The United States was to take full control of the country’s military, law enforcement, and financial system. The repressive and fraudulent means by which the occupation was rendered officially “legal” symbolized what “democracy” and “constitutional rule” meant under imperial rule.

Not satisfied with the mere ratification of the treaty, the United States sought to compel the Haitian National Assembly to adopt a new constitution made in Washington. Faced with the assembly’s opposition, Maj. Smedley Butler, the head of the Gendarmerie d’Haiti — the military contingent created by the United States to replace the Haitian army that it had disbanded — arbitrarily dissolved the assembly.

Having no room to maneuver, Dartiguenave signed the decree of dissolution. In waging their own coup d’état, the occupying forces continued a long-held practice of Haitian politics, but they modernized it. As Butler proclaimed, the gendarmerie had to dissolve the assembly “by genuinely Marine Corps methods” because it had become “so impudent.”

The “impudence” of the assembly partly stemmed from its refusal to grant foreigners the right to own property in Haiti. The US found this refusal unacceptable and decided that a coup was warranted to impose the laws of the capitalist market.

Armed with military power, imbued with an imperial mentality, and convinced of their “manifest destiny” and racial superiority, the American occupiers expected deference and obedience from Haitians. In fact, the key American policymakers in both Washington and Port-au-Prince entertained racist phobias and stereotypes and were bewildered by Haitian culture.

At best, the occupiers regarded Haitians as the product of a bizarre mixture of African and Latin cultures who had to be treated like children lacking the education, maturity, and discipline for self-government. At worst, Haitians were like their African forbears, inferior human beings, “savages,” “cannibals,” “gooks,” and “niggers.”

Robert Lansing, the secretary of state in the Woodrow Wilson administration, exemplified the racist American view:

The experience of Liberia and Haiti show that the African race are devoid of any capacity for political organization and lack genius for government. Unquestionably there is in them an inherent tendency to revert to savagery and to cast aside the shackles of civilization which are irksome to their physical nature . . . It is that which makes the Negro problem practically unsolvable.

For the occupiers, Haitians thus had no capacity to run their own affairs or even appreciate the alleged benefits of America’s invasion. As High Commissioner Russell put it, “Haitian mentality only recognizes force, and appeal to reason and logic is unthinkable.”

And indeed, the American-led gendarmerie used brutal force to impose its grip on Haitian society and squash all opposition. Adm. Caperton declared martial law on September 3, 1915. It would last fourteen years, facilitating the establishment of a new regime ofcorvée (forced, unpaid labor), as well as the brutal suppression of thecaco guerrilla resistance against American forces.

Overseen by the repressive control of the gendarmerie, the unpopularcorvée system compelled peasants to work as virtual “slave gangs.” The massive mobilization of coerced labor helped build roads that reached remote areas of the territory; the creation of a viable network of transportation was not merely a means of spurring economic and commercial development, but a result of American strategic considerations.

Putting down the cacos who had supported Bobo and joined the popular guerrillas of Charlemagne Péralte required the penetration of the countryside to prevent any further recruitment of peasants into the forces of resistance.

The corvée system of forced labor extraction,and the military repression of the guerrillas were thus symbiotically connected. Riddled with abuse, the corvée failed to stifle opposition, however. Instead, coercing the peasantry to labor on infrastructural projects just fueled greater resistance to the occupation.

Popular support for the cacos grew, and soon there was an embryonic movement of national liberation with an increasingly sophisticated guerrilla force under the leadership of Péralte. Péralte, who called himself Chef Suprême de la Révolution en Haïti, explained that he was fighting the occupiers to gain Haiti’s liberation from American imperialism.

In the eyes of American authorities, however, the cacos, Péralte, and his supporters were nothing but “bandits,” “criminals,” and “killers” who had to be thoroughly “pacified.” And so they were. Péralte was shot on November 1, 1919 and his successor, Benoît Batraville, suffered a similar fate on May 19 of the following year. By 1921 the American pacification of the country was virtually complete. Some 2,000 thousand insurgents had been killed, and more than 11,000 of their sympathizers had been incarcerated.

Still, pacification did not imply popular acquiescence. It is true that the traditional Haitian elites initially collaborated with and even welcomed American imperialism. But as they experienced the unmitigated racism of the occupying forces, the elites turned against them and espoused varied forms of nationalist resistance.

While not inclined to back the caco insurgents, these elites developed a sense of nationhood that curbed the significance of color but had little impact on the salience of class identities. In the eyes of most Haitians, those who had participated actively in the occupation machinery, like President Dartiguenave or his successor, Louis Borno, were opportunistic collaborators or simply traitors.

In fact, many of these collaborators had authoritarian reflexes and shared some of the paternalistic and racist ideology of their American overlords. Convinced that Haitians were not prepared for any democratic form of self-government, these elites believed in thedespotisme éclairé of the plus capables (the enlightened despotism of the most capable).

In addition, from their privileged class position they regarded the rest of their compatriots — especially the peasantry — with contempt. In an official letter to the nation’s prefects, President Borno openly expressed this disdain:

Our rural population, which represents nine-tenths of the Haitian population, is almost totally illiterate, ignorant and poor . . . it is still incapable of exercising the right to vote, and would be the easy prey of those bold speculators whose conscience hesitates at no lie.

[The] present electoral body . . . is characterized by a flagrant inability to assume . . . the heavy responsibilities of a political action.

Borno was a dictator, but a dictator under American control. His rule embodied what Haitians called la dictature bicéphale, the “dual despotism” of American imperialism and its domestic clients. This regime of repression had unintended consequences. It intensified the level of nationalist resistance to the occupation and contributed to a convergence of interests between intellectuals, students, public workers, and peasants.

This growing mobilization against the occupation precipitated the 1929 Marchaterre massacre, when some fifteen hundred peasants protesting high taxation confronted armed marines who then opened fire on the crowd. Twenty-four Haitians died and fifty-one were wounded. The massacre set in motion a series of events that would eventually lead the United States to reassess its policies and presence in Haiti.

President Herbert Hoover created a commission whose primary objective was to investigate “when and how we are to withdraw from Haiti.” The commission — which took the name of its chair, Cameron Forbes, who served in the Philippines as chief constabulary and then as governor — acknowledged that the US had not accomplished its mission and that it had failed “to understand the social problems of Haiti.”

While the commission astonishingly claimed that the occupation’s failure was due to the “brusque attempt to plant democracy there by drill and harrow” and to “its determination to set up a middle class,” it ultimately recommended the withdrawal of the United States from Haiti.

The commission advised, however, that the withdrawal not be immediate, but rather that it should take place only after the successful “Haitianization” of the public services as well as the gendarmerie. Forbes also understood that President Borno had no legitimacy and could be sacrificed. Borno was forced to retire and arrange the election of an interim successor who would in turn organize general elections. Sténio Vincent, a moderate nationalist who favored a gradual, negotiated ending to the occupation, thus became president in November 1930.

Vincent’s gradualism was in tune with the Forbes Commission’s recommendation for the accelerated Haitianization of the commanding ranks of the government and the eventual withdrawal of all American troops. While Forbes and Vincent operated on the assumption that the United States’ withdrawal would not occur until 1936, the election of Franklin Roosevelt in 1932 altered events.

Roosevelt’s new “Good Neighbor” strategy toward Latin America was rooted in the premise that direct occupation through military intervention was expensive, counterproductive, and in most instances unnecessary. It was not that the forceful occupation of another country was precluded; it simply became a last resort.

Roosevelt understood that in Latin America, the United States could impose its hegemony through local allies and surrogates, especially through military corps and officers that it had trained, organized, and equipped. It is this perspective that explains the American decision to withdraw from Haiti. In fact, what Haitians came to call “second independence” arrived two months earlier than expected. On a visit to Cape Haitien, in the north of the country, Roosevelt announced that the American occupation would end on August 15, 1934.

After close to twenty years of dual dictatorship, Haitians were left with a changed nation. American rule had contributed to the centralization of power in Port-au-Prince and the modernization of the monarchical presidentialism that had always characterized Haitian politics. With the American occupation, praetorian power came to reside in the barracks of the capital, which had supplanted the regional armed bands that had hitherto been decisive in the making, and unmaking, of political regimes.

Moreover, the subordination of the Haitian president to American marine forces had nurtured a politics of military vetoes and interference that would eventually undermine civilian authority and help incite the numerous coups of post-occupation Haiti. To remain in office, the executive would have to depend on the support of the military, which had been centralized in Port-au-Prince.

The supremacy of Port-au-Prince also implied the privileging of urban classes to the detriment of the rural population. Peasants continued to be excluded from the moral community of les plus capables, and they came under a strict policing regime of law and order.

The occupation never intended to cut the roots of authoritarianism; instead, it planted them in a more rational and modern terrain. By establishing a communication network that became a means of policing and punishing the population, and by creating a more effective and disciplined coercive force, American rule left a legacy of authoritarian and centralized power. It suppressed whatever democratic and popular forms of accountability and protests it confronted, and nurtured the old patterns of fraudulent electoral practices, giving the armed forces ultimate veto on who would rule Haiti.

Elections during the occupation, and for more than seventy years afterward, were never truly free and fair. In most cases, the outcome of elections had less to do with the actual popular vote than with compromises reached between Haiti’s ruling classes and imperial forces. Thus, elections lacked the degree of honesty and openness required to define a democratic order. The occupation imposed its rule through fraud, violence, and deceit, and little changed after it ended.

It is true that the imperial presence from 1915 to 1934 contributed to the building of a modest infrastructure of roads and clinics, but it did so with the most paternalistic and racist energy. American authorities convinced themselves that their mission was to bring development and civilization to Haiti. They presumed that Haitians were utterly incapable of doing so on their own. Not surprisingly, they used methods of command and control to achieve their project, a practice that reinforced the existing authoritarian patterns of unaccountable, undemocratic governance.

Interestingly, when one examines the strategy and rhetoric from the 1915–1934 occupation, one can see that it foreshadowed the contemporary “modernization” and “failed states” theories that have justified western interventionism during and after the Cold War era. Except for its unmitigated racism, the old interventionism differs little from the twenty-first century doctrines of “humanitarian militarism” and “responsibility to protect.”

In fact, since the fall of the US-backed Duvalier dictatorship in 1986 and the catastrophic earthquake of 2010, the country has been involved in an unending democratic transition marred by persistent imperial interventions that have transformed it into a quasi-protectorate of the international community.

Foreign powers, particularly the United States and to a lesser extent France and Canada, have regarded Haiti as a “failed state” that could not function without the massive political, military, and economic presence of outsiders.

One hundred years after the first American occupation and three decades after Jean-Claude Duvalier’s popular ouster, Haiti has been reoccupied twice by American marines, who have paved the way for the current, interminable, and humiliating presence of a United Nations “peace-keeping” force. The imperial language has barely changed. American rhetoric justifies occupation in the name of “stability,” “domestic security,” and the dangers of “populist and anti-market political forces.” The US continues to promise the development of a modern capitalist economy, a middle class society, and a democratic order.

That all of these occupations failed miserably to achieve these goals indicate the obdurate limits and contradictions of any project of development sponsored and imposed by imperial forces. These occupations warn us also about the justifications, dangers, and vicissitudes of interventions in the current era of neoliberal globalization.

Facilitated by the corruptions of its ruling classes, old and new imperial interventions have consistently failed to deliver what they promise; in fact, they have condemned Haiti to virtual trusteeship, a vassal country suffering from a recurring emergency syndrome.

We´re History July 17, 2015

“To all who come to this happy place: Welcome.” With those words, Walt Elias Disney officially dedicated Disneyland on July 17, 1955. But Disneyland almost didn’t happen, and its opening day was nearly a total disaster. If not for Disney’s indomitable will and savvy deal-making, “The Happiest Place on Earth” would never have succeeded.

Walt Disney was a man always on the lookout for “the next big thing.” He had burst onto the entertainment scene in the 1920s with the first sound-synchronized cartoons. He then pioneered the first color cartoons, and, thanks to the invention of the multi-plane camera, the first cartoons with visual depth. Then, of course, in 1937 he produced Snow White and the Seven Dwarves, the first feature-length animated film and a spectacular technological and artistic achievement that became a world-wide sensation. By the late 1940s, however, Disney was growing tired of animation: the stock studio characters were growing stale, the shorts were increasingly formulaic and uninspired, and the feature films paled by comparison to his great achievements of Snow White and Fantasia(1940). Moreover, Disney’s controversial foray into war-time propaganda cost the studio financially and hurt his reputation as an innovative artist.

Disappointed and bored, Disney turned to a different kind of entertainment: amusement parks. Since boyhood, he had always been fascinated by magic, theater, and public fairs. His father Elias had been a construction worker on the famed Chicago World’s Fair in 1893, and it is not too far-fetched to imagine that young Walt heard many tales of that Fair’s fantastical wonders. As early as the 1930s, Disney had toyed with mechanical “flea circuses” and miniature attractions that depicted scenes from American history. After World War Two, though, the tinkering became more serious. Disney selected his favorite animators, design artists, and story writers to form a separate company: WED Enterprises (taking the name from his initials). Under the leadership of engineer and retired navy admiral Joe Fowler, these teams of “Imagineers,” as Disney called them, produced concept art and attraction designs for the would-be park.

Simultaneously, Disney delved into live action films, and the Imagineers who worked on them gained valuable experience with set design and staging. Highly stylized cinematic successes such as Treasure Island (1950) became templates for the park. Disneyland, as Disney envisioned it, would essentially be an immersive experience in which guests would participate in a live-action show. Employees would be “cast members,” and time on the job would be called “on stage.” Even the entrance to the park was designed to be theatrical, with attraction posters, popcorn, and “red carpet” concrete.

Crafting concept art and set design were easy, but actually building the park presented enormous problems. In July 1953, Disney hired the Stanford Research Institute to study the park’s potential profitability and to scout possible locations. After an exhaustive search, the team concluded the venture would make money and should be located in sleepy Anaheim, California, near a new freeway. Next up was getting the cash to begin construction. Walt’s older brother Roy, who handled the studio’s finances and who had a knack for finding ways to fund Walt’s dreams, was deeply skeptical – there would be no big brother bailout this time, as there had been for past projects. Instead, Disney had to turn to outside investors, namely ABC Television, TWA, Richfield Oil, Monsanto, Kodak, Carnation, and Pepsi. ABC would front the initial cash in exchange for producing and airing a new weekly Disneyland television show, while the other corporations agreed to sponsor individual attractions. With impressive ease and speed, by July 1954 Disney had assembled the deals and the money needed to break ground.

Construction was frantic. Before the first shovel touched soil, Disney had agreed to an opening date of July 17, 1955, and every episode of the phenomenally successful Disneyland show reinforced that deadline and built tremendous anticipation around the world. The hectic pace and the unprecedented nature of what was effectively a massive urban planning project (entailing a 160-acre city with a main street, town hall, shops, and restaurants; a river with passenger boats; a castle; and four unique “lands”) resulted in a plethora of problems: money shortages, labor strikes, scarcity of asphalt, and rivers that would not hold water, just to name a few. As the summer of 1955 neared, construction crews were operating round-the-clock. Men, material, and money were exhausted to meet the deadline.

Finally, the day arrived. Disney, along with his entertainment buddies Art Linkletter and Ronald Reagan, went on the air for an unprecedented two-hour live broadcast of Disneyland. A smashing 90 million viewers tuned in, enthralled by the combination of live television and Disney magic. The audience never saw the disastrous failures of that momentous occasion. The reality was that the park was barely finished. The asphalt had been so recently poured that women’s heels sank into the streets; “Tomorrowland” was incomplete, and banners had to be hung at the last moment to hide the construction; gas leaks temporarily shut down “Fantasyland”; restaurants ran out of food; there were not enough drinking fountains and restrooms; and forged tickets resulted in suffocating crowds that outstripped the park’s capacity.

“Black Sunday,” as it was known by cast members and Imagineers, was a day of crises, but in the following weeks and months, Disney and his crew patched up the problems, finished the park, and set about creating new, even more exciting attractions. Influential urban planner James Rouse would soon call Disneyland “the greatest piece of urban design in the U.S. today,” and millions of visitors from around the globe flocked to see Disney’s marvel. Even celebrities and world leaders were eager to experience the excitement: Vice President Richard Nixon and his family visited a month after Disneyland opened, with Nixon chirping, “This is a paradise for children and grown-ups, too. My children have been after me for weeks to bring them here.” In 1957, the King of Morocco loved the park so much he snuck out of his hotel for a second visit, and that same year, former president Harry Truman joined the fun, joking that he would not ride the Dumbo attraction since elephants were a symbol of the Republican Party. Two years later, Senator John Kennedy and King Hussein of Jordan made the trip. The list goes on and on. And it is worth noting that the park itself has changed dramatically from those early days, improving (“plussing,” in Disney’s words) old attractions, adding new rides, and debuting cutting-edge technology, such as the Matterhorn Bobsleds in 1959, the world’s first metal-tubing roller coaster.

Disneyland today is both the same park of Disney’s dream from 1955 and a very different one. Visitors can still feel the special touch and presence of “Uncle Walt,” but park attractions and the technological innovations now surpass anything Disney could have imagined. Just as important as the thrills and pixie dust, however, is Disneyland’s role in Disney’s growing interest in urban planning and evolving partnership with major corporations. The park’s tremendous success (financially and logistically) gave Disney and his studio the momentum and money needed to delve into an even bigger project: an entire “city of the future” in central Florida. There, Disney, along with corporate backing and unprecedented political autonomy yielded by the state, would build more than a Disneyland, he would build a Disney World.

Michael Todd Landis is an Assistant Professor of History at Tarleton State University. He is the author of Northern Men with Southern Loyalties: The Democratic Party and the Sectional Crisis (Cornell, 2014).

National Security Archive Electronic Briefing Book No. 522

Posted – July 20, 2015

Edited by John Prados and Arturo Jimenez-Bacardi

Washington, D.C., July 20, 2015 – Forty years ago this year, Congress’s first serious inquiry into CIA abuses faced many of the same political and bureaucratic obstructions as Senate investigators have confronted in assessing Intelligence Community performance since the September 11, 2001, terrorist attacks. Records posted today for the first time by the National Security Archive document the often rough-and-tumble, behind-the-scenes dynamics between Congress and the Executive Branch during the “Year of Intelligence” – highlighted by the investigations of the congressional Church and Pike committees.
In 1975, it was then-Deputy Chief of Staff Dick Cheney who spearheaded the Ford White House’s hostile approach to Congress, which required the CIA to submit all proposed responses to Capitol Hill for prior presidential approval and featured the explicit intent to keep investigators away from the most sensitive records. Those events presaged the battles between the Senate Select Committee on Intelligence (SSCI) and the U.S. Intelligence Community since 2012 over plans to publish the former’s 6,000-page report on the CIA’s rendition, detention and interrogation program.Among White House and Intelligence Community stated concerns during the period of the Church and Pike inquiries were preserving the effectiveness of the CIA and reassuring future operatives who might fear their “heads may be on the block” for their actions, no matter how well-intentioned. But intelligence officials also worried that disclosures of agency operations would be “disastrous” for CIA’s standing in the world: “We are a great power and it is important that we be perceived as such,” a memo to the president warned, urging that “our intelligence capability to a certain extent be cloaked in mystery and held in awe.”

Related to today’s posting, a much larger compilation of 1,000 documents, many of them previously classified, was published in June 2015 in the online collection CIA Covert Operations II: The Year of Intelligence, 1975, the second in a series on the CIA through the Digital National Security Archive, a joint project with the scholarly publisher ProQuest.

Today’s e-book touches on the high points of one major aspect of the 1975 experience – the Church committee’s efforts to obtain evidence for its inquiry and countervailing work by the White House and CIA to limit and restrict the Senate’s access. Documents posted today show that:

The White House of President Gerald R. Ford, spearheaded by deputy assistant to the president Richard Cheney, quickly seized control of the administration’s response to the congressional investigations.

Lists of records to which the Church committee requested access for its investigation were reviewed in detail and Mr. Cheney ultimately decided whether to provide them in each case.

Specific records in categories approved for access were first sent to the White House for individual review and recommendation by National Security Council staff, followed by approval from Mr. Cheney.

The White House required the CIA to propose measures to govern Church committee access to CIA materials. These accommodation measures were then reviewed both by Deputy Assistant Cheney and Counsel to the President Philip Buchen.

CIA accommodation measures were explicitly designed to keep Church committee investigators away from its most important records.

National Security Council officials convened at the White House to express themselves in advance regarding proposed CIA testimony to the Church committee.

* * * * *

Fortieth Anniversary: The Church Committee, the White House and the CIA, Spring 1975

Washington, D.C., July 17, 2015 – Forty years ago there had never been a public investigation directly aimed at the Central Intelligence Agency. Congressional oversight had long been a matter of secret subcommittees and information privately shared with small circles of legislators. What reviews of intelligence there had been were carried out in the dark using reliable people. Even with major flops like the abortive Bay of Pigs invasion of Cuba, official investigations were confined to the executive branch—within the CIA itself or under the auspices of the National Security Council (NSC) at the White House. Public dissatisfaction with the CIA had grown through the Vietnam war, however. Concern spiked with the Watergate scandal, then revelations of the agency’s role in attempting to raise a Russian submarine off the floor of the Pacific Ocean, and its hand in the Chilean political troubles which led to the overthrow and death of Salvador Allende. On December 22, 1974 curiosity and concern became a firestorm of anger when the New York Times revealed that the CIA had spied on American citizens, monitored their political views, and carried this out on a massive scale for many years. The allegations were white hot because the law prohibited the CIA from engaging in domestic activity.[1]

President Gerald R. Ford, then vacationing at Vail, Colorado, was surprised to be told of the new CIA exposé, followed on succeeding days by yet more allegations of CIA skullduggery. The president demanded an explanation from CIA director William E. Colby. The agency’s chief responded with a report that confirmed New York Timesreporter Seymour Hersh had uncovered real CIA operations. It became apparent in the course of a few days that investigation was inevitable. Ford made an effort to take control of the issue early in January 1975 by means of creating a presidential commission under Vice-President Nelson A. Rockefeller. This was an inquiry of the old kind, one by minions who could be trusted to stay away from sensitive matters. But just a couple of weeks later President Ford let the cat out of the bag himself, admitting to visiting newspaper editors that he wanted to control the investigation to avoid sensitive issues like CIA participation in assassination plots. The mention of assassination plots blew the lid off the investigation issue. By the end of January 1975 both the United States Senate and the House of Representatives had established special committees to investigate U.S. intelligence.

The Senate committee would be the first to pull itself together and begin its inquiry. It was led by Idaho Democratic Senator Frank Church. The House committee would be delayed for months by squabbles over collusion with the CIA involving its own secret House overseers, so the first struggles over access to agency material were fought with the Senate.

Meanwhile the “Colby Report,” the preliminary response the CIA director had compiled to meet President Ford’s demand for information, became a focus that helped define how the various players approached the question of CIA information. At an early Senate hearing, on January 15, 1975, Director Colby tried to walk the thin line of belittling the press revelations while acknowledging that some of them were true. He recited the same story line he had used with the Rockefeller Commission, which naturally drew from the Colby Report. Colby’s testimony infuriated Henry Kissinger, then occupying the posts of national security adviser and secretary of state simultaneously. Kissinger was outraged both that Colby was speaking out of school and that he had not cleared his remarks with the White House in advance.

The White House reaction, which might have sounded predictable today, belonged to a pattern just being set in 1975. Why White House officials would have expected the CIA director to do anything other than rely upon the Colby Report for his testimony, none of them has ever said. Philip Buchen, the president’s counsel and point man on these investigations, left no record. Richard Cheney, the top operator, wrote a memoir that distances himself from the action despite information that has become available about his central role.[2] Kissinger writes of Colby burning a match in a gasoline depot.[3] But Colby’s only alternative would have been to say nothing and that, in the superheated political climate of January 1975, was not a viable option. The Colby Report had been pulled together very quickly because it was based on a larger in-house study, informally known as “The Family Jewels,” that had been ordered by Colby’s predecessor. If the CIA director had to say anything, there could be no doubt he was going to rely upon these sources. Nevertheless, the White House determined to ride herd on the CIA for its dealings with the investigations.

William E. Colby, who had returned to CIA from a Vietnam assignment just in time to witness the fallout from the Watergate scandal, wrote that he learned from Watergate that a CIA “distancing” strategy actually left suspicions behind, and that he recognized the congressional authority to investigate. On its own the CIA would have been on very shaky legal and constitutional ground in denying Congress. Investigation of the executive branch had always been a recognized role for the legislature, and law existed which specified that executive branch agencies could not deny any information necessary to an investigation by a properly constituted congressional inquiry. The denial strategy just would not work.[4] Therefore, from Colby’s perspective, the application of White House control was a good thing. Apart from the anger expressed towards him personally, the White House intervention gave Director Colby a useful and convenient rationale for denying information to congressional investigators.

CIA deputy general counsel Walter Pforzheimer was simply wrong when he told an interviewer in 1998 that whatever Bill Colby had in his mind on a given day was going to come out.[5] The documents selected below from the National Security Archive’s “CIA Covert Action II” set demonstrate conclusively just how far from the truth was the perception that the CIA was giving away the store in 1975. Rather than letting out all the secrets, what happened during the Year of Intelligence was a very carefully-contrived process in which the Ford White House asserted its prerogative to approve every release and the CIA followed suit. The Church Committee laid out its demands for information; they were reviewed and frequently denied; and the committee ended by appealing, cajoling, negotiating, or begging for data.

From the outset, Ford administration strategy relied upon giving the appearance of cooperation while invoking national security to shield information. On February 22, for example, Director Colby met with secretaries Kissinger, James R. Schlesinger, and others on measures to protect data (Document 1). Five days later, with Senator Church, Colby agreed to waive the agency’s employee secrecy oaths in dealing with the committee, and he promised to provide such basic material as organization charts, budget information, and legal authorities. In return Colby obtained the senator’s consent to subject his committee investigators to the kind of secrecy agreements considered at the White House on February 22.[6] The language the secrecy agreements eventually employed appears in Document 14.

On March 5, 1975, Senators Church and John Tower, the committee’s Republican vice-chairman, together with their staff chiefs, went to the White House to meet with President Ford and key administration officials. In his coat pocket Senator Church carried a list of White House documents he required for the investigation. Among them were the materials Colby and Church had discussed a week earlier. President Ford’s talking points for the meeting (Document 4)promised cooperation but emphasized a need not to “cripple” the intelligence agencies or reduce the CIA “to the level of a newspaper clipping and filing agency.” Senator Church promised not to be a “wrecking crew.” Mr. Ford played the Senate investigation against that of the House of Representatives, remarking that he did not want to put on paper his agreement to cooperate as he would then be forced into a similar agreement with the House committee, and, the president went on, “It’s best not to formalize. Let’s proceed on a case-by-case basis.”[7]

What the promised cooperation actually meant began to become evident after March 12, when Senator Church sent a new list of required documents directly to Director Colby at the CIA (Document 5). Only a few days later, in an encounter with a senior Colby aide, a senior staffer for the House committee complained that CIA’s exceptions to what it was being asked to produce were “so broad as to encompass nearly everything” (Document 6). On March 24 a White House official, possibly assistant to the president Richard Cheney, returned a copy of the Church-CIA document request that had been annotated to indicate approvals for the provision of the materials (Document 9). A wide swath of documents were to be denied.

On March 25, the CIA provided a progress report on its arrangements and guidelines (Document 10). Material bearing on the president would be entirely denied to investigators, who would be entitled only to briefings based on real material. At the next level there would be “fondling files” to be consulted only at the CIA or another originating agency.

On March 29 Robert C. McFarlane, then an NSC staff member, completed his evaluation of White House documents requested by the Church committee (Document 11). Again covert operations, which the Church investigators had an obvious interest in examining, were notably underplayed.

A favorite device of the CIA and White House in meeting the demands of congressional investigators became the “abstract.” Classified abstracts of much larger bodies of work, reports, reviews, or histories were repeatedly compiled. There are examples in several of the exhibits included here (Document 11, Document 21, Document 23). This permitted the administration to claim it had made an effort to satisfy investigators without, in fact, giving much away.
Faced with this problem, the administration came up with a device too clever by half – seeking a Freedom of Information Act (FOIA) exemption for agency documents submitted to Congress (Document 26). This would free the CIA and other agencies to keep secret the material given to Congress, refusing any FOIA request for that information. At the same time, since the CIA position was that classified information disclosed by any means other than direct release by the agency remained officially secret, if Congress did decide to release CIA documents the agency would be able to pretend the information was still classified. Fortunately for the public, in 1975 there was no political possibility the CIA could succeed in obtaining an FOIA exemption of that kind.The insoluble problem which remained for Ford administration officials was that the Congress, a coordinate branch of government, had complete authority within its sphere. White House and CIA officials could never resolve the implications of this for the secrecy of their records. This quandary would arise again in the fall of 1975 with the House investigating committee, but it was already a factor during the spring with Church. An identical dilemma has faced the present-day CIA confronted with the SSCI inquiry into its torture and detention programs. The CIA’s top lawyer in 1975, John Warner, frankly advised that, while he could see no plain authority for the Congress to release classified data, there was no question the Congress had a constitutional immunity from the consequences of releasing any piece of classified information (Document 15). Ranging over the same set of thorny issues, on April 14 NSC staffer McFarlane wrote that until the secrecy could be guaranteed, no classified information should be supplied to the Church committee(Document 19). (A forthcoming National Security Archive posting will address the revival of this issue, even more sharply, with the Pike Committee.)

With intractable secrecy issues still unresolved, the Church committee continued moving ahead with its investigation, closing in on the subject of covert operations that Ford officials had sought to avoid. White House lawyer Philip Buchen drafted an approval memo for a decision on what to cover in dealing with Church on covert operations. He wanted to restrict access to senators Church and Tower only, and to permit discussion of just eight covert operations (Document 27). Deputy national security adviser Brent Scowcroft issued a different decision memo. This provided that CIA present a briefing that spoke of covert operations in general, without reference to techniques, times or places (Document 28). The testimony Director Colby would present would be reviewed in advance by the NSC.

The electronic briefing book ends with Director Colby’s account to the National Security Council of his meeting with the committee heads and his testimony to the full committee(Document 31). Reacting to Colby’s remarks, the irrepressible Kissinger declares it “an act of national humiliation” for a nation to have laws that prevent a president from ordering an assassination.

This White House Memorandum of Conversation shows Kissinger warning that the investigations on the intelligence community could be “as damaging to the intelligence community as McCarthy was to the Foreign Service,” potentially leading to “the drying up of the imagination of the people on which we depend. If people think they will be indicted ten years later for what they do.” As such, the group comes up with several plans for controlling the investigations and the information they share with the committees. Suggestions include: setting up “secrecy agreements” with Committee members in order to control the materials and information that will be shared with them, and giving committee members secret documents in hopes that they are leaked to the press, allowing the White House to refuse to share more documents citing “executive privilege.”

On February 22, 1975, President Ford is informed that Senators Frank Church and John Tower, the Chairmen and Vice Chairmen, respectively, of the Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities, have asked to meet with the President to discuss the committee’s inquiries and to request the President’s full cooperation with the committee.

On February 25, 1975, a White House staffer with the initials HCD sends a note to William Kendall, the Deputy Assistant to the President for Congressional Relations, to try and set up a meeting with the President and Senators Church and Tower before March 5 in order for Henry Kissinger to attend the meeting. (The meeting is eventually scheduled for that date and Kissinger is unable to attend).

On March 4, 1975, President Ford receives a briefing memo to prepare for his meeting the following day with Senators Church and Tower. As part of the President’s talking points, he is to mention that he wants to cooperate with the committee, however, a series of dire warnings are presented. First, in the process of investigating allegations of impropriety, it is essential that, “we not cripple the effectiveness of the institutions which are so critically important to the very survival of this country.” Second, while “willful wrongdoing cannot be tolerated,” we must also “be careful that we do not create the impression among loyal dedicated intelligence personnel that their heads may be on the block in later years for actions they undertook in the belief they were serving their country… ” if this were to happen, “the CIA would be reduced to the level of a newspaper clipping and filing service.” Third, the memo warns that “disclosures would be disastrous” for perceptions of the U.S. around the world: “We are a great power and it is important that we be perceived as such.” Therefore, it is necessary that “our intelligence capability to a certain extent be cloaked in mystery and held in awe.” Given such concerns, the memo concludes that we must “tread very carefully” and that the goal must be “to preserve and enhance the confidence of the American people in their intelligence organizations.” To do so, the White House will cooperate with the Committee, however there will be a “presumption against providing sensitive material not indispensably material to the inquiry.”

Document 5: Church Committee, Letter from Senator Frank Church to CIA Director William Colby, March 12, 1975.

This letter from Senator Church to CIA Director Colby outlines the types of documents that the Committee is requesting from the CIA, including files from the U.S. Intelligence Board staff, the Offices of the Director and Deputy Director, the General Counsel, the Legislative Counsel, Comptroller, Inspector General, Historical Studies, and Finance, as well as documents pertaining to Colby’s testimony to the Senate Appropriations Committee on January 15, 1975.

In this CIA memorandum of conversation, Jack Boos, Staff Member for the House Select Committee on Intelligence, is quoted expressing his frustration at the CIA guidelines for access to documents, asking, “How soon will you invoke executive privilege?” and noting that Director Colby’s exceptions to documents “are so broad as to encompass nearly everything.” The memo then explains a number of security provisions that could be used during the investigations. Ultimately, the document concludes that, “I am not sure that Boos was reassured that the Committee and its Staff would have the access they think they need.”

This is a copy of Senator Church’s March 12 letter [Document 5] to CIA Directory Colby requesting documents from the CIA bearing Cheney’s handwritten notations of White House decisions as to which to provide the Committee. Items to be withheld were marked “no” or left blank. More than half of the requested “special studies” (eleven of twenty) were to be denied, including those on covert operations, among them the postmortem on the Bay of Pigs debacle. Also denied was the 1967 “Katzenbach Report” and a companion study of CIA activities at American universities, both triggered by the very public revelation that the CIA had funded voluntary groups of American university students (the National Student Association). All reports of the Bureau of the Budget or the Office of Management and Budget (OMB) – White House entities – were to be denied, which was unfortunate because the most recent management study of the intelligence community that existed had been done by OMB in November 1971. Access to the Colby Report was ruled out and the “Family Jewels” were to be protected, save for a CIA Inspector General’s study that had covered some of the same ground.

This memo provides a progress report and outlines CIA guidelines for dealing with inquiries from Congress. It explains the creation of an Ad Hoc Coordinating Group tasked as the principal mechanism for coordinating the exchange of information within the CIA, with the White House, with the USIB, and with the investigating Committees. The memo also describes security procedures being set up to share information with Congress, and outlines four levels of security that will be applied. The CIA would create a central index to record all materials released and to index and abstract papers and testimony. Officials offered a “central reading room” – at CIA – for investigators to go to read agency documents. The data deemed most sensitive will “not be available to Select Committee Staff in its raw form.” This included memos to and from the president. Congressional investigators would be restricted to briefings about that material. At the next level there would be “‘fondling files’” that could be viewed only at the originating agency, with “specific limitations placed upon them by the agencies concerned.” At the next lower level documents would actually be given to the investigators but only in a sanitized form. Only at the lowest level would data be directly provided in unexpurgated form. If the briefings did not satisfy the committee there would be negotiations. The memo then explains a process where Committee members could challenge such limitations, which could open the door for documents to be read in the presence of authorized agency personnel. The memo also outlines a strategy for dealing with press inquiries and allegations.

Document 11: White House, National Security Council, Memorandum, from Robert C. McFarlane to Brent Scowcroft, “Submission of Documents to the Senate Select Committee on Intelligence Operations,” March 29, 1975.

This NSC Report by Robert McFarlane to General Scowcroft describes seventy-two NSC directives/documents and recommends whether they should be made available to the Church Committee. The McFarlane evaluation recommended in favor of the provision of general organizational material but it was thin in regard to approvals of material related to covert operations. The directive then in force governing approval of covert operations, called National Security Decision Memorandum 40, was approved for release. But many earlier documents in the NSC-10 and NSC-5412 series were to be restricted to a classified abstract. NSC-124/2, which had created the Special Group (Counter-Insurgency) in the Kennedy administration, should be denied, McFarlane advised. Also to be denied were presidential decision memos on Radio Free Europe (1961) and psychological warfare operations in wartime.

This document is a White House summary of a 1949 National Security Council policy review of intelligence functions, requested by the Church Committee. “The CIA and National Organization for Intelligence” found that the CIA had failed to produce coordinated national intelligence estimates and that it engaged in much duplicative activity. The report advocated the consolidation of CIA operational activities under a single directorate, separated administratively and physically, to the extent possible, from the analytical side of the agency. This became an impetus for creation of the CIA operations directorate. The note goes on to warn that the policy review, also known as NSC-50, cited two more interim reports that the Church Committee might be interested in but did not know about, and which Ford officials had not examined.

Document 13: White House, Memorandum from Philip Buchen to President Ford, “Request of Senate Select Committee to Study Governmental Operations with respect to Intelligence Activities for Information,” April 2, 1975.

This White House memo for President Ford explains that the documents requested by the Church Committee have been reviewed by the offices of Jack Marsh, Counsel to the President, and Brent Scowcroft, the National Security Adviser. The memo recommends the release of several documents, including the Colby report. However, some documents “which are so sensitive or so central to the Presidency” will not be made available, and in the future might only be revealed to Senators Church and Tower. The memo refers to documents in Tab A, B, and C, Tab A refers to Senator Church’s information request list [Document 5] in this EBB, and Tab B refers to Robert McFarlane’s NSC analysis of the materials [Document 11]. Tab C is the Colby Report.

This is the draft employee agreement concerning the treatment of classified information that all committee staff members had to sign. Director Colby had told colleagues at a February 22 meeting [Document 1] that agreements of this kind would prevent the committees from releasing CIA information. He obtained Church’s agreement to constrain investigators in this way on February 27.

Document 15: Central Intelligence Agency, Open Memorandum by John Warner, CIA General Counsel, “Authority of Congress to Release Classified Data,” April 11, 1975.

This CIA memo by General Counsel John Warner concludes that, “I have found no express authority for Congress to publicly release information classified by the executive branch pursuant to an Executive Order issued by the President.” At the same time, Warner notes that, “Congress is constitutionally immunized, at least in part, against any consequences flowing from release and disclosure of classified information.” The memo goes on to explain the law behind his conclusions and presents different scenarios in which members of Congress are and are not liable for disclosing classified information. The implication is that there is no effective recourse to Congress releasing any information it desires.

This White House Office of Congressional Relations memo notes that the Church Committee is seeking Executive Directives to the CIA, which the Committee believes it has jurisdiction over and are not protected by executive privilege.

This heavily excised memo concerns follow-on requests from the Church Committee. It takes up several of the items listed by Senator Church, describing several documents requested by the Church Committee concerning CIA Director Colby’s January 15, 1975 testimony to the Senate Appropriations Committee, and argues in favor of releasing some of them.

This is a brief summary of four CIA reports requested by the Church Committee. The document also outlines potential problems that might arise if they are shared with the committee. These include: a 1954 “Report on the Covert Activities of the Central Intelligence Agency” (Doolittle Report); a 1962 “Final Report of Working Group on Organization and Activities;” a 1965 “Review of Selected NSA Cryptanalytic Efforts” (The Bissell Report); and, also from 1965, “The Long Range Plan of the Central Intelligence Agency.” To take one example, the Doolittle Report was reviewed for names of personnel that had appeared before the review board, with many deleted, and a loss to historical interpretation of the period. In 1975 deleting this material arguably served no identifiable national security purpose.

Document 19: White House, Memorandum from Robert McFarlane to James Wilderotter, “Procedures for Safeguarding Classified Information,” April 14, 1975.

This NSC Memo to the Church Committee raises several concerns regarding safeguarding classified information based on Senator Church’s claim to have the “right to make public any document provided to it … ” The memo objects to this claim, arguing that such decisions should be made by the President. The memo concludes that pending the resolution of this matter, “no classified information should be provided to the Committee.”

Document 20: White House, Memorandum for the Record by James Wilderotter, “Meeting with David W. Belin, April 15, 1975,” April 16, 1975.

This memo summarizes a meeting between White House functionary James Wilderotter and David Belin, Executive Director of the Rockefeller Commission, concerning the sharing of documents with the Church Committee. Belin agrees with Wilderotter’s requests to limit the sharing of some documents, including the omission of materials relating to assassination plots raised in a CIA Inspector General’s report.

This is a brief summary of two CIA reports requested by the Church Committee. The document outlines potential problems that might arise if the reports are shared with the Committee. These include: from 1967, “Report on Strategic Warning;” and from 1968, “Intelligence Activities and Foreign Policy.”

This memo presents Wilderotter’s recommendation with respect to half a dozen studies and reviews of intelligence and asks senior officials to note their approval or rejection. One of the six, concerning National Security Agency electronic spying (the Bissell Report of February 1965), should be withheld, it is argued, with no decision made at least until its contents have been discussed in a general way with senior Church Committee staff. Wilderotter assesses the other five as suitable for revelation to selected committee members at CIA headquarters, with classified abstracts furnished to the full committee at some later date.

Wilderotter writes a cover memorandum describing a set of internal CIA histories that have been requested, and encloses brief CIA summaries of the histories. The CIA notes categorize each of the documents and describe in general terms what they cover. Eleven can be released to the Church Committee but five others will need further review before they are released. In fact, in subsequent years a number of these histories were declassified, entered the public domain, and were published without any perceptible negative effect on the national security.[8]

This is a CIA fax to the White House responding to the Church Committee request for all materials relating to CIA Director Colby’s December 22, 1974 Report and January 15, 1975 Senate Testimony. The memo warns that such a request may raise, “policy issues involved here which we should focus on soon and also discuss strategy.”

This memorandum summarizes two CIA reports requested by the Church Committee: from 1964, “Middle East Task Force Report” (the Nolting Report); and from 1966, “Foreign Intelligence Collection Requirements” (The Cunningham Report). Wilderotter recommends that the Cunningham report should be made available at CIA, however the Nolting report should only be discussed in general terms with Senators Church and Tower. Summaries and concerns regarding both reports are provided.

Document 26: Central Intelligence Agency, Office of Legislative Counsel, Memorandum for the Record, “Requests under the Freedom of Information Act for Agency Documents Provided Select Committees and the President’s Commission,” May 8, 1975.

Source: NARA, CIA CREST Files, CIA-RDP77M00144R000400020052-2

This CIA memorandum discusses the potential of adding an exemption “to relieve the Agency from responding to requests under the Freedom of Information Act for Agency documents provided to the House and Senate Select Committees and the President’s Commission investigating CIA.” In case such an amendment does not pass the Congress, a Joint Resolution is proposed where any material furnished to the Committees “shall not be publicly disclosed without the express approval of the Chairman of these Select Committees and the head of the particular agency or department involved.”

This approval memorandum sought President Ford’s decision on a strategy of allowing administration officials to avoid testifying to the Church Committee at all. Instead Buchen recommends a strategy of briefing only Senators Church and Tower on CIA covert actions. In this context the President would permit specific discussion of ten covert operations, with current activities to be covered only generically. For example, he would permit discussion of the Katzenbach report of 1967, which had been denied in the initial White House review of the Church document requests [Document 9]. Buchen lays out the pros and cons of his suggested course of action.

This memo outlines the president’s authorization of a more restrictive course than Philip Buchen had recommended. Director Colby is to discuss covert operations with Senators Church and Tower alone, and only in the most generic terms. The memo requests that the final text of the CIA’s briefing to Congress be reviewed by the Counsel to the President (Buchen).

In this advance briefing memo, White House Counsel Buchen discusses a strategy for dealing with further Church Committee requests on covert operations. Topics include what CIA Director Colby should discuss with Senators Church and Tower the following day. The goal of Colby’s briefing should be to “induce the Chairman and ranking Minority Member to impose limitations on the further investigation of the subjects covered.” The CIA should seek to confine the discussion to a limited list of cases of covert operations, and impose restrictions on committee staff access to CIA records of its activities.

In this set of National Security Council Meeting Minutes, CIA Directory Colby describes his testimony to the Church Committee: “it was like being a prisoner in the dock, there was a real interrogation. All the questions were on assassination and it was like ‘when did you stop beating your wife?’” Colby also explains a discussion on assassinations he had with Church where, “I also told them that our policy and our orders are very clear: we will have nothing to do with assassination: Church ended by saying that is not enough. We need to have a law which prohibits assassination in time of peace.” To which Kissinger responded, “It is an act of insanity and national humiliation to have a law prohibiting the President from ordering assassination.”

[1] Seymour Hersh, “Huge C.I.A. Operation reported in U.S. against Anti-War Forces, Other Dissidents in Nixon Years,” New York Times, December 22, 1974, p. 1.

[2] Richard Cheney with Liz Cheney, In My Time: A Personal and Political Memoir. New York: Simon & Schuster, 2011.