The Hedgehog Review: Vol. 18 No. 1 (Spring 2016)

Liberated as Hell: The Autonomous Worker and the Hollowed Workplace

Brent Cebul

The Hedgehog Review: Spring 2016

In the early 1990s, Monika Miller was chosen to test-drive the office of the future. An associate media director in the Los Angeles office of the Chiat/Day ad agency, Miller learned she’d been freed of the burdens of her desk, chair, and personal office space. Her boss, Jay Chiat, had had an epiphany while skiing in the Rockies. He would no longer keep his creative workers penned in cubicles. His “virtual office” would be an expression of equality, featuring open, non-hierarchical communal spaces that would inspire creativity, collaboration, and flexibility—even playfulness. Unencumbered workers would sign out the day’s phone or computer from a tech “concierge.” News coverage exalting the firm’s brave new way of work featured photographs of the Tilt-A-Whirl amusement park cars Chiat installed for one-to-one meetings.

For Miller, however, liberation was “like a bad dream.” With nowhere to keep her supplies and files, she bought a Radio Flyer wagon. “Everyone thought it was so cute,” she recalled. “I’d be trudging down the hall,” she said, recollecting the daily search for a free space, “and they’d laugh and say, ‘Oh look, here she comes with that little red wagon.’” But when their turn came to be liberated, there was little laughing among Miller’s coworkers. Soon, the office was a blur of motion, with staff repeatedly visiting their cars in which they stashed papers and pens. Others hid files around the office. “Every day,” Miller said, “there’d be these frantic email messages like, ‘Has anybody seen my binder? Does anyone know where my files are?’” Chiat soon conceded the need for small lockers. There, he offered with unalloyed scorn, workers could “put their dog pictures, or whatever.” For employees newly habituated to turf battles, the lockers simply offered more fodder for jokes that they attended Chiat/Day High School.1

Today, “hot desking” or “hoteling”—the use of flexible office space to enable a firm to employ more workers than its space would traditionally have maintained—is de rigueur, hardly worthy of the attention lavished on Chiat/Day nearly twenty-five years ago. Companies that now require some portion of their work force to reserve space prior to arriving for work include Booz Allen Hamilton, American Express, GlaxoSmithKline, IBM, and the BBC. The US government became a hot-desker, too, when the General Services Administration assigned 3,400 workers to its updated headquarters, which had previously housed just 2,200.2 Internet start-ups Sharedesk and Zipcube even offer “sharing economy” desks and cubicles, making it possible for someone to work for one entity from the free space of another.

The casualization of the workplace has gone hand in hand with the rise of the precarious economy, characterized by flexible or temporary employment. “Precarious” workers cobble together a handful of assignments or contracts, working from home, coffee shops, or their car. According to one 2014 assessment, fifty-three million Americans do freelance work, including some 38 percent of millennials.3 In May 2015, the US Department of Commerce reported that the temporary services industry hit a record high, with 2.9 million Americans employed as temporary laborers or office workers.4 These workers join millions more who increasingly find their workspace transformed into liminal space.

White-collar office workers, too, more and more engage in piecemeal, project-oriented work. Many of these workers lack traditional office arrangements and, along with start-ups, have driven demand for coworking and collaboration spaces where freelancers or entrepreneurs pay a monthly fee for daily workspace. Often designed with a coffee shop or clubhouse feel, cowork spaces nevertheless require that members claim a new spot each morning and pack up their belongings each night. One estimate suggests that coworking membership may grow by 40 percent per year, passing one million by 2018. A survey of 2,700 coworking businesses found that 70 percent were struggling to keep up with demand, and 60 percent planned to expand.5

The hollowing of the traditional workspace, whether blue- or white-collar, and its implications for social provision are the subjects of Guy Standing’s The Precariat: The New Dangerous Class. In this extended jeremiad, Standing mixes a range of cultural assertions, episodic country-level data, and idiosyncratic comparisons and correlations to explain the origins and consequences of our present labor regime. Standing’s main objective is to sound an alarm: An “incipient political monster” is rising from the margins of the “neoliberal” economy. This monster, he explains, is the “‘bad’ precariat,” formerly working class, politically marginalized, “angry and bitter,” and “drawn to populist neo-fascism.” In contrast, a “‘good’ precariat,” comprised of mostly young, well-educated, Western workers, hope to “confront [its] insecurities with policies and institutions to redistribute security and provide opportunities for all to develop their talents.”6

Standing’s purpose is to persuade scholars and, one presumes, the precariat itself, that this emerging segment of the global workforce, which includes white-collar consultants, temporary laborers, and underemployed retail workers, “has class characteristics” and should mobilize to win a new category of rights decoupled from its traditional linkage with certain categories of work.7 A new system of “work rights” would contain a more capacious set of rights claims based on our new realities: modern work is done by people who are not employees; entire sectors of productive work were ignored by the old system (e.g., the caring economy); much work of social value (e.g., volunteer work) “should also become a zone of rights.” To carry out this seismic transformation, Standing urges the precariat, good and bad, to “demand construction of an international work-rights regime.”8

As Standing notes, the possibility of “class” formation today is dependent upon many more variables than simply economic position. Yet in his account, the primary characteristics the members of the precariat share are absences: labor security, social security, positive work-based identities, community support in times of need, state benefits, private benefits, labor solidarity, even, he asserts, a future. Precarity, he argues, generates the “four A’s”: anger, anomie, anxiety, and alienation.

Standing’s prescriptions for greater security, however, may fail to launch because his diagnoses overlook an important aspect of American life: workers’ evolving attitudes toward the ways, sites, and meanings of work. To understand shifting work regimes and their changing cultural implications, we must examine what work has meant for individuals’ sense of self and social commitments. Asking how older work regimes have shaped our current cultural moment suggests that Standing’s “Four A’s” and Monika Miller’s “liberation” resulted not simply from economic dictates but also from cultural forces that encouraged people to imagine and pursue new ways of work. Understanding these forces will be critical for shaping a new, more equitable regime of work rights.

The Postwar Work Regime—and its Discontents

As sociologist Arne L. Kalleberg puts it in Good Jobs, Bad Jobs, the two and half decades following World War II constituted “an age of security.” Management, rather than the dictates of shareholder value, structured most firms’ interests. In unionized sectors such as manufacturing, construction, and transportation, labor was generally treated as a fixed cost, and unions negotiated secure and well-paying jobs in exchange for labor peace. Society defined a “good job” as one that offered stability, health insurance, and a pension. Meanwhile, “bad jobs” made up the sector of the economy that had always been precarious, occupied by African Americans, immigrants, and women denied access to “good jobs.”9

By 1956, the United States had become a nation of clerks and industrial laborers. That year, more than twenty million Americans worked in blue-collar professions, while the total engaged in white-collar work was fast approaching twenty-seven million.10 Often forgotten today, these jobs, which included blue-collar work as well as a wide range of white-collar employment, offered little in the way of traditional intrinsic benefits—for example, personal satisfaction, or steady accumulation of new skills. Social theorists worried that the bureaucratic, hierarchical white-collar firm and the routinized, blue-collar factory run according to the dehumanizing principles of Frederick Taylor’s scientific management also threatened to undermine the entrepreneurial individualism they considered exceptionally American. Social scientists turned their attention to “adjusting men to machines” and to discovering ways to nurture individualism in the bureaucratic workplace. The magazine Applied Anthropology, for instance, explored “practical problems in human engineering.”11

As Daniel Bell wrote in 1959, “Our emphasis has been on economic growth, increased output, but not on what kind of men are being molded by the work process.” Bell observed that “few individuals think of ‘the job’ as a place to seek any fulfillment…work itself, the daily tasks which the individual is called upon to perform” was something to be gotten over with as quickly as possible. Was “meaningfulness in work any less important” in 1959, he asked, than worker safety had been five decades earlier?12 A generation of sociologists pondered, as C. Wright Mills put it in 1951, what the new “economic and political situation” meant “for the inner life and the external career of the individual.”13

For Mills, in place of the independence, personal satisfaction, and autonomy that characterized the nineteenth-century entrepreneur or craftsman (all of which he overstated), modern work offered perverse intrinsic benefits. In typically acerbic prose, Mills charged that the “fetishism of the enterprise, and identification with the firm, are often as relevant for the white-collar hirelings as for the managers.” The “salesgirl does not think of herself in terms of what she does, but as being ‘with Saks’ or ‘working at Time.’” For office workers, Mills charged, “periodical salary increases and initial salaries were both ranked below such considerations” as “the state of the equipment, the appearance of the place, the ‘class of the people.’”14 (Mills thus anticipated the skewering, in the 1999 film Office Space, of a particularly debased worker and his cherished stapler.) Americans were at risk of becoming anesthetized by meaningless work, mindless consumption, and mass society. Mills zeroed in on the psychological roots and implications of the shifting moral order.

Stimulating individual creativity in the workplace was soon considered essential to nurturing individualism and the entrepreneurial spirit. As historian Jamie Cohen-Cole reveals in The Open Mind, research into the sources of creativity boomed after 1950. In 1963, John Gardner, president of the Carnegie Corporation, described creativity’s vogue in Self Renewal: The Individual and the Innovative Society. Creativity, he wrote,

is a word of dizzying popularity.… It is more than a word today; it is an incantation. People think of it as a kind of wonder drug, powerful and presumably painless; and everybody wants a prescription. It is part of a growing resistance to the tyranny of formula, a new respect for individuality, a dawning recognition of the potentialities of the liberated mind.15

By emphasizing creativity in work, perhaps Americans could rediscover within the bureaucratized workplace the sense of individualism that had made the nation strong.

Amidst these concerns that new sites and habits of office work enervated the American spirit, a new species of advice writers emerged. The most influential of these “management theorists” was Peter Drucker, whose Viennese roots cultivated an intimate sense of the threats to individualism posed by the rampant bureaucratization of fascism and communism. Drucker contended that reforming modern capitalism by improving its signature institution—the corporation—could renew individualism and humanism. By the late 1950s, Drucker completed the heroic, psychological, and meritocratic reformulation of the office worker. Employees, he explained, “expect to be ‘intellectuals,’” and are deflated when “they find that they are just ‘staff.’” He re-labeled white-collar employees, “knowledge workers.” In 1960, Douglas McGregor amplified Drucker’s arguments in The Human Side of Enterprise, urging managers to nurture subordinates’ talents and initiative not through discipline but through cooperation, openness, and merit, fitting individualism to the modern corporate setting. Borrowing from psychologist Abraham Maslow, McGregor fused this meritocratic ethos to the kind of psychologizing that had vexed
C. Wright Mills. Managers, he explained, needed to focus on “the satisfaction of higher-level ego” and encourage workers to meet “self-actualization needs.”16

Intellectual rationalization or honorific titles (the 1960s saw organizational charts become cluttered with vice presidents) did little to alter the reality of workers’ struggle to reconcile themselves psychologically to the structures and character of the modern workplace. The latent frustration with work and the sites of work that critics had sensed in the 1940s and 1950s found popular, psychologized expression in the 1960s. By 1968, public opinion analyst Samuel Lubell charged, “our society seems to have developed a predilection, even craze, for reading psychological explanations into anything and everything that happens, moving as far toward this extreme as Marxians once did in assigning an economic cause to anything and everything.”17 One widely cited 1960 article described “talking, fun, and fooling” as a means of “‘psychological survival’” that kept manufacturing workers from, as one put it, “going nuts.”18

Rather than adapt to the sites of work, the student New Left rejected dehumanizing factory or white-collar work, implicating each in the military-industrial complex, the seeming inevitability of nuclear annihilation, and the immoral prosecution of war in Vietnam. In seeking to “name the system,” as Students for a Democratic Society leader Paul Potter put it in 1965, young people believed that the sites and systems of work offered essential material support for inhumane and immoral developments at home and abroad and simultaneously inured a majority of Americans to those processes. “What kind of system is it,” Potter asked, “that creates faceless and terrible bureaucracies and makes those the place where people spend their lives and do their work…?”19

Drawing upon a kindred sense of unease and outrage, in 1967 left-wing artist Ben Shahn published “In Defense of Chaos” in Ramparts, a magazine aligned with the antiwar, student left. Disturbed by his own complicity in allowing his life “to be well ordered for me by benign and unseen forces,” Shahn sought the spark of “freedom,” “the poetic element in a dull and ordered world!” Nevertheless, he closed with a more measured call, an acknowledgment, perhaps, that bureaucratic rationality was here to stay, though it could use a tousling from time to time: “I don’t propose to release Chaos and just turn her loose upon the human race—we still must have banks and plane schedules. But I think it would be nice if we just made a pet of her and let her go free from time to time to get a breath of fresh air and romp around a little among the Planned Society.”20

Having earlier pondered the implications of mass conformity, the cult of efficiency, and workers’ discontents, Daniel Bell now puzzled over where such unrest was headed. The “adversary culture,” he posited, “declares in sweeping fashion that the status quo represents a state of absolute repression, so that, in a widening gyre, new and fresh assaults on the social structure are mounted.” Cautiously, he noted that “such disjunctions…historically have paved the way for more direct social revolutions.”21 Bell was wise to hedge his bets on widespread social revolution. The workplace and its workers, however, were ripe for change.

Toward a Flexible—and Riskier—Regime

As the cultural and radical left worked to name and undermine the system, the majority of American workers by the early 1970s sought simply to manage the seismic shifts wrenching the economy and its institutions. Studs Terkel invoked a bygone era to describe the upheaval afflicting the moral landscape of Western work. “Bob Cratchit may still be hanging on,” he wrote in 1972, “but Scrooge has been replaced by the conglomerate. Hardly a chance for Christmas spirit here. Who knows Bob’s name in this outfit—let alone his lame child’s?”22

The 1970s saw workers undertake a range of efforts to humanize the workplace, carve out space for self-expression through work, and reclaim a modicum of control over their work. Unionized manufacturing employees, for instance, demanded greater creativity and independence in their work lives. Striking autoworkers’ “shoulder-length hair, beards, Afros and mod clothing,” Newsweek reported, suggested that employees at GM’s Lordstown, Ohio, plant were building “an industrial Woodstock.”23 Other factory workers sought personal satisfaction by creating systems of job rotation, acquiring new skills and meeting new challenges.24

Such trappings and efforts formed the surface of a deep-rooted existential anxiety spreading among a generation of routinized laborers. Increasingly familiar with psychologized notions of the self, workers worried their mechanized work lives conveyed something essential about themselves. “I wanted to be somebody,” said Ford autoworker Dewey Burton. “I wanted to have some kind of recognition, you know, to be more tomorrow than I was yesterday.” But, he admitted, “I realized I was killing myself, and there wasn’t going to be any reward for my suicide.” Another worker lamented, “It takes so much to just make it that there’s no time for dreams and no energy for making them come true—and I’m not so sure anymore that it’s ever going to get better.”25 As Reinhard Bendix and Seymour Martin Lipset argued, more and more blue-collar workers dreamed of “being one’s own boss.” The ideal of “the ‘individual enterprise,’” they found, had “become by and large a working-class phenomenon.”26

Kindred anxieties found expression in white-collar office towers. In Boston, a group of female accounting secretaries organized to at once unravel the company’s hierarchy and seize greater responsibilities. After being fired for her efforts to democratize and humanize her workplace, Karen Nussbaum went on to organize 9to5, a Boston “local” reflecting a national push by predominantly female clerical workers to subvert office hierarchies, often by focusing on office layout and design. As one secretary-activist told an interviewer, “I think I’d like to see more flexibility. And I think I’d do away with job levels and just make everybody more equal.”27 On practical and material levels, especially from female workers’ perspectives, the emergence of “dual-earner families,” Arne Kalleberg notes, made it important for workers to have an increasing degree of flexibility in their schedules.28

Faced, on the one hand, with the felt need to find personal expression through work, while, on the other, confronting sites of work drained of autonomy, flexibility, humanism, or creativity, a majority of American workers expressed themselves in the “psychological themes” Mills had initially located in the white-collar worker. The 1970s became a decade in which an updated self-help industry flourished. Thomas A. Harris’s I’m OK, You’re OK popularized the psychiatric practice of Transactional Analysis. The book was a New York Times best seller for nearly two years. Robert J. Ringer scored two Times number one best sellers with his proto-libertarian version of self-help, popularizing Milton Friedman’s rational, self-interested economics in Winning through Intimidation (1973) and Looking Out for Number One (1977). As Peter Marin put it in Harper’s in 1975, the “new world view emerging” focused squarely on the self, with “individual survival as its sole good.”29

Prior to the arrival of a new economic order, then, the moral and cultural orders of work had begun to shift. Against the stultifying and alienating structures of postwar work, Americans sought new forms and sites of autonomy and authenticity. In this emergent order, as Charles Taylor argued, “notions like self-fulfillment” and “the ideal of authenticity” commanded the kind of “moral force” that had once characterized earlier cultural orders such as the Protestant ethic. As Taylor described it, the era’s “most powerful moral ideal” granted “crucial moral importance to a kind of contact with myself, with my own inner nature.” “Being true to myself means being true to my own originality, and that is something only I can articulate and discover. In articulating it,” Taylor argued, “I am also defining myself.”30

If the postwar years had been characterized by efforts to reconcile the traditional (mythical?) American moral order of individualism and merit with mass society and bureaucracy, the 1970s saw the tensions between the two explode. True individualism, authenticity, and autonomy were completely incompatible with bureaucracy. Postwar theorists and policymakers sought ways to shape the market and its sites of work in ways that might protect and nurture individualism. In the 1970s and 1980s, Americans turned to the market to create not only new ways and means of working, but also of defining themselves—psychologically, economically, and autonomously.

The values of individualism that flourished at the time offered cultural legitimation for the political project of unleashing markets and undoing collective forms of security. For privileged white-collar workers and tech entrepreneurs, the creative destruction of Silicon Valley and the high stakes corporate raiding of high finance offered many of the intrinsic psychological benefits an earlier generation of management theorists had sought to foster through creativity. The money was certainly good, but “the satisfaction of higher-level ego” was off the charts.

Typifying the new breed of management theorist, Tom Peters spread the gospel of obliterating bureaucracy, not adjusting workers to it. (The title of one of his books, Thriving on Chaos, suggested Peters’s earlier cultural affinity with leftists like Ben Shahn.) As he reported in Liberation Management, one admirable innovator “demolished the corporate superstructure” “in about 100 days”; another company succeeded in “decimating the central staff ranks.” As he gleefully put it, “Could 38 of every 40 ‘staffers’ really be excess baggage? Yup!” Compared to those who labored under the “old standards,” Peters crowed, the workers of the future would be “liberated as hell.”31

For many in the working classes, however, the economic forces unleashed by the convergence of their embrace of the culture of individualism, deindustrialization, and the Reagan economy constituted a social crisis. The sociologist Jennifer Silva has documented the working-class transformations that have resulted from the broad diffusion of the therapeutic ethos and the adoption of the ethic of authenticity in the context of mounting precarity. In Coming Up Short, Silva describes, for instance, how “Monica,” a young working-class white woman, rejected traditional, routinized work and “redefined success in terms of passion and creativity.” Following a series of fleeting jobs and relationships, Monica found that depending on family or work to “center her sense of self would leave her constantly seeking.” “I have a strong work ethics [sic],” Monica said, “but at the same time I’m not going to slave away at a job that I hate.” Her goal was to find work wherever her “passion” took her.32 Operating firmly within the ethic of authenticity, Monica found that the intrinsic benefits of finding her passions outweighed, for the time being, her interest in material or emotional stability.

This quest for personal authenticity and autonomy in the face of unreliable communities and institutions has become a defining feature of the modern working class. Silva details many cases of working-class whites and African Americans deploying therapeutic metaphors to “make a virtue out of not asking for help, out of rejecting dependence and surviving completely on their own.” As a twenty-eight-year-old line cook explained, “When I start feeling helpless, I just have to make a conscious decision to not feel that way…. No one else is going to fix me but me,” he said. The result is a new cultural “common sense” that rejects structural explanations for racism, sexism, or inequality. Because an older social contract was shredded (or never existed for some), structural inequalities must be overcome individually. As Silva argues, “The cultural logic of neoliberalism resonates at the deepest level of the self.”33

Seeking New Forms of Security

Guy Standing’s concerns for a new regime of work rights thus sit at the nexus of global economic forces and a longer history of anxieties about the workplace. The class of worker Standing considers “good” has shown flashes of interest in forms of collective security. Coworking spaces indicate a quest for a form of solidarity. One in New York City, Prime Produce, is even organizing itself on the model of a medieval guild. As a founder described it, its “crafted social innovation” was designed to put human interaction first. Guilds, he said, “blocked innovation that dehumanized work” and “were always responsible to people first.”34

Cowork spaces might also reflect a further segmenting and segregation of the market and workplace. As opposed to the vertically integrated office or factory that was home to at least a modicum of class, racial, and educational diversity, cowork spaces offer the potential to seal off entire segments of like-minded, similarly cultured, and educated workers who earn enough to afford workspace dues.

Such efforts to create new forms of security or solidarity may end up simply reinforcing inequality and social division. Avoiding this outcome may require rediscovering what working people share, not what they lack (the latter being Standing’s primary way of denoting the precariat). Across the precarious economy, workers identify control over their work and non-work time as a signal virtue. While critics lament that Americans work more for less than at any point since World War II, many Americans find great intrinsic value in working for themselves. Though many workers would prefer steady employment, many also reject the prospect of repetitive, bureaucratized work. A poll of freelancers, for instance, found that nearly nine in ten would turn down a traditional full-time job if offered one.35 A Florida man values driving for Uber because it allows him to set his own schedule (“I need to be able to drop the kids off at school and…take [them] to appointments”) while also enabling him to devote time to his passion: getting his “photography business up in full swing.” Uber, he maintains, “was the best way to keep from going to the government for assistance.”36

Today, then, we might pose anew a question Daniel Bell asked nearly six decades ago: “If the slogan of ‘workers’ control’ is raised, the simple starting point, perhaps, is to ask: workers’ control over what?”37 Flexible work arrangements allow workers to control their work and their time spent working. A historically unprecedented valorization of individual autonomy is a core cultural virtue today. Faith in broader institutions is weak. To forge new forms of security and solidarity, we must understand our neoliberal moment as a cultural as well as an economic construct. Until we do, appeals to social solidarity that necessarily form the basis of a critically important new regime of rights and security will fall flat.

John Gardner, Self-Renewal: The Individual and the Innovative Society (New York: Harper & Row, 1963), 32; quoted in Jamie Cohen-Cole, The Open Mind: Cold War Politics and the Sciences of Human Nature (Chicago, IL: University of Chicago Press, 2014), 35.

Brent Cebul is the Mellon Postdoctoral Research Scholar in the Digital Humanities at the University of Richmond and an associate fellow at the Institute for Advanced Studies in Culture. He is the author of the forthcoming Developmental State: Business, Poverty, and Economic Empowerment from the New Deal to the New Democrats and is at work on a book tentatively titled Bootstraps Nation: A History of Self Help in Modern America.

Reprinted from The Hedgehog Review 18.1 (Spring 2016). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.

Who We Are

Published three times a year by the Institute for Advanced Studies in Culture, The Hedgehog Review offers critical reflections on contemporary culture—how we shape it, and how it shapes us.