Total Quality Management: Empirical, Conceptual, and Practical Issues

Copyright Cornell University, Johnson Graduate School. This material is published under license from the publisher through the Gale Group, Farmington Hills, Michigan. All inquiries regarding rights or concerns about this content should be directed to Customer Service.

In recent years, total quality management (TQM) has become something of a social movement in the United States. This commentary returns to the writings of the movement's founders--W. Edwards Deming, Joseph Juran, and Kaoru Ishikawa--to assess the coherence, distinctiveness, and likely perseverance of this provocative management philosophy. We identify a number of gaps in what is known about TQM processes and outcomes and explore the congruence between TQM practices and behavioral science knowledge about motivation, learning, and change in social systems. The commentary concludes with a prognosis about the future of TQM--including some speculations about what will be needed if TQM is to take root and prosper in the years to come.(*)

It has now been a decade since the core ideas of total quality management (TQM) set forth by W. Edwards Deming, Joseph Juran, and Kaoru Ishikawa gained significant acceptance in the U.S. management community. In that decade, TQM has become something of a social movement. It has spread from its industrial origins to health care organizations, public bureaucracies, nonprofit organizations, and educational institutions. It has become increasingly prominent in the popular press, in the portfolios of trainers and consultants, and, more recently, in the scholarly literature.(1) Institutions specifically chartered to promote TQM have been established, and a discernible TQM ideology has developed and diffused throughout the managerial community. And, in its maturity, TQM has become controversial--something whose worth and impact people argue about.

Some writers have asserted that TQM provides a historically unique approach to improving organizational effectiveness, one that has a solid conceptual foundation and, at the same time, offers a strategy for improving performance that takes account of how people and organizations actually operate (Wruck and Jensen, 1994). A more skeptical view is that TQM is but one in a long line of programs--in the tradition of T-groups, job enrichment, management by objectives, and a host of others--that have burst upon the managerial scene rich with promise, only to give way in a few years to yet another new management fashion.

In this commentary, we provide a conceptual analysis of TQM that places these competing claims in perspective. We ask whether there really is such a thing as TQM or whether it has become mainly a banner under which a potpourri of essentially unrelated organizational changes are undertaken. We document how TQM activities and outcomes have been measured and evaluated by researchers and note some significant gaps in what has been learned. We explore the uneasy relation between behavioral processes that are central to TQM practice and mainline organizational scholarship about those same processes. And we conclude with an overall assessment of the current state of TQM theory and practice, including some speculations about what may be required if this potentially powerful approach is to take root and prosper in the years to come.

IS THERE SUCH A THING AS TQM?

As is inevitable for any idea that enjoys wide popularity in managerial and scholarly circles, total quality management has come to mean different things to different people. There is now such a diversity of things done under the name "total quality" that it has become unclear whether TQM still has an identifiable conceptual core, if it ever did. We begin with a close examination of what the movement's founders had to say about what TQM was supposed to be, and then we assess how TQM as currently practiced stacks up against the founders) values and prescriptions.

Virtually everything that has been written about TQM explicitly draws on the works of W. Edwards Deming, Joseph Juran, and Kaoru Ishikawa, the primary authorities of the TQM movement (for a review, see Crosby, 1989). Rather than providing here a precis of their writings, we draw on them to determine whether there exists among them (1) a coherent philosophical position that specifies the core values to be sought in TQM programs and (2) a distinctive set of interventions (structures, systems, and/or work practices) that are intended specifically to promote those values.

TQM Philosophy

Deming, Ishikawa, and Juran share the view that an organization's primary purpose is to stay in business, so that it can promote the stability of the community, generate products and services that are useful to customers, and provide a setting for the satisfaction and growth of organization members (Juran, 1969: 1-5); Ishikawa, 1985: 1; Deming, 1986: preface). The focus is on the preservation and health of the organization, but there also are explicitly stated values about the organization's context (the community and customers) and about the well-being of individual organization members: As Ishikawa (1985: 27) said, "An organization whose members are not happy and cannot be happy does not deserve to exist." The TQM strategy for achieving its normative outcomes is rooted in four interlocked assumptions--about quality, people, organizations, and the role of senior management.

Assumptions. The first assumption is about quality, which is assumed to be less costly to an organization than is poor workmanship. A fundamental premise of TQM is that the costs of poor quality (such as inspection, rework, lost customers, and so on) are far greater than the costs of developing processes that produce high-quality products and services. Although the organizational purposes espoused by the TQM authorities do not explicitly address traditional economic and accounting criteria of organizational effectiveness, their view is that organizations that produce quality goods will eventually do better even on traditional measures such as profitability than will organizations that attempt to keep costs low by compromising quality (Juran, 1974: 5.1-5.15; Ishikawa, 1385: 104-105; Deming, 1986: 11-12). The strong version of this assumption, implicit in Juran and Ishikawa but explicit and prominent in Deming's writing, is that producing quality products and services is not merely less costly but, in fact, is absolutely essential to long-term organizational survival (Deming, 1993: xi-xii).

The second assumption is about people. Employees naturally care about the quality of work they do and will take initiatives to improve it--so long as they are provided with the tools and training that are needed for quality improvement, and management pays attention to their ideas. As stated by Juran (1974: 4.54), "The human being exhibits an instinctive drive for precision, beauty and perfection. When unrestrained by economics, this drive has created the art treasures of the ages." Deming and Ishikawa add that an organization must remove ail organizational systems that create fear--such as punishment for poor performance, appraisal systems that involve the comparative evaluation of employees, and merit pay (Ishikawa, 1985: 26; Deming, 1986: 101-109).

The third assumption is that organizations are systems of highly interdependent parts, and the central problems they face invariably cross traditional functional lines. To produce high-quality products efficiently, for example, product designers must address manufacturing challenges and trade-offs as part of the design process. Deming and Juran are insistent that cross-functional problems must be addressed collectively by representatives of all relevant functions (Juran, 1969: 80-85; Deming, 1993: 50-93). Ishikawa, by contrast, is much less system-oriented: He states that cross-functional teams should not set overall directions; rather, each line division should set its own goals using local objective-setting procedures (Ishikawa, 1985: 116-117).

The final assumption concerns senior management. Quality is viewed as ultimately and inescapably the responsibility of top management. Because senior managers create the organizational systems that determine how products and services are designed and produced, the quality-improvement process must begin with management's own commitment to total quality. Employees' work effectiveness is viewed as a direct function of the quality of the systems that managers create (Juran, 1974: 21.1-21.4; Ishikawa, 1985: 122-128; Deming, 1986: 248-249).

Change principles. TQM authorities specify four principles that should guide any organizational interventions intended to improve quality. The first is to focus on work processes. The quality of products and services depends most of all on the processes by which they are designed and produced. It is not sufficient to provide clear direction about hoped-for outcomes; in addition, management must train and coach employees to assess, analyze, and improve work processes (Juran, 1974: 2.11-2.17; Ishikawa, 1985: 60; Deming, 1986: 52).

The second principle is analysis of variability. Uncontrolled variance in processes or outcomes is the primary cause of quality problems and must be analyzed and controlled by those who perform an organization's front-line work. Only when the root causes of variability have been identified are employees in a position to take appropriate steps to improve work processes. According to Deming (1986: 20), "The central problem of management . . . is to understand better the meaning of variation, and to extract the information contained in variation" (see also Juran, 1974: 2.10-2.17; Ishikawa, 1985: chap. 12).

The third principle is management by fact. TQM calls for the use of systematically collected data at every point in a problem-solving cycle--from determining high-priority problems, through analyzing their causes, to selecting and testing solutions (Juran, 1974: 22.1-28.1; Ishikawa, 1985: 104-105; Deming, 1986: chap. 8). Although Deming, Ishikawa, and Juran differ in their preferred analytical tools, each bases his quality-improvement program on collecting data, using statistics, and testing solutions by experiment.

The fourth principle is learning and continuous improvement. The long-term health of an enterprise depends on treating quality improvement as a never-ending quest. Opportunities to develop better methods for carrying out work always exist, and a commitment to continuous improvement ensures that people will never stop learning about the work they do (Juran, 1969: 2-3; Ishikawa, 1985: 55-56; Deming, 1986: 49-52).

TQM Interventions

Despite some differences in emphasis, the three TQM authorities have a common philosophical orientation and share a set of core values about people, organizations, and change processes. They prescribe five interventions to realize those values.

Explicit identification and measurement of customer requirements. To achieve quality, it is essential to know what customers want and to provide products or services that meet their requirements (Ishikawa, 1985: 43). It is necessary, therefore, for organization members to assess directly customer requirements such as durability, reliability, and speed of service (Juran, 1974: 2.2; Deming, 1986: 177-182). Some customers are external to the organization, others are internal, as when the output of some organization members is passed on to others. TQM defines the next process down the line as the "customer" for each process. Within the organization, then, the assessment of customer requirements serves as a tool to foster cross-functional cooperation (Ishikawa, 1985: 107-108).

With data about customer requirements in hand, quality improvement can focus specifically on those aspects of work processes that are most consequential for customer satisfaction. Even so, high quality is not assured. Some organizations actively manipulate customer preferences (for example, through advertising) to bring them into line with what the organization already is able to provide. And customers may define their own requirements in terms of existing products and services that may be low in quality (Hayes and Abernathy, 1980). Deming (1993: 7-9) suggests that this may be especially characteristic of customers in the United States, because they have grown accustomed to poor-quality products and services; U.S. organizations that rely too heavily on what customers say they want risk setting quality standards far below what employees actually are capable of achieving. Creation of supplier partnerships. TQM authorities suggest that organizations should choose vendors on the basis of quality, rather than solely on price. Moreover, they recommend that organizations work directly with raw material suppliers to ensure that their materials are of the highest quality possible (Juran, 1974: 10.1-10.35; Ishikawa, 1985, chap. 9; Deming, 1986: 31-43).

Use of cross-functional teams to identify and solve quality problems. Although cross-functional teams can be used in multiple ways in TQM programs, their main purpose is to identify and analyze the "vital few" problems of the organization (Ishikawa, 1985: 113-119; Deming, 1993: 85-89). Juran (1969) refers to such teams as the "steering arm" of a quality effort. Other teams, also cross-functional, are created to diagnose the causes of problems that have been identified by the steering arm and to develop and test possible solutions to them. Diagnostic teams can be either temporary task forces or continuing organizational entities. In both cases, department heads are included as team members to ensure that stakeholder departments will cooperate when the time comes to implement the team's recommendations. Juran, far more than Deming, advocates the use of quality-improvement teams within functions. But the team composition principle is the same: Choose people who can provide access to the data necessary for testing potential solutions and who are critical to implementing the solutions developed (Juran, 1969: 78-89).

Use of scientific methods to monitor performance and to identify points of high leverage for performance improvement. The three TQM authorities are of one voice in advocating the use of statistical tools to monitor and analyze work processes (Juran, 1974: chaps. 22-27; Ishikawa, 1985: 109-120; Deming, 1986: chaps. 8-9). A wide variety of statistical tools are available to identify the points of highest leverage for quality improvement, to evaluate alternative solutions to identified problems, and to document the results of process changes. Many of the tools involve applications of probability theory to generate findings that then can be summarized pictorially. Literally dozens of "quality tools" have been described in the literature (for a review, see Sashkin and Kiser, 1993). Three of the most commonly used tools are control charts, Pareto analysis, and cost-of-quality analysis.

A control chart provides a pictorial representation of the outputs of an ongoing process. Control charts are used to monitor the performance of a process and to determine whether that process is "in control"--whether the variance produced by the process is random or attributable to specific causes. It is assumed that all processes produce variance, but a stable process fluctuates randomly. Therefore, data from a stable process will tend to fall within predictable bounds. Scrutiny of a control chart allows the user to (1) determine whether a given process is in need of improvement, (2) identify points outside the control range so that the causes of uncontrolled variance can be sought, and (3) reassess the process after experimental attempts to improve it are completed (Deming, 1986: 323-346).

Pareto analysis is used to identify the major factors that contribute to a problem and to distinguish the "vital few" from the "trivial many" causes. Pareto charts are used when each separate contributor to a problem can be quantified. For example, a group attempting to identify the vital few causes of high inventory costs would list each inventory item in order of total dollar value of materials kept in stock. Those materials that turn out to be major contributors to inventory costs are then addressed first (Juran, 1969: 43-54).

Cost-of-quality analysis is used to highlight the cost savings that can be achieved by doing the work right the first time. The analysis involves quantifying all costs associated with maintaining acceptable quality levels, such as the costs of preventing errors, and then comparing these with the costs incurred by failures to achieve acceptable quality, such as the cost of rework. …

Related articles on HighBeam Research

In the late 1970s and early 1980s, executives in one American company after another began to recognize that they had to dramatically improve the quality of their products and services while at the same time reducing manufacturing costs-if they wished to survive the onslaught of high-quality yet…

AMERICAN BUSINESS is beset by a slumping economy, intense competition from overseas entities and an increasingly discriminating customer base in the United States and around the world. Together, these problems dictate the absolute need for high quality operations, both within a company and in its…

Change, even when for the better, is always accompanied by apprehension and even outright fear. It is therefore not surprising to hear health care workers, especially physicians, expressing their concerns about this "new" management philosophy through a spectrum of reactions that vary from…

The principles of total quality management (TQM) make up one of the most robust sets of organizational precepts ever developed. And when applied to the art and science of supply chain management, they take on added potency. Consider, for example, what happens when TQM is applied upstream to the…

At almost every level of operation in the chemical process industries one is likely to hear about "programitis," a state of being snowed under by myriad management programs and being left little or no time to do the "real work." With the advent of Responsible Care, ISO 9000, new environmental and…