Intel Corporation is the largest semiconductor manufacturer in the world, with 11 fabrication facilities and six assembly and test facilities around the world. Intel has changed the global marketplace dramatically since it was founded in 1968; the company invented the microprocessor, the "computer on a chip" that made possible the first handheld calculators and personal computers (PCs). By the early 21st century, Intel's microprocessors were found in approximately 80 percent of PCs worldwide. The company's product line also includes chipsets and motherboards; flash memory used in wireless communications and other applications; networking devices and equipment for accessing the Internet, local area networks, and home networks; and embedded control microchips used in networking products, laser printers, factory automation instruments, cellular phone base stations, and other applications. Intel has remained competitive through a combination of clever marketing, well-supported research and development, superior manufacturing proficiency, a vital corporate culture, prowess in legal matters, and an ongoing alliance with software giant Microsoft Corporation often referred to as "Wintel."

1968–79: From DRAM to the 8086

Intel's founders, Robert Noyce and Gordon Moore, were among the eight founders of Fairchild Semiconductor Corporation, established in 1957. While at Fairchild, Noyce and Moore invented the integrated circuit; in 1968, they decided to form their own company. They were soon joined by Andrew Grove, a Hungarian refugee who had arrived in the United States in 1956 and joined Fairchild in 1963. Grove would remain president and CEO of Intel into the 1990s.

To obtain start-up capital, Noyce and Moore approached Arthur Rock, a venture capitalist, with a one-page business plan simply stating their intention of developing large-scale integrated circuits. Rock, who had helped start Fairchild Semiconductor, as well as Teledyne and Scientific Data Systems, had confidence in Noyce and Moore and provided $3 million in capital. The company was incorporated on July 18, 1968, as N M Electronics (the letters standing for Noyce Moore), but quickly changed its name to Intel, formed from the first syllables of "integrated electronics." Intel gathered another $2 million in capital before going public in 1971.

Noyce and Moore's scanty business proposal belied a clear plan to produce large-scale integrated (LSI) semiconductor memories. At that time, semiconductor memories were ten times more expensive than standard magnetic core memories. Costs were falling, however, and Intel's founders surmised that with the greater speed and efficiency of LSI technology, semi-conductors would soon replace magnetic cores. Within a few months of its startup, Intel produced the 3101 Schottky bipolar memory, a high-speed random access memory (RAM) chip. The 3101 proved popular enough to sustain the company until the 1101, a metal oxide semiconductor (MOS) chip, was perfected and introduced in 1969. The following year, Intel introduced the 1103, a 1-kilobyte (K) dynamic RAM, or DRAM, which was the first chip large enough to store a significant amount of information. With the 1103, Intel finally had a chip that really did begin to replace magnetic cores; DRAMs eventually proved indispensable to the personal computer.

The company's most dramatic impact on the computer industry involved its 1971 introduction of the 4004, the world's first microprocessor. Like many of Intel's innovations, the microprocessor was a byproduct of efforts to develop another technology. When a Japanese calculator manufacturer, Busicom, asked Intel to design cost-effective chips for a series of calculators, Intel engineer Ted Hoff was assigned to the project; during his search for such a design, Hoff conceived a plan for a central processing unit (CPU) on one chip. The 4004, which crammed 2,300 transistors onto a one-eighth- by one-sixth-inch chip, had the power of the old 3,000-cubic-foot ENIAC computer, which depended on 38,000 vacuum tubes.

Although Intel initially focused on the microprocessor as a computer enhancement that would allow users to add more memory to their units, the microprocessor's great potential—for everything from calculators to cash registers and traffic lights—soon became clear. The applications were facilitated by Intel's introduction of the 8008, an 8-bit microprocessor developed along with the 4004 but oriented toward data and character (rather than arithmetic) manipulation. The 8080, introduced in 1974, was the first truly general purpose microprocessor. For $360, Intel sold a whole computer on one chip, while conventional computers sold for thousands of dollars. The response was overwhelming. The 8080 soon became the industry standard and Intel the industry leader in the 8-bit market.

In response to ensuing competition in the manufacture of 8-bit microprocessors, Intel introduced the 8085, a faster chip with more functions. The company was also developing two more advanced projects, the 32-bit 432 and the 16-bit 8086. The 8086 was introduced in 1978 but took two years to achieve wide use, and, during this time, Motorola, Inc. produced a competing chip (the 68000) that seemed to be selling faster. Intel responded with a massive sales effort to establish its architecture as the standard. When International Business Machines Corporation (IBM) chose the 8008, the 8086's 8-bit cousin, for its personal computer in 1980, Intel seemed to have beat out the competition.

During the 1970s, Intel had also developed the erasable programmable read-only memory (EPROM), another revolutionary but unintended research byproduct. Intel physicist Dov Frohman was working on the reliability problems of the silicon gate used in the MOS process when he realized that the disconnected, or "floating," gates that were causing malfunctions could be used to create a chip that was erasable and reprogrammable. Since conventional ROM chips had to be permanently programmed during manufacture, any change required the manufacture of a whole new chip. With EPROM, however, Intel could offer customers chips that could be erased and reprogrammed with ultraviolet light and electricity. At its introduction in 1971, EPROM was a novelty without much of a market. But the microprocessor, invented at the same time, created a demand for memory; the EPROM offered memory that could be conveniently used to test microprocessors.

Another major development at Intel during this time was that of peripheral controller chips. Streamlined for specific tasks and stripped of unneeded functions, peripheral chips could greatly increase a computer's abilities without raising software development costs. One of Intel's most important developments in peripherals was the coprocessor, first introduced in 1980. Coprocessor chips were an extension of the CPU that could handle specific computer-intensive tasks more efficiently than the CPU itself. Once again, innovation kept Intel ahead of its competition.

Intel's rapid growth, from the 12 employees at its founding in 1968 to 15,000 in 1980, demanded a careful approach to corporate culture. Noyce, Moore, and Grove, who remembered their frustration with Fairchild's bureaucratic bottlenecks, found that defining a workable management style was important. Informal weekly lunches with employees kept communication lines open while the company was small, but that system had become unwieldy. Thus, the founders installed a carefully outlined program emphasizing openness, decision-making on the lowest levels, discipline, and problem solving rather than paper shuffling. Moreover, the company's top executives eschewed such luxuries as limousines, expense account lunches, and private parking spaces to establish a sense of teamwork with their subordinates.

In an interview with the Harvard Business Review in 1980, Noyce remarked on the company's hiring policy, stating, "we expect people to work hard. We expect them to be here when they are committed to be here; we measure absolutely everything that we can in terms of performance." Employee incentives included options on Intel stock, and technological breakthroughs were celebrated with custom-bottled champagne—"Vintage Intel" marked the first $250 million quarter, in 1983—the year sales reached $1 billion for the first time.

Company Perspectives:

For over 35 years, Intel Corporation has developed technology enabling the computer and Internet revolution that has changed the world. Founded in 1968 to build semiconductor memory products, Intel introduced the world's first micro-processor in 1971. Today, Intel supplies the computing and communications industries with chips, boards, systems, and software building blocks that are the "ingredients" of computers, servers and networking and communications products. These products are used by industry members to create advanced computing and communications systems. Intel's mission is to do a great job for our customers, employees, and stockholders by being the preeminent building block supplier to the worldwide digital economy.

1980s: From 286 to 486

During the 1974 recession, Intel was forced to lay off 30 percent of its employees, and morale declined substantially as a result. Thus, in 1981, when economic struggles again surfaced, instead of laying off more employees, Intel accelerated new product development with the "125 Percent Solution," which asked exempt employees to work two extra hours per day, without pay, for six months. A brief surge in sales the following year did not last, and, again, instead of more layoffs, Intel imposed pay cuts of up to 10 percent. Such measures were not popular among all its workforce, but, by June 1983, all cuts had been restored and retroactive raises had been made. Moreover, in December 1982, IBM paid $250 million for a 12 percent share of Intel, giving the company not only a strong capital boost, but also strong ties to the undisputed industry leader. IBM would eventually increase its stake to 20 percent before selling its Intel stock in 1987.

During the early 1980s, Intel began to slip in some of its markets. Fierce competition in DRAMs, static RAMs, and EPROMs left Intel concentrating on microprocessors. While competitors claimed that Intel simply gave away its DRAM market, Moore told Business Week in 1988 that the company deliberately focused on microprocessors as the least cyclical field in which to operate. Customer service, an area Intel had been able to overlook for years as it dominated its markets, became more important as highly efficient Japanese and other increasingly innovative competitors challenged Intel's position. In addition, Intel's manufacturing record, strained in years past by undercapacity, needed fixing. Fab 7, Intel's seventh wafer-fabrication plant, opened in 1983 only to face two years of troubled operations before reaching full capacity. Between 1984 and 1988, Intel closed eight old plants, and in 1988 it spent some $450 million on new technology to bring its manufacturing capacity into line with its developmental prowess.

Despite these retrenchments, the company continued to excel in the microprocessor market. In 1982 Intel introduced its 80286 microprocessor, the chip that quickly came to dominate the upper-end PC market, when IBM came out with the 286-powered PC/AT. The 286 was followed in 1985 by Intel's 80386 chip, popularized in 1987 by the Compaq DESKPRO 386, and which, despite bugs when it first came out, became one of the most popular chips on the market. While the 286 brought to the personal computer a speed and power that gave larger computers their first real challenge, the 386 offered even greater speed and power together with the ability to run more than one program at a time. The 386 featured 32-bit architecture and 275,000 transistors, more than twice the number of the 286.

In 1989 Intel introduced the 80486, a chip Business Week heralded as "a veritable mainframe-on-a-chip." The 486 included 1.2 million transistors and the first built-in math coprocessor, and was 50 times faster than the 4004, the first microprocessor. In designing the i486, Intel resisted an industry trend toward RISC (reduced instruction-set computing), a chip design that eliminated rarely used instructions in order to gain speed. Intel argued that what RISC chips gained in speed they lost in flexibility and that, moreover, RISC chips were not compatible with software already on the market, which Intel felt would secure the 486's position. A new chip, the 64-bit i860 announced in early 1989, however, did make use of RISC technology to offer what Intel claimed would be a "supercomputer on a chip."

Also in 1989, a major lawsuit that Intel had filed against NEC Corporation five years before was decided. Intel had claimed that NEC violated its copyright on the microcode, or embedded software instructions, of Intel's 8086 and 8088 chips. Although Intel had licensed NEC to produce the microcode, NEC had subsequently designed a similar chip of its own. At issue was whether microcode could be copyrighted. The court ruled that it could but that NEC had not violated any copyright in the case at hand. The suit made public some issues surrounding Intel's reputation. Some rivals and consumers, for example, claimed that Intel used its size and power to repress competition through such tactics as filing "meritless" lawsuits and tying microprocessor sales to other chips. Other observers, however, praised Intel's protection of its intellectual property and, subsequently, its profits. The Federal Trade Commission conducted a two-year investigation of Intel's practices and did not recommend criminal charges against the company, but two rival companies, Advanced Micro Devices, Inc. (AMD) and Cyrix Corporation, filed antitrust lawsuits against Intel in 1993.

1990s: The Pentium Decade

Intel's annual net income topped $1 billion for the first time in 1992, following a very successful, brand-building marketing campaign. Intel ads aggressively sought to bolster consumer interest in and demand for computers that featured "Intel Inside." By late 1993, the company's brand equity totaled $17.8 billion, more than three times its 1992 sales. Also during this time, Intel began to branch out from chipmaking. In 1992 the company's Intel Products Group introduced network, communications, and personal conferencing products for retail sale directly to PC users.

In 1993 Intel released its fifth-generation Pentium processor, a trademarked chip capable of executing over 100 million instructions per second (MIPS) and supporting, for example, real-time video communication. The Pentium processor, with its 3.1 million transistors, was up to five times more powerful than the 33-megahertz Intel 486 DX microprocessor (and 1,500 times the speed of the 4004), but, in an unusual marketing maneuver, the company suggested that "all but the most demanding users" would seek out PCs powered by the previous chip. The Pentium's reputation was initially sullied by the revelation of an embedded mathematical flaw, but Intel moved quickly to fix the problem.

Key Dates:

Intel introduces the world's first microprocessor (the 4004) and goes public.

1974:

Company introduces the first general purpose microprocessor (the 8080).

1980:

IBM chooses the Intel microprocessor for the first personal computer.

1983:

Revenues exceed $1 billion for the first time.

1992:

Net income tops $1 billion for the first time.

1993:

The fifth generation chip, the Pentium, debuts.

1996:

Revenues surpass $20 billion, net income exceeds $5 billion.

1997:

Company introduces the Pentium II microprocessor.

1999:

Intel debuts the Pentium III and is added to the Dow Jones Industrial Average.

2000:

The Pentium 4 hits the market.

2003:

The Centrino technology for mobile computers is launched.

The company enjoyed a dramatic 50 percent revenue increase in 1993, reaching $8.78 billion from $5.84 billion in 1992. Moreover, Intel's net income leapt 115 percent to $2.3 billion, repudiating Wall Street's worries that competition had squeezed profit margins. While Intel faced strong competition both from chip makers such as Motorola's PowerPC and former partner IBM, its place at the leading edge of technology was undisputed.

A key initiative that kept Intel ahead of its competitors was the company's move beyond chip design into computer design. With the advent of the Pentium, Intel began designing chipsets and motherboards, the latter being the PC circuit board that combined a microprocessor and a chipset into the basic subsystem of a PC. With the company now selling the guts of a PC, dozens of computer manufacturers began making and selling Pentium-based machines.

In the mid-1990s, as sales of PCs accelerated and multimedia and the Internet were beginning to emerge, Intel continued developing ever more powerful microprocessors. In 1995 the Pentium Pro hit the market sporting 5.5 million transistors and capable of performing up to 300 MIPS. Intel next added MMX technology to its existing line of Pentium processors. MMX consisted of a new set of instructions that was designed specifically to improve the multimedia performance of personal computers. Fueled by exploding demand, revenues hit $20.85 billion by 1996, while net income soared to $5.16 billion.

At this point Intel was continuing its longtime strategy of designing new, more powerful chips for the top end of the market while allowing previous-generation microprocessors to migrate down to the lower segments of the market. With the introduction of the Pentium II in May 1997, however, the company adopted a new strategy of developing a range of microprocessors for every segment of the computing market. The Pentium II, with 7.5 transistors, debuted with a top-end model that clocked at 300 megahertz. Originally designed for high-end desktop PCs, the Pentium II was soon adapted for use in notebook and laptop computers. With the following year came the launch of the Celeron processor, which was designed specifically for the value PC desktop sector, a rapidly growing segment of the market ever since the early 1997 debut of a sub-$1,000 PC from Compaq. Also in 1998 Intel for the first time designed a microprocessor, the Pentium II Xeon, especially for midrange and higher-end servers and workstations. At the same time Intel was moving into another burgeoning sector, that of embedded control chips for networking and other applications, such as digital set-top boxes.

Meanwhile Intel settled a dispute with Digital Equipment Corporation (DEC) over the development of the Pentium chip by acquiring DEC's semiconductor operations. In May 1997 Craig R. Barrett was named president of Intel, having joined the company in 1974, serving as head of manufacturing starting in 1985, and being named chief operating officer in 1993. Grove remained chairman and CEO for one year, whereupon Barrett was named president and CEO, with Grove retaining the chairmanship. In early 1999 Intel reached a settlement with the Federal Trade Commission on an antitrust suit, thereby avoiding the protracted litigation and negative publicity that beset its Wintel partner, Microsoft, in the late 1990s. Reflecting the increasing importance of technology to the U.S. economy, Intel was added to the Dow Jones Industrial Average in November 1999.

During the late 1990s Intel made several strategic acquisitions that rapidly gave the company a significant presence in areas outside its microprocessor core: wireless communications products, such as flash memory for mobile phones and two-way pagers; networking building blocks, such as hubs, switches, and routers; and embedded control chips for laser printers, storage media, and automotive systems. Intel also entered the market for e-commerce services, rapidly building up the largest business-to-business e-commerce site in the world, with $1 billion per month in online sales by mid-1999. The company was not neglecting its core, however; in 1999 Intel had its largest microprocessor launch ever with the simultaneous introduction of 15 Pentium III and Pentium III Xeon processors.

New Strategies in the Less Buoyant Early 2000s

The new product launches continued in 2000, but they were accompanied by an uncharacteristic series of blunders. In February arch-rival AMD had bested Intel by releasing the first 1-gigahertz chip, the Athlon, which had the added benefit of being cheaper than the Pentium III. Intel responded by speeding a 1.13-gigahertz version of the Pentium III to market, but the processor simply did not work right and thousands had to be recalled. Further embarrassment came when the firm had to recall a million motherboards because of a faulty chip. Intel had also underestimated growth in PC sales, leaving its production capacity insufficient to meet the demands of computer makers, and it also cancelled plans to develop a low-end microprocessor called Timna that had been slated for budget PCs. Intel continued to encounter problems developing the complex Itanium 64-bit processor, the company's first, which was specifically designed, in partnership with Hewlett-Packard Company, to meet the needs of powerful Internet servers. The long-delayed Itanium, seven years in the making at a cost of $2 billion, finally reached the market in 2001, receiving a rather muted initial reception. (The Itanium line was later shifted from servers to high-end computers.) On the bright side, Intel successfully released the Pentium 4 in November 2000. This processor included 42 million transistors and ran at an initial speed of 1.5 gigahertz, enabling Intel to regain the lead in the ongoing chip-speed battle with AMD. Despite all of the year's travails, Intel reached new heights in financial performance, earning $10.54 billion in profits on revenues of $33.73 billion.

The bursting of the Internet bubble posed new challenges for Intel in 2001 as consumer spending on computers dropped off and corporate information technology managers pulled back as well. The fierce competition from AMD prompted Intel to initiate a brutal price war, which cut both revenues and profits, and it also slashed Intel's worldwide share of the microprocessor market to below 80 percent, compared to the 86.7 percent figure from 1998. In 2001 Barrett began jettisoning many of the new ventures and acquisitions that were part of the late 1990s diversification drive, in a renewed refocusing on microprocessors. Revenues for 2001 fell 21 percent to $26.54 billion—the first such drop since the mid-1980s tech recession—while profits plummeted 87 percent to $1.29 billion. Early the following year, Paul Otellini was named president and chief operating officer, with Barrett remaining CEO. Otellini had served in a variety of marketing and management positions since joining the company in 1974, most recently serving as head of Intel's core operating unit, the architecture group, which was responsible for developing microprocessors, chipsets, and motherboards for desktop and notebook computers and for servers.

As the technology downturn continued in 2002, Intel cut thousands of workers from its payroll to reduce costs. Behind the scenes, an important change occurred in the company's approach to designing chips. Since the 1980s Intel had maintained its leading position by creating ever-faster processors. But by the early 2000s speed was becoming less important to the majority of PC users, who were mainly employing their desktop PCs and laptops to surf the Internet and run basic programs, such as word processors. Intel decided to deemphasize speed in favor of designing chips to better fit the way people were actually using their computers and to do so using technology "platforms," which were composed of several chips rather than a single microprocessor. The first fruit of this endeavor was Centrino, launched in early 2003. Centrino was a combination of chips specifically designed for portable computers. It included the Pentium M microprocessor, which while not sporting top speeds consumed much less power than the typical chip, providing for longer battery life (and reduced energy consumption when installed in desktop computers). The Pentium M was also smaller in size, making it less expensive to manufacture. Centrino also included a supporting chipset to further improve battery life and graphics performance as well as a wireless radio chip for connecting to the burgeoning number of wireless (Wi-Fi) networks being installed at corporate offices, in retail outlets, and within homes.

Buoyed by the success of Centrino, Intel's revenues hit a new high in 2004, $34.21 billion, despite a number of manufacturing glitches, product delays, and schedule changes during the year. Intel abandoned its efforts to develop television display chips and also scrapped plans to introduce the first 4-gigahertz processor because of problems with overheating. The profits of $7.52 billion were an impressive 33 percent higher than the previous year but below the peak reached in 2000.

In May 2005 Otellini became only the fifth CEO in Intel history and the first non-engineer. At the same time, Barrett succeeded Grove as chairman. One of the key legacies of Barrett's tenure was surely the huge outlay of capital, as much as $32 billion over six years, expended to rebuild Intel's manufacturing base and enabling the firm to increase capacity to meet chip demand and add capabilities to the products. At the same time, Otellini was credited with leading the push toward platforms, and this approach was institutionalized in a 2005 reorganization that divided the company into five market-focused groups: corporate computing, the digital home, mobile computing, healthcare, and channels (PCs for small manufacturers). Otellini was also shifting the product development effort toward so-called dual-core technology featuring two computing engines on a single piece of silicon. In this realm, Intel was competing fiercely with, and playing catchup to, AMD, which released its first dual-core chips for PCs in 2005, whereas Intel was aiming to produce three lines of dual-core processors, for notebooks, desktops, and servers, during the second half of 2006. Like the Centrino technology, dual-core chips were being developed to extend battery life in laptops and cut power costs for desktop PCs and servers. They were also intended to improve performance while avoiding the problems with overheating that had plagued some of the fastest single-processor models. Intel was simultaneously beginning work on multicore platforms with three or more "brains." Two other developments from mid-2005 held potential long-term significance. AMD filed a wide-ranging antitrust suit in U.S. federal court accusing Intel of using illegal inducements and coercion to discourage computer makers from buying AMD's computer chips. This action followed an antitrust ruling against Intel in Japan, earlier in the year. In the meantime, in what seemed a significant coup, Intel reached an agreement with Apple Computer, Inc. whereby Apple would begin shifting its Macintosh computers from IBM's PowerPC chips to Intel chips.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Intel Corporation is the largest semiconductor manufacturer in the world, with major facilities in the United States, Europe, and Asia. Intel has changed the world dramatically since it was founded in 1968; the company invented the microprocessor, the “computer on a chip” that made possible the first handheld calculators and personal computers (PCs). By the early 21st century, Intel’s microprocessors were found in more than 80 percent of PCs worldwide. The company’s product line also includes chipsets and motherboards; flash memory used in wireless communications and other applications; hubs, switches, routers, and other products for Ethernet networks; and embedded control chips used in networking products, laser printers, imaging devices, storage media, and other applications. Intel remained competitive through a combination of clever marketing, well-supported research and development, superior manufacturing proficiency, a vital corporate culture, legal proficiency, and an ongoing alliance with software giant Microsoft Corporation often referred to as “Wintel.”

1968–79: From DRAM to the 8086

Intel’s founders, Robert Noyce and Gordon Moore, were among the eight founders of Fairchild Semiconductor, established in 1957. While at Fairchild, Noyce and Moore invented the integrated circuit, and, in 1968, they decided to form their own company. They were soon joined by Andrew Grove, a Hungarian refugee who had arrived in the United States in 1956 and joined Fairchild in 1963. Grove would remain president and CEO of Intel into the 1990s.

To obtain start-up capital, Noyce and Moore approached Arthur Rock, a venture capitalist, with a one-page business plan simply stating their intention of developing large-scale integrated circuits. Rock, who had helped start Fairchild Semiconductor, as well as Teledyne and Scientific Data Systems, had confidence in Noyce and Moore and provided $3 million in capital. The company was incorporated on July 18, 1968, as N M Electronics (the letters standing for Noyce Moore), but quickly changed its name to Intel, formed from the first syllables of “integrated electronics.” Intel gathered another $2 million in capital before going public in 1971.

Noyce and Moore’s scanty business proposal belied a clear plan to produce large-scale integrated (LSI) semiconductor memories. At that time, semiconductor memories were ten times more expensive than standard magnetic core memories. Costs were falling, however, and Intel’s founders felt that with the greater speed and efficiency of LSI technology, semiconductors would soon replace magnetic cores. Within a few months of its startup, Intel produced the 3101 Schottky bipolar memory, a high-speed random access memory (RAM) chip The 3101 proved popular enough to sustain the company until the 1101, a metal oxide semiconductor (MOS) chip, was perfected and introduced in 1969. The following year, Intel introduced the 1103, a one Kilobyte (K) dynamic RAM, or DRAM, which was the first chip large enough to store a significant amount of information. With the 1103, Intel finally had a chip that really did begin to replace magnetic cores; DRAMs eventually proved indispensable to the personal computer.

The company’s most dramatic impact on the computer industry involved its 1971 introduction of the 4004, the world’s
first microprocessor. Like many of Intel’s innovations, the microprocessor was a byproduct of efforts to develop another technology. When a Japanese calculator manufacturer, Busicom, asked Intel to design cost-effective chips for a series of calculators, Intel engineer Ted Hoff was assigned to the project; during his search for such a design, Hoff conceived a plan for a central processing unit (CPU) on one chip. The 4004, which crammed 2,300 transistors onto a one-eighth- by one-sixth-inch chip, had the power of the old 3,000-cubic-foot ENIAC computer, which depended on 38,000 vacuum tubes.

Although Intel initially focused on the microprocessor as a computer enhancement that would allow users to add more memory to their units, the microprocessor’s great potential—for everything from calculators to cash registers and traffic lights—soon became clear. The applications were facilitated by Intel’s introduction of the 8008, an 8-bit microprocessor developed along with the 4004 but oriented toward data and character (rather than arithmetic) manipulation. The 8080, introduced in 1974, was the first truly general purpose microprocessor. For $360, Intel sold a whole computer on one chip, while conventional computers sold for thousands of dollars. The response was overwhelming. The 8080 soon became the industry standard and Intel the industry leader in the 8-bit market.

In response to ensuing competition in the manufacture of 8-bit microprocessors, Intel introduced the 8085, a faster chip with more functions. The company was also developing two more advanced projects, the 32-bit 432 and the 16-bit 8086. The 8086 was introduced in 1978 but took two years to achieve wide use, and, during this time, Motorola produced a competing chip (the 68000) that seemed to be selling faster. Intel responded with a massive sales effort to establish its architecture as the standard. When International Business Machines Corporation (IBM) chose the 8008, the 8086’s 8-bit cousin, for its personal computer in 1980, Intel seemed to have beat out the competition.

During the 1970s, Intel had also developed the erasable programmable read-only memory (EPROM), another revolutionary but unintended research byproduct. Intel physicist Dov Frohman was working on the reliability problems of the silicon gate used in the MOS process when he realized that the disconnected, or “floating,” gates that were causing malfunctions could be used to create a chip that was erasable and reprogrammable. Since conventional ROM chips had to be permanently programmed during manufacture, any change required the manufacture of a whole new chip. With EPROM, however, Intel could offer customers chips that could be erased and reprogrammed with ultraviolet light and electricity. At its introduction in 1971, EPROM was a novelty without much of a market. But the microprocessor, invented at the same time, created a demand for memory; the EPROM offered memory that could be conveniently used to test microprocessors.

Another major development at Intel during this time was that of peripheral controller chips. Streamlined for specific tasks and stripped of unneeded functions, peripheral chips could greatly increase a computer’s abilities without raising software development costs. One of Intel’s most important developments in peripherals was the coprocessor, first introduced in 1980. Coprocessor chips were an extension of the CPU that could handle specific computer-intensive tasks more efficiently than the CPU itself. Once again, innovation kept Intel ahead of its competition.

Intel’s rapid growth, from the 12 employees at its founding in 1968 to 15,000 in 1980, demanded a careful approach to corporate culture. Noyce, Moore, and Grove, who remembered their frustration with Fairchild’s bureaucratic bottlenecks, found that defining a workable management style was important. Informal weekly lunches with employees kept communication lines open while the company was small, but that system had become unwieldy. Thus, the founders installed a carefully outlined program emphasizing openness, decision making on the lowest levels, discipline, and problem solving rather than paper shuffling. Moreover, the company’s top executives eschewed such luxuries as limousines, expense account lunches, and private parking spaces to establish a sense of teamwork with their subordinates.

In an interview with the Harvard Business Review in 1980, Noyce remarked on the company’s hiring policy, stating, “we expect people to work hard. We expect them to be here when they are committed to be here; we measure absolutely everything that we can in terms of performance.” Employee incentives included options on Intel stock, and technological breakthroughs were celebrated with custom-bottled champagne—“Vintage Intel” marked the first $250 million quarter, in 1983—the year sales reached $1 billion for the first time.

1980s: From 286 to 486

During the 1974 recession, Intel was forced to lay off 30 percent of its employees, and morale declined substantially as a result. Thus, in 1981, when economic struggles again surfaced, instead of laying off more employees, Intel accelerated new product development with the “125 Percent Solution,” which asked exempt employees to work two extra hours per day, without pay, for six months. A brief surge in sales the following year did not last, and, again, instead of more layoffs, Intel imposed pay cuts of up to ten percent. Such measures were not popular among all its workforce, but, by June 1983, all cuts had been restored and retroactive raises had been made. Moreover, in December 1982, IBM paid $250 million for a 12 percent share of Intel, giving the company not only a strong capital boost, but also strong ties to the undisputed industry leader. IBM would eventually increase its stake to 20 percent before selling its Intel stock in 1987.

Company Perspectives

The Internet revolution requires a wholesale reengineering of the infrastructure for commerce and communications. In five to eight years, we believe the world will be linked by one billion connected computers, through tens of millions of servers, generating trillions of dollars of e-commerce. As we shift our focus from a PC-dominated industry to an Internet-dominated economy, we are positioning ourselves to provide key technologies to help drive this transformation.

During the early 1980s, Intel began to slip in some of its markets. Fierce competition in DRAMs, static RAMs, and
EPROMs left Intel concentrating on microprocessors. While competitors claimed that Intel simply gave away its DRAM market, Moore told Business Week in 1988 that the company deliberately focused on microprocessors as the least cyclical field in which to operate. Customer service, an area Intel had been able to overlook for years as it dominated its markets, became more important as highly efficient Japanese and other increasingly innovative competitors challenged Intel’s position. In addition, Intel’s manufacturing record, strained in years past by undercapacity, needed fixing. Fab 7, Intel’s seventh wafer-fabrication plant, opened in 1983 only to face two years of troubled operations before reaching full capacity. Between 1984 and 1988, Intel closed eight old plants, and in 1988 it spent some $450 million on new technology to bring its manufacturing capacity into line with its developmental prowess.

Despite these retrenchments, the company continued to excel in the microprocessor market. In 1982 Intel introduced its 80286 microprocessor, the chip that quickly came to dominate the upper-end PC market, when IBM came out with the 286-powered PC/AT. The 286 was followed in 1985 by Intel’s 80386 chip, popularized in 1987 by the Compaq DESKPRO 386, which, despite bugs when it first came out, became one of the most popular chips on the market. While the 286 brought to the personal computer a speed and power that gave larger computers their first real challenge, the 386 offered even greater speed and power together with the ability to run more than one program at a time. The 386 featured 32-bit architecture and 275,000 transistors—more than twice the number of the 286.

In 1989 Intel introduced the 80486, a chip Business Week heralded as “a veritable mainframe-on-a-chip.” The 486 included 1.2 million transistors and the first built-in math coprocessor, and was 50 times faster than the 4004, the first microprocessor. In designing the i486, Intel resisted an industry trend toward RISC (reduced instruction-set computing), a chip design that eliminated rarely used instructions in order to gain speed. Intel argued that what RISC chips gained in speed they lost in flexibility and that, moreover, RISC chips were not compatible with software already on the market, which Intel felt would secure the 486’s position. A new chip, the 64-bit i860 announced in early 1989, however, did make use of RISC technology to offer what Intel claimed would be a “supercomputer on a chip.”

Also in 1989, a major lawsuit that Intel had filed against NEC Corporation five years before was decided. Intel had claimed that NEC violated its copyright on the microcode, or embedded software instructions, of Intel’s 8086 and 8088 chips. Although Intel had licensed NEC to produce the microcode, NEC had subsequently designed a similar chip of its own. At issue was whether microcode could be copyrighted. The court ruled that it could but that NEC had not violated any copyright in the case at hand. The suit made public some issues surrounding Intel’s reputation. Some rivals and consumers, for example, claimed that Intel used its size and power to repress competition through such tactics as filing “meritless” lawsuits and tying microprocessor sales to other chips. Other observers, however, praised Intel’s protection of its intellectual property and, subsequently, its profits. The Federal Trade Commission conducted a two-year investigation of Intel’s practices and did not recom-mend criminal charges against the company, but two rival companies—Advanced Micro Devices Inc. and Cyrix Corp.—filed antitrust lawsuits against Intel in 1993.

1990s: The Pentium Decade

Intel’s annual net income topped $1 billion for the first time in 1992, following a very successful, brand-building marketing campaign. Intel ads aggressively sought to bolster consumer interest in and demand for computers that featured “Intel In-side.” By late 1993, the company’s brand equity totaled $17.8 billion—more than three times its 1992 sales. Also during this time, Intel began to branch out from chipmaking. In 1992, the company’s Intel Products Group introduced network, communications, and personal conferencing products for retail sale directly to PC users.

In 1993 Intel released its fifth-generation Pentium processor, a trademarked chip capable of executing over 100 million instructions per second (MIPS) and supporting, for example, real-time video communication. The Pentium processor, with its 3.1 million transistors, was up to five times more powerful than the 33-megahertz Intel 486 DX microprocessor (and 1,500 times the speed of the 4004), but, in an unusual marketing maneuver, the company suggested that “all but the most demanding users” would seek out PCs powered by the previous chip. The Pentium’s reputation was initially sullied by the revelation of an embedded mathematical flaw, but Intel moved quickly to fix the problem.

Intel debuts the Pentium III and is added to the Dow Jones Industrial Average.

2000:

The first Intel 1-gigahertz processor hits the market.

The company enjoyed a dramatic 50 percent revenue in-crease in 1993, reaching $8.78 billion from $5.84 billion in 1992. Moreover, Intel’s net income leapt 115 percent to $2.3 billion, repudiating Wall Street’s worries that competition had squeezed profit margins. While Intel faced strong competition both from chip makers such as giant Motorola, Inc.’s PowerPC and former partner IBM, its place at the leading edge of technology was undisputed.

A key initiative that kept Intel ahead of its competitors was the company’s move beyond chip design into computer design. With the advent of the Pentium, Intel began designing chipsets and motherboards—the latter being the PC circuit board that combined a microprocessor and a chipset into the basic subsystem of a PC. With the company now selling the guts of a PC, dozens of computer manufacturers began making and selling Pentium-based machines.

In the mid-1990s, as sales of PCs accelerated and multimedia and the Internet were beginning to emerge, Intel continued developing ever more powerful microprocessors. In 1995 the Pentium Pro hit the market sporting 5.5 million transistors and capable of performing up to 300 MIPS. Intel next added MMX technology to its existing line of Pentium processors. MMX consisted of a new set of instructions that was designed specifically to improve the multimedia performance of personal computers. Fueled by exploding demand, revenues hit $20.85 bil-lion by 1996, while net income soared to $5.16 billion.

At this point Intel was continuing its longtime strategy of designing new, more powerful chips for the top end of the market while allowing previous-generation microprocessors to migrate down to the lower segments of the market. With the introduction of the Pentium II in May 1997, however, the company adopted a new strategy of developing a range of microprocessors for every segment of the computing market. The Pentium II, with 7.5 transistors, debuted with a top-end model that clocked at 300 MHZ. Originally designed for highend desktop PCs, the Pentium II was soon adapted for use in notebook and laptop computers. With the following year came the launch of the Celeron processor, which was designed specifically for the value PC desktop sector, a rapidly growing segment of the market ever since the early 1997 debut of a sub$1,000 PC from Compaq. Also in 1998 Intel for the first time designed a microprocessor—the Pentium II Xeon—especially for midrange and higher-end servers and workstations. At the same time Intel was moving into another burgeoning sector, that of embedded control chips for networking and other applications, such as digital set-top boxes.

Meanwhile Intel settled a dispute with Digital Equipment Corporation (DEC) over the development of the Pentium chip by acquiring DEC’s semiconductor operations. In May 1997 Craig R. Barrett was named president of Intel, having joined the company in 1974, serving as head of manufacturing starting in 1985, and being named chief operating officer in 1993. Grove remained chairman and CEO for one year, whereupon Barrett was named president and CEO, with Grove retaining the chairmanship. In early 1999 Intel reached a settlement with the Federal Trade Commission on an antitrust suit, thereby avoiding the protracted litigation and negative publicity that beset its Wintel partner, Microsoft, in the late 1990s. Reflecting the increasing importance of technology to the U.S. economy, Intel was added to the Dow Jones Industrial Average in November 1999.

During the late 1990s Intel made several strategic acquisitions that rapidly gave the company a significant presence in areas outside its microprocessor core: wireless communications products, such as flash memory for mobile phones and two-way pagers; networking building blocks, such as hubs, switches, and routers; and embedded control chips for laser printers, storage media, and automotive systems. Intel also entered the market for e-commerce services, rapidly building up the largest business-to-business e-commerce site in the world, with $1 billion per month in online sales by mid-1999. The company was not neglecting its core, however; in 1999 Intel had its largest micro-processor launch ever with the simultaneous introduction of 15 Pentium III and Pentium III Xeon processors. In early 2000 a one-gigahertz Pentium III chip hit the market. Later in 2000 came the debut of the next generation processor for the early 21st century, the Itanium, the company’s first 64-bit processor, which was initially designed to meet the needs of powerful Internet servers. With its continuing development of ever more powerful processors and its aggressive expansion into other key technology areas, Intel appeared certain to remain one of the linchpins of the information economy in the new millennium.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Intel Corporation is the largest semiconductor manufacturer in the world, with major facilities in the United States, Europe, and Asia. Intel has changed the world dramatically since it was founded in 1968; the company invented the microprocessor, the “computer on a chip” that made possible the first handheld calculators and personal computers. By the early 1990s, Intel’s product line included: microcontrollers, memory chips, computer modules, and boards for original equipment manufacturers; network, communications and personal conferencing products for retail sale; and high-performance parallel supercomputers. Intel remained competitive through a combination of clever marketing, well-supported research and development, a vital corporate culture, and legal proficiency. In 1993, the market research firm Dataquest estimated Intel’s chip sales at almost 30 percent higher than those of its closest competitor.

Intel’s founders, Robert Noyce and Gordon Moore, were among the eight founders of Fairchild Semiconductor, established in 1957. While at Fairchild, Noyce and Moore invented the integrated circuit, and, in 1968, they decided to form their own company. They were soon joined by Andrew Grove, a Hungarian refugee who had arrived in America in 1956 and joined Fairchild in 1963. Grove would remain president and CEO of Intel into the 1990s.

To obtain start-up capital, Noyce and Moore approached Arthur Rock, a venture capitalist, with a one-page business plan simply stating their intention of developing large-scale integrated circuits. Rock, who had helped start Fairchild Semiconductor, as well as Teledyne and Scientific Data Systems, had confidence in Noyce and Moore and provided $3 million in capital. The company was incorporated on July 18, 1968 as N M Electronics (the letters standing for Noyce Moore), but quickly changed its name to Intel, formed from the first syllables of “integrated electronics.” Intel gathered another $2 million in capital before going public in 1971.

Noyce and Moore’s scanty business proposal belied a clear plan to produce large-scale integrated (LSI) semiconductor memories. At that time, semiconductor memories were ten times more expensive than standard magnetic core memories. However, costs were falling, and Intel’s founders felt that with the greater speed and efficiency of LSI technology, semiconductors would soon replace magnetic cores. Within a few months of its startup, Intel produced the 3101 Schottky bipolar memory, a high-speed random access memory (RAM). The 3101 proved popular enough to sustain the company until the 1101, a metal oxide semiconductor (MOS) chip, was perfected and introduced in 1969. The following year, Intel introduced the 1103, a 1 Kilobyte (K) dynamic RAM, or DRAM, which was the first chip large enough to store a significant amount of information. With the 1103, Intel finally had a chip that really did begin to replace magnetic cores; DRAMs eventually proved indispensable to the personal computer.

The company’s most dramatic impact on the computer industry involved its 1971 introduction of the 4004, the world’s first microprocessor. Like many of Intel’s innovations, the microprocessor was a byproduct of efforts to develop another technology. When a Japanese calculator manufacturer asked Intel to design cost-effective chips for a series of calculators, Intel engineer Ted Hoff was assigned to the project; during his search for such a design, Hoff conceived a plan for a central processing unit (CPU) on one chip. The 4004, which crammed 2,300 transistors onto a one-eighth- by one-sixth-inch chip, had the power of the old 3,000-cubic-foot ENIAC computer, which depended on 38,000 vacuum tubes.

Although Intel initially focused on the microprocessor as a computer enhancement that would allow users to add more memory to their units, the microprocessor’s great potential— for everything from calculators to cash registers and traffic lights—soon became clear. Their applications were facilitated by Intel’s introduction of the 8008, an 8-bit microprocessor developed along with the 4004 but oriented toward data and character (rather than arithmetic) manipulation. The 8080, introduced in 1974, was the first truly general purpose microprocessor. For $360, Intel sold a whole computer on one chip, while conventional computers sold for thousands of dollars. The response was overwhelming. The 8080 soon became the industry standard and Intel the industry leader in the 8-bit market.

In response to ensuing competition in the manufacture of 8-bit microprocessors, Intel introduced the 8085, a faster chip with more functions. The company was also developing two more advanced projects, the 32-bit 432 and the 16-bit 8086. The 8086 was introduced in 1978 but took two years to achieve wide use, and, during this time, Motorola produced a competing chip (the 68000) that seemed to be selling faster. Intel responded with a massive sales effort to establish its architecture as the standard.

When IBM chose the 8008, the 8086’s 8-bit cousin, for its personal computer in 1980, Intel seemed to have beat out the competition.

During the 1970s, Intel had also developed the erasable programmable read-only memory (EPROM), another revolutionary but unintended research byproduct. Intel physicist Dov Frohman was working on the reliability problems of the silicon gate used in the MOS process when he realized that the disconnected, or “floating,” gates that were causing malfunctions could be used to create a chip that was erasable and reprogrammable. Since conventional ROM chips had to be permanently programmed during manufacture, any change required the manufacture of a whole new chip. With EPROM, however, Intel could offer customers chips that could be erased and repro-grammed with ultraviolet light and electricity. At its introduction in 1971, EPROM was a novelty without much of a market. But the microprocessor, invented at the same time, created a demand for memory; the EPROM offered memory that could be conveniently used to test microprocessors.

Another major development at Intel during this time was that of peripheral controller chips. Streamlined for specific tasks and stripped of unneeded functions, peripheral chips could greatly increase a computer’s abilities without raising software development costs. One of Intel’s most important developments in peripherals was the coprocessor, first introduced in 1980. Coprocessor chips were an extension of the CPU that could handle specific computer-intensive tasks more efficiently than the CPU itself. Once again, innovation kept Intel ahead of its competition.

Intel’s rapid growth, from the 12 employees at its founding in 1968 to 15,000 in 1980, demanded a careful approach to corporate culture. Noyce, Moore, and Grove, who remembered their frustration with Fairchild’s bureaucratic bottlenecks, found that defining a workable management style was important. Informal weekly lunches with employees kept communication lines open while the company was small, but that system had become unwieldy. Thus, the founders installed a carefully outlined program emphasizing openness, decision making on the lowest levels, discipline, and problem solving rather than paper shuffling. Moreover, the company’s top executives eschewed such luxuries as limousines, expense account lunches, and private parking spaces to establish a sense of teamwork with their subordinates.

In an interview with the Harvard Business Review in 1980, Noyce remarked on the company’s hiring policy, stating, “we expect people to work hard. We expect them to be here when they are committed to be here; we measure absolutely everything that we can in terms of performance.” Employee incentives included options on Intel stock, and technological breakthroughs were celebrated with custom-bottled champagne—“Vintage Intel” marked the first $250 million quarter, in 1983—the year sales reached $1 billion for the first time.

During the 1974 recession, Intel was forced to lay off 30 percent of its employees, and morale declined substantially as a result. Thus, in 1981, when economic struggles again surfaced, instead of laying off more employees, Intel accelerated new product development with the “125 Percent Solution,” which asked exempt employees to work two extra hours per day, without pay, for six months. A brief surge in sales the following year didn’t last, and, again, instead of more lay offs, Intel imposed pay cuts of up to ten percent. Such measures weren’t popular among all its work force, but, by June 1983, all cuts had been restored and retroactive raises had been made. Moreover, in December 1982, IBM paid $250 million for a 12 percent share of Intel, giving the company not only a strong capital boost, but also strong ties to the undisputed industry leader. IBM would eventually increased its stake to 20 percent before selling its Intel stock in 1987.

During the early 1980s, Intel began to slip in some of its markets. Fierce competition in DRAMS, static RAMS, and EPROMS left Intel concentrating on microprocessors. While competitors claimed that Intel simply gave away its DRAM market, Moore told Business Week in 1988 that the company deliberately focused on microprocessors as the least cyclical field in which to operate. Customer service, an area Intel had been able to overlook for years as it dominated its markets, became more important as highly-efficient Japanese and other increasingly innovative competitors challenged Intel’s position. In addition, Intel’s manufacturing record, strained in years past by undercapacity, needed fixing. Fab 7, Intel’s seventh wafer-fabrication plant, opened in 1983 only to face two years of troubled operations before reaching full capacity. Between 1984 and 1988, Intel closed eight old plants, and in 1988 it spent some $450 million on new technology to bring its manufacturing capacity into line with its developmental prowess.

Despite these retrenchments, the company continued to excel in the microprocessor market. In 1982, Intel introduced its 80286 microprocessor, the chip that quickly came to dominate the upper-end PC market, when IBM came out with the 286-pow-ered PC/AT. The 286 was followed in 1985 by Intel’s 80386 chip, popularized in 1987 by the Compaq 386, which, despite bugs when it first came out, became one of the most popular chips on the market. While the 286 brought to the personal computer a speed and power that gave larger computers their first real challenge, the 386 offered even greater speed and power together with the ability to run more than one program at a time.

In 1989, Intel introduced the 80486, a chip Business Week heralded as “a veritable mainframe-on-a-chip.” In designing the i486, Intel resisted an industry trend toward RISC (reduced instruction-set computing), a chip design that eliminated rarely used instructions in order to gain speed. Intel argued that what RISC chips gained in speed they lost in flexibility and that, moreover, RISC chips were not compatible with software already on the market, which Intel felt would secure the 486’s position. However, a new chip, the 64-bit i860 announced in early 1989, did make use of RISC technology to offer what Intel claimed would be a “supercomputer on a chip.”

Also in 1989, an important lawsuit that Intel had filed against NEC Corporation five years before was decided. Intel had claimed that NEC violated its copyright on the microcode, or embedded software instructions, of Intel’s 8086 and 8088 chips. Although Intel had licensed NEC to produce the microcode, NEC had subsequently designed a similar chip of its own. At issue was whether microcode could be copyrighted. The court
ruled that it could but that NEC had not violated any copyright in the case at hand. The suit made public some issues surrounding Intel’s reputation. Some rivals and consumers, for example, claimed that Intel used its size and power to repress competition through such tactics as filing “meritless” lawsuits and tying microprocessor sales to other chips. Other observers, however, praised Intel’s protection of its intellectual property and, subsequently, its profits. The Federal Trade Commission conducted a two-year investigation of Intel’s practices and did not recommend criminal charges against the company, but two rival companies—Advanced Micro Devices Inc. and Cyrix Corp.—filed antitrust lawsuits against Intel in 1993.

Intel’s annual net income topped $1 billion for the first time in 1992, following a very successful, brand-building, marketing campaign. Intel ads aggressively sought to bolster consumer interest in and demand for computers that featured “Intel Inside.” By late 1993, the company’s brand equity totaled $17.8 billion—more than three times its 1992 sales. Also during this time, Intel began to branch out from chipmaking. In 1992, the company’s Intel Products Group introduced network, communications, and personal conferencing products for retail sale directly to PC users.

In 1993, Intel released its fifth-generation Pentium processor, a trademarked chip capable of executing over 100 million instructions per second and supporting, for example, real-time video communication. The Pentium processor was up to five times more powerful than the 33-megahertz Intel 486 DX microprocessor, but, in an unusual marketing maneuver, the company suggested that “all but the most demanding users” seek out PCs powered by the previous chip.

The company enjoyed a dramatic 50 percent revenue increase in 1993, reaching $8.78 billion from $5.84 billion in 1992. Moreover, Intel’s net income leaped 115 percent to $2.3 billion, repudiating Wall Street’s worries that competition had squeezed profit margins. While Intel faced strong competition both from chip makers like giant Motorola’s PowerPC and former partner IBM, its place at the leading edge of technology was undisputed and expected to continue. As it entered the mid-1990s, Intel looked to address potential challenges in the form of leadership transitions, as founders Moore and Grove neared retirement.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Intel Corporation

Intel has changed the world dramatically since it was founded in 1968: the company invented the microprocessor, the “computer on a chip” that has made everything from the first handheld calculator to today’s powerful personal computers possible.

Intel’s founders, Robert Noyce and Gordon Moore, were among the eight founders of Fairchild Semiconductor in 1957. Noyce is the co-inventor of the integrated circuit, and Moore made some of the basic discoveries that led to it. In 1968 the two men, feeling frustrated by Fairchild’s size, decided to leave to form their own company. They were joined soon after by Andrew Grove, who had been at Fairchild since 1963 and is president and CEO of Intel today.

Noyce and Moore asked Arthur Rock, a venture capitalist who had helped start Fairchild Semiconductor as well as Teledyne and Scientific Data Systems, for help in raising the money to start their company. With a one-page business plan saying simply that they were going into large-scale integrated circuits, they soon had $3 million in capital. The company was incorporated on July 18, 1968 as N M Electronics (for Noyce Moore), but quickly changed its name to Intel, from the first syllables of “integrated electronics.” Intel gathered another $2 million in capital before going public in 1971.

Despite their scanty business outline, Noyce and Moore had a clear plan for their company. They planned to produce large-scale integrated (LSI) semiconductor memories. At that time, semiconductor memories were ten times more expensive than standard magnetic core memories. But costs were dropping, and Intel’s founders felt that with the greater speed and efficiency of LSI technology, semiconductors would soon replace magnetic cores.

Within a few months of its startup, Intel was able to produce the 3101 Schottky bipolar memory, a high-speed random access memory (RAM). The 3101 proved popular enough to sustain the company until the 1101, a metal oxide semiconductor (MOS) chip, was perfected and introduced in 1969.

Intel’s next product was the 1103, introduced in 1970: a IK dynamic RAM, or DRAM, it was the first chip large enough to store a significant amount of information; today DRAMs are indispensable to every computer. With the 1103, Intel finally had a chip that really did begin to replace magnetic cores.

But the company’s most dramatic impact on the computer industry didn’t come until the introduction of the 4004, the world’s first microprocessor. Like many of Intel’s innovations, the microprocessor was the by-product of a search for something else. When a Japanese calculator manufacturer asked Intel to design the chips for a series of calculators, an engineer named Ted Hoff was assigned to the project. In his search for cost-effectiveness, he eventually conceived a plan for a central processing unit (CPU) on one chip. The 4004, which crammed 2,300 transistors onto a one-eighth- by one-sixth-inch chip, had the power of the old 3,000-cubic-foot ENIAC computer, which depended on 38,000 vacuum tubes. The microprocessor was born.

Although at first Intel saw the new device as a way to sell more memory, it was soon clear that the microprocessor held great potential for everything from calculators to cash registers and traffic lights. With the 1972 introduction of the 8008, an 8-bit microprocessor developed along with the 4004 but oriented toward data and character (rather than arithmetic) manipulation, microprocessors were off and running. The 8080, introduced in 1974, was the first truly general-purpose microprocessor. For $360, Intel sold a whole computer, on one chip, when real computers cost thousands of dollars. The response was overwhelming. The 8080 soon became the industry standard and Intel the industry leader in the 8-bit market.

Competitors began to produce 8-bit microprocessors quickly, however. Intel responded with the 8085, a faster chip with more functions. Meanwhile, the company was working on two more advanced projects, the 32-bit 432 and the 16-bit 8086. The 8086 was introduced in 1978, but it took two years for it to catch on. In that time, Motorola produced a competing chip (the 68000) that seemed to catch on faster. Intel responded with a massive sales effort to establish its architecture as the standard. When, among other things, IBM chose the 8088, the 8086’s 8-bit cousin, for its personal computer in 1980, Intel’s battle over architecture was won.

In the meantime, in 1971 Intel had also developed the erasable programmable read-only memory (EPROM), another revolutionary but unintended research by-product. An Intel physicist named Dov Frohman was working on the reliability problems of the silicon gate used in the MOS process when he realized that the disconnected, or “floating,” gates that were causing malfunctions could be used to create a chip that was erasable and reprogrammable.

Standard ROM chips had to be permanently programmed during manufacture; to make a change you had to manufacture a whole new chip. With EPROM, Intel
could offer customers chips which could be erased and reprogrammed with ultraviolet light and electricity. At its introduction in 1971, EPROM was a novelty without much of a market. But the microprocessor, invented at the same time, created a demand for memory; the EPROM offered memory that could be conveniently used to test microprocessors. It was a match made in heaven.

Another major development at Intel during the 1970s was peripheral controller chips. Streamlined for specific tasks and stripped of unneeded functions, peripheral chips could greatly increase a computer’s abilities without raising software-development costs. One of Intel’s most important developments in peripherals was the co-processor, first introduced in 1980. Co-processor chips are an extension of the CPU that can handle specific computer-intensive tasks more efficiently than the CPU itself. Once again, innovation kept Intel ahead of its competition.

Intel’s rapid growth, from the 12 employees at its founding in 1968 to 15,000 in 1980, demanded a careful approach to corporate “culture.” Since Noyce, Moore, and Grove had left Fairchild because its size had created frustrating bureaucratic bottlenecks, defining a workable management style was important to Intel’s founders. Informal weekly lunches with employees kept communication lines open while the company was small, but the system didn’t last for long. Instead, a carefully outlined program emphasizing openness, decision making on the lowest levels, discipline, and problem solving rather than paper shuffling has preserved Intel’s remarkable ability to innovate. Intel makes no bones about the kind of employees it looks for. In an interview with the Harvard Business Review in 1980 Noyce said, “we expect people to work hard. We expect them to be here when they are committed to be here; we measure absolutely everything that we can in terms of performance.” Until recently employees who arrived after 8:10 a.m. signed a late list. But this strict attitude is balanced by company commitment. Incentives include options on Intel stock and breakthroughs are celebrated with custom-bottled champagne (”Vintage Intel” marked the first $250 million quarter, in 1983—the year sales reached $1 billion for the first time).

During the 1974 recession, when Intel laid off 30% of its employees, the experience was traumatic enough that when 1981 rolled around, instead of laying off workers, Intel accelerated new product development with the “125% Solution” (asking exempt employees to work two extra hours a day, without pay, for six months), hoping to fight its way through the recession poised to tackle the next upturn. When a brief surge in sales in 1982 didn’t last, again, instead of firing, Intel imposed pay cuts of up to 10%. Such measures weren’t universally popular, but by June, 1983 all cuts had been restored and retroactive raises had been made. And in December, 1982 IBM paid $250 million for a 12% share of Intel, giving the company not only a strong capital boost, but also strong ties to the undisputed industry leader (IBM eventually increased its stake to 20% before selling its Intel stock in 1987).

During the early 1980s, Intel began to slip in some of its markets. Fierce competition in DRAMS, static RAMS, and EPROMS left Intel concentrating on microprocessors. While competitors claimed that Intel simply gave away its DRAM market, Moore told Business Week in 1988 that his company chose microprocessors as the field with the most promise.

The company also struggled during the mid-1980s to overcome its reputation for arrogance in marketing. Customer service, an area Intel had been able to overlook for years as it dominated its markets, became more important as the highly-efficient Japanese and other increasingly innovative competitors challenged Intel’s position. In addition, Intel’s manufacturing record, strained in years past by undercapacity, needed fixing. Fab 7, Intel’s seventh wafer-fabrication plant, opened in 1983 only to face two years of troubled operations before reaching full capacity. Between 1984 and 1988, Intel closed eight old plants, and in 1988 it spent some $450 million on new technology to bring its manufacturing capacity into line with its developmental prowess.

But in the microprocessor market, Intel continued to excel. In 1982, Intel introduced its 80286 microprocessor, the chip that, when IBM came out with the 286-powered PC/AT, quickly came to dominate the upper-end PC market. The 286 was followed in 1985 by Intel’s 80386 chip, popularized in 1987 by the Compaq 386, which, despite bugs when it first came out, is one of the most popular chips on the market today. While the 286 brought to the PC a speed and power that gave larger computers their first real challenge, the 386 offered even greater speed and power together with the ability to run more than one program at a time. And in 1989 Intel introduced the 80486, a chip Business Week heralded as “a veritable mainframe-on-a-chip.”

In designing the i486, Intel resisted an industry trend toward RISC (reduced instruction-set computing), a way of designing chips that eliminates rarely used instructions to gain speed. Intel argued that what RISC chips gained in speed they lost in flexibility—and the ability to run the software already on the market, something Intel feels will secure the 486’s position despite whatever speed competitors can offer through non-compatible RISC chips. But a new chip, the 64-bit i860 announced in early 1989, does make use of RISC technology to offer what Intel has claimed will be a “supercomputer on a chip.”

Also in 1989 an important lawsuit Intel had filed against NEC in 1984 was decided. Intel had claimed that NEC violated its copyright on the microcode, or embedded software instructions, of Intel’s 8086 and 8088 chips, which Intel had licensed NEC to produce, when NEC designed a similar chip of its own. At issue was whether microcode could even be copyrighted. The court ruled that it could be—but that NEC had not violated any copyright in the case at hand.

While Intel faces strong competition both from domestic chipmakers like the giant Motorola and the smaller Sun Microsystems and MIPS Computer Systems, its place at the edge of technology is undisputed. Meanwhile, it has also begun to branch out from chipmaking. The company has begun to build computers of its own, including a parallel processing one. It also has a growing systems business, providing customers with everything they need to build computer systems of their own.

Indeed, it is not technology, but more mundane matters like manufacturing, marketing, and management that are likely to pose challenges for Intel in the 1990s. All three of the company’s founders will be looking toward retirement in the next few years, and how the company handles its first leadership transition may make as much difference to its future as the speed of its next chip.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Intel Corporation

Computer Sciences
COPYRIGHT 2002 The Gale Group Inc.

Intel Corporation

The Intel Corporation of Santa Clara, California, was founded in 1968 by Robert Noyce (1927–1990), co-inventor of the integrated circuit , and his colleague Gordon Moore (1929–), the originator of "Moore's law." The name Intel was a shortened form of Integrated Electronics. Noyce and Moore were joined by Andy Grove and the three, all formerly from Fairchild Semiconductor, led the firm on its initial mission to produce the world's first semiconductor-based memory chips. The company went on to commercialize the microprocessor, the product that Intel is best known for today.

In 1969 a Japanese manufacturer, Busicom, commissioned Intel engineers to design a set of a dozen custom chips for its new family of scientific calculators. At the time, all logic chips were custom-designed for each customer's product. Logic chips perform calculations and execute programs, unlike memory chips, which store instructions and data.

Intel engineer Marcian "Ted" Hoff improved on Busicom's idea. Instead of designing twelve custom chips, he recommended a set of only four chips for logic and memory, which featured a central processing unit. Although Busicom was satisfied with this alternative approach, Hoff realized its full potential—the design team had created the first general-purpose microprocessor chip, though the term "microprocessor" would not appear for many years.

But, there was a problem. Intel did not own the rights to the new technology—Busicom did. Hoff urged company officials to buy the design from its former client. But others in the company claimed that Intel's future lay in fast and inexpensive memory chips, not in logic chips. Eventually, Hoff's side won by arguing that the success of the new logic chips would enhance the market for memory chips.

Busicom, strapped for cash, agreed to sell the rights for the four-chip set for $60,000. Intel used that agreement as the basis for its microprocessor business, eventually becoming a powerful global corporation. Sales in 2000 reached $33.7 billion.

In 1971, armed with its new technology, Intel engineers introduced the model 4004 microprocessor, which sold for $200 and could perform 60,000 operations per second. It was the size of a thumbnail, featured 2,300 transistors on a sliver of silicon , and could deliver the same amount of computing power as the first electronic computer, ENIAC. In 1972 the model 8008 microprocessor featured 3,500 transistors. Although that was powerful at the time, it was primitive compared to the Pentium IV processor offered in 2000, which had 42 million transistors.

A series of chips followed, each more powerful than the previous one. By 1981 Intel's 16-bit 8086 and 8-bit 8088 processors took the design world by storm, winning an unprecedented 2,500 design awards in a single year. That year, IBM selected the Intel 8088 microprocessor to run its first desktop personal computer, the IBM-PC.

The significance of the IBM alliance was not immediately evident. An Intel sales engineer who worked on the IBM project said, "At the time, a great account was one that generated 10,000 units a year. Nobody comprehended the scale of the PC business would grow to tens of millions of units every year." The success of the IBM-PC helped change the company's direction. In 1986 Intel left the memory-chip market to focus on microprocessors and, under the leadership of Andy Grove who succeeded Moore as CEO in 1987, the company became the world's dominant supplier of microprocessors.

Moore's law, which predicts ever-more complex circuits, drives Intel's designers. By constantly reducing the size of transistors within chips, Intel has reduced their cost. Smaller chips are cheaper because more of them can be made from a single expensive silicon wafer. There are additional benefits. Smaller chips work faster, system reliability is increased, and power requirements are reduced.

To make these tiny chips successfully, Intel's manufacturing technology has had to improve constantly. The earliest chips were made by workers wearing smocks. In 2001 microprocessors are created in a sterile environment, called cleanrooms, which are thousands of times cleaner than those of twenty-five years ago. Robots move the silicon wafers from process to
process. Operators working in these cleanrooms wear non-linting, anti-static fabric, called bunny suits, with face masks, safety glasses, gloves, shoe coverings, and even special breathing equipment.

As Intel grew to become the world's largest chipmaker, its dominant market share did not go unnoticed by competitors and the federal government. In 1998 the Federal Trade Commission announced an investigation into allegations of anti-competitive business practices. The company cooperated fully during the nine-month inquiry. The case was settled before it went to court.

Intel continues to explore possible barriers to microprocessor design. In 2000 company engineers demonstrated a 0.13-micron process technology using an ultra tiny transistor gate and the thinnest of thin films. In time, this advance will allow the company to manufacture chips with transistors that are approximately 1/1000th the width of a human hair.

Time magazine named Intel CEO Andy Grove, a Hungarian immigrant born Adras Gróf, as its 1997 Man of the Year as "the person most responsible for the amazing growth in the power and innovative potential of microchips."

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Intel

Intel A US corporation that is a leading manufacturer of integrated circuits (chips), particularly noted for its important range of microprocessor chips. The current range is shown in the table. The Pentium processor, Intel's most highly integrated semiconductor device, together with the previous generations – the Intel486 and Intel386 processors – run most current operating systems and support leading graphical user interfaces. In the Intel486 and Intel386 range, the standard DX suffix is replaced by SX to denote a lower-performance CPU without a built-in mathematics coprocessor, while the SL suffix is for a variant with low power consumption for mobile computers. The DX2 and DX4 ranges have doubled and tripled internal clock speeds respectively. The clock rates indicated (June 1995) are undergoing frequent upward modification.

All Intel486 and Intel386 processors are informally known by the numbers alone, and all used to have an 80 prefix. For instance, 386, 80386, Intel386 (Trademark), and i386 (Trademark) are synonymous. Preceding the 80386 range were the 80286, the 8086, and the 8088. Processors from the 8086 to the i486SX have optional math coprocessors distinguished by having a 7 in their number instead of a 6; hence the i387, i487.

Intel was the first manufacturer of microprocessors with the 4004 and 8008 chip sets. The original IBM PC and its successors and clones all used Intel processors or copies of them. In addition to its processor chips, Intel also sells system products, including both board-level products and the Paragon range of supercomputers. It is ranked number 41 in terms of revenue in the list of the world's top IT suppliers (1993 figures).

Cite this article Pick a style below, and copy the text for your bibliography.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:

Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.

In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.