Note on Generation of Computer

Please Wait...

Computer Generation

The history of the computer is also referred to its generation. Key technology development that vitally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful, efficient and reliable devices distinguished the generation of the computer. This division of computer according to the development period, memory, processing speed, efficiency, storage etc. is called computer generation. There are five computer generations:

First Generation (1946-1958)

Vacuum tubes were used for first generation computers for circuitry and magnetic drums for memory. They were huge and expensive to operate. Due to the consumption of great deal of electricity, it generated a lot of heat that often caused malfunctioning in the system. Example: ENIAC, UNIVAC, MARK-1.

Features:

They used vacuum tubes/valves as their main electronic components.

Storage capacity was limited to 1 KB to 4 KB.

They used machine level language for programming.

Processing speed was in a millisecond.

They used the magnetic drum for primary memory.

Drawbacks:

Difficult in maintenance.

No facility of linking program.

Difficult for logical programming.

Second Generation (1959-1964)

The transistors invented in 1947, which was not seen as extensive use, replaced vacuum tubes. The transistor was far superior to the vacuum tube that made computers become smaller, faster, cheaper, more energy-efficient and more reliable than the first generation computers. Example: IBM 1401, UNIVAC-II, IBM 1620.

Features:

They used transistors in place of vacuum tubes. 1 transistor was equivalent to 1000 vacuum tubes.

The speed of processing was increased to the microsecond.

They used the magnetic core as primary memory and magnetic tapes as auxiliary memory.

They were much smaller and more reliable.

They used assembly language for programming.

Third generation (1965-1974)

The development of the Integrated Circuit (IC) was the major turning point of the third generation computers. Transistors were made smaller and placed on silicon clips called semiconductors that drastically increased the speed and efficiency of computers. It was called integrated Circuit. Example: IBM 360, PDP-8, etc.

Fourth Generation (1975-1990)

The development of microprocessor gave rise to the fourth generation of computers. A microprocessor has thousands of integrated circuits builds onto a single silicon clip. The Intel 4004 chips, developed in 1971 is the first microprocessor.

Features:

The microprocessor is used in place of transistors. Very Large Scale integration (VLSI) containing hundreds of thousands of transistors on a chip and LSIs(Large Scale Integration) containing thousand of transistors in a chip made the microprocessor.

The speed of processing is increased to Picosecond. Billions of instructions could be processed in a second.

They are very user-friendly computers which use many high-level languages for programming

Fifth Generation Computer (1991-Personal and beyond)

The fifth generation computers are based on Artificial Intelligence (AI) and are still in development. After 90s computers that support Voice Recognition System (VRS) have been developed.

Features:

These computers will use parallel processor made from superconductors Gallium Arsenide (GaAs)/biochip.

They will possess Artificial Intelligence (AI).

They will be able to feed input in the natural language.

Artificial Intelligence (AI)

It is the biotechnology, which will be used in the fifth generation computer. A computer having AI will be able to understand natural language, think and make decisions.

Artificial Intelligence is a branch of computer science that refers to the use of computers in such a way that they will be able to reason, learn and understand natural language as a human being. Robots are made by using this technology.

The fifth generation computers will be based on biochips and superconductor chip. They will possess Artificial Intelligence and will be able to program themselves for the uses according to the user's instruction. The Deep Blue of IBM computer is that with Artificial intelligence.