Major technological advances that allow computers to run faster, decrease in size and become more efficient delineate the five generations of computers. First-generation computers used vacuum tubes, second-generation computers had transistors, while third-generation computers advanced to integrated circuits. Fourth-generation computers contain microprocessors, and future, fifth-generation devices will focus on artificial intelligence.
First-generation computers contained vacuum tubes to transmit electricity and magnetic drums as a memory-storage medium. These types of computers took up entire rooms. This generation lasted from 1940 to 1956.
Second-generation devices became smaller thanks to transistors. Transistors were more energy efficient, but computers still generated a lot of heat. Binary languages used in the first generation were changed to assembly languages such as COBOL and FORTRAN. The second generation was from 1956 to 1963.
The third generation of computers, from 1964 to 1971, developed integrated circuits on silicon chips to provide processing power. People used keyboards and monitors to interact with the processors instead of punched cards.
The fourth generation, which began in 1971, features microprocessors that contain thousands of integrated circuits on a tiny silicon chip. The first computer for home use was created in 1981, and this generation of computers includes small, handheld devices such as tablets and smartphones.
The theoretical fifth generation of computers delineates artificial intelligence. Computer engineers believe quantum computation, voice recognition, parallel processing and nanotechnology lead to computers that respond instantly to natural human voices and learn on their own.