Mrs. Pam Abbott
Chemistry 131
November 13, 2013
History of Computer.
For many people living in the 21st century, the computer has become a very important tool for everyone. The computer has the ability to give information, it can help to correct mistakes that people make, and it is able to keep track of many things. It is also able to do many different calculations; this was one of the reasons that sparked the idea of the computer so many years ago.
Every computer supports some form of input, processing, and output. This is the same as the device such as abacus where input, output and processing are simply the act of moving the pebbles into new position , seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information; the computer processes it according to its basic logic or the program currently running, and outputs the results.
The first generation of computer is from 1940s-1950s. First electronic computer used vacuum tubes, and they were huge and complex. The first general purposes electronic computer was the ENIAC. It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up “167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors” (Firdausi)
Transistors replaced vacuum tubes and ushered in the second generation of computers. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards. “The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.”(Rao)
The invention of the integrated circuits, also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on. First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.
First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004. “The Intel 4004 chip, developed in 1971, located all the components of the computer.” (Staff)
First microcomputers were a weird bunch. They often came in kits, and many were essentially