FROM THE STUDYING MIND OF ANDREW–The breakthrough of the transistor heralded a new era for computers. First generation computers such as ENIAC and EDVAC were massive machines which spanned whole rooms and contained thousands of vacuum tubes. While this design worked, it left little room for improvement as the method would require an increase in size to produce an increase in speed. However, the transistor brought an age of miniaturization, effectively replacing the vacuum tube. The first transistor was based upon a germanium semiconductor base and used a diode to control the flow of electrons. This was known as the grown-junction transistor. However, this was not the final form. While this grown-junction transistor was important, the design was unstable. Germanium, while holding the potential for great speeds, would prove to be too unreliable for any commercial use. Silicon was later found to be a much more suitable semiconductor to use in transistors.
This discovery is what jump started the transistor into the mass market. It is a perfect example of the evolutionary process that defines the world of computing. This is the exact form of development that Ralph Gomory, mathematician and IBM executive, believes is important in studying computers. He finds that the world of computers is defined through an evolutionary process, as are almost all technological innovations. Yet, while the evolution of the transistor into its many forms was important, he believes that the creation of the transistor was a true revolutionary breakthrough, a term that he admittedly does not use lightly. Gomory states that this computer evolution is one of equal magnitude and importance to that of the steam engine.
The large computers which made up the first computer era were extremely costly which often restricted them to government and research uses. The main form of interaction with these computers was through the use of punch cards. These punch cards would require holes to be made within certain areas to represent the binary logic behind a program. In time, computers began to utilize the transistor and various other components to make computers more reliable and manageable. However, these new transistor-based computers were not much of an improvement for the average consumer. It took a new invention to lead towards the second generation of computers.
In 1958, the integrated circuit was developed by Jack Kirby at Texas Instruments. The integrated circuit took all of the components of electronic circuits and miniaturized them on to a single semiconductor substrate. The effects of the integrated circuit led to the development of the microprocessor, an advanced integrated circuit which became known as the Central Processing Unit (CPU) of the computer. The first microprocessor to hit the market was Intel’s 4004 processor which was released on November 15, 1971. The 4004 held 2,300 transistors and had a modest clock speed of 740 kHz.
This evolution began to bring the world of computing into John Doe’s hand. In 1974, Arthur Robinson wrote an article on the change of computing from the “maxi-level to the micro-level.” At this stage in time the microprocessor was still a relatively new device, yet he has several predictions about the far reaching imprecations of the device. At this point in time, the only real encounter that the average person had with the world of computing was the first few hand-held calculators began to be sold (most notably by Texas Instruments).
However, Robinson saw other uses for the microprocessor including uses in factory machinery, computerized cash registers, and computer terminals. What is important to notice is his expectations for the future of computing. While he does not outright acknowledge an abstract personal computer, he gets close to the idea. His idea is that the main user of the microprocessor and the microcomputer would be OEMs (original equipment manufacturers). OEMs would develop unique machines with microprocessors embedded in them that carried out individual jobs, each one fine tailored towards a specific goal or task. For example, he discusses supermarket terminals. These terminals would be used to keep track of inventories, authorization of credit cards, and price look-up. However, he stipulates that these terminals would be linked towards a central computer, much like the terminals of the first generation computers were used.
Final installment to follow…