FROM ANDREW’S HTS 3083 TERM PAPER — The 20th century has seen giant leaps in several technologies, stemming from a massive increase in research and development on all fronts.
With the driving force of two world wars and a massive cold war, it became ever more important for countries to spend billions of dollars in development of new technologies that would drive their society, both militarily and culturally, towards dominance.
The most important of these technologies is without a doubt the rise of the computer and its permeation into the personal market. The evolution of computers would never be defined by a single invention, but rather an ideal to push towards a singular goal: the personal computer. Closer examination of the history in the computer’s evolution shows an important trend in which affordable and powerful personal computers were the first step in a digital revolution which would drive society both technologically and culturally into the 21st century.
It is difficult to lump the computer into a single invention that could be analyzed. The computer itself is not a singular technology that was just created overnight but rather a major basis for the electronic and computing system. This system is not separate from human society and culture, but would instead ingrain itself into the very fabric of society. Therefore, this paper will focus on the introduction of two key inventions which led to the evolution of this computing system. They are the transistor and the microprocessor, two major components which define the computer and electronic era.
The first step in understanding the evolution of the computer is the study of the engineering and technological efforts after the Second World War. With inventions such as radio and radar becoming increasingly more valuable in the defense market, governments began to view technology as a vital player in the international marketplace. Government spending in research and development began to increase dramatically as well as drive the private sector into the game. It also showed that scientists would play a key role in the post-WWII era.
One of the main hotbeds for this research was Bell Laboratories, an offshoot of AT&T and Western Electric Research Laboratories. Immediately after the war, then director of research at Bell Labs, M. J. Kelley, decided to investigate the role of semiconductors, elements which in the past had several uses. Semiconductors were common components in crystal radios before the invention of the electron tube was invented. The main goal of this research as was to “gain a deeper understanding of the physics of these substances (semiconductors).”
The researchers began to realize the potential in such materials and decided to take another look in a device which could control the flow of electrons in solids. This research eventually led to the invention of the transistor by Walter Brattain and John Bardeen. In his article published in Scientific Monthly, Ralph Brown, director of research the year the transistor was invented, stated that when they publicly announced the invention of the transistor, they would remain quiet on their hopes and dreams and rather just announced what they could certifiably state as facts.
In retrospect, Brown states that the act of announcing only accomplished tasks that the transistor had performed led to misunderstanding and lack of interest by the news media when reporting on the invention.
To be continued.