Jack Kilby, the man whose invention of the integrated circuit in 1958 led to the power of today’s microchip, died Monday at age 81. Shortly after being hired at Texas Instruments, Kilby designed a wafer thin crystal platform (the chip), which served to connect components such as transistors, capacitors, and resistors that were previously connected by wire into a single processing unit. This allowed for greater processing speeds and, most importantly, mass production of microchips.
While Kilby claimed over 60 patents in his 25-year career with Texas Instruments, his first, the invention of the microchip, has by far had the greatest impact on human evolution. Kilby also invented the hand held calculator and was one of the inventors of thermal printing, which was used by millions of office workers in early fax machines.
Kirby was not alone in his micro-electronic fascination. Within a year, another innovator, Robert Noyce co-founder of Intel had also created a microchip. Noyce’s innovation was to use silicone as the basis of the chip along with pioneering the planar chip printing process still used to connect components today. Seven years later in 1965, Noyce’s business partner, Gordon Moore noted that engineers at both Texas Instruments and Intel were able to double the number of transistors on a circuit (microchip) approximately every 24-months. This observation continues to hold true and has come to be known as Moore’s Law.
The invention of the integrated circuit was the foundation of the information age. Practically everything we make and consume, from cell-phone to agri-business, has been affected by the microchip. Over the past 50-years, science and technology have served to exponentially improve our knowledge of the universe and of ourselves. The biggest barrier had always been the processing time related to information. Kilby broke that barrier and for the accomplishment was awarded the Nobel Prize in Physics in 2000.