Computing: Base of the Internet
At rare moments in history, certain ideas overturn conventional knowledge and lead to innovations which make a difference to the quality of life we lead. Several such ideas have led to great discoveries and inventions in Physics, Communication and Computer Technologies and have paved the way to the emergence of the Internet as we use it today.
The impact of several critical ideas, often ignored or even ridiculed, is not immediately felt. When many of the inventions were made, it was not possible to envisage how they would morph into innovative technologies and what uses they would support. A typical comment to illustrate the nature of such epoch-making inventions was made by Heinrich Hertz (1857-1894), who detected radio waves in 1888. He is reported to have said, “I do not think that the wireless waves I have discovered will have any practical application”! This shows how inventions are seen in the beginning but improvised and used over generations reflecting the changing state-of-the-art technologies. But that does not diminish the value of the strokes of genius. Several innovative ideas, inventions and discoveries have contributed to the technologies underlying the Internet.
The emergence of computers is often taken as the starting point to tell the story of the Internet. Computers, which set the base for the development of the Internet, evolved from being counting devices. Even though the first counting device can be traced back to the Tigris-Euphrates valley 5000 years ago, a mathematical attempt at accurate numerical calculations was seen in the 17th century in Europe and later in America. The pioneers were individuals whose works, often bordering on the eccentric, hardly attracted a wide following.
The manipulation of numbers was followed by processing of instructions – a transformation made possible by the path-breaking invention of the transistor, the integrated circuit and the microprocessor. Processing called for software (essentially operating instructions), which played a key role in the evolution of computers.
Blasie Pascal in France designed and built the first workable, automatic calculating machine (1642). He was followed by Leibnitz in Berlin (1666), credited with the first general purpose calculating device to meet the needs of mathematicians and book keepers. It had all the elements of a modern keyboard but it could not have been constructed economically. Paper and pencil would have been cheaper for the job.
Not so successful was Charles Babbage (1791-1871) in England, who made thousands of detailed drawings (from 1833 on-wards), which projected the fundamentals of today’s computers. The computer pioneer designed the first automatic computing device, known as differential engine (based on programming) and an analytical engine, though he failed to build them. A Swedish inventor completed the work on the difference engine in 1854. Babbage was far ahead of his time. His analytical engine was in fact interpreted by Lady Ada Lovelace, now considered to be the first programmer. She even corrected Babbage’s calculations. The first complete Babbage engine was rebuilt in London in 2002.
Expanding on Leibnitz, an English mathematician – George Boole—laid the groundwork for today’s information theory in 1854. Boolean algebra could be applied to problems of telephone switching. His theory led to reliable communications with low error rates.
Almost a century later, the ideas of Baddage were extended by Alan Turing (1912-1954) in England and John Von Neumann (1903- 1957) in the United States. Their efforts led to the development of the modern electronic digital computer.
Turing published a landmark paper in 1936 on computing machinery and intelligence. His Turing machine helped crack the German secret code during the Second World War. It was a versatile machine; for instance, in 15 minutes it could do a calculation that would, if done by hand, fill half a million sheets of foolscap paper. Turing’s ideas were a source of inspiration to American computer scientists in later years.
It is now realized that the contribution of Alan Turing was not acknowledged and appreciated for a long time. Silencing his role for a long time has in fact blurred the contribution of Britain to the advancement of computing. The allied success in the Second World War would have been delayed considerably but for Turing’s machine, which broke the German secret code. His contribution marked a major advance in computing.
Timeline of Inventions that led to the creation of the World Wide Web (WWW)
1642— Blaise pascal invents the first mechanical adding machine.
1666— Isaac Newton studies the different colours in white light in a prism.
1752— Benjamin Franklin shows that lightning is a form of electricity.
1768 – Leonhard Euler shows that the wavelength of light determines its colour.
1787 – Antoine Lavoisier discovers silicon.
1801 – Joseph Jacquard uses punch cards to create patterns on fabric, a technique tries in
modern computers.
1838 – Samuel Morse demonstrates Morse code for telegraphy
1840 – Charles Babbage designs the first general purpose programmable mechanical computer.
1843 – Ada Lovelace writes the first computer programme.
1848 – Kelvin develops his temperature scale.
1852 – Submarine telegraph cable laid under the English Channel.
1865 – James Clerk Maxwell describes the electromagnetic waves.
1869 – Mendeleev formulates the Periodic Table of Elements.
1866 – A telegraph cable spans the Atlantic Ocean.
1888 – Heinrich Hertz detects radio waves.
1900 – Max Planck proposes quantum theory. He and Einstein held that light comes in bundles of energy called quanta or photons.
1901 – First Transatlantic radio signal sent.
1904 – Einstein describes the photoelectric effect. Fleming invents vacuum tube diode.
1910 – Edison develops movie picture with sound.
1911 – Ernest Rutherford discovers that an atom has a nucleus and orbiting electrons.
1914 – Niels Bohr proposes that electrons travel around the nucleus in fixed energy levels.
1923 – Vladimir Zworykin develops cathode ray tube used in television.
1928 – John Logie Baird transmits the first clear television image.
1952 – Charles Townes and Prokhorov and Basov invent the maser.
1947 – John Bardeen, Walter Brattain and William Shockley invent the transistor.
1958 – Jack Kilby and Robert Noyce independently invent the integrated circuit.
1961 – Optical fibre transmits laser light signals.
1965 – Ted Nelson envisages to computer to write in a non-linear format and retrieve a document from a short quotation.
1969 – Ted Holf develops the microprocessor.
1970 – Silicon chips used for making electronic circuits.
1973 – First cell phone used.
1975 – First personal Computer developed.
1977 – Fiber Optics used.
1981 – Global Positioning System (GPS) becomes operational.
1983 – Internet is formed.
1991 – World Wide Web developed.