The World Wide Web was born twenty years ago last month, when Sir Tim Berners-Lee, who at the time was working at CERN, wrote a paper that bore the innocuous title “Information Management: A Proposal.” The foundations of Web, however, go back as far as 1945 when a scientist named Vannevar Bush wrote an article in The Atlantic Monthly entitled “As We May Think.” Bush was concerned about the future of scientific endeavors as the world emerged from the Second World War. He noted that scientists had come together for the common good during the war, but as the war effort ended they were likely to head back to institutions where competition would replace cooperation. Bush was also concerned about how science was going to keep track of the vast amounts of new knowledge that were being created. He wrote:
“Science has provided the swiftest communication between individuals; it has provided a record of ideas and has enabled man to manipulate and to make extracts from that record so that knowledge evolves and endures throughout the life of a race rather than that of an individual. There is a growing mountain of research. But there is increased evidence that we are being bogged down today as specialization extends. The investigator is staggered by the findings and conclusions of thousands of other workers—conclusions which he cannot find time to grasp, much less to remember, as they appear. Yet specialization becomes increasingly necessary for progress, and the effort to bridge between disciplines is correspondingly superficial.”
Bush had good reason to be concerned about the loss of cross-discipline cooperation. As long-time readers of this blog know, I’m a fan of what Frans Johansson calls the Medici Effect — the burst of innovative ideas that occurs when people from different disciplines work together to solve problems. (For more on that subject see my post entitled The Medici Effect.) Bush also asserted that “methods of transmitting and reviewing the results of research are generations old and by now are totally inadequate for their purpose.” Since digital technologies were yet to be introduced, Bush turned to photography as one means of storing and retrieving vast amounts of information using microfilm. “The Encyclopoedia Britannica could be reduced to the volume of a matchbox,” he wrote. “A library of a million volumes could be compressed into one end of a desk.” He also noted that indexing systems were wholly inadequate for storage and retrieval. He anticipated the day when relational databases would be created. “Selection by association, rather than indexing,” he wrote, “may yet be mechanized.”
His solution was a photo-electrical-mechanical device called a Memex, for memory extension, which could make and follow links between documents on microfilm. As Bush explained it:
“Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, ‘memex’ will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.”
The next steps toward the creation of the World Wide Web were taken by J.C.R. Licklider, who worked for the Pentagon’s Advanced Research Projects Agency (ARPA now called DARPA). He wrote two important papers Man-Computer Symbiosis (1960) and The Computer as a Communications Device (1968, co-authored with Robert Taylor). He was also influential in setting funding priorities which would lead to the Internet, and the invention of the mouse, windows, and hypertext. Licklider’s papers can be downloaded by clicking this link. Licklider wrote this summary of his first paper:
“Man-computer symbiosis is an expected development in cooperative interaction between men and electronic computers. It will involve very close coupling between the human and the electronic members of the partnership. The main aims are 1) to let computers facilitate formulative thinking as they now facilitate the solution of formulated problems, and 2) to enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs. In the anticipated symbiotic partnership, men will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluations. Computing machines will do the routinizable work that must be done to prepare the way for insights and decisions in technical and scientific thinking. Preliminary analyses indicate that the symbiotic partnership will perform intellectual operations much more effectively than man alone can perform them. Prerequisites for the achievement of the effective, cooperative association include developments in computer time sharing, in memory components, in memory organization, in programming languages, and in input and output equipment.”
Licklider and Taylor made the astounding claim that “in a few years, men will be able to communicate more effectively through a machine than face to face.” There were others making similar claims. In 1962, Douglas Englebart wrote a paper entitled “Augmenting Human Intellect: A Conceptual Framework.” In his abstract, he wrote: “One of the tools that shows the greatest immediate promise is the computer, when it can be harnessed for direct on-line assistance, integrated with new concepts and methods.” This was probably one of the first uses of the term “on-line” in association with computer networks. Three years later, Theodor (Ted) Nelson coined the term “hypertext” and developed an accompanying concept in a paper entitled “A File Structure for the Complex, the Changing, and the Indeterminate.” Three years later, Englebart was touting the use of computers for online conferencing and collaboration and demonstrated his groupware called NLS (for oNLine System). NLS facilitated the creation and first successful implementation of digital libraries and storage and retrieval of electronic documents using hypertext.
All the pieces were then in place for the 1969 creation of ARPANET (which morphed into the Internet). ARPANET was set up as a response to the Soviet nuclear threat. The Air Force, in particular, wanted to ensure that it could maintain communication with its nuclear force commanders, even in the aftermath a nuclear attack and asked the RAND Corporation to study the problem beginning in 1962. RAND researcher Paul Baran recommended the development networks using packet switching as the answer. ARPA took up that challenge and in 1968 awarded the ARPANET contract to BBN. The actual backbone network was constructed a year later, linking four nodes: University of California at Los Angeles, SRI (in Stanford), University of California at Santa Barbara, and University of Utah. Ray Tomlinson of BBN had created a program that permitted the exchange of emails. Progress continued apace. The following year development began on an ARPANET protocol (later to be called TCP/IP). The group that developed this protocol was headed by Vinton Cerf from Stanford and Bob Kahn from DARPA. The importance of this new protocol was that it allowed diverse computer networks to interconnect and communicate with each other. In 1974, Cerf and Kahn started using the term Internet. Ethernet, TCP/IP, and the Internet were now all in place.
It was at this point that Berners-Lee enters the story. While consulting for CERN in 1980, he wrote a notebook program, “Enquire-Within-Upon-Everything,” which allowed links to be made between arbitrary nodes. Four years later, Paul Mockapetris introduced the Domain Name System (DNS). It took another five years before Berners-Lee was back with CERN and writing the article noted at the beginning of this post. He is now based at the Massachusetts Institute of Technology, where he runs the World Wide Web Consortium. Now twenty years on, The Economist looks back to see how the world has changed as a result of the birth of the World Wide Web [“What’s the score,” 12 March 2009].
“The web, as everyone now knows, has found uses far beyond the original one of linking electronic documents about particle physics in laboratories around the world. But amid all the transformations it has wrought, from personal social networks to political campaigning to pornography, it has also transformed, as its inventor hoped it would, the business of doing science itself. As well as bringing the predictable benefits of allowing journals to be published online and links to be made from one paper to another, it has also, for example, permitted professional scientists to recruit thousands of amateurs to give them a hand. One such project, called GalaxyZoo, used this unpaid labour to classify 1m images of galaxies into various types (spiral, elliptical and irregular). … Another novel scientific application of the web is as an experimental laboratory in its own right. It is allowing social scientists, in particular, to do things that would previously have been impossible.”
The World Wide Web has dramatically increased the study and understanding of networks. Google has become a household word. MySpace, Facebook, and Twitter are connecting people in ways that were impossible 20 years ago. Berners-Lee has been deservedly recognized for his efforts (including a knighthood) and his place in history has been secured. Some people lament the growth of the pornography and gambling industries as a result of the World Wide Web (but a lot of people are paying a lot of money to keep them growing). There is, however, much more to praise about the World Wide Web than there is to criticize. It has fostered an electronic economy that has helped spread the benefits of globalization to remote parts of the Earth. It has helped shine the light of public scrutiny on the acts of despots. It has raised awareness of events occurring around the world that affect the everyday lives of billions of people. All in all, the World Wide Web’s twentieth birthday is an event worth celebrating.