The Dream Machine

The Dream Machine

Author

M.Mitchell Walldrop

Year
2001
image

Review

Given the importance and prevalence of digital products, it’s easy to forget that the history of the computer and the internet are relatively short. The same is true of product management. Apple’s LISA was launched in 1979 - the history of digital products is so short it almost feels like you could learn it all.

The author touches on the important moments and people that made modern computing and networking possible. This book became a springboard for me to read more about the early days of computing and design.

It’s incredibly hard to summarise - so forgive me for the below!

You Might Also Like:

image

Key Takeaways

The 20% that gave me 80% of the value.

  • Licklider’s set the vision for ARPA which made Arpanet, TCP/IP, the internet, the windows-icons-mouse interface and the creative explosion at Xerox PARC possible.
  • Feedback (make observations and corrections) is important to control. We needed realtime computers
  • Von Neumann wrote a design doc for the stored-program concept that created a clean split between problem-solving software from the hardware. It's arguably the great idea of the computer age.
  • Shannon from Bell Labs published A Mathematical Theory of Communication, he separated the information from the meaning, the engineers job was just to get the message from A to B, without distortion. His theorem explained how error-correcting modems could transmit data at the rate of tens of thousands of bits per second over ordinary (and relatively noisy) telephone lines. Digital would become synonymous with data quality and reliability.
  • Von Neumann invented much of computer science as we know it today. In hardware he defined RAM, it was revolutionary that you wouldn't have to wait for a position on a drum. In software he defined how to program computers and rewrote algorithms in ways that were easy for machines to understand.
  • Licklider was a fan of human factors → he wanted to understand how machines and humans could work together as a system.
  • Licklider was all about execution. Connecting to real radar systems early, to learn fast. They didn't stay in the world of theory..
  • In the 1950s the number of people who could code was the bottleneck to human progress…
    • To create the first realtime operating system (before any computer languages existed to code it in) they calculated it would take 2000 programmers, which was roughly the amount of programmers in the entire country at that point. SAGE had to get into education.
  • Lick noticed 85% of his time he was doing clerical or mechanical tasks that could be outsourced to a machine. Doing so would enable him more thinking time
    • A partnership: thinking together, sharing, dividing the load. Computers would do algorithms and calculations, humans would focus on creative heuristics. Together a greater whole, a symbiosis, a man-machine partnership. Together they could think as no human brain has ever thought and process data in a way no machine could.
  • Lick laid out the future of computing and the roadmap to get there
    • The future: Man-Computer Symbiosis
      • 30 years before the internet, we have ideas: personal computers, graphics, symbiosis, time sharing, collaborative computers, networks and online communities, universal access to knowledge.
    • What we needed to solve first: Time Sharing + Networks + Input and Output + Storage
  • Doug Engelbart decided to dedicate his 5.5 million minutes of working life to augmenting the human intellect.
    • Augmenting Human Intellect, A Conceptual Framework was about increasing capability of humans to approach a complex problem situation, to gain comprehension and to derive solutions to problems. Manipulating words seemed like the place start, because that was how to manipulate ideas.
  • Lick proposed the 'Intergalactic Computer Network' in 1963. He wanted to link together the ARPA community so everyone could build on each others work. Proposed linking together all the time sharing computers into a national network. Software could float free of individual machines. Programs and data would live on the net. The most significant document that Lick would ever write. This memorandum to the Intergalactic Network became the direct inspiration for the Arpanet, which would evolve into today’s Internet
  • 1963 the CTSS on-line environment was starting to look like the internet. The open design was a really big idea. There were rules to follow to make the interchange possible, but freedom beyond that. PCs, operating systems and the internet all be open platforms.
  • Rather than trying to create a system which had to operate perfectly, they decided to build a fault-tolerant system that had the ability to react to the unexpected. Started to think about processor clocks to keep things moving, backups so data wasn't lost.
  • Douglas Engelbart would change the world. Demonstrating the oN-Line System (NLS) in front of the community. He showed what an intellectual worker could do with their own interactive computer and what value it could bring. They demonstrated a fast responding GUI, word processing, information management, hyperlinks, a programming language, online collaboration, email, video conferencing, the mouse. Doug just expanded the possibilities of computing by 10x.
  • 1970 Too many different computer types and architectures no interoperability. They called a conference, drew a matrix of participants down the side and across the top, and started to get the protocols to talk to each other. Reached a major milestone in connectivity. The Arpanet was up and running for real.
  • Xerox would invest in creating the office of the future, so it could stay relevant. Computers and the architecture of information. At ARPA it was human-computer symbiosis, at PARC it was the electronic office. Bill English's group were doing human-factors studies for the mouse and graphics.
    • Xerox PARC in the 70s is referred to as an event and a place, it was unique
    • In the space of about three years PARC pioneered all the basic technologies that have dominated the field ever since. The personal computer, graphics, GUIs (icons, windows, menus), the mouse, object-oriented programming; WYSIWYG word processing, laser printers, Ethernet and if you count Bob Metcalfe’s participation in the TCP/IP seminars the Internet.
    • Smalltalk was the sexiest piece of Alto software. Overlapping windows, icons, menus and mouse pointer... was the first user interface that would become familiar to everyone. The smalltalk environment was like an artists studio, tools were close to hand, you didn't need to break your concentration.
    • Where Xerox went wrong:
      • PARC was creating technology but how it was going to get from lab to the marketplace? They weren't setup for that. Budgets started to get cut, the executives at Xerox weren't getting it.
      • “It had never occurred to us that people would buy crap,” declares Alan Kay What none of us was thinking was that there would be millions of people out there who would be perfectly happy with the McDonald’s hamburger approach: they didn’t know it wasn’t real food.”
        • When he saw scrolling text on the small talk environment, “Why hasn’t this company brought this to market?” Jobs famously shouted, "What’s going on here? I don’t get it!”
      • Leadership didn’t understand computers, but they knew they had a great research lab. They figured the engineers would take care of it.”
      • Had no spreadsheet program. In the business world spreadsheets were the kind of application a manager might use. And the result was another sale for Apple, not for Xerox
  • That architecture of openness enabled explosive growth, in the 90s it expanded from a handful of users to millions in 5 years.
  • Minicomputers blew time-sharing out of the water: individual autonomy, hands-on control, and freedom.
  • Bill Gates and Paul Allen created BASIC while emulating the Intel 8080 on a different machine. Allen quit his job, Gates dropped out of school, they formed a little company called MicroSoft to market it.
  • 1976 startups realised that computers might sell better as products or appliances as apposed to self serve kits.
  • Because of Berners-Lee’s hypertext browsing, users would finally begin to get it about the Internet.
image

Deep Summary

Longer form notes, typically condensed, reworded and de-duplicated.

About J.C.R Licklider

Most computing pioneers were electrical engineers. Lick was different, he was more interested in the human mind and had studied neuroscience. In 1945 MIT was home to some of the brightest people and best technology (rivalled only by Los Alamos). Pioneers were meeting there to discuss the future fields of research (artificial systems, communication and control).

Norbert Wiener was an inspiration. He was interested in a machine that could automate computation. Vannevar Bush built a mechanical analogue computer that could solve differential equations, and followed that up with an electric version the Rockefeller Differential Analyser in 1931. In the 30s Bush imagined what condensing information onto film could enable. He imagined a Memex (a library within a desk) that anyone could use to improve their thinking. He described how people could arrange information in a way that made sense to them, creating links between different sources, so that you could follow a train of thought. He imagined these to be copyable and sharable between others. Creating a web of knowledge. Although he imagined this would be built with using film, he was describing hypertext. He published his thinking in a famous article in the Atlantic entitled "As we may think."

At the start of WW2 Bush lobbied FDR to create a National Defence Research Council. The idea was to use technology to win the war. Wiener set out a plan: build computers that were capable of digital calculations, binary maths for simplicity and have magnetic tape for storage. Theorising that they would be useful in many domains. The council said no to computers and focused on radar, anti aircraft control and sonar instead. Tragically, Bush would never give up on the analogue film Memex, and tried to create later in life long after it was obvious that it should be digital.

In the 1930s many individuals were working on solving small parts of the computer problem, Norbert Wiener started to put all the pieces together.

The importance of feedback to control

Feedback is a brilliantly simple concept once it's pointed out. In a world where no device is ever perfectly accurate or reliable, a self-correcting mechanism was essential for any kind of effective operation. Solving control would require a deep understanding of communication. You need to make observations and corrections. Wiener, Bigelow & Rosenblueth: through feedback, a mechanism could embody purpose.

The first neural networks proposed

McCulloch mused, what if the brain was a vast electrical circuit, with neurons serving as wires and switches. If the stimulation passed a certain threshold, a neuron would fire and send on output to dozens or hundreds more. They're describing a neural network. 1943 “A Logical Calculus of the Ideas Immanent in Nervous Activity,” demonstrated their neural networks were functionally equivalent to Turing machines. That is, any problem that a Turing machine could solve, an appropriately designed network could also solve.

Computer Architecture. Stored Program Concept.

Von Neumann wrote a design doc for one of the first digital computers (the EDVAC) in breathtaking clarity.

He envisioned a central controller going through an endless cycle. Fetch instructions or data from memory, execute the operation, return the result to memory. To this day most computers are still based on the serial, step-by-step “von Neumann” architecture.

The stored-program concept created a clean split between problem-solving software from the hardware. Computation thus became an abstract process that we now know as software, a series of commands encoded in a string of 1s and 0s and stored in memory.

By ignoring the engineering debates about hardware specifics, his design and contribution was timeliness and more meaningful.

It was similar to Turing's idea of encoding instructions on tape instead of wiring them into the machine. It had implications for universality, in a stored-program computer, like a Turing machine, you could change the instructions without having to change the wiring. It's arguably the great idea of the computer age. Prior to this notion, computers were regarded just as adding machines. This was really the invention of digital computers.

Fundamental Theorem of Information Theory

A modest Claude E. Shannon from Bell Labs published "A Mathematical Theory of Communication" in 1948. He imagined the communication process as being divided into five parts. An information source, a transmitter, a communication channel, a receiver and a destination. The information content of a message was 1s and 0s, more complicated messages would require more digits to encode. He separated the information from the meaning, interpretation was left to the people sending and receiving the message. The engineers job was just to get the message from A to B, without distortion.

He showed that even if a channel suffered from static and distortion, and the signal was faint, you could still send things with perfect fidelity. If the signal is faint, you have to send things many times and device error-correcting codes so that corrupted parts can be reconstructed at the other end.

The theorem explained how error-correcting modems could transmit data at the rate of tens of thousands of bits per second over ordinary (and relatively noisy) telephone lines. Digital would become synonymous with data quality and reliability.

1946 Bell Labs researcher John Turkey coined the term 'bit'.

Von Neumann

Von Neumann invented much of computer science as we know it today. In hardware he defined RAM, it was revolutionary that you wouldn't have to wait for a position on a drum. In software he wrote 'Planning and Coding Problems for Electronic Computing Instrument' which defined how to program computers. He also rewrote algorithms in ways that were easy for machines to understand.

He thought about what a machine would need to reproduce itself? It needs a description copier and the information that encodes itself. Much like DNA. DNA encodes the instructions for making all the enzymes and structural proteins that the cell needs in order to function and as a repository of genetic data, the DNA double helix unwinds and makes a copy of itself every-time the cell divides in two.

The Beginning of the cold War

Russians tested a nuclear bomb in 1949, Von Naumann thought there was an argument for preemptive war "If you say why not bomb them tomorrow? I say why not today?" Wiener was appalled and argued Game Theory didn't take into account that Player A might care what happened to Player B. Wiener thought of society as a complex adaptive system—a constantly evolving, endlessly surprising web of interacting players and overlapping feedback loops. “In the overwhelming majority of cases,” Wiener insisted, “when the number of players is large, the result is one of extreme indeterminacy and instability.” Weiner worried that the computer would devalue the work of the human brain, in the way the industrial revolution devalued the work of the human body.

The Transistor

Bell Labs announced the transistor in 1948. A new amplifying device, based on crystals and quantum mechanics.

Cybernetics

Wiener wrote Cybernetics and followed it with a more accessible book in 1950 called "The human use of human beings. They brought about the information age. Information was the unifying theory of communication, computation and control.

McCulloch and Pitts had shown that a network of neurons could compute anything computable, and in 1949 Hebb suggested that connections between nuerons grew stronger with use, and that's now a network of neurons could learn.

Suddenly, the future of computing looked really exciting!

The Beginning of the Cold War 2:

1939 U.S. reconnaissance plane detected radioactive debris drifting out of Siberia. The U.S. intelligence community confused and in denial, insisting the Soviet Union was 5 years away from testing an atomic bomb of its own. Calls for a preventative war were heard. Truman wanted to counter by developing a more powerful hydrogen bomb (Los Alamos came back to life).

The air force contracted Bell and GE to upgrade radar, and in 1950 asked MIT to turn a vision for a computer-based air defense into a reality.

Realtime

Previously computers were batch operated calculated machines. Worked on one problem at a time. Applications the military had in mind would require realtime operation, constant monitoring of inputs and quick responses. Navy thought $100k would do it (logistics, missile defence, air traffic control and task for coordination), scientists said it would take 10-15 years and $1-2bn dollars. The navy gave them $1m to work on realtime computing, hoping solving that would solve all the issues. The Semi-Automatic Ground Environment (SAGE) project was born.

1951 Lick was asked to work on the air defence for the valley. He pushed to include human factors in the project. He wanted to understand how machines and humans could work together as a system. Convinced that psychologists should work with the engineers from the very beginning of the design process.

They connected to real radar systems early, to learn fast. They didn't stay in the world of theory. Engineers knew they could filter out much of the noise from the equipment, but how much? More of a question about human perception and pattern spotting. Invented the light pen.

The MIT Lincoln Laboratory was a little like the manhattan project. A state of controlled panic, national safety was at risk. They connected radar to telephone lines, signals were corrupted by noise and frequency shifts. They created the modem (modulator-demodulator) to convert from digital to analogue and back, solved the problem and achieved transmission speeds of up to 1,300 bits per second.

Memory · Freedom and Autonomy

Memory was expensive, slow and volatile. Forrester created magnetic memory, which was faster. more reliable and cheaper. They couldn't have done this without buying a second computer just to test memory. They couldn't have bought the second computer without the freedom researchers were given to make decisions, and autonomy over budget. “As long as [our decisions] were plausible and could be explained,” agreed Forrester, “we could carry other people with us.”

Realtime again: At Lincoln they were asked to create the first realtime operating system (before any computer languages existed to code it in). They calculated it would take 2000 programmers, which was roughly the amount of programmers in the entire country at that point. SAGE had to get into education. It went live in 1958 and would be operational for decades. They were really reliable too.

The money ploughed into the SAGE project helped bootstrap an entire computing industry and move IBM onto the right path. Started to see applications in the commercial sector soon after, American Airlines realtime booking system. Planted an idea that humans and computers were more powerful if they worked together. Display, commands to a computer via keyboard, light gun, data sent via digital communications link - spiritual ancestor to the modern computer.

Turing

In 'Computing Machinery and Intelligence' (1950) Turing asked can a machine think?

Framed it as two elemental questions. First, what do we mean by machine? By machine we mean a digital computer. This was no real restriction, as a computer could simulate any other machine, including the human mind? And Second, What do we mean by think? He argued that if the interrogator could not tell that they were conversing with a machine, then one had to admit that the machine was really thinking. As at that point, the interrogator would have precisely as much evidence for the computer’s thinking ability as for the human’s. He called this the Imagination Game, but it become to be known as the Turing Test.

The Beautiful Question and Answer Sequence from the Paper

Q: Please write me a sonnet on the subject of the Forth Bridge. A : Count me out on this one. I never could write poetry. Q: Add 34957 to 70764. A: (Pause about 30 seconds and then give as answer) 105621. Q: Do you play chess? A: Yes. Q: I have K at my K1, and no other pieces. You have only K at K6 and R at R1. It is your move. What do you play? A: (After a pause of 15 seconds) R-R8 mate.

It's one of the most provocative assertions in all of modern science and is still discussed to this day. Turing moved on from machine intelligence to other things following the paper. 2 years later he was arrested for homosexual acts with a teenage boy, escaped jail but ordered to have estrogen treatments. Died of cyanide poisoning in 1954.

In 1954 Von Neumann took Atomic Energy Commissioner seat (highest ranking scientific position in government). He died 3 years later of bone cancer in hospital, surrounded by Army, Navy and Military chiefs of staff.

Seven, Plus or Minus Two

Research into human information processing limits showed that humans can only distinguish reliably between 7 different alternatives at once. They're great at distinguishing between two (salty or not salty) but add more and they start to falter. The number 7 turned up so consistently that Miller titled his 1956 paper “The Magical Number Seven, Plus or Minus Two”. It began with a famous line. “My problem is that I have been persecuted by an integer. For seven years this number has followed me around, has intruded in my most private data, and has assaulted me from the pages of our most public journals..."

Miller verified what Hayes had found. Short term memory limits are about the type of information not just the amount. Memory seemed to have a concept of ‘chunks,’” or meaningful clusters of items that you can remember as units. Sentences are easier than random words, you can remember them as phrases, or chunks of meaning. A 12 digit sequence is easier to remember as 3 memorable years.

Combinatoric Explosion & Heuristic Reasoning.

Heuristic reasoning helps us make decisions when the possible number of choices is incalculable (what should we do tomorrow). We apply a "satisfying” strategy, following rules of thumb or constraints that help us whittle the choices down to something manageable. These are known as heuristics (Greek: heuriskein, means “to invent” or “to discover"). Heuristics can help prevent machines becoming overwhelmed, enabling them to make an acceptable choice when there isn't enough time to make the ideal choice.

Interactive computing.

Ken Olsen: Computers should be interactive, exciting and fun. At the time though, computers were batch systems, the architecture was designed to optimise for the efficiency of the computer, not the people that operated it. In the Lincoln Lab they wanted to explore what humans and machines could do together, batch processing wasn't going to cut it.

An interactive computer, meant they were going to have to become smaller, faster and cheaper. So everyone could have their own to work on.

Transistors

Started off being really delicate, touch it and it would burn out. IBM and others would quickly replace vacuum tubes with computers.

Lick Time Study 1957

Lick studied his time, and kept track of what he did during work hours. 85% of his thinking time, was getting in position to think. These getting-into-position activities were essentially clerical or mechanical: searching, calculating, plotting, transforming, determining the logical or dynamic consequences of a set of assumptions or hypotheses, preparing the way for a decision or an insight. Embarrassingly, deciding what to attempt and what not to attempt was determined by clerical feasibility, not intellectual capability

The epiphany being: our minds were slaves to mundane detail, and computers would be our salvation. A partnership: thinking together, sharing, dividing the load. Computers would do algorithms and calculations, humans would focus on creative heuristics. Together a greater whole, a symbiosis, a man-machine partnership. Together they could think as no human brain has ever thought and process data in a way no machine could.

Early computers were really disappointing to Lick, they weren't able to achieve this symbiosis. Not only did they need to get smaller, faster and cheaper. We needed improvements in usability, understandability and configurability.

The PDP-1

The first computer that didn't disappoint Lick was the PDP-1 (Congress banned buying new computers until existing ones were better utilised, so they called it a Programmed Data Processor). Price and performance wise it was a spectacular leap. It cost 2x more, but was 1000x faster. This almost never happens.

Lick started firing off proposals as he saw the power of the machine, in was in constant use, Lick would return to the machine each evening to program.

McCarthy on AI and Time Sharing

Turin was the first person to make the case for using software to create AI. McCarthy didn't really get the software point when he read it for the first time. McCarthy established artificial intelligence as a field in its own right and gave it the name that has stuck.

As computers were too expensive for individuals, the obvious answer was to share a machine. Sharing processing time and storage space, McCarthy took to calling his scheme time-sharing. Nobody at IBM had even imagined such a thing, not in 1955. In effect, he was proposing to optimise human time instead of machine time. But in 1955, at IBM, that kind of proposal sounded both naive and self-indulgent.

1957 Fortran, an early program language was a huge boost to productivity, for the first time people could work in something that was human readable. Enabled about 10x more people to code too.

The explosion of users caused resource problems at Universities who went to batch, meaning students would have to wait 24 hours to get results from their code.

Minksy and McCarthy shared a conviction that AI was important, exciting and worth pursuing.

Lick the Leader

Lick set the vision and roadmap for the computing movement. He helped everyone invision what a computerised world would look like, and shared how to get there. He was a great integrator and synthesiser and collected ideas and people.

1960 Man-Computer Symbiosis Licklider essentially laid out the vision and the agenda for the next 25 years of computing research.

Symbiosis Opportunity

Symbiosis meant humans and computers working together in a partnership, with each side doing what it did best, in realtime. Humans setting the goals and formulating hypothesis, the machine working through simulations and calculations. Lick didn't want the computer to be a tool, but a colleague.

Would work better together. Lick referenced his 85% of time was spent doing drudge work that a computer could do.

Research Roadmap

Time Sharing: To collaborate in this way with a computer you'd need immediate responses. So you'd need your own device (which was almost unthinkable) or you share a machine that was fast enough to serve many people. Networks to connect people and information. Essentially the internet. Input and output were too slow. He proposed a console equipped with voice and handwriting recognition, something like a pencil that could write on the display. The flexibility and convenience of the pencil and notepad . Lick’s also proposed cheap mass produced memory (like CDs).

Lick also knew that AGI was a long way off, so for a long time to come, Lick wrote, humans and computers would have to work together. Man should focus on developing symbiosis then using it.

Lick was convinced that the future of programming languages would be graphical, and use gestures, strokes and images. So intuitive you wouldn't need training. He would make them a major focus of his research from this point on.

1961 Libraries of the Future Licklider (a book)

About managing information. He spoke about the difficulties of search and curation when you've got lots of information. Assimilation phase: getting the information you need, trying to extract the key points. Application phase: how you can use it to address a task at hand.

A reminder at this point were 30 years away form the internet, but we have ideas: personal computers, graphics, symbiosis, time sharing, collaborative computers, networks and online communities, universal access to knowledge.

McCarthy imagined a world where computing would be a utility like power, cities would have a powerful computer, everyone would pay when they used it, there would be many services on offer. Bringing computer power to the people that way.

The Phenomena Surrounding Computers

1957 Sputnik was launched, it beeped as it circled the earth for 23 days. America's rockets were still blowing up on the launchpad. The Russians had the capability to drop a bomb anywhere from space. This became known as 'the missile gap'. 1957 Eisenhower promised to catchup, heavy investment followed and space research was consolidated under a new agency: ARPA Advanced Research Projects Agency. 1958 NASA (National Aeronautics and Space Administration) was created for the non military things. ARPA began to pickup computing projects to solve command and control problems.

Lick joined ARPA and was given autonomy and a $10 million research budget, Lick would use it to bring interactive computing to life. He believed a 2-4x boost in productivity would be possible.

Doug Engelbart decided to dedicate his 5.5 million minutes of working life to augmenting the human intellect. He reasoned that problems were getting more complicated, we weren't getting better at coping. Boosting our ability to deal with complex problems would be impactful. The best way to do that was to create an interactive general-purpose computer-powered information environment.

1962 Engelbart published "Augmenting Human Intellect, A Conceptual Framework." Increasing capability of humans to approach a complex problem situation, to gain comprehension and to derive solutions to problems. Englebart proposed bringing together humans 'hunches and feel' with machines. Manipulating words seemed like the place start, because that was how to manipulate ideas.

He imagined human concepts and symbols being manipulated and rearranged by a computer using a 3D colour display. Engelbart was inspired by 'As We May Think' by Bush.

Engelbart proposed funding a lab to bring about this vision, Lick gave him the funding

Lick wanted MIT to come onboard and gave flexibility on projects, but there had to be AI, time-sharing, realtime human-computer input / interaction and graphics. Computers for communication not just calculation. MIT joined.

Lick proposed the 'Intergalactic Computer Network' in 1963. He wanted to link together the ARPA community so everyone could build on each others work. Proposed linking together all the time sharing computers into a national network. Software could float free of individual machines. Programs and data would live on the net. The most significant document that Lick would ever write. This memorandum to the Intergalactic Network became the direct inspiration for the Arpanet, which would evolve into today’s Internet. 1963 was too early, each of the sites were just struggling to get their computers running. Lick connected 3 computers at UCLA as an experiment, and arranged to have UCLA, Berkley and SRI to form a California network via phone line.

As a utility information was different, it flowed both ways. Users could give back to the system and create new resources. “the system became the repository of the knowledge of the community. And that was a totally new thing.” This open system quality allowed everyone to make the system be their thing, rather than what somebody imposed on them.

Early commands that proved popular were word processing (TYPSET and RUNOFF) and email (MAIL). Project MAC became the intellectual centre of the project. People would talk, propose ideas, collaborate. 1963 the CTSS on-line environment was starting to look like the internet.

The open design was a really big idea. There were rules to follow to make the interchange possible, but freedom beyond that. PCs, operating systems and the internet all be open platforms.

Openness required trust. We wanted people to store programs in memory not punch cards, so they had to believe that the system would be up and running when they needed it. Murphy’s Law: If anything can go wrong, it will, and at the worst possible moment.

Rather than trying to create a system which had to operate perfectly, they decided to build a fault-tolerant system that had the ability to react to the unexpected. Started to think about processor clocks to keep things moving, backups so data wasn't lost.

Users hated the introduction of Usernames and passwords and felt it was vaguely insulting. They realised that people needed a zone of privacy.

Engelbart began making progress when he got funding from NASA and ARPA. From 1964 they had a faster machine, he and English built NLS (oNLine System) a text editor (fonts, windows, outlining) with collaboration features (E-mail) and a bunch of other productivity tools (hyperlinks, programming language, address books and calendars). GSUITE before it's time.

Just having the PC write text on a CRT display was innovative. Creating the need for new interaction devices, they tested everything, Englebert invented the mouse and it tested best with participants.

Some competition between IBM and GE to create a realtime OS that would enable interactive computing.

Lick considered graphics to be critical to human-computer symbiosis, our eyes are “a high-bandwidth data channel. After time-sharing Lick wanted ARPA to focus on graphics and networking. He funded the RAND Tablet, and wanted graphics applications be included in Project MAC.

Lick had the patience to take the long view, he created a community of believers that guaranteed that his vision would live on after him. By funding research at universities he captured the next generation. Licklider and ARPA didn't just create a collection of bright people, they created a self organising community.

The intergalactic network

The LINC was the first mini computer (built at MIT's Lincoln Lab in 1965) that was good enough to make some thing that the future of computing might be personal devices, not time-sharing.

Information sharing communities emerged each time a network was established. Computer networks seem to be a medium that stimulates community,

Bob Taylor wanted to build a larger network but MIT, Berkeley and SDC didn't have enough compute and they thought more users would poach their cycles. Many saw networking as a distraction from the real work (like AI and graphics).

Taylor and Lick imagined what you could do with a large computer network. "The Computer as a Communication Device". Taylor got $1 million from ARPA to fund his network project in 20 minutes.

Early Networking

Early phone line experiments required a connection to be established each time a message was sent, causing 10 second delays. ARPA started leasing lines to leave computers always connected.

Continuous analogue signals work for voice, distortions aren't disruptive. When transmitting bits though, distortion changing a single bit could ruin a message. So packets were invented to break up communication, check digits help you verify you got it correctly, and the receiving computer can request failed packets. You now need to work out routing of packets though.

They adopted the Telegraph shore and forward approach. Send a message a sensible distance, store it before transmitting it further. Computers at each site would first share routing responsibilities equally (rather than having a central addressing computer).

Computers read each packet (destination address, return address, error-checking codes) and would accept them or forward it towards its destination.The network that operated collectively, with little or no central control.

Engelbert volunteered SRI to be an early node, as he wanted to explore online collaboration.

the ARPA network would be based on small routing computers à la Wes Clark. Interface Message Processors, they came to be called, or IMPs.

Why did ARPA build the network?

The proposers knew it would be good for researchers to share files and collaborate, but that wouldn't have secured funding on its own. However, funding was allocated because packet switching is more survivable, more robust under damage. In a strategic situation like a nuclear attack communication could still be made to the missile fields.

The military wouldn't use it until it was working, so the researchers would dog-food it to learn.

Alan Kay proposed the Dynabook at a 1968 conference Kay presented an idea for a tiny computer carried like a notebook, it would have a screen that you could write on like paper to display text and graphics and connect with other computers via wireless radio and have a hard disk.

The four node network

UCLA was the first node, they'd take measurements on traffic, SRI would be second, then the University of Santa Barbra and Utah as Stanford and Berkley were resistant. ARPA were creating the network, but the sites would have to learn how to use them.

They needed to create a set of protocols. Crocker started asking basic questions and the sites started agreeing things. They decided to start transcribing their decisions, Crocker took the first set of notes. They feared stepping on toes. Crocker called the notes 'Request for comment'

The Demo · APRAs Woodstock

1968 the Fall Joint Computer Conference in SF. Standing room only. Douglas Engelbart would change the world. Doug would demonstrate their oN-Line System (NLS) in front of a huge screen. He began... "The research program that I’m going to describe to you" and he started to show what an intellectual worker could do with their own interactive computer and what value it could bring. They demonstrated a fast responding GUI, word processing, information management, hyperlinks, a programming language, online collaboration, email, video conferencing, the mouse.

Englebert hadn't been publishing papers, a lot of people thought the work they were doing was silly, so the presentation turned that around. Doug was the showman, and they were showing what the NLS could do, not just talking about it.

They got a standing ovation. There was a huge amount of enthusiasm. An NLS installation at that time though cost $1 million, so it was still too expensive for people to seriously consider.

The presentation was a defining moment, the future that ARPA was funding was starting to take shape, and Doug just expanded the possibilities of computing by 10x.

Building the Network. The IMP.

The ARPA network to be installed by BBN would deliver packets, nothing else. All of the software would be the responsibility of the host computer, leaving a total vacuum on the protocols.

The first message was sent over the ARPANET in 1969 9 months after the contract was awarded, through a fridge sized modem (IMP) by a computer sending a message to itself using packets. 1970 all 4 sites were connected.

MAC

Lick lead project MAC a mainframe computer that could support 30 simultaneous users. Lick was really bad at administrative work.

Minsky and others didn't see the point in symbiosis, we should just solve AI first.

Multics was the prototype OS that was decent OS: memory management, a hierarchical file system, security etc. Lick was determined to connect his PDP 10 to the network. By 1969 more sites were getting added to the ARPANET. But the machines that were connected to the ARPANET couldn't use it, as the protocols weren't there yet.

1970 Too many different computer types and architectures no interoperability. They called a conference, drew a matrix of participants down the side and across the top, and started to get the protocols to talk to each other. Reached a major milestone in connectivity. The Arpanet was up and running for real.

Early demonstration of the ARPANET and Tomlinson's email to an ARPA director was successful. The director proclaimed that all the sites should be connected and everyone should get a terminal. Email started to connect the community more strongly too.

The had a big demonstration of the APRANET and ONLine system at a conference. The ICCC demonstration made everyone realise packet switching networks were the future. It was one of the great experiments in science. It changed the way things are going, commerce, government, industry, science, everything.

After the ICCC demonstration, the groups disbanded a little. A brand-new laboratory funded by Xerox was started. PARC: the Xerox Palo Alto Research Center.

LIVING IN THE FUTURE

Xerox had $1 billion in revenue, but feared their glory days would be short-lived. They wanted to move from copying to communication to create more diversified revenue.

It would invest in creating the office of the future, so it could stay relevant. Computers and the architecture of information. Bought a computer manufacturer SDS.

1957: Transistor

1959: The chip (developed by Fairchild)

Noyce and Moore broke out from Fairchild to create intel. In 1969 they were asked to design twelve custom chips for a Japanese company. Intel engineer Ted Hoff suggested designing a single chip that could do 12 functions. The 4004 was the first “microprocessor” to have a full CPU on a single chip. Soon there would be complete computers on a single chip.

Taylor used Moore's law to estimate that we were 10 years away from affordable desktop computers. How many more for laptops and one in your hand? Individual computers with graphics and networks could be the future. A new research centre might have just enough budget and man power to make it work.

They invented the laser printer before it was economical to make one.

Bob didn't want people who had to be managed. PARC sometimes seemed to have no organisation whatsoever.

People need the freedom to create, but their creations had to add up to something. At ARPA it was human-computer symbiosis, at PARC it was the electronic office. Getting the maverick geniuses moving in the same direction, without forcing everyone to move in lockstep, was the challenge. Needed to give them purpose and cohesion, without crushing spontaneity and initiative.

PARC tactic: Buy and build cutting edge tech, be 5-10 years ahead of the the curve. Moore's law will bring the cost down. Whatever you build, get everybody at PARC to use it, to see the problems and the possibilities. Use what they learn to build better technology.

Bill English's group were doing human-factors studies for the mouse and graphics.

Every new comer was given a paper to read 'Sketchpad: A man-machine graphical communication system'

Kay: with enough processing power and the right kind of interface, a computer could simulate marks on paper, paint on canvas, the “motion” encoded in film and television—any medium of expression that humans had yet devised. But much more than that, the computer could be dynamic and responsive in a way that no other medium had ever been. It could execute programs. It could respond to questions and experiments. It could engage the user in a two-way dialogue

Kay create object orientated programming and came up with a clear vision of what a personal computer should be:

The Dynabook. No larger than a notebook. Keyboard and stylus. Wireless networking. First prototype was cardboard to get the size and shape right.

Small talk object orientated programming language. GUI. Described a 'paint' program.

Alan Kay invented the concept of windows to stop the GUIs getting too crowded.

1972 at PARC there was agreement that the office of the future was going to be networked computers and graphics. Graphics weren't possible on time-sharing machines, but Doug had shown that a screen could be as good as paper with one computer per person. So it became part of the vision to make computers smaller, mini computers, so you could have one each. Was that even possible though?

Moore's law was going to make it possible, that was obvious in 1972, the personal computing era was coming.

They started to develop digital CRT bitmap displays that used pixels. One bit per pixel, 1 for white and 0 for black. Having enough memory to serve the displays (64kb) became the blocker. Thanks to Intel though, the cost of memory was falling fast.

You could argue that LINC or TX-0 were the first PC. BUT the Alto was the first PC to look like a computer does today. For $10k you got an 11 inch bitmap display 606 x 808 pixels, keyboard, mouse, networking, all in a case small enough to sit on the floor.

A prototype Alto came to life with the words 'Alto lives', the first image made was the cookie monster.

They developed ethernet, unpowered local cables that used data packets, and like the ARPANET there wouldn't need to be anything doing central control. Ethernet was short for Ether Network.

1973 Bob Kahn at ARPA wanted to connect the networks together, 'Internetworking' is what he called it, was shortened to Internet.

As long as you had an IMP and met the 1822 interface specification you could plug into the ARPANET with any computer, any OS and the bits would still flow. A network of networks was the same principle, just one level up. If you wanted to be part of the internet you could be, just follow the interface standard. That architecture of openness enabled explosive growth, in the 90s it expanded from a handful of users to millions in 5 years.

On defining the protocols:

The critical thing was to keep the packets flowing through all the different networks automatically, without making the poor things carry along special instructions for each one.

TCP (Transmission Control Protocol). It would be resilient. If packets didn't make it TCP would make sure that the source sent out replacements, quickly and automatically. A much-improved addressing scheme (Arpanet packet couldn’t specify an address on another network, it could only send things to the US) the addressing component would be separately codified as IP.

1974 TCP/IP was defined in the paper “A Protocol for Packet Network Interconnection,” Kahn and Cerf are so often hailed today as the inventors of the Internet, this is where the internet began.

They sold 1500 Altos, they started at $18k, the Alto II cost $12k. Everyone loved the Alto, the combination of graphics and the mouse transformed computing into something visible and tangible. Smalltalk was the sexiest piece of Alto software. Overlapping windows, icons, menus and mouse pointer... was the first user interface that would become familiar to everyone. The smalltalk environment was like an artists studio, tools were close to hand, you didn't need to break your concentration.

Laurel was an easy-to-use E-mail editor that allowed for reading, filing, and composing messages. No Altos were ordered without ethernet.

Xerox PARC in the 70s is referred to as an event and a place, it was unique. PARC was ARPA continued. The timing was perfect, the dawn of Microchips and ARPA students coming of age. Alan Kay, “the most powerful work in any genre is done not by the adults who invent it, but by the first generation of kids who grow up in it.”

PARC was well funded, and it's proprietors were patient. It's leaders Goldman, Pake and Taylor deeply understood the dynamics of innovation. It had brilliant minds, but the interactions between them is where the magic happened. They felt the could invent it all (hardware, software, AI, printing, networking) and Pake gave them the freedom and protection to do it.

In the space of about three years PARC pioneered all the basic technologies that have dominated the field ever since. The personal computer, graphics, GUIs (icons, windows, menus), the mouse, object-oriented programming; WYSIWYG word processing, laser printers, Ethernet and if you count Bob Metcalfe’s participation in the TCP/IP seminars the Internet. However nothing fundamental about computing really changed between PARC’s 1970s golden age and the end of the 1990s, when everything exploded.

Kay started sketching out laptops in in 1975, and built prototypes in 1978. Using NoteTaker on a plane at 35,000 ft.

In 1971 Wilson the Chairman on the board of Xerox dies, he was the biggest supporter of PARC. PARC ended up shifting under an executive that used to be a finance man from Ford.

A new devision in Xerox started competing with PARC, in 1974 PARC had the Alto, the Ethernet, and the laser printer in daily operation.

PARC was creating technology but how it was going to get from lab to the marketplace? They weren't setup for that. Budgets started to get cut, the executives at Xerox weren't getting it.

Lick also got a new boss Heilmeier who was asking for a roadmap, Lick found this insulting.

At MIT nobody would tell you that you’re doing a good job. Lick understood the loneliness, and tell people 'you’re doing a great job.’

Lick envisioned the internet (although he called it the Multinet). WFH, collaboration, information sharing, online banking. It would function as a network of networks. The Multinet would be the worldwide embodiment of equality, community, and freedom. In 1979 when most of the components were in place, Lick still feared it wouldn't happen.

  • Many commercial operators didn't want to be open. Systems were incompatible, and there was no incentive to make them so. Vendors were targeting large corporations that wanted proprietary networks for in-house use and that were paranoid about security leaks and industrial espionage; isolation was actually a selling point for such customers
  • ARPA was almost the only game in town going high risk, high payoff longterm bets. Few were going research like ARPA. PARC, IBM, and Bell Labs being the exception. In the late 1970s the ARPA funding was starting to be cut

Lick remained optimistic, and still evangelised computing and networks to anyone who would listen.

Stewart Brand and Ted Nelson declared that “hypertext” would allow us to break free from linear thought and hierarchical power structures. The business world also leased machines from IBM, it was less scary than ownership.

DECs PDP-8. Designers studied white goods looking at components and techniques. They build a small, light and cheap ($18k) computer. 1/6th the price of the PDP-1, cheaper than anything in the IBM catalog. Shipped in 1965 and were really popular.

Its open architecture inspired an entire OEM industry, they would buy PDP-8s, add peripherals , software and market it as a system. 50% were sold this way, DEC didn't have to investment in software, end-user training, service, or maintenance. In the process, moreover, the company proved just how powerful the open-systems approach really was. By giving up absolute control, by letting users take charge of their own machines, DEC could benefit from the creativity and entrepreneurial energy of a far larger community than it ever could have mustered in-house. Before it was all over, in fact, DEC would sell some 50k PDP-8s.1977 they were doing $1bn a year in revenue.

The Peoples Operating System

Minicomputers blew time-sharing out of the water: individual autonomy, hands-on control, and freedom. In 1974, Unix came along, and suddenly you could get your hands on the software, too. “Unix was the first really general-purpose operating system for minicomputers,” he says. “It took off for two reasons. One, it was free. And two, Unix was the first operating system you could get source code [the full list of programming commands] for. You could hack it.” Unix was only free because AT&T wasn't allowed to sell it as it had a telephone monopoly.

Ritchie completely rewrote the Unix kernel in an elegant new computer language based on an earlier experimental language code-named “B,” Ritchie code-named his language “C.” C and Unix quickly killed off manufacturer supplied operating systems.

HP launched calculators that got them onto every desktop in the US.

The first successful micro pc

1975 the Roberts Altair was the first commercially successful microcomputer. “World’s First Minicomputer Kit to Rival Commercial Models.” Inside, readers learned that the kits could be had from MITS for just $397. 10k orders came in.

The Altair was an open system: specifications for the data bus were made available so anyone could build an add-on card to fit in, that resulted in an explosion of entrepreneurial energy, start-ups sprang into being by the hundreds.

Bill Gates and Paul Allen created BASIC while emulating the Intel 8080 on a different machine. Allen quit his job, Gates dropped out of school, they formed a little company called MicroSoft to market it.

Going Mainstream.

1976 startups realised that computers might sell better as products or appliances as apposed to self serve kits.

Apple founded in 1976 by Homebrew Computer Club members Steve Wozniak and Steve Jobs. Their first computer was just assembling together parts. After some investment, they were able to make the Apple II in 1977. Beautiful built-in keyboard and beige-colored case. It was comparatively cheap ($1,195) and expandable, empty slots for add-on cards. Marketed as "The home computer that’s ready to work, play, and grow with you”. It was also good at video games.

They sold 130,000 Apple IIs and proved to have a market value of $1.2 billion.

Gates was asked to write a “Disk Operating System” for the new IBM PC, they cloned a competitor OS and made MS-DOS, it would soon dominate the market and create the market which in turn created the market for software. They also agreed they could sell it to other device manufacturers.

Games became popular, as did office-orientated products such as WordStar (word processor) and VisiCalc (spreadsheet).

VisiCalc was the killer app, people were buying Apple II's just to run it.

By 1982 the Time’s Man of the was not a human being at all but a machine: the computer.

“It had never occurred to us that people would buy crap,” declares Alan Kay What none of us was thinking was that there would be millions of people out there who would be perfectly happy with the McDonald’s hamburger approach: they didn’t know it wasn’t real food.”

Xerox would licence ethernet to any company for $1000, all they had to do was abide by the specifications, which were now published freely.

Steve Jobs visited PARC in 1979. Apples daylight raid. Apple was working on the LISA at the time. Graphics, and a windowing interface controlled by a mouse. Xerox was courting Apple. They wanted to licence their technology, and invest in Apple. Xerox could invest $1 million in Apple, in return Apple could use PARCs technology. They were interested in Smalltalk which was a great example of a GUI.

Jobs and crew showed up in the PARC lobby with no advance warning, 2 days after their demo. day, and asked to see the "good stuff, now". When he saw scrolling text on the small talk environment, “Why hasn’t this company brought this to market?” Jobs famously shouted, "What’s going on here? I don’t get it!”

Jobs was a raving maniac by the time he left, he had seen the future. Ordering that Lisa be reconfigured to match the Alto display. The visit didn’t have much impact on the Lisa the partnership with Xerox fell apart quickly, Bill Atkinson, had to re-create most of what he’d seen on his own. From now on Apple would follow the gospel according to Smalltalk. “Lisa must be fun to use,” declared a project design manifesto written a month or so after the show-and-tell. “Special attention must be paid to the friendliness of the user interaction and the subtleties that make using the Lisa rewarding and job-enriching.”

1981, Xerox introduced the Star. It was a sensation; many hadn't seen a bit-mapped screen, Apple watched every demo and then retreated into a corner to discuss what they had seen.

Four months after the Star’s introduction they knew the IBM PC would beat them on price.

Where Xerox went wrong:

Leadership didn’t understand computers, but they knew they had a great research lab. They figured the engineers would take care of it.”

HQ understood marketing but not computers, people at PARC understood computers but not marketing. They'd got the vision right. They'd created something far superior to anyone else but couldn’t imagine that customers would want anything less. They got the customer wrong. It wasn't just big corporates it was normal people.

They'd created too many features, not created an open system, and had no spreadsheet program. Nobody at PARC thought to write one because they didn't need one. They created WYSIWYG word processing and printing for technical papers, in the business world spreadsheets were the kind of application a manager might use. And the result was another sale for Apple, not for Xerox

The Star was 10x more expensive that the Apple II. Which made it hard for companies to try out the technology, you also had to buy it in a huge package from Xerox.

Xerox designers passed up several chances to do something simpler. “The option was open all through the nineteen-seventies,” says Lampson. “We could have shipped something much like the IBM PC, and a short time later shipped something very much like the original Alto.”

Apple’s Lisa debuted in 1983 audiences loved its Smalltalk-inspired interface but were turned off by its pricey hardware and slow performance. A year later though Apple’s Macintosh hit the sweet spot. It was compact, affordable, friendly, and cute. They flew out of stores and developed a fanatically loyal following that persists to the present day. In 1985, Microsoft shipped Smalltalk-inspired Windows 1.0 for IBM PC compatibles. Two massive revisions later, with the release of Windows 3.0, in 1990, Microsoft also hit the sweet spot, and famously began to make up for lost time. By the end of the 1990s, Windows would overwhelmingly dominate the operating-system market.

Steve Jobs missed the importance of Ethernet on their visit to PARC. Jobs absolutely despised the idea of networking. Legend has it that when Jobs was later asked why the Lisa didn’t have a networking port, he threw a floppy disk at the questioner and snarled, “There’s my fucking network.”

Back to networking:

NSFnet in 1968 could carry 56kb/s like Arpanet, by 1987 it had collapsed from congestion. The network’s capacity would be boosted by a factor of 30, to 1.5 megabits per second. 1988 usage exploded again, soaring from seventy-five million packets a day toward a billion packets a day. In 1991 Wolff and company would have to boost the NSFnet’s capacity by yet another factor of 30, to “T3” speeds of 45 megabits per second.

Al Gore coined the phrase “Information Superhighway,” (his father helped create the Interstate Highway System in the 1950s). Asked the administration to study the possibility of networking the supercomputer centres with fiber optics.

“We just found it more useful to cooperate than to fight,” explains Chuck Brownstein, who was Gordon Bell’s deputy at the NSF computer directorate. Nobody could afford it otherwise. “The networking part of our programs kept getting bigger and bigger,” he says. “So finally the program people from ARPA, NSF, NASA, and the Energy Department got and formed the Federal Networking Council. They started doing things like sharing the cost of overseas lines, or putting in exchanges to interconnect our networks.” Agencies consolidated their various networks around TCP/IP. The megabit speeds of the upgraded NSFnet had turned the 56-kilobit Arpanet into a dinosaur, ARPA decommissioned its IMPs By the beginning of the 1990s the de facto national research network was in place, and TCP/IP ruled: the Internet had expanded from its original base in the Defence Department to the research community as a whole.

Wolff tried to commercialise it. Recognising that government funding could dry up. Telecoms companies didn't get it, every E-mail sent was a telephone call that wasn't made. As late as 1989 AT&T couldn't see the internet making any money.

The idea was to create a three-tiered structure. Lowest level was campus scale networks, then mid level was new regional networks connecting the local ones, and the highest level would be high-capacity nationwide backbone (like long distance phone lines).

ISPs would create the regional networks and sell accesses to businesses.

Usenet newsgroups and ListServ mailing lists built community, you could always find someone on the Internet who shared your interests. Then, too, E-mail had been standardized and greatly streamlined

CERN

1990, at CERN Tim Berners-Lee finished the initial coding of a system in which Internet files could be linked via hypertext. He'd been working on it for 10 years. The notion of “browsing”: the program had a word-processor-like interface that displayed the links in a file as underlined text; the user would just click on a link with the mouse, and the program would automatically make the leap, display whatever files or pictures it found at the other end, and then be ready to leap again. Because of Berners-Lee’s hypertext browsing, users would finally begin to get it about the Internet. The initial version of the program would run only on a NeXT computer. The world wide web.

Licklider’s set the vision for ARPA, without APRA there would be no Arpanet, no TCP/IP, and no Internet. There would have been no windows-icons-mouse interface à la Doug Engelbart. And there would have been no creative explosion at Xerox PARC.

Licklider had the vision, was given the opportunity, and he took it. He succeeded beyond anything he could have hoped for. Even now, people who never heard of “Man-Computer Symbiosis” or J. C. R. Licklider still fervently believe in his dream, because it is in the very air they breathe.