Electric Dreams: Computers in American Culture - The Many Creators of the Personal Computer
The Many Creators of the Personal Computer
- The 1970s witnessed the rise of the personal computer (PC), a small, self-contained machine for individual use, affordable for consumers, which expanded public access to computing power and transformed American perceptions of computing.
- The history of the PC is not a linear progression but characterized by false starts, dead ends, and unexpected convergences.
- The Social Construction of Technological Systems (SCOT) model describes technological development as an alternation of variation and selection, resulting in a multidirectional model.
- Historical hindsight can simplify the multidirectional model into a linear one, but this overlooks the fact that successful stages in development aren't the only possibilities.
- This chapter examines the multidirectional development of the PC in the 1970s through various technological projects, including the ECHO home computer system, the emergence of minicomputers and time-sharing, the formulation of Moore's Law, the marketing of the microprocessor, and the invention of the Altair.
- Each project represents an intersection of multiple, often conflicting visions of computing.
- The PC is the result of struggles, alliances, and negotiations, which shaped contemporary notions of the personal computer.
ECHO: Making Room for a "Home Computer"
- The ECHO IV, created by Westinghouse engineer Jim Sutherland in 1965, was the first computer designed for home use, utilizing surplus hardware and memory.
- The computer's components were housed in four wooden cabinets weighing 800 pounds, located in Sutherland's basement.
- Input stations and output terminals were wired throughout the house, including a keyboard in the living room, a teletype in the kitchen, and binary displays in various locations.
- Westinghouse publicized Sutherland's project in 1966, featuring the family in newspaper and magazine articles.
- The ECHO project raised the question of how to computerize a home, including where to place the computer and what it would do.
- The ECHO was used for tasks like making shopping lists and filing recipes, representing early attempts to computerize the kitchen.
- The Honeywell Kitchen Computer, also introduced in 1966, was another attempt at kitchen computerization, featuring a 16-bit machine with a built-in cutting board.
- The idea of computerizing the kitchen has persisted, with recipe organization being among the first applications proposed for the Altair.
- Another vision for the home computer was as part of a living room entertainment center, controlling the TV antenna and automatically switching channels.
- While TVs became central to living rooms, PCs have remained peripheral, often relegated to a side desk.
- The "home office" has emerged as a distinct room category, reflecting the blurring boundaries between work and leisure and providing a space for the PC.
- The ECHO's ubiquity, with terminals and displays throughout the house, contrasts with the self-contained nature of later PCs.
- Microprocessors have proliferated within household products, accomplishing more than the ECHO's mainframe could.
- The idea of a networked, centrally controlled "smart house" has persisted in computer magazines, despite failures to gain widespread consumer adoption.
- Smart houses, controlled by PCs for lighting, audio/video systems, and security, are often featured in computer and specialized publications.
- Bill Gates's home is an archetype of these domestic fantasies, with computers monitoring guests and adjusting settings accordingly.
- The smart house ideal is a vision of domestic space as a panopticon, with home surveillance technology seen as a solution for domestic issues.
- The computer that takes over the house has been a theme in science fiction, exemplified by HAL in 2001 and the PC in Electric Dreams.
Minicomputers and Time-Sharing: "Mental Models" of Personal Computing
- Alongside home computer experiments, time-sharing emerged in university computing centers as a different form of "personal" computing in the 1960s.
- Early mainframes could only be operated by one user at a time, leading to the establishment of "batch processing."
- Batch processing required users to program instructions on punch cards and submit them to technicians, with results received after processing.
- Interactive learning and immediate feedback were limited in this system.
- In the 1960s, minicomputers were developed by companies like Digital Equipment Corporation, offering smaller, less expensive alternatives to mainframes.
- MIT's hackers embraced DEC's PDP-1 minicomputer, which was designed for scientific inquiry and mathematical formulation.
- Minicomputers were more compact, required less air-conditioning, and were easier to operate.
- The retail price of the PDP-1 was around 120,000, making computer time more accessible.
- MIT's hackers valued the ability to directly interact with the computer and developed time-sharing systems to allow multiple users to simultaneously access a single computer.
- Time-sharing created the illusion that each user had the full attention and resources of the computer, shaping a mental model of what computing could be.
- Organizations like the People's Computer Company brought time-sharing to local communities, providing low-priced computing access.
- Few envisioned that the time-sharing model would be popularized through individual "personal" devices rather than networks of terminals and mainframes.
Moore's Law: The Rush to Miniaturize
- The miniaturization of circuitry onto silicon chips made individually owned "personal" computers possible.
- A computer is essentially a series of on/off switches that represent logical operations and numerical values.
- The size of the computer depends on the size of the switches, the processing power on how quickly they switch, and the price on how cheaply they can be manufactured.
- Charles Babbage's difference engine used electromechanical switches; ENIAC used vacuum tubes, which were faster but larger and prone to failure.
- The invention of the transistor in 1947, which used semiconductors, led to miniaturization.
- Transistors are smaller, faster, more durable, and more energy efficient than vacuum tubes.
- Integrated circuits (ICs), which put multiple transistors on a single chip, further streamlined electronics design.
- Jack Kilby of Texas Instruments invented the IC in 1958, and Robert Noyce of Fairchild Semiconductor refined the manufacturing process.
- Noyce's photolithographic process allowed complex circuits to be mass-produced.
- Gordon Moore observed in 1965 that the number of transistors on a silicon chip doubled every year.
- This shrinking transistor size directly translated into a doubling of speed.
- Moore predicted that this pattern would continue indefinitely, leading to exponential growth in computer processing power, which became known as "Moore's Law."
- Moore later revised his figure to doubling every two years, then eighteen months.
- Hard-drive storage capacity has grown even faster than microprocessor speed, doubling every year since 1997.
- Laptop battery life has not kept pace with these advances.
- Moore's Law has become the centerpiece of technological determinism in computer culture.
- It suggests that technological change is driven by the intrinsic properties of semiconductors rather than social factors.
- Moore's Law is not a "force" or "law" like the Second Law of Thermodynamics but a result of labor and capital investment.
- It has become a self-fulfilling prophecy, with chipmakers and analysts setting goals and forecasts based on it.
- Moore's Law reflects an increasing rate of acceleration, with electronic devices speeding up at an ever-increasing rate.
- This explains why observers consistently perceive themselves to be at the center of unprecedented change.
- Moore's Law provides a level of predictability to technological change, allowing executives, engineers, and futurists to extrapolate computer processing power into the future.
- A knowledge of Moore's Law can differentiate those in the computer industry from those outside it.
- Increased processing power does not necessarily imply greater creativity, efficiency, or ease of use, leading to "bloatware."
- Faster models are consistently sold to consumers, even if the practical uses of that speed are not immediately apparent.
- Moore's Law is a paradigm for planned obsolescence, justifying the purchase of new versions of the same machines every few years.
- Software upgrades often force users to buy new computers as well.
- Semiconductor companies continuously build faster chips with the expectation that a market for them will follow.
- Great breakthroughs often precede the need for them.
- Some observers question whether the obsession with Moore's Law has blinded the industry to other priorities.
- Industry leader Intel has begun to de-emphasize "clock speed" and highlight other improvements, such as lower power requirements.
- However, competition may prevent any chip company from halting the processor-speed race.
- Moore's Law's confidence that if you build it, they will come, has opened opportunities for unanticipated uses.
The Microprocessor
- Semiconductor companies manufactured a wide array of chips for calculators, clocks, and other electronic devices in the 1960s.
- Chips were traditionally custom-designed for each product.
- Intel developed a flexible chip that could be programmed to perform various tasks, resulting in the first microprocessor, the Intel 4004, in 1971.
- The microprocessor's introduction was a critical development that made the personal computer possible.
- The Altair, the first device resembling a personal computer, appeared in 1975.
- Early engineers described the 4004 as a "microprogrammable computer on a chip."
- Intel management viewed the microprocessor as a more flexible version of existing chips rather than the core of an actual computer.
- Established computer companies could have developed personal computers using Intel's chips in the early 1970s.
- Engineers at Hewlett-Packard and Digital lobbied to build low-cost microcomputers.
- Digital's David Ahl organized a pilot program and produced two prototypes, but the project was scuttled due to skepticism from the sales department and CEO Ken Olsen.
- Executives found it difficult to imagine a tiny chip could be the core of a computer.
- The developer of the microprocessor, Ted Hoff, compared replacing microprocessors to replacing light bulbs.
- Design engineers couldn't make the connection between room-sized computers and a sliver of silicon.
The Altair: Playing with the Future
- The dawn of the personal computer age is often dated to the January 1975 issue of Popular Electronics, which featured the Altair 8800 minicomputer kit.
- The Altair was created by Ed Roberts, president of MITS, and was available via mail order for 397.
- Roberts founded MITS to build electronic instruments for model rocketry hobbyists and later moved into calculator kits.
- When large companies started producing low-cost calculators, Roberts developed the idea of a low-cost digital computer kit.
- The Altair was not the first hobbyist project to use the Intel 8008 microprocessor; the Scelbi 8-H was advertised earlier.
- Jonathan Titus published instructions on building the Mark-8 minicomputer in Radio-Electronics.
- The Altair was the first machine to develop a strong user base and support the development of software and peripherals, initiating the personal computer industry.
- Ed Roberts hoped to sell 400 machines, but sold that many in a single afternoon.
- Newsletters and magazines for Altair users began to be published, and companies developed software for the Altair, including Microsoft.
- Competitors like the IMSAI 8080 and Processor Technology's SOL emerged, culminating in the Apple II in 1977.
- The Altair lacked a keyboard, long-term memory storage, and a monitor, requiring users to input data via switches and translate binary code.
- It had an "open bus" architecture, allowing for easy expansion with plug-in cards.
- The Altair was marketed as a state-of-the-art minicomputer at a fraction of the cost of commercial units.
- Possible applications listed included accounting systems, navigation computers, time-shared computers, and intrusion systems.
- The language used to describe the uses of a home computer was still emerging.
- Applications like word processors and spreadsheets had not yet been invented.
- The appeal of the personal computer involved a tension between practical uses and long-range hopes and fantasies.
- Building the Altair was an opportunity to develop technical skills.
- Hobbies exist on the boundaries of work and leisure, allowing for experimentation and self-improvement.
- Readers of Popular Electronics were self-identified "hobbyists."
- Ads for technical education courses were prevalent, including courses on computer skills.
- Building a computer was a way to join an elite fraction of the professional-managerial class.
- It provided an opportunity to explore and understand a powerful technology and gain a sense of control and mastery.
- Many hobbyists saw their access to computing as part of a larger movement for the democratization of technology.
- Advocates of "computer liberation" emerged, especially in California's Bay Area.
- The Homebrew Computer Club became a meeting place for Altair enthusiasts and pioneers of Silicon Valley.
- Technophilia and radical politics intersected in publications like The Whole Earth Catalog and Co-Evolution Quarterly.
- These magazines promoted the "appropriate technology" movement, advocating democratizing, environmentally friendly tools.
- The personal computer was a logical extension of this left-wing technotopianism.
- Ted Nelson explored this vision in his 1974 manifesto, Computer Lib/Dream Machines.
- Nelson was depicted as the Tom Paine of the personal-computer revolution.
- The enemy was Central Processing, representing commercial, philosophical, political, and socio-economic manifestations.
- The libertarian vision of democratized technology differed from the communitarian time-sharing model.
- The time-sharing model was based on a shared public resource; hobbyists embraced an individualist vision of personal computers.
- This privatized concept of computing made it possible to imagine the computer as a commodity.
- The commodification of computing fostered decentralization but also stratified access by income level.
- The French Minitel system, by contrast, treated the computer as an extension of the government-backed phone system, reaching a larger percentage of the population.
- Only recently have computers penetrated the majority of American households.
- The commodification of computing established a vision of computing as an atomized, "personal" task, different from the time-sharing model.
- Despite the availability of modems, "connectivity" was initially seen as a peripheral role.
- The rise of the internet in the mid-1990s shifted this model, developing into "interpersonal" computing.
- The American interpersonal model remains rooted in the assumption that each user is anchored to a personal computer.
- In many parts of the world, users access the internet through publicly available computers.
- Cybercafes in the United States offer WiFi access, expecting users to bring their own laptops.
- Building an Altair was a way to participate in the making of the future.
- The era of the computer in every home was described as having arrived, casting the builders of the first Altairs as pioneers.
- The Altair was named after a star in Star Trek.
- The fascination in hobbyist texts was for the technology in itself rather than its possible consequences.
- This fascination is equivalent to the "sense of wonder" described by science fiction critics and the "American technological sublime."
- It is the utopian impulse at its most basic and unfocused: a vague sense that the world can be better and a deep hope for fundamental change.