A History of Modern Computing

Book A History of Modern Computing

MIT Press,


Recommendation

Paul E. Ceruzzi, curator of the National Air and Space Museum, describes the development of computing, starting with its earliest history. He examines the beginnings of commercial computing from 1945 to 1956 and traces the history of computer hardware and software, dividing these developments into five- to 10-year time periods. His book emphasizes technical development, rather than personalities or business dynamics, a focus that contributes to its fairly dry, academic style. With this caveat, BooksInShort recommends the book primarily to those with a technological bent, such as professionals in operations and computer sciences, and academics in the field. However, if you are interested in the subject, you’ll love this. Ceruzzi provides an informative and comprehensive saga including extensive footnotes and a bibliography that runs about 80 pages.

Take-Aways

  • Commercial computing began to emerge from 1945 to 1956.
  • UNIVAC was the first computer to store data on tape.
  • IBM took an early commercial lead in 1952 when it introduced the 701, a stored-program computer described as an electronic data-processing machine.
  • The next big step in computer history was the development of the transistor, which replaced vacuum-tube technology by 1959.
  • FORTRAN and COBOL were early programming languages in the ’50s and ’60s.
  • In the 1960s, computers were large mainframes and IBM had 70% of the market.
  • Faster transistors, more complex instruction codes and the new integrated circuit chip made minicomputers possible.
  • In the 1970s, Digital Equipment Corporation and Scientific Data Systems developed the first personal computers.
  • Intel’s development of the microprocessor reduced the cost of the personal computer.
  • In the mid-1970s, electronic hobbyists and enthusiasts created microprocessor applications that led to the personal computer’s commercialization.
 

Summary

1945-1956: The Birth of Computing

Before 1945, the word “computer” referred to a person who solved equations on an electronic digital computer, which was invented for that purpose. The term only began to refer to the machinery itself around 1945. At that point, the development of the computer was spurred by a climate of prosperity and the strong US consumer market that emerged after World War II. The Cold War against the Soviet Union and policy-makers’ concern with building military strength contributed to the development of increased computer power, such as the ENIAC and other military projects and weapons systems. The military and other government agencies were the first customers for the commercial computers.

Commercial computing emerged from 1945 to 1956. The early developers included the Eckert-Mauchly Computer Corporation, led by J. Presper Eckert and John Mauchly. The firm, which was soon absorbed as a division of Remington Rand, initially developed the ENIAC, an electronic calculator, at the University of Pennsylvania’s Moore School of Electrical Engineering. It was created to calculate US Army firing tables, which required repeatedly solving complex mathematical equations.

“The word ’computer’ originally meant a person who solved equations; it was only around 1945 that the name was carried over to machinery.”

ENIAC led to the development, in 1951, of the UNIVAC – the “Universal Automatic Computer.” The UNIVAC, which used vacuum tube circuits, was designed to solve calculating problems for scientists, engineers and businesses. It used tape – rather than single-punched cards, each bearing relevant information about a single entity – to store both its data and its operating instructions. This dual storage breakthrough became a basic feature of nearly every subsequent computer. The UNIVAC needed fewer vacuum tubes than the ENIAC and led to the development of programming, and later software, as something apart from the design of computer hardware.

In 1951, Eckert and Mauchly, now part of Remington Rand, turned their first UNIVAC over to the US Census Bureau, in hopes that the Bureau’s successful use of the machine would lead to potential customers and sales. The UNIVAC had many features that soon became commonplace in computers, such as alphanumeric as well as numeric processing, magnetic tapes for bulk memory and circuits called “buffers” that allowed for high-speed data transfer from a fast delay line to the slow-tape storage units.

“As the minicomputer established its markets in the mid-1960s, most computer dollars continued to be spent on large mainframes sold by IBM and a few competitors.”

Soon UNIVAC was purchased by a growing number of private corporations, which used it primarily for inventory, logistics and other data-processing applications. The machine’s ability to use tape instead of cards was a big draw, as was its ability to scan a reel of tape, find the correct record or set of records, process it in some way and return the results to the tape. This process was much more efficient than the very labor-intensive punched-card operations.

In 1952, IBM responded to the UNIVAC, with its 701, a stored-program computer that IBM called an “electronic data processing machine.” The machine was primarily used by defense contractors in the US Defense Department or by military aerospace firms, though soon business customers were buying it, too. Together, the UNIVAC and the IBM 701 launched the era of commercial-stored programming computing.

1956 to 1969: The Computer Comes of Age

Advances in circuit technology launched the next big step in the history of computer. By 1959, the transistor had become reliable and cheap enough to be the basic circuit element (replacing vacuum tubes) in all processors, resulting in greater reliability, less maintenance and lower operating costs. The new machines featured innovative, reliable, high-capacity memory units constructed from magnetic cores.

“The force that drove the minicomputer was an improvement in its basic circuits, which began with the integrated circuit (IC) in 1959. The IC, or chip, replaced transistors, resistors and other discrete circuits in the processing units of computers; it also replaced cores for the memory units.”

This core memory consisted of small, doughnut-shaped pieces of material through which several fine wires are threaded. The wires can magnetize the core in either direction – a zero in one direction, a one in the other. Binary information is stored in this way. The transistor and high-capacity memory units led to better performance and made commercial application of the computer cost-effective.

Now other companies, besides IBM and UNIVAC, began developing and selling large computers. Minneapolis’ Honeywell Corp. was first, with the 1957 debut of the Datamatic 1000. General Electric (GE), then the US’s leading electronics firm, brought out the Electronic Method of Accounting computer, which allowed banks to automate the check-clearing process. RCA entered the field with its BIZMAC computer for business and data-processing applications, and then with several transistorized computers.

“Nothing seems to be on the horizon that will seriously challenge the silicon chip. Silicon integrated circuits, encased in rectangular black plastic packages, are soldered onto circuit boards, which in turn are plugged into a set of wires called a bus: This physical structure has been a standard since the 1970s.”

By the end of 1960, about 6,000 of these general-purpose electronic computers were in use in the US, where IBM dominated the industry. Yet, while IBM’s stock soared and turned many of its employees into millionaires, many people criticized IBM for not being an innovator. Instead, they claimed, IBM let other companies develop the technology, and then stepped in with its superior in-house manufacturing techniques, and marketing and sales capability.

Software development improved along with hardware development. Software development emerged from 1944 to 1951 when Grace Murray Hopper helped Howard Aiken program code for the US Navy’s punched card computers. At the time, sequences were punched onto decks of cards, in programs of subroutines called “compilers.” Then, in 1954, J.H. Laning and N. Zierler developed the first modern compiler programming system for the Whirlwind Computer at MIT. It was based on an algebraic system that translated the programs into machine codes the computer could understand. Gradually, as these systems improved, they came to be known as programming languages. At the time, they were used primarily for sorting data. Typically, computer programs processed a company’s information, updated its files and issued a series of regular printed reports in formats that people could review.

“During the Second World War, Eckert and Mauchly designed and built the ENIAC at the University of Pennsylvania’s Moore School of Electrical Engineering. The ENIAC was an electronic calculator that inaugurated the era of digital computing in the United States.”

Early programming languages included FORTRAN, developed in 1957, and COBOL, developed years later. FORTRAN’s syntax was close to ordinary algebra, while COBOL was developed to provide a common business language. COBOL succeeded because it was one of the first standardized languages allowing the same program to run on different computers from different vendors. Though other languages – such as JOVIAL and ALGOL – were developed at this time, they never matched the success of FORTRAN and COBOL.

Meanwhile, a growing number of organizations were using mainframe computers. By the mid-1960s, the US Internal Revenue Service had embraced the computer. The IRS had one of the most sophisticated and complex IBM mainframe systems. A few big newspapers used mainframe computers, as did many US government agencies. The NASA-Ames Research Center used computers for high-speed aerodynamic research and the manned space program.

“The concept of storing both instructions and data in a common storage unit would become basic features of the UNIVAC and nearly every computer that followed.”

At the same time, a new mini-computer architecture was being developed, based on faster transistors and more complex computer instruction codes. The breakthrough that permitted the development of commercial minicomputers happened in 1957. Control Data Corporation’s Seymour Cray designed a new method to handle input and output for the 1604 model. His “160 CD” used a short word-length of 12 bits and accessed memory in new ways that allowed engineers to build smaller, cheaper computers.

The price dropped to $60,000, resulting in new markets. In 1962, pioneering customer Jack Scantlin, of Scantlin Electrics, developed a way to feed online New York Stock exchange quotations to brokers nationally.

In the late 1950s, the newly founded Digital Equipment Corporation (DEC) gained a growing market for its transistorized computers. In 1965, DEC sold more than 50,000 PDP-8 computers. That success began the minicomputer explosion. These computers used a series of compact modules, which were plugged into a hinged chassis to create a system consisting of a processor, control panel and core memory.

“The UNIVAC and the IBM 701 inaugurated the era of commercial stored-program computing.”

Though programming still required tremendous skill, the simplicity of the computer’s architecture led to the rise of independent original equipment manufacturers (OEMs), who bought minicomputers, added specialized hardware for input and output, wrote their own system software and sold the computers at a high markup under their own labels. These specialized applications were adapted for specific purposes from medical instrumentation and small-business record keeping to industrial control systems.

1961 to 1977: The Silicon Chip and the PC

By the mid-1960s and through the 1970s, most of the money the marketplace spent on computers purchased large mainframes sold by IBM (70% of the market) and a few competitors. IBM’s smaller competitors were Sperry Rand, Control Data, Honeywell, Philco, RCA, GE and NCR. Some called this group “Snow White and the Seven Dwarfs.”

IBM was known for its System/360 mainframes, launched in 1964. IBM also introduced new tapes, disks and other items that supported this system. Since the mainframe was generally too expensive for a single user, several users generally shared its computation cycles or time simultaneously. IBM supported users with such services as systems analysis and programming.

“Core memory refers to small, doughnut-shaped pieces of material through which several fine wires are threaded to store information. The wires passing through the core can magnetize it in either direction: this direction, which another wire passing through can sense, is defined as a binary zero or one.”

While IBM dominated mainframes, a growing minicomputer market emerged, based on the innovative integrated circuit, now known as the “chip.” While debate continues about whether Jack Kilby or Robert Noyce developed the chip (both applied for patents in 1959), the courts gave each a share of the credit. Generally, experts credit Noyce, who established a special “planar process” that led to great progress in integrated circuits.

The US aerospace industry helped to provide the first big market for the integrated circuit (IC). By the late 1960s, some 100 new companies or divisions of established companies made commercial minicomputers, including Digital Equipment Corporation (DEC) and Data General, which was founded by three DEC engineers.

“In 1957 IBM marketed a device that would prove to be one of its most lasting: the spinning disk for random-access storage of large quantities of data.”

The personal computer was born in the early 1970s. The 1972 TOPS-10 system was DEC’s initial entry, while Southern California’s Scientific Data Systems (SDS) launched the SDS-94. However, for the moment, the costs were too high for the general public.

The biggest change occurred when Intel invented the microprocessor, a chip with the ability to carry out a subroutine, execute it and return to the main program, making it possible to carry out complex operations stored in the memory of the subroutines. Then, in the mid-1970s, electronics hobbyists and enthusiasts began working out applications that made microprocessor-based systems possible.

These developments led to the personal computer, which was invented by H. Edward Roberts, a designer at Altair, who built the machine around Intel’s 8080 microprocessor. In 1975, Altair offered its personal computer in a kit for less than $400 – about 10 times cheaper than a minicomputer.

Although the Altair had many functional limitations and was hard to assemble, hobbyists flocked to the machine over the next few years.

“Storing data (numeric as well as non-numeric) dominated early commercial computing, and as late as 1973 was estimated to occupy 25% of all computer time.”

Then, in 1977, new software developments led to the success of the Apple and other personal computers. Gary Kildall, the key software developer who adapted BASIC for use on the DOS system, developed the basic CP/M system for the Intel 8080. Then, in 1977, he rewrote it as a common code called the BIOS (Basic Input/Output System) that could be used for any new computer or disk drive.

This pioneering phase of computer development ended in 1977. Microsoft took over the BIOS system from Kildall and licensed it to a growing number of manufacturers in exchange for royalties. New industry publications, software companies and support groups emerged to introduce novices to computing. The computer age had begun and, after 1977, growing giants – Microsoft and Apple – dominated the marketplace by building on the technological developments that lit the way.

About the Author

Paul E. Ceruzzi is curator at the National Air and Space Museum and the Department of Space History. He is the author of Beyond the Limits: Flight Enters the Computer Age.