Home » Business History » Computer industry history

Published: October 4, 2011, 04:55 AMTweet

Computer industry history

While the U.S. computer industry began as a direct result of large-scale Department of Defense spending on electronic digital computing research during and shortly after World War II, and the vision of a small number of engineers and entrepreneurs to commercialize this research, in large part it was facilitated and extended by technological and marketing capabilities built in the American office machine industry during the previous six decades. The U.S. office machine trade consisted of manufacturers of accounting machines, TYPEWRITERs, cash registers, tabulators, and other devices used to record, store, process, and retrieve information. America’s relative shortage of labor compared to European countries, coupled with America’s embracement of labor-saving technology, resulted in the United States’s strong international leadership in the production and use of OFFICE MACHINES from the late 19th century forward.

In the 1880s, Herman Hollerith, an engineer who worked at the U.S. Patent Office, invented a punched-card tabulating machine, and in doing so gave birth to electromechanical information processing. A subsequent version of Hollerith’s machine demonstrated major efficiencies after it won a competition to be used on the largest information processing task of its time—the 1890 U.S. Census. Its success on this application led Hollerith to form the Tabulating Machine Company in 1896, a firm that produced punched-card tabulation machines for U.S. and other censuses, various government agencies, and a small number of large corporations. In 1911, Hollerith sold this successful company, which after several MERGERS became the Computer-Tabulating- Recording Company (C-T-R). In 1924, C-T-R’s management changed the firm’s name to INTERNATIONAL BUSINESS MACHINES (IBM) to reflect its broadening line of office machine products and its growing international installations (both sales and leases). IBM, the global leader in tabulating machines, along with other firms that had led in particular office machine segments since the late 19th century, such as Burroughs Adding Machine, Remington Typewriter, and National Cash Register (NCR), would all become significant contributors to the U.S. computer industry.

During World War II, the army’s Ballistic Research Laboratory (BRL) was limited to using analog computers (such as Vannevar Bush’s differential analyzer) and other mechanical calculating machines to aid human calculators in solving the thousands of equations necessary to produce ballistic firing tables. These machines and methods proved wholly inadequate with regard to speed and accuracy. John Mauchly and J. Presper Eckert, both researchers at the University of Pennsylvania Moore School of Electrical Engineering, proposed developing an electronic digital computer to meet the BRL’s calculating needs. Based on their strong proposal and some fortuitous connections, the army provided a $400,000 contract for a project that began in the early 1940s to build the Electronic Numerical Integrator and Computer (ENIAC)—the first fully operational digital computer. The machine, powered by 18,000 vacuum tubes, was not completed until early 1946.

Later that year, Eckert and Mauchly established the Electronic Control Company, soon renamed the Eckert-Mauchly Computer Corporation, to design and sell digital computers. Almost simultaneously, engineers who had worked as cryptographers for the U.S. Navy during the war established Engineering Research Associates (ERA) for the same purpose. Over the next half decade both firms produced a small number of expensive mainframe computers for government departments and agencies and a few corporations. In the early 1950s Remington Rand (formerly Remington Typewriter) acquired the two pioneering computer firms and became the first of the office machine companies to enter into the U.S. computer industry. Burroughs and NCR soon followed by acquiring relatively small start-up computer firms during the mid-1950s.

Man prepares a UNIVAC computer, 1959
Man prepares a UNIVAC computer, 1959 (LIBRARY OF CONGRESS)

IBM, as a result of its lease structure and steady revenue from punched cards, fared far better than other office machine firms during the Great Depression. At the end of World War II, IBM was by far the most profitable office machine maker in the world. The company specialized in what almost immediately became the key input-output technology (punched cards and tabulators) for computers, had an unparalleled sales and service organization, possessed a large and varied customer base, and during the late 1940s began to make substantial investments in electronics research. These factors placed it in a strong position to thrive in the computer industry during the succeeding decade. In the early 1950s, it won the primary computer contract for the Department of Defense’s Semi-Automatic Air Ground Environment (SAGE) computer networked communication system, a project that was coordinated by engineers from MIT’s Lincoln Laboratory. While Remington Rand was selling its million-dollar UNIVAC computer to a modest number of customers in the early 1950s, IBM was furthering its already strong capabilities in anticipation of producing computers that could sell or lease in large volume.

In the mid-1950s, IBM came out with several lower priced computers that leased for between $3,000 and $15,000 per month. In 1959, the company announced its 1401 computer, a machine that would achieve more than 10,000 installations during the 1960s and establish IBM’s leadership in the computer industry. The IBM 1401, like a small number of other computers of its time, took advantage of transistor technology, which had been perfected in the decade following its invention by scientists at Bell Laboratories in 1947. Further innovations to transistor technology led to the integrated circuit (IC) during the first half of the 1960s. Behind the strength of the integrated circuit, domestic computer installations grew from 240 in 1955 to 11,700 in 1963. This growth would continue and represented a transition from the nearly exclusive scientific computing market of the early 1950s to the adoption of computers for many business purposes by the end of the decade and into the 1960s.

The IBM System/360 series, announced in 1964, was a watershed for the firm and the industry. It consisted of a series of compatible computers with varying processing powers and prices. This solidified IBM’s industry leadership and led to its achieving a peak of around 70 percent of the domestic industry by 1970. A combination of leading office machine producers, a couple of late 1950s start-up firms, and two electronics giants represented IBM’s primary competitors. These competitors developed some successful machines but provided only a modest challenge to IBM. IBM and its chief competitors (Burroughs, National Cash Register, Remington Rand/Sperry-Rand, Control Data, Digital Equipment Corporation, GENERAL ELECTRIC and RCA) were frequently referred to as “IBM and the Seven Dwarves” in the business press to emphasize the leader’s dominance. The two electronics firms, General Electric and RCA, showed only a partial commitment to the computing business during the 1960s, lost money in this area, and divested from the field at the beginning of the 1970s.

As IBM solidified its dominance, a growing number of firms sought to imitate its computers, as RCA did with its Spectra-70 series. Two new computer firms, both formed in 1957, took a different path and initiated divergent new segments of the computer industry: minicomputing (Digital Equipment Corporation) and supercomputing (Control Data Corporation).

Some of Sperry-Rand’s leading engineers and managers departed and formed Control Data Corporation (CDC) to build computers of unprecedented power to target the smaller but still substantial and growing scientific computing market. Though IBM continued to sell to scientific users, the firm chose not to enter the supercomputer business and to concentrate its resources on the business computing field.

CDC would dominate supercomputing in the 1960s but would be displaced from this area as its focus changed increasingly to computer peripherals and service businesses in the 1970s. Early in this decade, their star engineer, Seymour Cray, who had designed the advanced circuitry on the firm’s supercomputers, would leave to form Cray Research. This company soon became the supercomputing leader. From the first supercomputer, the CDC 6600, to a wave of new machines by Cray Research in the 1970s and early 1980s, supercomputing expanded possibilities for modeling the Department of Defense’s nuclear war scenarios, weather forecasting, and other areas requiring extensive processing power.

Kenneth Olsen and Harland Anderson, both MIT Lincoln Laboratory engineers who had worked at overseeing IBM’s SAGE contract, formed the Digital Equipment Corporation (DEC). Olsen, DEC’s longtime leader, recognized an opportunity to use advanced circuitry to make small computers of modest processing power for a significantly lower cost than mainframe computers. Along with Silicon Valley, which was emerging as a semiconductor center, Route 128 near Boston had also defined itself as a leading electronics development region. DEC and other minicomputer firms would add greatly to the reputation of the Boston area as a fundamental center for particular sectors of electronics and computing.

Early DEC computers sold for around $100,000, but integrated circuits of the mid- 1960s and DEC’s innovative designs allowed the firm to produce and sell its PDP-8 minicomputer for a mere $18,000. The PDP-8 made computers affordable to many previously excluded organizations, selling more than 40,000 units during its product life. Its success led a number of firms to enter the minicomputing field, from established companies such as Hewlett Packard and IBM to new entities such as Data General (formed by DEC PDP-8 designer Edson de Castro in the late 1960s). During the 1970s, there were more than 100 producers of minicomputers, most of which were small firms or small divisions of larger companies. Minicomputing not only extended the use of computers in hospitals, smaller laboratories, and mid-size firms, it also created a class of users who identified with operating their own machines. The minicomputer, in terms of size, cost, power, and its user community, more nearly matched the personal computer of the late 1970s and early 1980s than it did mainframes of the past.

Computers are useless without the programming that allows them to do various types of calculations and data processing tasks. In the second half of the 1950s, Sperry-Rand (the outgrowth of Remington Rand’s merger with Sperry Gyroscope in 1955), IBM, and other firms and organizations began developing programming languages, such as Fortran and Cobol. These tools helped with the arduous task of programming computers. Much programming in the first decade and a half of digital computers was done by software developers at mainframe computer manufacturers or by the sophisticated organizations purchasing or leasing these machines. Early in the next decade a number of programming or software service firms emerged to produce oneof- a-kind software and systems for clients’ computers. A shortage of programmers emerged as the number of computer installations expanded. This shortage, along with bugs and cost overruns, led to a real but media-exaggerated “software crisis.” Software products, or standardized systems and applications for many users, emerged in the second half of the 1960s to help address high programming costs and shortages of programmers. The software products industry gained great momentum when IBM unbundled (priced and sold separately from hardware) most software beginning in 1970. Unbundling facilitated the growth of software products firms such as Informatics, Applied Data Research, Cincom, and Cullinane—many of which were later acquired by Computer Associates or other contemporary software giants. IBM and other mainframe producers also developed and sold or licensed a large number of significant software products.

The semiconductor, which gave rise to minicomputing, became a fundamental industry that grew alongside the computer. Fairchild Semiconductor became a virtual training center for producing top engineers and executives of new semiconductor companies in Silicon Valley. Three former Fairchild engineers, Robert Noyce, Andrew Grove, and Gordon Moore initiated one such firm, Intel, in 1969. Two years later, Intel invented the microchip, or “a computer on a chip.” The microchip, also called a microprocessor, established a further wave of advances in computer power, miniaturization, and cost-effectiveness. Much of the microchip’s history has lent credence to Gordon Moore’s simple equation, Moore’s Law, which states that computer processing power doubles every 18 months relative to cost. Intel’s microprocessor gave rise to the personal computer kits of the mid-1970s, the emergence of new personal computer manufacturing start-ups (such as Apple Computer) in the second half of the 1970s, and IBM’s entrance into the field in 1981 with the IBM PC.

In 1980, IBM sought a software industry partner to help refine its Disk Operating System (DOS) for its soon to be released PC. The first company they called on, Digital Research, balked at the opportunity (apparently it was unwilling to sign a nondisclosure agreement), and IBM went with a 32-person outfit called Microsoft that was led by cofounders William (Bill) Gates and Paul Allen. Microsoft went on to produce MS-DOS for the PC and has been able to maintain a near-monopoly on IBM PC and PC-compatible operating systems ever since. Only recently has Microsoft faced competitive challenges from open-source systems such as Linux. IBM, while continuing to manufacture mainframes, minicomputers, and personal computers in the 1980s and 1990s, increasingly shifted its focus to software and services as revenue generators, and not just as tools to sell hardware.

Soon after IBM came out with the PC, firms such as Compaq and Hewlett Packard built compatible systems (or clones) that used Intel microprocessors and Microsoft operating systems (initially MS-DOS, and then MS Windows—a Microsoft operating system that mirrored certain graphic elements of Apple Computer’s Macintosh operating system). As a result of these IBM clones, and the origin of these machines’ processors and systems, the term Wintel (Windows and Intel) came into common usage. It signified both the powerful position of these two firms in the computer industry and the fact that IBM clone “manufacturers” were mere assemblers, marketers, and deliverers of commodity products (or “boxes”).

Independent producers of software applications and recreational software (particularly computer games) tended to design products for the PC-platform first and the Apple platform second, if at all. This challenge was a major factor in Apple’s personal computer market share dipping from double-digits, at the height of its early years, to lower single digits in the past 10 years. Recently, with the growth of Advanced Micro Devices and several other microprocessor competitors of Intel, Microsoft alone has been the focus of consumer and government scrutiny with regard to its domination of markets and its potential anticompetitive practices. This scrutiny came from Microsoft’s dominance of operating systems, certain popular applications (MS Word for word processing and MS Excel for spreadsheets), INTERNET browsers (MS Explorer), and most importantly, its apparent efforts to link together these products to lock in customers and lock out competitors.

Like the mainframe and minicomputer, U.S. government funding would prove critical to cultivating the underlying technology for the advent of the personal computer and the networking that would help transform this machine into a ubiquitous communication technology. In response to cold war concerns over the Soviet Union’s scientific and technological accomplishments of the late 1950s and early 1960s (particularly Sputnik in 1957), the U.S. Department of Defense (DoD) became all the more interested in advancing U.S. science and technology, including computing and computer networking. In 1962, the DoD initiated the Information Processing Techniques Office (IPTO) as part of its Defense Advanced Research Projects Agency (DARPA). In the following seven years the IPTO funded a Boston engineering firm, Bolt Beranek and Newman (BBN) and some key academicians to build a major computer network, the ARPANET, in order to allow scientists to communicate by computer as well as to provide a redundant computer communications network for defense purposes. The ARPANET became linked with other computer networks in the 1970s and early 1980s to form a network of networks known as the Internet.

In the early to mid-1990s, MIT scientist Tim Berners-Lee (in residence at CERN Laboratory in Switzerland) developed hypertext markup language (HTML). This provided a structure for sending graphics and text files and resulted in the World Wide Web. By early in the 21st century, the majority of Americans accessed the World Wide Web on a weekly if not daily basis. Network equipment demand expanded quickly, leading to rapid growth for CISCO Systems, the leading manufacturer of routers, devices that facilitate network traffic. Back in the 1960s and 1970s the IPTO had also funded significant graphics work. That, along with subsequent technological inventions and innovations at Xerox PARC and SRI, such as windows, icons, pull-down menus, and the computer mouse, greatly advanced the possibilities for future computer graphics and the ease of use for personal computers and the World Wide Web. Soon after Berners-Lee’s invention, several Internet browsers were developed to facilitate access to the growing information on the Internet. These included MOSAIC, Netscape’s Navigator, and slightly later, Microsoft’s Explorer. Microsoft made Explorer a standard feature on its Windows operating system, which led to the “net wars” with Netscape and the U.S. Department of Justice’s antitrust suit against Microsoft for bundling products to eliminate competition. In 2001, Microsoft settled the federal suit and became subject to a consent decree but still faces litigation from other jurisdictions.

Personal computers shown in a shopping center
Personal computers shown in a shopping center (GETTY IMAGES)

The World Wide Web has not only transformed the way in which many people communicate (e-mail instead of letters) and their leisure activities (interactive games and chat rooms), but has also brought about both real and perceived changes in how people engage in business. Many e-commerce firms began operations in the last few years of the 20th century, only to fall victim to the dot-com collapse that began in early 2000. The overvaluation of e-commerce firms had a precursor in the run-up of software services and products company stock during the late 1960s. Like this earlier high-technology stock market bubble and burst, a small number of industry leaders survived and thrived based on superior capabilities, first-mover advantages, established customer relationships, and a host of other factors. Today companies with dominant positions, such as E-bay in the electronic auctioning market, Google in commercial search engines, and Dell in personal computers (benefiting from its unparalleled supply management and service) have demonstrated that financial and inventory excesses of an industry do not hit world-class innovators and executors to the same degree as other firms. Part of this trend toward innovation and efficiency is achieving excellence in using global resources to best serve a global marketplace. Like the manufacturing sector of computer hardware, and many other manufacturing areas, a significant number of jobs have been sent overseas. In recent years, global outsourcing has become increasingly common with U.S.-based information technology (IT) firms in programming and IT services.

See also WATSON, THOMAS J.

Further reading

  • Campbell-Kelly, Martin. From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry. Cambridge, Mass.: MIT Press, 2003. 
  • Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996. 
  • Chandler, Alfred Dupont, Takashi Hikino, and Andrew Von Nordenflycht. Inventing the Electronic Century: The Epic Story of the Consumer Electronics and Computer Industries. New York: Free Press, 2001. 
  • Hiltzik, Michael A. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperBusiness, 1999. 

Jeffrey R. Yost

Tweet

Add comments
Name:*
E-Mail:*
Comments:
Enter code: *