The History of Personal Computing

Subscribe
Apple | Spotify | Amazon | iHeart Radio | Castbox | Podcast Republic | RSS | Patreon | Discord | Facebook


Podcast Transcript

When computers were first created, they were enormous.

They would often take up the better part of a building, and they consumed large amounts of energy. 

Despite the size of these early computers, some people saw a future where computers would shrink down small enough that they could fit inside a person’s home. 

Some thought that idea was ridiculous. Not only was that prediction true, but it changed everything. 

Learn more about the history of personal computing and how it developed on this episode of Everything Everywhere Daily. 


In hindsight, given the world we live in today, it is hard to believe that there was a time when the idea of a personal computer was considered ludicrous. 

Before the advent of personal computers, computing was the exclusive domain of governments, universities, and large corporations. The earliest electronic computers, such as ENIAC in 1945 and UNIVAC in 1951, were massive machines that filled entire rooms, required teams of specialists to operate, and cost millions of dollars. 

These mainframe computers were primarily used for complex scientific calculations, processing census data, and military applications. The concept of an individual owning a laptop was absurd during this era.

Yet, even at this early stage, some people with foresight saw a future where computing devices were personal. 

One such person was Vannevar Bush. Bush was an American engineer, inventor, and science administrator who was the head of the U.S. Office of Scientific Research and Development during World War II.

He laid out a vision of a personal knowledge machine in 1945 with a device he called a “Memex,” a desk-sized device for filing, linking, and retrieving one’s own documents using associative “trails.” He did not describe a microprocessor or a GUI, but he nailed the idea of a privately owned interactive information appliance that augments an individual’s memory and creativity.

The real transition toward smaller, more accessible computers began in the 1960s with the development of minicomputers. Companies like Digital Equipment Corporation pioneered this movement with machines like the PDP-8, which was roughly the size of a refrigerator rather than a room. While still expensive and requiring technical expertise, minicomputers represented a significant step toward democratizing computing power.

During this time, Joseph Carl Robnett Licklider shifted from the notion of information storage to active partnership. In his groundbreaking 1960 paper “Man-Computer Symbiosis” and later at DARPA, he argued that interactive computing would be personal, conversational, and networked. His vision of an “intergalactic network” implied many people at many small consoles, not a few mainframes serving batch jobs.

Douglas Engelbart turned this into a working preview. His 1962 program to “augment human intellect” and the 1968 “Mother of All Demos,” which you may recall from a previous episode, showcased a single user interacting with a high-resolution display, featuring a mouse, windows, real-time editing, hypertext links, and video collaboration. 

He demonstrated a personal workstation decades before the concept was adopted by the market at scale.

The true foundation of personal computing emerged in 1971 when Intel developed the Intel 4004, the first commercially available microprocessor. This single chip contained the processing power that previously required entire circuit boards. 

The 4004 was followed by more powerful processors, including the Intel 8008 and, crucially, the Intel 8080 in 1974, which became the heart of many early personal computers.

The microprocessor represented a paradigm shift because it made computing power both affordable and compact enough to be accessible to individuals and small organizations. This technological breakthrough set the stage for the personal computer revolution that would follow.

The first true personal computer is often considered to be the Altair 8800, introduced by Micro Instrumentation and Telemetry Systems, or MITS, in January 1975. Featured on the cover of Popular Electronics magazine, the Altair captured the imagination of electronics hobbyists despite its limitations. 

The Altair 8800 used an Intel 8080 CPU running at about 2 MHz, with a base memory of 256 bytes expandable up to 64 KB. 

It had no keyboard, no screen, and came as a kit requiring assembly. Users programmed it by flipping switches on the front panel and received output through blinking lights. Despite these constraints, the Altair 8800 sold thousands of units and demonstrated that there was genuine demand for personal computers.

The Altair’s success attracted the attention of two young programmers, Bill Gates and Paul Allen, who developed a version of the BASIC programming language for the machine. This marked the beginning of their company, which they named after the microcomputer software: Microsoft. 

Their work on the Altair BASIC demonstrated the critical importance of software, not just hardware, in making personal computers practical and accessible to non-technical users.

Around the same time, the Homebrew Computer Club formed in Silicon Valley, bringing together enthusiasts, engineers, and entrepreneurs who shared ideas and innovations. 

This informal gathering became a breeding ground for innovation in personal computing. Among its members were Steve Wozniak and Steve Jobs.

Steve Wozniak, while working at Hewlett-Packard, designed the Apple I computer in 1976, primarily for his own use and to impress fellow members of the Homebrew Computer Club. 

His friend Steve Jobs recognized the commercial potential and convinced Wozniak to start a company. The Apple I was sold as a fully assembled circuit board, though users still needed to provide their own case, power supply, keyboard, and display. The duo sold about 200 units from Jobs’s parents’ garage, establishing the Apple Computer Company.

The real breakthrough came with the Apple II. Introduced in April 1977, the Apple II was revolutionary because it was a complete, ready-to-use system with a plastic case, integrated keyboard, color graphics capability, and expansion slots for additional functionality. 

Wozniak’s engineering created a machine that was both powerful and user-friendly. The addition of VisiCalc, the first spreadsheet program, in 1979 transformed the Apple II from a hobbyist’s toy into a serious business tool. 

The Apple II became an enormous success, selling millions of units and establishing Apple as a major player in the industry.

The year 1977 also saw the introduction of two other significant personal computers that would define the first generation of mainstream machines. 

Besides the Apple II, the Commodore PET, or Personal Electronic Transactor, and the Tandy RadioShack TRS-80 both debuted. The Commodore PET was equipped with an integrated monitor, keyboard, and cassette tape drive in a distinctive all-in-one design. 

The TRS-80, sold through Radio Shack’s extensive retail network, became popular due to its relatively low price and widespread availability. These three machines, often referred to as the “1977 trinity,” brought personal computing to a broader audience and established the market’s viability.

It was Commodore who made the first personal computer I ever remember seeing, the Vic 20, in 1981. 

The Vic 20 sold for $300, and my friend Tim had one in grade school. The computer was inside the keyboard and hooked up to a television. We purchased computer magazines that had the code for games like Load Runner. To play the game, we had to copy the code from the magazine without making a single error.

We could then save it on a cassette tape. 

The landscape of personal computing transformed dramatically when IBM entered the market in August 1981 with the IBM Personal Computer, commonly known as the IBM PC. 

IBM, the dominant force in mainframe computing, had initially dismissed personal computers as insignificant. However, recognizing the growing market, IBM assembled a team to quickly develop a personal computer. In an unprecedented move for IBM, the team decided to use off-the-shelf components and an open architecture, making design specifications publicly available.

IBM chose Intel’s 8088 processor for the CPU and, crucially, licensed an operating system from the small, previously mentioned company, Microsoft. Bill Gates and Microsoft purchased an existing operating system called QDOS, which stood for Quick and Dirty Operating System, modified it, and licensed it to IBM as PC-DOS while retaining the rights to sell their own version, MS-DOS, to other manufacturers. 

This decision would prove to be one of the most consequential business moves in the history of computing.

The IBM PC’s open architecture meant that other manufacturers could legally create compatible machines, leading to the proliferation of “IBM PC clones” or “IBM compatibles.” 

Companies like Compaq, which reverse-engineered IBM’s BIOS to create fully compatible machines, led this charge. The standardization around the IBM PC architecture and MS-DOS created a massive software ecosystem, as developers could write programs that would run on any compatible machine. This network effect made the IBM PC platform increasingly dominant.

Commodore also introduced the Commodore 64 in 1982, which became the best-selling single computer model of all time, with estimates of 12 to 17 million units sold. The Commodore 64 was powerful, relatively inexpensive, and became especially popular for gaming and home use.

As IBM-compatible PCs proliferated, Microsoft’s MS-DOS became the standard operating system for personal computers. Microsoft maintained control over the operating system while hardware manufacturers competed on price and features, creating a highly profitable situation for the software company. Bill Gates’s vision of “a computer on every desk and in every home, running Microsoft software” was becoming a reality.

While the IBM PC was achieving market dominance through standardization, Apple was pursuing innovation in user interface design. In 1983, Apple introduced the Lisa, the first commercial personal computer with a graphical user interface featuring windows, icons, menus, and a mouse. 

It was the closest implementation to date of Douglas Engelbart’s vision in the “mother of all demos.”

The Lisa was revolutionary but failed commercially due to its high price of nearly $10,000 and slow performance.

However, Steve Jobs was simultaneously overseeing another project that would succeed where the Lisa failed. The Macintosh, introduced in January 1984 with a famous Super Bowl commercial directed by Ridley Scott, brought the graphical user interface to a broader market at a more affordable price. 

The Macintosh featured a mouse, a bitmap display, and an intuitive interface that made it far easier to use than DOS-based PCs. The Mac popularized concepts like drag-and-drop, point-and-click, and visual metaphors like the desktop and trash can.

In 1985, Microsoft introduced Windows 1.0, a graphical user interface that ran on top of MS-DOS. While initially crude and not particularly successful, Windows represented Microsoft’s answer to the graphical interfaces being pioneered by Apple. 

Subsequent versions, particularly Windows 3.0 in 1990 and Windows 3.1 in 1992, gained significant market adoption. However, it was Windows 95, released in August 1995, that truly revolutionized personal computing with its user-friendly interface, plug-and-play hardware support, and integration of DOS and Windows into a single product. 

Windows 95’s launch was a cultural phenomenon, with extensive marketing and millions of copies sold.

The launch of Windows 95, which I remember vividly, having been invited by Microsoft to one of their launch events, marked a transition in personal computing. 

With it, graphical user interfaces became the norm, and personal computers became even more mainstream, especially with the rising popularity of the internet. 

1995 was obviously not the end of personal computing, but I’m going to save the rest for a future possible episode. 

Notably, two of the earliest personal computer companies, both founded by pioneering enthusiasts, remain operational today and are among the largest companies in the world. 

The combined market capitalization of both Apple and Microsoft is, as of the time I am recording this, $7.73 trillion dollars.

Not bad for a couple of companies started in a garage. 

The original personal computers from the 1970s and early 1980s are now collector’s items. Original Altair 8800s now sell for thousands of dollars, and an original Apple I just sold for $475,000 at auction. 

While computers did a great deal of important work in the 50s through the 70s, it wasn’t until computers became personal that they truly revolutionized society.