HOLIDAY CLOSURE: We will be out of the office on December 25 and back on December 26. You can place orders as usual.

The Porchlight Business Book Awards longlist is here!

ChangeThis

The Computer’s Early History: A Lesson in the Genius We Take for Granted

Peter Spitz

August 07, 2024

Share Download

Peter Spitz reflects on the rapid adoption of life-changing innovations in everyday life, tracing the evolution from the early computer to the present day.

As an inventor, I’ve always marveled at how quickly ground-breaking, life-changing innovations become a regular part of our lives. Tracing the origins of amenities we now take for granted is a compelling way to imagine what comes next. It’s also a path of discovery that’s full of surprises.

Consider the computer, for instance. It was only in the latter part of the 20th century that we started acquiring personal computers and began sending emails. Soon, we couldn’t do without it—or a printer, for a time. So much for the typewriter or writing letters. While it wouldn’t be long before the smartphone took over some tasks (and that’s a computer in itself), the computer has become indispensable. Few can imagine modern life without it. But it didn’t just emerge without a lot of steps most of us have forgotten—and some go back over 100 years.

THE ABACUS

First, let’s go back to the invention of the abacus in China some 4000 years ago: essentially an analog adding machine, it used beads on metal rods within a wooden rack to carry out arithmetic computations.

Then jump to the 16th century, when Scottish mathematician, physicist, and astronomer John Napier discovered logarithms, developed the decimal point notation, and invented an abacus-like device that used ivory strips marked with numerals to multiply and divide. Flash forward to 1673 when philosopher and mathematician Gottfried Wilhelm Leibnitz invented a digital mechanical calculator.

THE DIFFERENCE ENGINE

Then came Charles Babbage, an English mathematician and mechanical engineer, who constructed a model of what he called a Difference Engine in 1822 to compute astronomical and mathematical tables. We can consider this the first concept of a digital programmable computer. It used the decimal system, was powered by a crank handle, and was considered so important that the British government gave Babbage 1700 pounds sterling to work on his project. Their interest was a machine capable of producing time-consuming tables.

But implementation proved a lot more difficult than expected: the metalworking techniques available at that time were unable to make parts in the quantity and precision required. An 1830 design showed a machine that would weigh 4 tons, with 25,000 parts.

THE ANALYTICAL ENGINE

In 1832, Babbage and a partner did produce a small working model—a steam-powered calculating machine—that could handle six-digit numbers, extract the root of a quadratic equation, and create and solve logarithmic tables.

Next, Babbage created another calculating machine, the Analytical Engine, that was fed with punch cards. It could solve most mathematical problems and could store data. But his work lay the foundation for modern computers. All the essential ideas that created the computer can be traced to his designs.

PROGRAMS AND PUNCH CARDS

There were others at work as well. in 1848, Ada Lovelace, an English mathematician, wrote the first computer program while translating a French paper on Babbage’s Analytical Engine into English. Then in 1853, Per Georg Scheutz and his son designed the first printing calculator. Herman Hollerith developed a punch card system to help calculate the 1890 U.S. census, and founded the Tabulating Machine Company.

By 1907, it was joined by two more companies: the International Time Recording Company and The Computing Scale Company. Then, Charles Ranlett Flint, one of the era’s prominent businessmen—and an expert in company mergers—had the idea to merge the three companies into one holding company. It was headquartered in Endicott, New York, while the individual companies continued to work in Dayton, Ohio, Detroit, Michigan, and Toronto, Canada. In 1924, the company was renamed International Business Machines (IBM).

THE DIFFERENTIAL ANALYZER

It’s startling enough to think of IBM as tracing back to 1924. But that pre-World War 2 era also brought other remarkable developments that hastened the computers’ progress.

In 1931, at MIT, Vannevar Bush (later involved in creating the atom bomb) invented the Differential Analyzer, a general-purpose mechanical analog computer. Five years later came British scientist Alan Turing’s machine, “able to compute anything that is computable,” and later used to decrypt German intelligence messages (Turing’s story was made into a movie in 2014, The Imitation Game). Then, in 1939, David Packard and Bill Hewlett founded the Hewlett-Packard Company in Palo Alto, California—and soon the region would become known as Silicon Valley.

THE BIRTH OF IBM

What came next? In 1945, two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, designed and built the Electronic Numerical Integrator and Calculator (ENIAC), the first automatic, general-purpose electronic, decimal, digital computer. IBM took the fore in the ensuing decades, becoming the leading manufacturer of punch-card tabulating systems, electric typewriters, and electro-mechanical calculators.

By the 1960s and 1970s, the IBM mainframe, exemplified by the System/360, was the dominant computer platform, with IBM producing 70 percent of computers worldwide. The computer and its software kept improving over the next years, shifting to “second generation” machines that used COBOL and FORTRAN as assembly languages, evolved beyond programming by cards, and used transistors instead of vacuum tubes.

APPLE IS BORN

Still in the early 1970s, the personal computer market was born. That’s when Steve Jobs and Steve Wozniak started to develop what they called the first user-friendly personal computer—and yes, their tiny startup was indeed founded in a garage. On April 1, 1976, the first Apple computer was born, an altogether different kind of desktop. It was pre- assembled (unlike others of that time), included an integrated keyboard, and had expansion slots for attaching disk drives and other components.

Keep in mind that this was still some 50 years ago, and the journey had just begun. Like so many of today’s groundbreakers, like AI or hybrid cars, the emergence of the modern computer was considered life-changing—and there were some people who wanted nothing to do with it. But as we know, that was then. And just as we take the computer for granted, someday we’ll take today’s brilliant inventions for granted as well.

 

About the Author

Peter Spitz was born in Austria and received his education in Europe and in the United States, After graduating with a Master's degree in Chemical Engineering at MIT, he worked for Esso Engineering and Scientific Design Company, earning nine patents and a leadership position in Process Development.

Learn More

We have updated our privacy policy. Click here to read our full policy.