Has The Innovators by Walter Isaacson been sitting on your reading list? Pick up the key ideas in the book with this quick summary.
Pop culture portrays genius as being the domain of “lone wolves,” who make great discoveries by shutting out the world and immersing themselves in theories and wild experiments.
While romantic, this myth is not really how innovation comes about. Innovation instead is a child of collaboration. Even the most introverted innovators were encouraged and nurtured by a circle of friends and creative minds, helping them toward the discoveries that ensured their lasting legacies.
Indeed, without such support, those we hail as geniuses today might have instead been just a footnote in the history of innovation.
Whether at hacker clubs or corporate meetings, government think tanks or through simple friendships, technology’s top talents more often than not made their groundbreaking discoveries through collaborations.
In this summary of The Innovators by Walter Isaacson, you’ll also discover:
- how pot-smoking nerds made some of the greatest breakthroughs in video game history;
- how refusing to patent the World Wide Web made it what it is today; and
- why we owe much of modern computation to one woman’s love for math and poetry.
The Innovators Key Idea #1: Ada Lovelace’s “poetic” mathematics provided an early vision of the role of modern computers.
It all started with Ada Lovelace (1815-1852), daughter of English poet Lord Byron. Although Byron was not involved in her upbringing, Lovelace nonetheless inherited his fiery artistic temperament.
At her mother’s behest, Lovelace began rigorously studying mathematics to discipline her rebellious mind. Over the course of her studies, she developed a passion for technology and machines, which, combined with her wild imagination, resulted in a uniquely “poetic” approach to mathematics.
At the early age of 17, she would attend the weekly salons of the science and math wizard Charles Babbage. These salons were a wonder, with lectures, mechanical dolls, telescopes trained on the stars and fascinating demonstrations of electrical and magnetic contrivances.
The centerpiece of these events, however, was Babbage’s Difference Engine, a large contraption that could make mechanical calculations.
Seeing Babbage’s work inspired her, and in her now famous Notes, she set out her ideas that creatively combined her vast mathematical knowledge with her creative disposition.
In 1834, Babbage took his ideas a step further with his Analytical Engine, a machine that could not only perform a single operation but also could switch operations – and even tell itself to do so.
Between 1842 and 1843, Lovelace translated from the French a transcript of Babbage’s presentation on his engine, to which she added her own copious and groundbreaking notes.
These notes – more than twice as long as Babbage's original article and in the end, far more influential – described “computers” as devices that could process music, patterns and poetry.
Lovelace’s ideas were essentially a prophetic vision of computer functionality, far beyond the simple calculations performed by Babbage’s Analytical Engine.
Lovelace also pioneered computer programming by explaining how the Difference Engine could be programmed with punch cards, thus greatly increasing its versatility and transforming it from a specialized contrivance into a general-purpose machine.
The Innovators Key Idea #2: Modern computing didn’t spring from a single mind, but was the result of many inspirations.
It took nearly 100 years following Babbage’s vision of a computer before technological advancements made it possible to build one.
In 1937, four key elements helped to define how a modern computer would come about. Electronic components would make up a computer’s core, while advances in circuits and switches meant the modern computer would be digital, and not analog. What’s more, computers would run on binary language (0s and 1s) and be general-purpose machines, able to manage a number of tasks.
In November 1945, inventors J. Presper Eckert and John Mauchly unveiled their ENIAC (Electronic Numerical Integrator and Computer), the first computer to incorporate all these elements.
Unlike its precursors, the ENIAC was completely electronic as well as fast and powerful, performing up to 5,000 addition and subtraction computations in a second. The ENIAC’s functionality and composition thus proved to be the basis for all modern computing.
Yet legal disputes plagued the inventors’ attempts to patent their breakthrough – proving in the end how truly collaborative the invention of the modern computer really was.
Eckert and Mauchly were first granted a patent for the ENIAC in 1964. Yet technology company Honeywell challenged the patent, arguing that the concepts behind the ENIAC weren’t original to the inventing team.
During court proceedings, it was brought to light that Mauchly had in 1941 visited physicist John Vincent Atanasoff and had examined a computer that Atanasoff had built. The judge determined that Eckert’s and Mauchly’s work was derivative from Atanasoff’s and ruled the ENIAC patent invalid.
Mauchly and Eckert nevertheless deserve much of the credit for inventing the modern computer, mainly because of their ability to draw and integrate ideas from multiple sources.
This history illustrates how such complex ideas and inventions rarely come from the mind of just one individual. Instead, paradigm-shifting inventions are the product of a collaborative brainstorm.
The Innovators Key Idea #3: A computer’s software, or programming, was a key invention in creating multi-use machines.
You probably know at least one person who works as a computer programmer. But do you really know what it is that a programmer does?
In essence, programming is the act of storing a sequence of instructions inside a machine’s electronic memory.
A true computer, like the one envisioned by Ada Lovelace, should be able to perform any logical operation. To do so, we would need a machine that isn’t limited by its hardware, or its physical components, but controlled by its software, the instructions that tell it how to manage a computation.
British mathematician and philosopher Alan Turing laid out the concept of programming in 1948, writing that, rather than having many specialized machines doing different jobs, it would be better to have a single machine “programmed” to carry out all needed operations.
Interestingly, during World War II women played a critical role in the development of programming, lead by programming pioneer and naval officer Grace Hopper.
As Hopper had been a math professor, she was assigned to write what would become the world’s first computer programming manual when she began working on the U.S. Navy’s digital computer, the Mark I.
Hopper approached programming in a methodical and collaborative fashion. She would give the computer precise instructions, while also involving her team in perfecting chunks of programming code for specific tasks.
By 1945, Hopper had transformed the Mark I into the most easily programmable large computer.
Hopper was not the only woman to have a huge impact on modern computation. In fact, women were typically at the forefront of the programming revolution.
As early programming was a highly repetitive task, consisting of switching cables and resetting switches, often these sorts of menial tasks were relegated to women. Yet it soon became apparent that a computer’s programming was just as important as its hardware.
The Innovators Key Idea #4: It took a trio of creative minds to create the first transistor, ushering in a new era of computing.
The invention of computers didn’t immediately spark the Digital Revolution. The first computers were enormous and costly, so widespread use was, at least at the time, out of the question.
In fact, the birth of our digital age didn’t happen until the advent of transistors, tiny semiconductors that allow highly complex programs to run on small devices.
The importance of the transistor to the Digital Revolution is as important as the steam engine was to the Industrial Revolution. Transistors made computers ubiquitous, allowing us to put serious processing power inside smaller computers, calculators and music players.
These revolutionary devices were made possible by a combination of diverse talents that all intersected at Bell Labs, based in New Jersey.
As a company, Bell Labs had a unique culture centered around sharing ideas. This collaborative environment allowed great innovations, as talented minds from various fields were brought together to exchange ideas and inspirations.
In 1939, physicist William Shockley at Bell Labs conceived of the idea of using semiconductors in place of cumbersome vacuum tubes, which until that point had been the standard way of powering computers.
Shockley then gathered together a research team, including great minds such as Bell Labs colleagues John Bardeen and Walter Brattain, to help realize his vision.
Finally, on December 16, 1947 – after two of years of collaborative experimentation and theorizing – Bardeen and Brattain managed to cram all a semiconductor’s component parts into a smaller space, thus creating the first transistor.
For their efforts, the trio was awarded the Nobel Prize in 1956.
The Innovators Key Idea #5: Two different engineers came up with the idea for the microchip almost simultaneously.
With the tenth anniversary of the transistor came a new and growing problem: the tyranny of numbers.
One of the great advancements made by the transistor was that it allowed for more advanced circuitry. However, as the number of components in a circuit increased, the number of connections between them increased even more – and since circuitry was often crafted by hand, creating all these connections was nearly impossible.
The invention of the microchip solved this problem. Yet interestingly, two scientists independently came up with the concept at almost the same time.
In the summer of 1958, Jack Kilby at Texas Instruments began work on a project to build smaller electrical circuits when he had the idea of manufacturing all a circuit’s component parts out of the same piece of silicon, rather than assembling different parts.
From this idea, the microchip was born – an achievement for which Kilby was awarded the Nobel Prize.
Only a few months after Kilby’s groundbreaking invention, Robert Noyce, co-founder of Fairchild Semiconductor and Intel, discovered that he could use printed copper lines to connect two or more transistors on the same piece of silicon. His more elegant design became the model for all future microchips.
Both discoveries meant that the process of building and connecting circuits could be automated, thus eliminating the roadblock with manually created circuitry and ending the tyranny of numbers.
Engineer Ted Hoff further expanded computational possibilities when he realized a better solution for designing microchips with varying functions. In 1971, he created a general-purpose chip called a microprocessor, which could be programmed to perform a variety of applications.
Today, microprocessors are found in all kinds of smart devices, from coffeemakers to personal computers.
The Innovators Key Idea #6: Hippies and hackers in the 1960s and 1970s combined visions to create the personal computer.
Early visionaries imagined a “personal computer” as early as 1945. But it would take another three decades before computers would become a mass product rather than a tool for researchers.
The personal computer revolution began in earnest in the mid-1970s, when tech whizzes and counterculture entrepreneurs came together and got serious about their computer tinkering.
Yet the seed for experimentation was planted in the 1960s, where in the San Francisco Bay area, a potent mix of hippies and hackers explored the burgeoning world of technology. For these early hackers, “the hands-on imperative” ruled the day: to understand a thing, you need to take it apart with your own hands and then use that knowledge to create new, better things.
Steve Jobs combined these two worlds: he was a visionary counterculture enthusiast as well as a skilled hacker, and together with Steve Wozniak, went from building prank devices that could hack free long-distance calls to founding Apple Computer.
Jobs and Wozniak both attended Homebrew Computer Club meetings, a hobbyist group where “tech-nerds” could meet and exchange ideas and where the philosophies of counterculture and technological enthusiasm were a perfect match.
It was at the Homebrew Computer Club that the pair first got a glimpse of the first personal computer: the Altair 8800.
Invented by Ed Roberts, the Altair 8800 was the first real working personal computer for home consumers. Roberts was neither a computer scientist nor a hacker, but a passionate hobbyist.
Using the new Intel 8080 microprocessor, he created a computer that despite its large size, lack of memory, keyboard or any other input device, any other hobbyist could make and own.
When the Altair 8800 was featured on the cover of the January 1975 issue of Popular Electronics, people went crazy. The electronics company MITS, which produced the Altair 8800, was overwhelmed with orders for the computer kit, which cost $397.
You’ve seen how collaboration paved the way for modern computing. In the next book summary, you’ll discover how collaboration changed the ways in which we use computers.
The Innovators Key Idea #7: From tool to toy: the collaborative culture of video games helped mold the personal computer.
As devices become smaller and more powerful, the perception of what computers could and should do also changed. We could not only use computers for work, but also use them for play.
It was this shift in utility that helped pave the way for yet another revolution: video games.
Early video games actually predate the first personal computer, and in fact helped bring about some of its primary features. Thinking in terms of games helped developers cultivate the idea that computers should have an intuitive interface and enticing graphic displays; in short, be personal and interactive.
For instance, in 1962 the creation of the video game Spacewar gave people the opportunity to handle a computer and make it respond to commands in real time. The simple game was free and based on open-source software, and today stands as a testament to the power of collaborative effort.
Spearheaded by computer scientist Steve Russell, the creation of Spacewar was actually the product of a geeky student organization at Massachusetts Institute of Technology (MIT) called the Tech Model Railroad Club. This group would pioneer the “hands-on” hacker culture that would later become so important to innovation in our digital age.
Although video games influenced the design of personal computers, it was innovator Nolan Bushnell who turned video games into a real industry.
A huge fan of Spacewar, Bushnell invented a console for the game which he named Computer Space. The console sold 1,500 units and immediately acquired a cult following. Bushnell went on to found video game company Atari, where he created the simple, successful and iconic game Pong.
Atari was notorious for its pot-smoking parties and beer bashes, and represented the zeitgeist that defined Silicon Valley: admire nonconformity, question authority and nurture creativity.
The Innovators Key Idea #8: The constant competition between software producers helps software in general improve.
In 1974, Paul Allen and Bill Gates gazed at a magazine cover that featured the world’s first personal computer, and got scared. The idea of being left behind in the early computer revolution was a terrible thought for two computer fanatics.
Allen and Gates thus spent the next eight weeks in a code-writing frenzy. Their work would eventually become some of the staple software for personal computers.
Gates and Allen had always been fascinated by software code, but far less so by hardware. They envisioned a market where hardware was simply a set of interchangeable pieces and application software and operating systems would be a computer’s true selling point.
With this vision, they set out to write software that would enable hobbyists to create their very own programs on Altair computers, and in doing so, launched an industry for personal computer software.
By the 1990s, there were many competing models for software development, and the competition encouraged the constant improvement of each model.
Two different approaches have since emerged. The Microsoft approach advocates for an operating system that is distinct from its hardware; the Apple approach combines hardware and software in a tightly closed system.
Additionally and separately, the Linux approach embraces free, open-source software.
Each of these approaches has their own distinct advantages. While Apple makes for cohesive design and a seamless user experience, Microsoft allows users more choice and options for customization. The open-source approach, however, allows any user to modify software as he or she sees fit.
So far, no single approach has dominated the industry. And importantly, with the continued existence of each of these models comes competition that inspires each to continually improve.
The Innovators Key Idea #9: The internet was built through a partnership of the military, universities and private corporations.
An innovation often bears the stamp of the environment that created it. This is particularly true for how the internet came to be.
The internet was created as a purely collaborative effort, in a partnership of the U.S. military, universities and private corporations.
Professor Vannevar Bush however was the individual who brought these groups together. As the dean at the MIT School of Engineering, a co-founder of electronics company Raytheon as well as America’s top military science administrator during World War II, Bush had unique insight and experience to inspire this collaborative effort.
In 1945, Bush published a report urging the government to fund basic research in partnership with universities and industry to ensure the military and economic security of the United States.
From this report, the U.S. Congress established the National Science Foundation, an organization that brings together experts from diverse fields. This in turn helped to spark the technological revolution that shaped much of the twentieth century.
Academics and computer experts within this framework thus were able to freely share ideas. The most pivotal of these came from psychologist and technologist J.C.R. Licklider, a pioneer whose ideas became some of the internet’s foundational concepts.
Licklider envisioned decentralized networks that would enable information to flow to and from anywhere, as well as the interfaces that would support human-machine interaction in real time.
But it was Licklider’s colleague, Bob Taylor, who envisioned a network that could permit research centers to share computing resources and collaborate on projects. And it was yet another colleague, Larry Roberts, who helped build this network.
Thus ARPANET, launched in 1969, served both military and academic functions. It wasn’t until 1973 when various independent networks were able to be combined with the creation of a communications protocol called Internet Protocol, creating what we now know as the internet.
The Innovators Key Idea #10: Carefully crafted government policies led to the opening of the internet to the public.
Although the personal computer and the internet were conceived around the same time and in the same spirit, the development of each followed a different path.
One reason for this was that the internet was not initially a public resource, and only those connected to an educational or research institution had access. At this time, email and the first online communities (such as bulletin boards and newsgroups) were also developed.
Yet before we can talk of an online revolution, the public had to find a way to access the internet. This wasn’t possible until the invention of the modem, which allowed personal computers to access global networks through phone lines.
However, a thick tangle of laws and regulations still existed that prevented commercial companies, such as America Online (AOL), from easily getting customers connected.
Before Al Gore was elected vice president in 1992, he helped to untangle the red tape as a senator by writing and advocating for the passage of the High Performance Computing and Communications Act of 1991, also known simply as the Gore Act.
The act lowered the barrier for entry in internet communications with the goal of establishing a “national information infrastructure,” which would allow commercial services to further develop connections to the internet and make it more widely available for public use.
The flood of new users who signed up for services like AOL in the following years completely transformed how the internet could and would be used, paving the way for an astonishing era of innovation.
Former Vice President Gore isn’t, as was often joked in the late 1990s, the “father of the internet.” Yet his foresight and determination to make the internet widely accessible is laudable.
His efforts led to the next phase of the Digital Revolution, where computers became tools for both personal creativity and collaboration.
The Innovators Key Idea #11: Tim Berners-Lee is the singular tech talent who alone conceived of the World Wide Web.
Although modems and online services made it possible for almost anyone to access the internet, early users nonetheless found themselves lost in a technical jungle with no way to navigate.
For most, the early internet was just too difficult to make sense of – until the advent of the World Wide Web, which came about largely through the efforts of Tim Berners-Lee.
The son of two computer scientists, Berners-Lee had already made a fundamental insight as a child growing up in 1960s London: while a computer is good at crunching numbers and processing linear information, it can’t make random associations and link information in a clever fashion the way a creative human mind can.
By combining the internet and hypertext – words or phrases with special code that when clicked, send a user to another related piece of content – Berners-Lee aimed to realize his vision of an ever-growing, evolving web of easily accessible information.
Robert Cailliau, a Belgian engineer working at the European Organization for Nuclear Research in Geneva (known as CERN), helped Berners-Lee in strengthening his funding proposal so his concept could be turned into reality.
And while the administration at CERN had initially hoped for a patent, Berners-Lee believed instead that the World Wide Web should remain free and open, in order to spread and evolve.
Berners-Lee’s faith in his conviction that a freer Web is a better Web is what allowed it to become as he had envisioned: a platform for sharing information and collaboration.
Final summary
The key message in this book:
Our digital age is the result of the efforts of a long line of collaborators who shared and explored their great ideas together. From the visionary programming contributions of Ada Lovelace in the 1840s to the radical innovations of our times, all the greatest technological movements have been a product of collaborative energy.
Suggested further reading: Steve Jobs by Walter Isaacson
This book chronicles the audacious, adventurous life of Steve Jobs, the innovative entrepreneur and eccentric founder of Apple. Drawing from Jobs’s earliest experiences with spirituality and LSD to his pinnacle as worldwide tech icon, Steve Jobs describes the man’s successful ventures as well as the battles he fought along the way.