Has The Information by James Gleick been sitting on your reading list? Pick up the key ideas in the book with this quick summary.
What is information? Is it physical or intangible? Is it just sets of data or can it be a form of language that communicates with others?
Information is anything that’s conveyed by a particular arrangement of things. As you’ll see in this pack, this definition covers everything from language to genetics to contemporary technologies. As a result, information is hugely influential. Changes in the way we use speak and write language, for example, led to the creation of logic.
So information has radically changed the way we think, and therefore its history is an indirect history of human thought. In this pack, you’ll see the major developments and changes in the way we humans have used information and why the way we handle it in the twenty-first century is crucial to our lives.
In this summary of The Information by James Gleick, you’ll learn
- how morse code changed the way we think about time;
- why making a comprehensive dictionary is impossible; and
- what memes tell us about the way information moves.
The Information Key Idea #1: Humans have been communicating information to each other from the beginning, and in the most unlikely of ways.
Information is a difficult term to define exactly. Commonly, information relates to facts, i.e., things that we can know. But information can be more broadly defined than that. In fact, information can be anything that is conveyed by an arrangement of things – objects, sounds, movements or symbols.
In later human history, we began to use information as a way to quantify and compare things, such as the difference in weight between a bag of rocks or a pot of water. Our early interest in information, however, was entirely concerned with communication.
All kinds of arrangements of things can be used to communicate information: letters convey words, dashes convey morse code, and even drum beats can convey meaning!
In fact, there are historical records of communities throughout Africa that used drums to literally talk to each other.
As early as 1730, scouts for English slave traders in sub-Saharan Africa noticed that drumming as a form of communication was quite prevalent. One scout, Francis Moore, spoke of how drums were used to signal the arrival of an enemy, but also, he suspected, to call for aid from nearby villages.
It would be nearly 200 years, however, before an English missionary called John F. Carrington made a concerted effort to understand and explain the “talking drums” of Africa to the rest of the world in 1914.
He found that drummers were doing more than signaling danger. They were actually talking through the drums, telling stories and even jokes. His discoveries were eventually published in 1949 in the book The Talking Drums of Africa.
Talking by drum was made possible by the fact that many African languages, unlike English, are tonal, so meaning is inferred by the different pitches of a word. African communities were able to mimic these tones with their drums, and thus convey information over great distances.
The Information Key Idea #2: Writing was not only important for communication – it changed the way we think.
Think about how much your life has changed since the dawn of the internet. Now think how much it would have changed with the first introduction of written language. When writing first developed, life changed drastically.
Writing not only helped us preserve spoken communication, it also changed the way we appreciate it.
Anthropological studies have revealed that people were already crudely writing at least 30,000 years ago in the Stone or Paleolithic Age.
While these scratchings on clay and paintings on cave walls weren’t necessarily “writing,” they nonetheless serve as examples of early humans recording internal ideas with external media.
These pictographic representations (writing the image) progressed to ideographic (writing the idea) and then logographic (writing the word) representations. These all combined to create the alphabet, in which individual sounds are represented.
The invention of such a system completely changes the way that we understand and convey ideas. In fact, our very ability to reason logically came from written language.
Logic doesn’t exist independently of language, and couldn’t exist without it.
Take the syllogism, for instance, which is a form of logical argument in which a conclusion is derived from two premises. For example, the premises “All Simpsons are yellow” and “Homer is a Simpson” lead to the conclusion “therefore Homer is yellow.”
Sure, syllogisms can be spoken, but their very existence is the product of the written word, the ability to see what’s been said and compare it with the rest of the text.
Let’s look at another Homer, this time the Ancient Greek epic poet: his stories contain zero syllogisms. Why? Because, as structural linguist Milman Perry proved, Homer’s work was composed and performed entirely orally, i.e., without writing.
Homer’s spoken epic poems would have been arranged according to loose association and memory, and not semantic logic. Only once these stories were put on the page around 500 BCE to 600 BCE did his narrative structure begin to develop into sustained rational argument.
The Information Key Idea #3: Though attempts to record entire vocabularies have failed, they have nonetheless taught us a lot about information.
How many words do you think there are? One million? Two million? Well, the pioneers of the dictionary realized there were a few more than that.
The first attempt to map our language was the Table Alphabeticall, compiled by English teacher Robert Cawdrey in 1604. While it is considered to be the first English dictionary, its format and purpose differed greatly from the dictionaries of today.
Firstly, Cawdrey didn’t have much respect for spelling, writing both “wordes” and “words” on the dictionary’s title page. Cawdrey also didn’t intend to compile all the words in the English language. Rather, he compiled 2,500 words which were “unusual” or “hard to speak.”
Cawdrey wanted to explain unusual words in “plaine English,” both for the benefit of people unskilled in language and to combat the increasing number of words borrowed from other languages. By 1801 English had already become the most “corrupted” language: Viking invaders brought egg, sky and anger, while cow and pig came from other Germanic languages.
Subsequent dictionaries tried more rigorously to catalogue the English language, but people soon realized that it was an impossible task!
The founders of the iconic Oxford English Dictionary (OED), first published in 1933, optimistically thought that the English-language lexicon was indeed large, yet finite. While the number of books was unknown, it wasn’t unlimited. Thus, they reasoned, the number of words must also be finite.
Lexicographers, people who compile dictionaries, eventually realized the futility of the OED’s quest: the problem isn’t just capturing the ever-changing slang or scientific jargon that enters the language, but getting a handle on the words we use most often.
Just think: the verb “make” alone has 98 distinct definitions – enough to fill a book on its own.
The Information Key Idea #4: The discovery of electricity accelerated the way information was, and could be, transmitted.
Communication wasn’t always as easy as picking up the phone, and our lives changed drastically when communication became instantaneous.
The first method of instantaneous, long-distance communication was the electric telegraph. This was preceded by the optical telegraph, invented by Claude Chappe in France in 1792. Optical telegraphs were towers resembling windmills with two sails, which were stationed within eyeshot of one another. Using ropes and levers, operators manipulated the two sails to form symbols against the sky, which were then transcribed into corresponding letters by the tower receiving the message.
While the towers were successful and spread throughout Europe, communication was slow, laborious and impossible in inclement weather.
Less than a century later, in 1844, electric telegraph lines were up and running in both England and North America. They were largely used to transmit morse code, a signal created by interrupting the electric current between two places. These interruptions created tones and lights which could then be organized into meaning.
The introduction of the electric telegraph lines meant that information could be transmitted at an unprecedented speed – as fast as electricity itself. This had huge implications for society.
For example, people living on the East Coast had no idea that the time of day was different on the West Coast until they could communicate with them so quickly.
The electric telegraph also resulted in new perspectives on how information could be shared, understood and expressed.
Morse converted language into a form that could be transmitted down a copper wire. To people like the mathematician George Boole, this form of communication was much more similar to math than to spoken language. He took this idea further in his 1854 book The Laws of Thought, which showed that linguistic arguments could be coded in mathematical formulas.
You’ve seen how our understanding and use of information has changed over the years, as well as how it has changed us. In the following book summary, we’ll examine how that information is stored.
The Information Key Idea #5: The discovery of DNA changed the way we think of ourselves as people.
Information is so deeply ingrained in our experience that one form of information actually determines nearly everything about us: our genes. When scientists discovered that genes were part of our DNA, it gave them a new place to look for clues about our human information.
Even before the discovery of deoxyribonucleic acid or DNA, biologists believed that, just as physicists have their atoms and chemists their molecules, there must be some basic unit that underpins all the biological phenomena they studied.
In 1910 Wilhem Johansen gave these units a name. He called them genes, the basic biological building blocks that determine which traits we inherit from our parents, e.g., blue or brown eyes.
But academics were flummoxed. If these tiny “genes” truly held all our information, then where were they?
The answer came with Francis Crick and James Watson’s groundbreaking 1953 discovery – genes exist within the iconic double-helix figure of DNA.
Differing views on genes have led to some startling revelations.
Molecular biologists, for example, examine the function and structure of proteins and nucleic acids that are essential to life. One central aim of molecular biology is to examine how our genes are reproduced, or how genetic information is passed from one generation to another. As molecular biologists’ knowledge of genes became more refined, they eventually proposed that we use DNA to pass on this information.
Compare this to evolutionary biologists, who instead look at the processes that produce life in all its variety.
The young zoologist Richard Dawkins challenged the molecular view. In his book, The Selfish Gene, Dawkins claimed that the molecular biologists had got it backward: we don’t use genes to reproduce our traits. Instead, they use us to reproduce themselves.
Dawkins’ analysis explains previously mysterious phenomena, such as why animals would risk their lives to preserve their offspring. In essence, organisms are hardwired to preserve their genes – their information – over themselves.
The Information Key Idea #6: There is also a source of non-genetic but equally pervasive information.
Scientists don’t just investigate an organism’s physical environment in their quest for understanding. They also look to the intangible.
Dawkins asserts that wherever there is life, there is replication. In his view, physical life began in what could be called a primordial soup – a rich mixture of organic compounds – where genes replicated and eventually became the organisms we see today.
Genes rely on chemical processes for replication, but Dawkins believed that a new kind of replicator also emerged – a bodiless replicator he called a meme. The meme’s primordial soup was human culture, it transmitted itself through language and it replicated itself within the human brain.
Memes are defined as any part of a culture or behavior that is non-genetic, yet is passed from one individual to another.
Ideas, for example, are memes. Ideas can pop up, permeate and evolve through the course of history. Just think of the flat earth theory (which some still hold to this day), which was eventually replaced by the idea of a spherical earth.
Catchphrases, too, are memes. Sayings such as “read my lips” or the now prolific “hashtag X” are propagated between individuals and cultures.
Memes can be seen as organisms in the sense that they are not elementary units, but rather the sums of many parts. Numbers, for instance, are not memes and neither are colors. Nor are objects memes – however, they are vehicles for them. As philosopher Daniel Dennett put it, a wagon with wheels carries not only things but also the idea of a wagon with wheels.
Yet while memes can be propagated just as well as genes, their effects can be far more devastating. Racist myths, for example, are memes, as is the idea that God will reward you for killing infidels, and these are two ideas that we could surely do without.
The Information Key Idea #7: Our modern technological age has created new ways in which we think about and store information.
Modern day devices and tools, such as Google or Wikipedia, function as constant supplies of data. They also reveal much about how we understand information.
Wikipedia, the online encyclopedia, showed us how hard it is to agree on what’s “relevant.” All the information stored on the site is generated by the general public, who are not paid to perform this service.
This free-for-all system quickly spawned edit wars, in which users debate endlessly over topics as specific and crucial as whether the human with whom a cat lives is its “owner,” “caregiver” or “human companion.”
These wars created two warring ideological factions among Wikipedia editors: “inclusionists” and “deletionists.”
Inclusionists take the view that everything should be allowed on Wikipedia. In essence, it should strive to be the storehouse for all human knowledge.
Deletionists, on the other hand, believe that there is no place for trivia on Wikipedia, and actively try to delete it as it pops up. To them, trivia is anything that is too short, badly written or culturally or historically insignificant.
Even so, both factions play a crucial role in the way Wikipedia evolves, and luckily, each edit is kept in page histories. Some hope that Wikipedia will one day store all of humanity’s accumulated information, made possible by this new era of information permanence.
In the days of the Ancient Greek playwright Sophocles, people assumed that information produced and consumed by mankind would eventually just vanish.
However, technological advances mean that everything can be recorded and preserved today. It’s normal for people to carry a camera with them everywhere they go. In fact, 500 billion images were captured in 2010 alone. In the same year, YouTube streamed more than a billion videos a day.
We can only ask ourselves what this new age of sharing and information permanence will mean for us.
Final summary
The key message in this book:
The way that we share, consume and even understand information has changed drastically over the years with each new technological paradigm shift. As information becomes both more accessible and more permanent, we have to ask ourselves how this will affect the way we understand and consume information.
Suggested further reading: The Language Instinct by Steven Pinker
The Language Instinct provides an in-depth look into the origins and intricacies of language, offering both a crash course in linguistics and linguistic anthropology along the way. By examining our knack for language, the book makes the case that the propensity for language learning is actually hardwired into our brains.