The Language Instinct Summary and Review

by Steven Pinker

Has The Language Instinct by Steven Pinker been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

Why is it that you pick up your mother tongue so naturally, while trying to learn languages in adulthood sometimes feels like banging your head against a wall?

And why is it that, barring the common mix-ups and misunderstandings, we are so adept at communicating with each other almost effortlessly?

The answers to these questions lie in the nature of language and our innate ability to communicate with words: our language instinct.

In this book summary, you’ll learn how language is structured and why human beings are especially good at picking it up. In addition, you’ll learn all about the neuroscience behind our amazing linguistic skills.

In this summary of The Language Instinct by Steven Pinker,You’ll also discover

  • how children use grammatical rules that nobody has taught them;
  • how you can intuitively differentiate one wug from two wugs; and
  • why Siri has such a hard time figuring out what you said.

The Language Instinct Key Idea #1: We are all born with a language instinct.

Think for a moment about how easy it is to turn the thoughts in your head into meaningful sentences. Where did this skill come from? While many people believe we learn grammar in the classroom, our knowledge of it precedes the moment we are born!

Indeed, very young children have an innate understanding of grammatical structure that they couldn’t possibly have learned. The idea that grammatical rules are hardwired into the brain was first put forward by the famous linguist Noam Chomsky in his theory of Universal Grammar.

According to Chomsky, children don’t learn how to speak from their parents or anyone else, but rather by using their innate grammar skill. As a consequence, Chomsky reasoned, all languages have the same basic underlying structure.

One of Chomsky’s main arguments for this is the poverty of the stimulus, which demonstrates that children understand verb and noun structures they couldn’t have learned.

For example, to turn the phrase “a unicorn is in the garden” into a question, you must simply move the “is” to the front of the sentence. However, for the phrase “a unicorn that is eating a flower is in the garden,” you have to rearrange more than just the first “is” to turn the phrase into a question. To make a grammatically sound sentence, you have to move the second “is.”

Chomsky correctly claimed that children would never make the mistake of misapplying the first strategy for creating a question to the second, more complex sentence. In subsequent experiments, no children moved the wrong “is,” even with sentences they could have never heard before.

Furthermore, deaf children use correct grammar in their signs without ever studying it.

Psychologists studied a deaf boy named Simon, whose two deaf parents only learned sign language in adulthood, and thus made various grammatical mistakes.

Simon, on the other hand, didn’t make the same mistakes, despite only ever being exposed to his parents’ style of signing. The only way to account for this is that Simon had an innate knowledge of grammar that precluded him from making his parents’ mistakes.

The Language Instinct Key Idea #2: The popular idea that our words affect our perception is false.

Despite its popularity, there’s no basis for what is known as linguistic relativity, i.e., the idea that the structure of our language influences the way we perceive and understand the world. Linguistic relativity is also called the Whorfian Hypothesis, after the linguist Benjamin Whorf.

Whorf was an amateur scholar of Native American languages, and made several claims that Native Americans viewed the world differently due to the structure and vocabulary of their language.

For example, “a dripping spring” translates literally as “whiteness moves downward” in one Apache dialect. According to Whorf, this discrepancy indicates that Apaches don’t perceive the world in terms of distinct objects or actions.

However, other psycholinguists were quick to point out that Whorf never actually studied Apaches in person. In fact, it’s not even clear that he ever met one!

He also translated sentences in ways that made them sound much more mystical than they actually were. But you can do the same with any language. For instance, the phrase “he walks in” could just as easily be modified to something mystical, like “as solitary masculinity, leggedness proceeds.”

By extension, some hold the view that people see colors differently according to their mother tongue. Some cultures, for instance, have only two color words: either “black” (dark hues) or “white” (light hues).

But does this mean they see only two colors? Hardly! It would be preposterous to think that language could somehow reach into their eyeballs and modify their physiology.

Despite this, belief in linguistic relativity survives due to urban myths. The Great Eskimo Vocabulary Hoax, for example, demonstrates how baseless linguistic relativity is.

The popular belief is that Eskimos have far more words for snow than are found in English. Experts say they actually have 12 – hardly a great discrepancy from English’s many variations on the word, like snow, sleet, slush, hail and so on.

The Language Instinct Key Idea #3: Language is based on two principles.

So how is it that we so effortlessly communicate with one another? Well, human language follows two principles which facilitate an ease in communication.

The first principle is the arbitrariness of the sign. This idea, first introduced by the Swiss linguist Ferdinand de Saussure, relates to the way in which we pair a sound with a meaning. For example, the word “dog” does not sound like a dog – it doesn’t bark like a dog, nor does it walk like a dog. The word has no inherent “dogness,” but nonetheless retains its meaning.

Why?

Well, English speakers all make the same association between the sound “dog” and man’s best friend through countless instances of rote learning.

The arbitrariness of the sign is a huge benefit for language communities, as it allows them to transfer ideas near instantaneously without having to rationalize pairing a particular sound with a particular meaning.

The second principle is that language makes infinite use of finite media. In layperson’s terms: we have a finite set of words that we can combine to create an infinite number of larger things, i.e., sentences.

We make sense of these infinite possible combinations by establishing rules that govern changes in word combinations. For example, what is the difference between “dog bites man” and “man bites dog”?

Apart from one being an unfortunate everyday occurrence and the other being newsworthy, the difference lies in the foundational grammar that governs meaning.

Each of the words in “dog bites man” has its own individual meaning that doesn’t depend on the complete sentence. Grammar is what allows us to arrange these words in specific combinations in order to evoke specific images and meanings.

There’s a finite number of words, but grammar gives us an infinite number of ways to combine them.

The Language Instinct Key Idea #4: Grammar might get all the attention, but words are interesting too.

Much as we are composed of cells, which themselves are composed of smaller particles, sentences and phrases are composed of words, which are made in turn from small bits of grammatical information called morphemes. These morphemes are governed by the rules of morphology.

Take the hypothetical word wug, for example. “Wug” is a morpheme. By adding the morpheme for pluralization, the suffix -s, to the end of a “wug,” we end up with a group of wugs.

So it appears that there is a rule for creating plurals for nouns: adding the morpheme -s.

Amazingly, we didn’t learn this rule as children, as was proven by psycholinguist Jean Gleason.

In an experiment, she showed preschool children a drawing and told them, “This is a wug.” She then showed them two wugs and asked, “Now we have two, so we have . . . ?”

The result? The children all added the suffix -s. There is no way a child could have learned the word “wugs” before, which indicates that we must have innate ability to form plurals and that we have mental rules for generating new words.

We can learn more about morphemes by looking at the differences between languages. English, for example, is often said to be simpler than German, but the difference is just morphological.

Or take the Tanzanian language Kivunjo. In terms of inflectional morphology, the language is quite sophisticated.

In Kivunjo, verbs can be made up of seven prefixes and suffixes – all of which are morphemes – that change the verb’s meaning. The word “naikimlyiia,” which means “to eat,” is an elaboration of the verb “-lyi-.” The additional letter combinations are various morphemes.

Contrast this with English, where most verbs have only four forms (e.g., quack, quacks, quacked, quacking).

However, what English lacks in inflection it makes up for with derivational morphology – the creation of new words from old. For example, by adding the suffix “-able” to the word “learn,” you create a new word: learnable.

Now that you know more about the way languages are structured, the following book summarys will examine why exactly we find communicating with one another so easy.

The Language Instinct Key Idea #5: Our ability to understand speech is like a sixth sense.

How is it that we can put a man on the moon and yet be unable to build a computer that recites back what we say?

Speech, as opposed to written language, does not have any clearly demarcated breaks between words.

The seamless, fluid connection between uttered words is essentially a string of phonemes, or units of sounds that make up a morpheme. These phonemes roughly correspond to the alphabet, so if you think of all the sounds when you spell out  b-a-t, each sound is a phoneme.

Each phoneme has its own unique acoustic signature. For example, the word “beat” is comprised of three sounds (“b,” “ea” and “t”), each with its own unique sound wave. So couldn’t we simply program a computer to recognize these sound waves and recite the word “beat” back to us?

Unfortunately not, due to a phenomenon called coarticulation, the process whereby the sounds of each phoneme blend into each other as we speak.

When you say the word “beat,” the three sounds that comprise the word are not distinct, and are influenced by the sounds uttered before and after. Computers can’t account for the radical diversity caused by coarticulation in the acoustic signatures of phonemes, and therefore have a hard time dictating our speech.

But why are we so good at it? As of yet, there is no clear answer. But we can be fairly certain that it isn’t due to top-down processing, that is, moving from a general to a specific analysis.

Some researchers believe that we understand the complex sounds of speech from context – for example, that when we talk about the environment, we expect someone to say “species” instead of “special.”

However, given the rapidity of normal speech, this seems unlikely. In most cases, it’s impossible for us to predict which word our conversation partner will say next. Moreover, if you call a friend and recite ten random words from the dictionary, he’ll understand them all despite the distinct lack of context.

The Language Instinct Key Idea #6: We understand written language because we are highly skilled “parsers.”

Up to this point, we’ve focused mostly on spoken language. But how exactly do we make sense of the strange symbols written upon the pages of a book?

We understand sentences by first parsing them, breaking them up into their component parts and referring to their grammatical roles in order to understand their meaning.

However, grammar itself is nothing more than the code for how language works, specifying only which sounds correspond to which meaning. The mind then parses this grammatical information, looking for the subject, verb, objects, and so forth, and groups them together to provide the meaning of the sentence.

Linguists believe that there are two kinds of parsing: breadth-first search and depth-first search.

A breadth-first search is a style of parsing that looks at individual words in order to determine a sentence’s meaning. During its analysis of individual words, the brain will entertain, however briefly, multiple and sometimes absurd meanings for ambiguous words (e.g., the word “bug” could be either an insect or a tool for spies).

A depth-first search looks at entire sentences, as there are sometimes simply too many words to compute at one time. Here, the brain picks one likely meaning for the sentence and runs with it.

Sometimes, depth-first searches lead to confusion, especially with garden path sentences, so named because they lead you up a “garden path.” These sentences demonstrate how parsers can not only fail to choose a likely meaning for a sentence, but also relentlessly hold on to the wrong one.

Take the sentence, “The man who hunts ducks out on weekends,” for example. Despite being perfectly grammatically sound, it confuses most people, because the meaning changes halfway through (the hunter goes from “hunting ducks” to “going AWOL”), so our brains get stuck on the original meaning and can’t make sense of the rest.

Clearly, we’re quite skilled in the art of speech. But where did this language ability come from? The following book summarys aim to answer this question.

The Language Instinct Key Idea #7: Childhood is a critical period for developing our innate language skills.

As we’ve learned, we’re all born with the innate ability to acquire language. However, we still need a playground to hone our skills.

When they’re still young, children are essentially vacuum cleaners for words. The author estimates that an average six-year-old has an amazing vocabulary of around 13,000 words!

This is an astounding feat, as preliterate children only hear words through speech and have no opportunity to study them. Instead, they memorize a new word every two hours for every waking hour, day after day.

This is especially impressive because the most effective methods for memorization, mnemonic devices, don’t help for individual words.

A mnemonic is a learning technique that transforms what we want to remember into something more memorable. For example, if you want to learn to read music, then an easy way to learn the lines on the treble clef (EGBDF) is to remember the sentence Every Good Boy Deserves Fudge.

But this doesn’t work with individual words. Given the shortage of easy ways to remember words, childrens’ brains must have an innate, powerful system for quickly mastering a language.

However, as we grow older, we begin to lose this amazing ability. Adults everywhere struggle when it comes to learning another language, as the skill seems to rust with age.

Elisa Newport is a psychologist who conducted a study on immigrants to America. She found that those who had arrived between the ages of three and seven were as skilled in English grammar as those born in the country. Those who immigrated between eight and 15, however, fared much worse.

The same can be seen when learning our first language. Throughout history a tiny number of children have grown up without any human contact, usually due to neglect. They are known as “wolf children,” like “Genie,” a 13-year-old girl who was discovered in 1970. Because she grew up without human contact, she was unable to form even basic grammatical sentences.

The Language Instinct Key Idea #8: Our language instinct could have come about through evolution.

We haven’t yet touched on the origins of the language instinct. Could it be possible that our natural ability for language was part of the evolutionary process?

Some, including Chomsky, doubt the language instinct’s compatibility with Darwinian evolution.

The modern take on Charles Darwin’s theory of evolution is that complex biological systems are created by the gradual collection of random genetic mutations over generations. These mutations enhance the organism’s reproductive success, and thus its ability to pass on its good genes.

Traditionally, there are two arguments against language instinct as a product of evolution.

First, language is unnecessarily powerful and complex. As a result, the development of language wouldn’t have aided reproductive success.

However, this critique is like saying a cheetah is faster than it “needs” to be. Over time, small advantages amount to big changes, and something as small as a one-percent reproductive advantage in growing one percent larger could, over a couple of thousand generations, lead a mouse to evolve to the size of an elephant.

Second, language is incompatible with evolution because it is unique to humans – even our closest relatives, chimpanzees, don’t have language. Since chimps and humans evolved from a common ancestor, who evolved from lesser primates, shouldn’t chimps and monkeys also have languages like ours?

Not necessarily!

Evolution doesn’t work as a linear hierarchy in which all organisms stem from the same source, e.g., an amoeba.

Evolution isn’t a ladder, it’s a bush. Chimpanzees and humans evolved from a common ancestor that is now extinct, so it’s possible for us to have language without chimps ever having to have it.

Our language instinct probably came about through natural selection, the process whereby slight differences between individuals give greater or lower chances for survival and reproduction.

Thus our ancestors likely benefited in some way from an ability to communicate with each other, which gave them the adaptive advantage necessary for surviving in their environment.

Our final book summarys will explore how we can use this knowledge of the origins of language and our propensity for learning to understand more about ourselves.

The Language Instinct Key Idea #9: Chill out about good grammar – it’s more arbitrary than you think.

Recent decades have seen a growing obsession with grammatical rules. Today’s “grammar Nazis” are quick to point out things like confusing “their” and “there,” or decry split infinitives as the mark of the uneducated. But is this fair?

In short: no, it’s not.

There is a vast difference between how we are “supposed” to talk and how we can or do talk. Consequently, people who actually study language have different conceptions of grammar rules to an average person.

Prescriptive rules are the ones we learn and struggle with in school, and they govern how we’re “supposed” to talk. These are the weapons of grammar Nazis.

In contrast, scientists deal with and attempt to isolate and explain descriptive rules, i.e., the ones that govern how people actually talk.

Scientists are more concerned with descriptive rules, because prescriptive rules alone are not enough to build a language.

For example, the prescriptive rule that you shouldn’t start a sentence with the word “because” wouldn’t make sense without the descriptive rules that define both infinitives and what a sentence is, and categorize the word “because” as a conjunction.

Put in the best light, prescriptive rules are little more than decorations of descriptive rules. So it’s possible to speak grammatically (as in descriptively) while also speaking ungrammatically (non-prescriptively), just as a taxi can obey the laws of physics while simultaneously breaking the laws of California.

So who decides what constitutes “correct” English?

Well, that’s hard to say. Prescriptive rules come and go with changes in fads and politics.

For instance, the rule of not splitting infinitives (not putting words between “to” and a verb) that was so diligently beaten into us as children, doesn’t seem so grating when Jean-Luc Picard tells us he wants “to boldly go where no one has gone before.”

The rule itself has its roots in eighteenth-century England, when people wanted London English to overtake Latin as the language of the upper class. Split infinitives don’t exist in Latin, so they simply copied the rule.

The Language Instinct Key Idea #10: With the knowledge that language is a human instinct, we can understand more about how the brain works.

Recent advances in neuroscience, combined with our understanding of language as an instinct, could help unlock the mysteries of the brain.

For example, understanding that language is an instinct offers us insight into how the brain is structured.

Key areas of the brain have now been identified as being associated with language. For instance, the left perisylvian is now considered to be the brain’s “language organ.” In 98 percent of brain damage cases resulting in language impairment, the left perisylvian area is affected.

While the relationship between brain structure and function is complex and not yet fully understood, it does appear that certain faculties are housed in specific places in the brain, called modules.

Different aspects of language, such as speech production, comprehension, etc., all involve areas of the brain that are located close to one another in the left hemisphere.

Our knowledge that we have a language instinct also allows us to speculate about other hardwired instincts we might have.

For example, just as we have a language instinct, we may also have “a biology instinct.” Anthropologist Brent Berlin put forth the idea that human beings have an innate folk biology. That is to say, people have an innate understanding that plants and animals belong to different species or groups – all without being taught.

The psychologist Elizabeth Spelke has demonstrated the legitimacy of folk biology in an experiment with children.

The kids were first shown a picture of a raccoon, which transformed to look like a skunk. They were then shown a coffee pot that transformed to look like a bird feeder.

The children accepted the coffee pot’s transformation, but couldn’t accept that a raccoon had turned into a skunk. It didn’t matter to them if an inanimate object altered its form, but a raccoon was a distinct being that couldn’t just turn into something else. This showed an intuitive understanding of the difference between natural and artificial things.

Our knack for language is deeply complex, but the more we learn about it, the more we discover about ourselves.

In Review: The Language Instinct Book Summary

The key message in this book:

We’re all born with a language instinct that is hardwired in our brains. Our knack for language is far deeper than the grammar we are taught in school, and is probably even one of the reasons for our continued survival as a species.

Suggested further reading: The Better Angels of Our Nature by Steven Pinker

The Better Angels of Our Nature takes a close look at the history of violence in human society, explaining both our motivations to use violence on certain occasions and the factors that increasingly restrain us from using it – and how these factors have resulted in massive reductions in violence.