The Death of Expertise Summary and Review

by Tom Nichols

Has The Death of Expertise by Tom Nichols been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

If you haven’t heard the phrase “fake news” lately, then you’ve probably been living in a remote yurt for the past year or so. We now live in an age of competing narratives of what constitutes the truth. The widening gap between people who trust and distrust established narratives has become so great, that it can often be difficult to know what – or whom – to believe.

But this didn’t all happen overnight. American society in particular has been moving toward its current predicament for a while, and it’s not just political discourse that’s at fault. Whether it’s grade inflation at the nation’s best universities, or celebrities using their stature to promote things they don’t know much about, the signs have been there for a while.

This book summary will show you how, and why, things have gotten this bad.

In this summary of The Death of Expertise by Tom Nichols, you’ll learn

  • why a college education is not as valuable as it once was;
  • how the advent of the internet has decreased the quality of news media; and
  • what confirmation bias is (and why it’s important.)

The Death of Expertise Key Idea #1: Although disagreements about expertise are nothing new, they have been increasing in the internet age.

In the past, you could expect some misinformation coming from the tobacco or sugar industries, who were always ready to tell you how harmless their products were. But these days, it’s becoming harder to keep track of what’s real and what’s bogus.

Generally speaking, being able to challenge a government’s official message has always been a sign of a healthy democracy. In the cradle of democracy, ancient Athens around the fifth century BC, the general populace was heavily involved in discussions about social and political developments.

And even back then, there were the same two opposing sides we continue to see today: the intellectuals who believe most people are buffoons, and the laypeople who have a distrust for anyone claiming to be an expert.

But things changed dramatically once the internet arrived in our homes, and the conflict between experts and laypeople has gotten wildly out of hand.

The internet makes it possible to find a source to support any opinion under the sun, no matter how outrageously unscientific it might be, and it also has people feeling more empowered than ever to voice their opinions.

Once people start ganging up and attacking established knowledge, years of scientific progress are endangered and people’s lives can be put at risk.

This is certainly the case with the anti-vaccine movement. Despite consistent research and an overwhelming majority of doctors and scientists insisting that vaccines are safe and essential for protecting children against disease, a dangerous myth caught on with the public. Now, many people truly believe vaccines are harmful and can even cause autism.

What’s worse is that these movements can pick up celebrity endorsements, as was the case with Jim Carrey, who used his celebrity status to promote this misinformation. Thousands of parents are now putting their children and others at risk by refusing to vaccinate their children.

People who reject expert advice often use the logic that experts have been proven wrong in the past and can be wrong again. This is true, but it’s also true that science has gotten more exact and experienced experts are less likely to be wrong than ordinary citizens and movie stars, especially when it comes to their area of expertise.

The Death of Expertise Key Idea #2: We have human traits that can make us believe a false argument.

With an infinite amount of information just waiting to be scrolled through, countless people are joining debates on everything from Batman movies to theoretical science. Having no formal education on a subject does nothing to weaken people’s confidence in their ability to read a few articles and believe they have a full grasp of a subject.

This makes a great deal of online conversations painful to read, but it’s important to note that both experts and laypeople are prone to many of the same biases inherent to human nature.

Take the Dunning-Kruger effect. In 1999, Cornell University psychologists David Dunning and Justin Kruger revealed that having less skill at a specific task can make someone less likely to recognize their own incompetence.

This is due to a lack of metacognition, which is the awareness of our own thought processes and the trait that allows someone to recognize their limitations.

Lack of metacognition could be the root cause of why people are so adamant that they know what they’re talking about, even when it’s clear to others that their thinking is radically off course.

Another human trait that can steer us in the wrong direction is our tendency to only seek out and pay attention to information that agrees with what we already believe. This is known as confirmation bias.

If you grew up being told that left-handed people are all agents of evil, you could then find police reports of every southpaw that committed a crime and point to these documents as proof. Meanwhile, you can dismiss every account of empathetic or philanthropic lefties as an exception or part of a conspiracy to throw off the unsuspecting public.

This is how confirmation bias works; it’s a very human mind-set that can lead even the most talented experts astray.

Doctors, for instance, can easily get so focused on a certain diagnosis that they only see the symptoms that agree with their theory and miss the ones that point to the real condition.

The Death of Expertise Key Idea #3: Higher education has become a product, so graduates are no longer experts.

Education systems have seen significant change in the last century and they are far from perfect.

Before World War II, a college degree was a sign of expertise in a certain field. But nowadays, a college education largely serves to make a graduate overconfident in their belief that they’re just as smart as a professor with decades of experience under her belt.

A college diploma means less than it used to because of a trend that began decades ago, when colleges began to boost graduation rates as a way to appear more successful and justify their inflated tuition costs. As a result, students are being coddled and praised throughout their higher education, instead of being intellectually challenged.

Two professors conducted a study of 200 US colleges and universities that compared historical data up to 2009. They found that the most frequently given grade these days was an A, while 80 percent of all grades given, across all subjects, were higher than a B minus. At Yale, nearly 60 percent of all grades were either an A minus or an A.

It’s fair to say a college education is now seen as a product someone buys, like a visit to a spa, where students are the customers; the days of campuses being places of true educational merit are fading.

Universities are now like any other business, competing for money from a teenage demographic and focusing on the experience rather than the content. The focus has now shifted from educational excellence to which one offers the best pizza in their cafeteria or the most luxurious dorm rooms. It’s even common to see grossly entitled students treating faculty members like paid staff who are there to serve them.

This might sound like an exaggeration, but this behavior isn’t exactly discouraged, since many institutions ask students to rate and critique their professors at the end of each semester. It’s no surprise that students feel superior despite receiving an inferior education.

The Death of Expertise Key Idea #4: Don’t believe everything you read on the internet.

There are some great benefits to having the internet be largely open and unregulated, but ensuring accuracy of information isn’t one of them.

The internet is a tremendous tool for researchers and journalists, but it can easily lead you astray if you don’t know how to double-check your facts.

There are very few safeguards to prevent anyone from posting anything on the internet, which has caused it to become a place infested with inaccurate and fake information. As such, it can take a keen eye to separate fake news from accurate reporting.

In 2015, writer Allen West posted a supposed “scoop” for a conservative news site that claimed US troops were being forced to pray like Muslims during Ramadan. The piece even featured a photo of US soldiers praying on mats, under the click-bait headline: “Look what our troops are being FORCED to do.”

The story was completely fake, but that didn’t stop it from spreading like a virus through social media platforms and other news sites.

People familiar with the rules of research and how to verify a source have what it takes to sort through this mess and use the internet responsibly. But the majority of readers out there aren’t trained to recognize a phony article and can be easily misled.

This problem is compounded by the fact that so many people are influenced by their confirmation bias and use the internet to reinforce their preconceived notions. For many, the internet is not a tool for finding facts and seeking out the truth, it’s a web of lies that readers are happy to remain stuck in.

At this point, so many false news stories have been posted that they now function as a “source” for even more misleading articles to base themselves on.

This is certainly the case for all the antivaccine stories, so someone can easily go online and search for articles that support the claims of supposed dangers while ignoring the real scientific studies.

It’s a minefield of misinformation out there, so it’s up to the reader to be diligent about not getting caught in a ruse.

The Death of Expertise Key Idea #5: Modern journalism gives readers a false impression of being informed.

If you feel like a lot of news article are dumbing down their content these days, you’re not alone.

The introduction of the internet may have increased the quantity of available sources but it has decreased the quality of the content.

Being a reporter used to imply a certain amount of experience and a specific set of journalistic standards, but the internet has given anyone with a computer the ability to launch a news site and build readership. Since the turn of the century, the number of news sources on the internet has steadily increased.

More news outlets also means an increased demand for content – and for journalists to generate that content – and being completely inexperienced isn’t exactly a deal breaker. So it should come as no surprise that a rise in inexperienced journalists has led to an overall drop in the quality of news on display.

The need for content has also resulted in outlets being flooded with fake news stories – but even a great deal of real news stories are being published with errors and inaccuracies.

In 2016, Time magazine listed the 100 greatest female writers of all time. Included among them was English writer Evelyn Waugh, who just so happens to be a man.

Fact-checking has taken a backseat to the demand for clickable content, so news websites are emphasizing material that readers want to read, instead of what they need to read.

On the web, revenue is determined by interactivity. So, to make a profit, websites publish stories that are designed to be clickable and shareable. And internet users have shown a clear preference for entertainment news and articles that confirm their beliefs.

As a result, many outlets aren’t offering insight as much as they’re offering entertainment and distractions. They publish stories designed to comfort rather than challenge, and purposely avoid distressing facts.

To help with that interactivity, news sites encourage readers to share and comment on social media, which opens the door for ordinary people to feel entitled and to start dissecting complicated matters without any of the necessary insight.

The Death of Expertise Key Idea #6: Experts can also be wrong.

Experts are as human as the rest of us and, from time to time, they’ll be proven wrong. Unfortunately, their failures can hurt more than others as they lead to attacks on facts and established knowledge.

There are plenty of reasons why an expert might have an epic fail moment, and a common one is when they step beyond their field of expertise.

This was the case for two-time Nobel Prize-winning chemist, Linus Pauling. In the 1970s, he was convinced that Vitamin C was a miracle drug. He consumed massive quantities on a daily basis and believed it could be used as a treatment for just about anything, from cancer to leprosy.

Scientists were rightfully skeptical of Pauling's claims and they tried to point out that their tests did not support his theories. But Pauling turned a deaf ear to their facts.

We now know that it’s possible to overdose on vitamins, and that there are ones that can increase the risk of stroke and certain cancers. Though Pauling was a brilliant chemist, when it came to his medical expertise, he was out of his depth.

Another common mistake on the part of experts is to make predictions.

A scientist's job essentially involves explaining things that have already taken place or are currently happening. But reporters and curious minds love to ask scientists to make predictions, often with the intent of enabling people to prepare for what’s to come.

But even the most educated expert can fail miserably when it comes to predicting the future.

Look at the recent US presidential election. In 2016, countless polls and political experts were predicting a win for Hillary Clinton, so everyone was rather shocked when Donald Trump emerged victorious.

A similar prediction was made before the United Kingdom’s “Brexit” referendum, which also left most experts stunned.

Like the rest of us human beings, experts make mistakes. But instead of responding with distrust or anger, we should continue working with experts to learn from these mistakes and figure out what in the world just happened. Our future may depend upon it!

In Review: The Death of Expertise Book Summary

The key message in this book:

There are many explanations for why we’re currently facing an abundance of misinformation, lies, confusion and distrust in established experts. The internet and modern news has corrupted our relationship with experts, from doctors to university professors, while a college education can actually hurt public debate instead of enriching it. We need to recognize the cognitive biases that lead all of us, experts and laypeople, to make mistakes, and we must work together to learn from these mistakes.