This post is not directly about politics or COVID-19. But it should be, which is why I’ve tagged it for both.
When I was teaching at California State University, Sacramento in the late 1980s, the Cal State system was trying to increase the focus on classes that emphasized critical thinking.
If there was an official definition, I never saw it, but my department made it clear that the environmental studies law-and-public-policy classes I taught were exactly what they wanted.
I ran these classes not as lectures, but as discussions based on assigned readings, and my biggest goal was to challenge the students to think about the readings’ implications, rather than just taking them at face value.
One of my favorite moments was a discussion in which one of the students flipped whatever I was saying at the time back on itself and pointed out something I’d overlooked. “That’s what you taught us to do,” he said, when he realized how well he’d hoisted me by my own petard.
I don’t remember what grade he got for the course, but for that day, he definitely got an A+.
Since then, however, I’ve found that critical thinking is all too often replaced by shorthand substitutes.
Some people are taught that all you have to do is assess the speaker or writer’s self-interests and reject anything coming from someone with an interest—particularly a financial one—in it being correct.
But, while that’s a good reason for skepticism, reflexive cynicism and critical thinking are not the same thing.
Another shortcut is to reject anything that comes from a source with which you disagree.
But sources you disagree with may still get their facts right…and even your favorites can garble things.
What we need to do, and do now before it’s too late, is to start teaching and modeling how to do this better. In addition to the two above shortcuts, that means:
- Examining the information, as well as the source. Even people with a stake in the outcome, can sometimes be right.
- Examining the information in light of everything you already know. Depending on the field, that could be a lot or a little, but it’s a good starting point…so long as what you think you know is actually accurate (more about that below).
- Tracing the information to its original source, to make sure something important didn’t get lost along the way.
- Fact checking bits and pieces of the information, as spot checks on its overall accuracy.
- Being aware that on Google, when you do a fact-check, you are basically polling the audience. Look for fact-checking sources that have probably made at least some effort to do their own fact checking (or have good error-correction protocols, such as Wikipedia has on technical issues).
- Fact checking your own facts, even the ones you think you know. There’s a maxim in journalism that it’s the thing you’re sure you know that will be the one that trips you up.
Finally, beware of the Dunning-Kruger effect.
Simply stated, it says that there is a complex relationship between the degree of a person’s competence in a field and the degree to which they think they’ve mastered it.
If you are utterly incompetent, you probably know it. If you have considerable expertise, you probably have a decent grasp of what you do and don’t know. But if you’re in the middle— knowing “just enough to be dangerous,” you may underestimate the complexities of the field, and overestimate your mastery of it.
It’s pretty easy to point the finger at someone else and snidely call them a representative of the dunning-Kruger effect in action.
But what’s really needed is to realize that this is a fallacy we ourselves can fall into.
The solution is to remember that that cute phrase, about knowing just enough to be dangerous isn’t actually anything I’ve ever heard used as a derogatory comment about someone else. It’s always a cautious: “I know just enough about that to be dangerous.”
A big piece of critical thinking is knowing what you don’t know.