One of the first lessons I remember my books (and TV) teaching me was that the truth is uncomfortable. I don’t remember exactly when I stopped believing that, but for many years in between, that lesson sounded consistent with common rhetoric I came across.
The nature of modern competitive journalism on average favours whatever sounds terrible to read, because cynicism relates to social status. Many people will probably say they don’t trust mainstream media to get their news - and they’d be somewhat justified to do so, if it weren’t commonly an excuse for selection bias. But beyond that, the subconscious deference to meta media narratives, like that things are bad and becoming worse, is harder to account for than just selectively picking what to take from certain articles.
But still, even if its source is flawed, that doesn’t mean assumptions like that the world is becoming a harder place are unfounded, right? After all, climate change exists, the spectre of AI existential risk hangs over all of us, and wealth inequalities increase yearly.
Except, we never hear about what we’ve done right. We’re past the point of actively worrying about the ozone layer depleting. That freak outlier that 2020 is aside, every year beats its predecessors as the objective greatest point in human history. You could have seen newspapers carry headlines of 170,000 people moving out of extreme poverty the previous day, every single day, for the last decade. All of which are nice truths, and not ones you could have predicted if you were stuck with the preconception that truth, and life, is inherently unfair and uncomfortable to comprehend.
There’s more to be said about the nature of people, that we indiscriminately give higher status to an idea proportional to how cynical it sounds. We regard “maturity” as the moment someone begins to internalize the idea that the world is an unfair place. In treating it as a mark of sophistication, what we end up with is a majority who signal cynicism for status despite cynicism being purported as a means to achieve truth. You continue that trend for long enough, throw in an independent factor like evaporative cooling of group beliefs, and it isn’t hard to see how conspiracy theories are born.
One might argue at this point that nice truths are still uncomfortable to those whose biases lean toward the cynical, because it forces them to introspect and change their beliefs. While that’s true, it’s not the entire story of nice truths - they’re the comfortable choice to a newcomer, and the comfortable belief to hold post-introspection. There are many people who hold cynicism (or adherence to whatever signals status) as a deeper ideal than their other beliefs, and so almost freely interchange their professed values with the internal satisfaction that comes from “changing your mind”. To them, the unsettlement would arise from this idea, not from the specific truths.
This whole argument can be viewed as an extension to reversed stupidity not equating to intelligence - being right isn’t as easy as believing what feels most uncomfortable, most cynical. But there are more people today (at least in my admittedly narrow sample space) claiming the world is becoming a worse place than there are people advocating for meat-based diets because Hitler was a vegetarian.
Don’t mistake my point - most people are, on average, wrong about a great many number of things. Trying to change your mind about something is rarely comfortable (if it seems like it isn’t, you most likely either didn’t care about what you were wrong on, or you’re doing it wrong). It’s not good introspection if you don’t feel harrowed.
But whether or not an argument feels comfortable is no better heuristic for the truth than whether it feels more complex than the alternatives. Sometimes you’re right, sometimes the truth is the easy answer, because if the truth had any universal characteristic, it’d be way easier to find. So if you had to have an aphorism, be it that finding and recognizing the right answer is hard. A corollary of this is the fallacy fallacy, where people dismiss an argument if it contains a fallacy, even if it turns out in the end to be right, through the benefit of more information to work off of, other arguments being more wrong, or just sheer luck. You could be bad at finding the truth and still be right. Not that it’s likely, because that would again just make the whole thing easy.
Rule of thumbs can approximate a certain level of accuracy, and this one may certainly be of use to a certain type of person. But they work off the representativeness heuristic, which is not a good metric for how the world works - I prefer standard Bayesianism, where all the approximation comes only in how good you are, not baked into the system. And even that’s an oversimplification when trying to be Bayesian fails.
On a closing note, a surprisingly affable quote from a singularly unexpected source:
If you look for truth, you may find comfort in the end; if you look for comfort you will not get either comfort or truth, only soft soap and wishful thinking to begin, and in the end, despair.
- C. S. Lewis
The Truth is Often Comfortable
Blog
31 Jan 2021
5 min read