Duncan Watts’s 2011 book is aptly named. If you can read words upside down.

The book is entitled Everything Is Obvious, with an * after Obvious. If you turn the book upside down you can read the asterisk: “Once You Know the Answer.” The clever title gets at the book’s big idea. We tend to think there are simple answers to life’s most complex questions. We assume that solving major problems—whether in predicting human behavior, or in economics, or in government policy—is a matter of common sense. We believe everything is obvious. But, as Watts persuasively argues, most solutions to complex problems are anything but obvious. And if they are obvious, it’s only because we have the advantage of hindsight to see what worked.

Most likely, we are still at the front end of the coronavirus crisis. If the disease disappeared over the summer, never to return again, we would still be dealing with COVID’s emotional and economic fallout. Doctors and economists and journalists and historians and epidemiologists will be writing about the virus for decades. At some point, it may become “obvious” that closing schools saved lives or that it was pointless. At some point, it may become obvious which countries and which leaders made the best decisions. At some point, it may be obvious all the ways we made a massive problem less deadly or made a serious crisis worse. But at the moment—in the fog of a pathogenic war—it only takes fives minutes on Twitter to realize that the best way forward is not patently obvious.

That doesn’t mean some ideas aren’t better than others. I have my own opinions (informed, I hope) about which explanations and which policies are obviously correct. But as a pastor without an expertise in medicine, epidemiology, or mathematical modeling, I want to be careful about issuing any assured conclusions about what we should or shouldn’t do.

I don’t agree with everything in Watts’s book, but it’s a helpful reminder about the many ways intuition can fail us. We think people will act in their situations the same way we would act, but they don’t. We think that every problem has a simple solution, but many problems are horribly complex and beyond our ability to manage or control. We think we can understand most issues with a little effort, but many issues take years of expertise to grasp (and even then, the experts don’t agree on what is obvious!).

Watts highlights a number of biases that cloud our ability to make good judgments.

Hindsight bias. We come up with explanations that make sense because they’ve already come true, but they don’t have any real predictive power. For example, he cites an article about the success of Harry Potter in which the author argues that it’s a simple formula of a Cinderella plot, set in a boarding school, with easy stereotypes illustrating common vices, combined with moral statements about the value of courage and friendship, and an overarching message about the power of love and sacrifice, and you’re bound to have a popular book. As Watts points out, such an argument amounts to little more than: Harry Potter worked because it had the characteristics of Harry Potter.

Sampling bias. In suggesting causal explanations we often pay a lot of attention to what happens in special situations while ignoring what happens in normal situations. We may feel like we always get stuck at yellow lights, forgetting all the times we sail through green lights. We remember the basketball player’s clutch game-winning shots, but forget all the times he missed in the final seconds. More seriously, we might note that the mass shooter was a young man, who played video games, didn’t have a lot of friends, and could be moody, without stopping to think that such a description probably fits millions of people who never become mass shooters. In other words, we gravitate toward the examples that fit our hypothesis, while ignoring the ones that don’t.

Anecdote bias (my label). We are all prone to making sweeping judgments based on anecdotal evidence. You heard about a 104-year-old woman who recovered from COVID-19, and no one in your immediate circle has been to the hospital? It must not be a big deal. You saw two young moms on Facebook write about losing their husbands to the virus, and you have a doctor friend who had to make his own mask? It must be worse than everyone thinks. It’s easy to reach virtually unmovable conclusions based on a handful of personal experiences.

So what’s the takeaway when everything is not obvious?

Watts argues that we should rely on hard data and as much measurable information as we can muster. That’s good advice and certainly wise for a pandemic.

But I’d suggest one other lesson: epistemic humility. That’s just a fancy way of saying, let’s be mindful of what we truly know and of all the things we don’t really know. Along the same lines, let’s pray for our leaders to be men and women of wisdom and courage who want to do the right thing and the best thing no matter what it is and no matter who gets the credit. And finally, admitting our finite knowledge should make us gracious toward those who want the same ends in the crisis but don’t reach all the same conclusions. At some point everything may be obvious, but at present we don’t know all the answers—or, likely, even all the questions to ask.