January 10, 2018

Understanding science

Why we’re not wired to think scientifically (and what can be done about it)

What is it about being human that conflicts with being scientific?

Read Time 8 minutes

Author’s note: This post was originally published in May, 2014. It has been updated to reflect my current thinking on the topic. Perhaps the best addition, by popular demand, is Rik’s coffee recipe (click the 1st inline footnote).

§

In 2012, I was having dinner with a good friend, Rik Ganju, who is one of the smartest people I know. And one of the most talented, too—a brilliant engineer, a savant-like jazz musician, a comedic writer, and he makes the best coffee I’ve ever had.1Here is the coffee recipe, courtesy Rik. I make this often and the typical response is, “Why are you not making this for a living?” Look for Vietnamese cinnamon, also known as Saigon cinnamon; you need two big dashes, if that. You need real vanilla (be careful to avoid the cheap versions with added sugar). Best is dissolved in ethanol; if that doesn’t work for you get the dried stick and scrape the pods. Then find a spice store and get chicory root (I’m a bit lazy and get mine on Amazon). You’ll want to replace coffee beans with ~10% chicory on a dry weight basis. If you’re on a budget, cut your coffee with Trader Joe’s organic Bolivian. But do use at least 50% of your favorite coffee by dry weight: 50-40-10 (50% your favorite, 40% TJ Bolivian, 10% other ingredients [chicory root, cinnamon, vanilla, amaretto for an evening coffee]) would be a good mix to start. Let it sit in a French press for 6 minutes then drink straight or with cream, but very little–max is 1 tablespoon of cream. The Rik original was done with “Ether” from Philz as the base. I was whining to him about my frustration with what I perceived to be a lack of scientific literacy among people from whom I “expected more.” Why was it, I asked, that a reporter at a top-flight newspaper couldn’t understand the limitations of a study he was reporting on? Are they trying to deliberately mislead people, or do they really think this study which showed an association between such-and-such, somehow implies X?

Rik just looked at me, kind of smiled, and asked the question in another way. “Peter, give me one good reason why scientific process, rigorous logic, and rational thought should be innate to our species?” I didn’t have an answer. So as I proceeded to eat my curry, Rik expanded on this idea. He offered two theses. One, the human brain is oriented to pleasure ahead of logic and reason; two, the human brain is oriented to imitation ahead of logic and reason. What follows is my attempt to reiterate the ideas we discussed that night, focusing on the second of Rik’s postulates—namely, that our brains are oriented to imitate rather than to reason from first principles or think scientifically.

One point before jumping in: This post is not meant to be disparaging to those who don’t think scientifically. Rather, it’s meant to offer a plausible explanation. If for no other reason, it’s a way for me to capture an important lesson I need to remember in my own journey of life. I’m positive some will find a way to be offended by this, which is rarely my intention in writing, but nevertheless I think there is something to learn in telling this story.

The evolution of thinking

Two billion years ago, we were just cells acquiring a nucleus. A good first step, I suppose. Two million years ago, we left the trees for caves. Two hundred thousand years ago we became modern man. No one can say exactly when language arrived, because its arrival left no artifacts, but the best available science suggests it showed up about 50,000 years ago.

I wanted to plot the major milestones, below, on a graph. But even using a log scale, it’s almost unreadable. The information is easier to see in this table:

Formal logic arrived with Aristotle 2,500 years ago; the scientific method was pioneered by Francis Bacon 400 years ago. Shortly following the codification of the scientific method—which defined exactly what “good” science meant—the Royal Society of London for Improving Natural Knowledge was formed. So, not only did we know what “good” science was, but we had an organization that expanded the application, including peer review, and existed to continually ask the question, “Is this good science?”

While the Old Testament makes references to the earliest clinical trial—observing what happened to those who did or did not partake of the “King’s meat”—the process was codified further by 1025 AD in The Canon of Medicine, and formalized in the 18th century by James Lind, the Scottish physician who discovered, using randomization between groups, the curative properties of oranges and lemons—vitamin C, actually—in treating sailors with scurvy. Hence the expression, “Limey.”

The concept of statistical significance is barely 100 years old, thanks to Ronald Fisher, the British statistician who popularized the use of the p-value and proposed the limits of chance versus significance.

The art of imitation

Consider that for 2 million years we have been evolving—making decisions, surviving, and interacting—but for only the last 2,500 years (0.125% of that time) have we had “access” to formal logic, and for only 400 years (0.02% of that time) have we had “access” to scientific reason and understanding of scientific methodologies.

Whatever a person was doing before modern science—however clever it may have been—it wasn’t actually science. And along the same vein, how many people were practicing logical thinking before logic itself was invented? Perhaps some were doing so prior to Aristotle, but certainly it was rare compared to the time following its codification.

Options for problem-solving are limited to the tools available. The arrival of logic was a major tool. So, too, was the arrival of the scientific method, clinical trials, and statistical analyses. Yet for the first 99.98% of our existence on this planet as humans—literally—we had to rely on other options—other tools, if you will — for solving problems and making decisions.

So what were they?

We can make educated guesses. If it’s 3,000 BC and your tribemate Ugg never gets sick, all you can do to try to not get sick is hang out where he hangs out, wear similar colors, drink from the same well—replicate his every move. You are not going to figure out anything from first principles because that isn’t an option, any more than traveling by jet across the Pacific Ocean was an option. Nothing is an option until it has been invented.

So we’ve had millions of years to evolve and refine the practice of:

Step 1: Identify a positive trait (e.g., access to food, access to mates),

Step 2: Mimic the behaviors of those possessing the trait(s),

Step 3: Repeat.

Yet, we’ve only had a minute fraction of that time to learn how to apply formal logic and scientific reason to our decision making and problem solving. In other words, evolution has hardwired us to be followers, copycats if you will, so we must go very far out of our way to unlearn those inborn (and highly refined) instincts to think logically and scientifically.

Recently, neuroscientists (thanks to the advent of functional MRI, or fMRI) have been asking questions about the impact of independent thinking (something I think we would all agree is “healthy”) on brain activity. I think this body of research is still in its infancy, but the results are suggestive, if not somewhat provocative.

To quote the authors of this work, “if social conformity resulted from conscious decision-making, this would be associated with functional changes in prefrontal cortex, whereas if social conformity was more perceptually based, then activity changes would be seen in occipital and parietal regions.” Their study suggested that non-conformity produced an associated “pain of independence.” In the study-subjects the amygdala became most active in times of non-conformity, suggesting that non-conformity—doing exactly what we didn’t evolve to do—produced emotional distress.

From an evolutionary perspective, of course, this makes sense. I don’t know enough neuroscience to agree with their suggestion that this phenomenon should be titled the “pain of independence,” but the “emotional discomfort” from being different—i.e., not following or conforming—seems to be evolutionarily embedded in our brains.

Good solid thinking is really hard to do as you no doubt realize. How much easier is it to economize on all this and just “copy & paste” what seemingly successful people are doing? Furthermore, we may be wired to experience emotional distress when we don’t copy our neighbor! And while there may have been only 2 or 3 Ugg’s in our tribe 5,000 years ago, as our societies evolved, so too did the number of potential Ugg’s (those worth mimicking). This would be great (more potential good examples to mirror), if we were naturally good at thinking logically and scientifically, but we’ve already established that’s not the case. Amplifying this problem even further, the explosion of mass media has made it virtually, if not entirely, impossible to identify those truly worth mimicking versus those who are charlatans, or simply lucky. Maybe it’s not so surprising the one group of people we’d all hope could think critically—politicians—seems to be as useless at it as the rest of us.

So we have two problems:

  1. We are not genetically equipped to think logically or scientifically; such thinking is a very recent tool of our species that must be learned and, with great effort, “overwritten.” Furthermore, it’s likely that we are programmed to identify and replicate the behavior of others, rather than think independently, and independent thought may actually cause emotional distress.
  2. The signal (truly valuable behaviors worth mimicking)-to-noise (all unworthy behaviors) ratio is so low—virtually zero—today that the folks who have not been able to “overwrite” their genetic tendency for problem-solving are doomed to confusion and likely poor decision making.

As I alluded to at the outset of this post, I find myself getting frustrated, often, at the lack of scientific literacy and independent, critical thought in the media and in the public arena more broadly. But, is this any different than being upset that Monarch butterflies are black and orange rather than yellow and red? Marcus Aurelius reminds us that you must not be surprised by buffoonery from buffoons, “You might as well resent a fig tree for secreting juice.”

While I’m not at all suggesting people unable to think scientifically or logically are buffoons, I am suggesting that expecting this kind of thinking as the default behavior from people is tantamount to expecting rhinoceroses not to charge or dogs not to bark—sure it can be taught with great patience and pain, but it won’t be easy in short time.

Furthermore, I am not suggesting that anyone who disagrees with my views or my interpretations of data frustrates me. I have countless interactions with folks whom I respect greatly but who interpret data differently from me. This is not the point I am making, and these are not the experiences that frustrate me. Healthy debate is a wonderful contributor to scientific advancement. Blogging probably isn’t. My point is that critical thought, logical analysis, and an understanding of the scientific method are completely foreign to us, and if we want to possess these skills, it requires deliberate action and time.

What can we do about it?

I’ve suggested that we aren’t wired to be good critical thinkers, and that this poses problems when it comes to our modern lives. The just-follow-your-peers-or-the-media-or-whatever-seems-to-work approach simply isn’t good enough anymore.

But is there a way to overcome this?

I don’t have a “global” (i.e., how to fix the world) solution for this problem, but the “local” (i.e., individual) solution is quite simple provided one feature is in place: a desire to learn. I consider myself scientifically literate. Sure, I may never become one-tenth a Richard Feynman, but I “get it” when it comes to understanding the scientific method, logic, and reason. Why? I certainly wasn’t born this way. Nor did medical school do a particularly great job of teaching it. I was, however, very lucky to be mentored by a brilliant scientist, Steve Rosenberg, both in medical school and during my post-doctoral fellowship. Whatever I have learned about thinking scientifically I learned from him initially, and eventually from many other influential thinkers. And I’m still learning, obviously. In other words, I was mentored in this way of thinking just as every other person I know who thinks this way was also mentored. One of my favorite questions when I’m talking with (good) scientists is to ask them who mentored them in their evolution of critical thinking.

Relevant aside: Take a few minutes to watch Feynman at his finest in this video—the entire video is remarkable, especially the point about “proof,”—but the first minute is priceless and a spot on explanation of how experimental science should work.

You may ask, is learning to think critically any different than learning to play an instrument? Learning a new language? Learning to be mindful? Learning a physical skill like tennis? I don’t think so. Sure, some folks may be predisposed to be better than others, even with equal training, but virtually anyone can get “good enough” at a skill if they want to put the effort in. The reason I can’t play golf is because I don’t want to, not because I lack some ability to learn it.

If you’re reading this, and you’re saying to yourself that you want to increase your mastery of critical thinking, I promise you this much—you can do it if you’re willing to do the following:

  1. Start reading (see starter list, below).
  2. Whenever confronted with a piece of media claiming to report on a scientific finding, read both the actual study and the media, in that order. See if you can spot the mistakes in reporting.
  3. Find other like-minded folks to discuss scientific studies. I’m sure you’re rolling your eyes at the idea of a “journal club,” but it doesn’t need to be that formal at all (though years of formal weekly journal clubs did teach me a lot). You just need a good group of peers who share your appetite for sharpening their critical thinking skills. In fact, we have a regularly occurring journal club on this site (starting in January, 2018).

I look forward to seeing the comments on this post, as I suspect many of you will have excellent suggestions for reading materials for those of us who want to get better in our critical thinking and reasoning. I’ll start the list with a few of my favorites, in no particular order:

  1. Anything by Richard Feynman (In college and med school, I would not date a girl unless she agreed to read “Surely You’re Joking, Mr. Feynman”)
  2. The Transformed Cell, by Steve Rosenberg
  3. Anything by Karl Popper
  4. Anything by Frederic Bastiat
  5. Bad Science, by Gary Taubes
  6. The Structure of Scientific Revolutions, by Thomas Kuhn
  7. Risk, Chance, and Causation, by Michael Bracken
  8. Mistakes Were Made (but not by me), by Carol Tavris and Elliot Aronson
  9. Thinking, Fast and Slow, by Daniel Kahneman
  10. The Method of Multiple Working Hypotheses,” by T.C. Chamberlin

I’m looking forward to other recommendations.

Disclaimer: This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor/patient relationship is formed. The use of information on this blog or materials linked from this blog is at the user's own risk. The content of this blog is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Users should not disregard, or delay in obtaining, medical advice for any medical condition they may have, and should seek the assistance of their health care professionals for any such conditions.

295 Comments

  1. You might get something out of

    Why Do Humans Reason? Arguments for an Argumentative Theory

    published in Behavioral and Brain Sciences, 2011.

  2. No membership required. JC will be posted monthly and available to everyone. Of course, if you sign up for the weekly email you won’t have to worry about when it comes out, as you’ll get notified.

    • Thanks! I had not heard anyone mention Karl Popper since college. I enjoyed your article.

  3. Another book for the reading list in the same vein as this article, The Enigma of Reason by H. Mercier and D. Sperber.

  4. im having an issue sharing this. Anyone else having trouble with FAcebook rejecting it?

  5. For those who have kids (newborns to teenagers), an easy to read book is NurtureShock by Po Bronson. It’s about the science of childhood development. However, it’s written in such a relaxed and simple style that even someone with a background in finance or the arts will come away having honed their scientific skeptism. The book has countless examples of researchers doing work on self-esteem, praise and feedback, sleep, reading to children etc. It’s always followed by how the media then butchers their findings and how the general public lazily extrapolates from what were deliberately narrow and limited avenues of inquiry. The author’s journalism has been recognized by the AAAS.

  6. Agree with the article. I have a point to add, though. The writer of the article was likely conducting himself perfectly rationally if his goal was to maximize emotional reaction, likely leading to more sales. I think a lot of the time, we are not aware of the fact that our assumed framing of the situation differs from that of the person we disagree with. If you are someone on the more idealistic side desiring to popularize science and scientific thinking, it’s indefensible. If you are someone who wishes to popularize science and/or increase sales, and believes that the public at large tends to be rather incapable of handling complex information not clearly pointing in a certain direction, it’s a rational move pointing the public in the right direction at least. The consequences of the first end goal might be that there are far fewer readers interested in the piece, but that there is a small nudge in the right direction as far as scientific thinking goes. The consequences of the second that a lot more people take an interest in the topic, but scientific accuracy suffers. It’s hard to know whether the author was being rational or not judging only from a shitty piece.

    A couple more issues:
    – the lack of factual accuracy is obviously a general trend in news reporting and politics, and it’s not necessarily due to the culprits behind the bullshit not thinking rationally.
    – this is going down the rabbit hole, but I think it is dangerous when people start fancying themselves enlightened rationals, because it will become even harder for them to see through their own emotionally and intuitively based worldviews and opinions. This will be present at all times, especially when they have an emotional reaction towards lack of logical order and what they perceive to be sloppy thinking. I myself have this reaction. I find that these rationals often start resenting the very concept of emotions, emotionally motivated underlying assumptions, and arbitrary perspectives, and therefore have emotional resistance preventing them from catching themselves in the act.

    /psychologist

    • Indeed: an excellent point, for proof of which one merely has to reference the defacing of Wikipedia by self-proclaimed skeptics. The most famous example is probably Professor Richard Dawkins, whose “skepticism” frequently reflects a rationalist’s over-use of Occam’s Razor (a valuable tool, but inappropriate for analysing complex systems). Rationalism should not equate to a priori exclusion; a fault found in many statistical studies that fail to mention (or even consider) multiple potential causes for specific outcomes.

      The trouble is, human scientists appear to have been unable to avoid the trap of believing, at various periods, that we “know nearly everything”. This spectacular complacency was expressed at various times in the 18th, 19th and 20th centuries (shortly before the discovery of the quantum realm, for example). Such a belief is, I suggest, fundamentally incompatible with scientific reasoning. Fortunately, today’s nuclear scientists and astrophysicists appear less inclined to make such grandiose judgements and more inclined to acknowledge that we’ve only just started to appreciate the complexity of reality.

      Even if it is all just a gigantic simulation 😉

  7. Just because Aristotle codified logic does not mean that mankind was devoid of it prior to the existence of Plato’s greatest student. For example, the construction of the Egyptian pyramids or even Stonehenge required logic. The astronomical alignments alone argue for that. And, of course, the engineering of the edifices demand it. It may even be argued that the practice of agriculture requires some logical thought. This does not necessarily vitiate the argument but does set the time logic was exercised at a much earlier time, probably around the time agriculture began.

    • Agree. It most certainly took a logical progression of thought to decipher the fact that a seed will grow into a plant.

      Formal study of logic, a second order derivative (thinking about how we think) may have come about later, or may simply have been recorded later and recognized by the general populace much earlier. We only have what individuals chose to write, and most of what they chose to write involved trade, not philosophy.

  8. I would recommend a number of books: (1) Ayn Rands’ “Introduction to the Objectivist Epistemology” (A good general overview of her view of concepts in logic can be found in “The Art of Reason” by David Kelly, (2) Harry Binswanger’s “How We Know”, and (3) David Harrimans’ “The Logical Leap: Induction in Physics”. This last book is an important one because it offers a solution to the “problem of induction” (which underpins all science). If Harriman’s solution is correct (as far as it goes), and I believe it is, and if it is applied to biology, I think this would radically change the way we do scientific studies in that field.

  9. I really liked this blog article. With the pandemic, I find a certain alienation from friends and their beliefs. That alienation seems to go further than 6 feet. I suppose it is a blend of logic and modeling that brought me here. It seems that I modeled people that read lots science and have some grad school credits in Nutrition to stack that deck.
    I mean, a quick search on Google for the number of studies in PubMed puts it over 33 million and certainly seems overwhelming. Also, I remember seeing a study (can’t find it now amidst the 33 million) that said about 86% of all studies never receive a reference (could be a little off on the number as this is memory).
    I formulate nutrition products and have a small company. I mean the whole copy vs. logic has never been more obvious example of this article than the nutrition game. I swear nobody reads research.
    As an example, there is an ingredient called D-Aspartic Acid. This ingredient is in a top 10 seller on Amazon for the respected category. There was some early research that showed some promise and the craze began. However the later research essentially caboshed it. Yet, products are still selling with this ingredient and some people are Instagramming its power. The ingredient is perhaps not useless but not going to bring testosterone numbers anywhere except maybe in rats. And, it actually can lower testosterone in a dose higher than 3g.
    Copy the product that sells… Don’t worry about the logic.

Facebook icon Twitter icon Instagram icon Pinterest icon Google+ icon YouTube icon LinkedIn icon Contact icon