Barrett's Laws for a Scientific Mindset

1. Only science is science

Not every claim is scientifically testable. People have personal experiences that can inform them and improve their lives. While there may be room for differences of opinion, there is no need to invoke scientific thinking until someone starts using scientific terms or claims, or makes claims that can be tested scientifically. For example, whether or not a given class was informative or helpful to me is a matter of personal opinion, which should not be challenged scientifically. Science only comes into the conversation when someone claims that the class is/is not helpful to most people--a testable claim.

Possible violations:

  • Using scientific language to challenge personal experience.
  • Claiming something cannot be true unless it is objective and verifiable.

2. Human sciences are "fuzzy" sciences, but they are still sciences.

Some people--even some scientists--are inclined to say that some sciences are more scientific than others, but we all use the same scientific method. Some branches of psychology--especially early psychology--are less committed to the scientific method than others, but their legitimacy remains in the accuracy of their predictions. Since psychology is a young science, it is easy for the critics of the science to scoff at early psychologists, especially those who speculated far beyond their data. Modern psychologists check each other's findings very carefully through the peer review process, to keep things as close to the scientific method as possible.

Note: Part of the reason psychology remains fuzzy because psychologists are not monsters. While a few infamous cases have crossed the line, the vast majority of psychologists view such experiments with disgust and anger. While the slow progress of psychology may be frustrating, keep in mind that we would prefer to study humanity without losing our own.

Possible violations:

  • Specific statements that a soft science (such as psychology) is not a science.
  • Leaving soft sciences off a list of sciences.
  • Condescending remarks to suggest that soft sciences are not as scientific.
  • While it's not a violation per se, I would also accept examples of why soft sciences remain soft--experiments (even proposals for them) that would be illegal/unethical.

3. Science requires common data.

"Data" means facts that are objective, measurable and predictable. The whole point of scientific observation, studies, and experiments is to produce reliable data. Your personal experiences do not constitute data--they are only coming from your perspective (which makes them subjective) and they rarely leads to exact measurements. Nor do your personal beliefs constitute data--there are many ways to reconcile the data with your beliefs, but to succeed in this class, you cannot merely ignore the data.

Possible violations:

  • Not citing a source.
  • Citing your life experience as data (this may also violate rule #5).
  • Citing dogmatic beliefs ("everyone knows...") as data.
  • Ignoring public/common data.

4. Science runs on careful criticism.

Sometimes people point to arguments between scientists as evidence that science is merely a matter of opinion. These people fail to notice that most of these arguments have rules, and are a critical part of the process that makes science work. We've all heard about the scientific method--observe, form hypotheses, test them, and draw conclusions--but most people don't realize what takes place after that: we share our conclusions and data with other scientists and invite them to find any mistakes. This search for mistakes has been formalized in the last century into a process called "peer review," and it's what keeps science on track. It's not perfect, nor is it pretty, but it's the best thing we've found so far, and it's far better than how scientists used to do this. Of course, you also need to criticize for the right reasons. Skepticism that is not based on data is just an opinion, and just as worthless to science as any other opinion. The blogosphere is full of this kind of rabid criticism, where people pick apart this or that conclusion without the benefit of careful reasoning. I expect better of all of you.

Possible violations:

  • A "rogue" or "persecuted" scientist claims that they are being criticized unfairly.
  • Citing an argument between scientists as evidence for one or both of them being incorrect.
  • Citing contrasting explanations or theories as evidence for a general weakness of the field of study.
  • Sloppy criticism--attacking without carefully thinking it through, or without a clear understanding of the topic. (May also be a violation of Law #5)

5. Fight fire with fire, and data with data.

Sometimes when people don't like a particular study, they try to outsource the work of fighting it to someone else. They say things like, "But there might be something wrong with that study," or "That guy was probably biased," or "Yeah, and next year someone else will say the opposite." In scientific terms, they want their hypothesis that a flaw exists to have the same weight as the data they are attacking, but it just doesn't work that way. While it's possible that this or that study was flawed somehow, you will need evidence of an error if you want to refute the data. Bear in mind: even the least controversial claims in your textbook have gone through a lot of peer review--lots of smart people picked through them to find any mistakes the scientists might have made. While it's quite possible that they missed something, you can't reject data on the grounds that someone, someday, might find a flaw in it.

Possible violations:

  • Hypothesizing potential flaws in a study without researching whether such flaws existed.
  • Predicting that a disliked theory will be discredited someday.
  • Rebuttal by reputation: claiming that a hypothesis or theory has more weight because the chief proponent has X or Y credentials in the scientific community.

6. There's power in precise terms.

Ever wonder why scientists use such big fancy words? It's because these words are more precise, and good scientists understand the power of precise terms. Terms that might be interchangeable to a layperson might mean vastly different things to a scientist. If you want to get ahead in this class, you would do well to learn the differences between "psychotic" and "psychopathic," between "sex" and "gender," between "normal" and "typical." These are only a few of the terms that have specific meanings within scientific circles. The better you know them, the better you will understand what you read, which will lead to success on your tests and essays, which will lead to a higher grade. One example of this that comes up frequently is the word "theory." Outside of scientific circles, people use the term "theory" to mean an idea based on observation, but the proper scientific term for that is "hypothesis." A theory is actually a hypothesis that has been thrown to the wolves of scientific criticism, and survived. The theories in your textbook are the greatest survivors of this process. If you disagree with something in the book, that's fine, but you'll need more than "it's only a theory" if you want to criticize it.

Repeat Offenders:

  • Theory vs. hypothesis
  • Gender vs. sex
  • Normal vs. typical
  • Psychotic vs. psychopathic
  • Nerve cells vs. nerve connections
  • Maturation vs. development

7. Correlation is not causation.

What if I told you that outbreaks of a certain disease went up as sales of a particular food went up? That a large number of the people afflicted caught the disease within 24 hours of eating this food? Such data would lead many people to conclude that the food was contaminated. Denials from the food industry might be seen as stonewalling, or even a cover-up. After all, we all know the facts, right? Actually, we only know some of the facts, and the ones we are missing are critical. For the record, this actually happened. Polio outbreaks in the 1940's and '50 strongly correlated to the sales of ice cream. Notice I said "correlated." That means that as one goes up, the other goes up, and vice versa. The danger of this is in assuming that one must be causing the other. For any two factors A and B, a strong correlation might mean A is causing B, B is causing A, or some unknown factor C is causing both. Far too often, we jump to conclusions. In the case of polio, the problem is that the polio virus targets children and is more active during the summer--when ice cream sales are high. Thus, despite the correlation between the two things, neither was causing the other. A third factor--the season--is causing both.

Possible violations:

  • Using the word "cause" without an experimental study to confirm causation.
  • Pointing to a link between two things and jumping to a conclusion that assumes cause.

8. More of something good isn't always better.

An extremely common assumption is that if something is good, it's always good, and more of it is always better. By that logic, since computers run on electricity, they should work better after being struck by lightning. There are actually very few things in real life that support this popular viewpoint. Even oxygen can quickly get toxic at levels only slightly higher than normal. This is a central issue to many class topics, such as over-stimulation of infants. Studies show that severe brain damage results from lack of stimulation, so companies have sprung up all over to make sure infants are stimulated. Many of them make claims to the effect that more stimulation will not just prevent brain damage, but actually foster brain growth. Recent studies have shown that nothing is further from the truth: over-stimulation can be just as dangerous. This law is subject to a number of reversals: claims that X is bad, so more of it would be worse, or that eliminating it entirely would be better. These are equally false. Most biological systems work in a state of balance, so that any given ingredient can be unhealthy if it goes too high or too low.

Possible violations:

  • Claims that something is "better for you" because it has more of a given nutrient.
  • Unhealthy food that tries to brand itself as healthy by claiming to be "sugar-free" or "fat-free."
  • Citing effects of deprivation as evidence for why you need more of something.

9. Beware the plausible, especially if it works.

There are few things more dangerous to critical thinking than the obvious answer. We are slow to test obvious answer, if we do it at all. This is how mankind's worst biases have been passed on for generation after generation, until some brave person realizes that they ought to be tested. This is especially problematic with methods that bring the desired results--methods that work. Suppose, for example, that a math student reduces the fraction 16/64 by crossing out the 6's to get 1/4. The answer is correct, but the method is flawed, and if the student continued to use this method, the result will be disaster. This frequently shows up in product sales and promotion when the marketing talks about the amazing effect, but is sketchy on the mechanism. This is how we get tapeworm eggs sold as diet pills--will you lose weight? Sure. The question is why, and whether losing weight is always such a good thing (see reversal of #7).

Possible violations:

  • Citing the effect of a change without explaining the mechanism.
  • Encouragement to "try it yourself" as the best evidence that something works.
  • Personal testimonials offered as evidence that something works.

10. Lack of evidence doesn't mean it's false.

It's a lazy brain that sorts things into two groups: true and false. A scientist trains himself to sort them into three groups: proven true, proven false, and unproven. When we don't have enough evidence to prove something true, that's not enough to make it false. It remains unproven until we have enough evidence either way. This applies to a lot of our other rules here. For example, in the example from Law #7, there might actually be something bad in the ice cream. We just don't have enough data to say that yet. With Law #8, there are plenty of individual cases where more of something good is actually better. And in case of #9, the plausible explanation might actually be true. We just don't know yet, which means we need to keep an open mind and collect more data.