BLA1332

= 1353367159 = = Barrett's Laws for a Scientific Mindset = =Table of Contents= Barrett's Laws for a Scientific Mindset 1. Psychology is a "fuzzy" science, but it's still a science. 2. You are entitled to your own opinion, but not your own facts. 3. Science runs on careful criticism. 4. Fight fire with fire, and data with data. 5. There's power in precise terms. 6. Correlation is not causation. 7. More of something good isn't always better. 8. Beware the plausible, especially if it works. 9. Where there's smoke, there's smoke. 10. Lack of evidence doesn't mean it's false.

1. Psychology is a "fuzzy" science, but it's still a science.
Some people--even some scientists--are inclined to say that some sciences are more scientific than others, but we all use the same scientific method. Some branches of psychology--especially early psychology--are less committed to the scientific method than others, but their legitimacy remains in the accuracy of their predictions. Since psychology is a young science, it is easy for the critics of the science to scoff at early psychologists, especially those who speculated far beyond their data. Modern psychologists check each others' findings very carefully through the peer review process, to keep things as close to the scientific method as possible.

Note: Part of the reason psychology remains fuzzy because psychologists are not monsters. While a few infamous cases have crossed the line, the vast majority of psychologists view such experiments with disgust and anger. While the slow progress of psychology may be frustrating, keep in mind that we would prefer to study humanity without losing our own. Example: [|Studies on Regeneration of limbs through spider research] A device in the lab shoots out a spider-web-looking material that is caught on a spinning rod. The resulting tubular shapes are used to engineer blood vessels.

2. You are entitled to your own opinion, but not your own facts.
Science is all about data--facts that are objective, measurable and predictable. The whole point of scientific observation, studies, and experiments is to produce reliable data. Your personal experiences do not constitute data--they are only coming from your perspective (which makes them subjective) and they rarely leads to exact measurements. Nor do your personal beliefs constitute data--there are many ways to reconcile the data with your beliefs, but to succeed in this class, you cannot merely ignore the data. Example: [|Beware of misleading information]

"Dental hygienists are educated professionals with the knowledge and ability to sort out the information available and reach an evidence-based conclusion. When confusion and disagreements in science arise, do not hesitate to weave the web for clarification, education, and even a bit more controversy and uncertainty. RDH"


 * < [[image:http://www.rdhmag.com/content/dam/rdh/print-articles/Volume%2032/issue%207/lory-laughter1207rdh.jpg caption="LORY LAUGHTER"]] ||<  ||< **LORY LAUGHTER**, RDH, BS, practices clinically in Napa, Calif. She is owner of Dental IQ, a business responsible for the Annual Napa Dental Experience. Lory combines her love for travel with speaking nationally on a variety of topics ||

3. Science runs on careful criticism.
Sometimes people point to arguments between scientists as evidence that science is merely a matter of opinion. These people fail to notice that most of these arguments have rules, and are a critical part of the process that makes science work. We've all heard about the scientific method--observe, form hypotheses, test them, and draw conclusions--but most people don't realize what takes place after that: we share our conclusions and data with other scientists and invite them to find any mistakes. This search for mistakes has been formalized in the last century into a process called called "peer review," and it's what keeps science on track. It's not perfect, nor is it pretty, but it's the best thing we've found so far, and it's far better than how scientists used to do this. Of course, you also need to criticize for the right reasons. Skepticism that is not based on data is just an opinion, and just as worthless to science as any other opinion. The blogosphere is full of this kind of rabid criticism, where people pick apart this or that conclusion without the benefit of careful reasoning. I expect better of all of you.

Example: " Critics of fracking often raise alarms about groundwater pollution, air pollution, and cancer risks, and there are still many uncertainties. But some of the claims have little — or nothing— to back them.

For example, reports that breast cancer rates rose in a region with heavy gas drilling are false, researchers told The Associated Press.

Fears that natural radioactivity in drilling waste could contaminate drinking water aren’t being confirmed by monitoring, either.

And concerns about air pollution from the industry often don’t acknowledge that natural gas is a far cleaner burning fuel than coal." [|Misleading Info] Formal Operation Stage: Based on the Palmer study and other studies in which researchers

4. Fight fire with fire, and data with data.
Sometimes when people don't like a particular study, they try to outsource the work of fighting it to someone else. They say things like, "But there might be something wrong with that study," or "That guy was probably biased," or "Yeah, and next year someone else will say the opposite." In scientific terms, they want their hypothesis that a flaw exists to have the same weight as the data they are attacking, but it just doesn't work that way. While it's possible that this or that study was flawed somehow, you will need evidence of an error if you want to refute the data. Bear in mind: even the least controversial claims in your textbook have gone through a lot of peer review--lots of smart people picked through them to find any mistakes the scientists might have made. While it's quite possible that they missed something, you can't reject data on the grounds that someone, someday, might find a flaw in it. Example: Dental Hygiene leads to a Stroke or heart attack or cardiovascular [|Redorbit.com has a similar bold headline -- "Study shows no link between gum disease and heart attack or stroke." The title gives no indication of the definite association between periodontal disease and cardiovascular disease, leading the public to believe oral health is not related to overall health.]

5. There's power in precise terms.
Ever wonder why scientists use such big fancy words? It's because these words are more precise, and good scientists understand the power of precise terms. Terms that might be interchangeable to a layperson might mean vastly different things to a scientist. If you want to get ahead in this class, you would do well to learn the differences between "psychotic" and "psychopathic," between "sex" and "gender," between "normal" and "typical." These are only a few of the terms that have specific meanings within scientific circles. The better you know them, the better you will understand what you read, which will lead to success on your tests and essays, which will lead to a higher grade. One example of this that comes up frequently is the word "theory." Outside of scientific circles, people use the term "theory" to mean an idea based on observation, but the proper scientific term for that is "hypothesis." A theory is actually a hypothesis that has been thrown to the wolves of scientific criticism, and survived. The theories in your textbook are the greatest survivors of this process. If you disagree with something in the book, that's fine, but you'll need more than "it's only a theory" if you want to criticize it.

6. Correlation is not causation.
What if I told you that outbreaks of a certain disease went up as sales of a particular food went up? That a large number of the people afflicted caught the disease within 24 hours of eating this food? Such data would lead many people to conclude that the food was contaminated. Denials from the food industry might be seen as stonewalling, or even a cover-up. After all, we all know the facts, right? Actually, we only know some of the facts, and the ones we are missing are critical. For the record, this actually happened. Polio outbreaks in the 1940's and '50 strongly correlated to the sales of ice cream. Notice I said "correlated." That means that as one goes up, the other goes up, and vice versa. The danger of this is in assuming that one must be causing the other. For any two factors A and B, a strong correlation might mean A is causing B, B is causing A, or some unknown factor C is causing both. Far too often, we jump to conclusions. In the case of polio, the problem is that the polio virus targets children and is more active during the summer--when ice cream sales are high. Thus, despite the correlation between the two things, neither was causing the other. A third factor--the season--is causing both.

7. More of something good isn't always better.
An extremely common assumption is that if something is good, it's always good, and more of it is always better. By that logic, since computers run on electricity, they should work better after being struck by lightning. There are actually very few things in real life that support this popular viewpoint. Even oxygen can quickly get toxic at levels only slightly higher than normal. This is a central issue to many class topics, such as over-stimulation of infants. Studies show that severe brain damage results from lack of stimulation, so companies have sprung up all over to make sure infants are stimulated. Many of them make claims to the effect that more stimulation will not just prevent brain damage, but actually foster brain growth. Recent studies have shown that nothing is further from the truth: over-stimulation can be just as dangerous.

Example: Your Baby Can Read From all of us at Your Baby Can Read, Thank you for your loyal support and for being such great customers. For more than 6 years, Your Baby Can Read! has been enjoyed and appreciated by families world-wide as an innovative reading concept for babies and young children. Regretfully, the cost of fighting recent legal issues has left us with no option but to cease business operations. While we deny any wrongdoing, and strongly believe in our products, the fight has drained our resources to the point where we can no longer continue operating. To our thousands of loyal customers who have provided overwhelmingly positive feedback, and particularly to those who took the time to send written and video testimonials about the success stories of their children, we sincerely thank you for being such great champions of our products. If you have questions regarding an existing order, please contact us at ybcservice@yourbabycan.com. Until August 15, a customer service representative will be available to respond to your emails during business hours.

8. Beware the plausible, __especially__ if it works.
There are few things more dangerous to critical thinking than the obvious answer. We are slow to test obvious answer, if we do it at all. This is how mankind's worst biases have been passed on for generation after generation, until some brave person realizes that they ought to be tested. This is especially problematic with methods that bring the desired results--methods that work. Suppose, for example, that a math student reduces the fraction 16/64 by crossing out the 6's to get 1/4. The answer is correct, but the method is flawed, and if the student continued to use this method, the result will be disaster.

9. Where there's smoke, there's smoke.
"Where there's smoke, there's fire", is a folk saying meant to express that if there's a rumor or suspicion of something, it must have started from a grain of truth. Scientists don't have the luxury of this assumption. We have to follow the data. Is it possible for a family to lose three children in a row to SIDS? Gossips will whisper what they want, but statistically speaking, unlikely things are bound to happen if you wait long enough. The police might suspect some kind of foul play, but if there's no evidence, it may be because there was no crime--just a tragic coincidence.

10. Lack of evidence doesn't mean it's false.
It's a lazy brain that sorts things into two groups: true and false. A scientist trains himself to sort them into three groups: proven true, proven false, and unproven. When we don't have enough evidence to prove something true, that's not enough to make it false. It remains unproven until we have enough evidence either way. This applies to a lot of our other rules here. for example, in the example from Law #6, there might actually be something bad in the ice cream. We just don't have enough data to say that yet. With Law #7, there are plenty of individual cases where more of something good is actually better. And in case of #9, the plausible explanation might actually be true. We just don't know yet, which means we need to keep an open mind and collect more data.