Why do clever osteopaths believe stupid things?
The International Academy of Osteopathy
http://www.osteopathie.eu/en
http://www.osteopathie.eu/en/publications
info@osteopathy.eu
1. 1
Why do some clever osteopaths believe stupid things?
Luc Peeters, MSc.Ost.
Rational thinking osteopaths that adhere the concept of Evidence Based Practice
(EBP) and the profession itself suffer from colleagues that belief in non-scientific
medical or pseudo-medical approaches to patients. Guru-type osteopaths and self
declared specialists claim their own methods, often giving them strange names such
as causamatics, osteosofie, morphologicum, biodynamics, silver bullits, bioregulative
medicine or others. Some osteopaths even think that osteopathy concerns life-
coaching and relational therapy.
There are enough scientific studies conducted to give us some insights into what
makes otherwise rational people adhere to irrational beliefs and what might be done
to prevent a growth in anti-science thinking.
Science advocates may fundamentally differ from pseudoscience advocates in the
way in which cognitive dissonance is reduced.
When faced with two conflicting ideas, the science advocate weighs the likelihood of
each idea against accepted knowledge and new information, then adopts the
most parsimonious position.
This requires the science advocate to change a previously held belief. The science
advocates reduce dissonance by accepting that they were wrong. Science
advocates can change their minds.
Pseudoscience advocates don’t do this. When faced with unambiguous,
conflicting, scientific evidence against their favorable idea, pseudoscience advocates
reject or rationalize the disconfirming evidence as meaningless. They do this through
the process of "motivated reasoning". They hold onto their previously held, favored
belief. In fact, Festinger et al. (cognitive dissonance) showed that, their belief actually
may become reinforced after they rationalize away the disconfirming evidence.
Rationalization is justifying controversial behaviour or feelings by explaining this in a
seemingly rational or logical manner to avoid the true explanation. They are made
consciously tolerable, even admirable and superior by plausible means.
The opposite of 'dissonance' is 'consonance'. If dissonance produces a deeply
negative emotional response, consonance produces an immediate positive emotional
response. We have all experienced this as well. We clap, we cheer, we nod in
approval when a speaker makes statements that are consonant with our beliefs.
It is known that people will embrace - with little or no scrutiny - arguments that agree
with their beliefs. That is, claims that produce cognitive 'consonance' are accepted on
face value alone.
2. 2
Sometimes, a research paper may be proudly touted just by reading the title and the
abstract's conclusion. Rarely is a consonant study's materials and methods section
scrutinized for biases, errors and uncertainty.
On the other hand, we pick apart the details of studies that contradict our beliefs. We
just know that there must be an error, because our brains have already decided that
such studies must be wrong.
Most people consider themselves to be rational. We all like to think that our positions
on certain issues reflect the most reasonable stances to take based on the available
evidence. But when presented with evidence that conflicts with our positions
(especially those with which we self-identify in some way), it is natural to be extra
critical of the new data. We are prone accept confirming evidence (confirmation bias)
with little scepticism. However, we become expert sceptics when faced with data that
may falsify our firmly held ideas.
"The "backfire effect" is a term coined by Brendan Nyhan and Jason Reifler to
describe how some individuals when confronted with evidence that conflicts with their
beliefs come to hold their original position even more strongly". People tend to dig
in their heels and insist even more fervently that their position is true if they sense
that they are being attacked. They use motivated reasoning to preserve their belief
and reject the attack.
This explains why is seems impossible to change somebody's mind about a deeply
held belief simply by pointing to conflicting evidence. One cannot simply tell them
"You're wrong”. That causes too much cognitive dissonance and "backfires".
It takes a different approach to avoid cognitive dissonance and the backfire effect.
For an adorable demonstration of these concepts in action, see this video. Aaliyah's
mother's solution is great. Enjoy.
(https://www.youtube.com/watch?v=7SdmT7NYJMg )
People do not often simply adopt an unscientific or irrational concept. They get there
in small steps. Each step may produce some dissonance, which leads to
rationalization and reinforcement. Through each successive small, dissonance-
reducing step, they get deeper and deeper into the concept.
Cognitive dissonance is an integral part of our psychology. It is a barrier that prevents
individuals from recognizing when they are wrong. In the world of science, medicine
and osteopathy, recognizing our own errors and mistakes is vital. Patients' lives may
depend on it.
We are all prone to cognitive dissonance and to using motivated reasoning to
overcome it. By recognizing it at work in ourselves and being open to being wrong
are hallmarks of good, scientifically thinking osteopaths.
Are you one of us?
Luc Peeters, MSc.Ost.
3. 3
Bibliography
Cognitive dissonance: https://www.youtube.com/watch?v=zuUPW86Nxo4
Festinger, Leon. A theory of cognitive dissonance. Stanford University Press, 1957.
http://www.imd.inder.cu/adjuntos/article/535/Rational%20and%20Irrational%20Belief
s.pdf
"Cognitive dissonance - The Skeptic's Dictionary - Skepdic.com." 2006.
<http://www.skepdic.com/cognitivedissonance.html>
When Corrections Fail: http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf