The test of replicability: are you practicing fact or fiction?

In December, The New Yorker ran an article which I found extremely interesting and intriguing.  The article was titled, “The Truth Wears Off: Is there something wrong with the scientific method?” ,and examined the difficulty for scientists to replicate well-established studies.   The author, Jonah Lehrer, highlights how the foundation of modern research, the test of replicability, is revealing that many believed facts are losing their truth.  He states the difficulty of scientists to replicate well-known studies and get similar results.
He sites a possible reason for this from research completed in the 1930’s by Joseph Rhine, a psychologist at Duke University.
Rhine had an interest in extrasensory perception (esp) and Rhine would use a deck of 25 cards printed with one of five different symbols on each card.  Rhine would ask a subject which symbol was on the card and most of his subjects would guess 25% of the cards correctly which is mathematically expected.  One day, Rhine came across a subject named Adam Linzmayer who guessed nearly 50% of the cards correctly, even guessing nine in a row!  The odds of this happening by chance, without esp, is 1/2,000,000 and Linzmayer repeated this three times.
Rhine studied these results and was preparing an article for publication on esp when he invited Linzmayer back to guess 1000 more cards.  This time, Linzmayer’s success rate went down significantly and it appeared he was guessing correct by chance.   Rhine later noticed this effect would occur with other individuals who he believed had ESP and called this “the decline effect“.  It simply represented a decline in his data to replicate.
Now to some math…a decline in data to replicate most likely occurs from a regression toward the mean.  As the experiment is repeated, it cancels out statistical flukes or extreme phenomenon. So in the example of Rhine, the statistical fluke of possible esp was an illusion which vanished over time.
Another possible flaw to replication is in the peer-reviewed process.  There is currently a fad in which when a new idea or paradigm is proposed, the peer-reviewed process is skewed or more entrenched toward positive results.  In time, the incentives shift and the most notable results are to disprove the once exciting, new theory.  This can be referred to as publication bias.  But as publication bias has an effect, it doesn’t account for the researchers findings which in many cases, may be skewed from selective reporting of results.
After reading this article, which is non-peer reviewed, I am intrigued at the replicability of much research we currently follow as clinicians.  One article that particularly comes to mind was published in the January 2011 JOSPT. The article was titled “Immediate Effects of Lumbar Spine Manipulation on the Resting and Contraction Thickness of Transversus Abdominis in Asymptomatic Individuals”.  This article looked at the effects of lumbar spinal manipulation on the thickness of the transversus abdominis (TrA) in asymptomatic subjects.  In previous studies, including the infamous clinical prediction rule for lumbar manipulation, examiners found the thickness of the TrA changed following a lumbar manipulation.  The subjects in previous studies were those with LBP, so it could be speculated that the thickness may have been altered due to descending pain inhibitory influences which might have allowed for greater relaxation.   In this current study, the thickness of the TrA did not change with manipulation.
Could this study be a possible example of the decline effect or a possible peer-reviewer excitement to disprove a once-exciting theory? Possibly.  The results could also be correct, and non-flawed, and simply different due to the examination of subjects without LBP vs. those with LBP and the neurophysiological mechanisms effected by the manipulation are different in those with pain vs. those without pain.   But in either case, we must continue to reevaluate and replicate our current research and techniques so that we can further define ourselves as effective clinicians.
Lehrer J. The Truth Wears Off: Is there something wrong with the scientific method. The New Yorker. Dec. 13, 2010.
Puentedura EJ, Landers MR, Hurt K. Immediate Effects of Lumbar Spine Manipulation on the Resting and Contraction Thickness of Transversus Abdominis in Asymptomatic Individuals. JOSPT: 41 (1); 13-21.

All Comments

  • Joe that was a great article! Maybe its just me but I feel that our profession is trying very hard to validate itself as an independent profession at a doctoring level. The studies coming out are wonderful ranging from prediction rules, effective combination of treatments, validated clinical tests, when to refer, the topics could go on and on. As evidence based as we want to project ourselves to be the APTA best represented our profession as “The science of healing and the art of caring.” Unique to physical therapists is the act of caring about a patient to than use the best available science to help them. Interestingly this slogan was used to represent one person and one physical therapist in my opinion. We can care about many patients and attempt to group them homogenously according to like symptoms but the art of caring for one individual diminishes. There are as much verbal and nonverbal interactions that occur with our treatments. performing a manual therapy technique or projecting to your patient that it is safe for them to move all are difficult variables to capture to create outcomes. If we were entirely able to categorize each patient and provide an intervention and always have a 100% outcome, we would truly be a machine and not human! The more we understand how the human body moves the more we understand how much we do not know. Seeking the truth is accepting the more we know, the more we know how much we do not know. Unfortunately insurance companies no not want an answer like that. Answering whether we practice fact or fiction relies independently on the therapist : patient interaction and if the patient believes that they have improved. Being a PHYSICAL therapist and wanting an answer of the truth to validate what we do, we have to accept human variability in MENTAL processing.

    Francois Prizinski March 23, 2011 3:45 pm Reply
    Check out Lorimer Moseleys explanation of how association of data is not causative..The video isnt the best but you get the point…Genious when it comes to understanding pain, but in my opinion, also a genious when it comes to undertanding the interpretation of research and research bias

    joebrence9 March 24, 2011 2:52 am Reply

Leave a Reply

Your email address will not be published. Required fields are marked *