Metacognition can be considered a synonym for reflection in applied learning theory. However, metacognition is a very complex phenomenon. It refers to the cognitive control and monitoring of all sorts of cognitive processes like perception, action, memory, reasoning or emoting.

A recent #DPTstudent  tweet chat dealt with the concept of metacognition broadly (list of links), but more specifically discussed the need for critical thinking in education and clinical practice. Most agreed on the dire need for critical thinking skills. But, many #DPTstudents felt they had no conceptual construct on how to develop, assess, and continually evolve thinking skills in a formal, structured manner. Many tweeted they had never been exposed to the concept of metacognition nor the specifics of critical thinking. Although, most stated that “critical thinking” and “clinical decision making” were commonly referenced.

What’s more important than improving mental skill sets?

Thinking is the foundation of conscious analysis. Yet, even with a keen focus on assessing and improving our thinking capacities, unconscious processes influence not only how and why we think, but what decisions we make, both in and out of the clinic. We are humans. Humans with bias minds. Brains that, by default, rationalize not think rationally…

Everyone thinks; it is our nature to do so. But much of our thinking, left to itself, is biased, distorted, partial, uninformed or down-right prejudiced. Yet the quality of our life and that of what we produce, make, or build depends precisely on the quality of our thought. Shoddy thinking is costly, both in money and in quality of life. Excellence in thought, however, must be systematically cultivated. – Via CriticalThinking.org

Need a Model

Mary Derrick observed previously in her post that the words critical thinking and clinical decision making are often referenced without much deeper discussion as to what these two concepts entail or how to develop them. Students agree that critical thinking and sound clinical decision making are stressed in their pre-professional education. But, all levels of education appear to grossly lack formalized courses and structured approaches. The words are presented, but rarely systematically defined. The actual skills rarely practiced and subsequently refined. Students thus lack not only exposure and didactic knowledge of metacognition, critical thinking, and decision making, but also lack experience evolving these mental skills.

Critical_Thinking_Wheel_DJacobs

Critical Thinking Wheel via Diane Jacobs @dfjPT 

Need to teach how to think

Alan Besselink argued a scientific inquiry model to patient care is, from his view, “the one approach.”

There is, in fact, one approach that provides a foundation for ALL treatment approaches: sound, science-based clinical reasoning and principles of assessment, combined with some sound logic and critical thinking.

One approach to all patients requires an ability to gather relevant information given the context of the patient scenario. This occurs via the clinician’s ability to ask the appropriate questions utilizing appropriate communication strategies. Sound critical thinking requires the clinician to hold their own reasoning processes to scrutiny in an attempt to minimize confirmation bias if at all possible. It also requires the clinician to have a firm regard for the nature of “normal” and the statistical variations that occur while adapting to the demands of life on planet earth.

Philosophically, I agree. It appears the question “what works?” has been over emphasized at the potential sacrifice of questions such as “how does this work?”  “why do we this?” and “why do we think this?”

Unfortunately, the construct of evidence based practice assumes the user applying the EBP model is well versed in not only research appraisal, but critical thinking. The structure of evidence based practice overly relies on outcomes studies. It lacks a built in process for integration of other sources of knowledge as well as the applicable question of “does this work as theoretical proposed?” The evidence hierarchy is structured and concerned with efficacy and effectiveness only. Many will be quick to point out that from a scientific rigor standpoint the evidence hierarchy is structured as such, because other forms of inquiry (basic physiology, animal models, case reports, case series, cohort studies, etc) can not truly answer questions “what works?” without significant bias. Robust conclusions on causation can not be made via less controlled experiments. And this, of course, is true. In terms of assessing effectiveness and efficacy in isolation, the evidence hierarchy is appropriately structured.

But, the evidence hierarchy does not consider knowledge from other fields nor basic science, and thus by structure explicitly ignores plausibility in both theory and practice. To be fair, plausibility does not necessarily support efficacy nor effectiveness. So, it is  still imperative, and absolutely necessary, to learn the methodology of clinical science. Understanding how the design of investigations affects the questions they can truly answer precedes appropriate assessment and conclusion. Limits to the conclusions that can be drawn are thus explicitly addressed.

Need the Why

Because of the focus on evidence based practice, which inherently (overly?) values randomized control trials and outcomes studies over basic science knowledge and prior plausibility, students continue to learn interventions and techniques while routinely asking “what works?” Questions of “how did I decide what works?” “why do I think this works” and “what else could explain this effect?” also need to be commonly addressed in the classroom, clinic, and research. Such questions require formalized critical thought processes and skills.

These questions are especially applicable to the profession of physical therapy as many of the interventions have questionable, or at least variable, theoretical mechanistic basis in conjunction with broad ranging explanatory models. This is true regardless of effectiveness or efficacy. In fact, it is a separate issue. Physical therapy practice is prone to the observation of effect followed by a theoretical construct (story) that attempts to explain the effect. A focus on outcomes based research perpetuates these theoretical constructs even if the plausibility of the explanatory model is unlikely. In short, while our interventions may work, on the whole we are not quite sure why. @JasonSilvernail‘s post EBP, Deep Models, and Scientific Reasoning is a must read on this topic.

The profession suffers from confirmation bias in regards to the constructs guiding the understanding of intervention effects. In addition, most, if not all, interventions physical therapists utilize will have a variety of non-specific effects. These two issues alone highlight the need for critical thought in order to ensure that our theoretical models, guiding constructs, and clinical processes evolve appropriately. And, further, to facilitate appropriate interpretation of outcomes studies.

It is not “what works?” vs. “why does this work?” Instead, a focus on integrating outcomes studies into the knowledge and research of why and how certain interventions may yield results is needed. This requires broadening our “evidence” lens to include physiology, neuroscience, and psychology as foundational constructs in education and clinical care. Further, research agendas focused on mechanistic based investigations are important to evolving our explanatory models. Education, research, and ultimately clinical care require both approaches. Interpretation, integration, and application of research findings, be they outcomes or mechanistic, necessitates robust cognitive skills. But, do we formally teach these concepts? Do we formally practice the mental skills?

So, now what?

There appears to be an obvious need, and obvious value, to learning how to think. But, that is just the start. The necessity of learning to think about thinking is required to improve the specific skill of critical thinking. The understanding and application of evidence based practice needs more robust analysis. Growth of critical thinking, metacognition, and an evolution of evidenced based to science based practice produces the foundation for strong clinical decision making. The call for evidence based medicine to evolve to science based medicine focuses on ensuring clinicians interpret outcomes studies more completely. It appears to put strong emphasis on increased critical thinking and knowledge integration.

Does Evidence Based Medicine undervalue basic science and over value Randomized Control Trials?

A difference between Sackett’s definition [Evidence Based Practice] and ours [Science Based Medicine] is that by “current best evidence” Sackett means the results of RCTs…A related issue is the definition of “science.” In common use the word has at least three, distinct meanings:

1. The scientific pursuit, including the collective institutions and individuals who “do” science;

2. The scientific method;

3. The body of knowledge that has emerged from that pursuit and method (I’ve called this “established knowledge”; Dr. Gorski has called it “settled science”).

I will argue that when EBM practitioners use the word “science,” they are overwhelmingly referring to a small subset of the second definition: RCTs conceived and interpreted by frequentist statistics. We at SBM use “science” to mean both definitions 2 and 3, as the phrase “cumulative scientific knowledge from all relevant disciplines” should make clear (by jennifer). That is the important distinction between SBM and EBM. “Settled science” refutes many highly implausible medical claims—that’s why they can be judged highly implausible. EBM, as we’ve shown and will show again here, mostly fails to acknowledge this fact.

 

What to do?

1. Learn how humans think by default: Biased
2. Learn the common tricks and shortcuts our minds make and take
3. Understand logical fallacies, cognitive biases, and the mechanics of disagreement
4. Meta-cognate: Think about your own thinking with new knowledge
5. Find a mentor or partner to critique your thought processes: Prove yourself wrong
6. Critique thought processes, lines of reasoning, and arguments formally and informally
7. Debate and discuss using a formalized structure
8. Think, reflect, question, and assess
9. Discuss & Disagree
10. Repeat

Science Based Practice…

1. Foundations in basic science: chemistry, physics, physiology, mathematics
2. Prior plausibility: “grand claims require grand evidence”
3. Research from other relevant disciplines from physics to psychology
4. Mechanics of science: research design and statistics
5. Evidence Based (outcomes) Hierarchy

Questions lead, naturally, to more questions. Inquiry breads more inquiry. Disagreement forms the foundation of debate. And thus, Eric Robertson advocates for embracing ignorance

Ignorance is not an end point. It’s not a static state. Ignorance isn’t permanent. Instead it’s the tool that enables one to learn. Ignorance is the spark that ignites scholarly inquiry.

Ignorance: the secret weapon of the expert.

Growth is rarely comfortable, but it’s necessary. And, that’s a lot to think about….

Resources

A Practical Guide to Critical Thinking
CriticalThinking.org
Logical Fallacies
Critical Thinking Structure
List of Fallacies
List of Cognitive Biases
Science Based Medicine
Clinical Decision Making Research (via Scott Morrison)
Clinical Decision Making Model
Thinking about Thinking: Metacognition Stanford University School of Education
Occam’s Razor
The PT Podcast: Science Series
Understanding Science via Tony Ingram of BBoyScience
I don’t get paid enough to think this hard by @RogerKerry1 (his blog is fantastic)