#DPTstudent You don’t need clinical experience…

You need deliberate practice. And, don’t be fooled you still need a lot of it.

The three most dangerous words in medicine are “In my experience.” –Mark Crislip, MD via Science Based Medicine

Especially when you are unaware of it’s caveats and limitations. Per Malcom Gladwell’s Outliers many advocate the 10,000 hour rule regarding the development of expertise. While this is a useful illustration of the sheer volume of practice necessary to develop mastery, it’s likely over simplistic for a concept as complex as expertise in a complicated craft.

Marla Popova comments that

The secret to continued improvement, it turns out, isn’t the amount of time invested but the quality of that time. It sounds simple and obvious enough, and yet so much of both our formal education and the informal ways in which we go about pursuing success in skill-based fields is built around the premise of sheer time investment. Instead, the factor Ericsson and other psychologists have identified as the main predictor of success is deliberate practice — persistent training to which you give your full concentration rather than just your time, often guided by a skilled expert, coach, or mentor. It’s a qualitative difference in how you pay attention, not a quantitative measure of clocking in the hours.

Clinical care, research, and critical thinking are no different. It is not experience, as linear measurement of time, but rather quality practice and volume that matters in developing high level skills. Left to its natural devices our brains and psyches are stubbornly prone to bias and errors in rational thinking. Confirmation bias and improper associations such as post hoc ergo propter hoc (since event Y followed event X, event Y must have been caused by event X) are common and often unrecognized. Thus, practice must be reflective and critical. Practice must be varied and evolve over time.

The skills required (mental, psychomotor, interpersonal, and otherwise) are staggering. It involves knowledge of current research, research methods, critical thinking, connection of concepts, connection  of knowledge, problem solving, listening, examination, hands on techniques, clinical decision making, and patient interaction.

Evidence is more important than experience.
Evidence can not replace experience.
You can’t have evidence based practice without experience.
Experience is meaningless without evidence.

Experience vs. Evidence. I observe these views that appear to be from separate ends of a spectrum, that appear to be contradictory, but in reality are just different concepts. So, does experience matter? Yes, but…experience as conceptualized and measured in years is both insufficient and incomplete. The focus should not be the mere acquisition of experience; but instead on proper, focused practice with the appropriate processes required to develop the necessary skills for mastery. This is not to say volume is meaningless, even focused practice requires repetition and time for effectiveness. Quantity is ultimately meaningless without quality. Quality is meaningless if it can’t be repeated and refined .

Adam Rufa and Joe Brence of Forward Thinking PT examine the concept of “clinical experience.” Their post series, The Experience Wall, assesses perceptions, memories, and interpretation within clinical care. Joe Brence highlights how experience may not result in linear increases in clinical skill and outlines a new definition of clinical experience:

I propose clinical expertise is not simply gained through practice. It is built through assessment of your ability to think, reason and apply scientifically plausible principles into practice. It requires peer-review.  It requires your thoughts and ideas to be challenged. It requires a hint of uncertainty.


New grads should be armed with the latest research, and often tout that they are more “evidence based.” Without “experience” they should rely mostly on didactic knowledge, research, and strong science based logic as they lack sufficient “experience” meaningful practice volume. But, clinicians with years and years of experience may claim that this research evidence alone is empty without time in clinic. So, who is right? Well, both of course.

Knowledge of research does not mean you can apply it. True
Having experience does not mean you are providing “evidenced based” interventions. True
Proper knowledge and experience do not ensure best care. True.

Paradoxes exist, and hacking may be helpful to a broader, more accurate assessment of the hows and whys of clinical care. Appropriate “evidence base” and proper “experience” are separate, but interacting components of developing into a high level clinician. Ideally, these are synergistic principles that contribute to each other, instead of mutually exclusive entities that are developed in isolation. Neither “experience” nor “evidence” ensures accurate research interpretation and application. Knowledge of current literature, appraisal of research, application of science, translating understanding into to practice, volume of clinical practice, and level of clinical ability (ranging from communication to therapeutic alliance to clinical decision making) are all differing skills. Of course, this is not an exhaustive list or conceptual framework. But, in essence, developing as a clinician, no matter our professional age, is more than simply evidence or experience.

Residencies and fellowships contain the potential to accelerate development when implemented effectively. The explicit curriculum, reflective practice, and mentorship can result in a more deliberate, critical, and self-reflective form of professional growth and clinical care. But, of course, residencies and fellowships do not guarantee proper mental skill acquisition or development, especially if founded on misguided assumptions or practice theories (but, this is a wholly separate topic); nor are they the only means to foster such growth. I have encountered plenty of physical therapists with bachelors degrees whose critical thinking, clinical decision making, and “evidence” base is quite staggering. And, conversely, I’ve interacted with many residency and fellowship trained physical therapists with doctor of physical therapy degrees whose reasoning and treatment skills were quite suspect. And yet, as Eric Robertson and Lauren Kealy discuss in their post series, The Bane of the New Professional remains significant. New grads are empowered and motivated to engage within the profession, yet some clinics appear to value experience. So, what gives?

Experience is bias.
Evidence is rigid.
Experience lacks rigor and control.
Evidence lacks experience applying to the individual.

There are great janitors and bad rocket scientists. Hacking your education, regardless of your experience, is necessary. These concepts of experience and evidence require critical reappraisal; a reframing that recognizes the necessity of a synergistic relationship. In regards to the myriad of skills encompassed in clinical care, an understanding to the non-linear progression of each. It’s much more than evidence vs experience, evidence or experience, or even evidence and experience. And, it takes practice.

9 Replies to “#DPTstudent You don’t need clinical experience…”

  1. Thanks for this.

    The distinction here is very valuable, looking at experience as an unstructured exposure and comparing that to deliberate practice which optimizes performance. That’s one of the clear lessons of expert performance research in certain domains.

    When we engage a domain that has predictable rules and patterns, where we have normative criteria for performance, and where we can optimize performance by internalizing the predictable rules, consistent deliberate practice with good feedback over a long period of time builds a reliable increasing base of expertise. And we don’t necessarily build that base from exposure alone.

    I want to argue a step farther though that in certain domains, not only does “mere experience” not reliably build decision making competence, but even deliberate practice can fail us. The reason is that in some domains, although we have clear normative criteria for success we do not also have clear feedback about the best option in any given situation, so the validity of the environment for learning from deliberate practice is relatively low. I suggest that’s why “clinical experience” tends to fail us relative to actuarial decision making methods in well-defined decision situations, but would also tend to mean that deliberate practice would be problematic as well.

    To me the more promising approaches are based on naturalistic decision making models of expertise that are better suited to domains where there is novelty, ambiguous information, time pressure, and there may be no actual optimization of performance.

    There may however be domain general principles other than or in addition to deliberate practice that are suited to training methods like simulation. These principles I think fall under the umbrella of self and other metacognition, thinking about our own thinking and thinking about the thinking of other people. That’s how we correct for cognitive errors, and use strategies like making adaptive use of our pattern recognition by creating good mental models of situations as they unfold. This kind of competence I think improves further on what we can do with purely actuarial methods as well as getting past the limitations of unstructured “clinical experience.”

    This implies we should emphasize elicitation and analysis of how clinical exemplars make decisions and transfer their knowledge to simulations and technical displays. The methods for doing these things have evolved greatly in the decades since the early and largely inadequate rule-based “expert systems” and I think now can help identify leverage points between human and machines for making more effective decisions.

    Some examples of applying metacognitive strategies for competence in decision making:

    Cognitive Forcing Strategies for reducing errors

    Experience with simulation training

    An intriguing notion of medical expertise being scaffolded in predictable layers of adaptive expertise, introducing the idea of illness scripts.

    The value of thinking about thinking as a domain general strategy for high novelty and high ambiguity domains is covered well in this book.

    The overarching concept of macrocognition as a useful general way of thinking about thinking in complex decision making environments and distinguishing what experts do differently is introduced here.

Comments are closed.