Over the week, I have sketched three attempts to answer the questions: What is an expert? and How does someone become an expert? Though I’ve glossed over many details, the accounts point roughly to the following features of expertise:
- Expertise involves extensive competence in a domain (including extensive tacit or explicit knowledge).
- Expert competence is acquired through specialized practice, such that expertise has both cognitive and performative dimensions.
- Specialized practice takes place within and requires social structures that make it possible to enhance competence to the level of expertise.
There are notable counterexamples to this consensus (see Hambrick, et al., 2018). And one type of counterexample stands out as especially concerning: The person who acquires expertise in a domain for which deliberate practice and other experts aren’t much help.
Consider the expert political forecaster (Tetlock and Gardner 2015). Given that political events are highly context-dependent and largely unpredictable, it is surprising to learn that there are experts who outstrip chance in making predictions about them.
The background conditions for political events cannot be replicated in a laboratory or practice hall. And forecasting cannot be broken into constituent parts. Thus, competence cannot be analyzed piecemeal, as deliberate practice suggests. And there is no strong feedback mechanism for evaluating performance except whether the event occurs within the predicted time frame.
Kind and Wicked Learning Environments
Psychologist Robin Hogarth (2001; 2010) says domains like forecasting are “wicked learning environments.” They are environments in which either feedback is delayed or not strongly correlated with competence, such as firefighting and military command (see Klein 1998).
In contrast, expertise in domains like violin playing or ballet is acquired in what Hogarth calls “kind learning environments.” In kind learning environments, feedback is immediate and strongly correlated with competence.
Since wicked domains are not candidates for the deliberate practice model, there must be another explanation for how experts in those domains become experts. I think a plausible explanation involves expanding what psychologist Daniel Kahneman (2011) calls “System 1” and “System 2” thinking to describe the kinds of training involved in the two environments.
The Dual-Process Theory of Reasoning
Violin expertise is cultivated through continual attention to and practice with each aspect of finger placement and bow movement. Each mistake is a check on the process of play. Those mistakes engage the player’s System 2 cognitive processing, which are slow, reflective processes of conscious thinking.
After enough System 2 training, playing becomes more fluid, tacit. At this point, System 1 cognitive processing (fast, reflexive thinking) can take over, leaving the player to focus attention on the more challenging aspects of play. The result is the appearance of effortless flow, though the player’s mind is always monitoring for errors and new challenges.
System 1 and System 2 Expertise
The idea is that training System 2 processes produces System 1-style (fast and precise) expert performance. I call this System 1 Expertise.
Now consider a domain like internal medicine, in which medicines affect patients differently, where diagnoses must be updated as the patient’s lab results change, and where multiple conditions must be managed simultaneously.
Internal medicine doctors rely heavily on System 1 Expertise (quick access to information about pharmacology, drug interactions, physiologic systems, diagnostic categories, etc.). But they don’t receive enough consistent feedback from patient to patient to apply this expertise with System 1 efficiency. Attempting to turn internal medicine into System 1 Expertise leads to cognitive errors (like anchoring and confirmation bias) which lead to diagnostic and communication errors.
The key to training expertise in internal medicine—and other wicked environments—is to slow down System 1 Expertise to reflect on how to appropriately reason about new and idiosyncratic situations. This is often accomplished through residency programs, where students are continually challenged to give explicit reasons for their decisions and to consider alternatives.
Other strategies include cultivating competence with statistical reasoning (e.g., regular updating of probabilities; considering base rate probabilities before the probabilities of parts of an event) and enlisting a diversity of perspectives on a problem before acting (see Tetlock and Gardner 2015 for more).
The idea here is that training System 1 processes to attend to non-standard features of a case produces System 2-style (methodical and accurate) expert performance. I call this System 2 Expertise. And like System 1 Expertise, System 2 Expertise depends heavily on immersive social structures–like residency programs–to guide training.
The Cognitive Systems Account of Expertise
Together, these two paths to expertise comprise what I call the Cognitive Systems Account. Characterizing expertise this way has a number of benefits. It maintains the objective nature of expertise, fulfilling the central motivations for truth-based accounts while avoiding the problem of deciding who has enough true beliefs to be an expert. It incorporates the vast empirical research supporting deliberate practice as a central path to expertise in kind environments. It incorporates growing empirical research on training expertise in wicked environments. And both paths explicitly acknowledge a debt to extensive social structures.
A Note of Thanks
I hope these five posts have piqued your interest in expertise studies. Thanks to The Brains Blog for the opportunity, and thanks to you for reading along. If you have thoughts or suggestions about anything I’ve written, don’t hesitate to leave a comment.
References
Hambrick, David Z. Guillermo Campitelli and Brooke N. Macnamara (eds) (2018). The Science of Expertise: Behavioral, Neural, and Genetics Approaches to Complex Skill, London: Routledge.
Hogarth, Robin (2001). Educating Intuition, Chicago: University of Chicago Press.
_____ (2010). “Intuition: A Challenge for Psychological Research on Decision Making.” Psychological Inquiry 21: 338-353.
Kahneman, Daniel (2011). Thinking Fast and Slow, New York: Farrar, Straus, & Giroux.
Klein, Gary (1998). Sources of power: How people make decisions, Cambridge, MA: The MIT Press.
Jamie Carlin Watson, PhD, is Assistant Professor of Medical Humanities and Bioethics at University of Arkansas for Medical Sciences, Little Rock, AR and author of Expertise: A Philosophical Introduction.