Introduction
Clinical reasoning can be defined as “the cognitive processes physicians use to diagnose and manage patients.”
1(p.248) It involves the decision processes required for diagnosis and treatment planning, alongside influential contextual and situational factors.
2 As a focus of research in human medicine for the past 40 years,
3 dramatic developments have occurred in the understanding of both the cognitive underpinning of clinical reasoning in physicians and the practical demonstration of the skill as a health professional.
Clinical reasoning is also a fundamental skill for veterinary surgeons.
4 In contrast to human medicine, there have been very few studies dedicated to understanding the process of
veterinary clinical reasoning.
5,6 As a result, veterinary educators are uncertain how to extrapolate medical research findings to their own field, and where differences between the disciplines affect decision making. These uncertainties, combined with the embedded nature of the skill within curricula, make developing clinical decision making a “formidable challenge to veterinary educators and their students.”
7(p.200)Studies into medical and veterinary undergraduate clinical reasoning development frequently examine the effect of a specific intervention on the reasoning skills of students, not the current reasoning development within an established curriculum. Although these interventions can have positive effects,
8–12 graduating with competence in clinical reasoning undoubtedly depends on more than one single teaching activity. Evaluation of the contribution and effectiveness of all aspects of the curriculum to clinical reasoning development is needed to understand shortcomings and to identify the need for and appropriate use of these interventions.
Understanding veterinary student reasoning development has recently increased in urgency, as the Royal College of Veterinary Surgeons (RCVS) now includes clinical reasoning ability as a Day One Competency.
4 The work of Tomlin et al.
13,14 provides the greatest insight into veterinary undergraduate clinical reasoning, demonstrating that students' clinical decision-making methods and opinions can differ substantially from what their clinical teachers predict. This suggests educators' assumptions about reasoning development in curricula are unreliable. However, this study only provides a snapshot of the process during a final-year examination, which is difficult to extrapolate to the whole program. Further information is needed to understand how veterinary students learn to make clinical decisions, what level of competence they achieve, and how this process can be optimized.
The aim of this study was to use the University of Nottingham School of Veterinary Medicine and Science (SVMS) as case study to investigate the development of clinical reasoning among veterinary students. It was hoped that information gained from a detailed investigation of one veterinary curriculum in the United Kingdom would provide some insight into clinical reasoning development that could be generalized to other veterinary schools
15 and contribute to general understanding of the process.
The 5-year Veterinary Medicine and Science program at the SVMS is a vertically integrated spiral curriculum arranged into body system modules (e.g., cardiorespiratory system). Harden describes a spiral curriculum as “one in which there is an iterative revisiting of topics, subjects or themes throughout the course.”
16(p.141) Importantly, each topic must be built upon with each encounter, increasing the skill of the student with time. The SVMS also uses a distributed model whereby the clinical practice modules that make up the final year of the program are taught offsite by university staff at associate veterinary practices. In addition to this practical experience, the RCVS requires all veterinary students to complete 26 weeks of clinical extramural studies (CEMS) consisting of workplace-based learning in private veterinary practices during holiday periods.
At the SVMS, clinical reasoning is considered to be an “embedded” topic—meaning it is integrated throughout all modules of the program,
17 within various teaching sessions (e.g., case-based learning [CBL]). There is also a dedicated lecture and a practical session explaining the concept and process of clinical reasoning to students in the third year of the program. Students are examined on their clinical reasoning ability in the fourth and final years of the program using case-based questions. This study aimed to clarify where and how decision-making expertise was developed.
Methods
We used Harden's conceptualization of a curriculum
18 as a framework for analysis. This model presents three overlapping but separate components within a curriculum: (1) information declared to be taught, (2) what actually is taught, and (3) what the student actually learns. As clinical reasoning is a topic integrated within many aspects of the SVMS curriculum and thus is difficult to isolate and access, structuring the study in this way allowed us to analyze the curriculum systematically, ensuring all perspectives and experiences were considered. Harden includes the hidden curriculum in his framework, embedded within the “learned” perspective.
We used a mixed-method approach. The concurrent mixed design
19 had two stages: (1) quantitative analysis of the declared curriculum through document content analysis and (2) qualitative analysis of the taught and learned curricula through staff and student/graduate perceptions, respectively. We integrated inferences from these two data sets post-analysis. The SVMS Ethics Committee approved all components of the study.
Content Analysis of the Declared Curriculum
We analyzed the declared curriculum by conducting a document content analysis—a process that codes and quantitatively analyzes qualitative data.
20 Method guidelines by Cohen et al.
21 were modified by selectively coding only information that related to clinical reasoning. The inclusion and exclusion criteria for the coding are shown in
Table 1.
We selected documents using a purposive sampling technique, whereby all documents describing the content of the SVMS curriculum were included. These were sourced from the Teaching, Learning, and Assessment department. As the SVMS has been operational for just 9 years, we found only 11 documents, most created for the purpose of accreditation. These included detailed learning objective records, student handbooks, self-evaluation reports, and program specifications. No documents were excluded.
In two of the documents, curriculum learning objectives were recorded next to the session type in which they were delivered (i.e., lecture, practical, self-directed learning [SDL], seminar, or CBL). In these documents, we noted the session type associated with each coded learning objective and calculated the percentage of codes (and therefore learning objectives relating to clinical reasoning) that appeared in each session type.
Thematic Analysis of the Taught and Learned Curricula
The taught and learned curricula were investigated qualitatively using the perceptions of SVMS staff, students, and recent graduates. Separate focus groups were held with SVMS staff (total of 16 participants) and students (total of 16 participants). Interviews were held with five recent SVMS graduates.
Focus Groups
Using a non-randomized purposive sampling technique, all staff involved in the teaching or planning of key curriculum areas were invited to participate in a focus group. Two focus groups were run with volunteer staff members, one with 8 participants and the other with 10.
A convenience sample of SVMS students were recruited via email. First-year students were not included as they had very limited experience of SVMS teaching (data collection took place within the first 2 weeks of a new student intake). Two focus groups containing eight students were run, with two students from each year group (years 2–5).
Both staff and student focus groups used a semi-structured questioning approach and lasted approximately 90 minutes. The participants of all groups were provided with a definition of clinical reasoning. Questions focused on participant perceptions of clinical reasoning as a process and how they felt it develops during the SVMS curriculum.
Interviews
A convenience sample of SVMS graduates less than 2 years post-qualification were interviewed individually to determine their view of the learned curriculum and their experiences of clinical reasoning in their first job. Interviews were semi-structured and conducted both in person and by telephone, lasting between 45 and 60 minutes. Participants from small-animal, equine and farm-animal practices were included. Questions focused on competence in clinical reasoning upon graduation and perceptions of how the SVMS curriculum assisted or hindered development.
Analysis
Interviews and focus groups were audio recorded and transcribed. We combined transcriptions from all focus groups and interviews into one data set for ongoing analysis. Data collection ceased when (1) at least two transcripts were collected for each cohort (staff/student/graduate) and (2) data saturation occurred. Thematic analysis was performed using guidelines developed by Braun & Clarke.
22 Complete inductive code generation was performed, managed through NVIVO (QSR, version 10). Codes were then interpreted and grouped together to form subthemes and themes. These themes were iteratively revised and edited. A 10% selection of the data was coded by a second researcher and agreement reached to ensure a consistent approach. Once coding was complete, all themes were defined and explained.
Discussion
This study has highlighted the successes and the shortcomings of a veterinary curriculum when trying to foster clinical reasoning development in students. A mixed-methods approach was used to “draw from the strengths and minimize the weaknesses”
23(p.14) of the qualitative and quantitative paradigms. This allowed methods to be chosen according to suitability, unrestricted by positivist or constructionist epistemologies.
19 The study findings indicate that the SVMS is producing graduates that can function as veterinary surgeons and are confident in certain aspects of decision making, but who are by no means “skilled.” As a result of this, they may need to significantly develop their reasoning ability once in practice. Although new graduates are not expected to be expert clinical decision makers, their current shortfall is such that it may be increasing their stress burden. While the specific level of deficit depends on the individual, all graduates reported some clinical reasoning challenges for which they felt unprepared. This appears to contradict opinions of surveyed graduates from other veterinary schools,
24,25 who report a good grounding in clinical decision-making skills during their programs. However, survey data are limiting, and further qualitative investigation in one study
24 revealed a lack of confidence in new graduates similar to that reported here, despite high survey scores. As the RCVS has recently included clinical reasoning as a Day One Competency,
4 more research to clarify the competence of new graduates is needed. This study demonstrates the benefits of performing a structured mixed-method analysis to assist with this.
It can be argued that the reasoning shortfall experienced by SVMS graduates can only be filled once working alone in practice, and that it is impossible to produce a graduate who is fully competent in this skill. However, the theme of holistic decision making suggests methods, such as simulation, to try and fill this gap in experience and create a more “practice-ready” graduate. Simulation has been shown to improve clinical reasoning in other disciplines,
26–29 but there are countless ways to implement it, meaning trials of specific interventions are needed in this area before curricular changes can be made. In veterinary medicine, one study has demonstrated the potential of contextualized simulation to improve decision-making skills.
30 Although this research relies on student self-assessment data, therefore lacking objective measurement, it provides good reason to investigate simulation further as a method of clinical reasoning development.
It is also apparent that the “real-life” aspects of decision making (e.g., clients, finances) need to be incorporated into teaching,
30,31 as it seems veterinary reasoning has more dimensions than clinical knowledge alone.
7 This corresponds to research in medicine, which has demonstrated that decision accuracy was affected by context and interference,
2 suggesting that these factors need to be integrated into teaching. It is interesting to note that direct effort by SVMS to teach students clinical reasoning—including lectures, practicals, and evidence-based medicine sessions—were not described by students as influencing their skill development. This may indicate that students do not associate the “classroom” version of decision making with the “consultation room” version.
Creating responsibility for decisions is a theme that emerged very strongly in this study, but is incredibly difficult to recreate. Due to animal welfare concerns, students will never be able to have the “last say” on a case. This is detrimental to development, as graduates cite lack of experience working with responsibility as a key factor making the transition to practice difficult.
25 While innovations such as virtual patients are a potential way to give students decision-making power,
8,32,33 they still have limitations. Students indicated that substituting medical responsibility for another high-stakes outcome—particularly embarrassment at poor performance in front of a client or clinician—might be an effective way to replicate pressure and improve performance. Further research into the comparison of “true” responsibility and other motivators to perform well is needed, but this study corroborates research by Baillie et al.
30 suggesting that using real or standardized clients during decision-making sessions to create this “performance-pressure” may be beneficial.
The components identified as contributing to clinical reasoning development (critical thinking instruction, experience in practice, knowledge, and life skills) are similar to findings from studies examining individual interventions.
12,34–37 The fact that knowledge is perceived by staff, students, and graduates as a key dimension of clinical reasoning may explain why the largest proportion of SVMS coded learning objectives are delivered in lectures. It is likely, however, that these perceptions are based on a lack of insight into the clinical reasoning development process, meaning that the use of lectures to “deliver” the skill may be misguided. As understanding of clinical reasoning grows, misconceptions about how best to teach the skill—particularly among staff designing curricula—must be addressed. It is clear that clinical reasoning tutelage needs to be based on evidence, not tradition.
The lack of student awareness of the concept of clinical reasoning, and the attitude that students should “assume” they should be learning it, is evident within the SVMS curriculum. It is likely that this is detrimental to students, as it makes it difficult for them to track or reflect on their reasoning skill development. Curricular transparency is a wider issue of clinical curricula. Acceptance that much student learning occurs within informal interactions, rather than just in declared teaching sessions,
38 has led to a call for greater accessibility of medical curricula generally.
18 To make curricula more transparent, Harden
18 advocates the use of curriculum mapping. This allows students to identify exactly where in the curriculum they receive opportunities to develop knowledge and skills, and is being adopted by many medical schools.
39 Currently the SVMS uses curriculum mapping purely as a management tool for accreditation purposes. Expanding this to include the mapping of embedded topics, and formatting it for use by student and staff may, as described by Harden, “make explicit the implicit.”
18(p.124)Limitations
The SVMS has been used as a case study
40 in this research. Although we investigated only a single institution, our research has a degree of generalizability
15 to other veterinary curricula in which clinical reasoning is an embedded skill. Comparing this work to similar case studies from other veterinary schools, if they were performed, would enhance our understanding of the subject and provide greater evidence for extrapolation of findings.
This study has not directly considered the effect of assessment on clinical reasoning development.
41 It was clear from student focus groups that students want to improve their reasoning skills to become competent veterinary surgeons, not because they see it as necessary to pass exams. Consequently, this avenue was not explored further but could be expanded on in future work. In addition, this study did not take into consideration the opinions of employers when evaluating the clinical reasoning ability of graduates, due to our focus on the curriculum. Information of this kind could be used to triangulate graduate interview findings.
When asking staff to critique their own curriculum, particularly in a focus group environment, it is possible that they may be either overly critical or defensive. Similarly, students may feel an affinity for the school that affects their perspectives. These factors, along with the fact that participants are self-reporting their clinical reasoning abilites, should be considered when interpreting the results of this study.