I’ve made mistakes in veterinary practice. Have you?

The ‘second victims’ of clinical error


In my first year in practice, I worked in a very busy clinic which also acted as an emergency centre after hours, at the time one of three large emergency clinics in Melbourne. Among the enormous number of cases, I made some great clinical decisions but also some howling errors. Two immediately spring to mind. The first was a white kitten bleeding from around a tooth which I attributed to the fact that it was losing its deciduous teeth, but was due to eating Ratsac (a Vitamin K antagonist), a realisation which hit me about an hour later. I rang the owners, who brought the kitten back and it recovered with treatment. The second was a cat brought in unable to move its hind limbs. I initially assumed it had been hit by a car but it had a saddle thrombus secondary to cardiomyopathy. I remember noticing during the consult in some distant corner of my mind that its hind paws were cold but it wasn’t until I was positioning it for a pelvic radiograph that my neurones connected the cold paws and paralysis and I proceeded to work it up and treat it for thromboembolism and cardiomyopathy.  The fact that I still cringe at my clinical blindness and recall minute details of both cases 15 years later illustrates the impact they had on my psyche. I was ‘the second victim of the error’, a phrase coined by Professor Albert Wu1 to describe the impact of errors on clinicians, nurses and other support staff.

I have previously written about cognitive aspects of diagnostic error. Brian Goldman’s TED talk, ‘Doctors Make Mistakes. Can we talk about that?’2 (thanks to Jan Ehlers for the link) discusses errors from a clinician’s perspective and reminded me that my fear of errors contributed to my decision to move away from practice. I have only ever admitted that to others who I know share my anxiety because in some way I am still ashamed. I will never make those mistakes again.

After listing some of his own errors, Goldman, an emergency physician well-known medical journalist in Canada, talks about striving to be a perfect and resisting asking for help for fear of being seen as high maintenance. In the aftermath of an error, he recounts thinking, ‘make the voices (in his head) stop and don’t let me make another mistake’. I related completely.

Goldman feels that the culture of silence around error is a major contributor to the feelings of shame and isolation a clinician may feel after making a mistake. Albert Wu agrees, stating that ‘confession is discouraged, passively by the lack of appropriate forums for discussion and sometimes actively by risk managers’.1  This culture has been attributed to the ‘hidden curriculum’ in medical education – the messages and attitudes transmitted through day-to-day attitudes, actions and vocabularies.3

Failure to support the emotional needs of clinicians after an error can have significant consequences such as poorer patient care, depression, burn out, premature retirement4 or, in particularly tragic cases, even suicide5. Wu notes that …’some of our most reflective and sensitive colleagues [are] perhaps most susceptible to injury from their own mistakes’1, a category I feel I fit into.

At a personal level, strategies to support colleagues after errors may include encouraging a description of what happened, affirm rather than minimise the importance of the mistake and disclosing your own mistakes to help reduce the sense of isolation. Acknowledge the emotional impact of the mistake and ask how the colleague is feeling.1 At a broader level, Goldman suggests better systems to reduce errors and rewards for identifying errors and for coming forward after an error has occurred.2

In my experience, a similar culture of silence around clinical error exists in veterinary practice. I shared my mistakes with close friends, who were extremely supportive and, as new grads, also had their own stories. Systems for coping with errors were never mentioned during my veterinary education. There is now rightly a far greater focus on mental health of veterinarians and much better support, for example through the Australian Veterinary Association, although I could find no mention of clinical error in their VetHealth section. I would be very interested to hear of any specific veterinary examples of support for veterinarians or veterinary nurses after an error has occurred or systems to help us learn from errors.

Where there are clinicians and medical and veterinary support staff there will always be errors. The second victim is also important.


  1. Wu AW. Medical Error: the second victim. BMJ. 2000; 320(7237): 726–727.
  2. TED. Doctors Make Mistakes Can We Talk About That? http://www.ted.com/talks/brian_goldman_doctors_make_mistakes_can_we_talk_about_that.html. 2012. Retrieved 8 February 2014.
  3. Liao  JM, Thomas EJ, Bell SK. Speaking up about the dangers of the hidden curriculum. Health Aff 2014; 33(1):168-171
  4. Waterman AD, Garbutt J, Hazel E, Dunagan WC, Levinson W, Fraser V, Gallagher TH. The Emotional Impact of Medical Errors on Practicing Physicians in the United States and Canada. Joint Comm J Qual Patient Saf 2007; 33(8):467-76
  5. NBC News. Nurse’s Suicide Highlights Twin Tragedies of Medical Errors http://www.nbcnews.com/id/43529641/ns/health-health_care/t/nurses-suicide-highlights-twin-tragedies-medical-errors/#.UviuNfmSySo. 2011. Retrieved 10 February 2014.


Posted in Uncategorized | 4 Comments

Diagnostic biases


how-doctors-think_cover_imageIn a previous post I discussed the widespread, and, in my case, unwitting, use of heuristics in clinical decision making. The advantages of heuristics, such as managing complexity and improving efficiency are well documented but equally they can be a disadvantage by leading to faulty reasoning and conclusions1.  One disadvantage is that they can contribute to cognitive bias in clinical decision making. Cognitive bias is defined as a pattern of deviation in judgment, whereby inferences of other people and situations may be drawn in an illogical fashion2. Several important cognitive biases deriving from heuristics are discussed in Jerome Groopman’s book ‘How Doctors Think’3. Some of the more common are:

The availability bias – ‘the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind.’3 (p.64) An example could be that having seen several cases of kennel cough recently, a cough due to heart disease may be misdiagnosed as the veterinarian recalls quickly the current outbreak of kennel cough.

The anchoring bias – the ‘tendency to perceptually fixate on to the salient features in the patient’s initial presentation at an early point of the diagnostic process and failing to adjust initial impressions even in the light of later information’4

The confirmation bias – related to the anchoring bias, this refers to the tendency to look for confirming evidence to support the diagnosis we are “anchoring” to, while downplaying or ignoring information that might contradict the initial diagnosis.

The satisfaction of search bias – the tendency to stop looking for alternate or even coexisting diagnoses when we have made one diagnosis. A classic example of this is not searching for a second fracture once “sufficiently satisfied” with finding the first fracture5.

Many vets, particularly more recent graduates, worry that they won’t have sufficient knowledge of clinical disease but in fact the vast bulk of diagnostic errors come from cognitive mistakes6. Reflection on the clinical evidence available is a recommended way of limiting errors due to faulty clinical reseasoning7. Core questions include:

  • Why do I favour this diagnosis?
  • What else could it be?
  • Could two things be going on?
  • Have I considered the finding or lab result that doesn’t quite fit?7

Awareness of these potential errors is a great step towards reducing the chance of falling into the common traps.

Beside the scientific basis of clinical error, mistakes can have significant impact, obviously on the patient, but also on the clinician. Media coverage and complaints processes rightly concentrate on the former, while the failure to discuss the latter can be devastating to the people involved. As a clinician who has made mistakes, my next post will discuss the culture around clinical error.

1. Elstein, AS., 1999. Heuristics and Biases: Selected Errors in Clinical Reasoning. Academic medicine, 74 (7), 791-4

2. Cognitive bias – Wikipedia, the free encyclopedia. 2013. Cognitive bias – Wikipedia, the free encyclopedia. Available at: http://en.wikipedia.org/wiki/Cognitive_bias. [Accessed February 10, 2014].

3. Groopman, J., 2007. How Doctors Think. Houghton Mifflin, Boston

4. Emergency Medicine Blog: Heuristics and Cognitive Biases in Decision Making During Clinical Emergencies. 2013. Emergency Medicine Blog: Heuristics and Cognitive Biases in Decision Making During Clinical Emergencies. Available at: http://emergencymedic.blogspot.com.au/2012/01/heuristics-and-cognitive-biases-in.html [Accessed 30 August 2013].

5. Croskerry, P., 2009. A Universal Model of Diagnostic Reasoning. Academic Medicine, 84(8), pp.1022–1028.

6. Croskerry, P., 2003. Cognitive Forcing Strategies in Clinical Decision making. Annals of Emergency Medicine, 41(1), pp.110–120. Available at: http://home.comcast.net/~jasoncillo/Cognitive.pdf [Accessed February 10, 2014].

7. Misbah Keen University of Washington School of Medicine, 2013. Medical Decision Making in Clinical Care: Avoiding Common Errors. Available at: http://healthinfo.montana.edu/WWAMI Conference/Medical Decision Making 2013-04.pdf [Accessed February 10, 2014].

Posted in Uncategorized | 1 Comment

BEST network – ‘Revolutionising biomedical education’


What is the BEST network?

The BEST Network is a new community of medical schools and peak professional bodies in medicine and nursing embracing the potential of online teaching and learning by developing lessons to use and share. The catch on the home page of the newly-launched website states ‘We are a community of biomedical experts who share a vision: that every student and every teacher, whenever they are, will have access to the best healthcare education.’ Among the founding members are the University of New South Wales, the University of Melbourne, James Cook University, the University of Queensland, the Royal College of Pathologists of Australasia, the Australian College of Nursing and the Australian Orthopaedic Association. The project began at the University of NSW and member institutions joined if they had particular connection to the project and availability to become involved in early lesson development rather than through specific invitation.

Membership of the BEST community includes Smart Sparrow membership (see previous post) and the opportunity to author lessons. Lessons to be published on the BEST portal must undergo a still-developing peer review process but all lessons developed can be used by students and shared with chosen colleagues through Smart Sparrow.

Why establish BEST?

The BEST website gives six reasons for the network’s existence including sharing of knowledge, next generation educational content and cutting edge technologies. Developing quality resources is time-consuming and expensive and it sharing seems extremely logical in these days of straitened funding at universities.

What does BEST provide?

The three central features of BEST are:

Courseware – the resources built with Smart Sparrow software, available for all BEST member educators, regardless of whether they work for a member or non-member institution, to use as developed or to modify to suit their needs. Each course has ‘Teach with it’ button, which adds the course to the user’s personal workspace and allows them to alter and deploy it.

Slice – a cloud-based high resolution biomedical image bank sourced from medical school collections. The website states that it adheres to legal standards of acquisition, privacy, ownership and publishing. Images allow zooming and can be annotated to aid student identification of key features. Images cannot be embedded directly into other programs at this stage eg Powerpoint – only a link can be provided – but can be included into lessons built with Smart Sparrow.

Community – the intention is to provide a platform for educators to network. This section of the site is still under development, awaiting feedback as to how potential users would like it to function.

How is the network funded?

The BEST network received $4.5 million of government funding to become established. This funding runs out in July 2014, giving the network 9 months from launch to find a sustainable funding model. Clearly this is critical issue; it was raised in the panel discussion at the launch and there is currently no clear answer.

The launch

In the week leading up to the launch there were two articles about the upcoming event, an opinion piece in the Higher Education section of The Australian by Professor Peter Smith, dean of UNSW Medicine, the lead faculty in the BEST Network and Dr Dror Ben-Naim, chief executive of Smart Sparrow and an article from Dr Dror Ben-Naim in the Education section of the Financial Review.

There were over 150 people at the event, including representatives from both member and non-member institutions and health care organisations as well as journalists (an interesting choice for moderator of the panel discussion was Tim Dodd, education editor of the Financial Review). Unfortunately I couldn’t make it to the introductory session, which included overviews of different aspects of the network, but I attended the panel discussion on BEST and the future of medical education and enjoyed the excellent lunch provided and the chance to chat to other attendees, a number of whom had come from non-member institutions interstate at their own expense. The afternoon session was a ‘speed dating showcase’, a series of seven stations set up which we rotated through every ten minutes. Although this was a bit rushed at times it was a good way to have a closer look at some of the courseware developed and at Slice in use. There were some excellent examples but two that I particularly liked were the forearm and hand anatomy (used as a review between lectures and a practical class) and the virtual oxygen electrode used as a practical class exercise.

The future

The BEST network is a great step forward in the collaboration and sharing of teaching resources and, given the strains on university funding and the improvements in technology it makes perfect sense. As expected with such a collaborative effort inevitably some questions were raised about intellectual property but hopefully institutions can see that the benefits far outweigh the disadvantages. I sincerely hope sustainable funding can be found and the biomedical community teaches and learns through the BEST network for many years to come.

Posted in Uncategorized | Tagged , , , , | 2 Comments

BEST Network launch event – adaptive learning


I was very fortunate to be invited to attend the BEST (biomedical education, skills and training) network launch at Melbourne University on October 18th. The network is a group of members (University of New South Wales, University of Melbourne, James Cook University, University of Queensland and professional peak bodies in medicine and nursing) who are creating and pooling digital educational resources, including an image bank and virtual patients. These resources are built using the Smart Sparrow adaptive learning platform, developed at UNSW.

There were many interesting features of both the BEST Network and Smart Sparrow. Wearing my eLearning developer hat, this post will concentrate on the features of Smart Sparrow and the types of learning experiences able to be created. Swapping to my resource sharing and collaboration hat, the next post will discuss the BEST network and its features.

What is adaptive eLearning and how does Smart Sparrow compare with other eLearning platforms?

Having used several different tools with very varied levels of sophistication (Dreamweaver 4.0 (way back!), PowerPoint, Articulate Studio, Articulate Storyline) over many years to develop eLearning I was very interested in the features of Smart Sparrow and how they compare with other eLearning software packages. Articulate is a mass commercial eLearning development software which is widely used in industry in particular. Its websites states that it is used by 19 of 20 top ranked universities in the US, which makes a comparison very relevant. I do not claim to be an expert in any of the tools above and there may be features of these I am not aware of. I am very happy to be corrected if this is the case.

My comments below are based on some features I learnt about at the launch through demonstrations of some courseware already developed using Smart Sparrow and information from the Smart Sparrow website.

There are three key ’adaptive’ features in the Smart Sparrow software:

1. Adaptive feedback

There are two facets to this feature:

- feedback given when a student answers a question depends on answers chosen by students e.g. to a multiple choice question.  This is not revolutionary at all; it is a common feature of eLearning software and can be done in PowerPoint by creating branching links.

- giving the student guidance when they have reached a set number of incorrect attempts. This is a great feature and one I saw in action in the demonstrations. It is apparently possible in Storyline but is very complicated to set up, not being a generic feature of the software.

2. Adaptive learning paths

Content shown to individual students can be varied depending on their demonstrating understanding by answering a set of questions. I didn’t see enough of this to fully assess it but it sounds different from other software I have used, in which you can certainly branch learners based on a single answer but not on multiple answers as far as I am aware.

3. Adapting content

Content can be easily adapted by educators combining the analytic and authoring tools built into Smart Sparrow rather than needing to go back to developers to request changes. The newer generation of eLearning tools, such as Articulate, have been specifically designed to be relatively easy to master without needing a background in programming so the ability of easy authoring didn’t particularly excite me. The learner analytics, though, really impressed. For each question the educator can easily see the average grade, average time spent, average number of attempts and number of students attempting. For each student you can see time spent, number of attempts, answers given and score. Students also receive an email at the end of the activity with many of the same statistics so they can have a clear picture of their progress.

Other features of Smart Sparrow

Besides the power and easy access to analytics, the other feature that really caught my attention was the virtual spaces eg laboratories that can be created. The most impressive  was the virtual oxygen laboratory, in which students can pick up a pipette and add a certain amount of different reagents and see the result of their actions. This level of sophistication would obviously require time and money to a level unlikely to be possible in the veterinary sphere but it is fantastic to see what can be created. There are also some virtual patient cases which I haven’t had the chance to assess in detail yet but which could have excellent potential to be replicated in a veterinary context.

Ease of authoring?

One aspect of Smart Sparrow I cannot assess at this stage is the authoring tool.  Ease of authoring is a strength of many major commercial products as is having the advantage of support not only from developers but also from communities of users. The Articulate community in particular is brilliant and I have always found help within a few hours to any query I have posted, regardless of time of night or day. Obviously building up that community of knowledgeable users takes time and no doubt the plan is that this will develop through the BEST Network but it maybe a challenge at this early stage.

Would I like the opportunity to try Smart Sparrow in the veterinary sphere? You bet I would.

My overall impression was that this platform has significant potential to improve student learning in veterinary science. It will not (and does not claim to) replace face-to-face learning but could be an extremely valuable adjunct. The power of the analytics was the highlight for me, both for teachers and students to get a better understanding of their progress and areas of challenge. I can’t wait to try it out, although I need to clarify whether I am able to do so, given that it is currently intended for medical rather than veterinary education. I will certainly post about the experience and any results.

NOTE: Readers from BEST Network member institutions (listed above or go to https://www.best.edu.au/) can access the courseware mentioned above and many other examples by joining the network. Those from non-member institutions can also join and use/adapt resources but are not able to create their own learning.

Posted in veterinary education | Tagged , , , , | Leave a comment

Information overload in veterinary science response

Wbee_signAnne Fawcett, author the fantastic Small Animal Talk blog, raised issues around learning and teaching in the age of information overload last week. My few quick comments in response ended up being  a post in their own right. Anne’s original post is here; my response is below.

This was a fantastic post and a topic most worthy of discussion. I’d love to have time to construct a logical piece but I know it will never be finished, given my current workload, so here at least are a few ideas and thoughts, varying considerably in quality and relevance. An (admittedly extremely brief) glance at the veterinary literature did not reveal any recent published work in this area – perhaps it’s time to revisit the issue more formally.
Teaching metacognition (understanding of the process of learning) to students could have significant value.
In Stephen May’s recent article on clinical reasoning1, he mentions a study which demonstrated that individuals who could solve novel problems in unfamiliar domains more expertly than most had greater understanding of their own thought processes2. Another related point is that, from purely a learning viewpoint, there is a preclinical-clinical divide ‘at which students stop being taught backward, from textbook lists of diagnoses and start being taught how to work forward from problems toward answers’1. As a learner who only fully grasped this concept about half way through final year, I think having it explained to me earlier may have been extremely useful. I also remember chatting to two particular students, one in the year above and one in my year, during preclinical years and realising that they were learning in a much more applied way than I, still grasping as I was to the rote system rather than how I would need to apply the knowledge in the real world. Both those students were dux in their respective year levels, which I think was strongly related to the way they thought about how to study.
Introduction of streaming into veterinary education
This has been raised many times and is the current model at UC Davis, albeit in a limited form i.e. graduates choose a stream but still must find a way to cover other species in order to get a licence. I actually emailed Davis earlier this year to seek clarification of the system and got this reply from Jan Ilkiw, Professor and Associate Dean-Academic Programs:
‘While there has been talk of limited licensure in the US for 20 odd years it is unlikely to eventuate. It came up in the NAMEC discussions when State boards and National Board of Veterinary Medical Examiners were all in the room together. There was talk that perhaps the licensing exam might consist of a number of sections and that students would have to pass say 4 out of 6 sections which would allow for some species selection – not heard anything more about this.All students have to pass a broad all species national exam in order to practice in the US so they are therefore are licensed to practice on all species. For curricula like ours and we have had tracking for 20 years it does mean that the students are then responsible for getting the extra species material to pass the licensing exam.The general feeling was that veterinarians are professionals and take an oath and therefore they should regulate what species they practice on – i.e. only practice on species they are competent to practice on. If they don’t then they could be reported to state licensing boards if they end up doing things that are not considered standard of practice.’

Curation of information is key
In the past, information was relatively scarce. Teachers told students facts and suggested texts and journals to use for further information and clients came to veterinarians seeking ideas about possible diagnoses. Now anyone can find virtually any information online and I have certainly had clients arrive with printouts on the condition they believe their pet may be suffering from. The vet and vet educator’s roles seem now to be as much curators of information and information sources as providers. This throws the problem of overload back on the educator to some extent rather than solving the problem.
Forms of assessment
With pedagogy now (rightly) a much more central focus, considerably more thought is being put into teaching strategies than will engage students and best assist their learning. I have noticed at least in some cases that design of assessments have not kept pace and are still often in an older style which encourages rote learning. As so many students study for the test rather than the real world, this may be a significant barrier in changing student attitudes towards when they have learned something. Designing assessments to be as authentic as possible to real world tasks (easier in some areas than others) would be of benefit.
Aims of veterinary education
Nigh on 20 years ago Bushbystated that the Mississippi State College University of Veterinary Medicine had redefined the goals of the professional curriculum to include the following:
  • The graduate must be able to learn on his or her own.
  • The graduate must be able to access and use information in an interdisciplinary manner to solve problems.
Perhaps we need to renew the focus on these goals. They will not of course reduce the load but may better equip students to manage it.
1.   May SA (2013) Clinical reasoning and case-based decision making: the fundamental challenge to veterinary educators. Journal of Veterinary Medical Education 40(3):200-209
2.   Brown AL et al (1983) Learning, remembering and understanding. In Mussen PH, editor Cognitive development Vol 3, Handbook of Child Psychology. Oxford, Wiley
3.   Bushby P (1994) Tackling the knowledge explosion without overloading the student. Australian Veterinary Journal71:372-374.
Posted in veterinary education | Tagged , , | 5 Comments

Information overload in veterinary science

Wbee_signLast week Anne Fawcett, author the fantastic Small Animal Talk blog, raised issues around learning and teaching in the age of information overload. Anne is also a lecturer at Sydney Uni vet school, a practising veterinarian and a  journalist and somehow manages to keep up with it all.  I responded a few days later with what was initially going to be a few quick comments but ended up being long enough to be a post in its own right.  Anne’s original post is below and my response follows as a separate post.

As a lecturer I am frequently asked for study tips and strategies by veterinary students. However, earlier this year I was informed by one student that she had found the Holy Grail: a book entitled How to Study in Medical School by Armin Kamyab.
I critiqued this book (read the post here) on the grounds that I felt it placed far too much emphasis on memorising everything taught, leaving little room to acquire experience in what is an overwhelmingly hands-on field.
Kamyab’s system is based on several problematic assumptions, one of which is that everything taught in the medical curriculum is worthy of equal consideration, equal revision and hopefully absorption. Discipline, he argues, is the answer.
But if we extend this to the veterinary curriculum, my own experience tells me that discipline is not enough. What happens when we reach capacity, or information overload?
In 1980 – well before the internet, social media and any form of portable telecommunication – Anderson & Graham raised concerns about the impact of information overload on medical education. They performed a loose analysis of the amount of information taught in the medical curriculum and determined that undergraduate medical students had to assimilate 27,000 facts and 25,500 concepts in their clinical or senior years (roughly 9 facts or concepts per hour).
The authors concluded that there was a need to establish the “best and most efficient means of transmitting and assimilating information” (Anderson & Graham, 1980). That was 23 years ago. Surveys of veterinary students at Murdoch University and the University of Queensland identified information overload as a major stressor (Williams et al., 2005; McLennan & Sutton, 2005). But it isn’t just undergraduates suffering under the weight of too much information (read the late Dr Lee Lipsenthal’s arguments in this post).
Graduates, too, are struggling with the onslaught of new research, new technologies, new techniques and aggressive marketing of veterinary continuing education (every week there seems to be a new provider springing up somewhere).
According to some, the rate of information overload is no longer relevant. Veterinary educators long ago recognised the inadequacies of the traditional assumptions underpinning undergraduate teaching (Bushby, 1994, see table 1).
Table 1. Rules defining traditional veterinary medical education
Assumptions underpinning curriculum…
Problems with that assumption…
There is a core of information that must be learned.
No two teachers agree on the core;
The core is difficult to refine (one attempt resulted in a program consisting of 216,000 objectives or one fact every six minutes
Teachers will tell the student what they must know.
Assumes that teachers know what needs to be learned;
Assumes students are passive, surface learners.
Teachers must cover the material.
If students don’t learn it is their fault; teachers are dissociated from the learning process.
The teacher determines the organisation of material and method of instruction.
The student plays no role in deciding how the material is organised or learned.
With sufficient knowledge students will learn to think
Students do not feel in control of learning and are only exposed to critical thinking late in the course.
The only valuable information is that which is stored in your head.
Memorisation, superficial understanding and guessing are rewarded; exploration, inquiry, thoroughness and reliability are penalised.
Adapted from Bushby (1994).
The argument against the traditional model is compelling, yet my own experience as a veterinary undergraduate – and feedback from my students – tells me that we have significant attachment to this model.
The most compelling evidence, to me, was interviewing students last semester as part of my Graduate Certificate in Educational Studies (if you’re interested in this course visit here).
When I interviewed veterinary students last semester about learning, 100 per cent stated that they had learned something when they had memorised it. Not when they could explain it, no mention even of applying that morsel of knowledge. On Saljo’s hierarchy of learning, these excellent students only reached the middle. And memorising something is far different from applying it.
One of my lecturers once said that “vet school is wasted on vet students”, i.e. they don’t have a context to hang those factoids they are learning, nor an appreciation of their relative importance, so it doesn’t make sense til they graduate by which time they’ve forgotten a large proportion of said factoids. I got what he meant – but does it have to be that way? If we move away from the assumptions that to be a good vet you need to memorise every fact taught in vet school, and good veterinary teachers teach good veterinary facts, what are we left with?
[Sorry for the cliffhanger, but this is a topic I will be posting on further. I am genuinely keen to hear from students, veterinary educators and vets about their take on the problem - is it really a problem? How do you learn? What is a good teacher and how can we teach veterinary science best? Send me an email or post a comment].
Anderson J and Graham A (1980) A problem in medical education: is there an information overload? Medical Education 14:4-7.
Bushby P (1994) Tackling the knowledge explosion without overloading the student. Australian Veterinary Journal71:372-374.
McLennan MW and Sutton RH (2005) Stress in veterinary science students: a study at the University of Queensland.Journal of Veterinary Medical Education 32(2):213-218.
Trigwell K (2001) Judging university teaching. International Journal for Academic Development 6:1, 65-73.
Viner B (2010) Success in Veterinary Practice: Maximising Clinical Outcomes and Personal Well-Being. Oxford: Wiley & Sons.
Williams SM, Arnold PK, Mills JN (2005) Coping with stress: a survey of Murdoch University Veterinary Students.
Posted in veterinary education | Tagged , , | 1 Comment

Heuristics in veterinary clinical decision making


I was recently made aware of the fascinating book ‘How Doctors Think’ by Jerome Groopman. Groopman, an oncologist in Boston, Chair of Medicine at Harvard Medical School and a staff writer for the New Yorker, is in an ideal position to both provide insight into the topic and write about it in an accessible way.

As always there are many interesting themes in the book, several of which are directly applicable to veterinary practice and education. This will be the first of (probably!) three posts, each covering different topics.

One of the ideas introduced early in the book was the concept of heuristics, defined in Wikipedia as ‘experience-based techniques for problem solving, learning, and discovery that gives a solution which is not guaranteed to be optimal. Where the exhaustive search is impractical, heuristic methods are used to speed up the process of finding a satisfactory solution via mental shortcuts to ease the cognitive load of making a decision.’ 1We all use these in everyday life, learning to ignore information that is extraneous to our needs at a given moment. For example, if waiting for a car to pass while crossing the road, we may not pay attention to the colour of the car but will be absolutely aware of when it has passed and we can safely cross. Heuristics have a clear application to clinical decision making, most obviously in an emergency situation but also in many other scenarios.

Heuristics can be personal and very specific to a diagnostic process in a particular environment or general with application across many disciplines. An example of the former could be a young dog presented for coughing but generally bright and alert, recently been in contact with other dogs and not yet fully vaccinated. As soon as I set eyes on that patient across the waiting room I have kennel cough in mind as a likely diagnosis, even though I have not fully examined the dog and have no objective data. Another name for this type of thinking is pattern recognition. A more general diagnostic heuristic could be the oft-repeated phrase, ‘If you hear hoof beats, think of horses, not zebras’ ie look for a common disease process before chasing the possibility of a rare cause.

When utilized by experienced clinicians, there is considerable evidence that heuristics are a powerful means of providing high-quality care in an efficient manner in the face of clinical uncertainty2. Pat Croskerry, an emergency doctor who studies physician cognition, calls heuristics ‘the essential tools of clinical medicine’ 3 and I certainly know I’ve employed them in many a case, despite having no specific instruction in their development and use, a situation I share with Groopman who notes in the boo that shortcuts were not taught to him in medical school. Clement McDonald in his 1996 article stated that ‘the heuristics of medicine should be discussed, criticized, refined and then taught. More uniform use of explicit and better heuristics could lead to less practice variation and more efficient … care.’4

Wegwarth et al assert that ‘Today’s medical students should learn and understand that heuristics are neither good nor bad per se, but that their reliability and usefulness interplays with environmental circumstances, such as the inherent uncertainty of a specific situation. To broaden students’ knowledge of what kind of environmental circumstances can be exploited in what fashion by what heuristic mechanisms seems as crucial as to teach them the building blocks from which heuristics can be constructed and adjusted for other problems or populations.’ 5 I certainly agree the concept should be introduced to veterinary students. I’d be very interested to hear from any veterinary educators about whether and how you are teaching heuristics.

Having discussed the obvious advantages of heuristics above, there are some significant caveats to using heuristics as, somewhat ironically, they are also a major source of diagnostic error, the topic of my next post.

  1. Wikipedia http://en.wikipedia.org/wiki/Heuristics Retrieved 12 August 2013
  2. 2Kempainen Robert R., Migeon Mary B. and Wolf, Fredric M., 2003. Understanding our mistakes: a primer on errors in clinical reasoning. Medical teacher, 25(2), pp.177–181. Available at: https://childrenshospital.org/cfapps/research/data_admin/Site2275/securepages/Documents/ClinicalReasoning.pdf.
  3. Croskerry, P., 2003. Cognitive forcing strategies in Clinical Decision making. Annals of Emergency Medicine, 41, pp.110–120.
  4. McDonald, C., 1996. Medical heuristics: the silent adjudicators of clinical practice. Annals of Internal Medicine, 124(1), pp.56–62.
  5. Wegwarth, O., Gaissmaier, W. & Gigerenzer, G., 2009. Smart strategies for doctors and doctors-in-training: heuristics in medicine. Medical Education, 43, pp.721–728.
Posted in Uncategorized | Tagged , , , | 1 Comment