‘Has Rex improved?’

The caregiver placebo effect in veterinary medicine

Bad_Science

After a long break from blogging on professional topics (although not from blogging in general – if you would like access to the blog from our fantastic trip around the western part of Australia please leave a comment to that effect) I have returned to work and am reading Ben Goldacre’s Bad Science, an eye-opening look at the sometimes questionable studies on which current treatments and accepted wisdom may be based. I had not previously heard of Goldacre, a UK-based doctor, epidemiologist, broadcaster and author who has an incisive mind and a lovely turn of phrase, my current favourite of which is ‘The plural of anecdotes is not data.’ As a bonus in discovery of interesting thinkers, researching this post also lead me to discover The SkeptVet blog, written by a blogger who ‘takes a sceptical and science-based look at veterinary medicine.’

One of the many topics covered in Bad Science is the placebo effect. As a student my somewhat primitive understanding of this concept was that placebos were substances which had no effect in treating a condition and were most often used in clinical trials to provide a control with which to compare the medication being investigated. I assumed that the placebo effect, where beneficial effects of the substance were thought to be due to the patient’s belief in the treatment rather than the treatment itself, was not a significant issue in veterinary practice because placebos didn’t have the same psychological effect in animals as in people. Once graduated, I encountered some vets who occasionally gave sterile water or vitamin injections to patients to satisfy owner demands. This was my first experience with the caregiver placebo effect, also known as the placebo effect by proxy, of which more later.

Surprising researching findings 

Goldacre reports on several studies which investigated the placebo effect, some of which had some very revealing results. Placebo tablets have been found to have a dose-response curve in the way you would expect for other drugs (two tablets were perceived to be more effective than one), to be influenced by the colour of tablets (pink tablets helped maintain concentration better than blue tablets) and to be being more effective when injected than when taken in tablet form. In a more extreme form of placebo, patients with pacemakers implanted but not switched on reported feeling better than before having the device implanted (although they did better still after the device was activated!)

Fascinatingly, the doctor’s faith in a particular treatment can also have a significant effect on the outcome for a patient. In a creative study in the mid-1980s, two groups of doctors who were blinded to which medication they were injecting, gave one of three possible treatments to patients. Of the three, only one was effective for the patients’ condition. One group was told that the treatments they were giving were ineffective and the other group were told the truth – that there was a chance their treatment could benefit the patient. Doctors were forbidden to tell patients the chance of their treatment being effective. No prizes for guessing that the patients of the second group did better. Apparently the manner of the doctor was enough to influence the outcome for the patient.

 The caregiver placebo effect has significant influence

In veterinary medicine, the vet obviously communicates about the likely success of treatment with the caregiver rather than directly with the patient. The vet’s and owner’s opinion on whether the patient has improved are significant in the vet’s assessment of the success or otherwise of treatment both in day-to-day patient management and in clinical trials. The SkeptVet blog led me to a study1 which attempted to assess the impact of the caregiver placebo effect on outcomes for dogs with lameness due to arthritis. Using animals assigned to the placebo arm of a clinical trial, owner and veterinarian assessments of dog’s response to treatment (placebo) were compared with objective measurements of weight bearing in affected limbs. A placebo effect was noted in over 55% of owner assessments and over 40% of veterinarian assessments.

In an ingenious attempt to elicit the impact of the caregiver placebo effect, a separate study2 on the effects of a non-steroidal anti-inflammatory treatment (meloxicam) on cats with degenerative joint disease (DJD) divided patients into two groups. One group received meloxicam, well-established as a treatment DJD in cats, for three weeks and then placebo for a further three weeks whilst the second group received only placebo throughout the trial. Owners did not know which group their cat was in. Cats were assessed by owner surveys prior to the study, after three weeks i.e. at the time of cessation of meloxicam for those receiving it and finally after six weeks. At the midpoint of the study, cats from both groups were perceived to have improved to an extent not significantly different between the two groups. At the end of the trial, cats in the meloxicam treatment group were perceived to have deteriorated compared with their midpoint assessment while those in the placebo group remained unchanged. The authors propose that the positive effect of the meloxicam in the first three weeks of the trial was masked by the caregiver placebo effect but once the meloxicam was replaced by the placebo, the return of clinical signs after withdrawal of active medication negated this effect.

Such studies illustrate the importance of the caregiver placebo effect in veterinary medicine, both for their veterinarian assessment of the patient and those of the patient’s owners. As veterinary educators, we should ensure that our students are forewarned of the phenomenon and remind mindful of it while working in clinical practice.

References

1 Conzemius M, Evans R. Caregiver placebo effect for dogs with lameness from osterarthritis. J Am Vet Med Assoc 2012;241:1314–1319

2 Gruen ME, Griffith E, Thomson A et al. Detection of Clinically Relevant Pain Relief in Cats withDegenerative Joint Disease Associated Pain. J Vet Intern Med 2014;28:346–350

Posted in Uncategorized | Tagged , , , , | Leave a comment

Education Day at AVA 2014 part 2

 

confeence_logo

This is the second blog post covering Education Day at the 2014 AVA Conference in Perth. The first post covered sessions relating to teaching and learning communication skills, both for students and for professional development of educators. This final post covers the remaining 3 presentations and the poster session.

Developing online resources for clinical teachers

In his talk entitled ‘Developing online resources for clinical teachers’ Daniel Schull explained why (the difficulty of providing face-to-face training made online the best option) and how the University of Queensland is developing an online self-directed induction resource for clinicians and staff instructing students on the clinic floor. After researching similar products used in other healthcare settings and completing a user survey to determine relevant content areas, content is being developed in developed as a series of progress-tracked bite-sized modules. The main aims of the resource are to raise participant awareness of their important teaching role, highlight useful  background theories, offer practical tips to assist with the role, and provide a flexible delivery format to suit the busy and unpredictable nature of clinical work. The resource is currently being piloted by  a  user group  comprised of  content experts from a range  of medical, veterinary and veterinary nursing backgrounds, and veterinarians and veterinary nurses from university and external clinical practices. Feedback from this user group will be secured via an online survey and semi-structured interviews. Post-launch evaluations will include a range of metrics such as user completion rates and feedback.

Clinical Problem Solving Exams

Digital resources in the form of clinical scenarios are also being used at Murdoch University in their Clinical Problem Solving Exams. This form of assessment was introduced in 2010 to try to improve the observational, critical thinking, communication, analytical, intervention and re-evaluation skills of students. The majority of questions are short answer questions (SAQ) with some multiple choice questions (MCQ) and some extended matching set questions (EMSQ). Progressive disclosure is a key component of the exams as it is more authentic to clinical practice and allows students to be redirected if they make an error early in a question. Unsurprisingly students initially find this component particularly challenging as it prevents them from taking an information dumping approach to answering the questions. With more clinical exposure and clinical experience students appear to realise the validity of the CPSE style of examination with progressive disclosure.

In contrast to other presentations, Sandra de Cat from James Cook University focused on ‘sheep week’, a week half way through the course dedicated to small ruminants. Students and staff travel several hundred kilometres to visit sheep properties who have taken different approaches to the industry. Hands on learning is integrated with previous knowledge to help the students ‘bring it all together.’ With its highly integrated curriculum, this style of learning is a good fit at JCU and very popular with the students, some of whom are struggling with the dip in motivation which often occurs midway through the course. Requests for dedicated ‘weeks’ for other species have been a common feature in student feedback, an indicator of the value students feel they get from the week.

 Poster presentations

An excellent poster session was held during the afternoon. In contrast to previous years, a three minute presentation in the meeting room was allocated to each poster rather than having posters presented with the audience standing around the posters, which was certainly more comfortable. For each poster, only the author who presented the poster has been named. If you are interested in learning more about any poster, please leave a comment and I will send more information. Posters presented were (in no particular order):

Adele Feakes (University of Adelaide) – Re-shaping veterinary business curricula to improve graduate business skills: a shared resource for educators

Adele Feakes (University of Adelaide) – Career sector intentions and gender effect: a cross-sectional analysis across year levels in four Australian veterinary programs in 2012

Chris Riley (Massey University) – How effective is ‘Best Practice’ training in the prevention of horse-related injuries to students?

Elise Boller (University of Melbourne) – Developing a framework for teaching professional communication skills in the University of Melbourne

Eva King (University of Queensland) – Learning from the learners: final year veterinary students’ perceptions of what helps and what hinders their learning in clinical environments

John Inns (University of Melbourne)  – Application of the proprietary, web based, curriculum mapping program Rubicon Atlas to map the Melbourne DVM curriculum

Liz Norman (Massey University) – Best practice in writing MCQs: why three options is enough

Liz Norman (Massey University) – Best practice in assessment: Using the SOLO taxonomy

Stuart Barber (University of Melbourne) – Collaborative development of virtual 4D farm systems for veterinary education

Susan Matthew  (University of Sydney) – Evaluation of the effectiveness of models in teaching surgical skills

 

 

Posted in Uncategorized | Leave a comment

Education day at AVA 2014

confeence_logo

 

 

Tuesday May 27th was education day at the 2014 AVA Conference.  As in previous years, the program consisted of papers, posters, a dinner and lots of excellent networking opportunities. After being on the road for 3 months around the western half of Australia (if you would like to read our trip blog please leave a comment to that effect and I’ll invite you), arriving at the conference was akin to landing on another planet and it took me a while to feel I could hold a professional conversation, much less give a coherent presentation but happily I had reassimilated within a few hours and was able to get the best out of my experience.

Although there was quite a variety of topics, a particular theme was teaching non-technical skills, particularly communication, in veterinary curricula. This first post will cover presentations which related directly to communication both in relation to teaching and professional development.

Non-technical skills in the veterinary curriculum

Martin Cake’s presentation ‘Consensus and evidence for the importance of non-technical veterinary skills’ gave a thought-provoking overview of what non-technical skills are perceived to be important by students and veterinarians and which skills have actually been shown through evidence to influence the success of veterinary graduates. Using the BEME (Best Evidence Medical Education) framework, meta-analysis of the literature showed evidence that 4 non-technical skills are of particular importance. The four are:

  • client trust/respect,
  • awareness of limitations,
  • communication skills
  • critical thinking/problem solving.

When consensus on perceived importance is matched with evidence of importance, communication skills lead the field, validating the strong focus this area of training has received in veterinary curricula.

Teaching clinical communication skills

Two presentations described the ways in which different vet schools instill clinical communication skills in their students. Jenny Mills and Melinda Bell described different techniques used at Murdoch, which include video scenarios, skills rehearsals of challenging situations such as euthanasia consultations and client simulations used at different stages in the curriculum. Particular focuses in that school are clinical empathy and inter cultural competence. Assessment includes several reflective tasks, such as a reflecting on video scenarios in the ‘Talk to the humans’ videos developed at Murdoch, immediate informal feedback after simulation exercises and videoing themselves in consultations and using the recording to review and self-assess their skills. OSCEs in final year form the ultimate summative assessment. Challenges at Murdoch are seen as including to provide more opportunities for students to record their consultations, to find more time for communication training in a crowded curriculum and to extend scenarios to include large animal cases.

Susan Matthew from Sydney University described some similar challenges in teaching clinical communication skills with time and resources but also spoke about the additional concern of the attitudes that some students bring to the discipline. Perceptions including lack of relevance in comparison to core scientific subjects and the belief that communication skills have already been acquired and can’t be developed further can hinder the teaching and learning process. The requirement for active participation e.g. in role play scenarios is challenging for many students, particularly the more introverted, and some take feedback on their performance as a personal affront. These student concerns lead to poor evaluations of those sections of the curriculum.

Communication through social media 

Jason Coe from Ontario Veterinary College spoke about communication training of a different type, focusing on educating students on the benefits and risks of social media, particularly Facebook. Studies have shown that veterinary students have a high rate of disclosing personal information and of posting material classed as unprofessional on Facebook. Using clickers to gauge audience opinion, Jason took us through a series of Facebook posts and asked whether we thought the various posts were acceptable and in some cases compared our response to those of vets and vet students they had surveyed in Canada using the same scenarios. While some items were clearly not acceptable to the group, others produced a wider range of opinions.

In his curriculum for veterinary students, making students aware of the potential consequences of their actions using real case is an important facet, as is introducing them to the 4 principles of ethical decision making in veterinary practice – non-maleficence, beneficence, autonomy, and justice.

Twitter for teaching and professional development

Continuing the topic of social media but broadening to include opportunities for professional development as well as teaching, I presented on the use of Twitter for veterinary educators, highlighting the features that make Twitter a useful tool for teaching and professional development and showing 3 examples of how it is currently being used for disseminating ideas or research, for teaching using #vetfinals as an example and for creating a conference back channel.

A second post will cover the remaining presentations and the excellent poster session.

 

Posted in Uncategorized | Tagged | 3 Comments

I’ve made mistakes in veterinary practice. Have you?

The ‘second victims’ of clinical error

Brian_Goldman

In my first year in practice, I worked in a very busy clinic which also acted as an emergency centre after hours, at the time one of three large emergency clinics in Melbourne. Among the enormous number of cases, I made some great clinical decisions but also some howling errors. Two immediately spring to mind. The first was a white kitten bleeding from around a tooth which I attributed to the fact that it was losing its deciduous teeth, but was due to eating Ratsac (a Vitamin K antagonist), a realisation which hit me about an hour later. I rang the owners, who brought the kitten back and it recovered with treatment. The second was a cat brought in unable to move its hind limbs. I initially assumed it had been hit by a car but it had a saddle thrombus secondary to cardiomyopathy. I remember noticing during the consult in some distant corner of my mind that its hind paws were cold but it wasn’t until I was positioning it for a pelvic radiograph that my neurones connected the cold paws and paralysis and I proceeded to work it up and treat it for thromboembolism and cardiomyopathy.  The fact that I still cringe at my clinical blindness and recall minute details of both cases 15 years later illustrates the impact they had on my psyche. I was ‘the second victim of the error’, a phrase coined by Professor Albert Wu1 to describe the impact of errors on clinicians, nurses and other support staff.

I have previously written about cognitive aspects of diagnostic error. Brian Goldman’s TED talk, ‘Doctors Make Mistakes. Can we talk about that?’2 (thanks to Jan Ehlers for the link) discusses errors from a clinician’s perspective and reminded me that my fear of errors contributed to my decision to move away from practice. I have only ever admitted that to others who I know share my anxiety because in some way I am still ashamed. I will never make those mistakes again.

After listing some of his own errors, Goldman, an emergency physician well-known medical journalist in Canada, talks about striving to be a perfect and resisting asking for help for fear of being seen as high maintenance. In the aftermath of an error, he recounts thinking, ‘make the voices (in his head) stop and don’t let me make another mistake’. I related completely.

Goldman feels that the culture of silence around error is a major contributor to the feelings of shame and isolation a clinician may feel after making a mistake. Albert Wu agrees, stating that ‘confession is discouraged, passively by the lack of appropriate forums for discussion and sometimes actively by risk managers’.1  This culture has been attributed to the ‘hidden curriculum’ in medical education – the messages and attitudes transmitted through day-to-day attitudes, actions and vocabularies.3

Failure to support the emotional needs of clinicians after an error can have significant consequences such as poorer patient care, depression, burn out, premature retirement4 or, in particularly tragic cases, even suicide5. Wu notes that …’some of our most reflective and sensitive colleagues [are] perhaps most susceptible to injury from their own mistakes’1, a category I feel I fit into.

At a personal level, strategies to support colleagues after errors may include encouraging a description of what happened, affirm rather than minimise the importance of the mistake and disclosing your own mistakes to help reduce the sense of isolation. Acknowledge the emotional impact of the mistake and ask how the colleague is feeling.1 At a broader level, Goldman suggests better systems to reduce errors and rewards for identifying errors and for coming forward after an error has occurred.2

In my experience, a similar culture of silence around clinical error exists in veterinary practice. I shared my mistakes with close friends, who were extremely supportive and, as new grads, also had their own stories. Systems for coping with errors were never mentioned during my veterinary education. There is now rightly a far greater focus on mental health of veterinarians and much better support, for example through the Australian Veterinary Association, although I could find no mention of clinical error in their VetHealth section. I would be very interested to hear of any specific veterinary examples of support for veterinarians or veterinary nurses after an error has occurred or systems to help us learn from errors.

Where there are clinicians and medical and veterinary support staff there will always be errors. The second victim is also important.

References

  1. Wu AW. Medical Error: the second victim. BMJ. 2000; 320(7237): 726–727.
  2. TED. Doctors Make Mistakes Can We Talk About That? http://www.ted.com/talks/brian_goldman_doctors_make_mistakes_can_we_talk_about_that.html. 2012. Retrieved 8 February 2014.
  3. Liao  JM, Thomas EJ, Bell SK. Speaking up about the dangers of the hidden curriculum. Health Aff 2014; 33(1):168-171
  4. Waterman AD, Garbutt J, Hazel E, Dunagan WC, Levinson W, Fraser V, Gallagher TH. The Emotional Impact of Medical Errors on Practicing Physicians in the United States and Canada. Joint Comm J Qual Patient Saf 2007; 33(8):467-76
  5. NBC News. Nurse’s Suicide Highlights Twin Tragedies of Medical Errors http://www.nbcnews.com/id/43529641/ns/health-health_care/t/nurses-suicide-highlights-twin-tragedies-medical-errors/#.UviuNfmSySo. 2011. Retrieved 10 February 2014.

 

Posted in Uncategorized | 4 Comments

Diagnostic biases

 

how-doctors-think_cover_imageIn a previous post I discussed the widespread, and, in my case, unwitting, use of heuristics in clinical decision making. The advantages of heuristics, such as managing complexity and improving efficiency are well documented but equally they can be a disadvantage by leading to faulty reasoning and conclusions1.  One disadvantage is that they can contribute to cognitive bias in clinical decision making. Cognitive bias is defined as a pattern of deviation in judgment, whereby inferences of other people and situations may be drawn in an illogical fashion2. Several important cognitive biases deriving from heuristics are discussed in Jerome Groopman’s book ‘How Doctors Think’3. Some of the more common are:

The availability bias – ‘the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind.’3 (p.64) An example could be that having seen several cases of kennel cough recently, a cough due to heart disease may be misdiagnosed as the veterinarian recalls quickly the current outbreak of kennel cough.

The anchoring bias – the ‘tendency to perceptually fixate on to the salient features in the patient’s initial presentation at an early point of the diagnostic process and failing to adjust initial impressions even in the light of later information’4

The confirmation bias – related to the anchoring bias, this refers to the tendency to look for confirming evidence to support the diagnosis we are “anchoring” to, while downplaying or ignoring information that might contradict the initial diagnosis.

The satisfaction of search bias – the tendency to stop looking for alternate or even coexisting diagnoses when we have made one diagnosis. A classic example of this is not searching for a second fracture once “sufficiently satisfied” with finding the first fracture5.

Many vets, particularly more recent graduates, worry that they won’t have sufficient knowledge of clinical disease but in fact the vast bulk of diagnostic errors come from cognitive mistakes6. Reflection on the clinical evidence available is a recommended way of limiting errors due to faulty clinical reseasoning7. Core questions include:

  • Why do I favour this diagnosis?
  • What else could it be?
  • Could two things be going on?
  • Have I considered the finding or lab result that doesn’t quite fit?7

Awareness of these potential errors is a great step towards reducing the chance of falling into the common traps.

Beside the scientific basis of clinical error, mistakes can have significant impact, obviously on the patient, but also on the clinician. Media coverage and complaints processes rightly concentrate on the former, while the failure to discuss the latter can be devastating to the people involved. As a clinician who has made mistakes, my next post will discuss the culture around clinical error.

1. Elstein, AS., 1999. Heuristics and Biases: Selected Errors in Clinical Reasoning. Academic medicine, 74 (7), 791-4

2. Cognitive bias – Wikipedia, the free encyclopedia. 2013. Cognitive bias – Wikipedia, the free encyclopedia. Available at: http://en.wikipedia.org/wiki/Cognitive_bias. [Accessed February 10, 2014].

3. Groopman, J., 2007. How Doctors Think. Houghton Mifflin, Boston

4. Emergency Medicine Blog: Heuristics and Cognitive Biases in Decision Making During Clinical Emergencies. 2013. Emergency Medicine Blog: Heuristics and Cognitive Biases in Decision Making During Clinical Emergencies. Available at: http://emergencymedic.blogspot.com.au/2012/01/heuristics-and-cognitive-biases-in.html [Accessed 30 August 2013].

5. Croskerry, P., 2009. A Universal Model of Diagnostic Reasoning. Academic Medicine, 84(8), pp.1022–1028.

6. Croskerry, P., 2003. Cognitive Forcing Strategies in Clinical Decision making. Annals of Emergency Medicine, 41(1), pp.110–120. Available at: http://home.comcast.net/~jasoncillo/Cognitive.pdf [Accessed February 10, 2014].

7. Misbah Keen University of Washington School of Medicine, 2013. Medical Decision Making in Clinical Care: Avoiding Common Errors. Available at: http://healthinfo.montana.edu/WWAMI Conference/Medical Decision Making 2013-04.pdf [Accessed February 10, 2014].

Posted in Uncategorized | 1 Comment

BEST network – ‘Revolutionising biomedical education’

bestlogo

What is the BEST network?

The BEST Network is a new community of medical schools and peak professional bodies in medicine and nursing embracing the potential of online teaching and learning by developing lessons to use and share. The catch on the home page of the newly-launched website states ‘We are a community of biomedical experts who share a vision: that every student and every teacher, whenever they are, will have access to the best healthcare education.’ Among the founding members are the University of New South Wales, the University of Melbourne, James Cook University, the University of Queensland, the Royal College of Pathologists of Australasia, the Australian College of Nursing and the Australian Orthopaedic Association. The project began at the University of NSW and member institutions joined if they had particular connection to the project and availability to become involved in early lesson development rather than through specific invitation.

Membership of the BEST community includes Smart Sparrow membership (see previous post) and the opportunity to author lessons. Lessons to be published on the BEST portal must undergo a still-developing peer review process but all lessons developed can be used by students and shared with chosen colleagues through Smart Sparrow.

Why establish BEST?

The BEST website gives six reasons for the network’s existence including sharing of knowledge, next generation educational content and cutting edge technologies. Developing quality resources is time-consuming and expensive and it sharing seems extremely logical in these days of straitened funding at universities.

What does BEST provide?

The three central features of BEST are:

Courseware – the resources built with Smart Sparrow software, available for all BEST member educators, regardless of whether they work for a member or non-member institution, to use as developed or to modify to suit their needs. Each course has ‘Teach with it’ button, which adds the course to the user’s personal workspace and allows them to alter and deploy it.

Slice – a cloud-based high resolution biomedical image bank sourced from medical school collections. The website states that it adheres to legal standards of acquisition, privacy, ownership and publishing. Images allow zooming and can be annotated to aid student identification of key features. Images cannot be embedded directly into other programs at this stage eg Powerpoint – only a link can be provided – but can be included into lessons built with Smart Sparrow.

Community – the intention is to provide a platform for educators to network. This section of the site is still under development, awaiting feedback as to how potential users would like it to function.

How is the network funded?

The BEST network received $4.5 million of government funding to become established. This funding runs out in July 2014, giving the network 9 months from launch to find a sustainable funding model. Clearly this is critical issue; it was raised in the panel discussion at the launch and there is currently no clear answer.

The launch

In the week leading up to the launch there were two articles about the upcoming event, an opinion piece in the Higher Education section of The Australian by Professor Peter Smith, dean of UNSW Medicine, the lead faculty in the BEST Network and Dr Dror Ben-Naim, chief executive of Smart Sparrow and an article from Dr Dror Ben-Naim in the Education section of the Financial Review.

There were over 150 people at the event, including representatives from both member and non-member institutions and health care organisations as well as journalists (an interesting choice for moderator of the panel discussion was Tim Dodd, education editor of the Financial Review). Unfortunately I couldn’t make it to the introductory session, which included overviews of different aspects of the network, but I attended the panel discussion on BEST and the future of medical education and enjoyed the excellent lunch provided and the chance to chat to other attendees, a number of whom had come from non-member institutions interstate at their own expense. The afternoon session was a ‘speed dating showcase’, a series of seven stations set up which we rotated through every ten minutes. Although this was a bit rushed at times it was a good way to have a closer look at some of the courseware developed and at Slice in use. There were some excellent examples but two that I particularly liked were the forearm and hand anatomy (used as a review between lectures and a practical class) and the virtual oxygen electrode used as a practical class exercise.

The future

The BEST network is a great step forward in the collaboration and sharing of teaching resources and, given the strains on university funding and the improvements in technology it makes perfect sense. As expected with such a collaborative effort inevitably some questions were raised about intellectual property but hopefully institutions can see that the benefits far outweigh the disadvantages. I sincerely hope sustainable funding can be found and the biomedical community teaches and learns through the BEST network for many years to come.

Posted in Uncategorized | Tagged , , , , | 2 Comments

BEST Network launch event – adaptive learning

bestlogo

I was very fortunate to be invited to attend the BEST (biomedical education, skills and training) network launch at Melbourne University on October 18th. The network is a group of members (University of New South Wales, University of Melbourne, James Cook University, University of Queensland and professional peak bodies in medicine and nursing) who are creating and pooling digital educational resources, including an image bank and virtual patients. These resources are built using the Smart Sparrow adaptive learning platform, developed at UNSW.

There were many interesting features of both the BEST Network and Smart Sparrow. Wearing my eLearning developer hat, this post will concentrate on the features of Smart Sparrow and the types of learning experiences able to be created. Swapping to my resource sharing and collaboration hat, the next post will discuss the BEST network and its features.

What is adaptive eLearning and how does Smart Sparrow compare with other eLearning platforms?

Having used several different tools with very varied levels of sophistication (Dreamweaver 4.0 (way back!), PowerPoint, Articulate Studio, Articulate Storyline) over many years to develop eLearning I was very interested in the features of Smart Sparrow and how they compare with other eLearning software packages. Articulate is a mass commercial eLearning development software which is widely used in industry in particular. Its websites states that it is used by 19 of 20 top ranked universities in the US, which makes a comparison very relevant. I do not claim to be an expert in any of the tools above and there may be features of these I am not aware of. I am very happy to be corrected if this is the case.

My comments below are based on some features I learnt about at the launch through demonstrations of some courseware already developed using Smart Sparrow and information from the Smart Sparrow website.

There are three key ’adaptive’ features in the Smart Sparrow software:

1. Adaptive feedback

There are two facets to this feature:

- feedback given when a student answers a question depends on answers chosen by students e.g. to a multiple choice question.  This is not revolutionary at all; it is a common feature of eLearning software and can be done in PowerPoint by creating branching links.

- giving the student guidance when they have reached a set number of incorrect attempts. This is a great feature and one I saw in action in the demonstrations. It is apparently possible in Storyline but is very complicated to set up, not being a generic feature of the software.

2. Adaptive learning paths

Content shown to individual students can be varied depending on their demonstrating understanding by answering a set of questions. I didn’t see enough of this to fully assess it but it sounds different from other software I have used, in which you can certainly branch learners based on a single answer but not on multiple answers as far as I am aware.

3. Adapting content

Content can be easily adapted by educators combining the analytic and authoring tools built into Smart Sparrow rather than needing to go back to developers to request changes. The newer generation of eLearning tools, such as Articulate, have been specifically designed to be relatively easy to master without needing a background in programming so the ability of easy authoring didn’t particularly excite me. The learner analytics, though, really impressed. For each question the educator can easily see the average grade, average time spent, average number of attempts and number of students attempting. For each student you can see time spent, number of attempts, answers given and score. Students also receive an email at the end of the activity with many of the same statistics so they can have a clear picture of their progress.

Other features of Smart Sparrow

Besides the power and easy access to analytics, the other feature that really caught my attention was the virtual spaces eg laboratories that can be created. The most impressive  was the virtual oxygen laboratory, in which students can pick up a pipette and add a certain amount of different reagents and see the result of their actions. This level of sophistication would obviously require time and money to a level unlikely to be possible in the veterinary sphere but it is fantastic to see what can be created. There are also some virtual patient cases which I haven’t had the chance to assess in detail yet but which could have excellent potential to be replicated in a veterinary context.

Ease of authoring?

One aspect of Smart Sparrow I cannot assess at this stage is the authoring tool.  Ease of authoring is a strength of many major commercial products as is having the advantage of support not only from developers but also from communities of users. The Articulate community in particular is brilliant and I have always found help within a few hours to any query I have posted, regardless of time of night or day. Obviously building up that community of knowledgeable users takes time and no doubt the plan is that this will develop through the BEST Network but it maybe a challenge at this early stage.

Would I like the opportunity to try Smart Sparrow in the veterinary sphere? You bet I would.

My overall impression was that this platform has significant potential to improve student learning in veterinary science. It will not (and does not claim to) replace face-to-face learning but could be an extremely valuable adjunct. The power of the analytics was the highlight for me, both for teachers and students to get a better understanding of their progress and areas of challenge. I can’t wait to try it out, although I need to clarify whether I am able to do so, given that it is currently intended for medical rather than veterinary education. I will certainly post about the experience and any results.

NOTE: Readers from BEST Network member institutions (listed above or go to https://www.best.edu.au/) can access the courseware mentioned above and many other examples by joining the network. Those from non-member institutions can also join and use/adapt resources but are not able to create their own learning.

Posted in veterinary education | Tagged , , , , | Leave a comment