Does Having In-Person, Virtual, or No Collaboration Affect A... : The Hearing Journal

2022-06-03 21:54:24 By : Mr. jeff king

Your message has been successfully sent to your colleague.

Clinical reasoning is defined as the thinking processes employed during clinical practice, and a couple of recent studies have called upon the need to better clinical reasoning through instruction design; the results of one was published November 2017 in the Journal of General Internal Medicine, and the results of another was published in June 2019 in Diagnosis. Additionally, results of a study from a January 2017 issue of Academic Medicine identified three causes of health care professionals making mistakes in diagnosis: cognitive bias, knowledge deficits, and dual process thinking. The effects of these three factors while diagnosing can be lessened or eliminated entirely when virtual patients are involved. Virtual patients can also decrease time constraints and lack of knowledge in teaching clinical reasoning.

Another study set out to explore the topics of clinical reasoning, audiology instruction, and virtual learning, the results of which were published as “Examining Audiology Students’ Clinical Collaboration Skills When Using Virtual Audiology Cases Aided With No Collaboration, Live Collaboration, and Virtual Collaboration” in the March 2022 issue of the American Journal of Audiology, authored by Ramy Shaaban and Cynthia M. Richburg. The findings seem especially relevant in this current age of work being done increasingly virtually and frequent debate over how effective collaboration can be when none of the collaborators are physically in the same room.

The study included 38 participants, all students from audiology courses at a public university in Pennsylvania and another in Kansas; 36 were women and two were men, reflecting the gender imbalance traditionally seen in communication sciences and audiology programs. It was comprised of three groups: one treatment group where participants collaborated virtually, and two control groups where participants had no collaboration and where participants collaborated in person (participants were sorted into these groups at random). It aimed to examine students’ collaboration skills in what is called a scaffolded environment. Scaffolding is a way of helping students learn until they achieve specific goals, and the two main kinds are hard scaffolds (pre-scripted tools) and soft scaffolds (adaptive and dynamic tools provided to the student throughout learning). The nature of this study ensured the use of multiple scaffolding methods.

Two computer-based simulations of audiology cases were created to put participants’ clinical reasoning skills into action. The simulation program was created using Adobe Animate to design the interactions, Google Forms as a scaffolding and problem-based activity tool, and WordPress as a host for the simulation session. The program consisted of an interactive simulation of the clinical tools used to diagnose the two cases, an interactive problem-solving activity using branched questions with situations that changed based on the students’ choices, and a grading system that saved students’ grades in an Excel sheet.

Each virtual case had the same steps and activities but different situations and case designs. Students have differing levels of knowledge about computers and the internet, so the first case involved what was called an easy diagnosis while the second involved a more difficult one. An easy diagnosis meant in this case that needed clues were provided, aiming to lead to an accurate diagnosis with simple reasoning. The second case provided more indirect clues that required more careful thinking.

Participants in the in-person and virtual collaboration groups achieved markedly better results than participants in the no-collaboration group. The in-person group, consisting of 12 participants, achieved a total score of 97.9%, while the total score of the virtual group (also 12 participants) was 95.1% and the score of the no-collaboration group (14 participants) was 78.6%.

The results suggest a significant difference in clinical reasoning skills among the three groups, though the scores of the in-person and virtual groups were close. The results also imply lower scores were linked to students being provided more instructor-designed content and higher scores to students being provided less. Participants who received more scaffolds with the collaborations might have exhibited better decision-making outside this exercise than participants who did not receive them, but it’s important to note lower scores do not necessarily equal lower skill, just different paths toward total expertise.

Excerpts from “Examining Audiology Students’ Clinical Collaboration Skills When Using Virtual Audiology Cases Aided With No Collaboration, Live Collaboration, and Virtual Collaboration” were included in this column. The Hearing Journal would like to acknowledge that the original study was published in its entirety Jan. 19, 2022, on https://bit.ly/3LGt4Ir . (DOI: 10.1044/2021_AJA-21-00052).

Your message has been successfully sent to your colleague.

Your message has been successfully sent to your colleague.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. For information on cookies and how you can disable them visit our Privacy and Cookie Policy.