Ulster University Logo

Using Eye-Tracking Technology to Capture the Visual Attention of Nurses During Interpretation of Patient Monitoring Scenarios from a Computer Simulated Bedside Monitor

Currie, Jonathan, Bond, Raymond R, McCullagh, P. J., Black, Pauline, Finlay, Dewar and Peace, Aaron (2016) Using Eye-Tracking Technology to Capture the Visual Attention of Nurses During Interpretation of Patient Monitoring Scenarios from a Computer Simulated Bedside Monitor. In: International Society for Computerized Electrocardiology, Arizona. ISCE. 1 pp. [Conference contribution]

[img] Text - Accepted Version
54kB
[img] Text - Supplemental Material
Indefinitely restricted to Repository staff only.

98kB

URL: http://c.ymcdn.com/sites/www.isce.org/resource/resmgr/2016Conference/ISCE_Program_2016_March_22.pdf

Abstract

Introduction:This study analysed the utility of eye tracking technology for gaining insight into the decision making processes of nurses during their interpretation of patient scenarios and vital signs.Methods:Five patient monitoring scenarios (vignette, vital signs [ECG, BP etc.] and scoring criteria) were designed and validated by critical care experts. Participants were asked to interpret these scenarios whilst ‘thinking aloud’. Visual attention was measured using infrared light- based eye-tracking technology. Each interpretation was scored out of 10. Subjects comprised of students (n=36) and qualified nurses (n=11). Scores and self-rated confidence (where 1=low, 10=high) are presented using mean±SD. Significance testing was performed using a t-test and ANOVA where appropriate (α = 0.05). Multivariate regression was performed to determine if a machine could use eye gaze features to accurately predict competency (dependent variable=score). Independent eye gaze only variables were used in the regression models if they statistically significantly (p<0.05) correlated with the score.Results:Scores across all scenarios were calculated (students=4.58±1.13 vs. qualified=6.85±0.82) with statistical significance between groups (p=<0.01). Mean self-rated confidence was also calculated (students=5.79±1.05 vs. qualified=7.49±1.00, p=<0.01). There was a weak positive correlation between confidence and score amongst students (r=0.323, p=0.06), although no meaningful correlation with qualified nurses (r=-0.099, p=0.77). However, for all participants there was a moderate correlation between confidence and score (r=0.592, p=<0.01).The fitness of the regression models for predicting competency based on eye gaze features only is as follows:• Scenario 1: R2=0.407, Std Error=1.243 (p=0.09)• Scenario 2: R2=0.746, Std Error=1.439 (p=0.01)• Scenario 3: R2=0.385, Std Error=1.564 (p=0.03)• Scenario 4: R2=0.687, Std Error=1.340 (p=0.44)• Scenario 5: R2=0.766, Std Error=0.960 (p=0.02)The following table also shows where subjects fixated the most and least on the different vital signs on the bedside monitor (note the lower fixation duration on the ECG by students in comparison to qualified nurses).Conclusion: The study has shown that eye-tracking measurements can provide insight into the decision- making of nurses and can be used to predict competency.

Item Type:Conference contribution (Poster)
Keywords:Eye tracking, healthcare training, simulation based training, competency, patient safety, clinical decision making, bedside monitoring, vital signs
Faculties and Schools:Faculty of Computing & Engineering
Faculty of Computing & Engineering > School of Computing and Mathematics
Faculty of Computing & Engineering > School of Engineering
Faculty of Life and Health Sciences > School of Nursing
Faculty of Life and Health Sciences
Research Institutes and Groups:Engineering Research Institute
Engineering Research Institute > Nanotechnology & Integrated BioEngineering Centre (NIBEC)
Computer Science Research Institute > Smart Environments
Computer Science Research Institute
ID Code:36088
Deposited By: Dr Raymond Bond
Deposited On:18 Oct 2016 11:54
Last Modified:18 Oct 2016 11:54

Repository Staff Only: item control page