Ulster University Logo

A Metric to Quantify if a Software Application Meets the User’s Expectation for Completing a Representative Task

Bond, Raymond, Van Dam, Eelco, Van Dam, Peter, Finlay, Dewar and Guldenring, Daniel (2015) A Metric to Quantify if a Software Application Meets the User’s Expectation for Completing a Representative Task. In: Irish Human Computer Interaction, Dublin. iHCI. 1 pp. [Conference contribution]

Full text not available from this repository.

URL: https://ihci2015.wordpress.com/poster-presentations/

Abstract

A usability test was conducted to evaluate medical software (ECGSim). A total of 112 tasks were recorded (14 users each attempted 8 tasks). We wanted to evaluate two metrics that could be used to determine if the software meets the user’s expectation in carrying out a representative task. Each user answered two questions before completing a task, (Q1) how difficult do you expect this task to be out of 1-10? (where 10=most difficult) and (Q2) how long do you expect to complete this task? After attempting the task, the user was then asked to rate how difficult the task was. Thus two metrics were calculated for each user task, (Metric 1) Δ between the user’s expected task-completion time and actual task-completion time (derived from time-stamps via a screencast) and (Metric 2) Δ between pre-task difficulty rating and post-task difficulty rating. A paired t-test (p<0.05) was used to test for significance when comparing the users’ expectation of task-completion time and actual task completion time as well as for comparing pre- and post task-difficulty ratings. Results indicate that only Metric 2 had statistical significance in identifying three tasks that did not meet the user’s expectation. For validation, two of the identified tasks also had the lowest task-completion rates. In conclusion, comparing pre-task estimation of task completion time with actual task completion times does not yield statistical significance, however, a pre- and post-task difficulty rating will.

Item Type:Conference contribution (Poster)
Keywords:Usability, Human Computer Interaction, UX
Faculties and Schools:Faculty of Computing & Engineering
Faculty of Computing & Engineering > School of Computing and Mathematics
Faculty of Computing & Engineering > School of Engineering
Research Institutes and Groups:Engineering Research Institute
Engineering Research Institute > Nanotechnology & Integrated BioEngineering Centre (NIBEC)
Computer Science Research Institute > Smart Environments
Computer Science Research Institute
ID Code:32534
Deposited By: Dr Raymond Bond
Deposited On:03 Nov 2015 12:36
Last Modified:03 Nov 2015 12:36

Repository Staff Only: item control page