Evaluation strategies for HCI Toolkit research

D Ledo, Steven Houben, J Vermeulen, Nicolai Marquardt, L Oehlberg, S Greenberg
in CHI Conference on Human Factors in Computing Systems, Conference paper (text), Montreal QC, Canada

Abstract

© 2018 Copyright is held by the owner/author(s). Toolkit research plays an important role in the field of HCI, as it can heavily influence both the design and implementa-tion of interactive systems. For publication, the HCI commu-nity typically expects toolkit research to include an evalua-tion component. The problem is that toolkit evaluation is challenging, as it is often unclear what 'evaluating' a toolkit means and what methods are appropriate. To address this problem, we analyzed 68 published toolkit papers. From our analysis, we provide an overview of, reflection on, and dis-cussion of evaluation methods for toolkit contributions. We identify and discuss the value of four toolkit evaluation strat-egies, including the associated techniques that each employs. We offer a categorization of evaluation strategies for toolkit researchers, along with a discussion of the value, potential limitations, and trade-offs associated with each strategy.