The History of Evaluation in HCI


Visualization
Image Credit: Adobe XD

Introduction

Evaluating interactive systems has always been central to Human-Computer Interaction (HCI) and has been a dominant topic in HCI for decades. Not surprisingly, evaluation is listed as a core activity in nearly every design model with many making it the center of the design process. Evaluation efforts have been shaped by a continuous adaptation to technological and cultural changes. To see how evaluations have changed  through the decades we start our timeline with the start of the modern computer used in 1940 and finish the timeline by examining how evaluation has evolved into its current form.

Material

The timeline was made through TimelineJS from KnightLab and a Google spreadsheet. I followed the directions with their video, link is below.

http://timeline.knightlab.com/#preview-embed

Google spreadsheet

Results

In the earliest stages of computers, evaluations were more focused on the development of the technology. As technology started to advance and computers became more efficient you can see how evaluations have shifted the focus on the users and their needs and less on the performance of the technology. The timeline was divided into 5 phases ranging from the 1940’s to current times. Each phase represents a cornerstone in computer development and new ways of evaluating their effectiveness.

  • System reliability phase
  • System performance phase
  • User performance phase
  • Usability phase
  • User experience phase
Animated GIF - Find & Share on GIPHY
View complete timeline

Reflections

We have witnessed how evaluations started measuring the technologies performance and have shifted towards evaluating and examining people’s emotions and their needs when using their devices. Technology will continue to evolve, so must the methods of evaluation in order to validate and ensure that technology works for us.

Reference

Craig M. MacDonald and Michael E. Atwood. 2013. Changing perspectives on evaluation in HCI: past, present, and future. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’13). Association for Computing Machinery, New York, NY, USA, 1969–1978. DOI:https://doi.org/10.1145/2468356.2468714

Christian Remy, Oliver Bates, Alan Dix, Vanessa Thomas, Mike Hazas, Adrian Friday, and Elaine M. Huang. 2018. Evaluation Beyond Usability: Validating Sustainable HCI Research. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18). Association for Computing Machinery, New York, NY, USA, Paper 216, 1–14. DOI:https://doi.org/10.1145/3173574.3173790