Crowdsourcing for Information Visualization: Promises and PitfallsShow others and affiliations
2017 (English)In: Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments / [ed] Daniel Archambault, Helen Purchase, and Tobias Hoßfeld, Springer, 2017, p. 96-138Chapter in book (Refereed)
Abstract [en]
Crowdsourcing offers great potential to overcome the limitations of controlled lab studies. To guide future designs of crowdsourcing-based studies for visualization, we review visualization research that has attempted to leverage crowdsourcing for empirical evaluations of visualizations. We discuss six core aspects for successful employment of crowdsourcing in empirical studies for visualization – participants, study design, study procedure, data, tasks, and metrics & measures. We then present four case studies, discussing potential mechanisms to overcome common pitfalls. This chapter will help the visualization community understand how to effectively and efficiently take advantage of the exciting potential crowdsourcing has to offer to support empirical visualization research.
Place, publisher, year, edition, pages
Springer, 2017. p. 96-138
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 10264
Keywords [en]
crowdsourcing, information visualization, user studies, visualization, evaluation
National Category
Computer Sciences Human Computer Interaction
Research subject
Computer Science, Information and software visualization
Identifiers
URN: urn:nbn:se:lnu:diva-68184DOI: 10.1007/978-3-319-66435-4_5Scopus ID: 2-s2.0-85031501419ISBN: 978-3-319-66434-7 (print)ISBN: 978-3-319-66435-4 (electronic)OAI: oai:DiVA.org:lnu-68184DiVA, id: diva2:1146786
2017-10-042017-10-042020-12-15Bibliographically approved