Supporting the design and evaluationof
GLAM and academiccrowdsourcing websites
Donelle McKinley, PhD candidate, Information Systems
Supervisors: Dr Sydney Shep and Dr Brenda Chawner
• GLAM and academic crowdsourcing involves online volunteers creating or
enhancing data and digital resources (Carletti et al. 2013)
• Crowdsourcing still in an experimental phase (Lascarides, 2012) and not all
projects have been cost-effective (Dunn & Hedges, 2012)
• Key project objectives are sufficient online participation and
high-quality contributions (Oomen & Aroyo, 2011)
• Meeting these objectives requires an understanding of contextual factors,
effective project and system design, and evaluation and refinements to
achieve optimal performance (Brabham, 2013)
• Common project constraints include limited time, resources and expertise
(Causer & Wallace, 2012; Holley, 2010; Vershbow, 2013)
• Guidelines in use are based on case studies and do not focus on system
design and evaluation (Holley, 2009; Lascarides, 2012; Romeo & Blaser, 2011)
• Scarcity of empirically-based guidance to inform design and evaluation of GLAM and
academic crowdsourcing projects and systems (Shirk et al., 2012; Wiggins, 2012; Zhao &
• No research to date has focused on the design or evaluation of GLAM and academic
Research strategy: Calls for human-computer interaction (HCI) research to help meet
project objectives (Bernstein et al., 2011; Oomen & Aroyo, 2011; Pan & Blevis, 2011).
Research aim: Develop a set of specialized heuristics that support the design and evaluation
of GLAM and academic crowdsourcing websites
Research question: How can a set of specialized heuristics support the design and
evaluation of GLAM and academic crowdsourcing websites?
1. What are the components of a set of specialized heuristics?
2. How can these specialized heuristics be structured and presented to support their
application in practice?
Research method: Action Design Research (ADR)
• A design research method that draws on Action Research
• ADR "conceptualizes the research process as containing the inseparable and
inherently interwoven activities of building the IT artifact, intervening in the
organization, and evaluating it concurrently" (Sein et al., 2011)
• Four stages of ADR (Sein et al., 2011)
• Research setting extends across multiple institutions and international
• Dynamic pool of potential participants
• A priori criteria approach to purposive sampling uses the researcher’s
conceptual framework for GLAM and academic crowdsourcing
• Multiple data collection techniques
• Iterative approach to heuristics development
• Contribute to IS theory for design and action (Gregor, 2006) and design
research by generating design principles for crowdsourcing website design
and specialized heuristics development
• Support crowdsourcing website design and evaluation practice with a
new, robust and flexible tool selected and customized for the context of use
• Contribute to crowdsourcing literature by presenting a focused study on
crowdsourcing website design and evaluation
• Contribute to HCI literature by presenting a study that incorporates and
builds on existing theory and recent research
Bernstein, M., Chi, E. H., Chilton, L., Hartmann, B., Kittur, A., & Miller, R. C. (2011). Crowdsourcing and human computation: systems, studies and platforms. In
PART 2 ———– Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems (pp. 53–56). New York, NY, USA: ACM.
Brabham, D. C. (2013). Crowdsourcing. Cambridge, MA: MIT Press.
Carletti, L., Giannachi, G., Price, D., & McAuley, D. (2013). Digital Humanities and Crowdsourcing: an Exploration. Presented at the Museums and the Web,
Portland, Oregon. Retrieved from http://mw2013.museumsandtheweb.com/paper/digital-humanities-and-crowdsourcing-an-exploration-4/
Dunn, S., & Hedges, M. (2012). Crowd-Sourcing Scoping Study: Engaging the Crowd with Humanities Research. London: Centre for e-Research, Department of
Digital Humanities, King’s College London. Retrieved from http://crowds.cerch.kcl.ac.uk/wp-uploads/2012/12/Crowdsourcing-connected-communities.pdf
Gregor, S. (2006). The Nature of Theory in Information Systems. Management Information Systems Quarterly, 30(3). Retrieved from
Holley, R. (2009). Many Hands Make Light Work: Public Collaborative OCR Text Correction in Australian Historic Newspapers. Australia: National Library of
Australia. Retrieved from http://www.nla.gov.au/openpublish/index.php/nlasp/article/view/1406/1688
Holley, R. (2010). Crowdsourcing: How and Why Should Libraries Do It? D-Lib Magazine, 16(3/4). doi:10.1045/march2010-holley
Howe, J. (2008). Crowdsourcing: why the power of the crowd is driving the future of business. New York: Three Rivers Press.
Lascarides, M. (2012). Next-Gen Library Design. Chicago: ALA Tech Source.
Oomen, J., & Aroyo, L. (2011). Crowdsourcing in the cultural heritage domain: opportunities and challenges. In C&T ’11 Proceedings of the 5th International
Conference on Communities and Technologies. (pp. 138–149). doi:10.1145/2103354.2103373
Pan, Y., & Blevis, E. (2011). A survey of crowdsourcing as a means of collaboration and the implications of crowdsourcing for interaction design. In 2011
International Conference on Collaboration Technologies and Systems (CTS) (pp. 397–403). Presented at the 2011 International Conference on Collaboration
Technologies and Systems (CTS). doi:10.1109/CTS.2011.5928716
Petrie, H., & Power, C. (2012). What do users really care about?: a comparison of usability problems found by users and experts on highly interactive
websites. In CHI ’12 Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems. (pp. 2107–2116).
Sein, M. K., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action Design Research. MIS Quarterly, 35(1), 37–56.
Shirk, J. L., & et. al. (2012). Public Participation in Scientific Research: a Framework for Deliberate Design. Ecology and Society, 7(2), Article 29.
Vershbow, B. (2013). NYPL Labs: Hacking the Library. Journal of Library Administration, 53(1), 79–96.
Wiggins, A. (2012). Crowdsourcing scientific work: A comparative study of technologies, processes, and outcomes in citizen science (Ph.D.). Syracuse
University, United States -- New York. Retrieved from
Zhao, Y., & Zhu, Q. (2012). Evaluation on crowdsourcing research: Current status and future direction. Information System Frontiers. Retrieved from