Prototype Evaluation as a Strategic Tool
Upcoming SlideShare
Loading in...5
×
 

Prototype Evaluation as a Strategic Tool

on

  • 614 views

"The Connect Gallery: Using prototype evaluation as a strategic tool in the design process of interactives", presented at the VSG Summer School Conference in Edinburgh, June 2007 & a VSG Seminar in ...

"The Connect Gallery: Using prototype evaluation as a strategic tool in the design process of interactives", presented at the VSG Summer School Conference in Edinburgh, June 2007 & a VSG Seminar in Cardiff, November 2007

Statistics

Views

Total Views
614
Views on SlideShare
613
Embed Views
1

Actions

Likes
0
Downloads
4
Comments
0

1 Embed 1

http://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Prototype Evaluation as a Strategic Tool Prototype Evaluation as a Strategic Tool Document Transcript

  • The Connect gallery – using prototype evaluation as a strategic tool in the designprocess of interactives.IntroductionThis paper introduces the prototyping process and its benefits in the design process ofinteractives. Through a case study from National Museums Scotland, it exemplifies howgoing to the trouble of prototyping can really pay off in terms of not ending up with aninteractive that no one likes or understands.Background to ConnectConnect is a permanent interactive science and technology gallery, situated on the groundfloor of the National Museum of Scotland. The overall theme of the gallery is Creativity,Discovery and Innovation, inspired by the science and technology icons selected for thegallery. These objects are key examples of innovation and creativity in their time andprovide inspiration for discovery and innovation in the future. One such example is Dollythe Sheep, a key icon of twenty-first century scientific achievement.Within the gallery there are five sub topics: genetics (‘Me2’), robotics (‘Robots’), Space(‘Blast Off’), energy and power (‘Power Up’) and transport (‘Move It’). A series ofinteractives are co-located with the relevant objects for each topic, and provide visitorswith opportunities to interact physically, emotionally and mentally through challenges,puzzles, and physical activities. The objects serve as a focus for the interactives.There are a total of 19 interactives in the gallery, of which about half are mechanical andhalf are ICT interactives, as well as touch screen information stations around the gallery.Seven of the ICT interactives and the information stations, all of which were speciallydesigned for the gallery, underwent prototyping during the design process.Figure 1: Connect gallery floor plan showing the five sub topics
  • The Prototyping ProcessFirst, the ICT developers produced the first prototypes. These mainly just contained the‘core interaction’ of the interactives, designed to get the main messages and navigationacross. Most graphics were rough ‘placeholders’ in case substantial redesign was needed,and paper storyboards alone were not acceptable.The next step was to test the prototypes in the museum to see how visitors responded tothem. I’ll say a bit more about this step in a moment, but basically it consisted of two parts– observation, followed by a short interview.The responses from the testing were then analysed and fed back to the developers. Thisconsisted of a written report and a face-to-face meeting. The reports detailed problemswith the functionality of the prototypes and suggestions for improvements, and alsocovered any content or design issues that needed attention.Finally, the developers made relevant changes to the prototypes according to thefeedback. Then the process started again.Figure 2: The Prototyping Process
  • Each interactive went through 3 phases of evaluation before final delivery. The secondround of prototypes incorporated recommendations from the earlier feedback, and mostsections incorporated final graphics or graphic styles so the museum could confirm it washappy with that. The final version was slightly different in that the testing with visitors stepwas replaced with robustness testing, where basically all available staff tested theinteractives for bugs, typos and other weaknesses.We worked to a very tight schedule. We had one week per prototype interactive for thetesting, and a further week to gather the results, with the feedback meetings with thedevelopers taking place two weeks after submission of the relevant prototypes. As I saidbefore, 7 interactives and the information stations were tested and evaluated, which madea total of 8 test subjects. We worked in three 4-week blocks, testing two prototypeinteractives per week. The ICT developers were paid by the museum in stages which tiedinto the delivery phases of the prototypes for evaluation.Testing the Prototype InteractivesI want to say a little bit more about the step where we tested the prototypes with visitors.Like I said, it consisted of two parts. After approaching visitors in the museum and invitingthem to take part in testing, we observed them trying out the prototype interactives to findout what seemed to be working or not working. Specific things that the observers werelooking out for included whether visitors seemed to have any problems with theinstructions, understanding what they were meant to do, as well as difficulties with thenavigation.The observation was followed by a short interview with the relevant visitor. If it was a groupof visitors being observed, then just one member of the group was interviewed. Thesample size for each prototype interactive was 20 groups/ interviewees. Questions thatwere asked in the interviews included:- What did you think the interactive was about?- Did you find out anything new you didn’t know before?- How would you describe this interactive to a friend? and- What did you like most/ least about the interactive?as well as practical things such as how easy or difficult they found the language tounderstand, and their opinion on the length of the interactives.1. Invite visitors to take part in testing2. Observe visitors using prototype interactives3. Short interview with visitors to get responses/ reactions on prototype interactivesFigure 3: Steps in testing the prototype interactives
  • Why Prototype?So what are the benefits of prototyping? Well, it gives developers a chance to experimentwith new and different designs, and it formalises the process by which the museum feedsback to the developers – feedback is sent a fixed number of times and in a writtendocument everyone agrees on.More importantly, the museum can ensure that visitors- understand the interactive (understanding)- are able to operate the interactive (ergonomics), and- enjoy the interactives (enjoyment).Case Study: GM FoodsSo, to end with I have an case study from our testing for the Connect gallery thatexemplifies how going to the trouble of prototyping can really pay off in terms of not endingup with an interactive that no one likes or understands.The ICT interactive I would like to present to you focuses on the GM food debate. It has a17’’ touch screen, however, the prototype was operated by mouse. The interactive has anintroduction explaining GM, followed by a game-play based challenge, and then anopportunity to vote on some topical GM issues at the end. The key messages theinteractive is intended to convey are the many different purposes of GM technology, andthat society must decide how best to use if for the future, i.e. a focus on the ‘pros and cons’of GM and the surrounding debate.The target audience for the interactive are schools and families with children age 11 andabove, as well as independent adults. I would like to focus on the middle part of theinteractive, the game-play based challenge1.In the first prototype, visitors were given three scenarios to choose from – to create foodsthat contain medicine, crops resistant to weedkiller, or foods that last longer on thesupermarket shelf. On choosing a scenario, visitors got some more information about thatscenario and were then invited to accept their challenge. On accepting their challenge,visitors then had to choose a plant to modify. When individual plants were selected, threecharacteristics popped up for each plant that visitors had to ‘accept’ or ‘decline’, based onthe information they had been given on the previous screen. Once visitors had identifiedthe correct plant for their challenge they could then modify it.Feedback from the testing of the first prototypeSo, did it work? Overall visitors seemed to like the interactive and enjoy the experience,but that seemed to be mainly restricted to the colourful graphics, and perhaps the noveltyof being involved in the testing and getting a special preview. The understanding andergonomics showed a different picture.1 Please note that the screenshots from the case study prototypes referred to in the original presentation at the VSG Summer School were not available for inclusion in this paper.
  • As I already said, most of the visitors seemed to enjoy themselves, although half of themfound something they disliked. In terms of understanding of the key messages, only 55%mentioned GM or genetic modification, and no one mentioned anything about choice ordebate. 40% found the language either quite difficult or very difficult, and overall 30% saidthe interactive had taught them a lot.The most significant result was in terms of ergonomics – 60% had a problem using theinteractive, including initial difficulties in understanding the challenge, to not reallyunderstanding the interactive at all or why they were getting answers right or wrong. Prototype 1 Prototype 2Enjoyment 50% found something they dislikedUnderstanding 55% mentioned GM or genetic modification 0% mentioned choice or debate 40% found the language quite or very difficult 30% said it taught them a lotErgonomics 60% had a problem using the interactiveFigure 4: Feedback from the testing of the first prototypePrototype redesignWe decided to throw the baby out with the bathwater and completely revisit the game-playof this interactive, concentrating more on ‘choice’ and ‘debate’.The new prototype showed a card game after the introductory screens. It gives the visitor10 pairs of picture cards they have to turn over two at a time and try to match. On correctlymatching a pair, a fact about GM food appears on screen relating to the matched picture.The picture is then assigned to either a ‘Pro’ or ‘Contra’ column (unfortunately this screenshot does not show the columns as no cards have been turned over yet).When we tested this new prototype, we found a great improvement in both understandingand ergonomics.Only 45% found something they disliked, so slightly fewer than before. In terms ofunderstanding, however, the results differed significantly. All visitors mentioned GM orgenetic engineering, and half of them mentioned it was about the pros and cons of GMfood. Only 10% found the language quite difficult, which included foreign visitors whosefirst language was not English. No one found the language very difficult. A total 80% said
  • they had learned something new. In terms of ergonomics, the results also differedsignificantly – only 10% had any difficulties using the interactive. Prototype 1 Prototype 2Enjoyment 50% found something they 45% found something they disliked dislikedUnderstanding 55% mentioned GM or 100% mentioned GM or genetic modification genetic modification 0% mentioned choice or 50% mentioned it was about debate pros and cons of GM foods 40% found the language 10% found the language quite or very difficult quite difficult 30% said it taught them a lot 80% said they had learned something newErgonomics 60% had a problem using 10% had a problem using the interactive the interactiveFigure 5: Feedback from the testing of the second prototypeConclusionHowever, we still hadn’t got it quite right yet. Half of the visitors felt the game was too long.And half of the visitors clicked on the cards in the game before the cards were ready toturn over. We therefore recommended to the developers that the number of card pairsshould be reduced, and to make the cards react quicker.The final version has 8 pairs of picture cards instead of 10, and you can also see the ‘prosand cons’ columns in this screen shot, which we renamed ‘for and against’. The cards alsorespond much faster, which I can’t show you here but if you’re ever in Edinburgh Irecommend you visit the Connect gallery and try it out for yourself. Thank you.Jenni FuchsVisitor Studies OfficerNational Museums ScotlandJune 2007