Anarchi report

413 views
347 views

Published on

Final report of the dataminer application.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
413
On SlideShare
0
From Embeds
0
Number of Embeds
10
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Anarchi report

  1. 1. Human-Computer Interaction: final reportGroup 1: AnarCHIhttp://anarchi11.blogspot.com/http://apps.facebook.com/datamineralpha/Sam Agten, 1st master computer sciencesBart Gerard, 1st master computer sciencesTanguy Monheim, 1st master computer sciencesSteven Vermeeren, 1st master computer sciencesAbstractThis is the final report for the course of human-computer interaction by group 1. We havedeveloped an application about social news called “Dataminer”. The most important challengeswere finding a decent concept, attracting enough users and creating a GUI that is easy to use.From this project we have learned the importance of using an iterative process while developinga user interface and the importance of listening to user feedback rather than just pursuing ourown ideas.
  2. 2. Contents 2Contents 1 Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2 Storyboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1 Final Storyboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Alternatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.3 Evolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.4 Thoughts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3 Screen-transition-diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 3.1 Final screen-transition-diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 3.2 Evolution and Alternatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 4 Iterations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 4.1 Paper iteration 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 4.1.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 4.1.2 Instruments, results and conclusions . . . . . . . . . . . . . . . . . . . . 8 4.2 Paper iteration 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 4.2.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 4.2.2 Instruments, results and conclusions . . . . . . . . . . . . . . . . . . . . 10 4.3 Paper iteration 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 4.3.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 4.3.2 Instruments, results and conclusions . . . . . . . . . . . . . . . . . . . . 12 4.4 Digital iteration 1: 09/04 - 24/04 . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.4.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.4.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.4.3 Results and observations . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.4.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 4.5 Digital iteration 2: 25/04 - 03/05 . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.5.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.5.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.5.3 Results and observations . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.5.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 4.6 Digital iteration 3: 04/05 - 09/05 . . . . . . . . . . . . . . . . . . . . . . . . . . 18 4.6.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 4.6.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 4.6.3 Results and observations . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.6.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.7 Digital iteration 4: 10/05 - 23/05 . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4.7.1 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4.7.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4.7.3 Results and observations . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4.7.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 5 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2 of 29
  3. 3. Contents 3 6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 7 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 7.1 Answers to questionnaires . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 7.2 Usage statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 7.3 Amount of viewers per day . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 7.4 Time spend . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 3 of 29
  4. 4. 41 ConceptWhat is dataminer? We have developed an application called “Dataminer”. Dataminer isa facebook application in which the user is represented by an avatar and can mine mineralscontaining news articles. The hardness of the mineral represents the value of the news articlebehind it (e.g. A diamond mineral contains a more valuable news article than a talc mineral).Every news articles is grouped in one or several categories (e.g. sport, politics, etc.) The valueof a news article is determined by the amount of friends who liked articles from the categoriesthe article hails from. This means that a player has the ability to rate an article (like/dislike/meh)after mining it.Why dataminer? We think Dataminer offers a fun and new way to explore the news. Theuser can choose to mine articles liked by his friends (hard minerals) or has the possibility todeviate and mine articles disliked by his friends. Dataminer also offers a reward system (badgesand high scores) to keep the user interested.How does dataminer relate to other applications? Dataminer could be compared toother mining games such as for example Dig It 1. But the only application we found that evencomes close to our idea is the application HearSay 2 . This application is still being developed.This application will also let people rate articles and show them which articles their friends liked.It will also use a badge system. The difference is that HearSay does not use a “mining” concept.Which alternatives have we considered? We also considered making a more genericsocial news application, which would allow users to rate and comment on news articles gatheredfrom different sources. We decided to go with Dataminer instead because we believed it to bemore exciting to work on and more original.Strong suites/weak suites? Dataminer’s biggest flaw is also it’s greatest strength. It is anoriginal idea and by extension, a bit of a gamble. Whether people like digging for their news orwould rather just read it without all the hassle, remains to be seen. 1 http://appadvice.com/appnn/2009/08/review-i-dig-it/ 2 http://www.newsgaming.de/2010/10/a-social-news-game/ 4 of 29
  5. 5. 52 Storyboard2.1 Final Storyboard Fig. 1: Game screen Fig. 2: Article screen To play DataMiner, the user has to login on Facebook and select the DataMiner application.The user enters the game screen (Figure 1). He can move across the different layers in searchof new minerals. When a mineral is found, an article from a category that is as valuable as themineral is shown (Figure 2). The user can like, dislike or meh the article. This will result in adifferent value for this category. Fig. 3: Statistic screen Fig. 4: Badges screen The value of the different articles can be viewed when opening the statistics screen (Figure3). The user sees how much he likes the different categories and also the combined interestsof the user and his friends are shown. By pressing on a category, the user is able to view morerefined statistics of a certain category. To motivate the different users to return again and again to 5 of 29
  6. 6. 2.2 Alternatives 6our application, a reward system has been built. When selecting the badge icon from the menu,the user gets an overview of all his achievements until now (Figure 4). By hovering over a badge,the user gets information about this reward.2.2 AlternativesThe friend menu from an earlier storyboard did not make the final release because of our lack oftime. When a user selects the friend symbol, he now gets a Facebook invitation screen insteadof a friend overview page.2.3 EvolutionThrough the different iterations the mini game has changed quite a lot. From an initial SuperMario idea till the current game. Early changes to a simplified concept were made because offear for the difficulties in implementation, but later on we changed back to a less safe concept tobroaden our horizon. The older storyboards were rough and unrefined. Just to get a quick idea.More recent storyboards got more detail to fill in the final gaps.2.4 ThoughtsA storyboard is a very powerful tool to visualize the different screens of an application. It makesan idea more tangible for the different members of the group. Initial differences can be foundquickly and reorientation of the different team members to the same realization becomes possibleearly on. Still there may be some disadvantages. Once a storyboard is made, it becomes reallydifficult to deviate from it. This can be bad if the placement of certain components isn’t thatgood and the developer does not want to return to the drawing table. Another disadvantage is thefact that not everything can be expressed with a storyboard. For example the workflow betweenthe different screens, is really difficult to capture without the use of additional text or a screen-transition-diagram. It would also be difficult to simulate a highly interactive application using astoryboard (such as a fully fledged game). 6 of 29
  7. 7. 7 3 Screen-transition-diagram 3.1 Final screen-transition-diagramVisual Paradigm for UML Standard Edition(K.U.Leuven) Help help contains badges contains left/right/bottom/up (no article) Badges Menu Game refresh contains close like/dislike/meh/close friends contains left/right/bottom/up (with article) send request contains Article Detailed Statistics Add Friends statistics category Statistics Fig. 5: Screen-transition-diagram Figure 5 contains our screen-transition diagram of our last iteration. Central in our appli- cation is the game element. When starting up the application, the user enters this state. He is able to move around either with a single mouse click or with the arrow keys. When pushing the refresh button, the underground is regenerated and the user can start fresh. If the miner touches a mineral, the user enters the article state. Here he can read the abstract of an article and if desired the user is able to read the full article. Now the user can rate the article or close the article screen. At all times the user is able to use the side-menu. When pushing the help button, the user gets an explanation of the application and a legend of the minerals. When desired, the user can check his badges by selecting the badge button. Also some valuable statistics about the users interests can be checked. By selecting a certain category, the user is guided to detailed statistics about its subcategories. From the menu, it is also possible to select the friend button. A default facebook screen is shown. After selecting the friends he want to invite, the user returns to the previously visited page. At all times the user can close a screen, returning him to the game screen. 3.2 Evolution and Alternatives In the previous diagram we wanted to show an explanation pop-up with the adjustments in statis- tics, every time an article was rated or closed. Because this could become rather annoying, it was banned from the final iteration. In this iteration we also had to let go the friend menu, because of the lack of time. This menu was meant for the social interaction between the different users and would have put the social before newsgame. 7 of 29
  8. 8. 84 IterationsIn the paper iterations we didn’t couple our conclusions in a one on one relation to our goals.Afterwards we learned we should have done this. However, we want to show the data as we usedit to make our conclusions. For this reason we won’t indicate what conclusion matches whatgoal. In the digital iterations we did link our conclusions to our goals. There we will always usematching numbers to indicate what conclusion matches what goal.4.1 Paper iteration 1When we made our first paper prototype, the concept of the game was still different from what itwould end up to be. Concerning the interface however, there are many similarities, so the resultsof the iteration have still been used in further iterations.4.1.1 GoalsDue to the experimental nature of paper prototypes, we decided that for a first prototype any ideathat seemed good would be used and tested. This did cause that the goals for testing were rathergeneral. We wanted to see if: • those ’first ideas’ were indeed good enough for users to use them in the intended way • the concept had any potential.4.1.2 Instruments, results and conclusionsWe tested the prototype with only three classmates, but managed to get some results. Followingare the most important conclusions: • about the interface: – Navigating around in the game was unclear: people tried many different ways (such as drag-and-drop), while we wanted them to click on a mining shaft to move there. – The fact that a user could earn money made them believe they would also be able to buy things, which was not the case. – The use of tabs confused people; they were looking for a button to close windows. • about the concept: – This was met with a lot of negative criticism. People found it unclear what the goal and motivation for playing were. The concept also neglected any “value” of news items, which made it look like it shouldn’t include news at all. 8 of 29
  9. 9. 4.2 Paper iteration 2 94.2 Paper iteration 2For our second paper prototype, we had revisited our concept and came up with our final concept.Figure 6 shows how this paper prototype looked like. This new concept solved a few of theproblems with the previous interface by itself: • We made the mining more interactive, giving the user several different options to navigate around the mine. • We abandoned the scoring system, so the money problem was gone. • The issue with the tabs remained. We decided not to change this yet, because we felt it might be an issue only due to the nature of paper prototypes. Fig. 6: Second paper prototype4.2.1 GoalsOnce again, we didn’t have very specific goals. We mainly wanted to: • check if the new concept was more attractive than the old one • see which problems users experienced with the updated interface 9 of 29
  10. 10. 4.2 Paper iteration 2 104.2.2 Instruments, results and conclusionsWe performed tests with five computer science students in their early twenties. Although this isnot our target audience, they were the easiest for us to get access to to perform the tests. Thesewere the results: • About the concept: – The concept was still vague when people first started. – People wanted to be able to exclude friends from being taken into account when calculating the value of articles. • About the interface: – It was not clear what the button to change settings of the statistics page meant. – It was unclear what rating an article did; the user got redirected to the game without any further information. – We provided an option to change the avatar of the miner, but people did not easily find it. 10 of 29
  11. 11. 4.3 Paper iteration 3 114.3 Paper iteration 3With the results of the previous iteration in mind, we performed following changes: • We added an introduction screen, to be displayed the first time a user starts the game, with a brief explanation. • We added a button in the friends list to be able to turn friends on and off. • We changed the settings button to have a traditional settings icon. • We added a screen that, after rating an article, showed its impact on the statistics. • We did not change the avatar option, but rather decided to keep it as bonus for more experienced players.Additionally, we abandoned the tabs and made close buttons for all but the main applicationscreen. The resulting prototype can be seen in figure 7. Fig. 7: Third paper prototype4.3.1 GoalsWe wanted to test if: • the performed changes had had their intended effect • there were any other, up to now unknown problems 11 of 29
  12. 12. 4.3 Paper iteration 3 124.3.2 Instruments, results and conclusionsThis iteration was tested with housemates of the team members, which was our first input fromnon computer scientists. A total of 8 people tested the prototype, ages varying from 18 to 23 andwith mixed computer and facebook experience. These were the most important conclusions: • changes: – The text in the introduction screen was not clear enough. – The button to indicate which friends are taken into account for the value of articles – a green/red “stoplight” – was causing confusion. It had to be explicitly explained before people understood its purpose. – One person thought the closing button for a sub-screen would close the entire ap- plication. • further issues: – The button to rate an article neutral was not clear. – Using the terms like/dislike for rating articles caused wrong usage: interesting arti- cles about bad events were being rated negative, while for our system they should be rated positive. – Some people wanted more specific statistics available. 12 of 29
  13. 13. 4.4 Digital iteration 1: 09/04 - 24/04 134.4 Digital iteration 1: 09/04 - 24/04In this first digital iteration we stripped down our application to the bare bone essentials: themining and rating of articles. The point was to release as soon as possible. Birth by fire. Morespecifically, we used dummy sprites to represent the miner, left out the sidebar and used a setamount of dummy articles instead of real-life ones. This way we were able to test the absolutecore of our application. Unfortunately we don’t have screenshots of how our application lookedin the first iteration.4.4.1 GoalsThis iteration we expected to find out whether or not: 1. The user could easily navigate the miner (by clicking somewhere on the map, or using the keyboard). 2. The user mined and rated articles (How many articles does a user mine? Do they mine articles at all, or just avoid the minerals? Do users just click dislike all the time? etc.). 3. What was the overall feeling the users had when using our application? Is it fun? Were they appealed by the idea or rather appalled?4.4.2 InstrumentsWe first tested our application on 6 users offline, these people are randomly chosen people weknow. Afterwards we released the application on facebook. None of the users had to go througha scenario. Since we didn’t have control over who tests our application, we didn’t put constraintson this. We benchmarked our goals as follows (numbers coincide with the goal numbers): 1. We expected users to navigate between articles using the mouse and/or keyboard. To track this, we measured the average number of mouse clicks and keystrokes between articles. We expected an average of two mouse clicks between articles and an average of 25 keystrokes. We chose these numbers because it is possible to reach an article in 1 mouse click and the average distance between articles is 15 spaces. We give users a 70% error margin and round up. 2. We expected an even distribution among the three possible ratings. To measure this we kept track how many times a user clicked on like/meh/dislike after mining an article. 3. To test the overall feeling the users had when using the application we used a question- naire based on the CSUQ test. Scores on the questionnaire range from 1 to 5. We want that 50% of the users rate 4 or 5 on questions related to whether they like the application.4.4.3 Results and observationsResults for the goals are listed below, with numbers coinciding with the previous sections: 1. Results show an average of 2.45 mouse clicks as evident by the graph below. The key- board was severely underused so it is hard to speak about an average number. This is visible in figure 8. 13 of 29
  14. 14. 4.4 Digital iteration 1: 09/04 - 24/04 14 2. The number of times users clicked on one of the four buttons displayed when rating an article (like/meh/dislike) is shown by the graphs in figure 9. These seem to be evenly spread. We did notice that we had more users adding the application on facebook than users mining at least one article. This means not all users mined an article. Later on we found this is because Safari didn’t submit the results to our database and wasn’t a problem visible to the users. 3. General feedback and the questionnaire showed us that: - The users weren’t pleased with the graphics. - Users asked for a legend of the different minerals. + Most users were medium satisfied with the application and hope to see more soon. +/- The opinions about the concept are still divided. We would like to refer to appendix 7.1 for a view on how the application scored on all the different questions and a more detailed view of the user input. Although this is not linked to one of our goals, we did notice something strange. We noticedthat some users allowed access to our application, but didn’t use it afterwards. Later on we foundout this was a bug in our tracking system. The Safari browser didn’t submit data to our database.As this only affects a small group of users, fixing this isn‘t one of our priorities. Fig. 8: Amount of clicks and keystrokes each user needs to reach an article (a) Absolute amounts per user (b) Overview of the totals Fig. 9: What users selected after reading an article 14 of 29
  15. 15. 4.4 Digital iteration 1: 09/04 - 24/04 154.4.4 ConclusionWe had a small amount of test subjects. Therefore we will continue tracking what we describedin section 4.4.2. We will only mention these in the following iterations in case the findingsthere vary from our conclusions here. The conclusions on each observation can be found below(Numbers coincide with previous sections): 1. We feel an average of 2.45 mouse clicks is acceptable. The fact that the keyboard was underused isn’t really problematic. We won’t remove the keyboard controls because we still think of it as a nice extra. 2. It seems the ratings are rather evenly distributed, but no hard conclusions can be made with only 9 unique users on facebook. Since it seems that users are able to rate and thus mine articles, we think it isn’t necessary to change the interface. 3. People were generally displeased with the look of the application which didn’t really come as a surprise as we didn’t put effort in graphics yet. We will improve these. 15 of 29
  16. 16. 4.5 Digital iteration 2: 25/04 - 03/05 164.5 Digital iteration 2: 25/04 - 03/05In this iteration we released new functionality and improved the looks of the application to tryto solve the problems we saw in iteration 1. Unfortunately we don’t have a picture of how theapplication looked in this iteration.4.5.1 Goals 1. The number of users in the previous iteration was problematic. We will try to attract more users by improving the graphics (replace the dummy sprite; add better graphics for minerals, etc.) and by adding a badge reward system. The combination of both will hopefully attract more users. 2. Some users commented in the previous iteration that they didn’t really understand what mineral stands for what article. We will try to solve this by adding a legend.4.5.2 InstrumentsAgain we released on Facebook. We hope that improving the application will get us enoughusers and won’t use additional ways of gathering users. We benchmarked our goals as follows(numbers coincide with the goal numbers): 1. We will simply measure success by counting the number of users that used our applica- tion. We would like at least 25 users for this iteration. 2. We will monitor whether or not there is improvement by using the same questionnaire from the previous iteration and see if we made any progress. We want 50% of the users to rate 4 or 5 out of 5 on how easy it is to find information.4.5.3 Results and observationsResults for the goals are listed below with numbers coinciding with the previous sections: 1. For this iteration we again had a total of 9 unique users. 2. 40% of the users rated 4 or 5 out of 5 for how easy it is to find information.Fig. 10: Comparison between how easy users found information in iterations 1 and 2 16 of 29
  17. 17. 4.5 Digital iteration 2: 25/04 - 03/05 174.5.4 ConclusionThe numbers of our conclusions coincide with previous sections: 1. Nine users indicates no improvement whatsoever. We will have to find a better way to attract more users. 2. The achieved 40% is not the 50% we wanted, but it is good enough. The other users rated very low for the help function. In other words, the opinions about our new help function are divided between extremes. We would like to know the concerns the people have who gave us a low score for the help function. In any case, as indicated by the people who did like the new help function, we did manage to help at least a good portion of our users. In that respect we consider the new help function an improvement. Further improvement of the help function is a possibility for possible next iteration but certainly not a top priority. For now we will leave the help function as it is. 17 of 29
  18. 18. 4.6 Digital iteration 3: 04/05 - 09/05 184.6 Digital iteration 3: 04/05 - 09/05In this iteration we tried again to attract more users for our application by expanding on theexisting application. You can see how the application looked in figure 11. Fig. 11: How the application looked in iteration 34.6.1 Goals 1. We will try to attract more users. We will do this by further improving the graphics and giving the users insight on how we provide their articles by showing them statistics. This idea came from the results of Section 4.3.2. 2. We will try to make users stay longer. The improvements of 1. should also improve this.4.6.2 InstrumentsWe will release another version of our application on facebook. To help with finding users,we also asked professor Erik Duval to tweet about our application besides the other ways ofrecruiting people we were already using (mouth-to-mouth advertising, posting on facebook, etc.).We benchmarked our goals as follows (numbers coincide with the goal numbers): 1. We will measure success by measuring the number of users. We expect at least 25 users. 2. We will use Google analytics to monitor how long users are on our site. We want them to stay 2 minutes on average on our application. We feel this is the time it takes to read approximately 2 articles. 18 of 29
  19. 19. 4.6 Digital iteration 3: 04/05 - 09/05 194.6.3 Results and observationsResults for the goals are listed below with numbers coinciding with the previous sections: 1. For this iteration we had a total of 7 unique users. 2. Users stayed an average of 7.50 minutes on our application. This varied a lot per day, as seen in figure 12 Fig. 12: Average time users spend on our site per day4.6.4 ConclusionAgain, the numbers of our conclusions coincide with the numbers above: 1. 7 users is in fact a decline from our previous iteration. Even though this was a shorter iteration, we will have to drastically increase this number in our next iteration. We will do this by making it possible for users to invite their friends. 2. We were very pleased with the result of 7.50 minutes. The users that do use our applica- tion certainly like it. 19 of 29
  20. 20. 4.7 Digital iteration 4: 10/05 - 23/05 204.7 Digital iteration 4: 10/05 - 23/05In this iteration we tried again to attract more users for our application. We will do this by lettingour users invite more users. The layout of our application didn’t change compared to the previousrelease. We only made the add-friend-button work. This can be seen in figure 13. Fig. 13: How the application looked in iteration 44.7.1 Goals 1. We will add a functionality to invite friends. This will hopefully provide us more users. 2. Due to the low number of users in the previous iteration, will check if problems occurred with the interface. We want to be certain users can mine and rate articles. If there is a problem here, this could be a cause as to the low number of users.4.7.2 InstrumentsAgain we release our new version on facebook. We benchmarked our goals as follows (numberscoincide with the goal numbers): 1. We will measure the number of users, we expect at least 25 users. 2. We will again measure the average number of mouse clicks, the average number of keystroke and the number of times a user clicked like/dislike/meh/close. We will use the same benchmark as mentioned in Section 4.4.2 and will monitor if there is any significant change.4.7.3 Results and observationsResults for the goals are listed below with numbers coinciding with the previous sections: 1. We finally managed to achieve our goal. We had 29 users using our application. 2. The basic controls are working nicely. All users manage to read and rate articles. The ratings are evenly spread. They do use a little more clicks, but all of them manage to read articles. These results can be found in figures 14 and 15. The user with lots of clicks is 20 of 29
  21. 21. 4.7 Digital iteration 4: 10/05 - 23/05 21 the same one as in iteration 3. He probably just likes walking through the underground without reading articles. Fig. 14: Average amount of clicks and keystrokes each user needs to reach an article (a) Absolute amounts per user (b) Overview of the totals Fig. 15: What users selected after reading an article4.7.4 ConclusionWe are pleased with the results of this iteration. Everything worked as we wanted. Since weachieved our pre-set goals, no changes are required. There do seem to be problems left, but these are not related to our goals. If we look at thequestionnaires, users don’t find our application effective to find articles and they don’t becomemore interested in the news. This is partially because the functionality to select articles isn’tcompletely working as planned. We didn’t invest time in this earlier on, because there weremore urgent matters to solve. If we had more time, we could also try to improve those parts ofthe application in a new iteration. 21 of 29
  22. 22. 225 ResultThe resulting application somewhat differs from our initial idea. We didn’t get to add the friendfunctionality in the way we really wanted or even implement the actual relation between the min-erals and the ratings given by users which seemed like “core functionality” at the time. Insteadwe added what users wanted. We are quite pleased with the end result. We ended up with 29users in the end who all played our game for a decent amount of time. We had a total of 49 users,excluding ourselves. Some of these people are now starting to ask for more: more badges, theaddition of high scores, sound effects, etc. It didn’t really occur to us that some people mightactually enjoy mining articles without an added agenda. These people seem to enjoy clickingon articles and liking or disliking them even if this doesn’t really contribute to anything (cf. theanecdote mentioned in class about the application where users were clicking on a rubber duckyto make it take a dive). They seem to like the simple pleasure of earning badges and chop-ping away at nothing in particular. Even though we might not have achieved our ultimate goal(making people more interested in news), we did succeed in creating a small game that offeredthese people a simple (and short) repose from their daily lives. Between all the statistics andquestionnaires and after all is said and done, that seems like a victory in itself.6 ConclusionPersonally we are quite satisfied with our Dataminer application. A continuous problem wasthe low number of users. During all of our iterations (except the last) we had just a small num-ber of users. We should have tried more aggressive techniques to attract more users from thestart. Working on Dataminer has taught us the importance of using an iterative process whendeveloping a user interface. Starting with the bare bone essentials and adding features based onuser requests and not just on what you have planned seems like the ideal way to work on anapplication (however time-consuming it may be). This course has also taught us the importanceof a decent user-interface (in which “decent” is judged by the users, not the creators) and howsmall changes (like adding a button to invite friends) can have a mayor impact, while big changes(adding graphs and badges) might not. One of our fellow students commented that by bloggingabout our graphics we might have set expectations a little too high. This seems like a valid point. 22 of 29
  23. 23. 237 Appendix7.1 Answers to questionnairesBelow are what the users answered on our questionnaires. In the first iteration there were 10people who answered to the questionnaire. In the second iteration 5 people who filled in thequestionnaire. In the third iteration only 2 people answered and in the fourth iteration 11 peopleanswered. We didn’t make any conclusions based on the questionnaire in the third iteration andsince not that much changed between the third and fourth iteration, we will take these 13 peopletogether as one group. In the graphs we want to show the evolution of what people think about our program. To beable to compare the different iterations, we have chosen to show what percentage of the usersanswered a certain answer, instead of absolute numbers. Each graph will show how much usersaswered 1 (worst rating), 2, 3, 4 or 5 (best rating) on our questions. Iteration 1 is shown in red,iteration 2 in green and iteration 3 in blue. The graphs can be found in figures 16 and 17.7.2 Usage statisticsWe have monitored how much clicks and keystrokes each user needs to mine an article. We alsomonitored what option they select after reading the article. In the figures below you can find thisfor each iteration. In iteration 2 we left out a user that constantly clicked dislike. He read lots of articles, butdisliked all of them. This made the percentage of dislikes seem very high, but it was only causedby one user. All the graphs can be found in figures 18, 19, 20, 21, 22, 23, 24 and 25. 23 of 29
  24. 24. 7.2 Usage statistics 24 24 of 29 Fig. 16: Answers to our questionnaires part 1
  25. 25. 7.2 Usage statistics 25 25 of 29 Fig. 17: Answers to our questionnaires part 2
  26. 26. 7.2 Usage statistics 26Fig. 18: Amount of clicks and keystrokes users need to reach an article in iteration 1 (a) Absolute amounts per user (b) Overview of the totals Fig. 19: What users selected after reading an article in iteration 1Fig. 20: Amount of clicks and keystrokes users need to reach an article in iteration 2 26 of 29
  27. 27. 7.2 Usage statistics 27 (a) Absolute amounts per user (b) Overview of the totals Fig. 21: What users selected after reading an article in iteration 2Fig. 22: Amount of clicks and keystrokes users need to reach an article in iteration 3 (a) Absolute amounts per user (b) Overview of the totals Fig. 23: What users selected after reading an article in iteration 3 27 of 29
  28. 28. 7.2 Usage statistics 28Fig. 24: Amount of clicks and keystrokes users need to reach an article in iteration 4 (a) Absolute amounts per user (b) Overview of the totals Fig. 25: What users selected after reading an article in iteration 4 28 of 29
  29. 29. 7.3 Amount of viewers per day 29 7.3 Amount of viewers per day Since 4 may, we are tracking how much unique users visit our application per day. This can be seen in Figure 26. Fig. 26: Amount of unique users per day from 04/05 till 23/05 7.4 Time spend Sam Agten Bart Gerard Tanguy Monheim Steven Vermeeren TotalRefining concept 20 20 20 20 80Storyboard 10 15 10 10 45Screen-transition-diagram 0 2 2 0 4Implementation paper prototypes 5 5 12 5 27Evaluation paper prototypes 5 4 2 5 16Implementation digital iteration 1 1 5 20 35 61Evaluation digital iteration 1 1 2 2 1 6Implementation digital iteration 2 20 10 40 15 85Evaluation digital iteration 2 1 2 2 2 7Implementation digital iteration 3 5 10 10 5 30Evaluation digital iteration 3 1 2 2 2 7Implementation digital iteration 4 1 2 1 1 5Evaluation digital iteration 4 1 1 2 1 5Writing Reports 10 10 15 8 43Listening to presentations 20 20 20 20 80Total 101 110 160 130 501 Tab. 1: Time investment 29 of 29

×