Rise presentation-users-2012-01


Published on

Presentation on user reactions to RISE given at RISE celebration event on 24 January 2012 by Elizabeth Mallett

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • So, to go back to the original project hypothesis. We had to create a strategy to quickly evaluate the prototype search and recommender system we had built. In a short project like RISE you can only carry out a limited range of evaluation activities
  • We planned three types of evaluation (survey, interviews and analytics).We ran an online survey linked from the interface using Survey Monkey, and advertised on the library website under Library News. Carried out some testing with individual users, partly during the One Stop search evaluation and partly specifically for RISE and analysed web analytics data.
  • For the survey we asked people several questions about what they thought of the recommendations that they were shown.The survey only had a small number of respondents (26) so is an indication only.This slide shows what users thought about the value of the recommendation: ‘these resources may be related to others you’ve viewed recently’Two thirds of people thought they were Very or Quite Useful
  • Course recommendations came out as being less useful. Just under half saw them as very or quite useful. That’s something that we picked up on in the face to face interviews.
  • The third type of recommendation was based on articles viewed by other people using the same search terms as you. Again the responses from users came in at about two thirds saying they we very or quite useful.There are though a significant number in some of the responses saying recommendations aren’t useful.
  • We also asked users what they thought of the relevance of the recommendations.Given that RISE was quite short term we certainly felt that there would be scope to increase the quality of the recommendations as more people used the system and it collected more data.
  • Finally we asked users about whether they thought recommendations be useful and saw a similar two-thirds/one-third split.
  • As the One-Stop discovery system was being evaluation in parallel with the RISE project, we were able to ask some questions in their focus groups about the use of recommender systems.They were particularly interested in being able to relate them to the module a student had done and suggest that they would also like to know how high a mark the student had got in their module.  This suggests that some students are very focussed on achievement and saw that recommendations and ratings could help them.Undergraduates showed more interest in the system, but both wanted some assurance about the provenance of the recommendations. Postgraduates in particular put a lot of important on the provenance of the recommendations, and they were interested in module-specific recommendations about which databases were best, which search-terms might get the best results. So we explored that in more detail in a series of face to face interviews.
  • 1000 students emailed.102 volunteered, with 52 only able to take part remotely. These were directed to the online survey.50 students volunteered to be interviewed, and 11 students were selected based on level of study and the subject matter of the course. We wanted a range.I asked the students to log in to the MyRecommendations web page. If they were on a course they were presented immediately with some course-related recommendations. We discussed their first impressions of these.The students were then asked to conduct a couple of searches to find something of interest to their studies. As well as the lists of search results from EDS , they were presented with search-related recommendations and asked to comment on the relevance, usefulness etc.When all different types of recommendation had been explored we discussed which type they would find most useful. Interestingly, unlike the survey, the face-to-face group preferred course-related recommendations.
  • Generally, there is support for the concept amongst most usersUseful for students lacking in confidenceGood to see what others are looking for
  • But there is some urge for caution which ties in with wanting to understand where the recommendations are coming from.
  • Pretty much all of the students so far said they would find the Course-related recs most useful.Caveat – lack of trust/respect for co-studentsOne undergrad philosophy student said he would normally go for course related recs but in this instance definitely not because the people on his course were “complete fruit cakes”.
  • And there were a few good suggestions for improvements.Others were:Make them more obvious on the pageIndicate the popularity of a recommendation (e.g. Xpercentage of people on your module recommended A. OR 10 People viewed this and 5 found it useful.)
  • The third type of evaluation was Google Analytics to track the use of the tools. As part of this work the developer created a custom report to try track which recommendations were being used.So we can see how many times each type of recommendation is being used. But bear in mind that not everyone gets every type of recommendation every time
  • Although in comments users have suggested that we show longer list of recommendations, analytics clearly shows that the first two recommendations on the screen are much more likely to be viewed than any others. That seems to indicate that users clearly expect that there is some relevance ranking and rating taking place. Selecting the first two also reinforces their value as recommendations.
  • Rise presentation-users-2012-01

    1. 1. RISE User EvaluationLiz Malletthttp://www.open.ac.uk/blogs/rise
    2. 2. Recommendations Improve theSearch Experience? “That recommender systems can enhance the studentexperience in new generation e- resource discovery services”
    3. 3. Evaluation Online Survey Face to Face interviews Review of web analyticshttp://www.flickr.com/photos/42505898@N00/305205950/sizes/m/in/photostream/
    4. 4. Survey results 1Related to records you have viewed Very useful Not sure 45% 11% Not useful 22% Quite useful 22% Slightly useful 0%
    5. 5. Survey results 2People on your course viewed Very useful Not 31% sure 0% Quite useful 15% Slightly useful 15% Not useful 39%
    6. 6. Survey results 3Search terms Very useful 47% Not sure 0% Quite useful Not useful 20% 33% Slightly useful 0%
    7. 7. Survey results 4 How relevant where the recommendations? Very relevant Not 17% used Quite relevant 4% 31% Not sure 0% Not relevant 35% Slightly relevant 13%
    8. 8. Survey results
    9. 9. Focus Groups Undergraduates PostgraduatesLike ratings and reviews from Citation as a recommendation other students „other people‟s experiences Wary of provenance valuable‟ Feed to module website Which module studied? Want synonyms How high a mark? Trust repository
    10. 10. Face to face interviews First impressions of recommendations (course-related) Asked to enter a search term. Results and recommendations explored. Asked about relevance Asked about preference for type of recommendation
    11. 11. Should we have a recommender system? “I think it would be a very good useful feature. It would be definitely very very useful” postgraduate Maths student“So it would be interesting to see what other people are looking at. Yes, I woulddefinitely use that because my limited knowledge of the library might mean thatother people were using slightly different ways of searching and getting differentresults.” undergraduate English Literature student I have just had a go, it was good with suggested papers that I had already found (which shows potential in my view) through Google.
    12. 12. Should we have a recommender system?“Im afraid my first reaction is to be a bit sceptical - it presumably doesnt tellyou if fellow students found the information/article useful or relevant to whatthey were looking for. I would hate to waste time following unproductivelinks laid down by others who might be failing students or think that any"lazy" students might develop poor practice by relying on what others hadlooked at. It sounds like a good idea but I think caution needs to beexercised. ”
    13. 13. Why they prefer course-related recommendations“I can’t be bothered with knowing what everybody else is interested in. Itake a really operational view you know, I’m on here, I want to get thereferences for this particular piece of work, and those are the people thatare most likely to be doing a similar thing that I can use.” H800 student“I suppose if I wasn’t so sure on an assignment it would perhaps be quiteuseful to see what other people were looking at to know if I was thinkingalong the right lines.” - Undergrad literature student
    14. 14. Suggestions for improvement“Maybe include a date. It would be interesting to know when a resourcewas last looked at” Postgraduate political philosophy student“If somebody used similar search but three years ago, is that going to carrythe same weight?” Postgraduate maths student Include course drop-down choice. “I would be looking at that and saying “which of my courses does it refer to?”
    15. 15. Recommendations usage Relationship 24% Search 40% Course 36%
    16. 16. Recommendations usage 2000 1800 1600 1400 1200 Relationship 1000 Course Search 800 600 400 200 0 1 2 3 4
    17. 17. Findings and lessons learnt • Users like recommendations „in principle‟ • Recommendations provenance • Interest in the search tools
    18. 18. Blog: www.open.ac.uk/blogs/RISE Code: http://code.google.com/p/rise- project/source/browse/trunk/rise/ Questions?http://www.flickr.com/photos/rmgimages/4660272978/in/photostream/
    19. 19. RISE User EvaluationLiz Malletthttp://www.open.ac.uk/blogs/rise