SharePoint Saturday Belgium 2013 Intranet search fail

811 views
699 views

Published on

Published in: Technology, Design
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
811
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

SharePoint Saturday Belgium 2013 Intranet search fail

  1. 1. Intranet Search #fail #SPSBE07 Ben van Mol, Ventigrate April 26th, 2014
  2. 2. Thanks to our sponsors! Gold Silver
  3. 3. @vanmobe ben.vanmol@ventigrate.befacebook.com/ventigrate @ventigrate linkedin.com/company/ventigrate info@ventigrate.be Veldkant 33A BE-2550 Kontich TEL: +32 (0)3 450 80 30 FAX: +32 (0)3 450 80 39 Who am I?
  4. 4. Agenda
  5. 5. Thanks to Google (and Bing of course) • Search technology is wide-spread • Search is highly adopted • Search is conceived as an easy tool to explore and find relevant information
  6. 6. All I want is How complicated can it be?! False expectations
  7. 7. Enterprise data is complex The information needs within an organization span a wide variety of information types, sources, formats, ... Popularity and the number of referrals are less important in Enterprise search compared to Internet Search. Google builds its metadata from millions of users searching for content, the enterprise is a much smaller case. In an Enterprise lot’s of people create content with little attention paid to information governance.
  8. 8. The user is complex • “This is a huge change to the overall user experience. It transforms the way we think and opens opportunities to use search in a disruptive fashion. I love it!” • “Personally, I think people will get annoyed with it. The interface itself isn’t anything new, and it’s an outdated concept. When you think about state-of-the-art search, it should be less about searching and more about finding.”
  9. 9. How would you take a picture?
  10. 10. Expertise significantly impacts how we seek information online. The effects on search are determined by • Domain knowledge • Technical knowledge http://bit.ly/1pQe5dv The User How would you take a picture?
  11. 11. Novices orienteer, experts teleport
  12. 12. Serialists concentrate on the individual parts rather than the whole Holists focus on the cohesive whole rather than on components Draw a vertical line inside the rectangle Rod-and-Frame test (Witkens & Ash) Serialists versus holists - Spend 50% more time - Visit twice as many pages - Are more likely to use the browser’s back button BUT: the performance gap vanishes if technical expertise is equally high SerialistHolist Source: Kim K. Information seeking on the web: Effects of user and task variables. Library & Information Science Research. 2001;23 233– 255.6,8.
  13. 13. Source: Paivio A. Imagery and verbal processes New York: Holt: Rinehart and Winston; 1971.12. Mayer R, Sims VK. For whom is a picture worth a thousand words? Extensions of a dual-coding theory of multimedia learning. Journal of Educational Psychology. 1994;86 389–401.11, 13
  14. 14. Who needs search? • No, thanks… • Yes please!
  15. 15. Different schools
  16. 16. 25
  17. 17. Search is a continuous improvement process • Small iterations with PDCA cycles • Requires Management buy-in • End-users involvement • Good communication • Means to contribute Plan Do Act Check
  18. 18. Recall Definition: RECALL is the ratio of the number of relevant records retrieved to the total number of relevant records in the database. It is usually expressed as a percentage.
  19. 19. Quick Relevancy Test
  20. 20. Precision Definition: PRECISION is the ratio of the number of relevant records retrieved to the total number of irrelevant and relevant records retrieved. It is usually expressed as a percentage.
  21. 21. Quick Precision Test
  22. 22. Search Performance Metrics
  23. 23. How to start? Capture the user requirements using traditional analysis techniques or analyze the existing data to analyze search performance and behavior. Work like Google is not a requirement!
  24. 24. Where to start?
  25. 25. Backwards oriented behaviour • Autosuggest – help express specific terms and suggest queries of other users • Related searches – stimulate novices to explore related searches • Avoid zero-results – by using spelling correction, query expansion, query reformulation • Breadcrumbs – to navigate back to a previous query if one is unsuccessfull
  26. 26. Designing search user interfaces that are easy to learn can help bridge the gap between novice and expert serialists, progressively training them how to use the application Source: Spool, J. (2005). What makes design seem “intuitive”? User Interface Engineering. Retrieved June 8, 2012 from http://www.uie.com/articles/design_intuitive/.9 Design for learnability • Descriptive text in search box • Contextual popovers • Guidance • Full-screen overlays
  27. 27. Source: blog.comperiosearch.com
  28. 28. When information scent is strong, users are confident that they’re headed in the right direction. When it’s weak, users may be uncertain of what to next, or they may abandon their search altogether.
  29. 29. Examples • Tonal Patterns (swine flu <> h1n1) • Synonym patterns (mail <> email) • Time-based patterns (traffic @eod) • Question patterns (categorization) • Answer patterns (content types) • Find common usage patterns, trends, and outliers • Start with queries and their relative frequency counts • Eliminate search log “junk”—meaningless queries—as best you can to improve your analysis.
  30. 30. Examples • Tonal Patterns (swine flu <> h1n1) • Synonym patterns (mail <> email) • Time-based patterns (traffic @eod) • Question patterns (categorization) • Answer patterns (content types)
  31. 31. Examples • Tonal Patterns (swine flu <> h1n1) • Synonym patterns (mail <> email) • Time-based patterns (traffic @eod) • Question patterns (categorization) • Answer patterns (content types) • Try to understand what people are looking for based on the query cluster • Works best when done by multiple people
  32. 32. Examples • Tonal Patterns (swine flu <> h1n1) • Synonym patterns (mail <> email) • Time-based patterns (traffic @eod) • Question patterns (categorization) • Answer patterns (content types) Try to find what type of content users expect to find to identify potential content types.
  33. 33. • You aren’t offering the content that your searchers want. • You offer it, but the search engine isn’t finding it. • A difference exists between how you and your searchers describe the same content. Diagnose problems and determine what to fix or improve for your site’s searchers.
  34. 34. Example: http://www.amazon.com/s/ref=nb_sb_noss?url=search- alias%3Daps&field-keywords=nike%20sneakers%20ruskie • Don’t be afraid to say you did not understand to prevent trashing (changing the query without resolving the problem) • Focus on providing a way out. Make sure every control on the page does something productive to help resolve the no search results condition. • Focus on the customer’s goal. Provide the most relevant recovery content first, while staying as close as possible to the customer’s original intent.
  35. 35. If you have access to information about who searched what and when on your site, conducting session analysis will help you gain deeper insight into what searchers do and how their needs change over a short period of time.
  36. 36. Audience analysis will help you better understand how information needs and searching experiences differ between audience segments. Challenge the assumption that your users are all alike. Audience analysis can beef up your personas or boost your organization’s existing segmentation analysis.
  37. 37. Measure User Satisfaction
  38. 38. Thank you!

×