8 better practices from information architecture By: Lou Rosenfeld

1,023 views
920 views

Published on

Published in: Design, Education, Technology
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,023
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
37
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide
  • http://xkcd.com/773/
  • http://www.semanticreview.com/images/semantic-data.jpg
  • Amazing drawing by Eva-Lotta Lamm: www.evalotta.net
  • Amazing drawing by Eva-Lotta Lamm: www.evalotta.net
  • Funnel: http://www.orionweb.net/wp-content/uploads/conversion-funnel.png Sitemap: http://www.peacockvaughninsurance.com/images/SiteMap.bmp
  • Onion courtesy Eva-Lotta Lamm
  • 8 better practices from information architecture By: Lou Rosenfeld

    1. 1. 8 better practices from information architecture Lou Rosenfeld
    2. 2. Hello, my name is Louwww.louisrosenfeld.com | www.rosenfeldmedia.com 2
    3. 3. 3
    4. 4. The state ofcontemporary findability 3
    5. 5. Some questions that youprobably can’t answer• Who are your content’s primary audiences?• What are the five major tasks and needs each has?• Are you satisfying those tasks and needs?• What data support your thinking?• How do you measure success? 4
    6. 6. Why can’t we getfindability right?
    7. 7. Why can’t we getfindability right?• We don’t know how to diagnose
    8. 8. Why can’t we getfindability right?• We don’t know how to diagnose• We don’t know how to measure
    9. 9. Why can’t we getfindability right?• We don’t know how to diagnose• We don’t know how to measure• Siloed organizations
    10. 10. Why can’t we getfindability right?• We don’t know how to diagnose• We don’t know how to measure• Siloed organizations• Ill-equipped decision-makers
    11. 11. Why can’t we getfindability right?• We don’t know how to diagnose• We don’t know how to measure• Siloed organizations• Ill-equipped decision-makers• Short-term thinking
    12. 12. Why can’t we getfindability right?• We don’t know how to diagnose• We don’t know how to measure• Siloed organizations• Ill-equipped decision-makers• Short-term thinking• Semantic illiteracy
    13. 13. Data is binaryInformation isn’t
    14. 14. Information architecture:8 better practices for findability1. Diagnose the important problems2. Balance your evidence3. Advocate for the long term4. Measure engagement5. Support contextual navigation6. Improve search across silos7. Combine design approaches effectively8. Tune your design over time 7
    15. 15. #1Diagnose theimportant problems 8
    16. 16. A 9
    17. 17. A Not all queries are distributed equally 9
    18. 18. A Nor do they diminish gradually 9
    19. 19. A 80/20 rule isn’t quite accurate 9
    20. 20. ( 10
    21. 21. ( 10
    22. 22. ( 10
    23. 23. ( 10
    24. 24. The Long Tail is( much longer than you’d suspect 10
    25. 25. Zipf Distribution in text 11
    26. 26. It’s Zipf’s World;we just live in it A little... • queries • tasks • ways to navigate • features • documents ...goes a long way 12
    27. 27. UNVERIFIED RUMOR: 90% of Microsoft.com contenthas never been accessed... not even once TAKEAWAY: FOCUS ON THE STUFFTHAT MATTERS!
    28. 28. Continually prioritizeto dowhat’s important... 14
    29. 29. ...and continually fix (withinto doan IA report card) 15
    30. 30. #2Balance your evidence 16
    31. 31. from Christian Rohrer: http://is.gd/95HSQ2 17
    32. 32. Balanced research leads to true insight, new opportunitiesfrom Christian Rohrer: http://is.gd/95HSQ2 17
    33. 33. Lou’s TABLE OFOVERGENERALIZED Web Analytics User Experience DICHOTOMIES Users intentions and What they Users behaviors (whats motives (why those things analyze happening) happen) Qualitative methods for What methods Quantitative methods to explaining why things they employ determine whats happening happen Helps users achieve goals What theyre Helps the organization meet (expressed as tasks ortrying to achieve goals (expressed as KPI) topics of interest) Uncover patterns and How they use Measure performance (goal- surprises (emergent data driven analysis) analysis) Statistical data ("real" data Descriptive data (in smallWhat kind of data in large volumes, full of volumes, generated in lab they use errors) environment, full of errors) 18
    34. 34. Balance over time:From projects to processes Example: the rolling content inventory 19
    35. 35. Develop a research regimento do balanced by time, quadrant Each week, for example... • Analyze analytics for trends (Behavioral + Quantitative) • Task analysis of common needs (Behavioral + Qualitative) Each month... • User survey (Attitudinal + Quantitative) • Exploratory analysis of analytics data (Behavioral + Qualitative) Each quarter... • Field study (Behavioral/Attitudinal + Qualitative) • Card sorting (Attitudinal + Qualitative/Quantitative) 20
    36. 36. #3Advocate for the long-term 21
    37. 37. S Typical design focus Stuff that gets ignored: mission, vision, charter, goals, KPI, objectives 22
    38. 38. For starters, develop yourto doproject’s elevator pitch Read Gamestorming (Gray, Brown, Macanufo); O’Reilly, 2010). http://amzn.to/nnpERG 23
    39. 39. #4Measure engagement 24
    40. 40. Measuringconversions?No problem... 25
    41. 41. ..measuringanything else?Good luck!
    42. 42. The missing metricsof in-betweenness• Orientation (“What can I do here?”)• Engagement (“I like this; do you?”)• Connection/cross-promotion (“What goes with this?”)• Authority (“I trust this”)• and many more... 26
    43. 43. Use gradual engagement to do model to isolate, measure tasks Example: adoption of features; can you measure movement between layers? Layer 0: User visits the site (unauthenticated; no cookies, no nothing) Layer 1: User asks the site a question (for example, a search query) Layer 2: Site asks the user a question (would you like save this product to a wish list?) Layer 3: Site suggests something to the user (you might enjoy these products ordered by people like you) Layer 4: Site acts on the users behalf (weve gone ahead and saved these products to yourMore on gradual engagement: accounts list of frequently-ordered items)http://bit.ly/9hPqyx 27
    44. 44. #5Support contextual navigation 28
    45. 45. Contextual navigation: your site’s desire lines Determinethrough content modeling, sitesearch analytics Deep navigation requires content modeling : a better approach to deep IA and content structuring
    46. 46. Important content objects emerge concert calendar from content modeling (example: BBC) album pages artist descriptions TV listings Content that matters mostalbum reviews discography artist bios 30
    47. 47. Important metadata attributes emergefrom content modeling Metadata that matters most 31
    48. 48. Make content modeling ato doparticipatory design exercise
    49. 49. Make content modeling ato doparticipatory design exercise•Provide subjects with “de-oriented” samples ofcontent types... and common tasks•Have them draw “desire lines” and startingpoints, and identify gaps in content types•Learn from “think out loud” and by identifyingcommon patterns•More info: Atherton et al.’s “domain modeling”presentation: http://slidesha.re/fzChQB
    50. 50. #6Improve search across silos 33
    51. 51. Reconsidering the search UI... 34
    52. 52. ...by contextualizing “advanced”features, focusing on revision
    53. 53. ...by contextualizing “advanced”features, focusing on revision search session patterns 1. solar energy 2. how solar energy works
    54. 54. ...by contextualizing “advanced”features, focusing on revision search session patterns 1. solar energy 2. how solar energy works search session patterns 1. solar energy 2. energy
    55. 55. ...by contextualizing “advanced”features, focusing on revision search session patterns 1. solar energy 1. solar energy 2. solar energy charts 2. how solar energy works search session patterns 1. solar energy 2. energy
    56. 56. ...by contextualizing “advanced”features, focusing on revision search session patterns 1. solar energy 1. solar energy 2. solar energy charts 2. how solar energy works search session patterns search session patterns 1. solar energy 1. solar energy 2. explain solar energy 2. energy
    57. 57. ...by contextualizing “advanced”features, focusing on revision search session patterns 1. solar energy 1. solar energy 2. solar energy charts 2. how solar energy works search session patterns search session patterns 1. solar energy 1. solar energy 2. explain solar energy 2. energy search session patterns 1. solar energy 2. solar energy news
    58. 58. Recognizingspecialized queries(e.g., proper nouns,dates, unique ID#s) 36
    59. 59. Recognizingspecialized queries(e.g., proper nouns,dates, unique ID#s)search pattern:TA292761 36
    60. 60. Recognizingspecialized queries(e.g., proper nouns,dates, unique ID#s) search pattern: regulations March 2011search pattern:TA292761 36
    61. 61. Recognizingspecialized queries(e.g., proper nouns,dates, unique ID#s) search pattern: regulations March 2011 search pattern: regulations Owenssearch pattern:TA292761 36
    62. 62. Recognizingspecialized queries(e.g., proper nouns,dates, unique ID#s) search pattern: regulations March 2011 search pattern: regulations Owens search pattern:search pattern: regulationsTA292761 Caterpillar 36
    63. 63. ...and designing specialized search result 37
    64. 64. ...and designing specialized search result 37
    65. 65. ...and designing specialized search result 37
    66. 66. Poor search results returned by search engineContent objectsfrom productcontent model...and designing specialized search result 37
    67. 67. Read a book chapter onto dosession analysisYou’ll find one in my bookSearch Analytics for Your Sitehttp://bit.ly/quFxdz 38
    68. 68. #7Combine design approacheseffectively 39
    69. 69. Y 40
    70. 70. Y Narrow, deep content access 40
    71. 71. V 41
    72. 72. V ...to editorially rich content 41
    73. 73. 42
    74. 74. Manuallyselected results 42
    75. 75. Manuallyselected results ...complement raw results 42
    76. 76. Treat your content to do like an onion informationlayer usability content strategy architecture indexed by search 0 engine leave it alone leave it alone squeaky wheel issues 1 tagged by users addressed refresh annually tagged by experts (non- test with a service 2 topical tags) refresh monthly (e.g., UserTesting.com) tagged by experts “traditional” lab-based titled according to 3 (topical tags) user testing guidelines content models for A/B testing structured according 4 contextual navigation to schema 43
    77. 77. Treat your content to do like an onion Each layer is cumulative; most important content is informationlayer usat thecore content strategy architecture indexed by search 0 engine leave it alone leave it alone squeaky wheel issues 1 tagged by users addressed refresh annually tagged by experts (non- test with a service 2 topical tags) refresh monthly (e.g., UserTesting.com) tagged by experts “traditional” lab-based titled according to 3 (topical tags) user testing guidelines content models for A/B testing structured according 4 contextual navigation to schema 43
    78. 78. #8Tune your design over time 44
    79. 79. Your site is a moving targetbuilt on moving targets 45
    80. 80. I 46
    81. 81. I Time to Interest in the study! football team: going ...going gone 46
    82. 82. I 47
    83. 83. Before Tax DayI 47
    84. 84. I 48
    85. 85. After Tax DayI 48
    86. 86. Move from time-boxedto doprojects to ongoing processes Example: the rolling content inventory 49
    87. 87. Summary:8 IA better practices1. Diagnose the important problems2. Balance your evidence3. Advocate for the long term4. Measure engagement5. Support contextual navigation6. Improve search across silos7. Combine design approaches effectively8. Tune your design over time 50
    88. 88. S Let’s stop boiling the ocean 50
    89. 89. Say hello Lou Rosenfeld lou@louisrosenfeld.com Rosenfeld Media www.louisrosenfeld.com | @louisrosenfeld www.rosenfeldmedia.com | @rosenfeldmedia 51

    ×