Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Team of Rivals: UX, SEO, Content & Dev UXDC 2015

1,033 views

Published on

The search engine landscape has changed dramatically and now relies heavily on user experience signals to influence rank in search results. In this presentation, I explore search engine methods for evaluating UX in a machine readable fashion and present a framework for successful cross-discipline collaboration.

Published in: Technology
  • Be the first to comment

Team of Rivals: UX, SEO, Content & Dev UXDC 2015

  1. 1. 1
  2. 2. 2
  3. 3. This presentation was supposed to be much easier until this exchange on Twitter. The webinar in question was promoted on July 10, 2015. The irony is that Eric Enge is a long-time SEO who has little if any experience with user experience as evidenced by his webinar http://www.mediapost.com/publications/article/252961/yelp-study-criticizes-google-for- degrading-searc.html. % satisfaction Related concepts Personas 3
  4. 4. http://www.semrush.com/blog/user-experience-its-not-what-you-think/ 4
  5. 5. 5
  6. 6. Hilltop Algorithm Quality of links more important than quantity of links Segmentation of corpus into broad topics Selection of authority sources within these topic areas Hilltop was one of the first to introduce the concept of machine-mediated “authority” to combat the human manipulation of results for commercial gain (using link blast services, viral distribution of misleading links. It is used by all of the search engines in some way, shape or form. The beauty of Hilltop is that unlike PageRank, it is query-specific and reinforces the relationship between the authority and the user’s query. You don’t have to be big or have a thousand links from auto parts sites to be an “authority.” Google’s 2003 Florida update, rumored to contain Hilltop reasoning, resulted in a lot of sites with extraneous links fall from their previously lofty placements as a result. Photo: Hilltop Hohenzollern Castle in Stuttgart
  7. 7. Topic Sensitive Ranking (2004) Consolidation of Hypertext Induced Topic Selection [HITS] and PageRank Pre-query calculation of factors based on subset of corpus Context of term use in document Context of term use in history of queries Context of term use by user submitting query Computes PR based on a set of representational topics [augments PR with content analysis] Topic derived from the Open Source directory Uses a set of ranking vectors: Pre-query selection of topics + at-query comparison of the similarity of query to topics Creator now a Senior Engineer at Google 7
  8. 8. About content: quality and freshness About agile: frequent iterations and small fixes About UX: or so it seems (Vanessa Fox/Eric Enge: Cllick-through, Bounce Rate, Conversion) 8
  9. 9. http://www.seobythesea.com/2013/09/google-hummingbird-patent/ Comparison of search query to general population search behavior around query terms Revises query and submits both to search index Confidence score Relationship threshold Adjacent context Floating context Results a consolidation of both queries Entity=anything that can be tagged as being associated with certain documents, e.g. Store, news source, product models, authors, artists, people, places thing. Query logs (this is why they took away KW data – do not want us to reverse engineer as we have in past) User Behavior information: user profile, access to documents seen as related to original document, amount of time on domain associated with one or more entities, whole or partial conversions that took place 9
  10. 10. http://www.hmtweb.com/marketing-blog/phantom2-google-update-april-may-2015/ Content quality • Signal to noise – too many links on the page, thin content (click-bait) • WP tag pages (list of links) • Related tag pages (spider trap) • Stacked video pages • Too many “ads” above the fold • Auto start videos • Duplicate content • 404 errors • Spammy comments (Google considers these part of the content) • Design choices (Google now indexes CSS and JS files) How-to sites impacted 10
  11. 11. Assign each word a set of vectors that define its position in “theoretical meaning space” or cloud. A sentence is a path between the vectors, distilled down to numbers that become its own vector Irony a problem, must master the literal first 11
  12. 12. Alternative to figuring out something that really works, to working with UX, to find an alternative to the SEO guns and religion of keywords and links 12
  13. 13. UX, Content Strategy, IA 13
  14. 14. 14
  15. 15. 15
  16. 16. When the traffic dries up as they drop off page 1 in search results. 16
  17. 17. 17
  18. 18. In 2002, Google acquired personalization technology Kaltix and founder Sep Kamver who has been head of Google personalization since. Defines personalization: “product that can use information given by the user to provide tailored, more individualized experience” Query Refinement System adds terms based on past information searches Computes similarity between query and user model Synonym replacement Dynamic query suggestions - displayed as searcher enters query Results Re-ranking Sorted by user model Sorted by Seen/Not Seen Personalization of results set Calculation of information from 3 sources User: previous search patterns Domain: countries, cultures, personalities GeoPersonalization: location-based results Metrics used for probability modeling on future searches Active: user actions in time Passive: user toolbar information (bookmarks), desktop information (files), IP location, cookies 18
  19. 19. Implicit Collection Tools: Software agents, Enhanced proxy servers, Cookies, Session IDs Gathered without user awareness from behavior to: Query context inferred, Profile inferred, Less accurate, Requires a lot of data Maximum precision: 58% Advantages: more data, better data (easier for system to consume and rationalize) Disadvantage: user has no control over what is collected Explicit Collection Tools: HTML forms, Explicit user feedback interaction (early Google personalization with More Like This), Provided by user with knowledge, More accurate as user shares more about query intent and interests Maximum precision: 63% Advantage: User has more control over personal and private information Disadvantage: compliance, users have a hard time expressing interests, burdensome on user to fill out forms, false info from user Resource: Jaime Teevan MS Research (http://courses.ischool.berkeley.edu/i141/f07/lectures/teevan_personalization.pdf)
  20. 20. http://uk.reputation.com/wp- content/uploads/2015/03/TheEvolutionofGoogleSearchResultsPagesandTheirEffectonUserBehaviour. pdf Golden triangle now vertical (Knowledge graph, local, mobile) Looking at more results in less time Spend less than 2 seconds viewing individual search results Position 2-4 getting more click activity Google eye tracking study 20
  21. 21. Resource: Pew Internet Trust Study of Search engine behavior http://www.pewinternet.org/Reports/2012/Search-Engine-Use-2012/Summary-of-findings.aspx 21
  22. 22. How to search: 56% constructed poor queries 55% selected irrelevant results 1 or more times Get Lost in data: 33% had difficulty navigating/orienting search results 28% had difficulty maintaining orientation on a website Discernment 36% did not go beyond the first 3 search results 91% did not go beyond the first page of search results Resource: Using the Internet: Skill Related Problems in User Online Behavior; van Deursen & van Dijk; 2009 22
  23. 23. http://www.wsj.com/articles/SB11064341213388534269604581077241146083956 Harvard B School/Yelp data team study The authors focused on searches for local services such as restaurants or hotels, the largest single category of search requests. They randomly displayed one of two sets of search-result screenshots to more than 2,500 Internet users. One set of users saw a page reflecting results currently displayed by Google, while the other set saw a page that ranked third-party review sites such as Yelp and TripAdvisor based on their relevance—using Google’s own algorithm. The survey found that 32% of users would click on Google’s current local results, while 47% clicked on the alternative merit-based results. That nearly 50% increase in the click rate is “immense in the modern Web industry,” the authors wrote. 23
  24. 24. http://www.slate.com/blogs/future_tense/2015/06/30/google_s_image_recognition_software_retur ns_some_surprisingly_racist_results.html Google image recognition software mistakenly labeled user uploaded image of black woman as a gorilla 24
  25. 25. 25
  26. 26. Selection: Do they pick you from the results Engagement: Do they do anything once they get to your page that would indicate it is relevant to their query (information need)? Content: Is the content of high quality? Links: Baked in legacy relevance: Are they contextually relevant? From Authority Resources? Earned, not purchased?
  27. 27. Missouri S&T Study: Eyes Don’t Lie: Understanding User’s First Impressions on Website Design Using Eye Tracking http://scholarsmine.mst.edu/cgi/viewcontent.cgi?article=6127&context=masters_theses Cognitive bias: Presence of positive first impression may negate negative issues encountered later Attention selective process: mechanism of looking into seeing, selectively process information to prioritize some aspects while ignoring others through focus on a certain location or aspect of visual scene Visual hierarchy: main menu and body text over bottom of the page 27
  28. 28. VISUAL COMPLEXITY & PROTOTYPICALITY The results show that both visual complexity and proto-typicality play crucial roles in the process of forming an aesthetic judgment. It happens within incredibly short timeframes between 17 and 50 milliseconds. By Comparison, the average blink of an eye takes 100 to 400 milliseconds. In other words, users strongly prefer website designs that look both simple (low complexity) and familiar (high prototypicality). That means if you’re designing a website, you’ll want to consider both factors. Designs that contradict what users typically expect of a website may hurt users’ first impression and damage their expectations. August 2012 Resource: http://googleresearch.blogspot.com/2012/08/users-love-simple-and-familiar-designs.html 28
  29. 29. 29
  30. 30. 30
  31. 31. This is an actual notification from a real Google Webmaster Account. The algorithms have determined that the content quality on this site is low. You do not want to get one of these because by the time you get it, you’ve already dropped a few PAGES in search results. 31
  32. 32. 32
  33. 33. This client invests a lot of time and effort in their News & Events directory Customers are viewing the utility pages (Contact, etc) and the product justification/ROI section. 33
  34. 34. “As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change.” http://googlewebmastercentral.blogspot.com/2012/01/page-layout-algorithm-improvement.html If you’ll recall, this is the Google update that specifically looks at how much content a page has “above the fold”. The idea is that you don’t want your site’s content to be pushed down or dwarfed by ads and other non-content material….“Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.” http://www.webpronews.com/google-updated-the-page-layout-algorithm-last-week-2014-02 Resources http://searchenginewatch.com/article/2328573/Google-Refreshes-Page-Layout-Algorithm http://www.seobythesea.com/2011/12/10-most-important-seo-patents-part-3-classifying-web-blocks- with-linguistic-features/ http://www.seobythesea.com/2008/03/the-importance-of-page-layout-in-seo/ http://searchenginewatch.com/article/2140407/Googles-New-Page-Layout-Update-Targets-Sites-With- Too-Many-Ads 34
  35. 35. 35
  36. 36. This is what users really think, or don’t when looking for information. 36
  37. 37. 37
  38. 38. 38
  39. 39. Soft system Methodology (SSM) SSM grew out of the failure of systems engineering-excellent in technically defined problem situations- to cope with the complexities of human affairs, including management situations. As system engineering failed we were naturally interesting in discovering what kind of approach could with problems of managing. SSM is the logic-based stream (engineering) incorporating cultural and political streams to make judgments between conflicting interests by setting up criteria on what is significant and how to judge? Model the purposeful activity of the users to define the transformations to take place SSM: A Thirty Year Perspective: Peter Checkland (2000) 39
  40. 40. 40
  41. 41. 41
  42. 42. IR systems have built in functional complexity to accommodate multiple aggregators and actors that are opaque to users (black box) The information system has the role of supplying knowledge and is not always the sole supplier of output Role of knowledge support of specific action Information System/Soft System: information as socially constructed Information Engineering: information as a concrete phenomena Problem solving encompasses system, cultural and strategic concerns SSM incorporates system learning and experiential learning and applies to problem-solving 42
  43. 43. 43
  44. 44. Discover client world views, environment, cultural and political influences Design for people, not technology Reveal interacting systems Define user purposeful activities (what problems are we trying to solve) 44
  45. 45. 45
  46. 46. 46
  47. 47. Navigation dominance 47
  48. 48. A true “home” page, start here and navigate to where you want to be 48
  49. 49. Broad band comes into its own and BIG pictures make a splash 49
  50. 50. Why Google started moving away from link-based relevance 50
  51. 51. 51
  52. 52. 52
  53. 53. 53
  54. 54. 54
  55. 55. 55
  56. 56. 56
  57. 57. 57
  58. 58. 58
  59. 59. 59
  60. 60. The interacting systems within organizations (client and ours) 60
  61. 61. Answered for us and the client Would this become the first deliverable after signing? Precipitate the client questionnaire? Outcomes: Roadmap 61
  62. 62. User purposeful activities (what problems are we trying to solve) 62
  63. 63. Infrastructure Model Work flow process Resources Outcomes: Use case scenarios, Technical Spec document Business Model Why: Existing brand and Messaging Framework Who: Customer Data Where: Competitive Landscape Review What: SWOT UX/IA Model Outcomes: UX/CRO Audit of existing site, site structure map Exercise: I like, I Want, I Wish Design Model Review Style guide Discuss design inspirations and why Outcomes: Creative Brief, Page Layout sketches Exercise: Sketch idealized homepage 63
  64. 64. 64
  65. 65. 65
  66. 66. 66
  67. 67. 67
  68. 68. Because great architecture is not enough. This is Falling Water by Frank Lloyd Wright. The joke is that it lives up to its name with Cantilevered terraces (projections that extend well beyond their vertical support) Structural engineer found that "after more than 60 years, Falling Water was still moving." One side of the living room terrace, he reported, had sagged almost seven inches. http://www.wsj.com/articles/SB106134872871015900 Sounds like it worked for Wright more than for the client. All the engineers that have gone in there have been like, "Yikes." They went through this multimillion- dollar shoring up and all of these, which they have to hide, because it's so beautiful. Beautiful that doesn't work is vapid and annoying. 68
  69. 69. Because we are ALL user experience professionals now 69
  70. 70. 70

×