• Save
Watching websites
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Watching websites

  • 6,760 views
Uploaded on

Presentation on monitoring the web, including synthetic, UEUM, web analytics, interaction analysis. Given at www.meshconference.com/meshu on May 20, 2008

Presentation on monitoring the web, including synthetic, UEUM, web analytics, interaction analysis. Given at www.meshconference.com/meshu on May 20, 2008

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • An excellent tour for webmetrics!!
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
6,760
On Slideshare
6,394
From Embeds
366
Number of Embeds
6

Actions

Shares
Downloads
0
Comments
1
Likes
6

Embeds 366

http://www.bitcurrent.com 333
http://www.slideshare.net 9
http://www.linkedin.com 9
https://www.linkedin.com 6
http://www.rentedmetal.com 5
http://localhost 4

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Watching websites
  • 2. What we’re going to cover
    • Why watch your website?
    • How to do it (well & affordably)
  • 3. STARTUP 101
  • 4. New idea End Execution Feedback Success? End Money left? Risk
  • 5. Feedback, the old way.
  • 6. Are people doing what we want?
  • 7. Are we doing dumb stuff?
  • 8. Do we understand our users? http://flickr.com/photos/ikhlasulamal/2443194039/
  • 9. Is it easy and intuitive? http://flickr.com/photos/jmecelab/2323995433/
  • 10. Slow, unmeasured trial & error.
  • 11. The Internet lets us make mistakes faster.
  • 12. The weird part: Mistakes are good
    • Most startups don’t succeed doing what they set out to do
      • Amazon was just a bookstore
      • eBay sold Pez
      • F5 made hurricane modeling software
    • But mistake speed is critical
  • 13. New idea Execution Money left? End Feedback Success? New idea Execution Money left? Feedback Success? End Feedback Success? End Feedback Success? End
  • 14. We do this by watching the web.
  • 15. First: What business are you in?
  • 16. Media
  • 17. Transactions
  • 18. Collaboration
  • 19. Applications
  • 20. Then: Know what we want to happen
  • 21. Users do what we wanted
    • Enrolment: They sign up
    • Purchases: They buy stuff
    • Invitations: They tell their friends
    • Stickiness: They stay for longer
    • Loyalty: They come back
    • Contribution: They add content
    • Renewal: They buy another subscription
    • Upselling: They increase their payments
  • 22. The app is fast & reliable
    • Uptime: It’s reachable and reliable
    • Latency: It responds to usage fast
    • SLAs: Its reliability meets contractual goals
    • Correctness: It functions as intended
    • Well maintained: Problems are found & fixed
  • 23. We understand our visitors
    • Intentions: We know what they want to do
    • Motivations: We know why they came to us
  • 24. The app is easy to use
    • Easy: Easy to learn, fast to use
    • Clear: Less confusion
    • Intuitive: Right stuff in the right place
  • 25. Our victim…
  • 26. Our eyes… Browser Data center Synthetic tester Analytics receiver Proxy Survey site Client-side interpreter Server Passive capture
  • 27. The four big questions What did they do? Could they do it? Why did they do it? How did they do it?
  • 28. The four big questions Web analytics What did they do? Could they do it? Why did they do it? How did they do it?
  • 29. What matters in analytics Where did they come from? What attracts them best? Did they do what we wanted? Where did they drop out? What appealed most? Where did we send them? Acquisition Usage Referral
  • 30. Data center Browser IT Analytics receiver Server
  • 31. Data center Browser Client-side interpreter Server Analytics receiver
  • 32.  
  • 33.  
  • 34.  
  • 35.  
  • 36.  
  • 37.  
  • 38.  
  • 39.  
  • 40.  
  • 41.  
  • 42.  
  • 43. Page views, visits, visitors Top ten lists Demographics Technographics Top entry/exit pages Performance Capacity Security Path analysis Funnel reports A/B testing KPIs Dashboards Merchandising Segmentation SEO Campaign optimization Personas KPI alerts Multichannel aggregation Cost-shifting analysis Lifetime value Personalization Analytics-based content serving Process analysis Multichannel sales reporting Activity-based costing Balanced scorecards Strategic planning Predictive analytics Integrated user experience Adapted from Stephane Hamel and Bill Gassman IT-driven, “feel good” information, few decisions Business driven, working on metrics, accuracy and process Optimize the channel 330° view of customer (30° privacy) Strategic web business Level 1: Table stakes Level 2 : Fix the site Level 3 : Improve traffic Level 4: Complete view Level 5: MBA
  • 44. Analytics pros & cons
    • Pros
    • Trivial to implement with <script> tags
    • Cheap to get started
    • Absolutely essential
    • Cons
    • Requires connection to the Internet (for hosted ones)
    • Can slow page load
    • Tagging pages with content data time-consuming
    • May incite privacy riots
    • Won’t work when not interpreted
      • RSS feeds
      • XML data
      • Mobile devices
      • Clicked through before loading
  • 45. The four big questions Web analytics User Experience Management What did they do? Could they do it? Why did they do it? How did they do it?
  • 46. What matters in UEM? Could they get to the site? From everywhere? What regions were worst? What was their experience like? Did the app break? Was it fast enough? What things were slowest? Reachability Reliability Latency
  • 47. The trivial web transaction Data center Browser TCP SYN (“let’s talk”) TCP SYN ACK (“Agreed: let’s talk”) TCP ACK (“OK, we’re talking) HTTP 200 OK (“Sure!”) TCP FIN (“Thanks! I’m done now.”) [index.html] (“Here it is!”) TCP FIN ACK (“You’re welcome. Have a nice day.”) HTTP GET / (“Can I have your home page?”) Server SSL (“Someone might be listening!”) SSL (“Here’s a decoder ring”) [img js css] (“Have this too!”) Bump, bump. (Renders furiously) (Thinks a bit)
  • 48. What could possibly go wrong?
    • Performance
    • DNS: Find the site
    • IP: Route the packets
    • TCP: Establish a connection
    • CDN: Redirect/overlay the traffic
    • SSL: Negotiate encryption
    • HTTP: Request content
    • Host delay: Generate the content
    • Network delay: Deliver the content
    • Packet loss: Recover from errors
    • Browser: Parse the page, get more content
    • RIAs: Execute code on the client
    • Browser, OS: Render the page
    • Availability
    • Client: Bad requests
    • Server: Capacity, availability, OS
    • Network: Packet loss, TCP timeouts, traffic shaping
    • Application: Logic, dependencies
    • Navigation: Looping, session timeouts
    • Content: Broken, unrenderable
    • RIA: Plugin, memory, OS
  • 49. Synthetic testing of key functions from around the Internet User monitoring of every transaction 2 complementary technologies Was it working ? Was it broken ?
  • 50. Synthetic testing Synthetic testing of key functions from around the Internet User monitoring of every transaction Was it working ? Was it broken ?
  • 51. Data center Browser Synthetic tester Synthetic tester … Server Synthetic tester
  • 52.  
  • 53.  
  • 54.  
  • 55.  
  • 56.  
  • 57. http://www.gomez.com/info_center/instant_test.php
  • 58.  
  • 59.  
  • 60.  
  • 61. Synthetic pros & cons
    • Pros
    • Easy to set up
    • Only way to test without traffic
    • Can compare to competitors
    • Easy baseline establishment
    • Cons
    • Brittle
    • Detects macro outages, not user events
    • Good geographic & network coverage costs money, generates load
  • 62. User experience monitoring Synthetic testing of key functions from around the Internet User monitoring of every transaction Was it working? Was it broken?
  • 63. Data center Browser Server Passive capture
  • 64.  
  • 65.  
  • 66.  
  • 67.  
  • 68.  
  • 69.  
  • 70.  
  • 71. EUEM pros & cons
    • Pros
    • Maximum visibility for all protocols
    • No load on client
    • No extra network traffic
    • Forensics on hand
    • Cons
    • Expensive!
    • Requires physical access *
  • 72. The four big questions Web analytics Voice of the Customer User Experience Management What did they do? Could they do it? Why did they do it? How did they do it?
  • 73.  
  • 74. What matters in VoC? Why did they visit? Did they accomplish it? Why or why not? Motivation Success Reasons
  • 75. Data center Browser Client-side interpreter Server Random selection Survey site
  • 76.  
  • 77.  
  • 78. EUEM pros & cons
    • Pros
    • Only way to know what they really wanted
    • Good qualitative data complements quantitative
    • Subjective insights
    • Cons
    • Depends on survey “goodwill” (1-5% of respondents)
    • Popunder techniques needed
    • Privacy and personal data collection concerns
  • 79. The four big questions Web analytics Voice of the Customer User Experience Management Web Interaction Analytics What did they do? Could they do it? Why did they do it? How did they do it?
  • 80. What matters in WIA? Where did they go? How did they use the pages? What did they do wrong? Navigation Interaction Usability
  • 81. Data center Browser Operator display The stage Mouse/key capture Server Analytics receiver
  • 82.  
  • 83.  
  • 84.  
  • 85. So how do I see what the user saw?
  • 86. Data center Browser Display The stage Sample sessions’ stored pages Mouse/key & page capture Server Analytics receiver
  • 87.  
  • 88.  
  • 89.  
  • 90. How do I reduce client burden?
  • 91. Data center Browser Display The stage All sessions stored pages Mouse/key capture Server Passive capture
  • 92.  
  • 93. WIA pros & cons
    • Pros
    • Realtime usability analysis
    • Easy(ish) to implement
    • Great for A/B testing of layouts
      • Lousy for long-term dynamic sites
    • Cons
    • Doing it right requires dedicated equipment
    • Using client to record the stage error-prone
    • SSL and secure content concerns
    • Embedded elements (Flash) mislead the recorder
  • 94. (Just one more) Proxy communications
  • 95. Browser Data center Proxy Client-side interpreter Server
  • 96.  
  • 97.  
  • 98.  
  • 99.  
  • 100. Proxy pros & cons
    • Pros
    • Easy way to aggregate several things
    • If proxy is a search engine, may inform SEO
    • Cons
    • No longer master of your own domain
    • May have downtime, delay you’re not aware of
  • 101. Recap: Four big questions What did they do? Could they do it? Why did they do it? How did they do it?
  • 102.  
  • 103. NO SITE IS AN ISLAND
  • 104.  
  • 105.  
  • 106. NON-HTML COMPONENTS
  • 107.  
  • 108. DYNAMIC PAGE NAMES
  • 109.  
  • 110. POP-UPS AND SITE DESIGN
  • 111.  
  • 112.  
  • 113. PERFORMANCE WITH LOAD IN MIND
  • 114.  
  • 115. PRIVATE SITES
  • 116. Privacy limits tools Data center Browser Synthetic tester Analytics receiver Proxy Survey site Server Private agents Passive capture OS agents
  • 117. CLOUD COMPUTING PLATFORMS
  • 118. Cloud limits server access Data center Browser Synthetic tester Analytics receiver Proxy Survey site Server Passive capture
  • 119. AVERAGES LIE
  • 120. 80 th percentile only spikes once for a legitimate slow-down (20% of users affected) Average varies wildly, making it hard to threshold properly or see a real slow-down. Setting a useful threshold on percentiles gives less false positives and more real alerts
  • 121. GETTING WHAT YOU PAY FOR
  • 122.  
  • 123. STREAMING (COMET/BAYEUX, ADOBE)
  • 124. How realtime web protocols work Data center Browser … HTTP 200 OK (“Sure!”) Here’s a channel to send me updates CSCO: $21 Subscribe to CSCO COMET server API in framework CSCO: $23 HTTP 200 OK (“Added!”) Subscribe to GOOG CSCO: $23 GOOG: $450 HTTP 200 OK (“Removed!”) Remove CSCO Got it
  • 125. WATCHING BECOMES THE PROBLEM
  • 126. Connections to load Bitcurrent
    • Connection 0 - www.bitcurrent.com (67.205.65.12)
    • Connection 1 - www.bitcurrent.com (67.205.65.12)
    • Connection 2 - 4qinvite.4q.iperceptions.com (64.18.71.70)
    • Connection 3 - static.slideshare.net (66.114.49.24)
    • Connection 4 - static.slideshare.net (66.114.49.24)
    • Connection 5 - www.feedburner.com (66.150.96.123)
    • Connection 6 - static.getclicky.com (204.13.8.18)
    • Connection 7 - cetrk.com (208.67.183.100)
    • Connection 8 - in.getclicky.com (204.13.8.18)
    • Connection 9 - crazyegg.com (208.67.180.236)
    • Connection 10 - www.google-analytics.com (72.14.223.147)
    • Connection 11 - www.apture.com (67.192.46.19)
    • Connection 12 - static.apture.com (67.192.46.25)
    • Connection 13 - s.clicktale.net (66.114.49.24)
    • Connection 14 - www.clicktale.net (75.125.82.70)
  • 127. Leftovers (other places to watch)
  • 128.  
  • 129.  
  • 130.  
  • 131.  
  • 132.  
  • 133.  
  • 134. Cheat sheet
    • What they did
      • Free: Google Analytics, Clicky
      • Premium: Omniture, Mint
    • Could they do it (Synth)?
      • Free: Webperform, Alertsite
      • Premium: Gomez, Keynote, Webmetrics
    • Could they do it (EUEM)?
      • Free: Log parsers, Wireshark/Ethereal
      • Premium: Coradiant, Tealeaf, CA, HP
    • Why did they do it?
      • Wufoo or Google Forms with Javascript
      • iPerceptions, Opinionlabs
    • How did they do it?
      • Free: Crazyegg, Clicktale, Robotreplay, Tapefailure, GA overlay view
      • Premium: Tealeaf, Clicktale Pro
    • Extra credit
      • Feedburner for RSS
      • Google Alerts, Serph for content monitoring
      • Stake out: Facebook, Drop.io, Twitter
      • Site cred: Compete.com, Technorati, Pagerank
  • 135. Join the conversation
  • 136. Questions? (alistair at bitcurrent.com)