• Like
  • Save

Government Web Analytics

  • 1,875 views
Uploaded on

Great presentation by SSA Tim Evans on Government Web Analytics

Great presentation by SSA Tim Evans on Government Web Analytics

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,875
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
10
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Web Analytics in Government Tim Evans Social Security Administration Co-Chair, Federal Metrics Sub-CouncilChair, Web Analytics Association Public Sector SIG
  • 2.  Some overall, introductory, Dutch-Uncle stuff Nuts and bolts of collecting web analytics Why Analytics is so hard in Government Building a culture of Analytics Web Analytics ResourcesSetting Expectations
  • 3.  Better Web sites based on data about site visitors Enable site decisions based on analysis of empirical visitor data, not Highly Paid persons opinions Identify and fix site problems Help site users succeedWhy Web Analytics?
  • 4.  Visitor Behavior: What they do on your website (often called “Clickstream”) Visitor Outcomes: How successful they are Visitor Experience: How happy they are about it Quality/Integrity data: Broken/missing links, SEO, etc. Social Media Metrics and “Buzz”Web Analytics Data
  • 5.  By definition, our data is about all visitors, or interesting segments of visitors, not individuals OMB policies emphasize this No place for PII here Tools that track individual behavior should be limited to a customer-service context (that is, casework)Web Analytics Data is AggregateData
  • 6. Vocabulary Alert: Banish “hits” Resist requests for bragging-rights numbersHow not to Track Success
  • 7.  Every site has a purpose; its goals should be identified before starting an Analytics project Key Metrics for your site must be based on your site’s goals One-size-fits-all data reporting can’t possibly meet your project’s needs You must find the one, true set of metrics for your siteEvery Site is Different; so are itsGoals; so are its Key Metrics
  • 8.  Plenty of tools will collect masses of analytics data on a site Highly competitive market; most have the same general range of capabilities Some very expensive, some free Regardless of choice, none will meet your needs out of the box Your work required in implementing the tools, then analyzing the dataTools: Just 10% of the Job
  • 9.  What they do on your website ◦ What pages, how many? ◦ Where did they enter, come from? ◦ How long did they stay? ◦ What did they search for? What did we learn about them ◦ Top Tasks ◦ Bounce rates ◦ Where did they quit? What’s relevant to your site’s unique goals?Behavioral Data
  • 10.  Web server log files Specialized logs created by JavaScript “page tags” embedded in web pages Passive, on-wire network sniffers that log web traffic Hybrids combining two or more of the aboveSources of Behavioral Data
  • 11.  Traditional source of data, kept by the web server itself as it serves pages Logs all activities with date stamps, IP address, resource (page) accessed, time- to-serve, much else Web Analytics tools parse the logs to create reports Major shortcomings; some pro’sWeb Server Log Files
  • 12.  Records all site activity, including that of spiders, ‘bots, door-knob-rattlers and other non-humans (over count) Does not record site activity served from ISP caching servers (under count) Dependent on central IT for setup and support, which may also control the Analytics toolWeb Server Logs: Con’s
  • 13.  Some technical data (e.g., error messages, bandwidth use) collected that is not in page-tag data Especially good for capacity planning functions (costly Analytics tools may be overkill for this purpose) On-site search terms automatically logged Logs can be re-analyzed, after the fact.Web Server Logs: Pro’s
  • 14.  Bits of JavaScript (usually) embedded in web pages Code executed by visitors’ browsers; web server is not involved Browsers “phone home” to site data collector (local or vendor-hosted), exchanging session data Data collectors log the session info: Analytics tools run against this dataPage Tags: Introduction
  • 15. <script src="/includes/wtinit.js"type="text/javascript"></script><script src="/includes/wtbase.js"type="text/javascript"></script><script type="text/javascript"> //<![CDATA[ var_tag=new WebTrends(); //]]>> </script> <scripttype="text/javascript">//<![CDATA[ // Add custom parameters here.//_tag.DCSext.param_name=param_value;_tag.dcsCollect(); //]]>> </script> <noscript><div><img alt="DCSIMG" id="DCSIMG" width="1"height="1"src="https://stats.ssa.gov/dcs5w0txb10000wocrvqy1nqm_6n1p/njs.gif?dcsuri=/nojavascript&amp;WT.js=No&amp;DCS.dcscfg=1&amp;WT.tv=8.6.2"/></div> </noscript>
  • 16.  Easy to implement (theoretically) Controlled, and configurable, by business users, not IT Data collected real time, immediately accessible for analysis Ignores spiders/bots/non-humans (they don’t execute JavaScript) Busts ISP caches—every page view triggers tag, regardless of its source All vendor innovation in Analytics herePage Tags: Pro’s
  • 17.  Tag must be in every page Increases page size (+/- 200KB) ~2% of users disable JavaScript Capacity-planning data not collected Since data is collected real time, botched/missing tags collect no or incomplete data; cannot be re-played Tag changes may render prior data invalid Mixing vendor tags may/may not create problemsPage Tags: Con’s
  • 18. Log Files Page TagsSpiders/Bots/Non- Yes NoHumanBusts ISP Caches No YesTech Data/Error Msgs Yes NoIT Support Req’d Yes Not if I have anything to say about itSearch Terms Yes MaybeRe-run Data Yes NoWho Controls IT You DoReal Time Collection No YesTouch Every Page No Your JobIncrease Page Size No YesBold Moves Me
  • 19.  Most vendors support hybrid data collection (logs + page tag data), and can merge them GA is exception; requires local Urchin install Sniffer appliances capture/log web traffic by on-wire packet inspection (no tags may be involved) On-the-fly insertion of page tag at site exit point to ensure consistency Some movement toward “universal” tags, partly result of concern about multiple tagsVarious Workarounds
  • 20. Always based on your site’s goals Basics ◦ Visits/Visitors ◦ Page Views ◦ Referrers ◦ Search terms ◦ Entry/Exit Pages ◦ Single-Page Visits (Bounces) ◦ More derived from these/with time dimensions• Web Analytics Association Definitions: http://tinyurl.com/28fkkq2Run-Through: Behavioral Metrics
  • 21.  By themselves, probably few of them ◦ Without your site goals, none can address “success” ◦ Does “a lot” of any of these tell you much? ◦ Is “more” always “better?” ◦ It’s not a competition Context adds meaning ◦ Trends over time ◦ Pre/Post Site Redesign ◦ Pages rising/falling ◦ MarketingWhich are My Key Metrics?
  • 22.  Most-viewed pages tell you visitors’ Top Tasks: Are they what you thought they were? If not, what does that do to your thinking? Search Terms tell you what visitors looked for: Did they find it? Did they not find it? Referrers tell you where visitors come from: Is your marketing succeeding? Where else should you be marketing?This is too much Work; What’sLow-Hanging?
  • 23.  Persistent Cookies ID returning visitors New OMB policy (6/10), removes prior prohibition Segmenting new and returning visitors is key; otherwise all visits are new ones Absent cookies, reported numbers of new/returning visitors are inaccurate Cookies also enable EZ login, site customization (“Remember Me”) http://challenge.gov/privacy#cookiesBehavioral Detour: Cookies
  • 24.  Attractive, powerful, free Web Analytics tool, with Federal Terms of service (http://apps.gov/) Hosted service (i.e., Google’s data center) Uses page tags, persistent cookies Possible issues with data ownership, location, retention, large sites, PII Geolocation data, but no IP addressesBehavioral Detour: GoogleAnalytics
  • 25. Measures of success depend on your site goals Task completion/Conversions Views/downloads of pages you wanted them to see Successful searches Time on site (maybe) Bounce rate (maybe) EngagementOutcomes Data
  • 26.  Web Analytics Tool ◦ Files (you wanted) viewed/downloaded ◦ Funnel Analysis on tasks pinpoint failure points ◦ Site registrations Other Places ◦ Mailing list sign-ups ◦ Call center activities ◦ Traditional MI: tasks completed ◦ Specific outcomes Q’s on surveys ◦ Session “replay” applications Converging multiple-source data an issueSources of Outcomes Data
  • 27. Simple Conversion Funnel
  • 28. Complex Conversion Funnel
  • 29. How visitors feel about their experience on your site Customer Satisfaction ◦ Overall Satisfaction ◦ Ratings on aspects of your site ◦ Future Behaviors ◦ Satisfaction with agency overall (clicks & mortar) Questions related to your site goals  Why did you come to our site?  Did you succeed?Experience Data: Introduction
  • 30.  Surveys (on line, on phone, in person) Web Site Quality/Integrity Testing Usability Testing/Assessments Social Media “buzz”Experience Data: Sources
  • 31. On-web “Pop-up” Surveys Ratings for Satisfaction, major site Elements (Navigation, Search) “Likely-to” questions Custom questions Open Ends ForeSee Results, iPerceptions, 4Q (free), othersExperience Data: Surveys
  • 32. ForeSee (FSR) Survey Summary
  • 33. FSR: Reporting Portal
  • 34. FSR Priority Map
  • 35. Main Reason Percent of all Failure Rate Satisfaction for Visit Visitors (% of (Not segment) Successful)Plan 13 5 47RetirementApply for 12 25 29BenefitsEstimate My 11 12 19Future BenefitsGet Disability 9 17 31InfoSee if I Qualify 8 17 48Aggregate 53 15 33Segmenting Data Reveals
  • 36.  Franchised through Interior’s National Business Center for Fed-wide use No procurement; Inter-Agency Agreement Pre-cleared by OMB for Paperwork Reduction Act purposes Cost: $25-30K per survey/year; more with add-on features Info: http://fcg.nbc.gov/Info about Federal FSR Use
  • 37.  4Q (Site survey) ◦ http://www.4qsurvey.com/; No-cost; just four questions iPerceptions (Site survey) ◦ http://www.iperceptions.com/; Owns 4Q, other products Net Promoter (Site Survey) ◦ http://www.netpromoter.com/; Just one question Kampyle (page-level survey) ◦ http://www.kampyle.com/; User-selected, at every page on site Remember OMB PRA Requirements!Some Other Survey Tools
  • 38.  Many call center software packages can incorporate surveys ForeSee Results conducts phone surveys on overall Government Satisfaction SSA frequently surveys recent claimants about their experience, by phone and mailIncorporating this data with other experience data is a challenge, esp. in attributing conversionsPhone/Other Surveys
  • 39. Assessment of site quality & integrity aspects: Find and fix broken stuff before it affects visitor experience Broken links, misspellings, etc. Section 508 compliance Missing meta-data, analytics page tags Page weights and proximity SEO MoreSite Quality/Integrity Data
  • 40. Accenture Digital Diagnostics
  • 41.  Franchised through Interior’s National Business Center for Fed-wide use No procurement; Inter-Agency Agreement Cost: ~$5-$10K initial purchase + plus annual maintenance Hosted (costs more) or on-site service (your hardware; your work) Also does web server log file analysis Info: http://fcg.nbc.gov/Info about ADD
  • 42.  W3C Quality Assurance Tools: http://www.w3.org/QA/Tools/ Xenu Link Sleuth: http://home.snafu.de/tilman/xenulink.html Google Website Optimizer: http://www.google.com/analytics/siteopt/ Many, many SEO “consultants” out there-- bewareOther Quality/Integrity Tools
  • 43.  As with Quality/Integrity, test your site to head off problems In-House Usability testing may suffer from being too close to things PRA may limit use of actual site visitors for testing (new OMB policy here, tho’) Third-party vendors offer way around PRA, with professional testing focused on industry best practicesUsability Testing
  • 44. Example: FSR Usability Audit
  • 45. Example: FSR Usability Audit
  • 46.  Traditional Referrer metrics: who’s sending visitors to your site Number of friends/likes/followers, comments on your Social Media pages “Buzz” monitoring/response Vendors rushing into this space Reference: Jim Stern, Social Media MetricsSocial Media Analytics
  • 47. Search & Aggregate Mentions in Social/Traditional Media Searchable, indexable, trendable Automated reports Influencer, Sentiment analysis ◦ Importance of poster ◦ Lexical Analysis: meaning of posts Response workflow tools ◦ Assign/manage response to postsSocial Media: Buzz/Response
  • 48. Forrester: Listening Platforms
  • 49. Why is Analytics so Hard in Gov’t?
  • 50. Reason #1: We don’t Make Money
  • 51. NMJ NMJ NMJ NMJ NMJ NMJReason #2: Not My Job You are Here
  • 52.  Cost savings—you can measure it ◦ FTE, infrastructure savings from on-line services ◦ ~40% of SSA FAQ’s users say found what they wanted; won’t contact via hi-cost channel; double-digit FTE savings Citizen Time Savings ◦ Time on phone, travel, wait at gov’t office ◦ Time spent on paper forms vis-à-vis on-line (PRA data provides hints) ◦ Soft data; may have to ask citizens (ACSI?)ROI: Gov’t Web Analytics
  • 53.  IT has hardware, software, wiring, security, monitoring, storage, capacity planning, boo-koo other folks—all have a piece, but it’s none of their jobs Content owners manage your Web Analytics page tags, but that’s not their job Business users may have Analytics “ownership,” but need IT and Content folks (again, stuff that isn’t their jobs)NMJ: The Real Problem
  • 54.  Build cross-component relationships Convert your boss Look for small successes within your grasp/control to gain confidence of others Above all, find an Executive to be your Analytics ChampionNMJ: Some Solutions
  • 55. Objectives we stated earlier: Better Web sites based on data about site visitors Site decisions based on analysis of empirical visitor data, not Highly Paid persons opinions Identify and fix site problems Help site users succeedAnalytics Culture: Objectives
  • 56.  Don’t “Spew” Data—200-page out-of-the-box report from WA tool not usually of much value Start small, with Outcomes data about something you can control Find Champions, Heroes, Role Models Buy doughnuts or pizza; invite the NMJ’s Deliver reports that drive action by connecting data, insight, and Outcomes Answer this: What’s the point?http://www.kaushik.net/avinash/Analytics Culture: Tactics
  • 57.  Jim Stern, Web Metrics (oldie but goodie) Eric Peterson, Web Analytics Demystified (also old) Jason Burby/Shane Atchison, Actionable Web Analytics Brian Eisenberg/Jeffrey Eisenberg, Call to Action Brian Eisenberg/Jeffrey Eisenberg, Waiting for Your Cat to Bark Avinash Kaushik, Web Analytics: An Hour a Day Brian Eisenberg/John Quarto-vonTivadar, w/Lisa T. Davie, a/b: always be testing Avinash Kaushik, Web Analytics 2.0 Brian Clifton, Advanced Web Metrics with Google Analytics Jim Stern, Social Media MetricsRecommended Reading
  • 58.  Web Analytics Association (Govt discount!): http://www.webanalyticsassociation.org/ eMetrics conferences: http://www.emetrics.org/ UBC On-Line Program (WAA discount!): http://www.tech.ubc.ca/webanalytics/ Federal Web Managers’ Council Metrics: http://forum.webcontent.gov/members/group.asp?id=316 82 Yahoo Web Analytics Forum: http://tech.groups.yahoo.com/group/webanalytics/ Web Analytics Demystified: http://blog.webanalyticsdemystified.com/ http://google.com/search?q=web+analytics+blogWeb Analytics Resources
  • 59.  Nuts and bolts of collecting web analytics ◦ Behavioral, Outcomes, and Experience data Why Analytics is so hard in Government ◦ Hard to prove ROI when we don’t sell ◦ NMJ Building a culture of Analytics ◦ Objectives ◦ Tactics Web Analytics ResourcesReview
  • 60. Tim Evans Social Security Administration http://www.socialsecurity.gov/ tim.evans@ssa.gov tkevans@tkevans.com (410) 965-4217 (443) 618-0351Contact