• Save
MeasureWorks - Velocity Conference Europe 2012 - a Web Performance dashboard final
Upcoming SlideShare
Loading in...5
×
 

MeasureWorks - Velocity Conference Europe 2012 - a Web Performance dashboard final

on

  • 4,568 views

For the Velocity Conference Europe 2012 workshop day this presentation is about the essentials for creation and building a Web Performance dashboard. This with ultimate goal of providing the audience ...

For the Velocity Conference Europe 2012 workshop day this presentation is about the essentials for creation and building a Web Performance dashboard. This with ultimate goal of providing the audience a framework for designing and building a web performance dashboard. The session will cover the following 3 items:
Design guidelines: What defines a web performance dashboard? How to make sure it’s actionable and for people to actually use it on day to day basis?
Data collection: Why performance data? The various ways there are to collect data (e.g. synthetic versus RUM data, Webpagetest, Mobile) and how to correlate the different types of data and tools
Building the dashboard: How to build the actual dashboard, providing an overview of the tools/techniques used

At the end of the workshop you will be able to design and build your own dashboard based on the framework provided, or to optimize the current dashboards within your organization.

Statistics

Views

Total Views
4,568
Slideshare-icon Views on SlideShare
4,339
Embed Views
229

Actions

Likes
11
Downloads
0
Comments
0

11 Embeds 229

http://tech.m6web.fr 95
https://twitter.com 47
http://candidosalesg.wordpress.com 33
http://localhost 17
http://candidosg.com 15
https://si0.twimg.com 7
http://www.flavors.me 5
http://flavors.me 4
http://local.tryghost.org 4
http://fr.flavors.me 1
http://leed.galsungen.net 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    MeasureWorks - Velocity Conference Europe 2012 - a Web Performance dashboard final MeasureWorks - Velocity Conference Europe 2012 - a Web Performance dashboard final Presentation Transcript

    • @jeroentjepkemaPerformance :: Analytics :: Optimization
    • a Web Performance Dashboard, up & running in 90 minutes Key concepts for measuring and presenting performance data Velocity Conference Europe October 2, 2012
    • Dashboard design principles
    • Dashboard design principles Collecting Performance data
    • Dashboard design principles Collecting Performance data Building the Web Performance dashboard
    • Dashboard, FTW!
    • 500 0 1000 1500 2000 2500 300000:00:0001:00:0002:00:0003:00:0004:00:0005:00:0006:00:0007:00:0008:00:0009:00:0010:00:0011:00:0012:00:0013:00:0014:00:0015:00:0016:00:0017:00:0018:00:0019:00:0020:00:0021:00:0022:00:0023:00:00 Visits Users
    • 500 0 1000 1500 2000 2500 300000:00:0001:00:0002:00:0003:00:0004:00:0005:00:0006:00:0007:00:0008:00:0009:00:0010:00:00 Alert status11:00:0012:00:0013:00:0014:00:0015:00:0016:00:0017:00:0018:00:0019:00:0020:00:00 Be Careful21:00:0022:00:0023:00:00 Visits Users
    • # Payment orders Login processPayment process Savings
    • January 25 2007, 9:05am # Payment orders Login process Payment process Savings
    • January 25 2007, 9:05am # Payment orders Login process Payment process Savings
    • http://www.nrc.nl/apps/bigboard/
    • Part 1: Five Design principles for a Web Performance Dashboard
    • #1 - Data for all, not for everything
    • Your goal is to be understood...
    • How technical is your audience?Your goal is to be understood...
    • How technical is your audience?Your goal is to be understood... How will they use it?
    • How technical is your audience?Your goal is to be understood... How will they use it? How fast can they execute?
    • 3 levels of performance data...
    • Trending
    • Trending Future Big pictureActionable CapacityOptimization
    • Service Levels Trending Future Big picture Actionable Capacity Optimization
    • Service Levels Trending Yesterday FutureHow did we perform? Big picture Versus targets? Actionable Historical context Capacity Optimization
    • Real Time Service Levels Trending Yesterday Future How did we perform? Big picture Versus targets? Actionable Historical context Capacity Optimization
    • Real Time Service Levels Trending Now Yesterday Future Everything working? How did we perform? Big picture Is it me or the internet? Versus targets? Actionable How to solve it? Historical context Capacity Simple Optimization0hours 24hours 7days 1Quarter
    • Real Time Service Levels Trending Now Yesterday Future Everything working? How did we perform? Big picture Is it me or the internet? Versus targets? Actionable How to solve it? Historical context Capacity Simple Data for dashboard Optimization0hours 24hours 7days 1Quarter
    • #2 - Focus on End User Experience
    • Why? Customer loyalty!
    • Just a1-second delay
    • Just a1-second 7% loss in conversion delay @jeroentjepkema, MeasureWorks
    • Just a 7% 11%1-second loss in conversion fewer delay page views @jeroentjepkema, MeasureWorks
    • Just a 7% 11%1-second fewer 16% decrease loss in conversion in customer satisfaction delay page views @jeroentjepkema, MeasureWorks
    • Study conducted by MeasureWorks & idr1, Emerce, 2012 Dutch eRetail100 Dutch eTravel30 Dutch eFinance50 4% 6% 12%Social Media mentions = Performance KPI
    • Speed @jeroentjepkema, MeasureWorks
    • Globe Neckermann SunwebD-reizen Arke Shopping for a holiday... @jeroentjepkema, MeasureWorks
    • www.vrijuit.nl Page size: 1201Kb Experience fast? Try... http://bit.ly/SHYZpRhttp://blog.dynatrace.com/2012/09/26/why-page-size-matters-even-more-for-mobile-web-apps/ @jeroentjepkema, MeasureWorks
    • Reliability @jeroentjepkema, MeasureWorks
    • Functional errors @jeroentjepkema, MeasureWorks
    • Technical errors@jeroentjepkema, MeasureWorks
    • @jeroentjepkema, MeasureWorks
    • And there’s marketing...
    • Goodcampaign
    • Good gonecampaign
    • Good gone BAD!campaign
    • #3 - Provide context
    • What does it mean?
    • What does it mean?Why is it important?
    • What does it mean?Why is it important? Next best action?
    • #4 - Functionality first
    • What do you sell to the end users?
    • Creates common language between business and IT...
    • Metric: Availability99% versus 99min ...with understandable targets
    • #5 - Make it visually attractive
    • Group items
    • Use colours to display status
    • Make it move...
    • Design with the end user in mind
    • And the dashboard is...
    • Real TimeAnd the dashboard is...
    • Real Time Focused on End UserAnd the dashboard is...
    • Real Time Focused on End User Providing contextAnd the dashboard is...
    • Real Time Focused on End User Providing contextAnd the dashboard is... Functionality first
    • Real Time Focused on End User Providing contextAnd the dashboard is... Functionality first Visually attractive
    • All that matters... Show them where it hurts
    • Part 2: Collecting Performance Data
    • Technology:Responsiveness: How End Users End User Experience How we build, measure & optimize perceive (front-end) performance performance @jeroentjepkema, MeasureWorks
    • Complexity of a typical web transactionSource: Compuware Gomez @jeroentjepkema, MeasureWorks
    • (CLOUD) DATA CENTER INTERNAL USERS INTERNET CUSTOMERS Third-party/ Cloud Services Storage DB Servers Web Servers Major Local This is what you control... Network What you’re blamed for.. ISP Content ISP Delivery Mobile Load Networks Carriers Middleware App BalancersMainframe Servers Servers Web Performance Delivery Chain @jeroentjepkema, MeasureWorks
    • Measuring End User Experience is noteasy... @jeroentjepkema, MeasureWorks
    • Measuring End User Experience is noteasy... You need diagnostic details for things you can control and/or change @jeroentjepkema, MeasureWorks
    • Measuring End User Experience is noteasy... You need diagnostic details for things you can control and/or change You need insights in the things you can control, but do impact your bottom line @jeroentjepkema, MeasureWorks
    • Measure End User Experience? Outside-in,from the browser perspective... @jeroentjepkema, MeasureWorks
    • Performance Measurement Toolkit
    • Lot’s of different user scenario’s @jeroentjepkema, MeasureWorks
    • @jeroentjepkema, MeasureWorks
    • Changing bandwith: DSL, Mobile, WiFi, etc.
    • Real Devices
    • Real usage...
    • Consistency in measurements @jeroentjepkema, MeasureWorks
    • Correlating data for root cause analysis @jeroentjepkema, MeasureWorks
    • Performance Data collection @jeroentjepkema, MeasureWorks
    • Synthethic Real User Benchmark MonitoringPerformance Data collection @jeroentjepkema, MeasureWorks
    • Simulate business transactions
    • Via multiple devices & browsers
    • From where multiple locations...
    • Used for error detection & Root Cause Analysis
    • Used for error detection & Root Cause Analysis
    • Used for error detection & Root Cause Analysis
    • Used for error detection & Root Cause Analysis
    • Synthetic monitoring Metrics collected Cons Easy collection of data No real user data, unlimited bandwith while testing Heartbeat collection of data over time Combination with CDN Testing without actual visitor traffic Detects macro outages, not user Test from multiple locations events Object level details Detects only what you script Detailed alerting via Text, Mail or SNMP No measurement of traffic volume Root cause analysis data, with error codes, Places load on the site under test screenshots, source code and waterfall data Maintenance of tests takes time
    • Synthetic monitoring Vendors When selecting: Alertsite Ease of use with recording scripts Gomez Check the API functionalities Keynote Level of detail with alert messages Catchpoint Global versus local coverage Siteconfidence IP Label Reliability of sending alert messages Websitepulse Pingdom Etc...
    • Synthetic monitoring Vendors When selecting: Alertsite Ease of use with recording scripts Gomez Check the API functionalities Keynote Level of detail with alert messages Catchpoint Global versus local coverage Siteconfidence IP Label Reliability of sending alert messages Websitepulse For Native Apps: Pingdom Gomez Keynote Etc... ...
    • Ultimately, synthetic monitoring shows youif your site’s working or not...
    • But, synthetic isn’t enough...
    • Synthetic heartbeat Real UsersBut, synthetic isn’t enough...
    • Real User Real User Benchmark MonitoringPerformance Data collection @jeroentjepkema, MeasureWorks
    • Navigation timing2 ways of measuring... Browser RUM Disclaimer: There’s also third category Datacenter RUM, this will not be covered in this out of the presentation. Contact me if your want details
    • h"p://www.w3.org/TR/naviga2on32ming/5 Some background info:Navigation timing http://66.7percentangel.com/2011/12/breaking-down-onload-event-performance-bookmarklet/ http://www.html5rocks.com/en/tutorials/webperformance/basics/ http://www.w3.org/TR/2011/CR-navigation-timing-20110315/#nt-dom-content-event-start
    • http://caniuse.com/nav-timing
    • 1 2 3 4 As pages execute, After onload tag sendInsert tag (.js file) into Pages are requested tag collects detailed report for (mobile) web pages from browser/device performance metrics further analysis tag.js tag.js tag.js tag.js Browser RUM
    • Gomez
    • Gomez
    • LognormalSoasta
    • Torbit
    • Relies on navigation timing API, custom variables can be addedGoogle Analytics
    • New Relic
    • Real User monitoring Metrics collected Cons Measure ALL pages from ALL users No traffic is no data Measures traffic as well as performance Needs change to application code Quantative performance data to analyse user May require physical installation of satisfaction data storage or data reporter Data can be directly correlated with Web Sample rate Analytics Great for trending and creating the big picture
    • Real User monitoring Vendors When selecting: Lognormal Do I need to build my own datastorage? Gomez Check the API functionalities Keynote How long is raw data stored Torbit Interface Google Analytics Mobile support Boomerang.js Oracle Etc...
    • Real User monitoring Vendors When selecting: Lognormal Do I need to build my own datastorage? Gomez Check the API functionalities Keynote How long is raw data stored Torbit Interface Google Analytics Mobile support Boomerang.js Oracle For Native Apps: Google Analytics Etc... Gomez Localytics
    • Ultimately, Real User Monitoring shows you how manyusers are affected by bad performance...
    • Real User benchmarkingPerformance Data collection @jeroentjepkema, MeasureWorks
    • First view 11.349s Real usage Repeat view 4.357s
    • Average page load time per bandwith (seconds) for Dutch eTravel30 8,8 1,5 mbps 10 mbps 20 mbps 3,9 3,4 56% 13% Bandwith
    • Average page load time per Browser (seconds) Dutch eTravel30 IE7 6,5 IE8 3,9 IE9 3,6 40% 8% Devices used
    • Benchmark competitors arke.nl 5,50 sec sunweb.nl 3,91 sec vakantiexperts.nl 2,15 sec dreizen.nl 6,13 sec thomascook.nl 1,79 secSource: Webpagetest.org, IE9 10Mb up/ 2 Mb down
    • Webpagetest.org
    • Object level !! Optimization tips Webpagetest.org !
    • Webpagetest.org
    • Mobitest / Akamai
    • Real User benchmarking Metrics collected Cons Variety of real browsers and real devices Can be difficult to setup available for testing Requires physical installation Repetitive collection of real usage scenario’s Scripting (is difficult) Collect optimization metrics, waterfall and page speed score Great for trending and creating the big picture
    • Ultimately, Real User benchmarking gives you periodicinsight in real usage scenario’s...
    • Synthethic monitoring Real User Monitoring Real User benchmarkingPerformance Data collection @jeroentjepkema, MeasureWorks
    • Synthethic monitoring Real User Monitoring Real User benchmarking Used for... Heartbeat, runs without traffic Test specific customer journeys Object level detail Collect detailed alerts, including root cause analysis Desktop/Mobile Site Gomez Keynote Watchmouse Alertsite Mobile Apps Gomez KeynotePerformance Data collection @jeroentjepkema, MeasureWorks
    • Synthethic monitoring Real User Monitoring Real User benchmarking Used for... Used for... Heartbeat, runs without traffic Real usage information from Test specific customer journeys all users!! Object level detail Trending/Optimization Collect detailed alerts, including Business impact root cause analysis Desktop/Mobile Site Desktop/Mobile Site Gomez Gomez Keynote LogNormal Watchmouse Torbit Alertsite Google Analytics Mobile Apps Mobile Apps Gomez Gomez Keynote Localytics Google AnalyticsPerformance Data collection @jeroentjepkema, MeasureWorks
    • Synthethic monitoring Real User Monitoring Real User benchmarking Used for... Used for... Used for... Heartbeat, runs without traffic Real usage information from Periodic testing of user scenario’s Test specific customer journeys all users!! with real devices and bandwith Object level detail Trending/Optimization Optimization details Collect detailed alerts, including Business impact Competitive scan root cause analysis Desktop/Mobile Site Desktop/Mobile Site Desktop/Mobile Site Gomez Gomez Webpagetest Keynote LogNormal Watchmouse Torbit Alertsite Google Analytics Mobile Apps Mobile Apps Mobile Apps Gomez Gomez Perfecto Mobile Keynote Localytics Device Anywhere Google AnalyticsPerformance Data collection @jeroentjepkema, MeasureWorks
    • Synthetic Browser RUM Competition
    • Synthetic vs. RUM Synthetic vs. RUM Synthetic Browser RUM Competition
    • Synthetic vs. RUM Synthetic vs. RUM You vs. competition Synthetic Browser RUM Competition
    • Raw data
    • Part 3: Building the dashboard
    • Start with your web analytics tool...
    • Every website has goals http://www.flickr.com/photos/itsgreg/446061432/
    • Organic Search Campaigns Ad Network Transactional site Visitor Offer €" Abondenment) Upselling Reach Purchase step 1 €" Purchase step 2 €" Mailing, alerts, €" promotions Conversion €" Disengagement) Enrolment Impact)on)site) €" Negative €" PositiveGoals = Customer journeys
    • Select your top customer journeys
    • Map the customer journeys with your ownweb application delivery chain...
    • Journey 1 Journey 2 Journey 3Synthetic transactions Tier1 Webserver Webserver Webserver Webserver Application Application Tier2 server server Database Tier3 Backend Backend Backend Preferably one per backend used
    • Journey 1 Journey 2 Journey 3Synthetic transactions Tier1 Webserver Webserver Webserver Webserver Application Application Tier2 server server Database Tier3 Backend Backend Backend Preferably one per backend used
    • Journey 1 Journey 2 Journey 3Synthetic transactions Tier1 Webserver Webserver Webserver Webserver Application Application Tier2 server server Database Tier3 Backend Backend Backend Preferably one per backend used
    • Journey 1 Journey 2 Journey 3Synthetic transactions Tier1 Webserver Webserver Webserver Webserver For proper alerting, eliminate doubles Tier2 Application server Application server Database Tier3 Backend Backend Backend Preferably one per backend used
    • Don’t forget third parties
    • Defining service levels & thresholds...
    • First thing is to establishing a baseline:
    • First thing is to establishing a baseline: A pre-defined set of metrics
    • First thing is to establishing a baseline: A pre-defined set of metrics that describes normal behavior
    • First thing is to establishing a baseline: A pre-defined set of metrics that describes normal behavior in order to detect variancies
    • First thing is to establishing a baseline: A pre-defined set of metrics that describes normal behavior in order to detect variancies and to be comparable within historic context
    • Example...
    • Purchasing a book, Customer journeymust be completed (speed), Metric: Speedwhere every page loads under 4 sec., Target: Secusing IE8 and higher, User scenariofrom any location in the Netherlands, User locationsfor 95% of all users, Percentileevery day between 6am and 12pm, Windowmeasured with Real User Monitoring. Collection type Source: Metrics 101, Velocityconf 2010
    • Repeat this for every customer journey defined...
    • Next, group the content...
    • KISS
    • Layered approachOnline brand Bol.com
    • Layered approachOnline brand Bol.com Books Music Products Toys etc.
    • Layered approachOnline brand Bol.com Books Music Products Toys etc. Search Services per product Reviews Purchase Discounts
    • Layered approachOnline brand Bol.com Overall performance Books Music Products Toys etc. Search Services per product Reviews Purchase Discounts
    • Layered approachOnline brand Bol.com Overall performance Books Music Products Toys Detailed performance etc. Search Services per product Reviews Purchase Discounts
    • Layered approachOnline brand Bol.com Overall performance Books Music Products Toys Detailed performance etc. Search Services per product Reviews Root cause Purchase Discounts
    • Layered approach Big PictureOnline brand Bol.com Overall performance Books Music Products Toys Detailed performance etc. Search Services per product Reviews Root cause Purchase Discounts Detailed level of performance
    • Mixing it all together...
    • Displayed in dashboard Metrics Group information
    • Displayed in dashboard Metrics Group information Functionality
    • Displayed in dashboard Metrics Group information Functionality Service Levels
    • Displayed in dashboard Metrics Group information Threshholds Functionality Service Levels Real Users versus Synthethic
    • Displayed in dashboard Metrics Group information Threshholds Functionality Service Levels Real Users versus Synthethic Business context
    • Displayed in dashboard Metrics Group information Threshholds Functionality Service Levels Real Users versus Synthethic Business context Customer Journey
    • Displayed in dashboard Metrics Group information Threshholds Functionality Service Levels Real Users versus Synthethic Page views Bouncerate Business context Customer Journey Affected users sessions Competition
    • Displayed in dashboard Metrics Group information Threshholds Functionality Service Levels Real Users versus Synthethic Page views Bouncerate Business context Customer Journey Affected users sessions Competition Real Time Performance
    • Displayed in dashboard Metrics Group information Threshholds Functionality Service Levels Real Users versus Synthethic Page views Bouncerate Business context Customer Journey Affected users sessions Competition Real Time Performance Delivery chain
    • Displayed in dashboard Metrics Group information Threshholds Functionality Real Users versus Synthethic Service Levels Page views Bouncerate Business context Customer Journey Affected users sessions Competition Speed Application errors Real Time Performance Content errors Delivery chain Third party errors Downtime
    • Designing the infrastructure...
    • Custom reports Mobile Performance (SLA/Trending) application dashboard 4 4 5 Authentication layer Vendor API used for both real time1 information as raw data download API for data upload to mobile app / 22 dashboard 33 Import HTML alert mail Alert Datawarehouse 3 message Authentication mechanism for both4 login as selective data transfer 1 1 1 1 Remote access to Datawarehouse for SLA5 or Trend reporting, based on same data as real time dashboard WPT Synthethic RUM Analytics
    • Synthethic RUM
    • Synthethic RUM
    • Synthethic RUM
    • Data model RUM
    • It all looks the same...
    • ...but still so different
    • ...but still so different Synthethic monitoring
    • ...but still so different http://bit.ly/MW-namingconvention
    • Authentication layerhttp://github.com/symfony/symfonyhttp://www.symfony-project.org
    • Visualshttps://www.html5rocks.com/en/tutorials/canvas/performance https://www.highcharts.com http://zqi.me/vizd3
    • Demo Demo time
    • Designing the interface...
    • Technical metricsNavigation
    • Technical metricsNavigation Business metrics
    • Technical metrics This is the part that movesNavigation Business metrics
    • Demo:URL: https://app.measureworks.nlUID: demo@measureworks.nlPassword: performance
    • Metric: Page speedDatacollection: RUMTreshold: Historically based or fixedTimeframe: Average 5min
    • Metric: UptimeDatacollection: Synthethic monitoringTreshold: Page errors (4xx, 5xx)Timeframe: Average 5min
    • Metric: Third Party uptimeDatacollection: Synthethic monitoringTreshold: Error objects/domain (4xx, 5xx)Timeframe: Average 5min
    • Metric: Application errorDatacollection: RUM & Synthethic monitoringTreshold: # object errors per funnel (4xx, 5xx)Timeframe: Average 5min
    • Metric: Content errorDatacollection: Synthethic monitoringTreshold: Object size, Context match, User transaction failureTimeframe: Average 5min
    • Metric: Pageviews impactedDatacollection: RUMTreshold: % of pageviews affected vs. total pageviewsTimeframe: Average 5min
    • Metric: Users affectedDatacollection: RUM & Navigation timingTreshold: % of browser session vs. bouncerateTimeframe: Average 5min
    • Metric: Competition rankingDatacollection: WebpagetestTreshold: % competitors faster based oncurrent average page speedTimeframe: Average 60min
    • URL: https://app.measureworks.nl UID: velocity@measureworks.nl PW: performance
    • Become a dashboard rockstar...
    • 1. Look into your webanalyticsBecome a dashboard rockstar...
    • 1. Look into your webanalytics 2. Define a performance baselineBecome a dashboard rockstar...
    • 1. Look into your webanalytics 2. Define a performance baseline 3. Start with synthethic monitoringBecome a dashboard rockstar...
    • 1. Look into your webanalytics 2. Define a performance baseline 3. Start with synthethic monitoring 4. Design your dashboardBecome a dashboard rockstar...
    • 1. Look into your webanalytics 2. Define a performance baseline 3. Start with synthethic monitoring 4. Design your dashboard 5. Build a report based on your design and collect feedbackBecome a dashboard rockstar...
    • 1. Look into your webanalytics 2. Define a performance baseline 3. Start with synthethic monitoring 4. Design your dashboard 5. Build a report based on your design and collect feedback 6. Build a dashboardBecome a dashboard rockstar...
    • 1. Look into your webanalytics 2. Define a performance baseline 3. Start with synthethic monitoring 4. Design your dashboard 5. Build a report based on your design and collect feedback 6. Build a dashboard 7. Add rum and other datasourcesBecome a dashboard rockstar...
    • One more things...Couple
    •   of
    • Start with a functional design
    • OrganisatieCreate a good team @jeroentjepkema, MeasureWorks
    • Design with your end users in mind
    • Collect feedback as early as possible
    • The art of deleting
    • Be careful with using API’s
    • Combining datasources... WTF?
    • Recommended...
    • Data Visualization: Noah Iliinsky/Julie Steel: Desiging Data Visualizations - http://oreil.ly/SryQyV Real User Monitoring:Wednesday Oct 3, 11.20am - The 3.5s dash for attention and other stuff we found in RUM WebPage test: Thursday Oct 4, 15.30pm - WebPage test, beyond the basics Join a Web Performance meetup: Go to www.meetup.com and search for web performance. For Netherlands go to http://www.meetup.com/Dutch-Web-Operations-Meetup/
    • Questions?
    • Thanks! More questions?M: jtjepkema@measureworks.nlT: @jeroentjepkemaW: www.measureworks.nl