A case study of the 2015 Grand National Horse Race in England.
Perfecto Mobile and Intechnica describe the results of their monitoring 34 million mobile bets during the 2015 Grand National and share best practices for app performance.
Unraveling Multimodality with Large Language Models.pdf
Mobile Monitoring for Peak Events
1. Grand National 2015
Mobile application performance insights
How to get monitoring right with millions of users
2. Perfecto Mobile
The world leader in mobile application continuous
quality helping the world’s biggest brands to deliver
better apps faster
“Perfecto Mobile currently has the
strongest 3rd party position in the
market” [ Thomas Murphy]
Intechnica
• Vendor/Technology Independent
• Enable Performance by Design
• Provide Performance Management Services
• Specialists in Agile Performance
• Enable and deliver Performance Best Practice
3. Your presenters:
Larry Haig
Senior Performance Consultant, Intechnica
larry.haig@intechnica.co.uk
+44 (0)845 054 2381
Contributing author:
The Art of Application Performance Testing
[Molyneaux, 2nd Ed 2014 Pub O’Reilly]
Amir Rozenberg
Director, Product Management, Perfecto Mobile
amirr@perfectomobile.com
5. Grand National 2015 – fast facts
• OpenBet record 19,300,000 bets within 24 hours
– Peak 45,000 bets pm
– 54,000,000 account transactions [UK population c63m (2011)]
• + 24% y-o-y
– Average bet size is £8.00
– 63% of remote bets via mobile,
– [up from 54% 2014]
• UK Transaction volume +12.6% y-o-y
– (+83.6% vs typical April trading day)
• UK Transaction value + 20.3% y-o-y Refs: OpenBet; Worldpay
6. Survey & caveats
• Focus on Mobile perspectives, end user/client side issues
– Android and iOS
– Complementary testing:
• Mobile emulation / public carrier (4G) / webkit browser (m. sites)
• Real device testing (native apps)
• Raceday & 2 day buildup
• High vs low traffic
• Performance and Quality of Service
• Caveats:
– Snapshot testing – indicative, not absolute / continuous
– High level monitoring only
• No device metric correlation
– Real device test redundancy limited
• Native App availability not quoted
Example – race day errors log
7. Summary of key findings
• Overall performance improving y-o-y, BUT
• Still evidence of traffic related stress
• All m. sites (and some native apps) showed some evidence of Quality of Service
issues
– 85%+ likely to be revenue relevant
– Brand impact
• m.sites:
– iPad2 Tablet slower than Smartphone
– 3rd party content source of failure
– Lack of traffic resilience
– Transaction payload highly correlated with performance
11. Native App Response Times by Device- The Good
4.23s 2.59s 2.91s 8.06s
5.16s 9.32s 5.67s 1.48s
Web Emulation = 1.15s
12. Poll
v
Impact of 5 second delay in a revenue transaction
• I would call customer support
• I would complain on social networks
• I would transition to competitor application
• I would try again
13. Poll Results
v
Impact of 5 second delay in a revenue transaction
I would call customer
support
4%
I would complain on
social networks
5%
I would transition to a
competitor application
56%
I would try again
35%
14. Native mobile applications – maximized revenue?
Mobile Advertising on app
launch takes user attention
Application behavior in absence of network-
opportunity to optimize experience
22. Quality of Experience- Adopt the user “glass”
Server Architecture Delivery Chain Devices & User
3rd party
Device Emulation UX Timer App-level
CPU Peak
iPhone 6 0.9s 1.15s 67%
iPhone 4S 0.9s 8.9s 85%
Typical Application Launch Time
Recommendation 1
24. Recommendation 3
Alerting Drives Early
Awareness (MTTK) &
correction (MTTR)Real Devices
Coverage
Geos/Scenarios/Carriers
Remain in control:
Know early, resolve early, and keep
winning!
25. Key suggestions for monitoring
1. Understand customer devices, locations, trends
2. Monitor appropriately and manage all digital delivery channels
– PC, m.
– Native mobile applications
– Other (in store kiosks, etc.)
3. Lean is fast – manage page weights in m. sites
4. Audit & optimise client side/application components
5. Explicitly monitor object rendering
6. Understand device specific effects
7. Manage 3rd party inclusions
– SLAs, monitoring
8. Plan ahead
– Understand traffic load effects ahead of time
– Baseline key metrics