Oslo Schibsted Performance Gathering

450 views

Published on

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
450
On SlideShare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Oslo Schibsted Performance Gathering

  1. 1. Performance Optimization Gathering – OsloJune 2011
  2. 2. What is this talk about? How do we do Perfomance testing with SCRUM at InfoJobs? Real User eXperience performance monitorization 2
  3. 3. Who are we?We are the leading employment job board where professionals and companies converge to satisfy their hiring andemployment needs. Since our begginings in (1998) we have beaten the market’s timingMore than 40% of the time inverted in Interenet jobsearch in Spain is inverted in Infojobs.net (*)(*) Source: Nielsen Netratings 2010 3
  4. 4. Our people and vocation 200 workers with a shared objective: “To make it easy for everyone to find the best possible job” InfoJobs Team 4
  5. 5. During this 30 minutes… 51 people will be hired using InfoJobs(1) 5 companies will post 58 job ads(2) 1,000,000 requests will be processed by our Real User eXperience monitoring systems (1) Contracts signed during 2009 in Spain through InfoJobs, according to an independent study from Salvetti & Llombart (2) Data from InfoJobs (jan-abr 2011) 6
  6. 6. In the last month… 8 out of 10 big Spanish companies used InfoJobs(1) 3,000 companies signed up at InfoJobs to post job offers (1) 33,5 million job offer searches have been executed (2) (1) Source: InfoJobs.net (April 2011) (2) Source: InfoJobs .net (February 2011) 7
  7. 7. Software development P.O ?* T.L P.O ?* T.L P.O ?* T.L S.M S.M S.M Team Team Team #* Dev ? Test #* Dev ? Test #* Dev ? Test 6 SCRUM teams working together on the # Dev ? Test # Dev ? Test # Dev ? Test # employment and training sites Dev UX # Dev UX # Dev UX # Dev <> Front # Dev <> Front # Dev <> Front P.O ?* T.L P.O ?* T.L P.O ?* T.L 1 integrated production release every 2 S.M S.M S.M Team Team Team weeks #* Dev ? Test #* Dev ? Test #* Dev ? Test # Dev ? Test # Dev ? Test # Dev ? Test # Dev UX # Dev UX # Dev UX # Dev <> Front # Dev <> Front # Dev <> Front 10
  8. 8. Software developmentWhere do we need performance testing? 12
  9. 9. But what about real users? We want to measure how InfoJobs performs for every real user request. 13
  10. 10. QA Performance Testing Performance Testing in Infojobs• Performance Test of the code generated by dev team.• Give standards for external developments.• Test Infrastructure changes before being in production.• Keep track of performance evolution of the application and systems.• Keep track of Real User Experience including 3rd party services such as banners
  11. 11. QA Performance TestingPerformance Testing in Infojobs • Running tests in an isolated environment similar to production and sharing networking infrastructure. • Load tests with 150 Virtual users generating 8 transactions per second that is the load peak in a front end during a Monday morning. • Benchmarks from Database, Application server … generated by the same monitoring tools as in production. • Tests are ran every release before going online.
  12. 12. QA Performance TestingKey Performance Indicators: Client Side • Response Time • Number of Transactions • Errors • Recovery Capacity
  13. 13. QA Performance Testing Key Performance Indicators: Server Side• CPU• SQL/second• Jboss Cache• Connection Pools• Exceptions
  14. 14. RUX QA Performance Testing• Google announces performance will be considered for their SEO algorithms.• RUX, real user experience is the part of the Syslog application that provides information about the session time of a user in Infojobs• The metrics are adjusted so they fit the ones provided by Google
  15. 15. QA Performance Testing RUX• Infojobs tracks every request that arrives to our systems• Capability of immediate response to lack of availability• Running Test A/B frontend
  16. 16. QA Performance Testing RUX: What does it measure • In the web site: Every page includes in the footer a pixel timer.jpeg that is called after the WindowLoad event of the browser. • In the server: the request arrives to the web server and the time is calculated • Basic graphic about what is measuredInternet Apache JBoss Busy sending rendering Internet SyslogUser RUX Infojobs Browser Timer.jpeg
  17. 17. QA Performance Testing RUX: Median and Quartiles• Not all the requests are measured. Only those that load timer.jpeg and those that are between reasonable values that we calculate considering median and 1st 3rd and quartile
  18. 18. QA Performance Testing RUX: Application• Weekly measurement for all servers all requests.• Exactly at minute measurements are available filtered by page and front-end.• Accurate measurement of infojobs User’s Experience.
  19. 19. QA Performance Testing RUX: monthly reports Páginas más vistas España 2000000• Pages View 1800000 1600000 1400000 /es/jobsearch/search-• Slowest Pages results/list.xhtml visualizaciones 1200000 /es/oferta.empleo 1000000• Slowest Pages/ View 800000 600000 /es/candidate/application/in dex.xhtml /es/candidate/applications/l 400000• Comparison reports 200000 ist.xhtml /es/home/index.xhtml 00000
  20. 20. QA Performance Testing RUX: monthly reports Páginas más lentas 12• Page Views /es/candidate/registrati on/index.xhtml 10• Slowest Pages 8 /es/candidate/channel/ calculation-job- search.xhtml segundos• Slowest Pages/ View /es/home/index.xhtml 6 4 /it/candidate/channel/• Comparison reports calculation-job- search.xhtml 2 /it/home/index.xhtml 0 /it/candidate/registrati on/index.xhtml
  21. 21. QA Performance Testing RUX: monthly reports Páginas más vistas + lentas• Page Views 10 9 /es/home/index.xht ml• Slowest Pages 8 /es/jobsearch/searc 7 h-results/list.xhtml /it/jobsearch/searc• Slowest Pages/ View 6 segundos h-results/list.xhtml 5 /es/ver- 4 oferta.xhtml• Comparison reports 3 /it/oferta.empleo 2 /es/candidate/appli 1 cation/ok.xhtml /es/oferta.empleo 0 01/01/2011
  22. 22. QA Performance TestingVirtual user vs Real User • RUX • Load Tests • Production vs Pre Production • Comparison reports
  23. 23. QA Performance Testing Goals: Operations – QA - DEV• Cooperation amongst departments. Keep up to date information from the systems and the application.• Maintenance of Systems( DB, Application Servers ) for functional testing
  24. 24. Challenge: SCRUM and Performance • 6 Scrum Teams 1 performance environment • Load Tests vs Real Load • Production vs Pre Production • Comparison reports not available

×