exploratory & context driven
some unstructured gyaan by Younus Poonawala
 $840mn invested in building the site healthcare.gov
 6 users registered on the first day
 The site took 2 months to recover technically
 And the budget shot to ~$2 bn towards the end of campaign
 Research has shown that over the past 10 years,
94% of large federal information technology projects were unsuccessful, more than
50% were delayed, over budget, or didn’t meet expectations and a total of 41.4%were judged to be complete failures.
 YES, we are pessimistic
 YES, we are picky
 YES, we know what sucks and why
 BUT, we don’t know what would happen if we don’t test
 Web Based Applications
 Triangle App – Case Study
 Testing is not checking
 Different Hats of a Tester
 Critical Path Analysis
 Risk Based Testing
 Mind Mapping
 Q&A
 Triangle App by Michael Bolton
 Lets Think of Some Tests for this one?
 Why was the site built?
 What’s the objective behind this?
 Who is it targeted to?
 Is the site complete ? Is it ready to be tested?
 Does the original requirements reflect on site now?
 What is the critical path?
 What is at Stake?
Analogy of Checks & Tests
 Functionality
 Usability
 Compatibility/Responsiveness
 Performance
 Security
 Localisation
 508 compliance
 Clarify your mission
 Analyse your product
 Analyse risks
 Define Strategy
 Plan Logistics
 Share the Plan
• Structure
• Function
• Data
• Interfaces
• Platforms
• Operation
• Time
• Smoke/Sanity
• Learn/Explore
• Retest/Regress
• Report
• Repeat
Context Driven Plan Things To Test Roadmap
 Understanding Critical Path
 State Transition Testing
 CPM (Critical Path Method) Diagram
 Make a list of risks.
 Product Risks
 Project Risks
 Risk catalogs/watch lists/matrices
 Risk Prioritization
 Perform testing that explores each risk.
 As risks evaporate and new ones emerge, adjust your test effort to stay focused on
the current crop
 FEATURE MAPS
 COVERAGE MAPS
 TEST DESIGN MAPS
 Does your site need compatibility testing?
 Is your site built on a responsive (mobile
friendly) framework?
 If yes, does it work so?
 How do we test it for compatibility
 How to choose for which browsers to pick up
 Is your site performance optimized for
Mobile and low bandwidths?
 If yes, does it work so?
 How do we test for AMP
 We all know Jmeter, but have we understood
our applications performance needs yet?
 Is the application built around high speed
response or it will be done later?
 What is concurrency and how do I calculate it?
 I want to know single page performance for a
web-page?
 I want to understand my servers performance
 I want to make sure my website is not
penalized by Google for its shabby
performance
 What is the ultimate goal of security testing?
 AAA
 Authentication
 Authorization
 Availability
 Ever realized the importance of localisation?
 Ever wondered how to test for i18n?
 How to start
 Where are the caveats?
 What is the reward of testing for i18n?
 Case Study
 Know what is 508 compliance testing
 Know the need of compliance
 Know how it can be tested
 Few examples
 Case study
 James Bach: www.satisfice.com
 Michael Bolton: www.developsense.com
 Ministry of Testing: http://www.ministryoftesting.com/resources/exploratory-
testing/
 Presentation on Mindmaps and Exploratory Testing By
 Ajay Balamurgadas
 Meeta Prakash
 Lots of Stats available on wikipedia and leading online publishers
 12 years of overall working experience in the core industry
 Worked for Education, Finance & Travel sectors
 Switched roles between a developer, QA engineer, Business
Analyst and many more
 Now an entrepreneur running a web development firm and a test
consultancy
 Easy to connect with and loves to travel
 Call/Whatsapp - 9920996021
 Email : younus@rethinkingweb.com
Younus poonawala   Web Application Testing

Younus poonawala Web Application Testing

  • 1.
    exploratory & contextdriven some unstructured gyaan by Younus Poonawala
  • 2.
     $840mn investedin building the site healthcare.gov  6 users registered on the first day  The site took 2 months to recover technically  And the budget shot to ~$2 bn towards the end of campaign  Research has shown that over the past 10 years, 94% of large federal information technology projects were unsuccessful, more than 50% were delayed, over budget, or didn’t meet expectations and a total of 41.4%were judged to be complete failures.
  • 3.
     YES, weare pessimistic  YES, we are picky  YES, we know what sucks and why  BUT, we don’t know what would happen if we don’t test
  • 4.
     Web BasedApplications  Triangle App – Case Study  Testing is not checking  Different Hats of a Tester  Critical Path Analysis  Risk Based Testing  Mind Mapping  Q&A
  • 6.
     Triangle Appby Michael Bolton  Lets Think of Some Tests for this one?
  • 7.
     Why wasthe site built?  What’s the objective behind this?  Who is it targeted to?  Is the site complete ? Is it ready to be tested?  Does the original requirements reflect on site now?  What is the critical path?  What is at Stake?
  • 8.
  • 9.
     Functionality  Usability Compatibility/Responsiveness  Performance  Security  Localisation  508 compliance
  • 10.
     Clarify yourmission  Analyse your product  Analyse risks  Define Strategy  Plan Logistics  Share the Plan • Structure • Function • Data • Interfaces • Platforms • Operation • Time • Smoke/Sanity • Learn/Explore • Retest/Regress • Report • Repeat Context Driven Plan Things To Test Roadmap
  • 11.
     Understanding CriticalPath  State Transition Testing  CPM (Critical Path Method) Diagram
  • 13.
     Make alist of risks.  Product Risks  Project Risks  Risk catalogs/watch lists/matrices  Risk Prioritization  Perform testing that explores each risk.  As risks evaporate and new ones emerge, adjust your test effort to stay focused on the current crop
  • 15.
     FEATURE MAPS COVERAGE MAPS  TEST DESIGN MAPS
  • 18.
     Does yoursite need compatibility testing?  Is your site built on a responsive (mobile friendly) framework?  If yes, does it work so?  How do we test it for compatibility  How to choose for which browsers to pick up  Is your site performance optimized for Mobile and low bandwidths?  If yes, does it work so?  How do we test for AMP
  • 19.
     We allknow Jmeter, but have we understood our applications performance needs yet?  Is the application built around high speed response or it will be done later?  What is concurrency and how do I calculate it?  I want to know single page performance for a web-page?  I want to understand my servers performance  I want to make sure my website is not penalized by Google for its shabby performance
  • 20.
     What isthe ultimate goal of security testing?  AAA  Authentication  Authorization  Availability
  • 22.
     Ever realizedthe importance of localisation?  Ever wondered how to test for i18n?  How to start  Where are the caveats?  What is the reward of testing for i18n?  Case Study
  • 23.
     Know whatis 508 compliance testing  Know the need of compliance  Know how it can be tested  Few examples  Case study
  • 24.
     James Bach:www.satisfice.com  Michael Bolton: www.developsense.com  Ministry of Testing: http://www.ministryoftesting.com/resources/exploratory- testing/  Presentation on Mindmaps and Exploratory Testing By  Ajay Balamurgadas  Meeta Prakash  Lots of Stats available on wikipedia and leading online publishers
  • 25.
     12 yearsof overall working experience in the core industry  Worked for Education, Finance & Travel sectors  Switched roles between a developer, QA engineer, Business Analyst and many more  Now an entrepreneur running a web development firm and a test consultancy  Easy to connect with and loves to travel  Call/Whatsapp - 9920996021  Email : younus@rethinkingweb.com