A/B Testing
101
The Basics & Best Practices
A/B Testing: Why?
USER EMPATHY:
COLLECT, ANALYZE
& SOLVE USER
PAIN POINTS
INCREASED ROI REDUCED BOUNCE
RATE
TAKE INTELLIGENT
RISKS
DATA DRIVEN
DECISION MAKING
INTRODUCE &
SCALE CHANGE
INTELLIGENTLY
Problem statements:
• Increased unqualified leads? Lower engagement? Higher bounce rate?
Answer: A/B Testing
A/B Testing: Scope
Web pages (Headlines, images, Call to action
texts/buttons, mentions, badges etc)
API end points: Platform as a service API or
Data as a service API or Software as a service
API (the developer network uses APIs which
APIs has higher rate of conversions)
A/B Testing: Definition
A/B Testing = showing 2 variants to different target
personas (at same time) + comparing the results to
determine which variant has higher ROI
Example:
Landing page changes to determine which change
results in higher metrics of conversion.
Examples of metrics of conversion:
• # of products purchased
• # of leads generated
A/B Testing: How it works?
Research
Collect data on usage by
visitors=>track
metrics=>Identify
problems=>Root-cause-
analysis of the problem
Hypothesis
Build a hypothesis based
on the findings from
research.
Alternatives
Build alternatives to
existing solutions to
validate the hypothesis.
Validate
Validate by testing
alternatives in parallel for
a defined duration. The
test duration can be
based on:
•existing state vs future
state
•# of alternatives
•total # of
personas/users
•% of personas/users to
be tested.
Implement
Analyze the test results
from the validation
Measure, Iterate &
Improve alternatives
until the best outcome is
achieved
Implement the winning
alternatives
A/B Testing: Best practices
• Plan for optimization: Measure, iterate, improve & scale intelligently
• Prioritize & focus while testing alternatives (Quantity vs Quality)
• Use “Smart statistics” (Data science) to eliminate bias in driving outcomes from a test (Probability,
Duration & Comparison of Improvements)
• Control vs challenger (test & compare with vs without hypothesis)
• Build test samples intelligently
• Set a goal for your test
• Look for best tool selection
• Collect feedback from users during the test
• Consider external factors
Examples of A/B testing by domains
A/B Testing for media:
Engagement & Growth
for online users
A/B Testing for B2B: # of
qualified leads, # of
trials, # of conversions
A/B Testing for search:
search modals, search
relevance, search results
A/B Testing for
eCommerce: # of orders
(Successful vs Cancelled)
A/B Testing for Platform as service or Data
as Service or Software as service: # of
successful API calls, engagement & growth
A/B Testing Tools
Convert Experiences Google Analytics Instapage
Pricing Paid: Higher price (trial) Freemium Paid: Lower price (trial)
Product direction Strong Available Strong
Support High quality support Low Meets expectation
Ease of use Can be better Meets expectation Great
Variation Testing Strong Meets expectation Meets expectation
Reporting Strong Meets expectation Meets expectation
Personalization Strong Meets expectation Meets expectation
Industry SMB, Mid-market SMB, Mid-market, Enterprise SMB, Mid-market
** Disclaimer: Based on my experience of using these tools & information available on internet.
References
• https://www.g2crowd.com/categories/a-b-testing
• https://vwo.com/ab-testing
• https://offers.hubspot.com/an-introduction-to-ab-testing

A b-testing-101

  • 1.
  • 2.
    A/B Testing: Why? USEREMPATHY: COLLECT, ANALYZE & SOLVE USER PAIN POINTS INCREASED ROI REDUCED BOUNCE RATE TAKE INTELLIGENT RISKS DATA DRIVEN DECISION MAKING INTRODUCE & SCALE CHANGE INTELLIGENTLY Problem statements: • Increased unqualified leads? Lower engagement? Higher bounce rate? Answer: A/B Testing
  • 3.
    A/B Testing: Scope Webpages (Headlines, images, Call to action texts/buttons, mentions, badges etc) API end points: Platform as a service API or Data as a service API or Software as a service API (the developer network uses APIs which APIs has higher rate of conversions)
  • 4.
    A/B Testing: Definition A/BTesting = showing 2 variants to different target personas (at same time) + comparing the results to determine which variant has higher ROI Example: Landing page changes to determine which change results in higher metrics of conversion. Examples of metrics of conversion: • # of products purchased • # of leads generated
  • 5.
    A/B Testing: Howit works? Research Collect data on usage by visitors=>track metrics=>Identify problems=>Root-cause- analysis of the problem Hypothesis Build a hypothesis based on the findings from research. Alternatives Build alternatives to existing solutions to validate the hypothesis. Validate Validate by testing alternatives in parallel for a defined duration. The test duration can be based on: •existing state vs future state •# of alternatives •total # of personas/users •% of personas/users to be tested. Implement Analyze the test results from the validation Measure, Iterate & Improve alternatives until the best outcome is achieved Implement the winning alternatives
  • 6.
    A/B Testing: Bestpractices • Plan for optimization: Measure, iterate, improve & scale intelligently • Prioritize & focus while testing alternatives (Quantity vs Quality) • Use “Smart statistics” (Data science) to eliminate bias in driving outcomes from a test (Probability, Duration & Comparison of Improvements) • Control vs challenger (test & compare with vs without hypothesis) • Build test samples intelligently • Set a goal for your test • Look for best tool selection • Collect feedback from users during the test • Consider external factors
  • 7.
    Examples of A/Btesting by domains A/B Testing for media: Engagement & Growth for online users A/B Testing for B2B: # of qualified leads, # of trials, # of conversions A/B Testing for search: search modals, search relevance, search results A/B Testing for eCommerce: # of orders (Successful vs Cancelled) A/B Testing for Platform as service or Data as Service or Software as service: # of successful API calls, engagement & growth
  • 8.
    A/B Testing Tools ConvertExperiences Google Analytics Instapage Pricing Paid: Higher price (trial) Freemium Paid: Lower price (trial) Product direction Strong Available Strong Support High quality support Low Meets expectation Ease of use Can be better Meets expectation Great Variation Testing Strong Meets expectation Meets expectation Reporting Strong Meets expectation Meets expectation Personalization Strong Meets expectation Meets expectation Industry SMB, Mid-market SMB, Mid-market, Enterprise SMB, Mid-market ** Disclaimer: Based on my experience of using these tools & information available on internet.
  • 9.