Advertisement
Advertisement

More Related Content

Advertisement

22NTC Fundraising Session: 123 Testing…

  1. Let Us Know: Regularly. Occasionally. Not enough! Maybe too much? How Often Are You Testing At Your Organization?
  2. Digital Tests That Worked (And Some That Didn’t) 123 Testing
  3. Session Objectives 3 This rapid-fire session will feature results from a variety of digital tests for email, landing pages, mobile, and digital ads. We’ll learn which tests helped boost fundraising revenue and engagement, which tests didn’t work, and how to develop a culture of testing at your nonprofit. You’ll walk away from this session with new ideas of what to test and the encouragement you need to try new things.
  4. Brought to you by: Caitlin Richard Senior Email Marketing Manager Heifer International she/her/hers Caitlin.Richard@heifer.org Cathy Whitlock Associate Vice President of Online Communications Parkinson’s Foundation she/her/hers cwhitlock@parkinson.org Megan Buchheit Digital Account Manager Lautman Maska Neill & Co. she/her/hers mbuchheit@lautmandc.com
  5. Creating a Culture of Testing Be open. Nothing is precious. Test your assumptions. You know your organization best. Some tests are going to lose – and that’s a good thing. 5
  6. How to Approach Testing Make a plan and build on past test results for future testing. Don’t test more than one thing at a time (in most cases). Test … and then retest! Work to optimize results in advance of important campaigns. Consider capacity at your organization and be realistic about what you can and can’t do. 6
  7. Making a Testing Plan: Landing Page Examples 7 Removing Extra fields 1 page versus 2 pages Ask string Adding mobile payment options
  8. How to Evaluate Test Results • Look for statistical significance. • Focus on the metrics that matter. For fundraising, we usually focus on response/conversion rate and page completion rate. 8
  9. What If Test Results Aren’t What You Hoped? • Consider your organizational priorities. You don’t always have to roll out with the winner of the test if you have different prioritizes as an organization. • Sometimes the test just helps you better understand and contextualize results moving forward. • Getting your audience used to something different can take time. 9
  10. Parkinson’s Foundation Rebrand Example 10 vs
  11. Types of Tests Email Landing Pages Sustainer Mobile Ads 11
  12. Email 12
  13. Email Testing…123 Sender Subject line Button color Button copy Send time Messaging and simple copy tests 13
  14. Renew Vs. Donate Button 14 Control Test Winner! Increased page completion rate and raised over 2X the revenue!
  15. Ask Amount: $20 vs. $40 15 Winner! Over 2x as many gifts!
  16. Email Send Time Control: 9:00 a.m. Test: 3:00 p.m. 16 Winner! Increased page completion rate and response rate!
  17. Personalizing Art With Donor’s Name 17 Control Test Winner! Higher revenue and increased click through rate!
  18. Personalizing with Name (Non-Fundraising) 18 Tie! The test didn’t make any difference.
  19. Personalizing Email Ask by Past Gift 19 Tie! The test didn’t make any difference.
  20. Recent Gift Amount vs. Highest Previous Gift 20 Winner! More gifts and higher average gift!
  21. Invoice Email vs. Plain Text Email Winner! More than doubled response rate and revenue!
  22. Landing Pages 22
  23. 23 Ask string • Regular vs. reverse ask string • Number of ask amounts (ex. 3 asks vs. 5 asks) • Higher vs. lower ask string Removing/adding fields • Do you need everything you currently ask for on your donation form? Landing Page Testing 123
  24. Mobile Payment Options 24 Control Test Winner! Increased page completion rate and average gift.
  25. Covering the Transaction Fee 25 Control Test Tie! No difference between control and test
  26. Landing Page Design Control Test Winner! Had a higher average gift and 2x the revenue
  27. 27 27 Sustainers
  28. Monthly ask string Adding a dual 1x and monthly ask to emails Using a monthly-first ask for active donors Sustainer Testing 123 28
  29. Sustainer Ask String 29 Control Test Tie! No difference between control and test.
  30. Sustainer First Ask 30 Control: Test: Winner! Increased sustainers but lowered page completion rate.
  31. Naming a Sustainer Program 31 Control Test Winner! Test had a higher response rate and average gift.
  32. Naming a Sustainer Program: Part 2 32 Control Test Winner! Had a higher response rate and average gift.
  33. Sustainer Segmentation • Who are you asking to become a sustainer? You might consider looking at: • Length of time on email file • Length of time since most recent gift • Length of time that they’ve been a donor • Previous giving level • First gift channel or program 33
  34. 34 Mobile
  35. Mobile Testing 123 SMS vs. MMS Send time Many of the tests that work for email and landing pages also apply for mobile. Mobile is also a great time to make your testing multi-channel and to continue tests that you may be doing in email. 35
  36. Art Design for Mobile Messages 36 Control Test Winner! The control won for the response rate.
  37. Premium Offer vs. Non-Premium Offer 37 Control Test Winner! More than doubled donations and had nearly 2x the response rate!
  38. 38 Ads
  39. 39 Call to action button copy Call to action buttons on ad creative Copy & messaging Including emojis in ad headlines 🚨❗🛑 Static vs. gif ads Text treatment on ads Channels for ads Like mobile, many of the email and landing pages tests also apply here. Digital Ads Testing 123
  40. Ad Landing Page Control Test Winner! Higher page completion rate and better cost/result
  41. One Time vs. Monthly Giving 41 Control Test Winner! Higher ROI on ad spend
  42. Audience Testing in Social Media Growth Campaign 42
  43. Ad Language Control Test Winner! Higher conversion rate and lower cost/result
  44. As we wrap up, remember… Start small. Have a plan of action. Don’t be afraid to retest. Things can get stale or change as technology changes! Have a procedure for keeping track of your test results. Make sure you are implementing your winning tests! Share your learnings with the rest of your team. Don’t limit yourself. There are so many tests beyond the ones we mentioned today. 44
  45. Questions? Please reach out to us at any time! MBuchheit@lautmandc.com | Caitlin.Richard@heifer.org | CWhitlock@parkinson.org 45

Editor's Notes

  1. Megan How does your organization approach testing? Do you test regularly? Is there a lot of resistance to it?
  2. Megan
  3. Megan
  4. Caitlin / Cathy / Megan Caitlin and Cathy intro your organizations
  5. Megan
  6. Cathy Keep a testing log to keep track of results
  7. Megan Think about the problem you’re trying to solve
  8. Megan Statistical significance = the results are not happening by chance/they aren’t random
  9. Caitlin – Accelerate Getting your audience used to something different can take time
  10. Cathy
  11. Megan
  12. Megan Introduce the idea of “123 testing”
  13. Caitlin Address Apple Mail privacy changes
  14. Megan Tried this test over the course of 2 emails; not a true renewal program but we know that using renew can make a big difference.
  15. Caitlin These may seem like such a small thing to test, 1 food basket vs 2 food baskets, a difference of $20. This was a 50/50 split, with these static asks. Click rate was slightly higher for 1 basket, but the difference in donor rate, page completion and number of gifts was huge! Both drove to the same landing page.
  16. Megan You should always retest these kind of test – be aware of time zones Actors Fund 02 2021 – tested over 3 emails. Increased page completion rate by 6% and response rate by 0.06%!
  17. Cathy Winner: Increased click through rate, almost double the revenue, higher average gift
  18. Caitlin Testing this multiple emails, and made no difference in click through
  19. Caitlin Heifer – adding in previous catalog item to email – did not make a difference CONTROL: 50% of the file received an email with gif showing the different gifts you could give TEST: 50% of the file received an email that reflected their most recent catalog gift, if their gift contained one of the 5 test versions (alpaca, bees, chicks, goats, GWNM). If their most recent gift didn’t contain any of those, they were shown the GIF. No significant difference in click rate, unsubscribe rate. Exact same donor rate. Personalization is usually touted as a best practice, but when it doesn’t move the needle, it may not be worth the additional effort to create the personalized content. No significant difference in click rate, click to open, and unsubscribe rate. Exact same donor rate.
  20. Caitlin Click through and Click to open only marginally favored MRC, but MRC generated more gifts and a higher avg gift.
  21. Megan HRC April 2021 Renewal Response rate went from 0.27% to 0.13% Invoices do well across the board – don’t need to be flashy
  22. Megan Check for questions from the previous section
  23. Cathy Consider what your pages look like on mobile
  24. Megan PayPal, done similar testing for Apple Pay and other payment options
  25. Cathy This is situation where we didn’t want to see a difference, so we could feel comfortable rolling out the test w/o it hurting page completion rate PF 05 2019 – no difference TF 02 2020 – no difference
  26. Megan RIF 05 2021 Landing Page Test: Spring Book Drive vs Ask String Buckets with Images Mention the test image the amounts reflect the ask string amounts
  27. Megan Check for questions from the previous section
  28. Caitlin
  29. Cathy Continued to use the control ask string, wanted to see if a longer ask string would increase response rate Control had slightly higher response rate/average gift
  30. Megan While the test won, we recommended using this strategically since it also lowered page completion rate – a good one to retest in the future There are trade offs in testing
  31. Cathy
  32. Megan Sustainer program had more name recognition, and also paired with a very emotional story
  33. Caitlin
  34. Megan Check for questions from the previous section
  35. Megan
  36. Megan This was a multi-channel test Test had a higher click through rate
  37. Megan
  38. Megan Check for questions from the previous section
  39. Cathy Call to actions – do you want to include one? What do you want it to say?
  40. Megan RIF In terms of ROAS and page completion, the test page (RIF.org/give-today) performed slightly better than the control for non-donors, while for donors, the control page performed best for page completion and cost/result. Facebook optimized toward the control. We therefore decided to continue to use the control page, since it is best for gift sourcing.
  41. Cathy One-time vs. monthly giving ask in year-end fundraising campaign. Result: one-time won by far at this time of year. Right test, wrong time
  42. Cathy creating lookalike lists of caregivers, patients, etc. and testing cost per follow, etc.
  43. PCRM Vegan vs. Plant Based Control: 63 conversions at $30 per Test: 2 conversions at $72 per
  44. Megan – Caitlin and Cathy
  45. Megan
Advertisement