Evolution,
not Revolution.
  Why Optimizing
 Beats Redesigning.
                      [presented by] Kate O’Neill
        ...
Who am I to make
 these claims?
And who are you
   to listen?
Why I’m a Believer

•   Helped lead rollout through data awareness of
    interactive redesign at Netflix that is still sta...
Why You May
     Choose to Believe
•   Working in a company torn by politics or
    conflicting interests in web design or ...
How are most
design decisions made?
HPPO

• Highest
  Paid
  Person’s
  Opinion
• Not irrelevant, but not comprehensive.
By Committee

• “I need my department to be featured on
  the home page!”
• “Can we use something other than red for
  the...
Designer’s Aesthetic


• Flash!
• Prettiness over performance
Finger in the Wind

• What seems to be trendy
• What someone mentioned at a party
• What’s reported in the news
In other words...

  in a vacuum.
How should design
decisions be made?
You need user input.


You are not your audience.
     Even if you are.
You need data.


      Users lie.
Aggregate data doesn’t.
Data trumps
       opinions.
    Even highly paid
         ones.
           HPPO
   Use conversion-related metrics
   to d...
If it isn’t
          interesting
      to the user, ditch
                it.
         Committee
        Use engagement m...
Think users like
     your design?
        Prove it.
     Design for
    Design’s Sake
 If your success can’t be measured,...
“Great idea! I’ll
         add it to the
      testing roadmap.”
            Finger in
            the Wind
  Trendy ideas...
Balance objective
and subjective input

  what people think or feel
  versus
  what people do
Balance qualitative
and quantitative input

  what you can intuit
  versus
  what you can measure
How to Balance
                      Input Gathering
                                                                     ...
What to do?
‣   Balance subjective & objective testing
    (And know that you may get it wrong)

‣   Find the story behind...
If you’re still going to
get it wrong, why test?
Because you can not only measure lift
when you’re right...
(Woo hoo!)
you...
Test your way
to greatness.
How a Redesign Can
   Go Wrong:
  HPPO Edition
Netflix Branding Gaffe
        circa 2000




                          TV?! Static?!
                         Totally wron...
Improved



       Playing off of
    the movie theater
       experience.
How a Redesign Can
   Go Wrong:
 Usability Edition
Look Inside: Setup
Look Inside: Results
•   Visitors more likely to click on Preview offer (4.89% average) vs. Explore
    (3.85%)

•   Both ...
What to do?
(Once more with feeling!)
‣   Balance subjective & objective testing
    (And know that you may get it wrong)
...
If all else fails:
[meta]marketer can help.
  Thank you!


  Kate O’Neill, Founder / CEO
  kate@metamarketer.com
  615-852...
Upcoming SlideShare
Loading in...5
×

Evolution Not Revolution: Why Optimizing Beats Redesigning

1,013

Published on

Redesigning a web site without data and testing is like cooking in the dark. Someone is probably going to end up burned, and it's likely to be you.

How do you keep the CEO from designing the site him or herself? How do you hold back the committee of people who want home page real estate for their pet projects? And if the answer is through analytics, how do you make good aesthetic decisions while paying attention to data?

Kate O'Neill addresses these questions, speaking from her experience managing data-driven incremental redesigns of sites like Netflix, Magazines.com, and many others. She is Founder and CEO of [meta]marketer, an optimization-focused marketing agency that helps clients maximize the value in their online presence.

(Presented at BarCamp Nashville 2009, also known as bcn09 or #bcn09.)

Published in: Technology, Design
1 Comment
4 Likes
Statistics
Notes
  • but what about the blink tag? smart, sharp and to the point. awesome preso.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total Views
1,013
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
25
Comments
1
Likes
4
Embeds 0
No embeds

No notes for slide

Evolution Not Revolution: Why Optimizing Beats Redesigning

  1. 1. Evolution, not Revolution. Why Optimizing Beats Redesigning. [presented by] Kate O’Neill of [meta]marketer Image credit: funfonix.com
  2. 2. Who am I to make these claims? And who are you to listen?
  3. 3. Why I’m a Believer • Helped lead rollout through data awareness of interactive redesign at Netflix that is still standard for e-commerce • Oversaw many multivariate tests at Magazines.com that resulted in 40% lift in conversion rate YOY • Now own a company that focuses on this kind of optimization for clients with great results
  4. 4. Why You May Choose to Believe • Working in a company torn by politics or conflicting interests in web design or feature? • Designing sites freelance and always encountering the same resistance? • Running personal web sites and blogs you’d like to see perform better?
  5. 5. How are most design decisions made?
  6. 6. HPPO • Highest Paid Person’s Opinion • Not irrelevant, but not comprehensive.
  7. 7. By Committee • “I need my department to be featured on the home page!” • “Can we use something other than red for the Buy buttons?”
  8. 8. Designer’s Aesthetic • Flash! • Prettiness over performance
  9. 9. Finger in the Wind • What seems to be trendy • What someone mentioned at a party • What’s reported in the news
  10. 10. In other words... in a vacuum.
  11. 11. How should design decisions be made?
  12. 12. You need user input. You are not your audience. Even if you are.
  13. 13. You need data. Users lie. Aggregate data doesn’t.
  14. 14. Data trumps opinions. Even highly paid ones. HPPO Use conversion-related metrics to determine executive-relevant strengths and weaknesses of the site.
  15. 15. If it isn’t interesting to the user, ditch it. Committee Use engagement metrics to determine what keeps users on the site.
  16. 16. Think users like your design? Prove it. Design for Design’s Sake If your success can’t be measured, you can’t defend it. And your input won’t be appreciated.
  17. 17. “Great idea! I’ll add it to the testing roadmap.” Finger in the Wind Trendy ideas are worth knowing about. But they may not work in your environment.
  18. 18. Balance objective and subjective input what people think or feel versus what people do
  19. 19. Balance qualitative and quantitative input what you can intuit versus what you can measure
  20. 20. How to Balance Input Gathering analytics data qualitative - - - - - - quantitative A/B or MVT results surveys focus groups usability studies / interviews subjective - - - - - - - - - - - - - - - - - - - - - - - - objective
  21. 21. What to do? ‣ Balance subjective & objective testing (And know that you may get it wrong) ‣ Find the story behind the story (But know that you may get it wrong) ‣ Look for a narrative in onsite testing (But know that you may get it wrong) ‣ Look for the unobvious AND the obvious (And know that you may get it wrong)
  22. 22. If you’re still going to get it wrong, why test? Because you can not only measure lift when you’re right... (Woo hoo!) you can also contain risk when you’re wrong. (And it just might save your job.)
  23. 23. Test your way to greatness.
  24. 24. How a Redesign Can Go Wrong: HPPO Edition
  25. 25. Netflix Branding Gaffe circa 2000 TV?! Static?! Totally wrong customer experience emphasis.
  26. 26. Improved Playing off of the movie theater experience.
  27. 27. How a Redesign Can Go Wrong: Usability Edition
  28. 28. Look Inside: Setup
  29. 29. Look Inside: Results • Visitors more likely to click on Preview offer (4.89% average) vs. Explore (3.85%) • Both variations resulted in a drop in conversion • Drop in conversion only slightly greater with Preview (-.87%) than with Explore (-.76%) • Overall, more established titles (Time, Sports Illustrated, People) had lowest Click % as well as below average drop in conversion of all viewer- enabled titles • Somewhat lesser-known titles (SI Kids, Health, StyleWatch, Money) had highest Click % and higher than average drop in conversion of all viewer- enabled titles
  30. 30. What to do? (Once more with feeling!) ‣ Balance subjective & objective testing (And know that you may get it wrong) ‣ Find the story behind the story (But know that you may get it wrong) ‣ Look for a narrative in onsite testing (But know that you may get it wrong) ‣ Look for the unobvious AND the obvious (And know that you may get it wrong)
  31. 31. If all else fails: [meta]marketer can help. Thank you! Kate O’Neill, Founder / CEO kate@metamarketer.com 615-852-META twitter: @kateo / @metamarketer
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×