Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Data-Driven PR Metrics: Share of Voice, Competitive Benchmarking, Correlations

4,152 views

Published on

Presentation given at eMetrics Summit in San Francisco on April 1, 2015. Covers PR measurement through three specific tactics - share of voice, competitive benchmarking and correlations.

Published in: Business
  • Be the first to comment

Data-Driven PR Metrics: Share of Voice, Competitive Benchmarking, Correlations

  1. 1. DATA-­‐DRIVEN  PR  MEASUREMENT   Sandra  Fathi   President   Affect   @sandrafathi     web:  affect.com   blog:  techaffect.com   email:  sfathi@affect.com     eMetrics  Summit   San  Francisco,  March  29-­‐Apr  2,  2015   Slides:  www.slideshare.net/sfathi  
  2. 2. ABOUT  ME   •  Sandra  Fathi   •  President,  Affect   •  Public  RelaJons,  Social  Media,   MarkeJng   •  Council  of  PR  Firms   •  PRSA  Past  PosiJons:   –  Tri-­‐State  Chair   –  NY  Chapter  President   –  Technology  SecJon  Chair   2  
  3. 3. ABOUT  AFFECT   3   Technology Healthcare Professional Services:    
  4. 4. PR  MEASUREMENT   Three  Concepts  for  Discussions:     •  Share  of  Voice   •  CompeJJve  Benchmarking   •  CorrelaJons   @sandrafathi   4  
  5. 5. PART  I:  SHARE  OF  VOICE  
  6. 6. DEFINITION   Share  of  Voice:     Comparing  your  crucial  performance  metrics  against   those  of  compeJtors  or  the  market.       •  You  have  to  measure  something   •  What  you  measure  needs  to  be  analyzed   proporJonately  against  compeJtor  data  (or  market   data)  to  establish  market  share   @sandrafathi   6  
  7. 7. THE  FORMULA       Number  of  ConversaJons     Including  Your  Company                              =    X  *  100  =  %  SOV   Total  ConversaJons  on  a  Topic   @sandrafathi   7  
  8. 8. ADVERTISING  CONCEPT   25%  SOV   75%  SOV   @sandrafathi   8  
  9. 9. SHARE  OF  VOICE  I   72%   28  %   ConversaJons   Talk   About  Me   @sandrafathi   9  
  10. 10. SHARE  OF  VOICE  II   0%   10%   20%   30%   40%   50%   60%   70%   80%   90%   100%   Q1   Q2   Q3   Q4   CompeJtor  C   CompeJtor  B   CompeJtor  A   Our  Company   @sandrafathi   10  
  11. 11. SHARE  OF  VOICE  III   300  ArJcles   MenJon  My   Company   145  ArJcles   MenJon   CompeJtor   589  Industry    ArJcles   87  ArJcles   MenJon  Both   51%  SOV     in  the  Industry   @sandrafathi   11  
  12. 12. KEEP  IN  MIND   •  Share  of  voice  should  be  defined  for  a  period  of  Jme   (finite  start  and  end).   •  Share  of  voice  is  oien  most  useful  when  limited  to  a   single  plajorm  or  medium.  For  example,  business   press  coverage  or  Twiler.   •  Share  of  voice  can  be  overwhelming  if  trying  to  look   at  too  large  a  segment  or  industry.  Try  choosing  SOV   among  top  compeJtors  or  in  key  interest  areas.     @sandrafathi   12  
  13. 13. SOV:  SOCIAL  MEDIA   ANALYTICS  PLATFORMS   @sandrafathi   13  
  14. 14. SOCIAL  MENTION   @sandrafathi   14  
  15. 15. SIMPLE  EXCEL  FORMULA   @sandrafathi   15  
  16. 16. ONLY  PART  OF  THE  STORY   •  Doesn’t  consider  senJment   •  Doesn’t  consider  sources  (exclude   self  produced/owned  media)   •  Doesn’t  consider  quality,  only   quanJty  (Is  NYT  blog  same  as   obscure  geek’s  tweet?)   •  Don’t  accept  the  data  blindly  –   human  verificaJon  is  required  with   any  tool   @sandrafathi   16  
  17. 17. OTHER  APPLICATIONS  &   CONSIDERATIONS   ConsideraJons:   •  Apply  senJment  or  tonal  filters  (posiJve/negaJve)   •  Apply  qualitaJve  measures  (by  Jer  or  by  type)   ApplicaJons:   •  Industry  trends/hot  topics  (i.e.  SOV  on  cloud  security)   •  Specific  products  or  services   •  Broken  down  by  geographic  or  demographic  parameters   (i.e.  SOV  in  18-­‐25  market)   @sandrafathi   17  
  18. 18. PART  II:  COMPETITIVE  BENCHMARKING  
  19. 19. DEFINITION   CompeJJve  Benchmarking:     The  conJnuous  pracJce  of  comparing  a  company’s   pracJces  and  performance  metrics  against  the  most   successful  compeJtors  in  the  industry.     •  You  measure  processes  and  results   •  You  must  idenJfy  a  ‘benchmark’  or  indicator  that  will   be  a  unit  of  measure  to  compare   •  The  desired  outcome  is  to  understand  which   processes  lead  to  greater  success  (best  pracJces)  in   order  to  improve  your  company’s  performance   @sandrafathi   19  
  20. 20. COMPETITIVE     BENCHMARKING   •  IdenJfy  my  compeJJve  set  for  comparison   •  Choose  my  units  of  measure:  press  coverage   •  Set  parameters:  top  20  business  and  trade   •  Define  a  Jme  period:  6  months   •  Choose  a  tool  (news  monitoring  service)  or  begin   manual  research   @sandrafathi   20  
  21. 21. EXAMPLE:  RADWARE   ObjecJve:       •  Build  &  Maintain  Radware’s  PosiJon  as  a  Thought  Leader  on  ADC   and  Security   •  Maximize  Radware’s  Overall  Public  RelaJons  Results     Strategy:     •  Compare  and  Contrast  Radware’s  Press  Release  Output  with  Top   3  ADC  and  Security  CompeJtors   •  Compare  and  Contrast  Radware’s  Coverage  with  Top  3  ADC  and   Security  CompeJtors   •  Analyze  Results   •  Apply  Best  PracJces  and  Lessons  Learned  to  Radware  to  Improve   Overall  Performance   @sandrafathi   21  
  22. 22. EXAMPLE:  RADWARE   ApplicaJon  Delivery   Network  Security   @sandrafathi   22  
  23. 23. •  Analysis  of  press  release  strategy  and  resulJng   coverage  over  6  month  period   •  Specifically  as  it  relates  to  relevant  products  or   business  units   •  Only  in  top  20  business  and  industry/sector   publicaJons   METHODOLOGY   @sandrafathi   23  
  24. 24. METHODOLOGY  II   @sandrafathi   24  
  25. 25. RADWARE  PRESS  RELEASES   Security   43%   ADC   27%   Both*   12%   Other*   18%   Press  Releases   *  ‘Both’  includes  releases  related  to  both  security   and  ADC,  ‘Other’  includes  non-­‐product  releases  (e.g.   company  news,  financial  announcements  etc.)   Press  Releases   Security   14   ADC   9   Both   2     Other   6   @sandrafathi   25  
  26. 26. ADC  COMPETITORS:   SECURITY  &  ADC   *  ‘Both’  includes  releases  related  to  both  Security  and  ADC   11   11   2   15   46   39   31   92   0   10   20   30   40   50   60   70   80   90   100   Radware   A10   Citrix   F5   Press  Releases   ArJcles   PRESS  RELEASES  VS.  NUMBER  OF  ARTICLES   @sandrafathi   26  
  27. 27. 9   11   2   7   44   16   31   38   0   5   10   15   20   25   30   35   40   45   50   Radware   A10   Citrix   F5   Press  Releases   ArJcles   ADC  COMPETITORS:     ADC  ONLY   PRESS  RELEASES  VS.  NUMBER  OF  ARTICLES   @sandrafathi   27  
  28. 28. ADC  COVERAGE  BY  TYPE   0   20   40   60   80   Radware   A10   Citrix   F5   Other   Report   Commentary   AcquisiJon   Partner   Customer   Product   Product   Customer   Partner   AcquisiJon   Commentary   Report   Other   Radware     7     2     1     0   1     34     1     A10     1     1     10     0   1     0   26     Citrix     27     0   0     0   2     0   2     F5     24     0     3   0     12     0     37     @sandrafathi   28  
  29. 29. COVERAGE  QUALITY   0   10   20   30   40   50   60   70   80   90   100   Radware   A10   Citrix   F5   74%   10%   23%   50%   26%   90%   77%   50%   MenJons   Features   FEATURE  VS.  MENTION   @sandrafathi   29  
  30. 30. ADC  CONCULSIONS   •  Number  of  press  releases  did  not  correlate  to  number  of  arJcles   •  Radware  was  leading  in  SOV  on  key  topic  (ADC)  amongst   compeJtors  and  the  quality  of  coverage  by  comparison  was   significant  (ValidaJon!)   •  Overwhelming  majority  of  Radware’s  ADC  coverage  was   generated  by  reports  (ValidaJon!)  with  product  and  customer   news  trailing  far  behind   •  CompeJtors  were  leading  with  product  news  and  capturing   media  alenJon  (Opportunity!)   •  No  one  was  successfully  telling  the  customer  story   (Opportunity!)   @sandrafathi   30  
  31. 31. SECURITY  COMPETITORS   14   23   28   10   84   164   68   68   0   20   40   60   80   100   120   140   160   180   Radware   Arbor   Imperva   Prolexic   Press  Releases   ArJcles   PRESS  RELEASES  VS.  NUMBER  OF  ARTICLES   @sandrafathi   31  
  32. 32. SECURITY  COVERAGE  BY   TYPE   0   50   100   150   200   Radware   Arbor   Imperva   Prolexic   Other   Report   Commentary   Alack   AcquisiJon   Partner   Customer   Product   Customer   Partner   AcquisiJon   Alack   Commentary   Report   Other   Radware     8     9     6   0   28     22     11   2     Arbor     19     1   1   18     29     44     52   0   Imperva     2     1     1     0     8     18     19   19     Prolexic     0     0   0   0     43     2     14     9     @sandrafathi   32  
  33. 33. COVERAGE  BY  QUALITY   0   20   40   60   80   100   120   140   160   180   Radware   Arbor   Imperva   Prolexic   MenJons   Features   35%   65%   31%   69%   34%   66%   54%   46%   FEATURE  VS.  MENTION   @sandrafathi   33  
  34. 34. SECURITY  CONCLUSIONS   •  Radware  is  #2  in  overall  SOV  but  the  quality  is  not  as   strong  (more  menJons  vs.  features)   •  Leading  customer  and  partner  conversaJons   (ValidaJon)   •  Good  job  at  Story  Hijacking  (responding  to  security   hacks)  but  room  for  improvement  (ValidaJon)   •  CompeJtors  winning  at  report  coverage  and   commentary  (Opportunity!)   @sandrafathi   34  
  35. 35. CONSIDERATIONS   •  Good  for  understanding  what  worked  but  not   necessarily  ‘how’  it  worked   •  Costs  for  research  may  outweigh  benefits  of  insights   •  Once  you’ve  idenJfied  the  ‘best  pracJces’  you  may   or  may  not  be  able  to  replicate  them   •  Consider  non-­‐compeJtor  companies  to  benchmark   •  Do  you  want  to  ‘emulate’  or  ‘innovate’?   @sandrafathi   35  
  36. 36. PART  III:  CORRELATIONS  
  37. 37. DEFINITION   CorrelaJon:     A  mutual  relaJonship,  or  interdependence,  between  two  or   more  things.       •  In  the  absence  of  being  able  to  prove  ‘causality’  you  may   be  able  to  demonstrate  a  ‘correlaJon’  to  demonstrate   the  impact  of  a  PR  or  markeJng  program   •  A  correlaJon  is  posiJve  when  the  values  increase   together   •  A  correlaJon  is  negaJve  when  the  values  decrease   together   @sandrafathi   37  
  38. 38. TYPES  OF  CORRELATION   Source:  MathisFun.com   38  
  39. 39. THE  FORMULA   39   Pearson’s  CorrelaJon:   @sandrafathi  
  40. 40. FUNCTION  IN  EXCEL   40  @sandrafathi  
  41. 41. CORRELATION  IN  EXCEL   41  @sandrafathi  
  42. 42. FUNCTION  IN  EXCEL   42  @sandrafathi  
  43. 43. SCATTER  CHART   43  @sandrafathi  
  44. 44. LINE  CHART   44  @sandrafathi   AcquisiJon  
  45. 45. 3-­‐D  LINE  CHART   45  @sandrafathi  
  46. 46. MULTIPLE  DATA  SETS   46  @sandrafathi   0   500   1000   1500   2000   2500   3000   3500   4000   4500   Q1   Q2   Q3   Q4   Sales   Web  Traffic   Press  Coverage  
  47. 47. SPURIOUS  CORRELATION   47  @sandrafathi   Source:  TylerVigen.com  
  48. 48. 48  @sandrafathi   Source:  TylerVigen.com   SPURIOUS  CORRELATION  
  49. 49. CONSIDERATIONS   •  User  correlaJons  cauJously  and  don’t  trust  the  math   blindly   •  The  visuals  oien  tell  a  story  as  well   •  Remember  that  correlaJon  is  not  causality,  it  can   only  help  as  an  indicator  or  potenJally  predict   probability   •  Data  is  sJll  beler  that  your  opinion   49  @sandrafathi  
  50. 50. FINAL  THOUGHTS   •  In  measurement,  speak  the  language  of  the  C-­‐Suite   •  Excel  is  sJll  the  best  dashboard  for  data  visualizaJon   •  Don’t  be  afraid  to  learn  that  you  are  wrong   •  Don’t  be  afraid  to  change  direcJon   •  Use  the  data  to  gain  execuJve  support     –  Strategy   –  Resources   –  Headcount   –  Budget   50  @sandrafathi  
  51. 51. THANK  YOU     CONTACT:   Sandra  Fathi   President   Affect   @sandrafathi     web:  affect.com   blog:  techaffect.com   email:  sfathi@affect.com     Slides:  www.slideshare.net/sfathi  

×