Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Toys R Us marketing database RFP responses, Ph I, Charles de Gruchy

768 views

Published on

I was hired by Eleanor Hong, VP of Marketing, to run the marketing database RFP project. The project with multiple revisions ran from late 2007 thru most of 2008 with significant changes to the financials as the numbers came in. Not surprisingly the prices was too high and they ended up staying with Harte-Hanks

Published in: Marketing
  • How we discovered the real reason nice guys don't get laid, and a simple "fix" that allows you to gain the upper hand with a girl... without changing your personality or pretending to be someone you're not. learn more... ▲▲▲ http://t.cn/AiurDrZp
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • hundreds of girls living near you. F.U.C.K ONE TONIGHT! ♣♣♣ http://t.cn/AiuWSRdj
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • FREE TRAINING: "How to Earn a 6-Figure Side-Income Online" ... ♣♣♣ https://tinyurl.com/y3ylrovq
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Penis Enlargement and Enhancement Techniques: What REALLY Works?!? ▲▲▲ http://t.cn/Ai88iYkP
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Legitimate jobs paying $40/h Tap into the booming online job, industry and start working now! ◆◆◆ http://scamcb.com/ezpayjobs/pdf
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Toys R Us marketing database RFP responses, Ph I, Charles de Gruchy

  1. 1. 1 RESTRUCTURED RFP RESPONSE EVALUATION FOR MARKETING DATABASE MANAGEMENT, MARKETING AUTOMATION AND ANALYTICS SERVICES BASED ON: 1. STRATEGY 2. DATA MANAGEMENT 3. ANALYTICS Prepared by Charles de Gruchy June 16, 2008
  2. 2. 2 1. Background: 1.1 The re structured RFP 1.2 Objectives 1.3 Purpose of restructured RFP 1.4 Gaps within the RFP as a result of the re structuring 1.5 Revised evaluation stages and process 1.6 Restructure RFP content 2. Evaluation: 2.1 Point of view behind the participant scoring 2.2 The scoring system 2.3 Participants overall scoring: 2.3.1 Strategy 2.3.2 Data Management 2.3.3 Analytics 2.4 Short list recommendation based on the restructuring 2.5 Short list recommendation 2.6 Short list detailed performance 2.7 Harte-Hanks recommendation 3. Next Steps Appendix (A) Evaluation approach CONTENTS
  3. 3. 3 BACKGROUND
  4. 4. 4 1.1 THE RE STRUCTURED RFP The RFP content and submissions have been re structured into three categories of response as follows: 1. Strategy and account management 2. Data management 3. Analytics
  5. 5. 5 1.2 OBJECTIVES The revised objectives for development of a category based RFP are: 1. To identify ‘best of breed’ within each functional category 2. To understand capabilities 3. To award TRU/BRU marketing database management based on category expertise
  6. 6. 6 1.3 PURPOSE OF THE RE STRUCTURED RFP The purpose of re structuring is to 1. Summarize the response submissions based on a performance criteria within a more tightly defined set of criteria 2. Apply that criteria to identify strengths and weaknesses, 3. Identify ‘best of breed’ within each performance category, and 4. Evaluate vendor capabilities with the intent of offering out the business to more than one vendor by category of expertise.
  7. 7. 7 1.4 GAPS WITHIN THE RFP AS A RESULT OF THE RE STRUCTURING As a result of the re structuring of the RFP the analytics category fails to deliver an in-depth exploration of vendor capabilities as well as a complete review of industry ‘best of breed’ practitioners. Recommendations are provided to build out this RFP category to meet the review objectives.
  8. 8. 8 1.5 REVISED EVALUATION STAGES AND PROCESS Stage 1 – Completed initial scoring of the 7 participants and recommended short list of three Stage 2 – Re structure initial scoring by category and make recommendations regarding: A. participants B. RFP questions and focus by category Stage 3 – Identify and select additional vendors to receive the enhanced category RFP.
  9. 9. 9 1.6 RE STRUCTURED RFP CONTENT The re structured RFP is divided into 14 sections as follows: SECTION NAME SECTION # QUESTION 2.4.5 Customer Segmentation and Modeling 2.4.6 Strategy and Enablement 2.7 Client Service 2.9 Training methodologies 2.3 Data warehouse management 2.4 Marketing automation services and reporting 2.4 Campaign management 2.4.2 Reporting, software installation and maintenances 2.4.7 Facilities and data safeguarding 2.5 Technology 2.6 Implementation & migration 2.8 Testing methodology 2.4.3 Reporting 2.4.4 Analytical leadership DATA MANAGEMENT ANALYTICS STRATEGY
  10. 10. 10 1.4 SUBMISSIONS Seven participants responded:  Acxiom  Allant  Epsilon  Equifax  Harte Hanks  Merkle  Rapp Collins The Forrester Wave, Database marketing Services Providers Review (November 2, 2007) included all the TRU/BRU RFP participants in their evaluation. Forrester results are referenced following.
  11. 11. 11 RE STRUCTURING THE CURRENT EVALUATION
  12. 12. 12 2.1 POINT OF VIEW BEHIND THE PARTICIPANT SCORING The participant scoring rewards participants who delivered on the following:  Added value  Integrated service offering  Integrated channel view  Flexible account structure  Flexible services organization Participant performance levels (ranking and scoring within each of the RFP categories) reflect the degree to which the participant answered the following questions in the body of their answer. 1. How well did the answer address the stated needs of TRU/BRU? 2. Did the participant answer the question? 3. Was the content directly, or indirectly, relevant to the question asked? 4. Did the answer meet or exceed the standard set by the other participants?
  13. 13. 13 2.2 THE SCORING SYSTEM The design of the RFP questions provides for two types of answers. The first type answers the question “what”, and the second, is more “open ended”*. The scoring system is the same for each type of question and is based on a three part score of 1-3-9 with the following interpretation assigned to “what” questions:  A score of ‘9’ for high or added value performance  A score of ‘3’ for medium or met the performance minimum standard  A scored of ‘1 for low, or were below the relative standard established by the other participants. And, with the following interpretation for the “open ended” questions:  A score of ‘9’ equals exceeded the requirement  A score of ‘3’ equals partial, either incomplete or unclear  A scored of ‘1 equals either an answer was not given or the answer was not relevant. * See appendix (A) for examples
  14. 14. 14 While all participants performed strongly Allant, Equifax and Merkle emerged as the three strongest contenders. Despite their overall leadership, each firms response raises further questions that need exploration, as follows:  Migration process  Client service integration with analytics services  Integrated analytics services (on and off line)  Work flow management and TRU/BRU resources  Ability to scale to service a business the size of TRU/BRU  Questions regarding geography and travel VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE All COMBINED TOTAL 72% 771 82% 870 73% 776 89% 950 74% 787 88% 932 75% 798 VENDOR PERFORMANCE ACXIOM ALLANT EPSILON EQUIFAX HARTE-HANKS MERKLE RAPP COLLINSSECTION NAME 2.3 PARTICIPANTS OVERALL SCORING RESULTS 1 23* See scoring detail document for more information
  15. 15. 15 2.4 SHORT LIST RECOMMENDATION Allant, Equifax and Merkle emerged as the three strongest contenders among the seven participants based on providing a consistently high level of understanding of the processes and steps/stages required to meet TRU/BRU stated objectives and service delivery levels. It is recommended that they form the short list. Out of a total potential of 1,065 points all three achieved over 80%. VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE All COMBINED TOTAL 82% 870 89% 950 88% 932 VENDOR PERFORMANCE ALLANT EQUIFAX MERKLESECTION NAME
  16. 16. 16 2.5 SHORT LIST RECOMMENDATION AND FORRESTER The selected contenders – Allant, Equifax and Merkle – were also identified by Forrester as “leaders” within their evaluation However, Forrester’s conclusions are consistent with this evaluation in noting the following gaps even among the leaders:  poor project management  Limited integration of on and off line capabilities  Relative degrees of proactive service Merkle, alone among the evaluated companies, achieved 7 measures with scores over 80%, notably account and analytical services were two of the categories. See chart following
  17. 17. 17 2.5 SHORT LIST RECOMMENDATION AND FORRESTER CURRENT OFFERING SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE Account management & service delivery 5 77% 3.85 78% 3.88 84% 4.20 Strategy services 5 80% 4.00 70% 3.50 73% 3.63 Data and data sourcing services 5 62% 3.10 66% 3.28 87% 4.35 Database management 5 56% 2.80 62% 3.10 70% 3.50 Data processing 5 0% 0.00 0% 0.00 0% 0.00 Analytical services 5 91% 4.55 53% 2.65 93% 4.65 Creative services 5 0% 0.00 20% 1.00 73% 3.65 Execution 5 51% 2.55 14% 0.70 84% 4.20 Measurement 5 0% 0.00 0% 0.00 0% 0.00 Technology capabilities 5 76% 3.80 88% 4.40 96% 4.80 Integrated services 5 65% 3.25 75% 3.75 90% 4.50 Other capabilities 5 62% 3.10 56% 2.80 62% 3.10 Midmarket capabilities 5 48% 2.40 88% 4.40 36% 1.80 Industry capablities 5 49% 2.43 57% 2.85 88% 4.40 Sales channel capabilities 5 49% 2.43 0% 0.00 0% 0.00 Contracts and pricing 5 0% 0.00 0% 0.00 0% 0.00 TOTAL 80 48% 38.26 45% 36.31 58% 46.78 MERKLE FORRESTER VAVE EVALUATION, November 2007 ALLANT EQUIFAX
  18. 18. 18 2.6 SHORT LIST DETAILED PERFORMANCE, CASE STUDIES RFP 2.2 Case studies -- were relevant to the TRU/BRU RFP focus and illustrated how they would add value to the TRU/BRU business (s). Allant’s TWEEN BRANDS case specifically addressed points relevant to the TRU/BRU business, for example, data quality, timely information, access to data, etc. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.2 TOTAL CASE STUDIES, REFERENCES 99 100% 99 100% 99 82% 81 SECTION NAME ALLANT EQUIFAX MERKLE
  19. 19. 19 2.6 SHORT LIST DETAILED PERFORMANCE , DATA WAREHOUSE RFP 2.3 Data Warehouse Management -- All three presented strong, detailed and believable data management cases and descriptions. In addition, each participant presented a flexible, “we’ll work with you” point of view. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.3 TOTAL DATA WAREHOUSE MANAGEMENT 315 78% 247 84% 265 82% 257 SECTION NAME ALLANT EQUIFAX MERKLE
  20. 20. 20 2.6 SHORT LIST DETAILED PERFORMANCE, AUTOMATION RFP 2.4 Marketing Automation -- Allant articulated the issues related to the integrated TRU/BRU marketing environment best, followed by Equifax and Merkle. Each response was a positive set up for the following section – campaign management. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4 TOTAL MARKETING AUTOMATION SERVICES 9 100% 9 100% 9 100% 9 SECTION NAME ALLANT EQUIFAX MERKLE
  21. 21. 21 2.6 SHORT LIST DETAILED PERFORMANCE , CAMPAIGN MANAGEMENT RFP 2.4.1 Campaign Management – While Allant is clearly capable of delivering a high level of support their response was not as well articulated as Equifax and Merkle and the overall impression created not as strong. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.1 TOTAL CAMPAIGN MANAGEMENT 117 57% 67 88% 103 88% 103 SECTION NAME ALLANT EQUIFAX MERKLE
  22. 22. 22 2.6 SHORT LIST DETAILED PERFORMANCE, SOFTWARE RFP 2.4.2 Software – Equifax presented the strongest integration story with current TRU/BRU technologies. The understanding of upgrades costs and ongoing upgrades needs to be built more clearly into the go forward. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.2 TOTAL SOFTWARE 72 75% 54 92% 66 89% 64 SECTION NAME ALLANT EQUIFAX MERKLE
  23. 23. 23 2.6 SHORT LIST DETAILED PERFORMANCE, REPORTING RFP 2.4.3 Reporting – While the staffing models outlined by each participant are clear the report creation process will need leadership that none of the participants is offering to provide. Allant, Equifax and Merkle are looking for leadership from TRU/BRU and the identification of a project leader for the migration. Equifax and Merkle offer the most flexible staffing approach. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.3 TOTAL REPORTING 27 70% 19 100% 27 100% 27 SECTION NAME ALLANT EQUIFAX MERKLE
  24. 24. 24 2.6 SHORT LIST DETAILED PERFORMANCE, ANALYTICS RFP 2.4.4 Analytical Leadership – Analytics services are offered on a project basis with commitment of senior analytics staff on a permanent basis to the business. Staffing needs to be clarified within the scope of work and specific individuals identified together with their allocation. A key question is how this function will be coordinated between TRU and BRU. Each participant has expressed concerns re: workflow management. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.4 TOTAL ANALYTICAL LEADERSHIP 72 100% 72 92% 66 81% 58 SECTION NAME ALLANT EQUIFAX MERKLE
  25. 25. 25 2.6 SHORT LIST DETAILED PERFORMANCE, SEGMENTATION RFP 2..4.5 Segmentation – It is Unclear how analytics is built into the staffing model relative to project work. Merkle presented the clearest structure and options but all contenders need to be more specific, e.g. who will lead; how will analytics be integrated into the account function; what are the work flow issues, etc. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.5 TOTAL CUSTOMER SEGMENTATION AND MODELING 27 78% 21 78% 21 100% 27 SECTION NAME ALLANT EQUIFAX MERKLE
  26. 26. 26 2.6 SHORT LIST DETAILED PERFORMANCE , STRATEGY RFP 2.4.6 Strategy and Enablement – The Equifax response did not present a confident description of how strategic services, including analytics, would be enabled. Example provided did not help. Allant and Merkle provided a stronger staffing story with Merkle leading. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.6 TOTAL STRATEGY AND ENABLEMENT 18 67% 12 67% 12 100% 18 SECTION NAME ALLANT EQUIFAX MERKLE
  27. 27. 27 2.6 SHORT LIST DETAILED PERFORMANCE, FACILITIES RFP 2.4.7 Facilities – All of the participants provided security solutions within acceptable frameworks. Merkle provided added security options not provided by the others. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.4.7 TOTAL FACILITES AND DATA SAFEGUARDING 99 94% 93 94% 93 100% 99 SECTION NAME ALLANT EQUIFAX MERKLE
  28. 28. 28 2.6 SHORT LIST DETAILED PERFORMANCE, TECHNOLOGY RFP 2.5 Technology – All participants recommended a dedicated T1 line for communications management and all participants anticipate large data transfer volumes. A key question that is not answered clearly by Merkle is how they will integrate the full scope of the TRU/BRU business while maintaining stated levels of service and support. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.5 TOTAL TECHNOLOGY 63 86% 54 86% 54 76% 48 SECTION NAME ALLANT EQUIFAX MERKLE
  29. 29. 29 2.6 SHORT LIST DETAILED PERFORMANCE, IMPLEMENTATION RFP 2.6 Implementation– the issues related to a potential transition appear to be best understood by these three contenders. More specifics need to be provided on how the transition will be managed and examples of successful transitions of the scale under consideration. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.6 TOTAL IMPLEMENTATION AND MIGRATION 36 100% 36 100% 36 83% 30 SECTION NAME ALLANT EQUIFAX MERKLE
  30. 30. 30 2.6 SHORT LIST DETAILED PERFORMANCE, CLIENT SERVICE RFP 2.7 Client Service – Merkle presented the most coherent client service case although all participants did poorly describing the migration strategy. Further detail needs to be provided client service structure and day-to-day operating practice. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.7 TOTAL CLIENT SERVICE 93 85% 79 94% 87 100% 93 SECTION NAME ALLANT EQUIFAX MERKLE
  31. 31. 31 2.6 SHORT LIST DETAILED PERFORMANCE, TESTING RFP 2.8 Testing – Testing protocols are consistent across all three contenders. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.8 TOTAL TESTING METHODOLOGIES 153 100% 153 88% 135 92% 141 SECTION NAME ALLANT EQUIFAX MERKLE
  32. 32. 32 2.6 SHORT LIST DETAILED PERFORMANCE, TRAINING RFP 2.9 Training – Training methodologies are not all equal with Merkel presenting the most customized and flexible point of view, e.g. they will work with TRU/BRU to develop the optimal program and will provide individual training sessions which the others did not mention. SCORE TARGET VS TARGET SCORE VS TARGET SCORE VS TARGET SCORE 2.9 TOTAL TRAINING METHODOLOGIES 45 60% 27 87% 39 100% 45 SECTION NAME ALLANT EQUIFAX MERKLE
  33. 33. 33 2.7 HARTE-HANKS RECOMMENDATION Harte-Hanks ranking relative to the other participants was surprisingly low in light of their long tenure on the business. Their presentation of facts was weak in the following areas: a) Migration process – this was not clearly defined b) Staffing – e.g. how will the proposed structure address current deficiencies. c) Client support structure, e.g. what will be different now vs. the current structure (integration issues) d) Training – a self service option was not defined Based on their incumbency it is recommended that they be included in the short list based on addressing the gaps above.
  34. 34. 34 NEXT STEPS
  35. 35. 35 3. NEXT STEPS 1. TRU/BRU internal review of this evaluation. 2. TRU/BRU confirmation (or change) of short list candidates. Meanwhile, consultant will 1. Complete detailed side-by-side cost comparison of short list candidates. (A top line review indicates that the contender’s approaches are consistent with an “all in” approach to fees). 2. Develop business problems/questions for in-person presentation by short list contenders.
  36. 36. 36 APPENDIX
  37. 37. 37 (Appendix A) EVALUATION APPROACH The questionnaire design presents limitations to the application of a single evaluation approach: Example 1: Provide client success story that best highlights your ability to handle requirements, section 2.2 Comment: Because the question focus is broad answers from the participants range from a marketing problem/solution (Harte Hanks) to more specific database marketing examples (Merkle). As a result measurement of participant performance is rated based on both the relevance and strength of the case study.
  38. 38. 38 Example 2: Describe the process to prepare data for specific uses by the marketing automation tools, section 2.3.14 Comment: The question did not clearly indicated that an answer is required or that it optional to defer an answer to the discovery stage post hire. As a result some participants did exactly that (Allant, Epsilon) versus the others who clearly described the process (Epsilon, Equifax, Merkle) or addressed a specific within the process (Harte-Hanks, Rapp Collins). As a result non responders or those not addressing broader process issues were penalized in review. (Appendix A) EVALUATION APPROACH
  39. 39. 39 Example 3: Describe in detail, assignment of a household key based on assigned individual key, section 2.3.17 Comment: The questions in this section were specific and well defined. The participants, as a result, had a clear framework in which to specify their answers. Evaluation of the outcomes was straightforward based on the degree to which the participant detailed the process and the outcome. For example, appending of key demographic/lifestyle data Acxiom provided the most detailed description with a clearly defined outcome. The other participants while describing the process did not add further value. Axciom was rated high and the other participants medium in performance. (Appendix A) EVALUATION APPROACH
  40. 40. 40 Example 4: Describe how your organization will provide the same customer data that is required for domestic customers and international customers. Comment: Because this questions was very broad it left too much discretion to the participants to define the outcome. The result was that Acxiom focused on their credentials without “describing” and scored medium. Allant, Epsilon and Harte-Hanks provided very literal answers and scored high. Merkle and Rapp Collins didn’t address the question completely and were ranked low. (Appendix A) EVALUATION APPROACH
  41. 41. 41 1.3 CONSIDERATIONS • What is a successful database marketing service vendor today? Marketing database service providers, to be successful, must exceed price of entry levels of service and performance: A. Price of Entry: The design, build and management of marketing databases is no longer the baseline for performance it was five years ago. B. Point of Difference: With the trend to integration of the on and off line channels of sales together with rapid growth and diversification of alternative medias in driving retail, marketing database providers must deliver:  High value service  Flexibility  Proaction  Integrated delivery* *source: Forrester Wave, Database marketing Service Providers, 11,02,07
  42. 42. 42 1.3 CONSIDERATIONS • What is the optimal service combination? All providers deliver a similar suite of services including:  Strategy and planning  List and data sourcing  Database management and processing  Analytics  Measurement and insight The service suites in themselves are clear. It is how they are offered that causes confusion and mixed expectations. *source: Forrester Wave, Database marketing Service Providers, 11,02,07
  43. 43. 43 1.3 CONSIDERATIONS • How should those services be structured? An integrated offering (and strategy and analytics) is preferred by marketers and, based on the multi channel/multi brand structure of TRU/BRU business model this is the only way that will work to meet marketing objectives and service support requirements. *source: Forrester Wave, Database marketing Service Providers, 11,02,07

×