Vendor Performance Management

5,766 views

Published on

Report created by the Canadian Association of Mgmt Consultants for the Ministry of Government Services for the Province of Ontario.

Published in: Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
5,766
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
268
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Vendor Performance Management

  1. 1. CMC - Canada February 2012Vendor Performance Management StudyTABLE OF CONTENTS 1.1 BACKGROUND...............................................................................................................................................1 1.2 APPROACH ....................................................................................................................................................1 1.3 RESEARCH LIMITATIONS ...............................................................................................................................22 RESPONDENT PROFILE ........................................................................................................................ 2 2.1 RESPONDENT CHARACTERISTICS ..................................................................................................................3 2.2 EXISTING VENDOR PERFORMANCE MANAGEMENT SYSTEMS .....................................................................123 MEASUREMENT ................................................................................................................................... 174 PERFORMANCE ELEMENTS .............................................................................................................. 225 IMPLICATIONS ..................................................................................................................................... 246 FAIRNESS ............................................................................................................................................... 277 SUMMARY AND CONCLUSIONS ....................................................................................................... 29 7.1 KEY ELEMENTS...........................................................................................................................................29 7.2 DESIGN CONSIDERATIONS ..........................................................................................................................32 7.3 DISCUSSION POINTS ....................................................................................................................................34 7.4 IMPLEMENTATION CONSIDERATIONS ..........................................................................................................358 BIBLIOGRAPHY .................................................................................................................................... 369 APPENDIX A .......................................................................................................................................... 38Prepared by:R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  2. 2. CMC-Canada February 2012Vendor Performance Management StudyEXECUTIVE SUMMARYINTRODUCTION1.1 BackgroundThe Ontario government advised vendors through a White Paper in spring 2010 of its intention todevelop and implement a vendor performance management program for consulting services. Thisintention was formalized in notices as part of Requests for Proposals for new Vendor of Record (VOR)arrangements for consulting services in spring 2011. CMC-Ontario provided initial input through itssubmission to the Ontario government in July 2010 on the “Modernization of Ontario’s ConsultingServices Vendor of Record (VOR) Program.” After posting of the VOR Requests for Proposals, CMC-Ontario commissioned a research project to assist the Association in advising the government in the taskof developing a robust, fair and transparent framework for working with vendors, to manage risk, and toassist vendors in understanding the expectations of government clients.Although the purpose of implementing such programs differs across jurisdictions and various services,the purpose of vendor performance evaluation programs is to monitor the performance of vendors,ensure the management of contracts, cut costs and alleviate risks, foster better communication, andenhance the value of such services by providing timely and structured feedback to vendors (OntarioRealty Corporation, 2010); (Office of the Procurement Ombudsman, 2010); (The Department ofHousing, 2006); (Survey Analytics, 2011); (Weber, 1996). Vendor performance management programsestablish a mutually beneficial relationship between vendor and client and promote the continuousimprovement of the quality of goods and services (The Department of Housing, 2006); (Office of theProcurement Ombudsman, 2010). In addition, vendor management programs increase the vendor’scompetitive advantage, improve stakeholder satisfaction, and increase performance visibility (SurveyAnalytics, 2011).The objective of the current research project was to provide CMC-Ontario (the Ontario Institute ofCanadian Association of Management Consultants) with information and recommendations that wouldenable the organization to articulate a position on a performance management approach formanagement consulting services in the public sector. The intention is for CMC-Ontario to table this paperwith the Ontario government as input to the government’s plan to implement a vendor performancemanagement program for consulting services within the next six to twelve months.1.2 ApproachThe Vendor Performance Management Study was a mixed mode project consisting of a survey,interviews, and a literature review. The objective of the survey was to collect representative data fromsector stakeholders in a statistically valid way. An electronic link to an online survey was sent topotential respondents, and a total of 119 respondents completed the survey entirely. 184 respondentsaccessed the survey but did not complete it entirely. The results from nine valid incomplete cases wereadded to the base of 119 completed cases and the remaining 56 invalid cases were discarded. Thecurrent report contains the valid responses for each survey question.Three telephone interviews were also completed with knowledgeable sector stakeholders. The objectiveof these interviews was to discuss the procurement and management of management consultingcontracts, the use of existing vendor performance management systems, and features and bestpractices of potential vendor performance management systems. These interviews were also intendedto inform the development of the survey instrument.Prepared by: P a g e |1R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  3. 3. CMC-Canada February 2012Vendor Performance Management StudyA literature review was also conducted at the beginning of the project to understand current andexisting vendor performance management systems and to collect some baseline data on best practicesin the field. Over 20 sources were consulted during the development of this review.1.3 Research LimitationsEffort was made during the research project to gather information that was both representative andreliable. However, as with all research endeavors, some considerations should be noted.The total sample size limits the level of detail in the data analysis. The survey used census approach,which means that the resulting data is drawn from a sample of convenience: those who were aware ofthe study and who opted to participate. A total of 128 respondents provided input, but theserespondents were not randomly selected.Since the data was collected from a non-random sample, there is no margin of sampling error.Furthermore, an analysis of subgroups is not possible with the exception of a distinction between buyers(clients) and providers (vendors) of management consulting services, although one question was cut bysector.2 RESPONDENT PROFILERespondents were asked to provide some background on their work and professional experiencesrelative to management consulting. This section provides details on those respondent characteristics.Prepared by: P a g e |2R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  4. 4. CMC-Canada February 2012Vendor Performance Management Study2.1 Respondent Characteristics Figure 2.1.0 Respondent Characteristics 50% 44% 43% 40% 30% 20% 13% 10% 0% Provider (Consultant) Buyer (Client) Neither n=119 QA1: Are you a buyer or provider of management consulting services?Management consulting services refers to both the industry and practice of helping organizationsimprove their performance primarily through the analysis of existing organizational problems and thedevelopment of plans for improvement.The North American Industry Classifcation system description of management consulting is:“Establishments primarily engaged in providing advice and assistance to other organizations onmanagement issues, such as strategic and organizational planning; financial planning and budgeting;marketing objectives and policies; human resource policies, practices and planning; and productionscheduling and control planning.”Buyers (clients) were defined as users of management consulting services while providers (vendors)were defined as providers of management consulting services. Relatively equal proportions of providers(44%) and buyers (43%) responded to the survey. 13% of respondents were neither providers nor buyersof management consulting services.Prepared by: P a g e |3R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  5. 5. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.1 Previous Management Consulting Experience 90% 82% 80% 70% 62% 60% 50% Buyer (Client) 38% 40% Provider (Consultant) 30% 20% 18% 10% 0% Yes No Provider n=68 Buyer n=67 QA1a: Have you ever been a Client (Buyer) of management consulting services? QA1b: Have you ever been a Vendor (Provider) of management consulting services?Both providers and buyers were asked whether they had ever bought or provided managementconsulting services, respectively. A much greater proportion of providers had previously been in a buyerrole, with 62% of providers responding that they had previously been buyers of management consultingservices. Conversely, 82% of buyers had never been in a position where they had provided managementconsulting services.Prepared by: P a g e |4R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  6. 6. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.2 Respondent Fields and Specialties 60% 50% 50% 40% 30% 23% 20% 15% 10% 6% 4% 2% 0% n=119 QA2: What is your field or specialty?Respondents were asked to identify their field or specialty. Half of all respondents are in supply chainmanagement, while 23% of respondents are in management consulting. Smaller proportions are fromthe human resources (2%), project management (6%) and information technology (4%) fields.Prepared by: P a g e |5R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  7. 7. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.4 Respondent Sectors 30% 22% 20% 16% 14% 12% 11% 10% 10% 7% 8% 0% n=249 (multiple response permitted) QA2a: What sectors are you involved with?Respondents were asked to identify the sectors that they are involved with. There was a dispersion ofanswers to this question with broad representation across various industry sectors, with 22% involved inprivate companies and relatively equal proportions involved in the municipal government (12%), healthcare (11%) or academic (i.e. university/college)(10%) sectors. 16% of respondents are involved in thebroader public sector, while 14% are involved in the federal and provincial government sectors. Smallerproportions are involved with school board (7%) and other (8%) sectors.Prepared by: P a g e |6R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  8. 8. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.5 Years of Respondent Experience 40% 30% 30% 25% 20% 18% 17% 10% 10% 0% 1 to 5 years 6 to 10 years 11 to 20 years 20 to 29 years 30 years + n=119 QA3: How many years have you been active in this field?Respondents were asked to indicate how many years they had been involved in their field. Respondentswere relatively experienced with 30% having 11 to 20 years of experience in their field and 25% having20 to 29 years of experience in their field. Only 10% had less than five years experience. Respondentshad an average of 18.2 years of experience.Prepared by: P a g e |7R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  9. 9. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.6 Respondent Memberships or Professional Designations 30% 20% 20% 17% 17% 13% 14% 10% 10% 3% 3% 2% 1% 0% N=180 (multiple response permitted) QA4: Do you possess any of the following memberships or designations?Respondents were asked to identify their membership in professional associations or their professionaldesignations. Equal proportions of respondents have Ontario Institute of the Purchasing ManagementAssociation of Canada OIPMAC (17%) and CMC Canadian Association of Management Consultants (17%)designations. 13% has membership in the Ontario Public Buyers Association OPBA and 10% havemembership in the National Institute of Governmental Purchasing NIGP. Smaller proportions ofrespondents were members or possessed designations from the Canadian Public Procurement CouncilCPPC (3%), Ontario Educational Collaborative Marketplace OECM (3%), Healthcare Supply ChainNetwork HSCN (2%), or construction association (1%). 14% of respondents stated that this question wasnot applicable to their situation.Prepared by: P a g e |8R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  10. 10. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.7 Degree of Involvement with Management Consulting Contracts 60% 50% 50% 40% 30% 20% 18% 14% 12% 10% 6% 0% none 1 to 5 6 to 10 11 to 15 16 + N=119 QF1: How many management consulting contracts have you been a part of in the last year?Respondents were asked to indicate how many management consulting contracts they had been a partof in the last year. 50% of respondents had been a part of a minimum of one contract and a maximum of5 contracts in the last year, while 18% had not been involved in any management consulting contracts.Smaller proportions were involved in between 6-10 contracts (14%), 11-15 contracts (12%) and morethan 16 contracts (6%).Prepared by: P a g e |9R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  11. 11. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.9 Number of Performance Management Evaluations 50% 43% 40% 34% 30% 20% 10% 10% 8% 6% 0% none 1 to 10 11 to 25 26 to 49 50 + N=119 QF2: During the course of a typical year how many performance management evaluations are performed at your company or organization?Respondents were asked to indicate how many vendor performance evaluations are performed duringthe course of a typical year at their company or organization. On average, companies and organizationscomplete 18 performance management evaluations during the course of a typical year. 43% ofrespondents stated no vendor performance evaluations are typically done, while 34% stated thatbetween 1 and 10 vendor performance evaluations are typically done at their company or organization.Smaller proportions indicated that they typically performed 11-25 vendor evaluations (10%), 26-49vendor evaluations (6%) or more than 50 vendor evaluations (8%).The survey also asked respondents to state the average value of these contracts. The mean for thisquestion (or average) average value was $379,844.00.Prepared by: P a g e | 10R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  12. 12. CMC-Canada February 2012Vendor Performance Management Study Figure 2.1.10 Importance of Vendor Performance Management 100% 92% 90% 80% 70% 60% 50% 40% 30% 20% 8% 10% 0% Yes No N=119 G2: Overall, do you see the management of vendor performance as an important activity to measure, evaluate and improve the performance of vendors?Respondents were asked whether they saw the management of vendor performance as an importantactivity to measure, evaluate and improve the performance of vendors. 92% of respondents stated thatthey saw it as an important activity, while 8% stated that they did not see it as an important activity tomeasure, evaluate and improve vendor performance.Prepared by: P a g e | 11R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  13. 13. CMC-Canada February 2012Vendor Performance Management Study2.2 Existing Vendor Performance Management Systems Figure 2.2.1 Existing Vendor Performance Management Policies 80% 75% 70% 60% 50% 40% 30% 25% 20% 10% 0% Yes No n=67 QA5: Does your company or organization have a vendor performance management policy for outside consultants?Buyers of management consulting services were asked if their company or organization had a vendorperformance management policy for outside consultants. While three-quarters of buyers do not have avendor performance management polity, 25% do. The remainder of this subsection pertains to this 25%and the existing vendor performance management policies possessed by those companies ororganizations. Figure 2.2.2 Utilization of Performance Contractual Clauses 100% 94% 90% 80% 70% 60% 50% 40% 30% 20% 10% 6% 0% Yes No n=17 (caution: small base) QA7: Does your company utilize performance contractual clauses for managing vendor performance?Of companies and organizations with existing vendor performance management policies, 94% ofcompanies and organizations utilize performance contractual clauses for managing vendor performance.This and subsequent figures must be interpreted with caution because of the small base sizes.Prepared by: P a g e | 12R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  14. 14. CMC-Canada February 2012Vendor Performance Management Study Figure 2.2.3 Utilization of Performance Incentives 70% 65% 60% 50% 40% 30% 18% 20% 12% 10% 6% 0% Yes - positive Yes - negative Yes - positive and No - neither negative N=17 (caution: small base) QA8: Does you company or organization use positive and negative performance incentives for managing vendor performance?Of companies and organizations with existing vendor performance management policies, 65% usepositive and negative performance incentives for managing vendor performance. 12% use only positiveperformance incentives, while 6% use only negative performance incentives. 18% of companies andorganizations use neither positive nor negative performance incentives.Prepared by: P a g e | 13R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  15. 15. CMC-Canada February 2012Vendor Performance Management Study Figure 2.2.4 Utilization of Tools and Resources 90% 80% 76% 76% 71% 70% 65% 60% 50% 41% 40% 35% 29% 30% 20% 10% 0% N=17 (caution: small base) QA10: Which of the following tools and resources are available to your organization for managing vendor performance?Companies and organizations with existing vendor performance management policies were asked whichtools and resources were available to their organization for managing vendor performance. Just overthree-quarters said they used forms (76%) and templates (76%), 71% said they used progress meetings,and 65% of companies and organizations used performance documentation. Lesser proportions usedchecklists (41%), third party verification (35%) or user guides and manuals (29%).Prepared by: P a g e | 14R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  16. 16. CMC-Canada February 2012Vendor Performance Management Study Figure 2.2.5 Automation of Forms and Templates 50% 47% 47% 45% 42% 40% 35% 35% 30% 25% Forms 20% 18% Templates 15% 11% 10% 5% 0% MS Word MS Excel Other Software Application Forms n=19 (caution: small base) Templates n=17 (caution: small base) QA10b and QA10c: Please specify your applicationsMS Excel was the most commonly used form and template for managing vendor performance: 47% ofthose with existing vendor performance management systems used MS Excel as their application. 42%of those with existing vendor performance management systems used MS Word as their application formanaging vendor performance forms, while 35% used MS Word as their application for managingvendor performance templates. 11% of those with existing vendor performance management systemsused other software applications for managing vendor performance forms, including CMIC and a SAPContracts Database. 18% of those companies or organizations with existing vendor performancemanagement systems used other software applications for managing vendor performance templates,including MS Access, Cordys Software, and online surveys. Figure 2.2.5 Evaluation Triggers for Existing Vendor Performance Management SystemsPrepared by: P a g e | 15R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  17. 17. CMC-Canada February 2012Vendor Performance Management Study 40% 32% 30% 24% 20% 12% 10% 9% 6% 6% 3% 3% 3% 3% 0% N=138 QF2a: What triggers an evaluation?Companies and organizations with existing vendor performance management policies were asked whattriggers a vendor evaluation. Issues with contract performance (32%) and contract value (24%) were theleading evaluation triggers. Complex projects triggers vendor evaluations at 12% of the companies andorganizations with existing vendor performance management policies, while periodic triggers occur at9% of the companies and organizations. Client dissatisfaction and project profiles were triggers at 6% ofthe companies and organizations, while new vendors, repeat vendors, the length of engagementtriggered evaluations at 3% of the companies and organizations each. Only 3% of companies andorganizations with existing vendor performance management policies trigger evaluations on everyproject.Prepared by: P a g e | 16R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  18. 18. CMC-Canada February 2012Vendor Performance Management Study3 MEASUREMENTThis section contains findings on respondent preferences as to when and how vendor performanceshould be measured. Figure 3.0.1 Vendor Performance Evaluation on Every Project? 100% 90% 17% 22% 21% 31% 80% 70% 60% 50% No 40% 83% 78% 79% Yes 69% 30% 20% 10% 0% Provider Buyer (Client) Neither Total (Consultant) N=119 QC1: In your opinion, should vendor performance be measured on every project?Providers (83%) were more likely than buyers (78%) and more likely than respondents who were neitherbuyers nor providers (69%) to state that vendor performance should be measured on every project.Overall, approximately 4 out of 5 respondents, or 79%, state that vendor performance should bemeasured on every project.Prepared by: P a g e | 17R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  19. 19. CMC-Canada February 2012Vendor Performance Management Study Figure 3.0.2 Vendor Performance Evaluation Triggers 40% 32% 30% 20% 16% 14% 11% 11% 10% 7% 5% 3% 2% 0% n=34 QC1: If no, what do you think should trigger a formal evaluation?Respondents who said that performance should not be measured on every project were asked whatshould trigger a formal evaluation. 32% of respondents said that formal evaluations should be triggeredby poor performance, while 16% said that formal evaluations should be periodic. 14% said that formalevaluations should occur at the end of a contract, while 11% said formal evaluations should occur uponrequest. 5% qualified the fact that evaluations should occur on every project, while 3% said that thecontract value should be the trigger. 2% said that they don’t know, and 11% said this question was notapplicable to their situation.Prepared by: P a g e | 18R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  20. 20. CMC-Canada February 2012Vendor Performance Management Study Figure 3.0.3 Vendor Performance Evaluation Timelines 50% 43% 40% 30% 20% 16% 10% 10% 8% 7% 4% 5% 1% 2% 3% 1% 1% 0% N=153 QC2: When should vendor performance be evaluated?Respondents were asked when vendor performance should be evaluated. 43% said that vendorperformance should be evaluated at the completion of each stage or phase of the project. 16% ofrespondents stated that vendor performance should be evaluated at the end of the contract, while 8%said it should be evaluated on a quarterly basis. Lesser proportions were seen for other periodicevaluations including monthly (1%), biannually (4%), annually (7%), or every other year (1%). Otherrespondents stated that it would depend on the length (5%), value (2%) and complexity (1%) of theproject/contract. 3% stated that vendor evaluations should occur at times that are specified in thecontract.Prepared by: P a g e | 19R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  21. 21. CMC-Canada February 2012Vendor Performance Management Study Figure 3.0.4 Evaluation Report Duration 80% 71% 70% 60% 50% 40% 30% 18% 20% 10% 7% 4% 0% under 10 minutes 10 minutes to half between half an longer than an an hour hour and an hour hour N=119 QC3: How long should an evaluation report take to complete?Respondents were asked to state how long, in minutes, an evaluation report should take to complete.71% of respondents said that an evaluation report should take between 10 and 30 minutes to complete,while 18% stated that an evaluation report should take between 30 minutes and an hour to complete.Lesser proportions stated that evaluation reports should take less than 10 minutes (7%) or longer thanan hour (4%). The mean for this question was 30 minutes, while the average was 32.5 minutes.Prepared by: P a g e | 20R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  22. 22. CMC-Canada February 2012Vendor Performance Management Study Figure 3.0.5 Vendor Performance Evaluation Measurement Scales 100% 4% 9% 6% 6% 90% 80% 19% 43% 33% 70% 7% 47% Other 60% 9% Numeric scale 50% 11% 6% 40% Very good to very poor 30% 65% 52% Exceeded, met, fell short of 20% 43% 41% expectations 10% 0% Provider Buyer (Client) Neither Total (Consultant) N=125; QC4: Please indicate your preference with respect to the following scales used to measure satisfaction with vendor performance.Respondents were asked to indicate their preferences with respect to several types of scales used tomeasure satisfaction with vendor performance. Providers were more likely to state that an expectationscale should be used to measure vendor performance (65%), with smaller proportions of providers whopreferred numeric (19%) and qualitative (i.e. very good to very poor) scales (7%). Buyers were equallydivided between the expectation (43%) and numeric scales (43%), with only 11% preferring thequalitative scale. Respondents who were neither buyers nor providers were more likely to prefer thenumeric scale (47%) over the expectation (41%) or qualitative scale (6%). Overall, 52% of respondentspreferred the expectation scale while lesser proportions preferred the numeric (33%) and qualitativescale (9%).Prepared by: P a g e | 21R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  23. 23. CMC-Canada February 2012 Vendor Performance Management Study 4 PERFORMANCE ELEMENTS This section contains findings on vendor performance elements. Figure 4.0.1 Vendor Performance Element Ratings Providers Buyers (Clients) Overall (n=128) Neither (n=17) (Consultant) (n=56) (n=55) Top Box Average Top Box Average Top Box Average Top Box Average % (mean) % (mean) % (mean) % (mean)A. Effective communication 59.2% 9.2 69.6% 9.5 58.2% 9.3 29.4% 8.8 throughout engagementB. Quality of resources 45% 8.9 53.6% 9.3 41.8% 9.0 29.4% 8.1C. Availability of resources to 46.7% 9.0 51.8% 9.1 47.3% 9.1 23.5% 8.7 carry out contractD. Quality of the final 69.7% 9.5 66.1% 9.6 74.5% 9.6 58.8% 9.4 deliverablesE. Providing value added 22.7% 8.1 21.7% 8.2 25.5% 8.1 11.8% 7.8 servicesF. Maintaining 44.1% 9.1 33.9% 9.0 56.4% 9.3 41.2% 9.3 timelines/deadlinesG. Budget/cost control 47.9% 9.1 39.3% 9.0 50.9% 9.3 58.8% 9.3H. Having a vendor contact 30.8% 8.1 25.0% 8.1 38.2% 8.4 17.6% 7.8 for dispute resolution N=various (see table) QB1: In general, please indicate how important the following elements are for evaluating vendor performance? Overall, respondents rank the quality of the final deliverables (69.7%) and effective communication throughout the engagement (59.2%) as the most important elements for measuring vendor performance, as measured through the top box score. Budget/cost control (47.9%), availability of resources to carry out the contract (46.7%), quality of resources (45%) and maintaining timelines/deadlines (44.1%) followed with similar overall average and top box scores. Having a vendor contact for dispute resolution (30.8%) and providing value added services (22.7%) are considered to be the least important elements for evaluating vendor performance. Prepared by: P a g e | 22 R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  24. 24. CMC-Canada February 2012 Vendor Performance Management Study Figure 4.0.2 Vendor Performance Evaluation Rankings Providers (Consultant) Overall (n=126) Buyers (Clients) (n=54) Neither (n=17) (n=55) Rank Rank Rank Rank Rank Rank Rank Rank Rank Rank Rank Rank 1 2 3 1 2 3 1 2 3 1 2 3A. Effective communication 26% 19% 14% 30% 20% 9% 18% 20% 17% 24% 19% 19% throughout engagementB. Quality of resources 9% 18% 11% 9% 25% 9% 11% 7% 9% 0% 13% 19%C. Availability of resources 11% 8% 8% 7% 5% 11% 13% 9% 7% 12% 6% 6% to carry out contractD. Quality of the final 38% 23% 14% 43% 16% 18% 33% 26% 15% 41% 31% 6% deliverablesE. Providing value added 2% 5% 8% 2% 11% 9% 2% 0% 6% 0% 0% 13% servicesF. Maintaining 3% 12% 25% 0% 9% 25% 4% 19% 26% 6% 6% 19% timelines/deadlinesG. Budget/cost control 8% 13% 16% 4% 5% 18% 16% 19% 17% 12% 25% 6%H. Having a vendor contact 3% 2% 1% 4% 4% 0% 0% 0% 0% 6% 0% 6% for dispute resolutionI. Other, Specify 1% 2% 3% 0% 4% 0% 2% 0% 4% 0% 0% 6% N=various (see table) QB2: Please rank the three most important elements for evaluating vendor performance in general. Respondents were asked to rank the three most important elements for evaluating vendor performance in general. As with the top box scores, the quality of the final deliverables (38.3%) and effective communication throughout the contract (25.8%) were ranked first by the greatest number of respondents overall. The third most important first-ranked element was the availability of resources to carry out the contract (10.8%). The quality of the final deliverables (22.7%) and effective communication throughout the engagement (18.5%) were the most important second ranked elements, although the quality of resources (17.6%), budget and cost control (13.4%) and maintaining timelines/deadlines (11.8%) were still important to a considerable number of respondents overall. Maintaining timelines/deadlines (25.2%) and budget/cost control (16.0%) were the most important third ranked elements, while quality of the final deliverables (14.3%) and effective communication throughout the contract (14.3%) were tied for the third most important third ranked elements. In examining the overall results horizontally it can be concluded that while maintaining timelines/deadlines and budget/cost control are important elements, they do not figure as prominently in evaluating vendor performance as the quality of the final deliverables or effective communication throughout the engagement. Moreover, the fact that maintaining timelines/deadlines and budget/cost control have increasing importance from left to right across the ranks suggests that they figure prominently as important elements regardless of communication and quality deliverables. Comparing results across those who are providers, buyers and neither providers nor buyers of management consulting services, it is clear that providers rank effective communication throughout the engagement more highly than buyers. Providers are also more likely to rank the quality of resources as an important element (2nd rank) relative to buyers, and to rank maintaining timelines/deadlines and Prepared by: P a g e | 23 R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  25. 25. CMC-Canada February 2012Vendor Performance Management Studybudget/cost control as less important elements relative buyers. 59% of both buyers and providers rankthe quality of the final deliverables either first or second out of all performance elements.5 IMPLICATIONSThis section contains findings on respondent preferences for how vendor performance measurementsystems should be used. Figure 5.0.1 Performance Incentive Preferences 100% 90% 31% 80% 39% 45% 70% 56% No - neither 60% Yes - positive and 50% negative 50% 40% Yes - negative 51% 30% 45% 37% Yes - positive 20% 10% 13% 2% 6% 5% 6% 4% 6% 5% 0% Provider Buyer Neither Total (Consultant) (Client) N=119 QD1: Should a vendor performance measurement system use positive and negative performance incentives? E.g. Bonuses and / or financial penalties.Overall, respondents were divided as to whether vendor performance measurement systems should usepositive and negative performance incentives or not, with 45% stated that they should not be used andequal proportion stating that both positive and negative performance incentives should be used. Buyersand respondents who were neither providers or nor buyers were more likely to state that positive andnegative performance incentives should be used with 51% and 50% stating that they should be used.56% of providers think that performance incentives should not be used and 37% of providers think thatboth positive and negative performance incentives should be used. Much smaller proportions ofrespondents overall thought that only negative (5%) or only positive (5%) performance incentives shouldbe used.Prepared by: P a g e | 24R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  26. 26. CMC-Canada February 2012Vendor Performance Management Study Figure 5.0.2 Implications for Receiving Subsequent RFP’s and RFS’s 100% 13% 90% 24% 23% 25% 80% 70% 60% 50% No 88% 40% 76% 77% Yes 75% 30% 20% 10% 0% Provider Buyer (Client) Neither Total (Consultant) N=119 QD2: Should a vendor performance measurement system be used to determine which vendors receive subsequent RFP’s and RFS’s?There were high levels of consensus that vendor performance measurement systems should be used todetermine which vendors receive subsequent RFP’s and RFS’s, with relatively equal proportions ofproviders (75%) and buyers (76%) in agreement that this constitutes a good business practice.Respondents who were neither providers nor buyers were more likely to state that vendor performancemeasurement systems should be used to determine which vendors receive subsequent RFP’s and RFS’s(88%).Prepared by: P a g e | 25R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  27. 27. CMC-Canada February 2012Vendor Performance Management Study Figure 5.0.3 Probation for Vendors with a Poor Performance Record 100% 6% 14% 90% 19% 29% 80% 70% 60% 50% No 94% 86% 40% 81% Yes 71% 30% 20% 10% 0% Provider Buyer (Client) Neither Total (Consultant) N=119 QD3: Should a vendor with a poor performance record be put on probation for a defined period of time?Buyers and providers were more likely to be at odds over whether vendors with poor performancerecords should be put on probation for a defined period of time, with 86% of buyers but only 71% ofproviders in agreement that this constituted a good business practice. Respondents who were neitherproviders nor buyers were more likely to state that vendors with a poor performance record should beput on probation for a defined period of time, with 94% in agreement. Overall, approximately 4 in 5(81%) of respondents were in agreement that probationary periods should be used to respond to poorvendor performance.Prepared by: P a g e | 26R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  28. 28. CMC-Canada February 2012Vendor Performance Management Study6 FAIRNESS Figure 6.0.1 Preferences with Respect to Dispute Resolution 100% 4% 1% 3% 4% 90% 80% 42% 45% 44% 70% 50% Other 60% Client/Vendor meetings 50% 13% 15% Bring in third party 18% 8% 40% Debriefing on evaluation 30% report 32% 31% Designated ombudsman 20% 29% 35% 10% 7% 8% 7% 0% 4% Provider Buyer (Client) Neither Total (Consultant) N=239 (multiple response permitted) QE1: What types of dispute resolution should be available?Relatively similar responses were received from the respondent groups on the types of disputeresolution that should be available. This was a multiple response question so percentages should beviewed as the degree of agreement rather than as the percentage of those who agreed. Client/vendormeetings (44% overall) and debriefings on the vendor evaluation report (31%) overall are seen as theleading types of dispute resolution that should be available. Across the respondent groups relativelysimilar proportions were in agreement that there should be a designated ombudsman in the clientorganization. The respondent groups were most highly misaligned on whether a third party should bebrought in to facilitate, mediate and arbitrate in the event of a dispute between the provider and buyerof management consulting services. 18% of providers thought that this type of dispute resolution shouldbe available as compared to 13% of buyers and just 8% of respondents who were neither providers norbuyers.Prepared by: P a g e | 27R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  29. 29. CMC-Canada February 2012Vendor Performance Management Study Figure 6.0.2 Additional Remarks 30% 23% 20% 20% 18% 16% 11% 10% 5% 5% 2% 0% N=44 QG3: Is there anything else that you’d like to add in regards to vendor performance management?At the end of the survey respondents were asked whether there was anything else that they’d like toadd in regards to vendor performance management. This question received a range of open-endedremarks. 18% of respondents to this question stated that a standardized approach is needed acrosssectors and fields with respect to the management of vendor performance. 20% of respondentscautioned that vendor performance management can become resource intensive and/or bureaucratic ifimproperly designed and implemented. 16% also cautioned that an unbiased approach is needed so thatvendor performance evaluations are a fair and effective management tool. 11% of respondents to thisquestion stated that vendor performance management is needed as a business practice for relationshipbuilding. 5% of respondents suggested that training is needed for users of any given vendorperformance management system, and a similar proportion (though presumably of providers) suggestedthat buyers also need to be managed. 2% also cautioned that the public and private sectors are differentand that an approach used in one sector may not necessarily be effective in another.Prepared by: P a g e | 28R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  30. 30. CMC-Canada February 2012Vendor Performance Management Study7 SUMMARY AND CONCLUSIONS7.1 Key ElementsAs previously mentioned, the research for this project included interviews and a literature review inaddition to the survey research. These supplementary activities provided data that is useful forinforming, validating and confirming the survey findings contained in the report. While clear direction isprovided by the majority of the survey findings, this summary and the following conclusions also drawupon that supplementary material.Respondents are in agreement that the management of vendor performance is an important activity formeasuring, evaluating and improving the performance of vendors. Any vendor performancemanagement system will include not only the tools with which vendor performance can be measured,but the processes and knowledge with which such measurements can be integrated into the operationsand business practices of an organization. The fact that only a quarter of buyers represented in thesurvey have a vendor performance management policy represents an opportunity for the developmentof vendor performance management systems.Existing performance managements systems characteristically utilize performance contractual clausesfor managing vendor performance. Such clauses are used as a framework against which performancecan be measured. Approximately two-thirds of companies and organizations with existing performancemanagement policies utilize both positive and negative performance incentives, suggesting thatperformance incentives (both positive and negative) are a best practice in terms of managing vendorperformance. Forms, templates, progress meetings and performance documentation are the mostcommonly utilized tools and resources. It should be noted, however, that other survey findings and theinterviews have cautioned that user guides and manuals (and system training) ought to be consideredpart of an effective vendor performance management system.Among companies and organizations with existing vendor performance management systems, poorcontract performance, budget thresholds and complex projects were generally the most prevalentvendor evaluation triggers. While Providers and Buyers are generally in agreement that vendorperformance should be used uniformly on every project, those who suggested that vendor performanceevaluations should not be utilized on every project were most likely to state that poor vendorperformance should the evaluation trigger. This finding suggests that vendor performance evaluationscould become punitive in the absence of a uniform trigger system. Indeed, in our review of theliterature, vendor performance programs generally used a complaint based system in which clients onlycomplete such forms if they have a negative experience and wish to submit a complaint. Informantsnoted that performance evaluations should be ongoing because performance cannot be improved at theend of the contract. Mid-contract evaluations were also seen as opportunities for the vendor to ‘makegood’ on their contractual obligations. Indeed, respondents were most likely to state that vendorperformance should be evaluated at the completion of each stage or phase of the project.The current study identified a mean value of $379,844.00 as the average value of respondent’s contractsfor which performance evaluations were completed. Since budget thresholds are a more objectivetrigger mechanism than poor performance, a figure for vendor performance evaluation systems that aretriggered by a budget threshold ought to be considered. In reviewing the literature many broader publicsector organizations utilize budget thresholds in order to determine which procurement opportunitiesought to be made publicly available to all suppliers. The Agreement on Internal Trade (AIT) for instance,utilizes a figure of $100,000 as the threshold for goods and services procurement opportunities whichmust be made available on an electronic tendering system readily accessible by all suppliers acrossCanada. Given the need to automate vendor performance measurement systems and in the interests ofPrepared by: P a g e | 29R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  31. 31. CMC-Canada February 2012Vendor Performance Management Studydeveloping synergies between the current initiative and public sector regulations a budget threshold of$100,000 is similarly recommended as the trigger for vendor performance evaluations in themanagement consulting sector.Respondents are generally in agreement that a vendor performance evaluation should take, on average,approximately 30 minutes to complete. Respondents were more divided on the scales used to measureperformance, with many (52%) suggesting that an expectation scale ought to be used. Nevertheless,qualitative comments and data from the interviews suggested that an expectation based scale will notcapture differences between vendors because of the central tendency (i.e. most vendors will ‘meetexpectations’). A number of qualitative comments and data from the interviews suggested that anumeric scale (i.e. 1-4) would be best in order to avoid that central tendency. Indeed, a VendorPerformance Tracking Report produced by the State of Florida myMarketPlace demonstrates (AppendixA Figure 9.0.1) that approximately two thirds of all vendors fall between 2.81 and 3.20 on the five pointscale used by that jurisdiction. A greater dispersion and avoidance of the central tendency would be themain benefits of a four point numeric scale.Also reoccurring throughout jurisdictions is the use of a numeric rating scale to evaluate vendorperformance. For instance, the Department of Housing (2006) in Atlanta, Georgia uses a numeric ratingperformance scale from 0-4 (0 representing unsatisfactory performance and 4 referring to excellentperformance) including a “N/A” option. Each numeric rating is defined in a legend describing the levelsof performance. The same technique is used by (MyFloridaMarketPlace.com, 2005) (Government ofTasmania, 2001), and the (Ontario Realty Corporation, 2010) who uses a numeric scorecard andscorecard guide to obtain a Vendor Performance Rating score.There were some clear findings regarding the performance elements of the vendor performancemanagement system. Quality of the final deliverables and effective communication throughout theengagement were identified as priorities, suggesting that these two elements should contain the most‘attributes’ or question dimensions in order to address the importance of these elements with respectto vendor performance. Budget/cost control, availability of resources to carry out the contract, qualityof resources, and maintaining deadlines/timelines are also important elements. Value added servicesand having a vendor contact for dispute resolutions were considered the least important elements.Regardless of the specific elements, the performance criteria should always be disclosed, in advance, tovendors and the performance criteria must be simple and easily applied (Kestenbaum & Straight, 1995).Informants also noted that performance management systems require clear definitions of each scoreitem.Informants noted during the interviews that the recognition of need for expertise that is not available inhouse was generally the instigation for the procurement of management consulting contracts. It isgenerally thought that the ‘misunderstandings’ that may lead to performance issues generally beginwith the development of a scope of work. Because the scope of work is used to define clientexpectations, the main source of problems are generally introduced at the end of the contract when thebudget has been used up. It is at this point that clients may recognize a gap between their expectationsand the consultant’s deliverables when there is no time or money to rectify the performance issue. Ahigh rating in the current survey for “effective communication throughout the engagement” suggeststhat the gap between client expectations and consultant’s deliverables may be addressed through themanagement of client expectations through effective communication. One informant noted,additionally, that management consultants often function with a vague scope of work. This issue alsohighlights the importance of communication for defining service, work and deliverable parameters.Within this context (and particularly when clients have not defined their need) there may be anopportunity to introduce performance criteria associated with innovation and creativity with respect toPrepared by: P a g e | 30R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  32. 32. CMC-Canada February 2012Vendor Performance Management Studyhow the consultant produces the desired outcome. The emphasis here, again, is on the quality of thefinal deliverables.The overarching performance criteria among vendor performance programs reviewed in the literaturetend to concern the quality of performance, partnership between client and vendor, delivery of services,and cost. For example, (Ontario Realty Corporation, 2010) utilizes a performance scorecard in whichvendors are rated on a scale of 1-5 based on quality, partnership, and value for money. A similarscorecard is used by (The Department of Housing, 2006) in Georgia, which rates vendors based onsatisfaction, quality, business relations and timeliness. Of all performance measures, quality of service isthe most difficult to evaluate according to (Kestenbaum & Straight, 1995) and it is the factor thatgenerally receives the most weighting overall (Stueland, 2004). An analysis of the consequences ofcontract administration problems for contracted services revealed that poor performance was theleading cause of contract delays of more than 10 days (18.4%), and a leading cause for contracttermination (17.7%) amongst causes including such contract administration problems as wrongproducts, delays, definitions of acceptance, change orders, conflicts, risks, subcontracts and costs(Davison, B and Sebastian, R.J, 2009). Poor performance was revealed to be the third leading cause ofcontract delays of less than 10 days (16.7%), and the leading cause of contract termination (13.8%) forprofessional services contracts (Davison, B and Sebastian, R.J, 2009).Prepared by: P a g e | 31R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  33. 33. CMC-Canada February 2012Vendor Performance Management Study Figure 7.1.2 Performance Incentive Preferences by Sector 100% 90% 80% 40% 45% 70% 56% 60% No - neither 50% Yes - positive and negative 40% Yes - negative 48% Yes - positive 30% 53% 38% 20% 10% 3% 3% 9% 0% 4% 3% 0% GOVERNMENT INSTITUTION PRIVATE COMPANY N=240 (multiple responses permitted) QD1: Should a vendor performance measurement system use positive and negative performance incentives? E.g. Bonuses and / or financial penalties?Because respondents were equally divided (i.e. 45% / 45%) as to whether a vendor performancemanagement system ought to include or not include performance incentives, the researchers examinedthe question against categories formed by aggregating the respondent sectors together. In this context,respondents who worked in government sectors (municipal, provincial, federal and the broader publicsector) were more likely to suggest that vendor performance measurement systems should not usepositive and negative performance incentives (56%) than those respondents in the institutional(academic, school board, health care; 45%) and private company (40%) sectors. However, respondentsare generally in agreement that a vendor performance management system should be integrated intoprocurement operations so as to determine which vendors receive or do not receive subsequent RFP’sand RFS’s. There were also high levels of agreement that vendors with a poor performance recordshould be put on probation for a defined period of time.Respondents are in agreement that client/vendor meetings and debriefings on the vendor evaluationreport ought to be available as dispute resolution techniques. Providers are more likely to seek thirdparty involvement, while smaller numbers are in agreement that there should be a designatedombudsman in the client organization for dispute resolution purposes.7.2 Design ConsiderationsQualitative comments received through the survey and interviews suggest that vendor performanceought to be measured against clauses in the contract. Generally these clauses are provided withtender/RFP/RFS packages to prospective vendors. An informant revealed, for instance, that “a proactivePrepared by: P a g e | 32R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  34. 34. CMC-Canada February 2012Vendor Performance Management Studyapproach to vendor performance is required. Essentially, it needs to be disclosed that the successfulbidder will be subject to a performance evaluation in the bid documents, along with a disclosure of thetools that will be available to the client in assessing the vendor. And then, a consistent application of theproper contract administration functions is important through-out the life of the project.” Commentsreceived, however, indicate that publicly available performance policies are not a best practice withrespect to management consulting services, and very few companies or organizations represented in thesurvey have publicly available performance policies (via the internet) . However, the integration ofperformance policies with evaluation processes and metrics into service agreements, as mentionedabove, is a widespread best practice.The majority of programs reviewed in the literature, including (Ontario Realty Corporation, 2010), (TheDepartment of Housing, 2006), (MyFloridaMarketPlace.com, 2005), and (Ministry of Transportation(MTO), 2007), utilize a performance scorecard and scoring guide, a performance rating system, and aweighting technique. The scorecard is used by clients to evaluate a vendor’s performance by applyingevaluation criteria that are aligned with the various performance components. The scoring guide assistsclients with completing the scorecard by outlining the criteria used to evaluate such performancecomponents. Benchmarking techniques are also used as tools to assign weighting to performancecomponents. Weighting is a common measurement technique used to evaluate the importance of eachperformance criterion relative to one another in order to provide vendors with a total score. Theweighting scales include percentage per criterion and numerical values (Stueland, 2004).Other comments point to the existence of other performance measurement tools and resources such asthe Better Business Bureau, financial stability, level of responsiveness to bid invitations, project teammember skills, safety records, and references as components of an overall vendor performancemanagement system. It is not clear on the basis of these comments whether these tools and resourcesare formally or only informally part of the vendor performance management systems at play within therespondent companies and organizations.There are several options with respect to the use and implications of various metrics and measurementsystems. The (Ontario Realty Corporation, 2010), for example, uses the average of all a particularVendor’s scorecards over a three year period to derive their Vendor Performance Rating (VPR). The VPRcan be applied in a Request for Qualification, Tender, Proposal, or services (Ontario Realty Corporation,2010). (Ministry of Transportation (MTO), 2007) employs a similar approach to their ConsultantPerformance and Selection System (CPSS) by measuring past performance through a CorporatePerformance Rating (CPR), which is the weighted average of the consultant’s appraisals over three-years. Informants also expressed the view that performance management systems ought to include thecapacity to reflect trends.(Office of the Procurement Ombudsman, 2010) found in their study of vendor performance that a feworganizations, such as: Public Works and Government Services Canada - Real Property Branch (PWGSC –RPB) and the National Capital Commission (NCC), use key performance indicators (KPI’s) as measures ofvendor performance. They define a KPI as “a key measure of performance for a specific activity that ispre-identified by the organization, and is used for determining the success of the vendor in meeting itscontractual obligations” (Office of the Procurement Ombudsman, 2010, p. 9).Prepared by: P a g e | 33R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  35. 35. CMC-Canada February 2012Vendor Performance Management Study7.3 Discussion PointsDISCUSSION POINT ONE: CMC-Ontario should assist the Ministry of Government Services Supply ChainManagement Division in developing a vendor performance management system which can be used bythe broader public sector and its suppliers, including buyers (clients) represented in the current report.75% of buyers represented in the current report do not have an existing vendor performancemanagement system and 92% of respondents in the current survey agree that the management ofvendor performance is an important activity for measuring, evaluating and improving the performanceof vendors. In the words of one informant, a standardized approach to measurement of vendorperformance is long overdue.DISCUSSION POINT TWO: The vendor performance measurement system ought to be utilizeperformance contractual clauses that are made available to vendors within the bid documents.DISCUSSION POINT THREE: Given the opportunity to develop the vendor performance measurementsystem for the broader public sector, it ought not to include positive and negative performanceincentives. Instead, vendors with a poor performance record ought to be put on probation for a definedperiod of time. The vendor performance management system also ought to be used to determine whichvendors receive or do not receive subsequent RFP’s.DISCUSSION POINT FOUR: Vendor evaluations ought to be triggered by contract values in excess of$100,000, and ought to be undertaken at the completion of each stage or phase of the project. This willmitigate against punitive evaluations and will place the emphasis on continuous improvement.DISCUSSION POINT FIVE: The vendor performance evaluation systems should be designed in such a waythat individual vendor evaluations take approximately 30 minutes to complete. Vendor evaluationreports ought to use a numeric four point scale for measuring performance in order to avoid the ‘centraltendency.’DISCUSSION POINT SIX: “The quality of the final deliverables” and “effective communication throughoutthe contract” ought to include the most attributes or question dimensions in the evaluation form inorder to reflect the importance of these performance elements.DISCUSSION POINT SEVEN: Vendors and clients ought to have recourse to client/vendor meetings anddebriefings on the vendor evaluation report as dispute resolution techniques. Evaluators ought to betrained on the use of the vendor performance management system, and evaluations ought to be signed.Prepared by: P a g e | 34R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  36. 36. CMC-Canada February 2012Vendor Performance Management Study7.4 Implementation ConsiderationsIn the open-ended comments, respondents noted that various personnel were responsible toundertaking vendor performance evaluations at companies and organizations with existing vendorperformance management policies for outside consultants. The responsibility resided with projectmanagers, procurement/purchasing, business services, corporate managers and even presidents atthose companies and organizations with existing performance management policies. Informants notedthat vendor performance management evaluations should be a business practice or “collaborative toolwhich will form the basis of conversation on how to improve both the vendor and the organization’sability to manage the particular contract.” Informants envisioned a system where business and financearms, as well as procurement and the procuring department would have access to and retain custody ofperformance evaluations. One informant noted that performance evaluations should be signed.Respondents in both the interviews and surveys noted that any personnel involved with undertakingvendor performance evaluations should be trained on how the system works at all levels. Respondentscautioned that personnel should be aware of the implications of their vendor evaluations and of howevaluation results are used within various areas of the company or organization. Personnel should alsobe trained on the use of all performance management tools and resources to ensure a consistentapplication from one project to the next. Likewise, vendors ought to be aware of the performancemetrics and schedule as well as the resources that are available to them and be made aware of suchresources through clauses in their formal contract. According to informants, training should focus onand emphasize the need for consistency with respect to both use and application.Although corrective action measures are not well documented in the literature, (Ministry ofTransportation (MTO), 2007) states that infraction reports are only issued for serious contract breachessuch as: failure to comply with the terms and conditions of such agreement, failure to provide adequateorganization, co-operation, personnel or equipment, failure to comply with standards and legislations,and delayed delivery/failure to complete project in a timely manner. (Stueland, 2004) suggests that inorder for a vendor performance program to be successful, a vendor performance policy must be inplace, enforced, and available publicly.The (Ontario Realty Corporation, 2010) and the (Government of Tasmania, 2001) state that if a VendorManagement Program is to be effective, the program must be standardized, streamlined, and consistentand it is fundamental to the process that the information be timely, accurate, and a true reflection ofperformance. The (Office of the Procurement Ombudsman, 2010) also states that the use ofautomated systems is a best practice among many organizations, such as DCC and the Government ofNewfoundland and Labrador. The use of an automated system makes it manageable to control vendorperformance, as many organizations deal with a large number of contracts at any given time (Office ofthe Procurement Ombudsman, 2010). Very few companies and organizations with existingperformance measurement systems represented in the current survey had automated forms (11%) orautomated templates (18%) for conducting vendor evaluations.Prepared by: P a g e | 35R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  37. 37. CMC-Canada February 2012Vendor Performance Management Study8 BIBLIOGRAPHYAberdeen Group, Inc. (2002). The Supplier Performance Measurement Benchmarking Report: Measuring Supply Chain Success. iSource.Government of Tasmania. (2001). Performance Reports for Prequalified Contractors and Consultants. Retrieved from The Government Purchasing Information Gateway: http://www.purchasing.tas.gov.au/buyingforgovernment/getpage.jsp?uid=4C1F9B61B1F4F980C A256C9400148B03Kestenbaum, M. I., & Straight, R. L. (1995). Procurement Performance: Measuring Quality, Effectiveness, and Efficiency. In Public Productivity & Management Review (pp. 200-215). Armonk: M.E. Sharpe. Inc.Ministry of Transportation (MTO). (2007, October). RAQS Consultant: Consultant Performance & Selection System (CPSS). Retrieved from Ministry of Transportation: https://www.raqsa.mto.gov.on.ca/login/raqs.nsf/english/Text/RAQSPages/B.+Consultant+Headi ng+-+F.+Consultant+Performance+and+Selection+System+(CPSS)?OpenDocumentMyFloridaMarketPlace.com. (2005). Contract Administrators Meeting. Tallahassee: State of Florida.MyFloridaMarketPlace.com. (2011, April). 04_01_2011 vendor performance tracking report. Retrieved from Department of management services - state of florida: http://www.dms.myflorida.com/business_operations/state_purchasing/vendor_information/ve ndor_performance_tracking_vpt/vpt_tracking_reports/04_01_2011_vendor_performance_trac king_report.pdfOffice of the Procurement Ombudsman. (2010). study on a management approach to vendor performance. In Chapter 6: Procurement practices review (pp. 9-14). Ottawa: Office of the Procurement Ombudsman.Ontario Realty Corporation. (2010, June 2). Vendor Performance Program. Retrieved from Ontario Realty Corporation: http://www.ontariorealty.ca/Doing-Business-With-Us/Strategic-Sourcing----Bid- Opportunities/Vendor-Performance-Program.htmShirouyehzad, H. (2011, April). Efficiency and ranking measurement of vendors by data envelopment analysis. International Business Research, 4(2), 137-146.Stueland, V. J. (2004). Suppliers evaluations best practices and creating or imporving your own evaluation. ISMs 89th annual international supply management conference. San Antonio: Wells Fargo Services Company.Survey Analytics. (2011). Vendor performance management. Retrieved from Survey Analytics enterprise research platform: http://www.surveyanalytics.com/Prepared by: P a g e | 36R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  38. 38. CMC-Canada February 2012Vendor Performance Management StudyThe Department of Housing. (2006). Improving vendor performance: Vendor performance reports in contract administration. Georgia: Georgia Institute of Technology.Weber, C. A. (1996). A data envelope analysis approach to measuring vendor performance. Supply Chain Management, 1(1), 28-39.Prepared by: P a g e | 37R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  39. 39. CMC-Canada February 2012Vendor Performance Management Study9 APPENDIX A Figure 9.0.1 The Central Tendency Overall Rating 70% 66% 60% 50% 40% 31% 30% Overall Rating 20% 10% 3% 0% >= 1.00 to <= 2.80 >= 2.81 to <= 3.20 >= 3.21 to <= 5.00 N= 5105 Source: State of Florida myMarketPlace Vendor Performance Tracking ReportPrepared by: P a g e | 38R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC
  40. 40. CMC-Canada February 2012Vendor Performance Management Study Figure 9.0.2 Consequences of Contract Administration Problems for Contracted Services Contract Contract Increased Increased Delay < 10 Dely > 10 Contract Contract Contract No Effect days days Cost < 10% Cost > 10% Termination Percent Percent Percent Percent Percent Percent Contract Administration Problem Wrong Product 48.8% 23.1% 7.9% 8.7% 4.1% 7.4% Delays 30.8% 29.7% 18.1% 10.9% 4.7% 5.8% Definition of Acceptance 38.7% 22.7% 16.4% 9.8% 5.5% 7.0% Change Order 31.8% 17.8% 12.5% 20.8% 10.6% 6.4% Conflict 31.3% 25.7% 17.3% 8.1% 7.7% 9.9% Other Sources 48.5% 17.0% 12.4% 11.2% 7.9% 2.9% Poor Performance 26.7% 18.1% 18.4% 10.8% 8.3% 17.7% Risk of Failure/Termination 33.0% 21.9% 13.0% 8.9% 8.1% 15.2% Subcontractors 41.5% 19.0% 11.7% 12.5% 8.1% 7.3% Costs 29.2% 14.4% 12.5% 22.9% 12.2% 8.9% Consequences 35.6% 21.0% 14.2% 12.5% 7.8% 9.0% N= 2228 Source: Davison, B and Sebastian, R.J. An Analysis of the Consequences of Contract Adminstration Problems for Contract Types; 2009.Prepared by: P a g e | 39R.A. Malatest & Associates Ltd. & Gerald Ford Ontario Advacocy Committee Member CAMC

×