Alignment, impact and measurement with the a model

7,393 views
7,318 views

Published on

Published in: Business, Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
7,393
On SlideShare
0
From Embeds
0
Number of Embeds
678
Actions
Shares
0
Downloads
0
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Alignment, impact and measurement with the a model

  1. 1. Alignment, Impact and Measurement with the A-model Print version Bruce C. Aaron ametrico 2012 Users Conference New Orleans March 20 - 23Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmarkis a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  2. 2. Culture Methods A state of The tools and responsibility for methods used to demonstrating evaluate progress accepted outcomes toward outcomes. Assessment, Operates at the Evaluation, organization, group, Performance and individual Measurement, the levels. A-model Performance 2012 Users Conference  New OrleansSlide 2 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  3. 3. Talent Mgmt. Human Capital KM Mgmt. Diversity Competency Mgmt. HRD Learning & Training Blended Learning Social Learning 70 / 20 / 10 Performance Performance Mgmt. Improvement 2012 Users Conference  New OrleansSlide 3 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  4. 4. Issues in Accountability Some Common Problems A-model Benefits 1. Inconsistent evaluation approaches 1. Unified decision framework 2. Evaluation as part of planning and 2. “Post” evaluation mentality strategy 3. Weak support for program 3. Clear identification of performance improvement drivers 4. Ambiguous relationship of programs to 4. Links to strategy and business problem strategy or business problem must be validated 5. Efficient measurement 5. “Scrap” measurement 6. Forecasting across outcomes 6. No forecasting of outcomes (“predictive analytics”) 2012 Users Conference  New OrleansSlide 4 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  5. 5. Session Goals:  In today’s session, we will:  Review the fundamental concepts of the A-model  Explore case study examples  Discuss evaluation tips and techniques  Invite Q & A  As a result, you should be able to:  Describe the A-model  Generate new ideas for assessment and evaluation in your organization  Apply your new ideas to your work 2012 Users Conference  New OrleansSlide 5 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  6. 6. The Systemic View 2012 Users Conference  New OrleansSlide 6 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  7. 7. The A-model SM Analysis & Design Phase Assessment & Evaluation Phase Program Enablement Detail Performance Problem Sequence 2012 Users Conference  New OrleansSlide 7 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  8. 8. The A-model Analysis & Design Phase Assessment & Evaluation (A&E) Phase Program Delivery A&E Performance Enablement Enablement A&E Performance Performance A&E Problem Impact A&E = Purpose = Program Delivery = Causal Path 2012 Users Conference  New OrleansSlide 8 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  9. 9. The A-model Program Delivery Performance Enablement Enablement Performance Performance Problem definition Includes important objectives relating to quantity, quality, cost, revenue, time to market, engagement, satisfaction, etc. Impact = Purpose = Program Delivery = Causal Path 2012 Users Conference  New OrleansSlide 9 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  10. 10. The A-model Program Delivery Performance requirements are Performance Enablement the specific behaviors or Enablement performances required to solve the problem. These are defined through causal analysis of the human performance issues relating to the Performance Problem. Creating the new Performance addresses the Problem. Problem Impact = Purpose = Program Delivery = Causal Path 2012 Users Conference  New OrleansSlide 10 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  11. 11. The A-model Program Delivery Performance Enablement requirements identify what must be in the “heads, hearts, or hands” of participants to bring about the desired change. These are identified as gaps Enablement in Knowledge/Skill, Incentive/Motivation, Information, Performance Support, or Work Processes. Filling these gaps enables the Performance. Performance Performance Problem Impact = Purpose = Program Delivery = Causal Path 2012 Users Conference  New OrleansSlide 11 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  12. 12. The A-model Program requirements identify what the solution or intervention must be and do as it is delivered to participants, in order to be successful and produce the Performance Enablement identified in the previous stage. Delivery It specifies the conditions that ensure successful implementation of the solution. Performance Enablement Enablement Performance Performance Problem Impact = Purpose = Program Delivery = Causal Path 2012 Users Conference  New OrleansSlide 12 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  13. 13. The A-model Delivery or operations metrics, penetration of target audience, quality Program standards for design or delivery, participant reaction and intent to apply) Performance Learning, Skill gain, attitude change, Enablement performance support, process support Target behaviors, application Performance on the job, individual and organizational capability Quality, quality, cost, Problem revenue, time savings, client / employee satisfaction = Purpose = Program Delivery = Causal Path 2012 Users Conference  New OrleansSlide 13 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  14. 14. A & E Questions Is the program implemented as planned? Program Delivery Do participants perceive the value of the solution? Enablement Enablement Did they learn? Can they use the performance support? Are they motivated to apply the solution? Performance Performance Did performance change? Business Problem Impact $$ ROI 2012 Users Conference  New OrleansSlide 14 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  15. 15. Business Impact and Return On Investment What’s the status of the “pain” (problem) or “gain” (opportunity) in Program measures of time, quality, production, cost, sales, etc.? What’s the ratio of that monetary benefit to the total cost Enablement of the solution? Performance Problem Organizational Impact and ROI ROI =Net Gain / Cost What is the “pain” (problem) or “gain” (opportunity)? Where does this hit the bottom line? What brings us to the table? What is the gap? 2012 Users Conference  New OrleansSlide 15 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  16. 16. Case Example, GlobeCom Inc.  The leadership of GlobeCom Inc. has reason to believe that their company is offering superior value and service in the marketplace. However, they are plagued by lagging performance in several important metrics that directly affect revenue. These include shortfalls on market share, and win rates on sales opportunities.  After conducting analysis on these issues, GlobeCom concludes that they not being “out-solutioned” in the marketplace, but that they are being “out-sold”. GlobeCom’s needs analysis supported several hypotheses about gaps in the sales-related performances of their business developers. The company lacked a consistent approach to sales.  Based on the performance analysis, an intervention was designed, implemented, and evaluated. The solution comprised a comprehensive training program in consultative sales targeted at the company’s business developers, and changes in several work processes, tools, and organizational roles relating to the sales function.  Try to locate each of the following elements of the GlobeCom case within the A-model. 2012 Users Conference  New OrleansSlide 16 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  17. 17. Locate each element in the A-model. See Appendix A for solutions. 1. Increase market share, which is stagnant and below target. 2. GlobeCom business developers need to present solutions to clients persuasively. 3. Apply a systemic and systematic sales methodology across the organization. 4. Train GlobeCom business developers in a new sales methodology. 5. The change in ability of GlobeCom’s program participants to present solutions to clients persuasively. 6. The application of the new sales methodology by GlobeCom’s program participants. 7. The increase in sales that is attributable to the new program. 8. (Sales and Savings attributed to the program - Cost of the program) / Cost of the program D Program E C Performance Enablement F B Performance G A Problem H 2012 Users Conference  New OrleansSlide 17 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  18. 18. A Comprehensive Framework Analysis & Design Assessment & Evaluation Program Delivery Instructional Design. Testing. Causal Path. Certification. Research Design. Enablement Enablement Needs Assessment. Performance Analysis. Performance Performance Strategy. Performance A&E. Goals. Impact A&E. Impact Problem ROI 2012 Users Conference  New OrleansSlide 18 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  19. 19. Data Collection in the A-model = Needs Assessment Delivery Evaluation = = Performance Analysis Test = = Job/Task Analysis Performance Evaluation = = Employee Survey Impact Evaluation = = Client/Customer Surveys Program Enablement Performance Problem 2012 Users Conference  New OrleansSlide 19 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  20. 20. Evaluation Strategies in the A-model  The 3 Cs  The Evaluation Cube  The Causal Path Program Enablement Performance Problem 2012 Users Conference  New OrleansSlide 20 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  21. 21. Evaluation Strategies in the A-model  The 3 Cs  The Evaluation Cube  The Causal Path 2012 Users Conference  New OrleansSlide 21 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  22. 22. The Basis of Evaluation – The 3 Cs (Comparisons): • Compare the individual or group to itself over time. • Change. Increase/decrease. Baseline. Trend and variation of the time series. • Compare the individual or group to defined others. • Norms. Rank. Percentiles. The mean and variation of the distribution. • Compare the individual or group to a defined standards. • Benchmarks. Performance Targets. Mastery scores. 2012 Users Conference  New OrleansSlide 22 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  23. 23. Evaluation Strategies in the A-model  The 3 Cs  The Evaluation Cube  The Causal Path 2012 Users Conference  New OrleansSlide 23 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  24. 24. The Evaluation Cube • A tool for generating ideas about how to assess and evaluate • Aids in divergent thinking (generating the alternatives) and convergent thinking (selecting the best approach based on your situation) • Allows you to systematically consider the assessment and evaluation options. 2012 Users Conference  New OrleansSlide 24 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  25. 25. The Evaluation Cube: Pathways for Assessment & Evaluation (with data from these) 3 Sources of (judge these) evidence 3 Ps Self Others (norm) Criterion 3 Cs (by comparing to these) 2012 Users Conference  New OrleansSlide 25 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  26. 26. Evaluation Cube Example A A company requires certification exams of its engineers to verify mastery of software development skills P: Judging performance (enablement), C: comparing to a mastery criterion, Source: with data collected from the individual 2012 Users Conference  New OrleansSlide 26 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  27. 27. Evaluation Cube Example B A new performance support system and training are provided to customer service representatives. Six months after training, supervisors of the reps are surveyed to determine the extent to which participants are using the new system to meet customers’ needs. P: Judging the Performance domain (performance and enablement), C: comparing to a criterion (performance objectives), Source: with data collected from observers 2012 Users Conference  New OrleansSlide 27 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  28. 28. The Evaluation Cube: Summary Use the Cube to generate and select the alternatives for assessment. 1. First, identify the evaluation domain (Program, Performance, Problem) 1. Then consider the 3 possible comparisons (Self, Others, Criteria), and the 3 data sources while testing each combination for the following: Self Others (norm) A. Accuracy Criterion  Will this path provide data that are reliable and valid? B. Utility  Are these data sources credible to stakeholders?  Does the source provide data within the necessary timelines? C. Feasibility  Can the data be collected cost effectively?  Are the data practical, credible and politically viable? D. Propriety  Are the data useful given any necessary protections for participants (e.g., data privacy, anonymity, local considerations). 2012 Users Conference  New OrleansSlide 28 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  29. 29. Evaluation Strategies in the A-model  The 3 Cs  The Evaluation Cube  The Causal Path 2012 Users Conference  New OrleansSlide 29 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  30. 30. The A-model Causal Path Assessment & Evaluation  Why collect data at each stage of the A&E Phase? (A&E) Phase Program Delivery  When impact is less than expected:  Identifies the specific domain that needs to be addressed. Enablement  When impact meets or exceeds expectations:  Provides evidence of the program’s Performance direct influence on results Impact The A-model establishes a natural, logical correlation between data collected along the Causal Path. ` = Causal Path 2012 Users Conference  New OrleansSlide 30 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  31. 31. Applying the Causal Path: Example A Result at each stage (dichotomized) Delivery Performance Enablement Performance Impact No transfer of new knowledge, skills, or attitudes to workplace. Identify issues in the Performance domain (e.g., incentive to apply, support from management, performance support, etc.). 2012 Users Conference  New OrleansSlide 31 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  32. 32. Applying the Causal Path: Examples Result at each stage (dichotomized) Delivery Performance Enablement Performance Impact Logical pattern of results for successful implementation. This pattern is necessary to attribute impact to the program. 2012 Users Conference  New OrleansSlide 32 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  33. 33. Applying the Causal Path: Examples Result at each stage (dichotomized) Delivery Performance Enablement Performance Impact Illogical pattern of results. Attributing impact to the program is not justified. Can result from the A-model not being applied in the A&D phase, misspecification of Delivery Metrics, and/or extraneous factors affecting targeted Impact results. 2012 Users Conference  New OrleansSlide 33 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  34. 34. Applying the Causal Path: Examples Result at each stage (dichotomized) Delivery Performance Enablement Performance Impact Typical pattern of data used to make decisions about programs. No measurement of performance or impact. Successful delivery and achievement of enablement objectives interpreted as success. Decisions about further investment are at risk. 2012 Users Conference  New OrleansSlide 34 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  35. 35. Case example I – Knowledge Management ABC Inc., implements a KM System designed to consolidate and distribute information useful for work. Basic functions relate to harvesting, coding and storing information. They believe “crowd-sourcing”, electronic libraries, expert networks, and online CoPs (Communities of Practice) are Program important, so these initiatives (i.e., Programs) are also funded and implemented. Performance Enablement The system collects metrics and generates reports relating to traffic (hits, visits, the number of contributions in the repositories, the number of people in Performance CoPs, etc.) and these are tracked and reported for accountability. Problem The economic environment tightens, the company organizes, and the KM initiatives are challenged to justify current and future funding. 2012 Users Conference  New OrleansSlide 35 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  36. 36. Case example I – Knowledge Management KM System operational status. Evaluation of whether systems are performing their basic functions (e.g., harvesting coding and storing information) as designed, independent of interaction with users. System “health” indicators. First level of interaction between the KM System and users. Metrics relate to Program System Status penetration (person access) and hits/visits (content access), usability, navigation. Access System “Locate” metrics indicate whether the Performance right information gets to the right person/s at the right time, including effectiveness of Enablement Locate Information Search and Browse strategies. Performance That which our people do in Performance the work setting with the knowledge created or found through the KM system in the service of our clients. Business Results Problem & ROI The business outcome or result of applying the knowledge created/found through the KM system. The ratio of quantifiable business results to the cost of the program (ROI, the net gain relative to cost). 2012 Users Conference  New OrleansSlide 36 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  37. 37. Case example II – Knowledge Management SysTech is implementing a KM System designed to increase collaboration between first tier customer support engineers and higher tier more expensive engineers. The goal is to reduce the amount of time spent by high-tier engineers in re-solving identical Program problems from various customers as those issues are escalated beyond Tier 1 Support. The system should harvest, code and store information about these Performance issues and solutions to make it available for re-use. Enablement The system has inherent metrics and reports relating to traffic (hits, visits, the Performance number of contributions in the repositories, ratings and recognition for successful solutions). Problem SysTech has a set of metrics relating to the amount of time it takes to solve a customer issue. They are investing much of their project time and energy on these metrics and are basing their business case and evaluation on decreasing these metrics (i.e., to show time savings). 2012 Users Conference  New OrleansSlide 37 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  38. 38. Case example II – Knowledge Management CURRENT SysTech’s KM System designed to increase collaboration and reduce the amount of time spent by high-tier engineers in re-solving identical problems. Program System Status System traffic Performance metrics, Access System Enablement counts of PROPOSED contributions Locate in the Information Performance repositories, ratings and recognition for successful Performance Problem solutions. Business Results & ROI Metrics relating to the amount of time it takes to solve a customer issue 2012 Users Conference  New OrleansSlide 38 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  39. 39. Invoke Tukey’s Axiom: “An approximate answer to the right problem is worth a good deal more than an exact answer to an approximate problem.” -- J. W. Tukey 2012 Users Conference  New OrleansSlide 39 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  40. 40. Questions? 2012 Users Conference  New OrleansSlide 40 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.
  41. 41. Appendix A  Suggested solutions for GlobeCom case: A 1. Increase market share, which is stagnant and below target. B 2. GlobeCom business developers need to present solutions to clients persuasively. B 3. Apply a systemic and systematic sales methodology across the organization. D 4. Train GlobeCom business developers in a new sales methodology. F 5. The change in ability of GlobeCom’s program participants to present solutions to clients persuasively. G 6. The application of the new sales methodology by GlobeCom’s program participants. H 7. The increase in sales that is attributable to the new program. H 8. (Sales and Savings attributed to the program - Cost of the program) / Cost of the program 2012 Users Conference  New OrleansSlide 41 ametrico Copyright © 1995-2012 Questionmark Corporation and/or Questionmark Computing Limited, known collectively as Questionmark. All rights reserved. Questionmark is a registered trademark of Questionmark Computing Limited. All other trademarks are acknowledged.

×