Challenges in Business Performance Measurement: The Case of a ...
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
596
On Slideshare
595
From Embeds
1
Number of Embeds
1

Actions

Shares
Downloads
9
Comments
0
Likes
0

Embeds 1

http://www.slideshare.net 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Challenges in Business Performance Measurement: The Case of a Corporate IT Function Stephen Corea & Andy Watters (Warwick University, UK)
  • 2. Presentation Outline
    • Research Motivation
    • Theoretical Review
    • Methodology
    • Case Study: GITS
    • Findings & Discussion
    • Conclusion
  • 3.
    • Business performance measurement (PM) – presenting relevant information to management staff for assessing the organization's progress towards achieving strategic/operational aims
      • Several major PM frameworks proposed recently: field dominated by prescriptive, top-down perspective (formal derivation from strategy)
    Research Background
    • RESEARCH AIM :
    • exploratory study to understand the challenges in mounting dashboard based PM practices in a corporate IS Dept
      • Significant no. of PM initiatives fail to take root or adequately deliver expected benefits (70% according to McCunn, 1998)
      • Need for greater understanding of on-the-ground challenges and bottom-up perspective of challenges involved
  • 4.
    • Robust PM system should take a ‘balanced’ approach (Kaplan & Norton, 1992)
    • Identification/population of useful performance measures (or metrics) to capture progress towards goal attainment is a key but not easily satisfied criteria (Neely et al, 1997):
    Theory: PM Implementation
    • Key proposed principles of PM design
      • Should be grounded in strategy: performance metrics must be derive from predefined strategy/objectives (De Tonia & Tonchia, 2001)
      • Should incorporate a high proportion of ‘leading’ indicators (Eckerson, 2006) i.e. in comparison to ‘lag indicators’
  • 5. Theory: PM Design
    • 22 most-cited recommendations for designing measures (Neely e al, 1997)
    • Provide fast feedback
    • Have an explicit purpose
    • Be based on explicitly defined formula and source of data
    • Employ ratios rather than absolute numbers
    • Use data which are automatically collected as part of a process
    • Be reported in a simple consistent format
    • Be based on trends rather than snapshots
    • Provide information
    • Be precise – be exact about what is being measured
    • Be objective – not based on opinion
    • Be derived from strategy
    • Be simple to understand
    • Provide timely and accurate feedback
    • Be actionable: based on quantities that can be influenced or controlled
    • Reflect the business process: both customer and supplier should be involved in defining
    • Relate to specific targets
    • Be relevant
    • Be predictive: part of a closed management loop
    • Be clearly defined
    • Have visual impact
    • Should focus on improvement
    • Be consistent (i.e. should maintain significance as time goes by)
  • 6.
    • Systemic aspect of effective PM: process rationalisation, shared understanding & staff commitment, IT support (data capture/collection, processing & presentation)
    • Dashboards: visual impact, data quality and timeliness (Few, 2005; Dixon et al., 1999)
    • PM for the IT Function: need for spread of measures across 3 categories (Stanwick & Stanwick, 2005) – (i) efficiency; (ii) effectiveness; (iii) productivity
    Theory: PM Implementation
  • 7.
    • Interpretive case study method (Walsham, 1995): indepth single-site case study, aimed at theoretical generalisation
    • Case organisation/unit: GITS (Group IT Services), the corporate IT function of Multicorp (a pseudonym), a multi-national manufacturer of tobacco-based products
    • Data Gathering & Analysis
      • multiple site visits: June to August 2006
      • 27 semi-structured interviews, dashboards/documents review, informal conversations
      • inductive analysis (Glaser & Strauss, 1967): identifying patterned regularities (common themes, issues or dilemmas)
    Research Method
  • 8. Case Study: Multicorp /GITS
    • GITS (Groups IT Services)
      • corporate IT function, formed in 2004
      • transform supply of IT support across Multicorp’s worldwide business units from traditional geographically-localised model into a centralised, ‘shared services’ model
      • to achieve cost-savings of £100 million by 2009
    • Major manufacturer of cigarettes & tobacco-based products
      • 300+ brands sold in 180 ‘end-markets’ (i.e. country-specific regions); factories in 54 countries
      • stable industry competitive conditions: strategy is strongly focused on efficiencies (cost-savings) across business units (esp. support)
  • 9. Case Study: GITS
    • Strategy & structure
      • vision: irresistible value
      • client-facing units & management team
      • critical need to monitor performance
      • 3 dashboards in use
      • “ Two years ago GITS had less than 100 staff, the management team could sit in a room and discuss in detail operational issues throughout the department. Now we’re more than 500 strong, and are doing far more things. We haven’t a clue what is going on out there, and don’t know what operational things we should be looking at.” (Manager)
    World Class People Service Quality Irresistible Value Cost Savings SDS 26 Volume £ 100m CSS 4.5 100% Supply Side
  • 10. GITS: Leadership Dashboard
  • 11. GITS: Application Services Dashboard
  • 12. GITS: Technical Services Dashboard
  • 13.
    • Inadequacies in dashboard population and scope of measurement
      • difficulties obtaining timely & accurate data
      • areas of performance left untracked (in scope & time)
      • deficient in leading indicators (heavily lagged-oriented): lack of predictive capacity to take proactive interventions
        • ““ I’ve no idea what drives the numbers. I’m not sure if anyone has” (manager)
    Case: Findings
  • 14.
    • lack of clarity or common understanding regarding definition of certain measures, e.g.
      • Constitution of measures: e.g. managed volume
        • (i) “We count managed volume against our target only when services have been transferred to GITS, and the first invoice sent to the end-market”;
        • (ii) “Managed volume is just that: services which we (GITS) manage. It doesn’t matter if we haven’t billed the customer yet.”
      • Progress towards targets: e.g. cost savings
        • (i) “We claim that we have achieved a cost saving when we sign a contract with an outsource provider to provide the service at a cost lower next year than our current deal”
        • (ii) “Cost savings are claimed when we release next years’ price list to the end markets in May, with confirmation in early December.”
    Case: Findings
  • 15.
    • Relation of measurement to strategy: difference between Leadership & AS/TS dashboards
      • Application Services & Technical Services dashboards reported self-chosen operational targets beyond existing strategy
    Case: Findings
      • non-strategy derived measures were seen by managers as useful aspects for monitoring operational health
        • “ There are quite a few measures which don’t directly relate to strategy or targets, but we think it is worthwhile to keep track of them. It helps to know the operational health of the business.” (Application Services manager)
      • political value or ‘signaling’
        • ““ We benchmark the charge rates of our project managers against external consultancy providers; we’re less expensive, and the difference is classed as Cost Avoidance. It helps us demonstrate our value to the business.” (Technical Services manager)
  • 16.
    • Lack of systemisation in data collection and measurement (cum dashboard) design
      • no top-down mandate or formal programme / framework guiding the implementation of these practices: need for process rationalisation and information systems infrastructure (Bourne et. al, 2003)
      • difficulty identifying leading indicators (Neely et al., 2000; Eckerson, 2006)
    Case: Discussion
    • Need for re-orientation of fundamental aim/focus
      • from a tool for simply monitoring/reporting to one of learning what factors drive results (i.e. to be able to influence/control)
  • 17.
    • Re-thinking major PM tenets/principles
      • Notion of ‘balance’ in balanced measurement
        • financial vs. non-financial ‘lever’
        • reporting vs. predication/learning ‘lever’ (lagging vs. leading indicators)
    Case: Discussion
      • Existing strategy as the source for deriving measures: a case for de-coupling strategy from measurement?
        • cost-focussed strategies promote financial and discourage non-financial indicators
        • ‘ de-politicizing’ of measurement
        • promote transformation of PM towards a learning rather than simply a monitoring tool
  • 18.
    • Results of this exploratory study suggest a need for further research & theoretical development to extend & deepen understanding of the complex nature of PM
      • What does ‘balanced’ measurement imply
      • Relationship between strategy and measurement
    • Questions?
    • Thank you
    Conclusion
  • 19. END OF PRESENTATION
  • 20. References
    • Kaplan, R. and Norton, D. (1992). “The balanced scorecard – measures that drive performance.” Harvard Business Review. January-February, 71-29.
    • McCunn, P. (1998) The Balanced Scorecard: the eleventh commandment. Management Accounting 34-36.
    • Neely, A., Richards, H., Mills, J., Platts, K. and Bourne, M. (1997) Designing performance measures: a structured approach. International Journal of
    • Operations & Production Management 17, 1131-1152.
    • De Toni, A. and Tonchia, S. (2001) Performance measurement systems - models, characteristics and measures. International Journal of Productions and Operations Management 1, 347-354.
    • Eckerson, W. (2006) Performance Dashboards: Measuring Monitoring and Managing Your Business, edn. New Jersey: John Wiley & Sons.
    • Few, S. (2005) Dashboard Design: Beyond Meters, Gauges, ad Traffic Lights. Business Intelligence Journal 10, 18-24.
    • Dixon, J.R., Nanni, A.J. and Vollmann, T.E. (1990) The new performance challenge: Measuring operations for world-class competition, Homewood, IL: Dow Jones-Irwin.
    • Stanwick, P. and Stanwick, S. (2005) IT Performance: How Do You Measure a Moving Target? The Journal of Corporate Accounting and Finance 13, 19-24.
    • Walsham, G. (1995). Interpretive case studies in IS research: nature and method. European Journal of Information Systems, 4:2, 74-81.
    • Glaser, B. and Strauss, A. (1967) The discovery of grounded theory, Aldine, Chicago.
    • Bourne, M., Franco, M. and Wilkes, J. (2003) Corporate Performance Management. Measuring Business Excellence 7, 15-21.