• Like

TBR 2Q11 Corporate IT Buying Behavior & Customer Satisfaction Study

  • 494 views
Uploaded on

Technology Business Research is a different kind of research company. Our bottoms-up approach provides a look at the technology industry unlike anything you’ve seen before. We analyze company …

Technology Business Research is a different kind of research company. Our bottoms-up approach provides a look at the technology industry unlike anything you’ve seen before. We analyze company performance in professional services, networking and mobility, computing and hardware, and software on a quarterly basis, leveraging our data to create industry benchmarks and landscapes that provide a business perspective on leaders and laggards and their business plans. We are experts in the business of technology.

The TBR Computing research team compiled information from the Second Quarter 2011 into this Corporate IT Buying Behavior & Customer Satisfaction Study. These supporting slides include information regarding internal support organizations, Dell Services, IBM Global/Lenovo Services, and HP Services. TBR provides insight on hot topics such as competitive placement, performance differentiation shifts, server support, desktop/notebook support, critical metrics, historical record, and their own Watch List.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
494
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
14
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Technology Business ResearchAccelerating Customer Success Through Business Research TBR T E C H N O L O G Y B U S I N E S S R ES E AR C H , I N C .
  • 2. Service & SupportCorporate IT Buying Behavior & Customer Satisfaction StudySecond Quarter 2011 OVERALL SUPPORT SERVICES x86 SERVER SUPPORT DESKTOP/NOTEBOOK SUPPORT 2Q11 2Q11 2Q11 Strength/ Strength/ Strength/ 2Q11 TBR 2Q11 WSI Weakness 2Q11 TBR 2Q11 TBR Weakness 2Q11 TBR 2Q11 TBR Weakness SUPPORT PROVIDER RANK SCORE Points RANK SCORE Points RANK SCORE Points Internal Support Organizations 1 85.9 +16 1 85.9 +14 1 85.8 +14 Dell Services 2 82.8 +2 3 82.0 0 2 83.7 +3 IBM Global/Lenovo Services 2 82.4 +1 2 83.0 +5 3 81.7 -1 HP Services 2 81.5 -1 3 81.9 0 3 81.1 -1 Publication Date: Sept. 23, 2011 Author: Julie Perron TBR T E C H N O L O G Y B U S I N E S S R ES E AR C H , I N C .
  • 3. Content TBR Slides and Modules 3 2Q11 Corporate Service & Support Satisfaction At A Glance 10 2Q11 Competitive Placement Summary & Insights 11 Key Findings 16 The Score in 2Q11 18 Most Noteworthy Events - Performance Differentiation Shifts 21 Server Support - Segment Analysis 25 Desktop/Notebook Support - Segment Analysis 29 Critical Metrics Summary 32 TBR’s Watch List 38 Historical Record 3 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 4. TBR 2Q11 Corporate Service & Support Satisfaction At A Glance4 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 5. 2Q11 Corporate Service & Support Satisfaction at a Glance TBRIBM and Dell regain honors in their respective server anddesktop/notebook support satisfaction segmentsIn-house support continues to assert itself as the overall model of maintenance efficiencyIBM holds server support Dell Services maintains theleadership position for the advantage for desktop/sixth straight reporting 2Q11 WEIGHTED SCORES AND RANKING notebook support for its thirdperiod TBR BY SUPPORT SEGMENT straight period 89.0• IBM outpaced its OEM • Dell Services defended its top competitors by excelling 87.0 85.9 85.8 ranking position by across key areas that 85.0 1 83.0 83.7 1 outperforming competitors touch on each aspect of 83.0 82.0 81.9 2 81.7 across most categories. 81.1 the support experience – 81.0 2 3 3 3 3 • At a considerable distance on-site support, phone 79.0 from Dell, Lenovo Services and support, replacement 77.0 HP Services shared the No. 3 parts availability, and the Server Support Desktop/Notebook Support ranking position. HP was cited perception of services Dell Services Internal Support Organizations with a competitive warning for value. IGS/Lenovo Services HP Services technical expertise, while• Dell and HP Services SOURCE: TBR Lenovo Services was cited shared the No. 3 ranking with a warning for online position. Both contenders support – both areas in which lacked the differentiation Dell excelled and earned exhibited by IBM across competitive strengths. the areas identified above. • The internal support teams continued to substantially outperform OEM support providers across seven of the nine categories in both segments. 5 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 6. 2Q11 Corporate Service & Support Satisfaction at a Glance TBRIn-house support is again cited as the ideal experience, over a muddledfield of OEM support providers with a few specific strengths & challenges 2Q11 Key Takeaways:TBR Service Provider 2Q11 Scorecard • The internal support group defended its historical position as the model against which OVERALL RESULTS we measure OEM support providers. INTERNAL IGS/LENOVO • Conservative budgeting of internal resourcesSUPPORT PROVIDER SUPPORT DELL SVCS SVCS HP SVCS ensured the continued standing of self supportBrea k/Fi x Servi ces   *  as the best source for managing ITOn-s i te Techni ca l Expertis e     infrastructures.On-s i te Res pons e Ti me/Commi tment     • The three OEM support providers shared theTel ephone/Hel pdes k Support   *  No. 2 ranking position, with their WSI ratings at a considerable distance from that of internalOnl i ne Support  *   support.Remotel y Ma na ged Support     o Dell Services’ position improved, gaining aRepl a cement Pa rts Ava i l a bi l i ty     new competitive strength for online support.Support Servi ces Pri ci ng/Va l ue     o IGS’ position strengthened asHa rdwa re Ins tal l a tion/Confi gura tion  *   well, recovering from its previous warningNumeri c Va l ue 16 2 1 -1 for on-site response time.Wei ghted Sa tis fa ction Score 85.9 82.8 82.4 81.5 o The general strengthening of Dell’s and IGS’Ra nki ng 1 2 2 2 scores were mitigated somewhat by DellRa nki ng, OEM Support Provi ders Onl y 1 1 1 giving up its previous strength for on-site response and IGS receiving a new warningKey: Weakness;  Strength; ¡ Ne utra l. Warning area fo r weakness, but insufficient data to substantiate at this time. * The de te rm ina tio n wa s m a rgina l. for online support. o HPS’ performance improved modestly, yet a The Overall Results combine the server and desktop/notebook previous challenge – on-site technical results into one, with sample sizes of 250 or more per group. expertise – returned.6 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 7. 2Q11 Corporate Service & Support Satisfaction at a Glance TBRIBM Support outshines competitors for x86 server-related support servicesfor the sixth straight period due to broadly differentiated services 2Q11 Key Takeaways:TBR x86 Server Service Provider 2Q11 Scorecard • The internal support group held firm, substantially outperforming OEM SERVER SUPPORT support providers in all categories but parts INTERNAL availability and phone support.SUPPORT PROVIDER SUPPORT IBM SVCS DELL SVCS HP SVCS • Internal support gave up its previous (andBrea k/Fi x Servi ces  *   typically repetitive) competitive advantageOn-s i te Techni ca l Expertis e  *   for phone support, as its 2Q11 score was noOn-s i te Res pons e Ti me/Commi tment     longer significantly higher than that of IBM. • IBM Support earned its sixth consecutive topTel ephone/Hel pdes k Support  *   ranking, driven by repeated competitiveOnl i ne Support     strengths across five key categories –Remotel y Ma na ged Support     break/fix services, on-site expertise, phoneRepl a cement Pa rts Ava i l a bi l i ty  *   support, parts availability, and supportSupport Servi ces Pri ci ng/Va l ue  *   services value.Ha rdwa re Ins tal l a tion/Confi gura tion     • Dell Services and HP Services continued toNumeri c Va l ue 14 5 0 0 share a subordinate ranking position toWei ghted Sa tis fa ction Score 85.9 83.0 82.0 81.9 IBM, neither exhibiting the needed differentiation perception that IBM hasRa nki ng 1 2 3 3 continued to own.Ra nki ng, OEM Support Provi ders Onl y 1 2 2 • With a full slate of neutralKey: Weakness;  Strength; ¡ Ne utra l. Warning area fo r weakness, but insufficient data to substantiate at this time. performances, only one category exhibited * The de te rm ina tio n wa s m a rgina l. significant differences between the scores of The Server Support Results are based on views of IT Dell and HP Services – hardware deployment. managers/directors that primarily support x86-based servers, with a sample size of 125 or more per group.7 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 8. 2Q11 Corporate Service & Support Satisfaction at a Glance TBRDell Services continues to exert its leadership status in desktop/notebook support 2Q11 Key Takeaways: • The internal support group held firm, substantiallyTBR Desktop/Notebook Service Provider 2Q11 Scorecard outperforming OEM support providers in all but two categories – break/fix services and parts DESKTOP/NOTEBOOK SUPPORT availability. INTERNAL • Internal support gave up its competitive advantageSUPPORT PROVIDER SUPPORT DELL SVCS LENOVO SVCS HP SVCS for break/fix services, an area in which the groupBrea k/Fi x Servi ces     has typically dominated the competition eachOn-s i te Techni ca l Expertis e     reporting period.On-s i te Res pons e Ti me/Commi tment     • Dell Services maintained its top ranking position asTel ephone/Hel pdes k Support     a result of continued performance differentiationOnl i ne Support  *   through its on-site technical expertise rating. A new online support strength was added, at the cost ofRemotel y Ma na ged Support     giving up its on-site response time strength thatRepl a cement Pa rts Ava i l a bi l i ty     had been present in the previous two periods.Support Servi ces Pri ci ng/Va l ue     Dell’s substantially higher-than-average WSI ratingHa rdwa re Ins tal l a tion/Confi gura tion     was enhanced by marginal advantages overNumeri c Va l ue 14 3 -1 -1 competitors across most of the remainingWei ghted Sa tis fa ction Score 85.8 83.7 81.7 81.1 categories.Ra nki ng 1 2 3 3 • HP Services was reissued a previously rescindedRa nki ng, OEM Support Provi ders Onl y 1 2 2 competitive warning for on-site expertise.Key: Weakness;  Strength; ¡ Ne utra l. Warning area fo r weakness, but insufficient data to substantiate at this time. • Lenovo Services shared the No. 3 ranking position * The de te rm ina tio n wa s m a rgina l. with HPS, being cited with a new competitive warning for online support. Yet, Lenovo Services The Desktop/Notebook Results are based on views of IT was the most improved performer in this study managers/directors that primarily support desktop and laptop segment, recovering from its previously issued PCs, with a sample size of 125 or more per group. warning for on-site response time. 8 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 9. 2Q11 Corporate Service & Support Satisfaction at a Glance TBR Satisfaction with support services gently rises in 2Q11 • Satisfaction with support services has increased during the past two reporting PERCENT CHANGES IN MEAN SATISFACTION POSITIONS, 2Q11 VS. 1Q11 periods, at a modest rate. TBR • The overall satisfaction rating Dell Services HP Services IGS/Lenovo Services Internal Support Organizations 4% (group to the far right) servesShifts of 3% or greater indicate as a leading indicator. Thesignificant change between 3% more solid level of 2% improvement within this 1% metric suggests satisfactionreporting periods. 0% levels across the individual -1% categories could move more -2% substantially in the near future. -3% -4% • TBR observed modest weakening on the part of the Support Services Value Break/Fix Services Online Support Overall Satisfaction On-site Technical Expertise Parts Availability Phone Support On-site Response Time Hardware Deployment Remotely Managed Support internal support teams, affecting perceptions of phone support, remotely managed support, hardware deployment and overall SOURCE: TBR services value. These shifts, however, were inconsequential both in magnitude and in contrast to the improving overall satisfaction rating. 9 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 10. 2Q11 Corporate Service & Support Satisfaction at a Glance TBR The score corrections of late 2010 appear to have been short-lived; satisfaction levels begin a slow recovery toward 3Q10 high points • Satisfaction with supportTBR OEM SUPPORT PROVIDER SATISFACTION, PAST FOUR CALENDAR QUARTERS services spiked in 3Q10, followed 7.00 by a snap-back during the forth Jul-Sep 10 Oct-Dec 10 Jan-Mar 11 Apr-Jun 11 quarter of last year. 6.50 • The 3Q10 (also evident in 2Q10) 6.00 burst of enthusiasm was driven 5.50 by a combination of new product purchases with fresh warranties 5.00 and resumed IT staff 4.50 hiring, where enthusiasm with new hardware spilled over into 4.00 perceptions of services. On-site Technical Expertise Hardware Deployment Remotely Managed Support Overall Satisfaction Replacement Parts Support Services Online Support Phone Support Break/Fix On-site Response Pricing/Value • The full correction occurred Availability during 4Q10, when most satisfaction positions returned to their first-quarter levels.SOURCE: TBR • The results from 1H11 show customer satisfaction beginning to slowly rebuild, and in most The mean satisfaction ratings in the graph are based on the study results of cases establishing sustainable discrete calendar quarters and not the “reporting periods” (comprising two patterns indicative of more calendar quarters) TBR generally reports. The graph exemplifies average stable attitudes. Note the overall ratings across the three OEM support providers – Dell Services, HP Services satisfaction rating is currently and IGS/Lenovo Services. very close to its 3Q10 position. 10 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 11. TBR 2Q11 Competitive Placement Summary & Insights11 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 12. Key Findings: Overall Study TBROVERALL RESULTS: Internal support organizations continue theirexemplary performancesDell and IGS performances are enhanced by successes within various segments Factors Driving Rankings:TBR SERVICE & SUPPORT SATISFACTION MEANS ANALYSIS 6.80 • Internal support’s No. 1 ranking was 6.60 Internal Support Organizations Dell Svcs HP Svcs IGS/Lenovo Svcs driven by an inspiring set of 6.40 6.20 performances, substantially outpacing the 6.00 OEM support providers in all but one 5.80 category (parts availability). 5.60 • TBR noted insufficient performance 5.40 5.20 differences across the three OEM support 5.00 providers to assign separate ranking Support Services Pricing/Value Replacement Parts Availability On-site Response Time/ Remotely Managed Support On-site Technical Expertise Break/Fix Services Telephone/Helpdesk Support positions. Overall Satisfaction Deployment/Installation Online Support Commitment o Dell Services outpaced its OEM Hardware competition across two areas – online support and hardware deployment – while failing to deliver on its previousSOURCE: TBR advantage for on-site response time satisfaction (present in the two previous periods). = TBR issued competitive strength in 2Q11 o IGS also earned two strengths = TBR issued competitive weakness or warning in 2Q11 (break/fix, phone support), their effects mitigated somewhat by a warning for online support. o HPS was cited with one competitive warning, for on-site expertise.12 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 13. Key Findings: x86 Server Support TBRx86 SERVER SUPPORT RESULTS: In-house support remains No. 1;IBM outperforms Dell and HP Services for the fifth straight period TBR SATISFACTION WITH SERVER SUPPORT, 1Q09 to 2Q11 The Context 92.0 • Customer satisfaction with x86-based server support services took a hit in 2009 as 90.0 a result of the spending cuts caused by the 88.0 Great Recession. WSI ratings progressively declined throughout the year, leaving no 86.0 competitor (not even the in-house teams) 84.0 immune to the trend. • By 1Q10, however, customer satisfaction 82.0 score slides halted, and improved in IBM’s 80.0 case. In 2Q10, the real excitement started; 78.0 customer satisfaction ratings surged across all groups, resulting in a split between 76.0 No. 1 ranked internal support and IBM over 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 No. 2 ranked Dell and HP Services. SOURCE: TBR Dell Services HP Services IGS/IBM Services Internal Support • In 3Q10, the internal support organizations2Q11 Developments resumed their place alone at the top, while• Satisfaction positions are beginning to slowly IBM established a ranking position improve, suggesting some customers may have undertaken advantage over Dell and HPS. recent product refreshes, which ordinarily include fresh • In 4Q10 and into 1Q11, satisfaction scores warranties. corrected, returning to positions held prior• IBM maintained its status as the top-ranked OEM support to the ebullience of the previous several provider, while Dell and HP Services’ scores remained periods. Ranking positions remained interlocked. 2Q11 was the fifth consecutive reporting period in constant. which the companies were so aligned. IBM’s No. 1 rankings include a sixth period (1Q10) when TBR ranked all three OEM13support Service & Support Customer Satisfaction | Second Calendar Quarter 2011 providers at the No. 1 spot. ©2011 Technology Business Research Inc.
  • 14. Key Findings: x86 Server Support TBRx86 SERVER SUPPORT RESULTS: Performance differentiation examplesremain plentiful, favoring internal support and IBM Factors Driving Rankings: MEAN CUSTOMER SATISFACTION BY SUPPORT OFFERING - SERVERS/STORAGE ONLY TBR • Internal support’s 6.60 No. 1 ranking was the result of 6.40 Dell Services HP Services IGS (IBM) Services Internal Support consistently outperforming OEM 6.20 competitors across all but parts 6.00 availability and (new to 2Q11) 5.80 phone support. 5.60 • IBM’s sole No. 2 ranking was 5.40 delivered through solid 5.20 performances across 5.00 break/fix, technical expertise, phone support, parts Phone Support Break/Fix Web Support Parts Availability Overall Satisfaction Overall Value On-site Response Time Remotely Managed On-site Expertise Hardware Deployment availability and support services value. These results entirely mirrored those of the previous SOURCE: TBR reporting period. • While Dell and HP Services = TBR issued competitive strength in 2Q11 remained in a shared = TBR issued competitive weakness or warning in 2Q11 No. 3 ranking, it was not theIBM holds a consistent leadership position, as evidenced by its record of result of specific warnings orcompetitive strength wins:. weaknesses, as all were lifted in• Break/fix Services: 6 of the past 7 reporting periods 1Q11 and remained in check into• On-site Expertise: 5 straight wins 2Q11. Dell and HP Services simply lacked the competitive strength• Phone Support: 2 straight wins of several of IBM’s performances.• Replacement Parts Availability: 2 straight wins• Support Services Value: 3 straight wins 14 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 15. Key Findings: Desktop/Notebook Support TBRDESKTOP/NOTEBOOK SUPPORT RESULTS: In-house support remains No. 1;Dell Services reiterates performance edge over Lenovo and HP Services SATISFACTION WITH DESKTOP/NOTEBOOK SUPPORT, 1Q09 to 2Q11 TBR The Context 90.0 • Customer satisfaction with desktop and notebook systems support began to 88.0 decline as far back as mid-2008 but 86.0 accelerated during the recession of 2009. 84.0 • By 1Q10, customer satisfaction scores for all competitors either stabilized or 82.0 improved. Dell Services’ improvement 80.0 was substantial enough to deliver a sole No. 1 ranking. 78.0 • In 2Q10, ranking positions held 76.0 steady, with Dell Services as the 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 singular SOURCE: TBR Dell Services HP Services IGS/Lenovo Services Internal Support No. 1 ranked player, internal support and Lenovo Services sharing No. 2 and2Q11 Developments HPS ranked No. 3.• Satisfaction scores began to show signs of recovery for Dell and • In the succeeding periods, Dell and Lenovo Services, while HPS’ scores held fairly steady. In-house Lenovo Services switched support scores, however, continued to weaken. positions, with Lenovo taking No. 1 in• Dell earned its third consecutive win. Dell has placed at the top of 3Q10 and Dell regaining the lead in the OEM provider rankings in five of the past six reporting 4Q10. Dell kept its leadership position periods. in 1Q11.• Lenovo Services, while remaining in a subordinate ranking position to Dell, was the most improved performer.• HP Services remained in a shared ranking position with Lenovo.15 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 16. Key Findings: Desktop/Notebook Support TBR DESKTOP/NOTEBOOK SUPPORT RESULTS: Dell Services edges the competition with its online support and technical expertise ratings Factors Driving Rankings: MEAN CUSTOMER SATISFACTION BY SUPPORT OFFERING -TBR DESKTOPS/NOTEBOOKS ONLY • Internal support’s No. 1 ranking was the 6.80 result of consistently outperforming 6.60 Dell Services HP Services Lenovo Services Internal Support OEM competitors in all categories but 6.40 parts availability and (new in 2Q11) and 6.20 break/fix. 6.00 • Dell Services’ No. 2 ranking, behind the 5.80 in-house group, was the result of two key performance differentiators, where Dell 5.60 earned competitive strengths: on-site 5.40 expertise and online support. 5.20 • Dell Services has earned competitive 5.00 strengths for on-site technical expertise Web Support Parts Availability Break/Fix Overall Satisfaction Phone Support On-site Response Time Remotely Managed Hardware Deployment On-site Expertise Overall Value for two consecutive reporting periods. However, its on-site response time rating did not quite hold up to pastSOURCE: TBR successes, with TBR lifting the competitive strength held during the = TBR issued competitive strength in 2Q11 previous two periods. = TBR issued competitive weakness or warning in 2Q11 • No. 3-ranked Lenovo Services and HPS were positioned below Dell Services across most categories (overall value excepted), particularly with respect to two key challenges – online support for Lenovo Services and on-site expertise for HP Services. 16 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 17. The Score in 2Q11 TBRInternal organizations validate themselves as the ideal support experiencePerformance differentiation across the OEM support providers is diminishedThe overriding trend in 2Q11 was for gentlyrising satisfaction following two previous 2Q11 VERSUS 1Q11 WEIGHTED SATISFACTIONperiods of considerable score weakening. TBR RATINGS AND RANKS 92.0 90.0 88.0 86.3 85.9 Dell Services’ WSI exhibited a 0.8% improvement 86.0 1 2 84.0 82.2 1 82.8 81.8 81.2 82.4 81.5 Improving scores were led by online support (+1.6%); 82.0 2 there were no weakening positions. 80.0 2 2 2 2 2 78.0 HPS’ WSI increased by 0.5% 76.0 Satisfaction improvement was led by on-site support 1Q11 Internal Support Organizations 2Q11 IGS/Lenovo Services & Partners response time (+1.7%); there were no weakening HP & Partners Dell & Partners positions. SOURCE: TBR 3 3 IGS’ WSI advanced by 0.8% On-site response time (+2%) led the list of improving 2Q11 satisfaction score improvement was driven by : positions; there were no examples of score declines. • Server support satisfaction scores Internal support’s WSI dipped by 0.5% increased, particularly within the Dell and HP There were no examples of significantly shifting groups, and particularly relative to on-site support scores. response time. • Desktop/notebook support satisfaction scores were less energized and, in some cases, weakened. Dell and HP scores were basically constant, while those of Lenovo improved. Internal support scores receded.17 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 18. The Score in 2Q11 TBRTBR’s Competitive Strength & Weakness determinations reinforcethe 2Q11 ranking position placement decisionsThese determinations are based on two-pronged results: statistical significance tests(three tests) and GAP analysis (two tests) • The singular No. 1 ranking position held by TBR 2Q11 Service Provider Strengths and Weaknesses Summary the internal support group was enhanced CHANGES IN PERFORMANCE DIFFERENTIATION SINCE 1Q11 VENDOR INTERNAL SUPPORT DELL SVCS IGS/LENOVO SVCS HP SVCS by it receiving competitive strengths inCONSTANT Break/Fix Services   *  eight of the nine categories.EXPANDING On-site Technical Expertise     • In addition to the close proximity of theirCONTRACTING On-site Response Time/Commitment     WSI scores, the three OEM supportCONSTANT Telephone/Helpdesk Support   *  providers shared the No. 2 ranking in 2Q11EXPANDING Online Support  *   Remotely Managed Support     as a result of a mixture ofCONSTANTCONSTANT Replacement Parts Availability     developments, all modest in terms ofCONSTANT Support Services Pricing/Value     overall effects.CONSTANT Hardware Installation/Configuration  *   • Dell Services earned two marginal Numeric Value 16 2 1 -1 strengths, yet within relatively low priority Weighted Satisfaction Score 85.9 82.8 82.4 81.5 Ranking 1 2 2 2 areas (online support and hardware Adjusted Ranking 1 1 1 deployment). Dell was unable to carry (Third-Party Providers Only) Key: Weakness;  Strength;  Neutral.  Warning area for weakness, but insufficient data to substantiate at this time. forward a previous win in a higher priority * The determination was marginal. category – on-site response time. SOURCE: TBR • IGS earned two marginal strengths, one of YELLOW boxes indicate areas where Strength/Weakness which was in a top category – break/fix determinations have been downgraded from the previous reporting services. These strengths were offset by a period. new warning, for online support. BLUE boxes indicate determinations that mark an upgrade. • HP was cited with one warning, for technical expertise.The principal developments in 2Q11 involved a merging of • In the end, the weights, calculated againstcustomer perceptions around on-site response time and the the satisfaction scores, delivered three WSIemergence of online support as a new performance differentiator. ratings that, again, were too close for TBR to separate the rankings. 18 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 19. Most Noteworthy Events – Performance Differentiation Shifts TBR On-site response time drops off the list of experiences separating OEM support providers In 1Q11, Dell Services’ on-site support response time satisfaction rating was significantly higher than those of HPS and IGS, leading TBR to assign Dell its second consecutive strength. IGS’ score was significantly lower than average, resulting in a competitive warning. As expected, the in- house support group’s rating was in a range all its own. By 2Q11, scores for HPS and IGS nudged up by a two-to-one greater factor than that of Dell, resulting in scores that could not be differentiated via statistical significance test. The in-house group continued to dominate the category. Between 1Q11 and 2Q11, IGS and HPS 1Q11 traded in many of their previously level-5 2Q11 scores for level-6 scores. This largely evened the score against Dell. It should be noted, however, that Dell Services continued SATISFACTION WITH ON-SITE RESPONSE TIME BY RATINGS CATEGORY SATISFACTION WITH ON-SITE RESPONSE TIME TBR BY RATINGS CATEGORY to bring in the greatest number of Perfect 7 TBR 60% 70% ratings and fewer level 5s. The difference 60% from those of competitors, however, was 50% 50% just not enough for Dell to earn the 40% 40% competitive strength for the third straight 30% 30% period; scores were too spread out for the 20% 20% significance test to turn positive. These 10% 10% results were driven by developments on the 0% 0% <5 5 6 7 desktop/notebook support side, where <5 5 6 7SOURCE: TBR Dell Services HPS IGS/Lenovo Services In House Lenovo’s mean score increased by nearly 3% SOURCE: TBR Dell Services HPS IGS/Lenovo Services In House against more static competitors’ ratings. TBR lifted the competitive warning against Lenovo in that study segment, along with Dell’s previous competitive strength. 19 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 20. Most Noteworthy Events – Performance Differentiation Shifts TBR On-site technical expertise flips to differentiate the OEM support providers In 1Q11, scores declined from the previous period at varying magnitudes, essentially evening the score across the three OEM providers. Perceptions of on-site expertise merged. In 2Q11, scores shifted again, enough to force a return of the category as a performance differentiator. Dell Services’ and IGS’ scores increased modestly, while that of HPS was flat. HPS landed in a place that was significantly lower than average, and TBR reissued a competitive warning. 1Q11 2Q11TBR SATISFACTION WITH ON-SITE TECHNICAL EXPERTISE BY RATINGS HPS’ distribution curve shifted SATISFACTION WITH ON-SITE TECHNICAL EXPERTISE BY RATINGS CATEGORY TBR 70% slightly to the left against its CATEGORY competitors in 2Q11. What is not 70% 60% clearly shown here is that the shift 60% 50% was driven entirely by the 50% 40% desktop/notebook satisfaction 40% 30% results, where HPS’ mean score 30% 20% declined by nearly 2% against more 20% 10% stable competitors. TBR cited HPS 10% 0% with a competitive warning for on- 0% <5 5 6 7 <5 5 6 7SOURCE: TBR Dell Services HPS IGS/Lenovo Services In House site technical expertise in the SOURCE: TBR Dell Services HPS IGS/Lenovo Services In House desktop/notebook segment. *These overall results may be misleading. IBM led the competition in the server support segment for technical expertise satisfaction; Dell Services in the desktop/notebook segment. 20 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 21. Most Noteworthy Events – Performance Differentiation Shifts TBR Online support re-emerges as a performance differentiator, favoring Dell over Lenovo Services In 1Q11, TBR observed no significant differences across the online support satisfaction scores for Dell, HPS or IGS. The in- house group continued to dominate the category on its own. In 2Q11, Dell’s score increased against flat competitors’ ratings. This resulted in a placement where Dell’s mean score was significantly higher than average, spurring TBR to award Dell with a competitive strength. As the developments occurred within the desktop/notebook study segment, TBR awarded Dell the strength only for desktop/notebook support, against a new competitive warning for Lenovo in that its score remained static against improving Dell and HPS ratings. It should be duly noted, however, Dell’s rating in the server 1Q11 support segment was also higher than 2Q11 average, just not significantly so. TBR SATISFACTION WITH ON-LINE SUPPORT BY RATINGS CATEGORYTBR SATISFACTION WITH ON-LINE SUPPORT BY RATINGS CATEGORY 60% Dell Services earned the 60% 50% competitive strength as a result of 50% earning more Perfect 7 ratings and 40% fewer disappointed ratings than 40% 30% competitors. The 30% 20% difference, however, is more evident within the 20% 10% desktop/notebook study segment. 10% 0% Lenovo Services’ warning was the <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House result of a distribution curve 0% <5 5 6 7SOURCE: TBR heavily weighted toward the fifth Dell Services HPS IGS/Lenovo Services In House SOURCE: TBR level of the scale and clearly lacking in Perfect 7 ratings. 21 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 22. Server Support – Detailed Segment Analysis TBRTBR’s Competitive Strength and Weakness determinations enhancethe 2Q11 server support ranking position placement decisionsThese determinations are based on two-pronged results: statistical significance tests(three tests) and GAP analysis (two tests) TBR x86 Server Service Provider 2Q11 Scorecard • The foundation for the internal support group’s continued No. 1 ranking was substantiated by its SERVER SUPPORT continued earning of strengths in all but two categories SUPPORT PROVIDER INTERNAL SUPPORT IBM SVCS DELL SVCS HP SVCS – parts availability (standard) and phone support (new). Brea k/Fi x Servi ces  *   • IBM repeated its No. 2 ranking behind the in-house On-s i te Techni ca l Experti s e  *   group and ahead of its OEM support provider On-s i te Res pons e Ti me/Commi tment     competition. This was enhanced by five continuing Tel ephone/Hel pdes k Support  *   competitive strengths. Onl i ne Support     • Dell Services’ No. 3 ranking behind IBM Support was the Remotel y Ma na ged Support     Repl a cement Pa rts Ava i l a bi l i ty  *   result of failing to narrow performance gaps across the Support Servi ces Pri ci ng/Va l ue  *   five categories in which IBM is differentiated in the Ha rdwa re Ins ta l l a ti on/Confi gura ti on     minds of its customers. Numeri c Va l ue 14 5 0 0 • HPS’ situation was similar to Dell Services, allowing IBM Wei ghted Sa ti s fa cti on Score 85.9 83.0 82.0 81.9 to take a total of five competitive strength categories. Ra nki ng 1 2 3 3 Ra nki ng, OEM Support Provi ders Onl y 1 2 2 Key: Weakness;  Strength; ¡ Ne utra l. Warning area fo r weakness, but insufficient data to substantiate at this time. * The de te rm ina tio n wa s m a rgina l. What Changed in 2Q11: Only one change occurred between 1Q11 and 2Q11; the general weakening of the in-house group resulted in failure to repeat its competitive strength advantage for phone support. This is an unusual development, as the group generally dominates this category. Its mean score generally weakened on the desktop/notebook support side and might be indicative of internal stresses around resource availability for handling internal helpdesk calls as a result of recent cutbacks.22 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 23. Server Support – Detailed Segment Analysis TBRIBM earns a solid No. 1 ranking over Dell and HP Services due tothe contributions of several key competitive advantages MEAN CUSTOMER SATISFACTION BY SUPPORT OFFERING - TBR SERVERS/STORAGE ONLY • Server support customers 6.20 attribute relatively high Dell Services HP Services IGS (IBM) Services importance to most 6.00 categories, with the exceptions 5.80 being remotely managed and 5.60 online support as well as hardware deployment services. 5.40 • IBM Support established 5.20 substantial performance 5.00 advantages over competitors across five categories. As high- Phone Support Break/Fix Web Support Parts Availability Deployment On-site Response Overall Value Overall Satisfaction Remotely Managed On-site Expertise Hardware importance areas, each of these Time categories carry significant SOURCE: TBR weight toward the WSI score. • Across most highly weightedFor details on server/storage versus desktop/notebook support by support provider, please refer categories, the performances ofto Appendix G. Dell and HP Services were TBR splits responses based on the comparable, yet HP held a respondents’ primary narrow advantage over Dell for 2Q11 SERVER SUPPORT SATISFACTION & RANKINGS responsibilities. In each study a support services value. WSI Score Rank participant is asked to identify the support area with which they are IBM Services 83.0 1 most involved (servers/storage or Dell Services 82.0 2 desktop/notebook) and are then HP Services 81.9 2 asked to rate those experiences exclusively. 23 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 24. Server Support – Detailed Segment Analysis TBR In 2Q11, server support satisfaction ratings began to rebound • Satisfaction with server support services increasedTBR 1Q11 to 2Q11 SATISFACTION SHIFTS, SERVER/STORAGE SUPPORT in 2Q11, yet Dell Services’ score for online support3% was the only example of an improvement of Dell Services HP Services IGS (IBM) Services3% significant magnitude.2%2% • Overall, Dell and HP had the most to gain in1% 2Q11, with their scores on average improving at1% higher magnitudes than that of IBM and0% subsequently narrowing IBM’s leadership margin.-1% • Regardless of the momentum evidenced within the-1% ratings of Dell and HPS, IBM held firm its Parts Availability Web Support On-site Response Time Break/Fix Remotely Managed Phone Support Hardware Deployment On-site Expertise Support Service Pricing/Value competitive advantages across the five categories where competitive strengths were awarded in the last reporting period as well. • Dell Services’ ratings improved across most of theSOURCE: TBR high-importance categories, except parts availability and support services value excepted. Note that Dell’s online support rating improved significantly, while competitors’ scores gently LEVELS OF IMPROVEMENT IN SERVER receded, offering Dell a potential opportunity for SATISFACTION, 1Q11 to 2Q11 dominance in this category. % Change, WSI Score • HPS’ scores improved most notably in the on-site Dell Services +1.3% support categories as well as in parts availability. HP Services +1.1% • IBM’s momentum was stilted against competitors’ IBM Services +0.7% gains, with the exception of phone support. Regardless, as mentioned above, IBM gave up nothing in terms of dominance within the five key areas that defined its continued No. 1 ranking. 24 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 25. Server Support – Detailed Segment Analysis TBRGAP scores point to some specific challenges for OEM support providers STANDARD GAP SCORES - SERVER/STORAGE SUPPORT TBR Support Services Value Parts Availability UNACCEPTABLE GAP RANGE Hardware Deployment Remotely Managed Support Online Support Phone Support On-site Expertise On-site Response Time Break/Fix Services -10.00% -5.00% 0.00% 5.00% 10.00% 15.00% 20.00% 25.00% SOURCE: TBR IGS (IBM) HP Services Dell Services While most GAP positions have shifted to within the acceptable range for the score (up to -5%), examples remain where there is ample room for improvement. These examples include online support for IBM, on-site response time for IBM and HPS, phone support and parts availability for Dell Services.25 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 26. Desktop/Notebook Support – Detailed Segment Analysis TBRTBR’s Competitive Strength and Weakness determinations enhance the2Q11 desktop/notebook support ranking position placement decisionsThese determinations are based on two-pronged results: statistical significance tests(three tests) and GAP analysis (two tests) • The internal support group’s No. 1 ranking was substantially TBR Desktop/Notebook Service Provider 2Q11 Scorecard enhanced by its continuing domination across all but the parts DESKTOP/NOTEBOOK SUPPORT availability (standard) and break/fix (new) categories. INTERNAL • Dell Services maintained its No. 2 ranking for the third reporting SUPPORT PROVIDER SUPPORT DELL SVCS LENOVO SVCS HP SVCS period by carrying over its on-site expertise competitive Brea k/Fi x Servi ces     strength and adding a new one for online support. Note Dell’s On-s i te Techni ca l Expertis e     On-s i te Res pons e Ti me/Commi tment     on-site expertise strength (a relatively high importance Tel ephone/Hel pdes k Support     category) was a full value strength, while its online support win Onl i ne Support  *   was marginal. Dell did not carry over its previous strength for Remotel y Ma na ged Support     on-site response time. Repl a cement Pa rts Ava i l a bi l i ty     • Lenovo Services remained in the No. 3 ranking behind Dell, the Support Servi ces Pri ci ng/Va l ue     decision enhanced by its competitive warning in the online Ha rdwa re Ins tal l a tion/Confi gura tion     support category, where Dell currently dominates. This new Numeri c Va l ue 14 3 -1 -1 85.8 83.7 81.7 81.1 warning was offset by a recovery from Lenovo’s previous Wei ghted Sa tis fa ction Score Ra nki ng 1 2 3 3 warning for on-site support response time. Ra nki ng, OEM Support Provi ders Onl y 1 2 2 • HPS remained in a shared No. 3 ranking with Lenovo, also cited Key: Weakness;  Strength; ¡ Ne utra l. Warning area fo r weakness, but insufficient data to substantiate at this time. with one competitive warning. This was a newly-issued warning * The de te rm ina tio n wa s m a rgina l. for on-site technical expertise.What Changed in 2Q11:• The internal support teams failed to continue to dominate the break/fix services category, primarily as a result of improvements on the part of both Dell and Lenovo.• Dell Services gave up its strength for on-site response time due to significantly improving competitors. Its on-line support strength gain was delivered as a result of steady improvement over the past several reporting periods.• Lenovo Services recovered from its previous response time warning through a nearly 3% improvement within its mean rating. Its online support warning came about due to a static rating against improving competitors.• HPS’ on-site technical support warning was the result of a weakened rating against static or improving competitors.26 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 27. Desktop/Notebook Support – Detailed Segment Analysis TBRDell Services’ No. 1 ranking is driven by on-site expertise andonline support advantages MEAN CUSTOMER SATISFACTION BY SUPPORT OFFERING - • Dell Services’ win was primarilyTBR DESKTOPS/NOTEBOOKS ONLY the result of its substantial 6.20 Dell Services HP Services Lenovo Services performance advantages in the 6.00 relatively high-importance area of 5.80 on-site technical expertise, where a full competitive strength was 5.60 awarded. This was supplemented 5.40 by a marginal strength within a 5.20 lower-importance area – online support. 5.00 • The fact that Dell’s wins occurred Parts Availability Overall Satisfaction Phone Support Web Support On-site Response Break/Fix Deployment Overall Value Remotely Managed On-site Expertise Hardware within areas where competitors Time were challenged accentuated theSOURCE: TBR power of its No. 1 ranking. For details on server/storage versus desktop/notebook support by support TBR splits responses based provider, please refer to Appendix G. on respondents’ primary responsibilities. Each study 2Q11 DESKTOP/NOTEBOOK SUPPORT SATISFACTION & participant is asked to RANKINGS identify the support area WSI Score Rank with which they are most Dell Services 83.7 1 involved (servers/storage Lenovo Services 81.7 2 or desktop/notebook) and HP Services 81.1 2 are then asked to rate those experiences exclusively.27 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 28. Desktop/Notebook Support – Detailed Segment Analysis TBR Desktop/notebook support satisfaction ratings shift within a narrow range of modestly changed positions 1Q11 to 2Q11 SATISFACTION SHIFTS, DESKTOP/NOTEBOOK SUPPORT • WSI positions were essentially constantTBR4% for Dell and HP Services, while Lenovo Dell Services HP Services Lenovo Services Services made a modest level of2% improvement. • Regardless of Lenovo’s improvement,0% the magnitude of Dell Services’ lead-2% was only modestly reduced. • In the area of on-site technical-4% expertise, HPS’ score receded, against Phone Support Web Support Parts Availability Break/Fix On-site Response Time Support Service Pricing/Value Remotely Managed Hardware Deployment On-site Expertise comparatively flat competitors. This resulted in HPS gaining a new competitive warning, while Dell held its competitive strength for a second consecutive win.SOURCE: TBR • Lenovo’s on-site response time rating improved by nearly 3%, against a flat Dell rating. Consequently, Dell lost hold LEVELS OF IMPROVEMENT IN DESKTOP/NOTEBOOK of its competitive strengths of the SATISFACTION, 1Q11 to 2Q11 previous two periods while Lenovo % Change, WSI Score recovered from its warnings of the Dell Services +0.2% same time periods. HP Services -0.2% Lenovo Services +1.0% 28 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 29. Desktop/Notebook Support – Detailed Segment Analysis TBRSome borderline GAP positions remain in place, primarily affectingHPS’ performances TBR STANDARD GAP SCORES - DESKTOP/NOTEBOOK SUPPORT Support Services Value Parts Availability UNACCEPTABLE GAP RANGE Hardware Deployment Remotely Managed Support Online Support Phone Support On-site Expertise On-site Response Time Break/Fix Services -10.00% -5.00% 0.00% 5.00% 10.00% 15.00% 20.00% SOURCE: TBR IGS (Lenovo) HP Services Dell Services While most GAP positions have shifted to within the acceptable range for the score (up to -5%), a few borderline GAP positions remained in place in 2Q11. HPS shows additional room for better meeting customer expectations in the phone support and on-site expertise categories as well as for replacement parts availability. Lenovo needs to consider its on-site response time performance. While TBR lifted its previous competitive warning, customers continue to express high expectations.29 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 30. Critical Metrics Summary TBRVital Statistics: 2Q11 Technology Services Satisfaction Competition Dell Services IGS/Lenovo Services HP/PSG Services Internal Support 2Q11 Ranking 2 2 2 1 2Q11 Ranking, OEM 1 1 1 N/A support providers Rank change vs. 1Q11 0 0 0 0 2Q11 WSI 82.8 82.4 81.5 85.9 WSI change vs. 1Q11 +0.8% +0.8% +0.5% -0.5% Rationale for WSI placement & WSI placement & WSI placement & WSI placement & proximity proximity to OEM proximity to OEM proximity to OEM to OEM competitors; multiple Ranking Positions competitors competitors competitors competitive strengths Online support (New; Marginal); hardware Break/fix Services All except for parts deployment (Continuing; Marginal); availability (continuing) – all Competitive Strengths (Continuing; Phone support None full competitive strengths, Marginal); On-site (Continuing; Marginal) consistent with 1Q11 results support response strength rescinded Online support (New; Competitive On-site technical Warning); recovered from None expertise (New; None Weaknesses/Warnings on-site support response Warning) time warning Significant Movement, No significant No significant No significant movement; movement; online movement; on-site 2Q11 vs. 1Q11 support advanced by on-site response time response time No significant movement (3% or greater shifts) improved by 2% 1.6% advanced by 1.7%30 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 31. Critical Metrics Summary TBRVital Statistics: 2Q11 Technology Services Satisfaction Competition Dell Services IGS/Lenovo Services HP/PSG Services Internal Support Server Support WSI & 82.0 No. 3 83.0 No. 2 81.9 No. 3 85.9 No. 1 Ranking Desktop/Notebook 83.7 No. 2 81.7 No. 3 81.1 No. 3 85.8 No. 1 Support WSI & Ranking Five strengths – Strengths across all break/fix, on-site categories except for Server Support Neutral across the expertise, phone Neutral across the board parts availability Competitive Profile board support, parts (continuing) and phone availability, support support (new) services value Two strengths – on-site technical expertise Strengths across all One warning – online (Continuing; Full); categories except for Desktop/Notebook support (New); on-site One warning – on-site online support (New; parts availability Competitive Profile response time warning technical expertise (New) Marginal); on-site (continuing) and rescinded response time strength break/fix services (new) rescinded [WSI +1.3%] No Significant Movement, significant movement; [WSI +0.7%] No [WSI +1.1%] No significant [WSI +0.4%] No Server Segment, online support +2.5%; significant movement; movement; on-site response significant movement; 2Q11 vs. 1Q11 on-site response time phone support +1.6% time +2.1% on-site expertise +2% +2.1% [WSI -0.2%] No significant [WSI – 1.6%] Support Significant Movement, [WSI +0.2%%] No [WSI +1%] On-site movement; remotely services value -3.2%; Desktop/Notebook major changes response time +2.9% managed support -1.9%; on- remotely managed Segment, 2Q11 vs. 1Q11 site expertise -1.7% support -3%31 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 32. Critical Metrics Summary TBRVital Statistics: 2Q11 Technology Services Satisfaction Competition Dell Services IGS/Lenovo Services HP/PSG Services Internal Support The internal support group The revolving pattern of HP Services has established far IBM Support pulled in its sixth remained in the position to Dell and Lenovo Services fewer No. 1 rankings than straight No. 1 ranking for server which it was ascribed at the trading wins in the competitors in either study support satisfaction, and its fifth start of TBR’s study design desktop/notebook support segment (server support, straight singular (non-shared) top more than a decade ago – segment has been broken. desktop/notebook support). Its ranking. Among the records IBM the ideal against which we Dell secured its third last top ranking in the server holds are competitive advantages measure the OEM support straight (and fifth of the support segment was a three-way for break/fix support satisfaction providers. The group past six reporting periods) tie with IBM and Dell in 1Q10. in six of the past seven periods, carried competitive No. 1 ranking. In 2Q11, Dell While HPS’ server support five straight wins for on-site strengths against all differentiated itself through satisfaction scores improved to a technical expertise, and three categories except parts its competitive advantages greater extent than competitors in straight wins for support services availability in the 2Q11 for on-site expertise and the categories of parts availability value satisfaction. Neither of its overall results. TBR did online support specifically, and on-site response time, competitors has been able to observe some weakening of yet its scores trended additional momentum will beSummary differentiate their performances performances, however, higher than competitors required to break the IBM spell of through competitive strength that resulted in the group across all but the support six consecutive wins in the wins for these past five reporting giving up its competitive service value category. Dell segment. HPS won a string of periods. Lenovo Services strength standing for phone customers also continued three No. 1 rankings in the continues to be outpaced by Dell support in the server to attribute the highest desktop/notebook segment from in desktop/notebook satisfaction segment and for break/fix levels of satisfaction to 4Q08 through 2Q09. despite being the most improved services in the parts replacement handled Subsequently, however, Lenovo competitor in the 2Q11 reporting desktop/notebook by Dell on-site technicians. and now Dell have been more period. Currently, Lenovo is segment. It is possible In the server support likely to secure top rankings. In challenged in meeting customer internal support segment, Dell continued to the 2Q11 competition, HPS’ expectations for online support organizations may be be out-differentiated by scores in the desktop/notebook and was outpaced by Dell in the feeling a bit of the pinch of IBM for the fifth support segment were stagnant on-site technical expertise overextending existing consecutive reporting amid a renewed challenge in the category. resources in the face of period. area of on-site technical expertise. budget cutbacks. During the past year, TBR has observed many patterns in these study results, beginning with the exuberance of large corporate refreshes with fresh systems warranties that introduce minimal fuss in the support department. This was followed by what occursBottom with respect to the natural order of time, where some systems develop issues that need to be dealt with either internally, through OEM support contacts, or both. The results of these changes were the extreme ups and downs TBR observed in the satisfactionLine numbers. The data appears to be on a return course to business as usual, with 1H11 satisfaction scores gently rising, leaving 4Q10 as the end of the satisfaction score corrections. The rebound appears to be led by server support experiences, where satisfaction levels generally improved in 2Q11, while desktop/notebook support scores were stalled.32 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 33. TBR’s Watch List TBRTBR’s Watch List differs from the Competitive Strength andWeakness AnalysisDifferences:• The analysis looks backward and forward.• Items placed on the Watch List are often not areas where the vendor has underperformed the marketplace or a specific competitor.• Included are areas in which a vendor may have recently excelled; however, the competitive field has shifted during the current reporting period.TBR takes the following factors into consideration in determining items on the Watch List:• Results of the Improvements GAP Analysis are based on a vendor’s expectation fulfillment for a category against its overall expectation fulfillment across all measured attributes.• Competitive positioning based on results of statistical significance tests• Results of the Standard GAP Analysis for the vendor against its competitors’ positions• Decline in satisfaction in the past two reporting periods• Segments (server support versus desktop/notebook support) influencing declines in satisfaction during past two reporting periods• Loss of competitive strength or addition of competitive weakness• Disappointment/Delight meter – proportions of dissatisfied versus delighted customers• Items are removed from the Watch List when a vendor has recovered its competitive position from past, recent reporting periods. 33 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 34. TBR’s Watch List: Dell Services TBRDell Services faces challenges in meeting customer expectations forserver support; divided customer perceptions need to be addressed Segments Strength/ Disappointment/ Improvements % Change Long-termCitation Placement Affected, Weakness Notes GAP versus 1Q11 Trends Delight Meter 2Q11 StatusPhone Significantly Below Average Up by 1.5%, Trailing IBM Server Remaining Disappointment Dell’s phone supportSupport below IBM at 95% comparable by Support neutral steady at 8% and satisfaction scores confidence in to substantial while IBM remains worst in continue to exhibit server segment competitors’ margins for retains class, vs. best-in- volatile patterns and average in the past four competitive class performance wide opinion spread, server reporting strength for (20%) for with an unacceptably segment periods second customer delight high number of straight disappointed scores. period Today, the issue remains largely on the server support side, where IBM continues to defend its exceptional record.Support Significantly lower Just Below Up by a Unable to Server Remaining Disappointment An issue of divergingServices than IBM at 95% Average modest 0.7%, recover from Support neutral steady at 8% and views among DellValue confidence in comparable substantial while IBM remaining worst in customers sampled server segment; to drop in successfully class; customer again; Dell is most also trending competitors’ 4Q10; defends delight steady at challenged in the lower than HP average in trailing IBM competitive 22% and best in server segment, server server significantly strength for class where both segment for the past the third competitors fared four periods straight better. Dell has not period earned a competitive strength for services value since mid-2009. 34 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 35. TBR’s Watch List: Dell Services TBRDell Services faces challenges in meeting customer expectations forserver support; divided customer perceptions must be addressed (cont.) Segments Strength/ Disappointment/ Improvements % Change Long-termCitation Placement Affected, Weakness Notes GAP versus 1Q11 Trends Delight Meter 2Q11 StatusBreak/Fix Significantly Above Average Up by 1.4% Beginning to Server IBM Disappointment IBM continues toServices below IBM and recover from Support continues to steady at 6% and dominate the in server comparable substantial win worst in class; break/fix segment, to weakening in competitive customer delight, at satisfaction 95% competitors’ previous two strength, in 27%, up slightly and category, bringing confidence average in periods; six of the past remaining best in in its sixth server consistently seven periods class competitive segment trailing IBM strength during the since mid-2009 past seven periods. Dell customer opinions remain very divided, suggesting variability of experience, perhaps drawing a dividing line between server and desktop/notebook customers, and possibly between premium-level and basic-support contract holders. Additional Observations: Dell Services’ on-site response time satisfaction ratings continued to trend higher than average, but divided opinions have affected this metric as well, preventing Dell from continuing to significantly outperform competitors in 2Q11. Dell brought in the worst-in-class performance for customer disappointment, against the strongest record for customer delight. TBR subsequently lifted Dell’s competitive strength standing that had existed during the previous two periods. Meanwhile, competitors have recovered from their various warnings in this category during the past several reporting periods.35 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 36. TBR’s Watch List: HPS TBRHP challenges remain due to competitive pressures in both study segments % Change Segments Strength/ Disappointment/ Improvements Long-termCitation Placement versus Affected, Weakness Notes GAP Trends Delight Meter 1Q11 2Q11 StatusOn-site Significantly Average Up by 1.5% No signs of Both Warning reissued Disappointment, at HPS faces toughTechnical lower than IBM in server recovery; segments in desktop/ 5%, was worst in competition in thisExpertise in server segment, trailing notebook class; customer category from both segment; comparable competitors segment delight, at 19%, sides – Dell earns significantly to by wide following short within range of the strengths in the lower than competitors’ margins for reprieve, while competitors desktop/notebook competitors’ average; the past Dell earns its segment and IBM in average in down by four periods second straight the server segment. desktop/ 1.7% in competitive HPS subsequently notebook desktop/ strength; suffers from a lack segment notebook remaining of perceived segment, vs. neutral in server differentiation. competitors’ segment yet IBM average +1% continues to dominate the categoryBreak/Fix Significantly Above Average Up by 1.6% Beginning Both Remaining Disappointment HPS has improvedServices lower than IBM and to recover segments neutral yet IBM steady at 4%; its positioning in the in server comparable from earns customer delight up server segment, yet segment; to substantial competitive from 15% in 1Q11 clearly lacks the trending lower competitors’ weakening strengths in six of to 22% in 2Q11 sustainable than average in of previous the past seven differentiation competitors’ server two periods periods in the established by IBM average in segment; no but server segment in the server desktop/ change in consistently segment. notebook desktop/ trailing both Competitors’ scores segment notebook competitors improved in the segment for the past desktop/notebook against four periods segment while those competitors’ of HPS remained average flat. +1.5%36 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 37. TBR’s Watch List: HPS TBRHP challenges remain due to competitive pressuresin both study segments(cont.) Segments Strength/ Disappointment/ Improvements % Change Long-termCitation Placement Affected, Weakness Notes GAP versus 1Q11 Trends Delight Meter 2Q11 StatusOnline Significantly Below Average No change in Remaining flat Both Last Disappointment Dell Services’ scoreSupport lower than server during the past Segments competitive worst in class at was significantly Dell in both segment, while three periods warning 9% above average and study Dell improves and dropping issued in HPS held an segments by 2.5%; up by significantly 1Q10; inordinately high just 0.6% in below Dell in remaining number of desktop/ 2Q11 as Dell neutral disappointed scores. notebook has improved while Dell Lenovo Services is segment, during the past picks up the similarly challenged. comparable to two periods competitive that of Dell strength in the desktop/ notebook segment Additional Observations: TBR removed on-site support response time from HPS’ Watch List in 2Q11 due to the competitive field converging in the server support segment. In the desktop/notebook segment, HPS scored between Dell Services and Lenovo Services, trending only slightly (but not significantly) lower than Dell. HPS saw a 50% increase in customer delight and a 50% reduction in customer disappointment for the category between 1Q11 and 2Q11.37 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 38. TBR’s Watch List: IGS TBRIBM support continues to exhibit few vulnerabilities; Lenovo Servicesmust focus on regaining past competitive advantages Segments Strength/ Disappointment/ Improvements % Change Long-term Citation Placement Affected, Weakness Notes GAP versus 4Q10 Trends Delight Meter 1Q11 Status On-site Marginally Well Below Up by 2.9% in Trailing Dell Desktop/ 1Q11 Disappointment With Lenovo Services’ Response below Average desktop/ for past Notebook competitive held in check at score improving Time competitors’ notebook three warning 3%; customer against a flat Dell average (90% segment but reporting lifted due to delight worst in Services, Dell lost its confidence) not enough to periods, HP improvement class (13%) vs. competitive strength and close the for past two; in 2Q11 competition standing of the significantly performance had led the previous two below Dell gap against field briefly reporting periods and (95%) in Dell in 3Q10 Lenovo recovered desktop/ from its warnings of notebook the past two. Yet, the segment category remains on the Watch List, as Lenovo continues to trail Dell, setting up a seemingly unbreakable pattern. NEW – Significantly Well Below Down gently, Has been Desktop/ Competitive Disappointment in IGS earned a paltry Online lower than Average -0.5% in server down for the Notebook warning check at 5%; number of perfect 7s, Support Dell in server segment past three issued in customer delight, half those earned by segment; against Dell’s periods and 2Q11 while at 8%, worst in competitors. IGS had substantially +2.5% trailing Dell earns class vs. earned three lower than improvement; competitors’ the strength competitors’ consecutive competitors’ no change in averages for in desktop/ average 16% competitive strengths average and desktop/ the past two notebook in this category, from Dell in notebook segment 1Q10 through 3Q10. desktop/ segment vs. Perceptions have notebook competitors’ shifted considerably segment average +0.6% since then. 38 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 39. Historical Record TBRDell Services holds the record for wins since the study’s inception,though IGS holds the record for wins in the past three years• Since the study’s inception in 4Q00, Ranking Determinations Among Third-party Dell Services has been ranked as a No. 1 Support Providers, Past Three Years TBR support provider for 34 of 44 reporting 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Total # periods. Dell Services 2 1 2 3 1 1 1 1 1 2 2 1 1 1 9• Dell Services’ No. 1 ranking in 2Q08 was HP Services 2 2 2 2 1 3 2 3 3 3 2 2 1 1 3 IGS/Lenovo Services 1 1 1 1 1 2 1 2 2 1 1 1 1 1 11 its first since 4Q07 and did not carry SOURCE: TBR over into 2H08. Dell Services regained its No. 1 status three reporting periods later, in 1Q09, and held that distinction Until 2Q09, IGS held the record for number of successive wins in the for the next four periods. previous 14 reporting periods. IGS regained its No. 1 status in 3Q09,• Dell’s wins have reappeared in the past making for 19 wins during the last 22 reporting periods up to the current three reporting periods. reporting period. 3Q00 and 4Q00 iterations were experimental; methodology differed from that established with • Half of HPS’ 14 No. 1-ranking determinations have occurred the 1Q01 study. since 2Q05. HPS achieved five consecutive No. 1 rankings from 1Q06 through 1Q07, with its 1Q09 win the company’s first after SUPPORT PROVIDER RANKING HISTORY (Based on 43-reporting-period History Beginning an absence of nearly two years. Competitive pressures TBR 3Q00) contributed to HPS’ drop to the No. 3 spot in 2Q09, followed by 100% a series of second and third place rankings up until the current 80% reporting periods, in which it returned to No. 1 in both 1Q11 and 2Q11. 60% • Of the 24 incidences in which IGS has been a No. 1-ranked 40% 34 player, 14 were consecutive wins (4Q05 to 1Q09). During the 24 20% 14 past three years, IGS has earned a total of 11 No. 1 0% rankings, outnumbering Dell Services’ nine wins. IGS also holds Dell Services HP Services IGS/Lenovo Services the record for the number of consecutive wins in recent periods No. 1 No. 2 No. 3+ – earning No. 1 ranking status for the past five straight periods. SOURCE: TBR 39 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 40. Historical Record TBRThe x86 server support satisfaction competition history shows DellServices yielding its title to IBM x86 SUPPORT PROVIDER RANKING HISTORY• During the time period in which TBR (Based on 18-quarter History since TBR separated TBR has separated the service results) 100% and support study results by 90% segment, Dell Services has 80% 70% earned 10 No. 1 rankings to 60% IBM’s 11 in the x86 server 50% 40% support segment. 30% 10 11• However, one-half of Dell’s wins 20% 10% 5 occurred before 3Q08, its most 0% recent No. 1 ranking taking Dell Services HP Services IGS/IBM Support place as part of a string of five No. 1 No. 2 No. 3+ SOURCE: TBR wins from 1Q09 through 1Q10.• IBM has ranked No. 1 steadily for the past six reporting periods. Ranking Determinations Among x86 SERVER Third-party TBR• HPS has earned one of its five Support Providers, Past 12 Reporting Periods No. 1 rankings since 3Q08. Total # 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Wins Dell Services 2 2 1 1 1 1 1 2 2 2 2 2 5 HP Services 2 3 2 3 3 2 1 2 2 2 2 2 1 IBS/IBM Services 1 1 2 2 2 2 1 1 1 1 1 1 8 SOURCE: TBR40 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 41. Historical Record TBRWhile Lenovo Services has stacked up the greatest number of wins indesktop/notebook support, Dell Services’ success has been most current• Since TBR separated the service and DESKTOP/NOTEBOOK SUPPORT PROVIDER support study results by segment, Lenovo RANKING HISTORY Services has earned 13 No. 1 rankings to TBR (Based on 18-quarter History since TBR separated Dell’s eight in the desktop/notebook results) support segment. 100% 90%• Yet, since 3Q08, Dell and Lenovo Services 80% have each earned a total of 7 No. 1 70% 60% rankings, with Dell taking the leadership 50% position for the past three straight 40% 13 30% periods. 20% 8 10% 3• Lenovo Services’ No. 1 rankings were more 0% predictably awarded before 2010. Dell Services HP Services IGS/Lenovo Services• HPS has earned its only No. 1 rankings as a SOURCE: TBR No. 1 No. 2 No. 3+ string of three wins between 4Q08 and 2Q09. Ranking Determinations Among DESKTOP/NOTEBOOK Third-party TBR Support Providers, Past 12 Reporting Periods Total # 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Wins Dell Services 3 2 1 1 3 2 1 1 2 1 1 1 7 HP Services 2 1 1 1 2 3 3 3 2 2 2 2 3 IGS/Lenovo Services 1 1 1 1 1 1 2 2 1 2 2 2 7 SOURCE: TBR41 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 42. Historical Record TBRCases of differentiation dwindled in 2008, reasserting themselvessince 2009TBR Strength & Weakness Performance History - 4Q05 to 2Q11 • The years 2007, 2009 and 2010 were 4Q05 1Q06 2Q06 3Q06 4Q06 1Q07 2Q07 3Q07 4Q07 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11SERVICES PRICING/VALUE marked by a substantial number ofDell ServicesHP Services *   *   * performanceIGS/Lenovo Services REPLACEMENT PARTS AVAILABILITY *   differentiators, compared to tighterDell Services        * competitive fields during theHP Services   IGS/Lenovo Services   remaining years since 2005.BREAK/FIX SERVICESDell Services       • Some noteworthy patterns ofHP Services    IGS/Lenovo Services *       *    * consistency since 2009 include:ON-SITE SUPPORT RESPONSEDell Services  * *     *  o Eight consecutive strengths forHP Services           IGS/Lenovo Services * *    break/fix services for IGSTECHNICAL EXPERTISEDell Services    * *      o Four straight strengths for on-HP Services       IGS/Lenovo Services *   *   *   site response time for DellPHONE SUPPORTDell Services     Services from 2Q09 throughHP ServicesIGS/Lenovo Services  *      *             * * *   1Q10, returning in 4Q10 & 1Q11 o Warnings or weaknesses in fiveONLINE SUPPORTDell Services    *HP ServicesIGS/Lenovo Services * *       * *    of the past seven periods forHARDWARE DEPLOYMENTDell Services *   * *   HPS for on-site support responseHP Services timeIGS/Lenovo Services * *       Ke y: We a kne s s ;  S tre ngth; Ne utra l. Wa rning; no t c ite d a s a c o m pe titive we a kne s s this qua rte r due to la c k o f c o rro bo ra ting e vide nc e . o A recurring pattern of scattered* M e a ns tha t the s tre ngth is bo rde rline .SOURCE: TBR wins for phone support for IGSPerceptions of support services value point to a distinctive lack of o A lack of predictability withdifferentiation overall. There have been few incidences where TBR has respect to online support; Delldetermined competitive strengths or warnings to OEM support providers in and IGS trading strengths andthis category. Clearly, in-house support is perceived as the most effective warnings on a rotating basismeans for controlling support costs. 42 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 43. TBR Appendix A: Analytical Graph & Tables43 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 44. Understanding the 2Q11 Ranking Positions TBR Dell Services outperforms HPS in the areas of hardware installation and on-site technical expertiseTBR DELL TO HP MEAN SATISFACTION DISTANCES, 2Q11 VS.1Q11 Dell Services continued to outperform HPS 5% in the area of hardware installation as well Dell 3% Advantage as gained a significant advantage for on-site 1% Areas technical expertise and gained a slight -1% HP advantage in online support in 2Q11. Advantage -3% Areas Dell to HP Distance 1Q11 Dell to HP Distance 2Q11 -5% Telephone/Helpdesk Support Remotely Managed Support Hardware Installation/Configuration On-site Technical Expertise Overall Satisfaction Support Services Pricing/Value Online Support Replacement Parts Availability Break/Fix Time/Commitment On-site Response TBR PERCENT CHANGES IN MEAN SATISFACTION POSITIONS,SOURCE: TBR FOR DELL & HP SERVICES 2Q11 VS. 1Q11 3% Dell Services HP Services Many areas exhibited similar magnitudes of 2% rising mean ratings between Dell Services and HPS. The exceptions included remotely managed 1% support and support services value, where Dell Services continued to outperform HPS. Dell 0% gained competitive advantages over HPS in 2Q11 for both online support and on-site expertise -1% On-site Technical Expertise Support Services Value On-site Response Time Break/Fix Services Overall Satisfaction Parts Availability Hardware Deployment Online Support Remotely Managed Support Phone Support due to its improved scores in both categories against HPS flat ratings. SOURCE: TBR 44 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 45. Understanding the 2Q11 Ranking Positions TBR Dell Services gains and loses competitive advantages over IGS DELL TO IGS MEAN SATISFACTION DISTANCES, 2Q11 VS.1Q11TBR 5% Dell to IGS Distance 1Q11 Dell to IGS Distance 2Q11 Dell Services marginally outperformed IGS in 4% the areas of on-site response time, and 3% Dell Advantage significantly outperformed IGS in the area of 2% Areas online support and hardware installation. 1% 0% IGS lost its slight competitive advantage over IGS -1% Advantage Dell Services in 1Q11 in overall satisfaction in Areas -2% 2Q11 but gained a slight competitive -3% advantage in the break/fix category. On-site Response Time/Commitment Hardware Installation/Configuration On-site Technical Expertise Overall Satisfaction Support Services Pricing/Value Telephone/Helpdesk Support Remotely Managed Support Online Support Replacement Parts Availability Break/Fix PERCENT CHANGES IN MEAN SATISFACTION POSITIONS FOR DELL & IGS SERVICES, 2Q11SOURCE: TBR TBR VS. 1Q11 3% Dell Services IGS/Lenovo Services 2% IGS’ mean rating shifted up for on-site response time and remotely managed 1% support, and down for online support and 0% hardware installation, leading to many of the performance differences indicated above. -1% On-site Technical Expertise Remotely Managed Support Support Services Value Parts Availability Break/Fix Services On-site Response Time Overall Satisfaction Online Support Phone Support Hardware Deployment SOURCE: TBR 45 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 46. Understanding the 2Q11 Ranking Positions TBR IGS continues to generally outperform HPS HP TO IGS MEAN SATISFACTION DISTANCES, 2Q11 VS.1Q11 TBR 2% IGS continued to outperform HPS by 1% HPS Advantage significant margins across the areas of 0% -1% Areas break/fix services, on-site expertise -2% -3% IGS and phone support while remaining -4% Advantage Areas significantly ahead for overall -5% -6% HP to IGS Distance 1Q11 HP to IGS Distance 2Q11 satisfaction. On-site Technical Expertise Break/Fix Online Support Overall Satisfaction On-site Response Time/Commitment Remotely Managed Support Telephone/Helpdesk Support Hardware Installation/Configuration Support Services Pricing/Value Replacement Parts AvailabilitySOURCE: TBR PERCENT CHANGES IN MEAN SATISFACTION POSITIONS FOR HP & IGS SERVICES, 2Q11 VS. TBR 1Q11 3% HP Services IGS/Lenovo Services HPS’ mean satisfaction rating for phone 2% support declined by a significantly 1% greater magnitude than IGS – hence the compelling performance gap. 0% -1% On-site Technical Expertise Support Services Value Break/Fix Services Overall Satisfaction On-site Response Time Parts Availability Remotely Managed Support Hardware Deployment Online Support Phone Support SOURCE: TBR 46 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 47. Tracking the Satisfaction Indices TBRService and support satisfaction positions continue an expectedcorrection, to pre-recession levels• Through the end of 2008, TBR observed SERVICE & SUPPORT WEIGHTED SATISFACTION SCORES, TBR 3Q08 through 2Q11 generally predictable outcomes, with the 91.0 in-house support group earning its reputation as the yardstick against which 89.0 we measure the OEM support providers. 87.0 During these periods, IGS was most consistent at earning top scores in the 85.0 competition. 83.0• In 2009, steadily declining satisfaction scores were the rule to which no 81.0 competitor was immune, defined by a close 79.0 competition between IGS and Dell Services, with HPS considerably more challenged. 77.0 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11• Satisfaction positions hit rock bottom in SOURCE: TBR Internal Support Organizations Dell Services HP Services IGS/Lenovo Services 4Q09, exhibiting hints of a recovery in 1Q10 that transitioned into a full recovery for all players in 2Q10. TBR Ranking Determinations Among Third-party Support Providers, Past 12 Reporting Periods• Scores collectively improved by substantial 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 magnitudes in 2Q10 and 3Q10, resulting in Dell Services 1 2 3 1 1 1 1 1 2 2 1 1 1 new record highs being established by all HP Services 2 2 2 1 3 2 3 3 3 2 2 1 1 four competitors by 3Q10. IGS/Lenovo Services 1 1 1 1 2 1 2 2 1 1 1 1 1 SOURCE: TBR• As expected, and following the patterns of TBR’s product-related studies, satisfaction Note: The ranking positions in the table have been adjusted to scores have now stabilized due to represent the placement of OEM support providers, excluding the corrections from 1Q11, primarily affecting presence of the internal support organizations. the OEM support providers. 47 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 48. Tracking the Satisfaction Indices TBR The long-term trend line shows a diminution of performance differencesTBR SERVICE & SUPPORT WEIGHTED SATISFACTION SCORES LONG TERM • The principal contributor to narrowing90.0 1Q06 THROUGH 2Q11 performance gaps involved the perspective of the internal support88.0 organizations, where stressed86.0 resources led to significantly declining84.0 satisfaction scores. Throughout most82.0 of the recessionary year of 2009, the80.0 group no longer represented the78.0 utopia of support capability against which TBR compares the OEM-76.0 provided support groups. Customer satisfaction with support servicesSOURCE: TBR Internal Support Organizations Dell Services HP Services IGS/Lenovo Services declined sharply throughout 2009 for all groups.TBR SERVICE & SUPPORT WEIGHTED SATISFACTION SCORES LONG TERM • Positions began to stabilize by 1Q06 THROUGH 2Q11, WITH MOVING AVERAGES 1Q10, setting the stage for the broad- 90.0 based and substantial recovery of the 88.0 2Q10 reporting period. In 3Q10, the 86.0 84.0 internal support organizations 82.0 returned to the top ranking position 80.0 for the first time since 1Q09. 78.0 • Since 1Q11, internal support 76.0 satisfaction has been trending downward, suggesting a possible 2 per. Mov. Avg. (Internal Support Organizations) 2 per. Mov. Avg. (Dell Services)SOURCE: TBR 2 per. Mov. Avg. (HP Services) 2 per. Mov. Avg. (IGS/Lenovo Services) return to economically stressed times that stretch internal resources and budgets. 48 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 49. GAP Analyses: Tracking Expectation Fulfillment TBRDell Services is the only competitor to consistently meet customerexpectations for services value, yet the picture is clearly changing TBR SUPPORT SERVICES PRICING/VALUE ANALYSIS TBR SUPPORT SERVICES PRICING/VALUE ANALYSIS FOR DELL SERVICES Satisfaction versus FOR IGS/LENOVO SERVICES6.50 Importance data 6.506.30 points have 6.306.10 remained 6.10 interlocked5.90 5.90 throughout the5.70 timeline for Dell 5.705.50 Services. 5.505.30 Competitors, 5.30 particularly HPS, have historically been unable to Satisfaction Importance Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) sustain closed Poly. (Satisfaction) Poly. (Importance) GAPs. SOURCE: TBR SOURCE: TBRTBR SUPPORT SERVICES PRICING/VALUE ANALYSIS FOR HP SERVICES6.306.10 2Q11 Developments: • Historically speaking, Dell Services has been the only competitor to5.90 consistently keep pace with customer expectations for services value;5.70 the satisfaction trend line continues to steadily increase over time.5.50 • In 2Q11, satisfaction scores for all three OEMs remained below5.30 importance levels. • Importance ratings for Dell Services and HPS fell in 1Q11, against a static satisfaction, while IGS’ importance scores rose. Satisfaction Importance Poly. (Satisfaction) Poly. (Importance)SOURCE: TBR 49 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 50. GAP Analyses: Tracking Expectation Fulfillment TBRSatisfaction ratings for support services response rise across the board,yet only Dell Services consistently meets customer expectations TBR SUPPORT SERVICES RESPONSE ANALYSIS SUPPORT SERVICES RESPONSE ANALYSIS FOR DELL SERVICES TBR FOR IGS/LENOVO SERVICES6.50 6.406.30 6.206.10 6.005.90 5.805.70 5.605.50 5.405.30 5.20 Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) SOURCE: TBR SOURCE: TBR TBR SUPPORT SERVICES RESPONSE ANALYSIS 2Q11 Developments: 6.60 FOR HP SERVICES • Dell satisfaction scores rose in 2Q11, creating a smaller 6.40 gap between importance and satisfaction. HPS and IGS’ 6.20 gap between importance and satisfaction remains wide 6.00 for the third consecutive quarter. 5.80 5.60 • Dell Services fared the best of the three competitors, 5.40 with its satisfaction and importance at similar levels while 5.20 the gap between satisfaction and importance ratings for 5.00 HPS and IGS/Lenovo’s support services response remained wide. Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) SOURCE: TBR50 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 51. GAP Analyses: Tracking Expectation Fulfillment TBRIGS/Lenovo Services and HPS fail to meet customer expectations foron-site technical expertiseTBR DELL SERVICES SATISFACTION VERSUS IMPORTANCE FOR TBR IGS/LENOVO SERVICES SATISFACTION VERSUS ON-SITE TECHNICAL EXPERTISE IMPORTANCE FOR ON-SITE TECHNICAL EXPERTISE 6.506.30 Satisfaction around 6.306.10 perceived technical 6.105.90 expertise was the 5.905.70 hardest hit of all 5.705.50 categories during 5.505.30 5.305.10 2009. Satisfaction 5.10 levels in 2010, however, represented a full Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) recovery.SOURCE: TBR SOURCE: TBR TBR HP SERVICES SATISFACTION VERSUS IMPORTANCE FOR ON-SITE TECHNICAL EXPERTISE 6.50 6.30 2Q11 Developments: 6.10 5.90 Dell Services was the only vendor to meet customer expectations in 5.70 2Q11 due to relaxed customer expectations over the last three 5.50 periods. HPS and IGS have been unable to close the gap between 5.30 expectation and satisfaction for the past three reporting periods. 5.10 Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) SOURCE: TBR 51 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 52. GAP Analyses: Tracking Expectation Fulfillment TBRSatisfaction levels declined from 2H10 to 1Q11, but expectations arenow beginning to relaxTBR SUPPORT SERVICES BREAK/FIX ANALYSIS TBR SUPPORT SERVICES BREAK/FIX ANALYSIS FOR DELL SERVICES FOR IGS/LENOVO SERVICES6.70 7.106.50 While GAPs had 6.906.30 closed by late 2009 6.706.10 due to relaxing 6.505.90 expectations, 1Q10 6.305.70 saw a sudden 6.105.50 increase in customer 5.905.30 requirements, which 5.705.10 5.50 continued to build into 2Q10, then taper off. The Satisfaction Importance break/fix category Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) refers to customer Poly. (Satisfaction) Poly. (Importance)SOURCE: TBR experiences with SOURCE: TBR basic hardware SUPPORT SERVICES BREAK/FIX ANALYSIS maintenance TBR FOR HP SERVICES services, not with 2Q11 Developments: 6.70 premium-level • Customer expectations for basic break/fix 6.50 contracts. services continued to relax for all OEMs in 6.30 2Q11. 6.10 5.90 • While IGS has consistently outperformed 5.70 competitors for satisfaction with break/fix 5.50 services, note all three OEMs have 5.30 successfully met expectations. 5.10 Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) SOURCE: TBR 52 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 53. GAP Analyses: Tracking Expectation Fulfillment TBRCustomer expectations for phone support have placed pressure on allOEMs for the past three periodsTBR PHONE SUPPORT ANALYSIS FOR DELL SERVICES TBR PHONE SUPPORT ANALYSIS FOR IGS/LENOVO SERVICES6.10 6.10 5.905.90 Historically, particul5.70 arly throughout 5.70 2007 and 2008, Dell 5.505.50 Services and HPS 5.305.30 have struggled to 5.10 meet customer expectations for Satisfaction Importance phone Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) support, predomina Poly. (Satisfaction) Poly. (Importance)SOURCE: TBR ntly falling far short SOURCE: TBR of that goal. Meanwhile, IGS has TBR PHONE SUPPORT ANALYSIS FOR HP SERVICES consistently maintained very 2Q11 Developments: 6.10 small GAP positions. Customer expectations for phone support 5.90 fell by varying degrees, while satisfaction 5.70 positions leveled off (HPS) or rose 5.50 (IGS/Lenovo Services and Dell Services), 5.30 creating smaller gaps between importance 5.10 and satisfaction. IGS has been unable to maintain closed gaps for the past three periods. Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) SOURCE: TBR53 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 54. GAP Analyses: Tracking Expectation Fulfillment TBRExpectations and satisfaction for online support continue to fluctuate;trend lines point to improvement for IGS and HPS against static DellTBR DELL SERVICES SATISFACTION VERSUS TBR IGS/LENOVO SERVICES SATISFACTION VERSUS IMPORTANCE FOR ONLINE SUPPORT IMPORTANCE FOR ONLINE SUPPORT6.00 6.005.80 5.805.60 5.60 5.405.40 5.205.20 5.00 Satisfaction Importance Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) Poly. (Satisfaction) Poly. (Importance)SOURCE: TBR SOURCE: TBR TBR HP SERVICES SATISFACTION VERSUS IMPORTANCE FOR ONLINE SUPPORT 2Q11 Developments: 5.90 • IGS’ importance rating for online support exceeded its 5.70 satisfaction ratings in 2Q11 while HPS and Dell Services fell below. 5.50 • Dell Services’ importance rating continued to rise 5.30 significantly, while satisfaction remained stagnant over the same sequential compare, leading to a widening gap. 5.10 • IGS/Lenovo Services’ satisfaction and importance ratings flattened in 2Q11, leading to continued unmet customer expectations. Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) SOURCE: TBR 54 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 55. GAP Analyses: Tracking Expectation Fulfillment TBRReplacement parts availability is a critical element of the supportexperience across the board for customersTBR DELL SERVICES SATISFACTION VERSUS IMPORTANCE TBR IGS/LENOVO SERVICES SATISFACTION VERSUS FOR REPLACEMENT PARTS AVAILABILITY IMPORTANCE FOR REPLACEMENT PARTS AVAILABILITY6.50 6.60 6.406.30 6.206.10 6.005.90 5.80 5.605.70 5.405.50 5.20 Satisfaction Importance Poly. (Satisfaction) Poly. (Importance) Satisfaction Importance Poly. (Satisfaction) Poly. (Importance)SOURCE: TBR SOURCE: TBRTBR HP SERVICES SATISFACTION VERSUS IMPORTANCE FOR REPLACEMENT PARTS AVAILABILITY6.50 2Q11 Developments:6.306.10 Importance and satisfaction ratings for parts availability5.90 remained stable for all three vendors in 2Q11, with satisfaction5.70 ratings remaining at levels below those of importance,5.50 indicating unmet customer expectations by the three vendors.5.30 Satisfaction Importance Poly. (Satisfaction) Poly. (Importance)SOURCE: TBR55 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 56. Trends of the Reporting Period TBRAnalysis of the Past Four Reporting PeriodsDell Services’ 2Q11 scores are generally very close to those of 1Q11, with some modestevidence of improvement TBR DELL SERVICE & SUPPORT CUSTOMER SATISFACTION TREND ANALYSIS • Dell Services’ satisfaction 3Q10 TO 2Q11 positions were generally at 6.6 their highest levels in 3Q10 6.4 and their lowest in 1Q11. 6.2 6.0 • Dell Services’ remotely 5.8 managed support position has 5.6 remained stagnant for the 5.4 past four reporting 5.2 periods, indicating a level of 5.0 homeostasis between On-site Technical Expertise Support Services Value Replacement Parts Availability On-site Response Time Hardware Installation/Configuration Telephone/Helpdesk Support Break/Fix Services Remotely Managed Support Overall Satisfaction Online/Web Support satisfaction and importance. SOURCE: TBR 3Q10 4Q10 1Q11 2Q11 WSI Rating Shift, 1Q11 to 2Q11: 0.8% • Led by increasing overall satisfaction • Comparatively stable positions include replacement parts availability, support services value and hardware deployment.56 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 57. Trends of the Reporting Period TBRAnalysis of the Past Four Reporting PeriodsHPS’ performances remain at similar levels over a sequential compareTBR HP SERVICE & SUPPORT CUSTOMER SATISFACTION TREND ANALYSIS 3Q10 TO 2Q11 • HPS’ satisfaction positions 6.4 were generally at their 6.2 highest levels in 3Q10 and 6.0 their lowest in 1Q11. 5.8 • 2Q11 positions remained 5.6 very similar to 1Q11 5.4 positions, with the exception 5.2 of increases in the area of on- 5.0 site response time and On-site Technical Expertise On-site Response Time Break/Fix Services Support Services Value Replacement Parts Availability Overall Satisfaction Online/Web Support Remotely Managed Support Telephone/Helpdesk Support Hardware Installation/Configuration overall satisfaction. • Across the board, satisfaction positions remained at the same level or slightly rose passed positions of 1Q11, showing little signs of improvement.SOURCE: TBR 3Q10 4Q10 1Q11 2Q11 WSI Rating Shift, 1Q11 to 2Q11: 0.45% Comparatively stable positions included all categories except for on-site response time and overall satisfaction.57 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 58. Trends of the Reporting Period TBRAnalysis of the Past Four Reporting PeriodsIGS’ 2Q11 scores are generally very close to those of 1Q11, with some modest evidence ofimprovementTBR IGS/LENOVO SERVICE & SUPPORT CUSTOMER SATISFACTION TREND ANALYSIS 3Q10 TO 2Q11 • IGS’ satisfaction positions were 6.8 generally at their highest levels 6.6 in 3Q10 and their lowest in 6.4 1Q11. 6.2 • Scores in 2Q11 rose gently 6.0 from their 1Q11 positions but 5.8 remained mostly below those 5.6 of 4Q10. 5.4 5.2 • IGS’ phone support position On-site Technical Expertise On-site Response Time Break/Fix Services Support Services Value Replacement Parts Availability Overall Satisfaction Online/Web Support Remotely Managed Support Telephone/Helpdesk Support Hardware Installation/Configuration has remained relatively stagnant over the past four reporting periods. 3Q10 4Q10 1Q11 2Q11SOURCE: TBR WSI Rating Shift, 1Q11 to 2Q11: 0.8% • Led by rising break/fix services, on-site expertise and response time satisfaction levels • Comparatively stable positions included phone support, online support, support services value and remotely managed support.58 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 59. Improvements GAP Analyses TBRRecommended areas for improvements for Dell Services include theinitial contact areas of phone and online support• Primary Area Requiring Improvement Efforts: None• Secondary Areas Requiring Improvement Efforts: On-site response time, phone support and online support• Areas of Competency: Hardware installation TBR SUGGESTED AREAS OF IMPROVEMENT FOR DELL SERVICES 2Q11 40 60 Hold Back/ Exploit Recommended 80 Actions 100 Maintain 120 140 Target Improvements 160 SOURCE: TBR 59 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 60. Improvements GAP Analyses TBRHP Services’ analysis points to target improvement programs aroundon-site response time, phone and online support• Secondary Areas Requiring Improvement Efforts: On-site response time, on-site technical expertise, phone support• Areas of Competency: Break/fix services TBR SUGGESTED AREAS OF IMPROVEMENT FOR HP SERVICES 2Q11 40 Hold Back/ 60 Exploit Recommended Actions 80 Maintain 100 120 Target Improvements 140 160 SOURCE: TBR 60 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 61. Improvements GAP Analyses TBRIGS must focus on perceptions of on-site response time andonline support• Primary Areas Requiring Improvement Efforts: On-site response time, online support• Areas of Competency: Break/fix Services SUGGESTED AREAS OF IMPROVEMENT FOR IGS/LENOVO SERVICES 2Q11 TBR 40 Hold Back/ 60 Exploit Recommended 80 Actions 100 Maintain 120 140 Target Improvements 160 SOURCE: TBR 61 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 62. Improvements GAP Analyses TBRThe in-house group must focus on improving the ability to work withOEMs to procure replacement or spare parts• Primary Area Requiring Improvement Efforts: Replacement parts availability• Secondary Areas Requiring Improvement Efforts: Phone support• Areas of Competency: Competitive strengths in server support was removed in 2Q11 62 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 63. Selection Criteria – Stated TBRParts availability, break/fix services, and support services value driveservice and support experience evaluationsRemote support methods (phone, web and automated support) are gainingin utilizationTBR SERVICE & SUPPORT IMPORTANCE RATINGS BY CUSTOMER GROUP4.8 • Critical: Break/fix services,4.6 Dell HP IBM InHouse parts availability4.44.2 • Also Important: Support4.0 services value, on-site3.8 expertise and response time3.63.4 • Somewhat Important: Phone3.2 support, online support3.0 • Less Important: Hardware Support Services Value On-site Response Time On-site Technical Expertise Installation/Configuration Break/Fix Services Online/Web Support Remotely Managed Support Telephone/Helpdesk Support Replacement Parts Availability deployment, remotely managed support Hardware SOURCE: TBR Customer expectations within the IGS group remain significantly higher than average overall, creating a special situation in which IGS was forced to perform that much better in the satisfaction ratings to rank No. 1 in this reporting wave. While this was largely driven by the IBM Support (server) side of the equation, Lenovo Services customers were also more focused than competitors’ customers on break/fix services and technical expertise.63 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 64. Scoring Summary – Significance Tests TBRStatistical significance test No. 1 points to performance differentiationlargely favoring Internal Support, somewhat favoring IGS and Dell ServicesTest compares each player’s performances against the sum of competitors’using the standard test Results of the Standard t-Test 2Q11 Developments:TBR • The internal support groups returned to their IGS/LENOVO INTERNAL DELL SVCS HP SVCS SVCS SUPPORT historical position as the standard-Basic Break/Fix Services    setter, outperforming industry averages acrossOn-site Technical Expertise   most categories – with parts availability theOn-site Response Time/Commitment   single exception.Telephone/Helpdesk Support    • IGS/Lenovo Services’ results were a mixture ofOnline Support    positives and negatives, outperformingRemotely Managed Support Replacement Parts Availability competitors in break/fix services and phoneSupport Services Pricing/Value   support while underperforming in onlineHardware Installation/Configuration    support.Overall Satisfaction    • HPS’ scores were either below or comparableGrand Mean    to the industry averages. A verage sco re; t-test is null;ñ t-Test is significantly higher than average o f co mpetito rs;  t-test issignificantly lo wer than average o f co mpetito rs. Smaller arro ws represent significant differences at the 0.06 to 0.10 • Dell Services outperformed the competition inco nfidence levels. online support and hardware installation.SOURCE :TBR 64 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 65. Scoring Summary – Significance Tests TBRPerformance differentiation in the segments points to IBM as favoredfor server support; Dell Services for desktop/notebook supportTests compare each player’s performances against the sum of competitors’ using thestandard test TBR Results of the Standard t-Test - x86 SERVER SUPPORT TBR Results of the Standard t-Test - DESKTOP/NOTEBOOK SUPPORT INTERNAL INTERNAL DELL SVCS HP SVCS IBM SVCS SUPPORT DELL SVCS HP SVCS LENOVO SVCS SUPPORTBasic Break/Fix Services   Basic Break/Fix ServicesOn-site Technical Expertise    On-site Technical Expertise   On-site Response Time/Commitment  On-site Response Time/Commitment   Telephone/Helpdesk Support   Telephone/Helpdesk Support  Online Support  Online Support   Remotely Managed Support  Remotely Managed Support  Replacement Parts Availability  Replacement Parts Availability Support Services Pricing/Value  Support Services Pricing/Value Hardware Installation/Configuration  Hardware Installation/Configuration  Overall Satisfaction   Overall Satisfaction Grand Mean   Grand Mean   A verage sco re; t-test is null;ñ t-Test is significantly higher than average o f co mpetito rs;  t-test is  A verage sco re; t-test is null;ñ t-Test is significantly higher than average o f co mpetito rs;  t-test issignificantly lo wer than average o f co mpetito rs. Smaller arro ws represent significant differences at the 0.06 to 0.10 significantly lo wer than average o f co mpetito rs. Smaller arro ws represent significant differences at the 0.06 to 0.10co nfidence levels. co nfidence levels.SOURCE: TBR SOURCE: TBR The key performance differentiators in the server The key performance differentiators in the support segment were break/fix services, on-site desktop/notebook support space were on-site expertise, phone support, and parts availability – with technical expertise, on-site response time, online all favoring IGS over HPS and Dell Services. support and hardware installation, where Dell Services outperformed the industry average while HPs and Lenovo underperformed.65 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 66. Scoring Summary – Significance Tests TBRStatistical significance test No. 2 elaborates on the findings of test No. 1These are paired comparisons using the standard test Results of the Pair-wise t-Tests, Vendor Comparisons TBR Highlighted performance differentiation IGS/LENOVO involving the OEM support providers: DELL SVCS VS. HP SVCS VS. SVCS VS. • IGS significantly outperformed both IGS/ DELL IGS/ DELL PAIR-WISE T-TESTS HPS LENOVO SVCS LENOVO SVCS HPS competitors for break/fix services. Break/Fix Services     • Dell Services outperformed both On-site Technical Expertise     competitors for hardware installation On-site Response Time/Commitment   and online support – all at significant Telephone/Helpdesk Support   levels. Online Support     Remotely Managed Support • HPS’ underperformed IGS in break/fix Replacement Parts Availability services and overall satisfaction, Support Services Pricing/Value underperformed Dell Services in Hardware Installation/Configuration     hardware installation, and Overall Satisfaction   Grand Mean     underperformed both competitors in  t-Test is significantly higher than the average o f co mpetito rs; t-Test is significantly lo wer than average o f co mpetito rs. on-site technical expertise. Smaller arro ws represent significant differences at the 0.06 to 0.1 co nfidence levels. 0 SOURCE: TBR66 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 67. Scoring Summary – Significance Tests TBRStatistical significance test No. 2 elaborates on the findings of test No. 1These are paired comparisons using the standard test TBR Results of the Pair-wise t-Tests, Internal Support vs. Vendor-provided Support • The in-house groups outperformed all three OEM support providers across every category INTERNAL SUPPORT ORGANIZATIONS VS. with the single exception of parts availability. DELL IGS/ • These performance differences were confirmed PAIR-WISE T-TESTS SVCS HPS LNV at very high levels of statistical confidence. Break/Fix Services    On-site Technical Expertise    On-site Response Time/Commitment    Telephone/Helpdesk Support    Online Support    Remotely Managed Support    Replacement Parts Availability Support Services Pricing/Value    Hardware Installation/Configuration    Overall Satisfaction    Grand Mean     t-Test is significantly higher than the average o f co mpetito rs; t-Test is significantly lo wer than average o f co mpetito rs. Smaller arro ws represent significant differences at the 0.06 to 0.1 co nfidence levels. 0 SOURCE: TB R67 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 68. Statistical Significance Tests TBRDespite the tough test, several performance differentiatorsare corroborated by statistical test No. 3The Bonferroni correction is the most stringent of TBR’s applied tests Differences Between the Vendors According to Bonferroni Correction TBR Significant Differences Cited by Bonferroni Attribute Correction In-house Dell HPS IGS/Lenovo Basic Break/Fix Services Internal over Dell, HP; IGS over HP 3 -1 -2 1 On-site Technical Expertise Internal over ALL 3 -1 -1 -1 On-site Response Time/Commitment Internal over ALL 3 -1 -1 -1 Telephone/Helpdesk Support Internal over HP 1 0 -1 0 Online Support Internal over ALL; Dell over IGS 3 1 -1 -2 Remotely Managed Support Internal over ALL 3 -1 -1 -1 Replacement Parts Availability None at the 0.05 significance level 0 0 0 0 Support Services Pricing/Value Internal over Dell, HP 2 -1 -1 0 Hardware Installation/Configuration Internal over ALL; Dell over HP, IGS 3 1 -2 -2 Overall Satisfaction Internal over ALL 3 -1 -1 -1 Total Points 24 -4 -11 -7 SOURCE: TB R The Bonferroni correction, the most stringent statistical significance test used by TBR, confirmed many of the tests cited by the standard test. Most of the confirmed differences were in comparisons of in-house support against the OEM support providers. Additional confirmed performance differences included break/fix services (IGS over HPS), online support (Dell over IGS), and hardware installation (Dell over HPS and IGS).68 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 69. Statistical Significance Tests TBRDespite the tough test, several performance differentiatorsare corroborated by statistical test No. 3The Bonferroni correction is the most stringent of TBR’s applied tests In the x86 server support segment,TBR Differences Between the Vendors According to Bonferroni Correction - x86 SERVER SUPPORT the internal support organizations Significant Differences Cited by Bonferroni were confirmed as havingAttributeBasic Break/Fix Services Correction Internal over Dell, HP In-house 2 Dell -1 HPS -1 IBM 0 outperformed various competitorsOn-site Technical Expertise Internal over Dell, HP 2 -1 -1 0 across all categories but partsOn-site Response Time/CommitmentTelephone/Helpdesk Support Internal over Dell, HP IBM over Dell 2 0 -1 -1 -1 0 0 1 availability and phone support,Online Support Internal over HP, IBM 2 0 -1 -1 designated by the previous tests. InRemotely Managed SupportReplacement Parts Availability Internal over ALL None at the 0.05 significance level 3 0 -1 0 -1 0 -1 0 addition, IBM outperformed DellSupport Services Pricing/Value Internal over Dell, HP; IBM over Dell 2 -2 -1 1 for phone support and supportHardware Installation/Configuration Internal over HP, IBM; Dell over HPOverall Satisfaction Internal over ALL 2 3 1 -1 -2 -1 -1 -1 services value. IBM also benefitedTotal Points 18 -7 -9 -2 by not placing significantly lowerSOURCE: TB R than in-house support in several categories, while competitors were Differences Between the Vendors According to Bonferroni Correction - not so fortunate.TBR DESKTOP/NOTEBOOK SUPPORTAttribute Significant Differences Cited by Bonferroni Correction In-house Dell HPS Lenovo In the desktop/notebook supportBasic Break/Fix Services None at the 0.05 significance level 0 0 0 0 segment, the internal supportOn-site Technical ExpertiseOn-site Response Time/Commitment Internal over HP, Lenovo; Dell over HP Internal over ALL 2 3 1 -1 -2 -1 -1 -1 organizations outperformedTelephone/Helpdesk Support Internal over HP 1 0 -1 0 competitors in all but the break/fixOnline SupportRemotely Managed Support Internal over HP, Lenovo; Dell over Lenovo Internal over HP, Lenovo 2 2 1 0 -1 -1 -2 -1 and parts availability categories, asReplacement Parts Availability None at the 0.05 significance level 0 0 0 0 designated in the previous tests. ItSupport Services Pricing/ValueHardware Installation/Configuration None at the 0.05 significance level Internal over HP, Lenovo 0 2 0 0 0 -1 0 -1 was also confirmed that DellOverall Satisfaction Internal over ALL 3 -1 -1 -1 outperformed HP for on-siteTotal Points 15 0 -8 -7SOURCE: TB R expertise and Lenovo for online support.69 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 70. Competitive GAP Analysis TBRThe Competitive GAP Analysis confirms the in-house support performancedifference premises set by the statistical significance tests • The competitive GAP SERVICE & SUPPORT COMPETITIVE GAP ANALYSIS 2Q11 scores support TBR’sTBR decisions regarding on- site response time on the competitive strength and weakness citations for the 2Q11 reporting period. 40 60 • The internal support Exceeds group’s scores were so 80 Fully high, with the exception 100 Meets of parts availability, that 120 they skewed the 140 Short of remainder of the analysis, making it Expectation 160 Fulfillment difficult for OEM supportSOURCE: TBR Internal Support Organizations Dell Services HP Services IGS/Lenovo Services providers to earn scores above the 100-point marker and leading scores to trail toward the lower end of the meeting expectations range. 70 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 71. Buying Behavior TBRMost customers utilize a mix of self-replacement and on-site supportfor replacing/repairing failed parts METHODS OF REPLACING/REPAIRING FAILED PARTS TBR 100% on-site Primarily on-site;self replace some parts About 50/50 self replacement/on- site Primarily self replacement/on-site for some parts 100% self replacement 0% 5% 10% 15% 20% 25% 30% 35% SOURCE: TBR Desktops/Notebooks Servers• The majority of desktop/notebook customers utilize an approximate 50/50 mix between self-replacement and on-site support by an OEM or partner.• TBR found that the majority of server customers preferred primarily self-replacing the parts while utilizing third parties for some specific parts that may require more expertise.• This pattern has largely remained constant in the past year, with an average of 24% of respondents indicating either 100% on-site support or 100% self-replacement, while the rest leverage a mix of the two. 71 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 72. Buying Behavior TBRCustomers are most satisfied with self-replacement or a mixture ofself-replacement and on-site support • Server customers are most satisfied with a mixture of self- PARTS REPAIR METHOD WITH HIGHEST SATISFACTION (Respondents Select One) replacement and on-site TBR support, whereas 45% desktop/notebook customers 40% are most satisfied replacing the 35% parts in-house. 30% • Customers are least satisfied 25% with on-site support provided 20% by a third party, at 7% 15% satisfaction. 10% • This finding strongly suggests 5% OEM support providers must 0% find the optimum balance of Self replacement On-site repair visit from systems On-site repair visit from third Mix of self replacement and on- self-replaceable versus on-site manufacturer/authorized party site partner repair parts. To complicate matters, this balance may vary SOURCE: TBR Servers Desktops/Notebooks greatly by customer.72 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 73. Buying Behavior TBRCustomers face many challenges in replacing failed parts in-house, led byavailability of parts and the challenge of replacing more difficult parts PRINCIPAL CHALLENGES IN REPLACING FAILED PARTS IN HOUSE TBR Forced to self replace due to contract terms/cost Lack of training/in-house expertise Issues with difficulty of replacing parts Replacement parts availability Limited staff resources 0% 10% 20% 30% 40% 50% 60% SOURCE: TBR Desktops/Notebooks Servers• The variety of challenges organizations face in replacing failed parts themselves could be at the root of an increase in requirements for on-site support. This premise is supported by the finding that between 30% and 45% of respondents reported issues with the difficulty of replacing parts, which was cited as a leading challenge. This strongly suggests a growing requirement for on-site support expertise from outside the organization.• IBM customers are less challenged than Dell and HP customers with staff resource issues, but are slightly more challenged when facing parts availability and difficulties with replacing some parts.• In terms of having issues replacing difficult parts, desktop/notebook customers found this as more of an issue than server customers.73 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 74. Buying Behavior TBRPremium support contracts and extended warranties are more commonfor server support than desktop/notebook TYPES OF x86 SERVER SUPPORT CONTRACTS PURCHASED TYPES OF DESKTOP/NOTEBOOK SUPPORTTBR TBR CONTRACTS PURCHASED 80% 90.00% 70% 80.00% 70.00% 60% 60.00% 50% 50.00% 40% 40.00% 30% 30.00% 20% 20.00% 10% 10.00% 0% 0.00% Dell Services HP Services IGS/IBM Services Dell Services HP Services IGS/Lenovo Services Critical/Premium Level Standard Level Critical/Premium Level Standard LevelSOURCE: TBR Acquired at Time of Hardware Purchase Extended Warranty Acquired at Time of Hardware Purchase Extended Warranty SOURCE: TBR On the server side, Dell and HP customers were IGS/Lenovo Services’ customers were the most likely more likely to purchase support contracts at the to purchase critical/premium and extended warranty time of the hardware sale, while IBM customers contracts. HP Services’ customers were the most were evenly spread across the board. likely of the vendors to purchase support contracts at the time of the desktop or notebook sale and standard level support contracts.74 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 75. TBR Appendix B: Support Provider Satisfaction Scores – 1Q08 Through 2Q1175 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 76. Support Provider Customer Satisfaction Scores TBR1Q08 Through 2Q11 BREAK/FIX SERVICES 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.90 5.94 5.89 5.88 5.96 6.06 5.91 5.80 5.92 6.25 6.47 6.14 5.88 5.94 HP Services & Partners 5.93 5.89 5.88 5.94 5.98 5.94 5.91 5.91 5.94 6.24 6.34 6.04 5.85 5.90 IGS & Partners 6.06 5.94 5.96 6.03 5.99 6.10 6.09 6.07 6.09 6.35 6.58 6.23 5.99 6.07 Internal Support Organizations 6.11 6.06 6.10 6.11 6.08 5.96 5.92 5.74 5.75 6.12 6.57 6.47 6.16 6.14 ON-SITE TECHNICAL EXPERTISE 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.96 6.03 5.84 5.74 5.85 5.81 5.65 5.54 5.52 5.95 6.31 6.00 5.84 5.89 HP Services & Partners 5.95 5.88 5.91 5.92 5.99 5.87 5.65 5.29 5.24 5.86 6.20 5.88 5.74 5.73 IGS & Partners 6.00 5.91 5.98 5.97 5.89 5.79 5.59 5.34 5.38 6.02 6.45 6.04 5.82 5.88 Internal Support Organizations 6.09 6.07 6.10 6.11 6.07 5.96 5.85 5.50 5.47 5.88 6.27 6.20 6.05 6.10 ON-SITE RESPONSE TIME 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.79 5.87 5.81 5.73 5.85 6.07 5.81 5.62 5.63 5.84 6.12 5.85 5.68 5.74 HP Services & Partners 5.69 5.63 5.76 5.73 5.73 5.78 5.61 5.40 5.15 5.56 5.96 5.63 5.57 5.67 IGS & Partners 5.83 5.77 5.88 5.88 5.84 5.90 5.67 5.46 5.45 5.85 6.22 5.71 5.48 5.59 Internal Support Organizations 6.21 6.22 6.36 6.29 6.18 6.14 5.98 5.74 5.76 6.12 6.37 6.30 6.21 6.18 TELEPHONE / HELPDESK SUPPORT 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.68 5.77 5.60 5.69 5.83 5.75 5.56 5.51 5.64 5.84 5.81 5.62 5.67 5.74 HP Services & Partners 5.58 5.49 5.55 5.68 5.72 5.59 5.45 5.31 5.28 5.64 5.89 5.72 5.67 5.67 IGS & Partners 5.81 5.83 5.86 5.83 5.71 5.66 5.46 5.29 5.48 5.83 5.92 5.77 5.80 5.83 Internal Support Organizations 5.95 6.06 6.18 6.13 6.00 5.77 5.66 5.44 5.48 5.92 6.10 5.98 5.98 5.90 ONLINE / WEB SUPPORT 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.71 5.71 5.56 5.58 5.74 5.69 5.50 5.46 5.50 5.77 5.76 5.54 5.63 5.72 HP Services & Partners 5.64 5.51 5.38 5.55 5.62 5.55 5.47 5.35 5.34 5.74 5.86 5.57 5.57 5.58 IGS & Partners 5.51 5.59 5.70 5.83 5.77 5.67 5.58 5.47 5.60 5.98 5.94 5.63 5.51 5.49 Internal Support Organizations 5.68 5.70 5.69 5.63 5.63 5.57 5.48 5.42 5.58 5.93 6.01 5.91 5.94 5.96 REPLACEMENT PARTS AVAILABILITY 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 6.04 6.08 5.97 5.95 6.04 5.94 5.81 5.65 5.63 5.92 6.24 6.07 5.85 5.86 HP Services & Partners 5.87 5.78 5.87 5.89 5.84 5.84 5.67 5.39 5.53 5.91 6.19 6.00 5.76 5.80 IGS & Partners 5.94 5.82 5.97 5.99 5.84 5.80 5.68 5.58 5.69 5.95 6.28 6.10 5.86 5.91 Internal Support Organizations 5.41 5.32 5.48 5.41 5.50 5.51 5.41 5.25 5.23 5.71 6.29 6.15 5.84 5.8776 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 77. Support Provider Customer Satisfaction Scores TBR1Q08 Through 2Q11 SUPPORT SERVICES VALUE 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.72 5.78 5.71 5.76 5.83 5.89 5.80 5.69 5.77 6.17 6.20 5.85 5.79 5.78 HP Services & Partners 5.58 5.71 5.66 5.67 5.71 5.73 5.70 5.59 5.63 6.06 6.24 5.90 5.82 5.81 IGS & Partners 5.74 5.63 5.65 5.73 5.64 5.68 5.71 5.69 5.79 6.20 6.32 6.02 5.90 5.90 Internal Support Organizations 5.89 5.92 6.08 6.09 5.99 5.87 5.77 5.56 5.65 6.04 6.30 6.24 6.12 6.03 HARDWARE INSTALLATION / CONFIGURATION 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.57 5.56 5.45 5.65 5.79 5.59 5.47 5.42 5.40 5.67 5.85 5.71 5.69 5.74 HP Services & Partners 5.80 5.79 5.67 5.73 5.87 5.57 5.31 5.14 5.30 5.73 5.84 5.54 5.46 5.51 IGS & Partners 5.92 5.72 5.64 5.60 5.73 5.78 5.52 5.27 5.35 5.62 5.84 5.63 5.46 5.53 Internal Support Organizations 6.02 6.05 6.18 6.12 6.12 5.86 5.57 5.36 5.52 5.97 6.15 6.09 6.04 5.95 AUTOMATION / INSTANT SUPPORT 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.43 5.42 5.21 5.31 5.46 5.51 5.33 5.26 5.43 5.54 5.46 5.43 5.55 5.55 HP Services & Partners 5.59 5.57 5.52 5.56 5.58 5.46 5.32 5.21 5.26 5.53 5.68 5.45 5.44 5.42 IGS & Partners 5.54 5.40 5.48 5.69 5.65 5.63 5.47 5.28 5.39 5.64 5.59 5.37 5.39 5.47 Internal Support Organizations 5.40 5.45 5.62 5.68 5.62 5.64 5.67 5.55 5.56 5.85 5.87 5.85 5.98 5.89 OVERALL SATISFACTION 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 5.73 5.82 5.79 5.72 5.81 6.00 5.94 5.78 5.77 6.09 6.26 5.96 5.81 5.98 HP Services & Partners 5.86 5.88 5.86 5.94 5.98 5.88 5.79 5.74 5.70 5.97 6.25 5.96 5.76 5.86 IGS & Partners 5.98 5.87 5.82 5.93 5.88 5.82 5.82 5.83 5.92 6.17 6.28 6.04 5.92 6.01 Internal Support Organizations 5.99 5.99 6.18 6.14 6.02 5.91 5.81 5.66 5.70 6.02 6.25 6.19 6.16 6.29 Survey Counts 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners 160 161 161 160 185 239 234 199 186 192 227 252 253 251 HP Services & Partners 160 160 160 159 175 235 239 201 199 210 233 252 254 248 IGS & Partners 160 159 159 161 186 240 235 201 199 204 227 254 263 253 Internal Support Organizations 160 167 169 169 168 219 242 220 225 212 244 404 510 49977 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 78. TBR Appendix C: Historical Strength & Weakness Analysis for Selected Attributes78 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 79. Historical Strength & Weakness Analysis TBRHistorical Accumulation of Strength & Weakness DeterminationsVENDOR 2Q01 3Q01 4Q01 1Q02 2Q02 3Q02 4Q02 1Q03 2Q03 3Q03 4Q03 1Q04 2Q04 3Q04 4Q04 1Q05 2Q05 3Q05 4Q05 1Q06 2Q06 3Q06 4Q06 1Q07 2Q07 3Q07 4Q07 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11SERVICES PRICING/VALUEDell                * *  *HP               IGS                * * PARTS AVAILABILITYDell                *HP      IGS     BREAK/FIX SERVICESDell       HP       IGS   *       *    *ON-SITE RESPONSE TIMEDell      * *   * HP              IGS       * *   PHONE SUPPORTDell       * *   HP             IGS *   *     * * *ONLINE SUPPORTDell        *HP IGS * *      * *   TECHNICAL EXPERTISEDell         * *  HP      IGS  *   *  *  HARDWARE INSTALL/CONFIGUREDell        *   * * * *HP IGS   *  *     Key: Weakness;  Strength; Neutral. Warning; not cited as a competitive weakness this quarter due to lack of corroborating evidence. * Means that the strength is borderline.SOURCE: TBR 79 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 80. TBR Appendix D: Satisfaction Trends for Key Service & Support Satisfaction Attributes80 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 81. Satisfaction Trends TBROn-site Break/Fix ServicesTBR HISTORICAL SATISFACTION TRENDLINE FOR BREAK/FIX SERVICES 6.60 6.50 6.40 6.30 6.20 6.10 6.00 5.90 5.80 5.70 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR SOURCE: TBR. 81 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 82. Satisfaction Trends TBROn-site Technical ExpertiseTBR HISTORICAL SATISFACTION TRENDLINE FOR ON-SITE EXPERTISE 6.50 6.30 6.10 5.90 5.70 5.50 5.30 5.10 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR SOURCE: TBR.82 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 83. Satisfaction Trends TBROn-site Response TimeTBR HISTORICAL SATISFACTION TRENDLINE FOR ON-SITE RESPONSE TIME 6.30 6.10 5.90 5.70 5.50 5.30 5.10 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR83 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 84. Satisfaction Trends TBRPhone SupportTBR HISTORICAL SATISFACTION TRENDLINE FOR PHONE SUPPORT 6.20 6.10 6.00 5.90 5.80 5.70 5.60 5.50 5.40 5.30 5.20 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR84 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 85. Satisfaction Trends TBROnline SupportTBR HISTORICAL SATISFACTION TRENDLINE FOR ONLINE SUPPORT 6.10 6.00 5.90 5.80 5.70 5.60 5.50 5.40 5.30 5.20 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR85 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 86. Satisfaction Trends TBRReplacement Parts AvailabilityTBR HISTORICAL SATISFACTION TRENDLINE FOR REPLACEMENT PARTS AVAILABILITY 6.40 6.20 6.00 5.80 5.60 5.40 5.20 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR86 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 87. Satisfaction Trends TBRSupport Services Pricing/ValueTBR HISTORICAL SATISFACTION TRENDLINE FOR SUPPORT SERVICES VALUE 6.40 6.30 6.20 6.10 6.00 5.90 5.80 5.70 5.60 5.50 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR87 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 88. Satisfaction Trends TBRHardware Deployment/Installation/ConfigurationTBR HISTORICAL SATISFACTION TRENDLINE FOR HARDWARE DEPLOYMENT 6.20 6.00 5.80 5.60 5.40 5.20 5.00 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR88 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 89. Satisfaction Trends TBRAutomated Support (Remotely Managed by Support Provider)TBR HISTORICAL SATISFACTION TRENDLINE FOR REMOTELY MANAGED SUPPORT 6.00 5.80 5.60 5.40 5.20 5.00 4.80 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR89 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 90. Satisfaction Trends TBROverall SatisfactionTBR OVERALL SATISFACTION 6.40 6.30 6.20 6.10 6.00 5.90 5.80 5.70 5.60 1Q08 2Q08 3Q08 4Q08 1Q09 2Q09 3Q09 4Q09 1Q10 2Q10 3Q10 4Q10 1Q11 2Q11 Dell Services & Partners HP Services & Partners IGS & Partners Internal Support OrganizationsSOURCE: TBR 90 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 91. TBR Appendix E: Confidence Interval Graphs91 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 92. Confidence Interval Graphs TBRBreak/Fix Services 1Q11 2Q1192 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 93. Confidence Interval Graphs TBROn-site Technical Expertise 1Q11 2Q1193 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 94. Confidence Interval Graphs TBROn-site Response Time 1Q11 2Q1194 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 95. Confidence Interval Graphs TBRPhone Support 1Q11 2Q1195 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 96. Confidence Interval Graphs TBROnline Support 1Q11 2Q1196 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 97. Confidence Interval Graphs TBRReplacement Parts Availability 1Q11 2Q1197 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 98. Confidence Interval Graphs TBRSupport Services Value 1Q11 2Q1198 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 99. Confidence Interval Graphs TBRHardware Deployment/Installation/Configuration Services 1Q11 2Q1199 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 100. Confidence Interval Graphs TBRAutomated Support (Remotely Managed by Support Provider) 1Q11 2Q11100 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 101. Confidence Interval Graphs TBROverall Satisfaction with Technical Support Services 1Q11 2Q11101 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 102. TBR Appendix F: Categorical Responses102 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 103. Category Graphs TBRBreak/Fix ServicesTBR SATISFACTION WITH BREAK/FIX BY RATINGS CATEGORY SATISFACTION WITH BREAK/FIX BY RATINGS CATEGORY TBR 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In House SOURCE: TBRSOURCE: TBR 1Q11 2Q11 103 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 104. Category Graphs TBROn-site Technical ExpertiseTBR SATISFACTION WITH ON-SITE TECHNICAL EXPERTISE BY RATINGS SATISFACTION WITH ON-SITE TECHNICAL EXPERTISE BY RATINGS CATEGORY TBR CATEGORY 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In HouseSOURCE: TBR SOURCE: TBR 1Q11 2Q11104 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 105. Category Graphs TBROn-site Response TimeTBR SATISFACTION WITH ON-SITE RESPONSE TIME BY RATINGS SATISFACTION WITH ON-SITE RESPONSE TIME BY RATINGS CATEGORY TBR CATEGORY 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In HouseSOURCE: TBR SOURCE: TBR 1Q11 2Q11105 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 106. Category Graphs TBRPhone SupportTBR SATISFACTION WITH PHONE SUPPORT BY RATINGS CATEGORY SATISFACTION WITH PHONE SUPPORT BY RATINGS CATEGORY TBR 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In HouseSOURCE: TBR SOURCE: TBR 1Q11 2Q11 106 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 107. Category Graphs TBROnline SupportTBR SATISFACTION WITH ON-LINE SUPPORT BY RATINGS CATEGORY SATISFACTION WITH ON-LINE SUPPORT BY RATINGS CATEGORY TBR 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In HouseSOURCE: TBR SOURCE: TBR 1Q11 2Q11 107 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 108. Category Graphs TBRReplacement Parts AvailabilityTBR SATISFACTION WITH PARTS AVAILABILITY BY RATINGS SATISFACTION WITH PARTS AVAILABILITY BY RATINGS CATEGORY TBR CATEGORY 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In HouseSOURCE: TBR SOURCE: TBR 1Q11 2Q11 108 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 109. Category Graphs TBRSupport Services Pricing/ValueTBR SATISFACTION WITH SUPPORT SERVICES BY RATINGS CATEGORY SATISFACTION WITH SUPPORT SERVICES BY RATINGS CATEGORY TBR 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In HouseSOURCE: TBR SOURCE: TBR 1Q11 2Q11 109 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 110. Category Graphs TBRHardware Deployment TBR SATISFACTION WITH HARDWARE DEPLOYMENT SERVICERS BY SATISFACTION WITH HARDWARE DEPLOYMENT SERVICES BY RATINGS CATEGORY TBR RATINGS CATEGORY 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House SOURCE: TBR Dell Services HPS IGS/Lenovo Services In House SOURCE: TBR 1Q11 2Q11110 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 111. Category Graphs TBRAutomated Support (Remotely Managed by Support Provider)TBR SATISFACTION REMOTELY MANAGED SUPPORT BY RATINGS SATISFACTION WITH REMOTELY MANAGED SUPPORT BY TBR CATEGORY RATINGS CATEGORY 80% 80% 70% 70% 60% 60% 50% 50% 40% 40% 30% 30% 20% 20% 10% 10% 0% 0% <5 5 6 7 <5 5 6 7 Dell Services HPS IGS/Lenovo Services In House Dell Services HPS IGS/Lenovo Services In House SOURCE: TBR SOURCE: TBR 1Q11 2Q11111 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 112. TBR Appendix G: Server/Storage versus Desktop/Notebook Support by Support Provider112 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 113. Satisfaction Trends TBRDell Services 2Q11 TBR DELL SERVICES SATISFACTION, PAST FOUR CALENDAR QUARTERS 6.50 6.00 5.50 5.00 4.50 SOURCE: TBR Jul-Sep 10 Oct-Dec 10 Jan-Mar 11 Apr-Jun 11113 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 114. Satisfaction Trends TBRHP Services 2Q11 TBR IGS/LENOVO SERVICES SATISFACTION, PAST FOUR CALENDAR QUARTERS 6.50 6.00 5.50 5.00 4.50 SOURCE: TBR Jul-Sep 10 Oct-Dec 10 Jan-Mar 11 Apr-Jun 11114 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 115. Satisfaction Trends TBRIBM Global Services 2Q11 TBR HP SERVICES SATISFACTION, PAST FOUR CALENDAR QUARTERS 6.50 6.00 5.50 5.00 4.50 SOURCE: TBR Jul-Sep 10 Oct-Dec 10 Jan-Mar 11 Apr-Jun 11115 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 116. Satisfaction Trends TBRInternal Support Organizations 2Q11 TBR IN-HOUSE SERVICES SATISFACTION, PAST FOUR CALENDAR QUARTERS 6.50 6.00 5.50 5.00 4.50 SOURCE: TBR Jul-Sep 10 Oct-Dec 10 Jan-Mar 11 Apr-Jun 11116 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 117. TBR Appendix H: Study Design & Methodology117 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 118. Study Design & Methodology TBRTBR’s Corporate IT Service & Support Customer Satisfaction Study isbased on the views of those who manage in-house support servicesand/or work with OEM-provided supportCompanies interviewed for TBR’s Corporate IT Service & Support Satisfaction Study are Additional Screening Criteria forrequired to have a minimum of 200 PCs (combined total servers, desktops and notebooks) the Corporate IT Service &installed. In contrast, TBR’s product-related satisfaction studies require a minimum of 500 Support Satisfaction Study:PCs for most covered brands. This makes the Service & Support study a tool best suited for 1. Has your company utilizedevaluating the experiences of midsized corporations, whereas the product-related studies any on-site, phone or webextend to the experiences of enterprise customers. The reason for the differing criteria is support for Dell, HP, IBM orthat larger organizations tend to rely more fully (sometimes entirely) on their internal Lenovo for desktops, serverssupport staff. With this in mind, study subscribers should not expect the results of this or notebooks in the paststudy to mirror TBR’s product-related satisfaction studies, including the x86-based three months?Server, Corporate Notebook and Corporate Desktop Customer Satisfaction studies. 2. Is your company utilizingThroughout this report, TBR refers to two types of support providers: in-house technical support?INTERNAL SUPPORT ORGANIZATIONS: Companies with in-house technical support staff 3. Are you personally involved(systems manufacturers often refer to these customers as “self-maintainers”); TBR’s study in evaluating, recommendingfocuses primarily on internal support organizations that perform a number of support or purchasing supportfunctions with their own staff, supplemented by OEM-provided support as needed. services for desktops, serversOEM SUPPORT PROVIDERS: Dell Services, HP Services, IBM Global Services and Lenovo and notebooks at yourServices perform repairs and basic maintenance for customers based on support service company or site? Or, if yourportfolio offerings. site uses internal support teams only, are you involved• Dell Services and its authorized service partners provide technical support to Dell with the supervision of these customer sites for servers, notebooks and/or desktop PCs. teams?• HP Services encompasses services for the Industry Standard Server group as well as for the Personal Systems Group (desktops and notebooks).• IGS comprises support services for IBM server customers as well as for Lenovo desktop and notebook PC customers. Lenovo customers are serviced by IGS and Lenovo Services, in addition to a network of third-party service delivery partners.118 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 119. Study Design & Methodology TBRReporting Structure DefinedTBR generally reports on the combined results of server, notebook and desktop support;report sections break up the study results by segment wherever referenced (server/storagesupport, desktop/notebook support) x86 Server/Storage Support, wherever referenced Sample size = Approximately 125 interviews per group Covers satisfaction with x86-based server support Combined Study Results delivered by: Sample size = Approximately 250 interviews 1. Dell Services (Enterprise Support) per group 2. HP Services (TSS) Covers satisfaction with x86-based server as well as desktop/notebook support delivered 3. IBM/IGS Services by: 4. Internal Support Organizations 1. Dell Services 2. HP Services (includes both TSS and PSG Desktop/Notebook Support, wherever referenced groups) Sample size = Approximately 125 interviews per 3. IGS (includes both IBM server support group and Lenovo desktop/notebook support) Covers satisfaction with desktop/notebook support 4. Internal Support Organizations delivered by: 1. Dell Services (Client Support) 2. HP Services (PSG) 3. Lenovo Services 4. Internal Support Organizations119 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 120. Study Design & Methodology TBR2Q11 Sample Overview• TBR’s 2Q11 Corporate IT Service & Support Satisfaction Study is based on interviews with qualified respondents at 642 medium and large U.S. and Canadian establishments, primarily MIS/IT, systems management and purchasing managers.• A number of the respondents are responsible for purchasing services from multiple support providers for their company or site, and thus were interviewed twice (once for each brand). Most respondents rated, at the very least, their internal support organization and one third-party provider.• Consequently, 1,011 interviews were completed for the reporting period. This number has increased over previous reporting periods because TBR boosted the number of required interviews to better represent the stated experiences of customers receiving server-related versus desktop/notebook-related support events.• Because many of the larger companies rely exclusively on their internal support teams, the requirements for this study differ from TBR’s x86-based server, notebook and desktop satisfaction studies. The minimum requirement is an installed base of 200 systems for the Service & Support Study (versus 500 for the standard studies). Respondents are screened to include only those who recommend or evaluate OEM support services for their organization and also manage an internal support staff.• The service and support interviews for the reporting period were distributed as follows: 251 Dell Services customer interviews; 248 HP Services customer interviews; 253 IBM Global Services customer interviews; and 259 internal support organization interviews. Interviews were conducted between January 1, 2011 and June 30, 2011. Methodology & Sample TBR Standard Error at 95% Confidence Level per Segment Average Measurements Across All Attributes Service & Support Sample Size Standard Error All Providers 1011 1.00% Dell & Partners 251 1.99% HP & Partners 248 2.01% IGS & Partners 253 1.57% Internal Support Organizations 259 1.74% SOURCE: TBR120 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 121. Study Design & Methodology TBRNumber of Employees TBR Average Number of Employees at the Companies Surveyed Percentage of Number of Employees Respondents <500 31.6% 500–1,000 18.4% 1,000–4,999 22.4% 5,000–9,999 9.9% 10,000–14,999 6.6% 15,000–19,999 4.1% 20,000–49,999 4.1% 50,000–74,999 2.2% 75,000–99,999 0.3% 100,000+ 0.5% Average Number of Employees 6,685 SOURCE: TBR121 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 122. Study Design & Methodology TBRType of Business TBR Types of Businesses Represented in the Study Type of Business Percentage of Respondents Agriculture, Forestry, Fishing, Hunting 3% Pharmaceuticals 3% Transportation Service 4% Public Utilities 4% Mining, Construction 5% Wholesale Trade 5% Education 7% Finance, Insurance, real estate 7% Retail Trade 7% Information Service (including software development) 7% Manufacturing - Discrete (products, machinery, computers, furniture, etc.) 7% Healthcare 7% Government 8% Other Services 9% Manufacturing - Process (materials) 10% SOURCE: TBR122 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 123. Study Design & Methodology TBRJob Titles/Responsibilities123 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 124. Study Design & Methodology TBRPurchasing – Past, Present & Future Units Installed and Planned for Purchase by Form Factor TBR Installed Base Purchase Intent x86-Based x86-Based Desktops Servers Notebooks Desktops Servers Notebooks Enterprise Sum 1,013,259 127,905 433,287 140,078 13,587 77,797 Mean 1,593 201 681 220 21 122 Division Sum 3,770 595 4,114 508 112 691 Mean 628 99 686 85 19 115 Percent of Installed Base Replaced Enterprise 13.82% 10.62% 17.96% Division 13.47% 18.82% 16.80% SOURCE: TB R The 2Q11 study sample represents 1.6 million units (servers, desktops, notebooks) installed and a purchase intent for an additional 233,000 units during the next 12 months.124 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 125. TBR Appendix I: Analytical Procedures125 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 126. Analytical Model TBRSatisfaction Ratings • The customer satisfaction analysis was based on several lines of questioning. Respondents were asked to grade their vendor across a series of attributes (listed below) for each brand the surveyed corporations purchased in the most recent buying cycle. At the conclusion of the attribute testing, respondents were asked to provide a rating based on a 7-point Likert scale. Totally Dissatisfied Totally (Failure) Mediocre Satisfied Failure Very Poor Poor Fair Good Very Good Excellent 1 2 3 4 5 6 7 • Respondents were also asked to indicate the relative importance of each of the attributes in choosing their brand. These responses were given on a 1- to 5-point scale, with 1 meaning not at all important and 5 meaning very important. These ratings determined the gap between vendor satisfaction and importance, or how well the vendor manages expectations. • Respondents were then asked to indicate on a 1- to 5-point scale the degree of their loyalty toward their primary vendor(s). Finally, respondents were asked whether their corporation switched from one vendor to another during the past 12 months, and if so, which vendors were involved and why a change was made.126 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 127. Analytical Model TBRMeasured Attributes Customer satisfaction and relative importance were measured for each of the following attributes. Proportions of customers utilizing each service (based on percentage responding) are also indicated in the table. Service % Responding On-Site Break/Fix Services 87.98% On Site Technical Expertise 86.02% On Site Response Time/Commitment 86.28% Telephone/Help Desk Support 87.28% Online Support 85.63% Replacement Parts Availability 86.32% Support Services Pricing/Value 87.33% Hardware Installation/Configuration 72.81% Automated Diagnostics 65.08% Overall Satisfaction 87.98%127 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 128. Analytical Procedures TBRSatisfaction Statistics • A table of satisfaction statistics (including mean, standard deviation, standard error, range around the mean representing 95% confidence interval and standard t-Test) describes customer satisfaction for each vendor in each attribute area, with special emphasis on overall satisfaction. A series of t-Tests were performed on each vendor against the sum of its competitors, and the attribute areas where significant differences in score were indicated are marked. The t-Test compares two means to determine if one mean is significantly different than the other, taking variability of response into consideration. The purpose of these tests is to determine if any of the group’s mean differences observed (e.g., a group being a set of customers of one vendor) cannot be entirely explained by random or natural variation within sampled groups of customers. In other words, the observed differences are real. TBR uses an independent sample t-Test assuming unequal variances, or the standard student’s t-Test. Those attributes with an  level of 0.05 or less are cited as indicating there is a 95% chance that concluding the two means are different is correct. A t-Test of the grand mean (the mean of all scores for all attributes combined) serves to determine whether any of the vendors’ overall scores tend to run higher or lower than competitors’ scores. • As a backup to the above tests, an alternate test (the Bonferroni correction) is used for confirmation purposes (e.g., one-way analysis of variation). The variation within a group of customers is first determined in these one-way ANOVA tests. These variations are then compared to the variability between the groups (e.g., between Dell, HP and IBM customers). The between-group variation is measured by the sum of the squared differences between the sample mean of each group and the grand mean, which is then weighted by the sample size in each group. The between-group variation will be larger than the within-group variation (variation within each specific customer group) if there are meaningful differences between the means. The attributes that pass this additional test are also cited in the report. While the one-way ANOVA identifies which attributes are affected by differing means according to customer group, further tests, such as the Bonferroni correction, identify exactly which means differ from one another.128 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 129. Analytical Procedures TBRGAP Analysis • The competitive GAP analysis measures the gap between a vendor’s customer satisfaction for each attribute area against the expectations (importance ratings) of the market (all respondents). The standard against which each vendor is measured is the average size of that gap for all server vendors. The GAP analysis compares vendor satisfaction per attribute against importance per attribute among the vendor’s customer base, relative to overall satisfaction for all vendors per attribute against overall importance for all vendors per attribute. The formula for each attribute area independently is as follows: GAP = ____(Vendor Importance * (7-Vendor Satisfaction)____ * 100 (Grand Mean Importance * (7-Grand Mean Satisfaction) • The product for the above is graphed on a scale where values between 40 and 80 indicate where the vendor exceeds customer expectation; values between 81 and 120 show where the vendor fully meets expectation; values greater than 120 indicate where the vendor falls short of expectation. • A second GAP analysis (the standard GAP analysis) considers how each systems vendor manages the expectations of its own customer base. For each vendor independently and for each attribute area, the mean satisfaction rating is graphed next to the mean importance rating (adjusted from a 5-point scale to the 12-point scale used for customer satisfaction). There are three possible outcomes: satisfaction meets customer expectation (bar graphs are equal or within a range where the gap is not significant); satisfaction falls short of expectation (indicating areas where the systems vendor may want to consider focusing greater efforts on raising satisfaction); and satisfaction exceeds expectation (indicating attribute areas where the systems vendor may be focusing more than is necessary). • Another GAP analysis (the Improvements GAP analysis) is focused on determining the areas where the vendors need to set up improvement programs and areas where vendors may be able to pull back resources. It uses a similar formula to the competitive GAP analysis, however, the denominator becomes the grand mean importance and satisfaction for the vendor across all of the attributes. In this test, TBR compares the gaps for each of the individual attributes against the average gap for the vendor. Areas where the gaps measure wider than the average are areas where the vendor most urgently needs to focus its improvement efforts.129 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 130. Analytical Procedures TBRTrend Analysis A trend analysis compares each vendor’s customer satisfaction scores for the current reporting period separately against those from both the preceding reporting period and the reporting period prior to that. By comparing against both reporting periods, TBR is able to determine if any changes are indicative of a real change in historical pattern. This graph uses a 95% confidence-interval technique; the scores for each vendor are represented with the mean indicated in the middle from which the lines extend (in both directions) the distance of the standard error around the mean. This analysis is used to determine the reasons a vendor may move up or down in the rankings from previous reporting periods: is it because the vendor improved or because the competition declined in customer satisfaction? The analysis also is used to pinpoint potential problem areas or areas where marked improvement is evident.130 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 131. Analytical Procedures TBRNumeric Weighting Model support provider segment = 10131 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 132. Analytical Procedures TBRSupport Provider Ranking Positions Vendor ranking positions are determined primarily by the average weighted satisfaction index positions, with a minimum distance of 1.0% generally required for TBR to assign separate ranking positions to any two vendors. The determination of ranking positions does not end here, however; additional factors, such as number of competitive strengths versus weaknesses, also play into the final decision, which is a team effort by TBR principals. Consequently, less than a 1.0% distance can occur between two vendors’ weighted satisfaction index positions, yet, they may be assigned separate ranking positions based on the additional factors stated above.Competitive Strength & Weakness Table A competitive strength and weakness table is the final result of all the above analysis. The table points to the attribute areas that are definite strengths or weaknesses for each vendor. Areas of neutrality are those attributes where the vendor’s customer satisfaction performance is about average. The formula utilized for the determinations is: each attribute receives a score of 0 for neutrality, +1 for a positive and –1 for a negative. Three analysis are reviewed: the t-Test analysis (0 for null, +1 for significantly higher scores and –1 for significantly lower scores); the competitive GAP analysis (0 for meeting expectation, +1 for exceeding and –1 for falling short); and the vendor GAP analysis. The standard t-Test results are compared to those of the more stringent Bonferroni analysis and those passing both tests are given an extra point. The three scores for each attribute are then summed up. Any attribute with a total score of +2 or –2 is cited as a strength or weakness; total scores between these ranges are cited as neutral areas. Those with scores of +4 or –4 are areas of particularly strong strength or weakness. Marginal determinations (warnings or marginal strengths) come about when the determination is borderline (i.e., only the first t-Test was passed, or the t-Test was passed as a potential area of strength but a poor GAP rating negated it).132 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 133. TBR Appendix J: Survey Instrument133 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 134. Survey Instrument TBR2Q11 Survey Instrument SCREENERS134 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 135. Survey Instrument TBR2Q11 Survey Instrument135 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 136. Survey Instrument TBR2Q11 Survey Instrument136 Service & Support Customer Satisfaction | Second Calendar Quarter 2011 ©2011 Technology Business Research Inc.
  • 137. TBRTechnology Business ResearchTechnology Business Research is a different kind of research company. Our bottoms-up approach provides a lookat the technology industry unlike anything you’ve seen before. We analyze company performance in professionalservices, networking and mobility, computing and hardware, and software on a quarterly basis, leveraging ourdata to create industry benchmarks and landscapes that provide a business perspective on leaders and laggardsand their business plans. We are experts in the business of technology. “I never go into a negotiation with a vendor until I have reviewed TBR’s quarterly reports. Understanding a vendor’s profit margin by business unit gives me an information edge in formulating my negotiation strategy and has saved my organization countless dollars!” – Telecom End User “We are using Technology Business Research’s operational metrics and management consulting taxonomy to drive our growth strategy and resources for our management consulting business…” - Top 5 Global Technology Company ©2012 Technology Business Research Inc.
  • 138. TBRFor more information on accessing new TBR reports please contact James McIlroy at mcilroy@tbri.com or at 603-758-1813 Follow our analysts on @TBRinc Read out analysts’ commentaries at @TBRincNewsroom Watch our recorded webinars at http://www.youtube.com/user/TBRIChannel?feature=mhee ©2012 Technology Business Research Inc.