Project Examples

334 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
334
On SlideShare
0
From Embeds
0
Number of Embeds
37
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Project Examples

  1. 1. Excel PowerPivot Based Dashboard Problem: The client was spending significant amounts for manual monthly analysis of data around marketing campaigns and results but also had a small budget for IT Development. Solution: Create a PowerPivot based dashboard combining data from multiple sources including SQL Databases, Data Cubes, Share Point Lists, and Excel Spreadsheets. Results: An aesthetically semi-automated dashboard dramatically decreasing monthly reporting costs.1
  2. 2. Excel PowerPivot Based Dashboard2
  3. 3. Partner Segmentation Tool Problem: The prior year partner segmentation tool (narrowing 500,000 partners to 20,000 managed partners) cost roughly $2 million and did not deliver as expected. Solution: Create a tool combining data from 8 sources applying strategy based selection criteria using around 100 DAX expressions in an easy to use tool in PowerPivot. Results: A well received PowerPivot based tool delivered for 5% of the prior year costs providing more and better data than previously available. 3
  4. 4. Partner Category Recommendation Sample4
  5. 5. BI Solution Problem: As the launch manager, analyst and strategist for a project, pulling data for analysis grew to take up 75% of each month. Solution: Learn SQL and Microsoft Reporting Services to automate the monthly reporting. Results: The monthly reporting cycle was reduced from 3 weeks to 3 days with better data and teams asked “which BI Team had developed the solution”. 5
  6. 6. Sample Reporting Services Report6
  7. 7. Sample Reporting Services Report7
  8. 8. Sample Reporting Services Report8
  9. 9. Measuring Customer Experience Problem: The client’s market share is under threat from several companies that offer a more intimate customer experience. Solution: Benchmark the customer experience of 8 targets in various industries for best practices. Results: CONFIDENTIAL 9
  10. 10. Benchmarking: We Benchmarked 9 Companies to Assess Industry Best Practices Around Specific Customer Interactions Experience Takeaways: Customer Experience Landscape • Proactive Proactive + Relationship Oriented=Customer Centric • Loyalty and higher Lifetime Time Attractors Value (LTV) = Approachability • 56% of the Companies Benchmarked have a 360 degree view but aren’t fully utilizing • Complexity of business is not anTransaction Oriented excuse for bad customer Relationship Oriented experience • A good product does not equal a good customer experience Deflectors The Fine Print Reactive • The number of company interactions is not statistically significant and scoring may vary with further interactions • The relative location of each company is a good representation of how “Joe Customer” interpreted his interactions
  11. 11. Methodology Define Develop Interact Measure Identify Identified a Created realistic Used all available Developed 70+ Performed GAP Customer Service customer channels to measurement Analysis to extract Life Cycle* scenarios to drive interact with target attributes within best practices for encompassing the customer companies, in- the phases of the inclusion in the experience from interactions along store, web, phone Customer Service business case. pre-purchase to the Customer and chat to Life to allow Primary disposition. Service Life Cycle* research, comparison. purchase goods and service, create and resolve support issues and return products or cancel services.Secondary Utilized Forester, Gartner and performed extensive internet research. Target Companies: Microsoft, Apple, Dell, Amazon, Best Buy, Google, Comcast, T Mobile and Fidelity *”The Customer Service Life Cycle: A Framework for Improving Customer Service through Information Technology”. Gabriele Piccoli, PH.D. 6/01/2001 11
  12. 12. Customer Service Lifecycle* Phases 1. Establish Need 2. Determine attributesThese lifecycle phases were usedto identify and score interactions between the customer and the target companies 3. Source selection 4. Ordering12. Transfer & Disposal 5. Authorization & Payment13. Auditing & Accounting 6. Acquisition 7. Testing & Acceptance 8. Integration 9. Usage monitoring 10. Upgrading 11. Maintain *”The Customer Service Life Cycle: A Framework for Improving Customer Service through Information Technology”. Gabriele Piccoli, PH.D. 6/01/2001 12
  13. 13. Customer Interaction Scenarios and Measurement Attributes Example Customer Interaction Scenarios Software Company #1 Software Company #2 • Utilized pre-purchase • Purchased and returned 2 selection group, attempted 1 Laptops, 2 Wifi Routers, 1 chat support incident, Music Player, Applied for completed 2 technical support Credit Card, Utilized multiple calls plus prior experience and free selection services, trialed system knowledge several pay services, and called for support 5 times.Example Attributes Used to Measure Each of the Scenarios for Each Benchmarked Company (over 70 were used)Attribute # Example Attributes -2 -1 0 1 2 Does the company provide access Extensive forums or to communities for additional discussion groups with Yes, some form of information? ratings on participants, 66 No community information moderators, other sharing methods to judge the value of the information Does the Customer register Forced registration with no Registration encouraged but 31 ownership of the product? customer benefits, company Customer registers with not required for support, No registration unaware of owner of product minimal information offers tangible benefits to the even with registration. customer. 13
  14. 14. Meet Joe CustomerThe part of Joe Customer was played by Joe Dion.Joe Customer is a particularly savvy customer pretending to be dense and needy. Joe Customer kept hisidentity hidden as much as possible throughout the interactions (unless of course, that would create legalissues or impede completion of the scenario), even when his real name was revealed, he has never heard ofhis consulting firm and has no knowledge of this project. In fact if asked, he’s a freelance writer (yes,several targets did ask during the needs assessment stage…) *Yes, I even removed my badge before enteringstores!]Joe Customer acted within defined scenarios to fully test the Customer Service Lifecycle from before thecustomer identifies a need through purchase and ownership to retirement and disposal. Joe Customer alsotook liberties with the scenarios when circumstances allowed an even deeper interaction. Joe Customertook every opportunity to ask extra questions, particularly dumb questions to demonstrated his naivety,providing even greater opportunities for the targets to provide assistanceJoe Customer challenged any cancelation and restocking fees, but, was cordial in his challenge. (Theexperience may have differed, had he become enraged). Where relationships already existed, newrelationships were established to avoid historical taint. (new email accounts, new affinity accounts, newbuyer accounts, etc.)Joe Customer kept extensive notes of his experience including screen shots which can be seen at<SHAREPOINT site here>We hope you enjoy Joe Customer’s interactions…we certainly did . 14
  15. 15. Current State Competitve Customer Experience:Joe Customer© buys a computer Approachable and Resolution Oriented = Positive Customer Experience 15
  16. 16. Current State Competitive Customer Experience:Joe Customer© buys a computer Approachable and Resolution Oriented = Positive Customer Experience 16
  17. 17. Measuring “Confusion” Problem: Competitors are intentionally creating confusion about the benefits of open source software and software based on open standards to compete against the client in selling to the public sector. Solution: A methodology to measure “confusion” over the terms open source and open standards to determine if the “confusion” is leading to open source preferences in legislative activities. Results: A correlation was found to exist between “confusion” about the benefits of open source software and government procurement mandates; the client developed strategies to counteract the “confusion” and neutralize the procurement preferences. 17
  18. 18. Measuring Confusion Define Develop Measure Validate Analyze Read a sample of Identified a set of Read documents Provided the Analyzed the legislative confusion points, (49 from 25 clients leading results of all documents to (statements which countries) to Open documents and identify an attributed the identify confusion Standards/Open identified approach to value of open points in their Source Law Firm correlation measuring standards to open native language with the between confusion confusion over the source), and a (English, French, documents and and preference as terms open source body of Spanish, Italian, measurement well as areas of and open measurement and Portuguese) system for greatest confusion standards. rules to allow and verified comparative to be targeted with objective understanding analysis. PR and training treatment of all using translation Reconciled all efforts. documents. software. differences. Open Standard: A standard that is developed or Open Source: software that is distributed ratified in a consensus based process, which is according to a licensing model that grants access available to implementers and which has the goal to the source code of a program and requires that of requiring that the IP necessary to implement the source be made available to third parties for the standard be licensed to implementers on further distribution and modification reasonable and non-discriminatory terms.18
  19. 19. Greater Confusion=Hard Preference Preference by Confusion Points 100% 90% 80% % of Documents with type of Preference 70% 60% 50% Hard Preference Soft Preference 40% Neutral 30% 20% 10% 0% Confusion Points 0 1 2 3 4 5 6 # of Documents 8 10 7 6 7 4 119
  20. 20. Measuring “Awareness & Understanding” Problem: The client faces billions of dollars in risk if employees violate the intellectual property of other parties; the client has developed intellectual property policies and expends substantial training effort to reduce the risk of intellectual property violation, however, had no way of knowing whether employees knew or understood the policies. Solution: A methodology to measure awareness and understanding of intellectual properties among employees using a scenario driven quiz delivered as a survey with a 60% response rate from those sampled. Results: Employees were found to be more knowledgeable than expected, however results were used to increase training efforts in problem areas, and attorneys utilized the information in working with their internal clients. 20
  21. 21. Demonstrating UnderstandingScenario: Marc Zimmerman was just hired by Microsoft from a competingcompany. He was hired primarily because he led a group at his prior employer that A realistic scenario developed inwas developing the same types of products he will now be working on atMicrosoft. Marc developed many of the ideas that formed the basis of those products conjunction with attorney subjectand still has copies of schematics, software code and other items. To speed up the matter experts.development process, Marc is considering making this information available to histeam. Marc Should: (select one answer)•Freely provide the information to everyone in his group Survey respondents identified what they•Not use this to create products but can distribute it to show what the competition isdoing believe to be the acceptable action in the•Say he came up with the ideas after leaving his last job and freely distribute them•Not provide or use the information from his prior employer in his Microsoft projects given scenario.•Have an LCA attorney present when he discloses information from his prior employerWhy? (select one answer)•Its difficult to prove where ideas came from and he was hired by Microsoft for hisprior experience Next the survey respondent selected an•This is good competitive intelligence and usable as long as it doesnt go into Microsoft appropriate justification. Each possibleproducts•Marcs previous employer can keep the products Marc developed, but not the ideas action in the first question had a logicalhe developed•An attorney present creates an attorney-client privilege and protects Microsoft from paired reason to demonstrate policylegal problems•This may violate employment agreements with both companies and violate trade understanding in the second question.secret law Completion was anonymous and voluntary, yet designed to be entertaining and challenging; surveys elicited responses from 60% of the targeted population. 21
  22. 22. Different Categories of Awareness=Different Training Challenges Trade Secret 100% 80% 60% 87 93 83 86 90 80 40% 65 52 49 52 50 44 20% Unaware Confused Aware 0% Intuitive -20% -40% Trade Secret 1 Trade Secret 2 -60% BizDev BizDev All Developer All Developer PM PM Marketing Marketing Tester Tester 22

×