3. Challenges to the effective delivery of software A history of cost overruns, schedule slips and quality issues â Business leaders and CIOs are under pressure to enable their teams to become catalysts for change. However, change is outpacing their ability to deliver . They are reorienting their teams to focus on ROI and quantified business outcomes and to mitigate risk and reduce costs. â â IBM CIO Survey, 2008 Growing focus on business outcomes â 49% of budgets suffer overruns and 62% fail to meet their scheduledâ 30% of project costs are due to rework and poor execution of requirementsâ â 50% of outsourced projects are expected to under performâ â Only 22% of executives felt that their IT and business strategy were tightly integratedâ â 34% of software projects are deemed successful costing $300B annuallyâ
4. Software delivery is a business process that can be continuously improved Source: Gartner, âMaking the Difference: The 2008 CIO Agenda,â Jan. 2008 Requirements Design Implementation Verification Deployment CIOâs top priority over last three years: âImproving Business Processesâ
5.
6. Software Measurement Status - Today Fortune 500 companies with productivity measures: 30% Fortune 500 companies with quality measures: 45% Fortune 500 companies with complete measures: 15% Fortune 500 companies with missing measures: 85% Number of software measurement personnel 5,500 Number of software projects measured: 160,000 Number of software projects not measured 50,000,000 - Capers Jones (2009)
7. Does It Help to Measure? Companies that measure: Companies that donât: On-time projects: 75% On-time projects: 45% Late projects: 20% Late projects: 40% Cancelled projects: 5% Cancelled projects: 15% Defect removal: > 95% Defect removal: Unknown Cost estimates: Accurate Cost estimates: Optimistic User satisfaction: High User satisfaction: Low Software status: High Software status: Low Staff morale: High Staff morale: Low - Capers Jones (2009)
11. Leverage a Control Framework to Manage to Expected Business Results Operational Level Practice Level Business Level Process Enactment / Governance Enforcement / Process Awareness Jazz Platform Business Objectives Process Definition / Practices Rational Method Composer Operational Objectives feedback feedback feedback Feedback Performance Measurement (IBM Rational Insight) Value Metrics e.g., ROI, ROA for SSD Operational Effectiveness Metrics e.g., Time to market, productivity Practice Adoption/Maturity Subjective IBM Rational Self-Check Practice Artifacts Objective
12.
13. Leverage MCIF Assets to Incrementally Define the Control Framework Operational Level Practice Level Business Level Process Enactment / Governance Enforcement / Process Awareness Jazz Platform Business Objectives Process Definition / Practices Rational Method Composer Operational Objectives Phase 4: Rational Insight Phase 3-4: Self-Check Phase 3: Rational Method Composer Phase 2: Health Assessment Phase 1: Exec Business Value Workshop Phase 3: Rapid Deployment Package NEW ENHANCED feedback feedback feedback feedback Performance Measurement (IBM Rational Insight) Value Metrics e.g., ROI, ROA for SSD Operational Effectiveness Metrics e.g., Time to market, productivity Practice Adoption/Maturity Subjective IBM Rational Self-Check Practice Artifacts Objective
14. Unifying Platform - Jazz Operationalize capability improvement Define Business and Operational Objectives Define Practices Operationalize processes, accountability, and policies across the organization Measure, monitor and report on practice adoption, and business and operational results Phase 4 Phase 3 Phase 1/2 Client Integrations Server Integrations Future IBM Capabilities 3 rd -Party Jazz Capabilities Deliver Enduring Quality Accelerate Change & Delivery Ensure Security & Compliance Manage Architecture Manage Evolving Requirements Improve Project Success Query Storage Collaboration Discovery Data Warehousing Administration: Users, projects, process Best Practices JAZZ SERVICES
15.
16.
17.
18.
19. Rational Insight â SSD Performance Management Open Lifecycle Service Integrations Best Practice Guidance ClearCase MS Project Quality Manager Partners Cognos based report server provides a consolidate view ClearQuest RequisitePro Team Concert Insight Measured Capability Improvement Framework (MCIF) Powered By Cognos 3 rd Party Integrations Rational Insight Data Warehouse ODBC/JDBC XML Powered by
20. Rational Insight Project Management Build Management Architecture Management Requirements Management Change Management Configuration Management Portfolio Management Quality Management CxO Project Manager Collaborate across disparate development artifacts In the context of business objectives Process Lead
21.
22. Multiple Roles Come With Multiple Usage Patterns⊠View dashboards and reports online⊠View reports on mobile devices⊠Embed charts and reports in spreadsheets / presentations⊠Search for reports using standard paradigms⊠Easily create and customize graphical dashboardsâŠ
23. Extending the Insight platform with Cognos Go! Increase business intelligence user adoption rates within your organization by letting users view and consume reports, scorecards, and other BI content using familiar applications or devices, such as a BlackBerryÂź , search engine , MS Office application , or Web browser. IBM Cognos 8 Go! Dashboard lets users build and edit dynamic and interactive dashboards using a drag-and-drop Flash interface. IBM Cognos 8 Go! Mobile lets users securely receive and interact with reports and analysis through their BlackBerryÂź handheld devices. IBM Cognos 8 Go! Search lets users find existing reports in IBM Cognos 8 Business Intelligence more quickly. It also lets you integrate BI content into enterprise search applications such as those provided by Autonomy, Fast, Google , and IBM . IBM Cognos 8 Go! Office lets your users view, interact with, and refresh BI content within MS Office applications such as MicrosoftÂź ExcelÂź, PowerPointÂź, and WordÂź.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
51.
52.
53.
54.
55.
56.
57.
58.
59. This top-level dashboard showcases schedule variance, project health, size, and % complete for each project. Drill down on project for more info. This next set of slides showcases the executive performance measurement dashboard we are using to govern Rationalâs development teams leveraging Rational Insight using MCIF-style dashboards. Note that some data has been added, changed or removed to avoid revealing confidential data. Project health by region. Drill down for regional information, including headcount and cost-related information. Dashboards are organized around business objectives to ensure focus on potential problem areas preventing us to reach stated business objectives.
60. Project Health and Practice Control Metrics Iterative Development 2-Level Planning Continuous Integration
Sources: (1) CHAOS Chronicles v12.3.9, The Standish Group, June 30, 2008.Airbus from Business Week -- http://yahoo.businessweek.com/globalbiz/content/oct2006/gb20061005_846432.htm (2) BusinessWeek January 12,2004, Shifting Work Offshore? Outsourcer Bewareâ -- based on Gartner survey of 219 clients who outsourced projects offshore & domestically â half are expecting to fail to deliver anticipated savings. (3) "Corporate Software Development Fails To Satisfy On Speed Or Quality", Forrester Research, Inc., 2005.
The business process of software delivery is a non-trivial process. It has a lot of moving parts. Making things worse, it deals with innovation, frequently one-of-a-kind work, and artifacts where it is non-trivial how to measure their value or goodness. How do you determine how good a design is, for example. We know from Gartner studies that the top priority of CIOs over the last three years have been to Improve Business Processes. But how can you improve your business process if you do not know what works and what does not work. Reality is that in most organizations, we have very little data allowing us to take insightful decisions wrt what works and what does not so that we can improve. This means that most organizations are guessing what changes to make to get more business value from IT Delivery. Many organizations spends $100Ms or billions of dollars on IT Delivery, yet, we are primarily guessing how to use that money wisely. If we want to increase the business value of software delivery, we need to take the guessing out of software delivery.
We can learn a lot from other industries. The biggest influence on software development right now comes from the principles of Lean Manufacturing which can be derived from Toyota Production System. Toyta in 2007 became the worldâs largest car manufacturer. Even more impressive is that Toyotaâs profit is bigger than the combined profit of all other car companies combined, so it seems like they are doing something right. Toyota revolutionized how the manufacturing process, and many of these key concepts can effectively be leveraged in the software industry. They changed the notion of that Quality must cost. Instead, they said that if you build with high quality directly, you can actually reduce cost, and the same is true for software development. They introduced the notion of Kaizen, continuous process improvement where many small improvements will add up over time. They involved team members in the change process. In the 70s, it was pretty revolutionary to have a blue collar worker stop the manufacturing process because they saw a defect. And they focused heavily on measures. No work without process No process without metrics No metrics without measurement No measurement without analysis No analysis without improvement MCIF brings many of these key thoughts to the domain of software delivery
Talk about the wrong incentive â we may all chuckle here, but this is more the norm than the exception Iâd like to talk about changing this conversation, too!!! ï
This fundamental notion of types of measures is de-facto standard in other industries, but not that common in software, maybe we can learn something. You are typically looking for a certain outcome. You need to measure whether you are getting this desired outcome or not. This is done through so called outcome measures . To get the outcome, you need to change a few things, that is, you need to turn a few knobs. You want to measure that the system has properly responded to the knobs, which are so called control measures .
Here are an example of 3 levels of measures. For most audiences, the top 2 levels are outcome measures, and bottom are control measures. Letâs say that you want to increase your profit. You need to measure the Profit to know whether you are achieving that business objective or not To improve Profit, you decide that you need to Increase Productivity and Improve Quality. You now need to measure whether you are achieving those operational objectives. To improve productivity and quality, you decide that you need to introduce or improve the practices Test Management, Iterative Development, and Continuous Integration. For each of these practices you can now introduce control measures to assess how well the organization have adopted these practices. For Test Management you can analyze Defect density and its distribution across different components and severity levels. For Iterative Development, you can track Velocity and Iteration Burndown For Continuous Integration you can track Build Stability and Build frequency, and so on
The Measured Capability Improvement Framework (MCIF) is a measurable approach to continuous capability improvement Captures +10 years of Rational experiences in incremental adoption Key aspects have been used in agile transformationâs of +80 IBM internal projects Process independent â used in conjunction with RUP, XP, Scrum and other processes A systematic approach for improving your business using four phases: Phase 1: Elicit and set business value objectives Reduce Time-to-Market, Improve Quality, Increase Innovation, âŠ) Phase 2: Determine the right set of practices and tooling to drive desired business objectives Leverage assessments and out-of-the-box business objectives to practices mappings Phase 3: Accelerate and monitor the adoption of the technology Effectively deploy well-governed practices Understand what aspects of which tools to adopt to effectively adopt practices Process guidance, training courses. enablement material, etc Phase 4: Measure the business results realized Understand whether target practices are successfully adopted Understand whether desired business outcomes are achieved or not Take corrective actions
07/21/10 Explode this slide â I wanted to wrap this presentation up with a short discussion of our future direction as it relates to providing the industries only platform for operationalizing capability improvements, But before I talk to the future, let me talk about what is here today: You can define practices in RMC, and make those available in context of Jazz-based tools You can capture best practices and policies in Jazz platform and the platform will support and enforce their usage We have beta capabilities that provide you with the ability to measure how you are doing within Jazz and non-jazz-based products. Jazz is a middleware platform for software delivery. That means that you can get access to all of the above without having to force each team to retool. Your team can e,g. stay on VisualStudio, while benefiting from the above, which lowers TCO, allowing you to decide when and how to evolve the development environment of each individual team while still operationalizing your ability improve the capabilities of your Software and Systems Delivery organization. The direction we are moving towards is to further strengthen these capabilities. We want to fully integrate our process definition capabilities with the Jazz process capabilities, so that Rational Method Composer becomes a process definition tool for the Jazz platform We want to extend the capabilities of the Jazz platform in areas such as RACI specifications and policies We want to deliver a comprehensive measurement solution, allowing you to monitor the business process of software capabilities The sum of the above is that we believe that we today have the industry leading solution in this area, and that we continue evolving it to address your needs.
We all know that when driving a car, there are inherit blind-spots. If you donât adjust our mirrors to get proper visibility into what is happening all around you, the results could be significant. There are blind spots within your software development processes as wellâŠ. Team members may not have visibility into overall business goals and operational initiatives that are driving their deliverables Management may not have relevant, timely visibility into whether development activities are meeting current business and operational initiatives There is often an overload of âdataâ and it is difficult to aggregate it into useful, âactionable informationâ This disconnect can often lead to making key business decisions without the required accurate and up-to-date information
To help narrow down the blind spot(s), a systematic approach to measuring and managing the performance of your software delivery teams is required. Statistics show that organizations that leverage world-class performance management practices see a 2.4 times market return over companies that do not! That is a huge differentiation! Leveraging the principles of the Measured capability Framework that Per presented early with the new IBM Rational Insight product, organizations can: Measure process and project outcomes using real-time intelligence based on 20 years of IBMs best practice metrics, dashboards, and models Make informed decisions based on objective, accurate information. Alerts and automation enable you to focus on key issues early in the integrated lifecycle, reducing the cost of issues by a factor 10, 100, or even 1000 Enables real-time action and collaboration based on aligned and relevent data
The IBM Rational Insight product is based on the industry leading Cognos 8 BI reporting platform and enables real-time and historical views across projects, products, and geographies. Additional statistics show that by leveraging automated project and process measurement can improve team productivity by at least 15%. Per showed us some detailed metrics showing this performance gain earlier. Insight is an integral part of the overall Rational brand strategy enabling Collaboration across disparate development artifacts IN the context of business and organizational objectives. Teams can continue to leverage collaboration facilities built into their development environments. Insight extents the reach of that collaboration to multiple levels of management as well. Automation of reporting and measurement, leveraging Jazz compliant open REST interfaces and the Cognos 8 BI reporting engine Reporting on ârelevantâ data with a built in library of industry best practice metrics and dashboards. I will demonstrate some of these in a few minutes. Although very early in itâs lifecycle, Insight is already helping organizaitons realize some of these benefits. Insight enabled Unsiys to automate some of their manual metrics gathering and simplify the presentation, giving management a âbirds eye view of all stages of project performance Insight has been deployed internally at IBM as well and one team is already saving over 40 hours/month through automated data gathering and report generation. In addition, they can now offer on-demand dashboards which were very labor and time intensive before they started using Insight.
As I indicated earlier, Insight leverages an open REST integration framework just like the Jazz Open Services for Lifecycle Collaboration. There are OOTB integrations in Insight V1.0 with Rational Team Concert, Quality Manager, Clearcase, Clearquest, RequisitePro, and TestManger. We also have an OOTB integration with MS Project. The platforms may be further extended to connect to almost any DBC, JDBC, or XML based data source, and we have Business Partners working on developing additional integrations. The IBM teams will be continuing to develop Insight integrations to most/all Rational/Telelogic products as well and will continue to be deployed over the next few years. Data from these disparate and distributed sources can then be included in either on-demand, live operational reports, or Sent thru our Cognos based Extract, Transform, and Load engine and stored in the Insight Data Warehouse for trending and analytical reporting. The Cognos Framework Manager enables Insight to deliver a metadata model that encapsulates the business consumer and metrics /report engineer from the complexities of the underlying data warehouse or point product schema. Only the necessary objects and relationships are exposed, allowing a wide range of consumers to define and author new KPIs, reports, and dashboards. And finally, Insight includes best practice guidance in the form of Measurement Capability Improvement Framework (MCIF) and a newly developed Performance Measurement practice. More on that in a few minutesâŠ
Each of these consumers may have one or more disparate usage models for their reports. The underlying Cognos reporting platform can be extended to support many report delivery methods. Users can log into the Rational Insight web site to view role and user based dashboards are reports. This includes full drill down, drill up, and drill thru analytics. Report generation can be automatically scheduled with various delivery options. I will cover this in more detail during the Insight demonstration. Managers may wish to take advantage of the Cognos Go! Mobile features enabling reports to be delivered to Mobile devices such as a Blackberry or PDA. Cognos enables a single report âtemplateâ to support both online viewing thru the web site and off-line viewing thru Mobile devices. Other users may wish to embed reports, or portions of reports (charts, tables, etc.) within Microsoft Office Word, Powerpoint, or Excel documents. The Cognos Co! Office integration enables this integration as well. The Cognos Go! Dashboard application enables users even more flexibility over the look and feel of their dashboards and reports. Using Go! Dashboard, users may dynamically personalize the look and feel and presentation style of any Insight report element. Again, leveraging the same report template(s) that are deployed on the Insight web site. Finally, the Cognos Go! Search features enables IT shops to integrate the Cognos Search engine with their corporate search applications, such as Google. From within a Google search, users will have the additional benefits of being able to see details on when a report was created or generated. They can even visualize a snapshot of the report right there in the search results window! All of this is possible through our integration with the Powerful Cognos 8 BI reporting platform.
Use a work items list to coordinate the team and prioritize work. The work items list represents the development teams to-do list. It is similar to the product backlog used in the Scrum methodology and contains all proposed and schedule work for the project. Work items may be created to fix a defect, to add requirements to address an enhancement request, or to capture other actions that must be addressed. The high priority work items are addressed first. Work items may be decomposed into smaller work items as needed to add detail or to split the work into smaller pieces so it can be completed within the bounds of an iteration. Work items are the primary planning and tracking elements on the project.
Next, I would like switch gears and take you through an demonstration of the Insight Version 1.0 product. In this demonstration, I will walk you through many of the features and capabilities of the Cognos based Insight user interface. We will be holding additional Webcasts to cover advanced Insight topics such as report authoring and data administration .
After logging into Insight the initial display is the welcome screen. From here, there is a wealth of information at your fingertips. The Insight âTourâ is an 8 minute flash tour of all the Insight components. There are links to tutorials, help pages on the web, and even a link to the RMC Performance Management practice delivered with Rational Insight. I will cover that in a bit more detail later. The âVisit my homeâ link will take me to my personalized start page.
This is the landing or home page configured for my project. Like much of the Insight interface, this is highly customizable and can easily be configured with information relevant to your projects and programs. A number of controls that can be placed on the portal include: A web page viewer â in this case, we are looking at the home page for the 2009 Rational Software Conference. There is also an RSS feed portlet and an quick access URL list. Clicking a link in either of these portlets will display the results in the main html viewer on this page. Across the top, I have some tabs configured to display additional project information. For my project, I have one tab for my Executive level business valure dashboards; one tab with my Project specific KPIs and dashboards. The reports tab contains views into project and program level KPIs and reports; And the Practices tab contains best practice guidance in the form of the new MCIF published Performance Measurement practice. First, we will explore the executive dashboards configured for my projectsâŠ.
My initial dashboard on the executive dashboard tab is an overall program scorecard. There are KPIs that are measuring the overall status of the 3 projects that my team is working on: Classics CD, Online Auction, and Smarter Living. This includes a schedule performance Index, Cost performance index, Overall quality indicator, and a KPI measuring on time project deliveries. The map indicator is showing the geographic distribution of my workforce, with an indicator of whether my staffing profile is at or below planned or expected values.
If I hover over one of the geographies, I am presented with additional detail pertaining to that region. I can also drill into one of the regions to display a detailed breakdown of that geographies staffing profile.
This drill thru report the staffing profile for my 3 projects in the United States. The cross-tab in this report can be configured to sort on any column and include summaries across a number of dimensions. The report is also context aware and shows the details for the selected region. Now, lets look at some of the additional features ability on this report
Clicking on the âAdd This Reportâ button enables me to add this report to either my personal folder of reports, or my personal set of report bookmarks. I can then easily access the report later onâŠ
I can also display the report in various formats. This includes HTML for view in the web; PDF for either viewing in the web or sending via email; XML for exporting to other tools; and various MS Excel and CSV file formats. The Cognos platform enables a single report template to be utilized for any of these output formats.
Finally, I have a number of options available for saving the report. A snapshot of this report can be saved which includes the report template as well as the data being displayed. A âviewâ of this report can be saved, which will save any parameters used to generate this report. Or the report can be sent via email to one or more recipientsâŠ
From this view, I can see that the overall status KPI of Online Auction project is not green. I will now bring up the overall status dashboard to look at project(s) in a bit more detail.
This dashboard is displaying a number of KPIs related to my projectâs overall status. This includes cost measurements as well as a number of KPIs related to customer satisfaction. The Budget chart shows that we have been successful in implementing some cost control initiatives and that my actual expenses are tracking very close to my budget. The Delivery Schedule chart tracks how many releases have been delivered on time, within 2 weeks of customer commitment, or greater than 2 weeks over the committed release date. This is a key leading indicator of customer satisfaction on my projects. I can see that most of my releases have been on time, or within 2 weeks, but we might want to look at the projects delivering late. The one think that I immediately notice is that my defect count KPI is showing that there is a problem. I will click on my Quality Dashboard to get additional information on my project quality KPIs.
The quality dashboard is showing a number of practice or project level measures that are feeding into our overall status KPIs. Here we are tracking KPIs like the number of pre and post ship defects on all three of my projects. I can quickly see that the post-ship defect count for my Online Auction project is much higher than my other projects. I can drill down on this report to view additional detail. Click on the right bar of the post ship defect report
The drill-thur connection from metric to metric, or report to report is completely customizable. In this case, we are taken to a chart comparing project size (measured using Clearcase) to defect count (using RTC and CQ). This is easily accomplished using the linkage established in the Insight data warehouse. I am interested in seeing whether that higher defect count is related to project size (where a much larger code base could contribute to the larger defect count). However, this report indicates that all three projects are about the same size. Let me close this report and examine some of the other KPIs on my quality dashboard Click on the to close this reportâŠ
Now that it is apparent that my Online Auction project requires more attention, I am going use one of my Project Management views of the Online Auction project to look at some of the project specific KPIs⊠Click on the Project Dashboards tabâŠ
My Online Auction project dashboard has a number of KPIs configured showing information from the various tools being uses to develop and manage this project. The Work KPIs show project velocity and burndown. This data is coming from Rational Team Concert and ClearQuest. The Build Health chart shows how stable my project builds have been. I can see that there was an inconsistent number of project builds being performed. A Continuous Build practice was recently deployed as part of an overall quality improvement operational initiative and we can quickly see that we are startting to satisfy our objective. The enhancement request backlog chart shows that our enhancement requests are not being implemented as quickly as we would like. I will drill into this report for additional detail⊠Click on the Enhancement Request Backlog chartâŠ
One of the drill-thru paths for this chart was a requirement churn trend. We would normally expect to see changes in requirements stabilize as the project moves thru its iterations and sprints. In this case, it appears that there continues to be changes in requirements throughout the lifecycle, indicating that we may want to introduce and measure some additional requirements management practices on this project. I would like to add a comment to this report so that one of my team leads can investigate this trend. To do this, I can click on the âadd commentâ button to start collaborating on this report⊠Click on the Add Comment buttonâŠ
From here, I can enter who authored the comment and add some text. I will ask Per, my lead analyst to look into the trend. Click the X to close this dialog
I can now sent this report to Per via email so that Per is immediately notified that there is an issue that needs his attention The email form allows me to select or enter one or more recipients as well as some text to accompany the message. I can optionally include a link to this report or even embed the report in the email message⊠Click on the X to close this dialogâŠ
However, requirements churn alone does not explain the high post-ship defect countâŠ. Let me drill into the Enhancement Request report again, this time looking at how well these requirements are being tested⊠Click on the enhancement request chart againâŠ
OK, from here, I can see a leading indicator of an issue. It appears that the requirements that are being implemented (measured from Rational Requisite Pro and Rational Team Concert) are not being properly covered by test cases (measured from Rational Quality Manager and Rational Test Manager). This is clearly an indicator that a more formal testing methodology needs to be implemented and measured. Having this historical data stored in the Insight data warehouse will enable us to measure the effectiveness of a test Driven Development practice implemented, regardless of what testing or test management tool is leveraged. If I need additional information on the requirements that are not being tested, I can click on the Not Covered slice to bring up a tabular list of those requirements⊠Click on the not covered sliceâŠ
This report shows the detailed information for all of the non-tested requirements listed on the requirements coverage chart. As you can see, this report has a number of parameters that are automatically populated as we drill thru from chart to chart, or report to report⊠This enables one report like this to work for multiple configurationsâŠ. Click the X to close this reportâŠ
Beyond the Executive and Project level dashboards, which are useful to visualize business and operational measures and status, we can view a number of reports, either pre-configured for a project, or parameterized letting the user select the program/project to display⊠Click on the âreportsâ tabâŠ
My reports view has a number of controls defined. On the left, I can view a number of reports configured for my Online Auction project. There is also a categorized report hierarchy, enabling me to group reports is multiple categories. Finally, I have a search control that lets me search the repository for reports using a number of simple and advanced search criteria. I will expand my âMost Popularâ link at the top of the list to look at some of my commonly viewed reports. Click most popularâŠ
One key report that we use on this project is a Requirements traceability report⊠Click on the Requirements Traceability reportâŠ
This report displays information on the requirements that are being implemented on this project. The color coding quickly identifies any use cases that are not currently traced to a high level project feature. We are also able to look at any Defects (CQ, RTC, etc.) that have been filed against each requirement⊠The related source code activity against each requirement (CC, RTC, etc.) And any test cases that have been written to test each requirement⊠I would like to leverage some of the automation features of Rational Insight to generate this report periodically⊠To do this , I will first maximize the report selector⊠Click on the maximize button on the report selectorâŠ
Next, I will click on the schedule icon next to the report I want to automatically generate⊠Click on the schedule iconâŠ
This dialog enables me to setup the scheduling parameters⊠I can have the report generated daily, weekly, monthly, or via a pre-defined trigger. The report format (HTML, PDF, etc.) as well as report parameters may be defined, or the default values for this report will be used. Finally, the delivery method for the generated report can be specified to either send the generated report via email, save the report snapshot for later viewing, or sending the report to defined printer. Click the X to close the print dialogâŠ
One of the predefined report hierarchies that are configured OOTB maps to a set of Rational Method Composer practices Click on the Practice Oriented Hierarchy linkâŠ
The Test Execution Status report, which is part of the Test Management RMC practice, can display additional information on whether our test cases (RQM, TestManager, etc.) are passing or failing⊠As we can see, the majority of our test cases have been passing. However, we may want to spend more time later looking into why there has been a rise in test case failures in iteration 4⊠What we have seen so far is an OOTB set of dashboards and reports implementing a performance management system. As I indicated earlier, we have developed and delivered an RMC practice on Performance Management Click on the Practices tabâŠ
The Performance Measurement practice gives users an immense amount of guidance on establishing and implementing, and measuring a performance management system. This includes an overview or performance measurement, multiple process diagrams⊠Click on the Sample Metrics tree nodeâŠ
There are also a number of sample metrics and measures defined that correspond to Business, Operational, and Practice or Implementation initiatives For each measure, there is information on the purpose of the metrics, implementation method, and measurement analysis. Click on Iteration VelocityâŠ
You will find that most of the metrics that are defined here have been implemented in Reports and Dashboards delivered with Rational InsightâŠ
There is also mapping defined that show which RMC practices that each of the metrics can be used to measure. As you can see, the Test Coverage measure is utilized by various best practices⊠Click on the scroll barâŠ
There are also a set of âValue-Traceabilty Treesâ defined These enable you to visualize how a high-level bussines objective can drive one or more operational initiatives, which are implemented in various practices and implementations. There is also a set of Rational Insight âTool Mentors defined covering various tasks within the Rational Insight product. Click on the Welcome tabâŠ
So, as you have seen, there are a number of Executive and Business value dashboards defined OOTB with Rational Insight. Many more can be configured to map to individual business drivers. For any business driver, there can be multiple operational initiatives implemented. Insight Implements many KPIs and reports that measure the implementation, progress, and performance of these operational initiatives and can feed that information into business value dashboards and scorecards. Furthermore, there is a wealth of user assistance, tutorials, best practice guidance, and tool mentors built directly into the tool⊠Thank you for watching this Rational Insight Demo
Now I would like to go back to Per Kroll for a quick wrap-up of this webcast
Baseline alignment, efficacy and efficiency Prescriptive, best-practice measures and models Program roadmaps Root cause correlation models Skill, practices and tool maps Objective, iterative, and easy to deploy lifecycle performance management Targeted best practices Increase confidence of delivering incremental improvement Iterative process and incremental Bite sized chunks Realign Redirect Find out what is working and what is not Measure progress continuously Modify approach