'Dashboards: Real-time Test Information For Managers And Teams' by Michael Bolton
Upcoming SlideShare
Loading in...5
×
 

'Dashboards: Real-time Test Information For Managers And Teams' by Michael Bolton

on

  • 446 views

As a tester, how do you effectively report your work in a way that is most meaningful to management? One way is to provide story that prompts conversation and questions. Managers claim that they want ...

As a tester, how do you effectively report your work in a way that is most meaningful to management? One way is to provide story that prompts conversation and questions. Managers claim that they want to know about the progress of testing--but is that what they really want? Perhaps they wouldn’t care much about the state of testing if they knew the state of the product. At very least, they’ll want to know about both. More than anything else, managers want to know about threats to the project schedule.



One complication is that product status is highly multi-variate. In particular, our knowledge each product area can be in a different state. The important variables for each area include;

- the amount of testing effort being expended currently;
- the amount of test coverage obtained so far; and
- information about threats to the schedule.



However, the good news is that one of the most basic forms of data display--a table--can easily show this information in a way that is clear, concise, and compelling.



In this presentation, Michael Bolton takes you on a detailed tour of a straightforward, easily-maintained testing dashboard that is designed to keep the entire project team informed of product status and testing activity, to foster productive conversation, and to help prompt focused questions from management.

Statistics

Views

Total Views
446
Views on SlideShare
446
Embed Views
0

Actions

Likes
0
Downloads
6
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Sometimes we run into trouble when the report doesn’t match the reporting mission. Your client may be expecting a very detailed answer; if so, a sentence or two might not provide enough information to be satisfactory. Your client may want a practical answer to a specific problem, where a detailed answer would include information that they would consider excessively elaborate. A practical answer might be too much when your client is in a hurry. One quick way to address this issue is to ask if your client wants a quick, a practical, or a deep answer. Practice giving all three.
  • Managers claim that they want to know about the progress of testing. We’re skeptical that that’s what they really want. We reckon that they wouldn’t care much about the state of testing if they knew the state of the product. At very least, they’ll want to know about both. More than anything else, in our experience, they want to know about threats to the project schedule . One complication is that product status is highly multi-variate. In particular, each product area can be in a different state with respect to testing. The variables for each state include the amount of effort being expended currently; the amount of test coverage obtained so far, and information about threats to the schedule. However, the good news is that a table can easily handle display of the data. You might like to present this in Excel or as a Web page, but we recommend a prominent dashboard placed in a highly visible location. The Agile community often talks about Big Visible Charts. Well, they’re often tables, rather than charts, but what of that? It’s the Big Visibility that’s important. This one’s a table.
  • This is the Low-tech Testing Dashboard. It’s designed to be displayed in a highly visible way in some kind of high-traffic area. The dashboard tracks three dimensions of each product area over time: Effort is how much testing focus that a given area is receiving at the moment . Coverage represents the amount of information that we have about a given area at a given time . Quality , in this context, means one thing and one thing only: what we know about problems in the product that threaten the release schedule . y 130km/h. That’s illegal on the highway in Ontario, supremely dangerous in a school zone, and just nicely at the limit on the Interstate in Arizona. Your engine is overheating; do you pause and pour in some coolant (again), or do you finally decide that enough is enough and get the thing fixed. You’ve traveled a certain distance; are you nearing your destination, or have you overshot it? Do you need an oil change, or is it time to sell the car? The data from the dashboard informs those decisions. Dashboard designers make choices about the way information is displayed to draw your attention quickly to certain urgent and important things. We’ll show some of the reasons behind our choices. The dashboard is not designed to be a complete status report. It’s not designed to tell you whether your product is good, or whether your project is running well. It’s designed specifically to start conversations by giving management just enough information to prompt questions about the things that concern them most.
  • Use red to denote significant problems or stoppages, as in blocked, none, or pause. Color ship green once the final tests are complete and everything else on that row is green. Use a neutral color (such as black or blue, but pick only one) for others, as in start, low, or high .
  • Color the coverage number green if the test coverage level is acceptable to inform a shipping decision; otherwise color it black. Level 1 and 1+ focus on functional requirements and capabilities. Can this product work at all? Well enough to be tested? Level 2 focuses on capability, the common, the core, the critical, the essentials, the happy path. Can this product work in ideal or ordinary conditions? Level 2+ and 3 focus on information to judge performance, reliability, compatibility, and other “ilities”. Will this product work under realistic or extreme usage? Level 3 or 3+ looks at the harsh, the extreme, the challenging, the corner cases. Level 3 coverage implies “if there were a bad bug in this area, we would probably know about it by now.”
  • Again, in the Q column, the Q stands for Quality of exactly one kind: Are there problems in the product that threaten the successful, on-time delivery of the product? Testers don’t know the answer, and to a great degree can’t know the answer. Only the product manager (program manager, project manager, etc.) has the ultimate authority over whether to fix known problems, ignore them, change the schedule date, bring new programmers on to the project, spend more money on tools, and so forth. Thus the testers should not be the ones filling in this field.

'Dashboards: Real-time Test Information For Managers And Teams' by Michael Bolton 'Dashboards: Real-time Test Information For Managers And Teams' by Michael Bolton Presentation Transcript

  • 1KEY IDEA
  • 2Reporting Considerations Reporter safety: What will they think if I made no progress? Client: Who am I reporting to and how do I relate to them? Rules: What rules and traditions are there for reporting here? Significance of report: How will my report influence events? Subject of report: On what am I reporting? Other agents reporting: How do other reports affect mine? Medium: How will my report be seen, heard, and touched? Precision and confidence levels: What distinctions make adifference?Take responsibility for the communication.
  • 3KEY IDEA
  • 4The Dashboard ConceptProject conference roomLarge dedicated whiteboard“Do Not Erase”Project status meeting
  • The Low-Tech Testing DashboardTesting Dashboard Updated 21/2 Build 38Area Effort C Q CommentsFile/edit High 1View Low 1+ 1345, 1363, 1401Insert Low 2Format Low 2+ automation brokenTools Blocked 1 crashes bug 1407, 1423Slideshow Low 2 animation memory leakOnline help Blocked 0 new files not deliveredClip art Pause 1 need help to testConnectors None 1 need help to testInstall Start 20/3 0Compatibility Start 13/3 0 compatibility lab time scheduledGeneral GUI Low 3
  • 6Product Area 15-30 areas (keep it simple) Avoid sub-areas: they’re confusing. Areas should have roughly equal value. Areas together should be inclusive ofeverything reasonably testable. “Product areas” can include tasks orrisks- but put them at the end. Minimize overlap between areas. Areas must "make sense" to yourclients, or they won’t use the board.Areafile/editviewinsertformattoolsslideshowonline helpclipartconvertersinstallcompatibilitygeneral GUI
  • 7Test EffortHow much testing focus is each area getting right now?NoneStartLowHighPauseBlockedShipNot testing; not planning to test.Regression or spot testing only; maintaining coverage.Focused testing effort; increasing coverage.No testing yet, but expecting to start soon.Temporarily ceased testing, though area is testable.Can’t effectively test, due to blocking problem.Going through final tests and wrap-up procedure.
  • 8Test CoverageHow much information do we have about each area so far?011+22+3We don’t have good information about this area.More than sanity, but many functions not tested.Common & Critical:Sanity Check:Some data, state, or error coverage beyond level 2.Complex Cases:all functions touched; common& critical tests executed.strong data, state, error, orstress testing.major functions & simple data.
  • 9Quality AssessmentDoes management see threats to the ship date?“We know of no problems in this area thatthreaten to stop ship or interrupt testing, nor dowe have any definite suspicions about any.”“We know of problems that are possibleshowstoppers, or we suspect that there could beimportant problems not yet discovered.”“We know of problems in this area thatdefinitely stop ship or interrupt testing.”
  • 10Use the comment field to explainanything colored red, or any non-greenquality indicator.Comments Problem ID numbers. Reasons for pausing, or delayed start. Nature of blocking problems. Why area is unstaffed.
  • 11Using the Dashboard Updates: 2-5/week, or at each build, or prior toeach project meeting. Progress: Set expectation about the duration of the“Testing Clock” and how new builds reset it. Justification: Be ready to justify the contents ofany cell in the dashboard. The authority of the boarddepends upon meaningful, actionable content. Going High Tech: Sure, you can put this on theweb, but will anyone actually look at it? A big visiblechart gets attention without being asked.
  • Visualizing Test Progress
  • Visualizing Test Progress
  • Visualizing Test Progress