Collaboration Techniques: Forgotten Wisdom and New Approaches
Upcoming SlideShare
Loading in...5
×
 

Collaboration Techniques: Forgotten Wisdom and New Approaches

on

  • 235 views

In our increasingly agile world, the new buzzword is collaboration—so easy to preach but difficult to do well. Testers are challenged to work directly and productively with customers, programmers, ...

In our increasingly agile world, the new buzzword is collaboration—so easy to preach but difficult to do well. Testers are challenged to work directly and productively with customers, programmers, business analysts, writers, trainers, and pretty much everyone in the business value chain. Testers and managers have many touch points of collaboration: grooming stories with customers, sprint planning with team members, reviewing user interaction with customers, troubleshooting bugs with developers, whiteboarding with peers, and buddy checking. Rob Sabourin and Dot Graham describe how collaboration worked on several agile projects, giving critiques of what worked well, where problems could arise, and additional aspects to consider. Join Rob and Dot to look at examples from agile projects and how forgotten but proven “ancient” techniques can be applied to your own collaboration, such as entry and exit criteria, role diversity, risk-based objectives, checklists, cross-checking, and root cause analysis. Bring your own stories of collaboration—good and bad—and see how forgotten wisdom can help improve today’s practices.

Statistics

Views

Total Views
235
Views on SlideShare
232
Embed Views
3

Actions

Likes
0
Downloads
13
Comments
0

1 Embed 3

http://www.stickyminds.com 3

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Collaboration Techniques: Forgotten Wisdom and New Approaches Collaboration Techniques: Forgotten Wisdom and New Approaches Document Transcript

  • TN PM Tutorial 10/1/2013 1:00:00 PM "Collaboration Techniques: Forgotten Wisdom and New Approaches" Presented by: Rob Sabourin, AmiBug.com Dorothy Graham, Software Test Consultant Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • Dorothy Graham Software Test Consultant In testing for more than thirty years, Dorothy Graham is coauthor of four books—Software Inspection, Software Test Automation,Foundations of Software Testing, and Experiences of Test Automation: Case Studies of Software Test Automation. Dot was a founding member of the ISEB Software Testing Board, a member of the working party that developed the first ISTQB Foundation Syllabus, and served on the boards of conferences and publications in software testing. A popular and entertaining speaker at conferences and seminars worldwide, she has been coming to STAR conferences since the first one in 1992. Dot holds the European Excellence Award in Software Testing and is the first recipient of the ISTQB Excellence Award. Learn more about Dot at DorothyGraham.co.uk. Robert Sabourin AmiBug.com Rob Sabourin, P. Eng., has more than thirty years of management experience leading teams of software development professionals. A well-respected member of the software engineering community, Rob has managed, trained, mentored, and coached hundreds of top professionals in the field. He frequently speaks at conferences and writes on software engineering, SQA, testing, management, and internationalization. Rob wrote I am a Bug!, the popular software testing children's book; works as an adjunct professor of software engineering at McGill University; and serves as the principle consultant (and president/janitor) of AmiBug.Com, Inc. Contact Rob atrsabourin@amibug.com.
  • Collaboration Techniques: Forgotten Wisdom and New Approaches Star West 2013, Anaheim, CA Dorothy Graham info@dorothygraham.co.uk www.DorothyGraham.co.uk Robert Sabourin rsabourin@amibug.com www.AmiBugShare.com © Dorothy Graham, Robert Sabourin 2013 Twitter: @DorothyGraham, @RobertASabourin 1 Collaboration: forgotten wisdom •  In our increasingly agile world, the new buzzword is collaboration so easy to preach but difficult to do well. •  Testers are challenged to work directly and productively with customers, programmers, business analysts, writers, trainers - everyone in the business value chain. •  Examples: grooming stories with customers, sprint planning, reviewing user interaction, troubleshooting bugs, whiteboarding, and buddy checking. •  Rob and Dot describe how collaboration worked on several agile projects, giving critiques of what worked well, where problems could arise, and additional aspects to consider. •  Look at examples from agile projects and how forgotten but proven “ancient” techniques can be applied to your own collaboration, such as entry and exit criteria, role diversity, risk-based objectives, checklists, cross-checking, and root cause analysis. •  Bring your own stories of collaboration—good and bad—and see how forgotten wisdom can help improve today’s practices. 2 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Contents •  Collaboration –  defined, benefits, risks, problems •  Some collaboration stories –  Software reviews at Sun Soft –  Buddy checks at Med Soft •  Exercise •  Another story (if time) –  Story grooming at Maple Solutions •  Concluding discussion 3 Dictionary •  collaboration 1.  the act of working with another or others on a joint project 2.  something created by working jointly with another or others 3.  the act of cooperating as a traitor, esp with an enemy occupying one's own country www.thefreedictionary.com/collaboration presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk 4 © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Benefits of collaboration - 1 •  two heads are better than one –  synergy is like a 3rd person •  diverse experiences drive better problem solving –  opportunity to challenge, discuss, discover –  a way to get “unstuck” –  identify problems sooner •  better communication –  direct and efficient, no long loops 5 Benefits of collaboration - 2 •  more confidence in what you produce •  trouble-shooting more effective •  blend bug isolation with debugging –  e.g tester and developer working together •  learn things from the other –  fill in the blanks for each other •  more productive –  better quality produced more quickly •  more fun! 6 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Risks of collaboration •  lowest level of tester independence •  groupthink –  limited view may miss major things –  don’t look outside the team •  focus on visible problems or less important •  no record of decision making –  technical or project-related –  no one has/takes full responsibility •  personality clashes – work against each other instead of with each other (see problems) 7 Problems in collaboration - 1 •  people aren’t cogs – don’t always work well together –  arbitrary assignment (group ≠ team) •  good “chemistry” can’t be forced –  arbitrary disbanding of good teams –  introvert / extravert •  (2 i’s don’t communicate, 2 e’s don’t listen, i steam-rollered by e) –  placate: do what’s polite, not correct or best thing –  conflict: arguments, not progress 8 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Problems in collaboration - 2 •  cultural differences between team members •  different goals –  e.g. support wants to resolve customer issues quickly, developers want to fix technical problems •  distributed team logistics •  collaborators at different levels in the organizational / decision-making hierarchy –  might not have same level of autonomy •  people “taking criticism personally” 9 Contents •  Collaboration –  defined, benefits, risks, problems •  Some collaboration stories –  Software reviews at Sun Soft –  Buddy checks at Med Soft •  Exercise •  Another story (if time) –  Story grooming at Maple Solutions •  Concluding discussion 10 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Objectives for the experience stories Share what happened Seek lessons from the past Encourage wise practices 11 Software Reviews at Sun Soft Business Context •  Desktop product development •  Merged from dozens of diverse sources Technical Context •  Blend new & legacy technologies •  Frail code, huge regression risks Development Lifecycle •  24 concurrent synchronized Scrum teams •  2 major releases per year 12 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Objectives and reviewers at SunSoft •  Clear objectives –  break legacy code? –  features using modified legacy code? –  focus regression testing –  conform to standards? •  All changed code reviewed by architect team –  know the history and evolution of the code base •  Good objectives –  critical for review success ✔ –  use static analysis tools to check vs standards? ? •  Choice of reviewers –  role diversity is good –  experts in this system – best choice –  dependence on individuals? ✔ •  time / availability •  bus syndrome? 13 Author’s role at Sun Soft •  Author did not •  Excluding the author is participate in the review not good (in general) –  not familiar enough with the legacy code –  author corrects any defects identified –  changes re-reviewed •  Recommended more involvement of authors –  often done for the wrong reasons –  author has the most to learn –  author may also find the most defects 14 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Other aspects at Sun Soft •  Quick feedback (hours) –  suggested improvements –  specific risks identified •  Very successful –  minimized risk of changes destabilizing legacy code •  especially by new developers not familiar with the legacy system •  Considered “anti-agile” •  Feedback quickly for greatest benefit •  Reviews have always been one of the most successful quality techniques ✔ ✔ –  and still are –  applied for your current context –  even in agile! 15 Contents •  Collaboration –  defined, benefits, risks, problems •  Some collaboration stories –  Software reviews at Sun Soft –  Buddy checks at Med Soft •  Exercise •  Another story (if time) –  Story grooming at Maple Solutions •  Concluding discussion 16 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Buddy Checks at Med Soft Business Context •  Software for diagnostic medicine •  Used by clinicians at the point of care Technical Context •  Code base primarily C++ and Delphi •  Supports multiple database engines Development Lifecycle •  2 Scrum teams •  Blends support and new developments 17 Buddy Checks at Med Soft 1 •  On check-in –  author invites peer to review •  Approach •  Who to review? –  knowledgeable & friendly –  author needs to be ready & happy with reviewer –  enough expertise? –  “preference war” –  ”pair-think” ✔ –  author explains context •  Approach –  highlights and walks through changes ✔ –  strongest feature: –  compare to relevant story compare to other artifacts (cross-checking) or bug report –  author bias? less likely to find problems presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk 18 © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Buddy Checks at Med Soft 2 •  Changes •  Change now? ✔ –  good: done done –  but: “sleep on it” can help to see other things –  required changes are acted upon on the spot •  Identify risks •  Risks –  changes done well? –  does artifact suit its purpose? –  any superfluous changes made? ✔ –  prioritize by risk –  focus the review on important stuff –  take a meta-view to improve more than just this artifact 19 Buddy Checks at Med Soft 3 •  Ask questions –  how does this change solve the problem? –  does it introduce new concerns? •  Discuss alternatives –  what other approaches were considered? –  why was this chosen? •  Build process –  build breaks if peer review was not done presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk •  Questions & alternatives –  asking questions gives focus, finds different bugs ✔ –  “challenging” mentality ✔ –  what else might be affected? •  Required –  part of the process –  beware “lip service” ✔ •  Missing? –  capture the best ideas, learn from what’s missed –  root cause analysis 20 © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Contents •  Collaboration –  defined, benefits, risks, problems •  Some collaboration stories –  Software reviews at Sun Soft –  Buddy checks at Med Soft •  Exercise •  Another story (if time) –  Story grooming at Maple Solutions •  Concluding discussion 21 Exercise •  four parts –  part 1: in pairs: write one or more user stories –  part 2: pairs combine into groups of 4: review each other’s stories and improve them –  part 3a and 3b: use checklists to review stories –  part 4: evaluate review process •  pages handed out for each part, including a page to write your user stories and comments 22 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Contents •  Collaboration –  defined, benefits, risks, problems •  Some collaboration stories –  Software reviews at Sun Soft –  Buddy checks at Med Soft •  Exercise •  Another story (if time) –  Story grooming at Maple Solutions •  Concluding discussion 23 Story Grooming at Maple Solutions Business Context •  Pioneer in virtual technologies •  Small technology driven venture Technical Context •  All the latest technologies •  No legacy code •  Robust automated regression testing Development Lifecycle •  One Scrum team •  Blends support and new development 24 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Story Grooming at Maple 1 •  Goal: improve stories –  high priority stories –  groomed before implementation •  Time-boxed •  Goal is good ✔ –  timing of session •  Time spent ✔ –  do most important stories first (in case not all covered) –  fixed amount of time –  ok to not cover all stories •  Check readiness before in one session –  entry criteria ✔ •  Lack of information kills •  Team participation good •  Get team buy-in –  dual role for BA, both –  all participate –  BA facilitates meeting ✔ needing full attention? •  independent facilitator? 25 Story Grooming at Maple 2 •  Understand, clarify –  clear up ambiguities –  clarify description –  compare existing function –  defined questions –  explore alternative interpretations •  Elicit acceptance tests –  examples from product owner –  normal, alternative and error paths presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk •  How to do it ✔ –  compare to existing functionality –  gives objectivity –  alternatives: potential mis-understandings –  what about other stories, previous changes, affected areas •  Acceptance tests ✔ –  also helps to clarify & understand 26 © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Story Grooming at Maple 3 •  Estimation •  Estimation problems? –  team sizes the story/stories –  all necessary information for estimation? –  decide whether to postpone –  right people there? or pull a story •  Team building •  Teamwork –  teams works to all 4 objectives –  all work is considered –  team buy-in –  changes implemented at the time –  good team spirit is worth a lot –  have fun! ✔ ✔ –  look back at the changes later (“sleep on it”)? 27 Concluding remarks In Agile, collaboration is key Wisdom from ancient technique Collaboration is all about people 28 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • Summary: good collaboration •  compatible people –  mutual respect, but ok to challenge –  work at the relationship –  equal effort and equal recognition •  •  •  •  goals/objectives for the resulting product timely (time-boxed) look outside as well (cross-check, checklists) risk-driven tasks 29 Thank you Questions … •  Comments … •  Concerns … •  Share your collaboration stories … info@dorothygraham.co.uk www.DorothyGraham.co.uk rsabourin@amibug.com www.AmiBugShare.com 30 presented by Rob Sabourin and Dot Graham rsabourin@amibug.com info@dorothygraham.co.uk © Robert Sabourin & Dorothy Graham 2013 www.AmiBugShare.com www.DorothyGraham.co.uk
  • ISTOCKPHOTO 14 BETTER SOFTWARE JANUARY/FEBRUARY 2013 www.TechWell.com
  • This past summer, Dot Graham and I, Rob Sabourin, were working together in the hallowed offices of Grove House in the picturesque hamlet of Macclesfield, England. I was sharing some recent research and task-analysis experiences in the area of agile project collaboration. In many of the successful agile teams I work with, testers collaborate frequently and directly with programmers, business analysts, customers, and other team members. I collected many examples by interviewing team members and by observing collaboration in action. I documented each collaboration story. My goal was to build resources to help teach collaboration over and above the generic warm and fuzzy team-building approaches. I want to look deeply into how collaboration takes place. As I started to share collaboration stories, Dot observed that there were several aspects of agile collaboration that bore a striking similarity to a team-based collaboration technique known as “software inspection.” In software inspection, a small group of individuals work together to identify defects, weaknesses, and potential improvements in any software development work product. Software inspections have been used since the late 1970s and are implemented independent of the lifecycle model in use. I have been implementing software inspections since the 1980s, and Dot’s book, Software Inspection, co-authored with Tom Gilb, has been an important inspiration and guide. Dot has taught software inspection techniques to inspection moderators and inspectors over many years, focusing in later years on a lean version of the process called “agile inspection.” However, interest in inspection seems to have declined in recent years, or at least people aren’t admitting to doing it. It is no longer an attractive topic at conferences or discussed much in blogs, magazines, or forums. We both feel that it is a real shame that these techniques, which are extremely effective, have been abandoned. What is the reason for this? Have these techniques just “gone out of fashion,” or have they stopped working? The agile story in this article is just one example. The story is real, with the company and context sanitized to protect the innocent. Note that the practices described are imperfect and include adaptations that may vary from recommended agile practices or strict adherence to the Agile Manifesto or agile guiding principles. I present the story with our comments in italics. Dot comments on similarities to recommendations from the “ancient wisdom” but also highlights possible dangers—lessons learned from the past that can help you in the future. A Software Review story Sunsoft is the world leader in its product market. It has built this leadership position through rich product innovations and many strategic corporate acquisitions. Its products run on highend workstations and desktops, and typical development projects involve adding new capabilities to existing product families. Sunsoft products have been on the market for more than ten years. They are based on a large amount of legacy code www.TechWell.com that has an eclectic history, poor documentation, and is very difficult to maintain. Frequently, small changes to this code introduce regression bugs that are difficult to identify during development sprints. Implementing new features often requires the modification or complete refactoring of code. Sunsoft does not have automated regression testing of legacy features. Automated unit and story tests are being created for new features but not for legacy enhancements. Sunsoft implements a variation of Scrum. Each product has several feature teams. Each feature team includes a ScrumMaster, designer, test lead, development lead, and documentation lead. Each team also includes a mix of developers, testers, and writers. Team size does not exceed ten members. Sprints are three weeks long. Products experience a beta release cycle of about two months before commercial deployment. There is one major and one minor product release to the market per year. Patch releases are made to correct urgent field-reported concerns. Major and minor releases take place at fixed dates and are synchronized between all product teams. When a story’s coding and unit testing is complete, all new or changed code is submitted to a formal review process. An automated review-management service governs and regulates the review workflow. Only reviewed code can be checked into the source control system. A small team of architects and senior developers does the reviews. The author does not participate in the review. Dot: A small team with a variety of participants is a good idea. Having domain experts involved is also a good idea. However, excluding the author may not be a good idea. Authors are normally excluded to “protect” them from review comments that may be perceived as aggressive or threatening. But, if the author is not present, she misses out on the opportunity to hear detailed technical discussions that could be of immense benefit to her future work. There are better (and very important) ways to protect the author while including her in the review meeting. Another surprising thing is that the author often has the most to contribute to the review as well, due to her more-detailed understanding of the artifact being reviewed. You would think that the author has already said everything she knows, but the discussions and questions that come up in the review can trigger deeper understanding and identify more significant problems when the author is present. Rob: Good point. In this case, the author was excluded from the review because the author was not familiar with the legacy code. My recommendation in this area was to involve the author a bit more in the review process to improve knowledge sharing and make sure reviewers were aware of the intent of the changes to legacy code. Code is reviewed in order to: 1. Determine if there is a risk of breaking existing legacy features 2. Check conformance to coding standards 3. Identify features that use any modified legacy code 4. Focus regression testing JANUARY/FEBRUARY 2013 BETTER SOFTWARE 15
  • Dot: Having clear objectives for the review is critical to success. These are good objectives. However, it may be possible to check conformance to coding standards using static-analysis tools, which could be run prior to the review. Rob: Static-analysis tools might do the trick for new code, but from what I saw, a lot of the legacy code did not conform to the coding standards. I suspect static-analysis tools would have flagged too many violations in the original code. The team was trying to minimize change. Developers must correct any defects identified. Changed code is re-reviewed prior to check in. Dot: Correction of defects outside of the review meeting gives the developer a chance to take a step back and think about the defect from a wider perspective. It also gives her an opportunity to look for and fix other areas of the code where a similar defect could occur. Re-reviewing changes is a good idea, where at least one other person confirms that the changes are correct (but not necessarily another full review meeting). Reviewers include senior architects and developers. Reviewers are familiar with the history and evolution of the code base. A review can be completed within a few hours of code submission. Dot: Having senior architects and developers as reviewers is good, as their knowledge will contribute to the quality of defect identification and fixing. However, there is a danger of relying too much on individual expertise. If these individuals were to leave, their knowledge may leave with them. It would be better to include other reviewers who were not as experienced so that they can begin to pick up some of this knowledge. It may also be useful to try to encapsulate some of their knowledge in checklists that could later be used by less-experienced people if the experts were not available for any reason. It’s excellent to have quick feedback to developers within a few hours of submitting the new or changed code. The sooner feedback is given, the fresher the ideas are in their minds and the more effective the learning experience. However, this assumes that the reviewers and experts are available at short notice for the review and that they are all willing and able to drop everything in order to do the review. Rob: Good point about the dependency on experts who may be very busy themselves. The experts were always available for reviews and were seemingly dedicated to the review process Sunsoft established. I can imagine that a backlog could develop if several teams modified legacy code at the same time, but the review workflow seemed quite fluid. I spent several months performing a detailed task analysis of all testing and development projects at Sunsoft. Although several process changes were proposed, it was determined that the Sunsoft review process effectively minimized regression risk due to modifications and enhancements to legacy code, especially those made by new developers unfamiliar with the legacy code base. Note that Sunsoft’s automated unit and story tests effectively exercised new features but did not help identify regression bugs in legacy code. 16 BETTER SOFTWARE JANUARY/FEBRUARY 2013 I consulted some independent agile transition experts about the Sunsoft review process. These agile development experts suggested that the Sunsoft review method was “anti-agile.” Dot: It’s great to hear that this review process was so successful. They followed many of the principles of software inspection—a process that still works! However, this is a surprising reaction from agile experts! It seems almost as though agile theory is more important than practices that work. Sunsoft code reviews were implemented as a process bridge during the company’s agile transition. At the time of this task analysis, Scrum had been in use for just under two years. It is anticipated that, eventually, automated regression testing at the unit and story levels will mitigate the need for such a comprehensive review of all code changes at Sunsoft. Dot: It would make a very interesting study to see what regression bugs (if any) creep in after reviews are abandoned at Sunsoft! Some Concluding Comments In this story, we have seen how a “traditional” review process was used on changes to legacy code in an organization using agile development. The tried-and-true techniques are shown to work well at Sunsoft, regardless of how some agile experts categorize it. There are many great lessons that we can apply from reviews and inspections to the many challenges that agile teams face in collaborating today. One of the most important lessons is that we can find problems early with minimum effort by getting a group of people with diverse skills and experience to work well together. The “old” technique, rather than being “past its sell-by date” actually has some powerful advice for today’s development. Of course, most “ancient texts” do need some interpretation for the current day. If you read Software Inspection, you will find things that don’t apply today (e.g., heavy process and documented planning), but you will find many nuggets of information and tips for collaborating better that are eminently applicable now and in the future. These “ancient” techniques can help you improve collaboration on agile teams. We hope that the tips we have outlined in this article will help you do better reviewing and receive more benefits from reviews, however they are done in your workplace. {end} www.TechWell.com rsabourin@amibug.com dot@dorothygraham.co.uk For more on the following topic go to www.StickyMinds.com/bettersoftware. I Further reading