In our increasingly agile world, the new buzzword is collaboration—so easy to preach but difficult to do well. Testers are challenged to work directly and productively with customers, programmers, business analysts, writers, trainers, and pretty much everyone in the business value chain. Testers and managers have many touch points of collaboration: grooming stories with customers, sprint planning with team members, reviewing user interaction with customers, troubleshooting bugs with developers, whiteboarding with peers, and buddy checking. Rob Sabourin and Dot Graham describe how collaboration worked on several agile projects, giving critiques of what worked well, where problems could arise, and additional aspects to consider. Join Rob and Dot to look at examples from agile projects and how forgotten but proven “ancient” techniques can be applied to your own collaboration, such as entry and exit criteria, role diversity, risk-based objectives, checklists, cross-checking, and root cause analysis. Bring your own stories of collaboration—good and bad—and see how forgotten wisdom can help improve today’s practices.
Apidays New York 2024 - The value of a flexible API Management solution for O...
Collaboration Techniques: Forgotten Wisdom and New Approaches
1. TN
PM Tutorial
10/1/2013 1:00:00 PM
"Collaboration Techniques:
Forgotten Wisdom and New
Approaches"
Presented by:
Rob Sabourin, AmiBug.com
Dorothy Graham, Software Test Consultant
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
2. Dorothy Graham
Software Test Consultant
In testing for more than thirty years, Dorothy Graham is coauthor of four books—Software
Inspection, Software Test Automation,Foundations of Software Testing, and Experiences of
Test Automation: Case Studies of Software Test Automation. Dot was a founding member of the
ISEB Software Testing Board, a member of the working party that developed the first ISTQB
Foundation Syllabus, and served on the boards of conferences and publications in software
testing. A popular and entertaining speaker at conferences and seminars worldwide, she has
been coming to STAR conferences since the first one in 1992. Dot holds the European
Excellence Award in Software Testing and is the first recipient of the ISTQB Excellence Award.
Learn more about Dot at DorothyGraham.co.uk.
Robert Sabourin
AmiBug.com
Rob Sabourin, P. Eng., has more than thirty years of management experience leading teams of
software development professionals. A well-respected member of the software engineering
community, Rob has managed, trained, mentored, and coached hundreds of top professionals
in the field. He frequently speaks at conferences and writes on software engineering, SQA,
testing, management, and internationalization. Rob wrote I am a Bug!, the popular software
testing children's book; works as an adjunct professor of software engineering at McGill
University; and serves as the principle consultant (and president/janitor) of AmiBug.Com, Inc.
Contact Rob atrsabourin@amibug.com.
19. This past summer, Dot Graham and I, Rob Sabourin, were
working together in the hallowed offices of Grove House in
the picturesque hamlet of Macclesfield, England. I was sharing
some recent research and task-analysis experiences in the area
of agile project collaboration. In many of the successful agile
teams I work with, testers collaborate frequently and directly
with programmers, business analysts, customers, and other
team members.
I collected many examples by interviewing team members
and by observing collaboration in action. I documented each
collaboration story. My goal was to build resources to help
teach collaboration over and above the generic warm and fuzzy
team-building approaches. I want to look deeply into how collaboration takes place.
As I started to share collaboration stories, Dot observed
that there were several aspects of agile collaboration that bore
a striking similarity to a team-based collaboration technique
known as “software inspection.” In software inspection, a
small group of individuals work together to identify defects,
weaknesses, and potential improvements in any software development work product. Software inspections have been used
since the late 1970s and are implemented independent of the
lifecycle model in use. I have been implementing software inspections since the 1980s, and Dot’s book, Software Inspection, co-authored with Tom Gilb, has been an important inspiration and guide.
Dot has taught software inspection techniques to inspection
moderators and inspectors over many years, focusing in later
years on a lean version of the process called “agile inspection.”
However, interest in inspection seems to have declined in recent
years, or at least people aren’t admitting to doing it. It is no
longer an attractive topic at conferences or discussed much in
blogs, magazines, or forums.
We both feel that it is a real shame that these techniques,
which are extremely effective, have been abandoned. What is
the reason for this? Have these techniques just “gone out of
fashion,” or have they stopped working?
The agile story in this article is just one example. The story
is real, with the company and context sanitized to protect the
innocent. Note that the practices described are imperfect and
include adaptations that may vary from recommended agile
practices or strict adherence to the Agile Manifesto or agile
guiding principles.
I present the story with our comments in italics. Dot comments on similarities to recommendations from the “ancient
wisdom” but also highlights possible dangers—lessons learned
from the past that can help you in the future.
A Software Review story
Sunsoft is the world leader in its product market. It has built
this leadership position through rich product innovations and
many strategic corporate acquisitions. Its products run on highend workstations and desktops, and typical development projects involve adding new capabilities to existing product families.
Sunsoft products have been on the market for more than
ten years. They are based on a large amount of legacy code
www.TechWell.com
that has an eclectic history, poor documentation, and is very
difficult to maintain. Frequently, small changes to this code
introduce regression bugs that are difficult to identify during
development sprints. Implementing new features often requires
the modification or complete refactoring of code. Sunsoft does
not have automated regression testing of legacy features. Automated unit and story tests are being created for new features
but not for legacy enhancements.
Sunsoft implements a variation of Scrum. Each product
has several feature teams. Each feature team includes a ScrumMaster, designer, test lead, development lead, and documentation lead. Each team also includes a mix of developers, testers,
and writers. Team size does not exceed ten members.
Sprints are three weeks long. Products experience a beta
release cycle of about two months before commercial deployment. There is one major and one minor product release to
the market per year. Patch releases are made to correct urgent
field-reported concerns. Major and minor releases take place
at fixed dates and are synchronized between all product teams.
When a story’s coding and unit testing is complete, all new
or changed code is submitted to a formal review process. An
automated review-management service governs and regulates
the review workflow. Only reviewed code can be checked into
the source control system.
A small team of architects and senior developers does the
reviews. The author does not participate in the review.
Dot: A small team with a variety of participants is a good
idea. Having domain experts involved is also a good idea.
However, excluding the author may not be a good idea. Authors are normally excluded to “protect” them from review
comments that may be perceived as aggressive or threatening.
But, if the author is not present, she misses out on the opportunity to hear detailed technical discussions that could be of
immense benefit to her future work. There are better (and very
important) ways to protect the author while including her in
the review meeting.
Another surprising thing is that the author often has the
most to contribute to the review as well, due to her more-detailed understanding of the artifact being reviewed. You would
think that the author has already said everything she knows,
but the discussions and questions that come up in the review
can trigger deeper understanding and identify more significant
problems when the author is present.
Rob: Good point. In this case, the author was excluded
from the review because the author was not familiar with the
legacy code. My recommendation in this area was to involve
the author a bit more in the review process to improve knowledge sharing and make sure reviewers were aware of the intent
of the changes to legacy code.
Code is reviewed in order to:
1. Determine if there is a risk of breaking existing legacy
features
2. Check conformance to coding standards
3. Identify features that use any modified legacy code
4. Focus regression testing
JANUARY/FEBRUARY 2013
BETTER SOFTWARE
15
20. Dot: Having clear objectives for the review is critical to success. These are good objectives. However, it may be possible
to check conformance to coding standards using static-analysis
tools, which could be run prior to the review.
Rob: Static-analysis tools might do the trick for new code,
but from what I saw, a lot of the legacy code did not conform
to the coding standards. I suspect static-analysis tools would
have flagged too many violations in the original code. The
team was trying to minimize change.
Developers must correct any defects identified. Changed
code is re-reviewed prior to check in.
Dot: Correction of defects outside of the review meeting
gives the developer a chance to take a step back and think
about the defect from a wider perspective. It also gives her an
opportunity to look for and fix other areas of the code where a
similar defect could occur.
Re-reviewing changes is a good idea, where at least one
other person confirms that the changes are correct (but not
necessarily another full review meeting).
Reviewers include senior architects and developers. Reviewers are familiar with the history and evolution of the code
base. A review can be completed within a few hours of code
submission.
Dot: Having senior architects and developers as reviewers
is good, as their knowledge will contribute to the quality of
defect identification and fixing.
However, there is a danger of relying too much on individual expertise. If these individuals were to leave, their knowledge may leave with them. It would be better to include other
reviewers who were not as experienced so that they can begin
to pick up some of this knowledge. It may also be useful to try
to encapsulate some of their knowledge in checklists that could
later be used by less-experienced people if the experts were not
available for any reason.
It’s excellent to have quick feedback to developers within a
few hours of submitting the new or changed code. The sooner
feedback is given, the fresher the ideas are in their minds and
the more effective the learning experience.
However, this assumes that the reviewers and experts are
available at short notice for the review and that they are all
willing and able to drop everything in order to do the review.
Rob: Good point about the dependency on experts who
may be very busy themselves. The experts were always available for reviews and were seemingly dedicated to the review
process Sunsoft established. I can imagine that a backlog could
develop if several teams modified legacy code at the same time,
but the review workflow seemed quite fluid.
I spent several months performing a detailed task analysis
of all testing and development projects at Sunsoft. Although
several process changes were proposed, it was determined that
the Sunsoft review process effectively minimized regression risk
due to modifications and enhancements to legacy code, especially those made by new developers unfamiliar with the legacy
code base. Note that Sunsoft’s automated unit and story tests
effectively exercised new features but did not help identify regression bugs in legacy code.
16
BETTER SOFTWARE
JANUARY/FEBRUARY 2013
I consulted some independent agile transition experts about
the Sunsoft review process. These agile development experts
suggested that the Sunsoft review method was “anti-agile.”
Dot: It’s great to hear that this review process was so successful. They followed many of the principles of software inspection—a process that still works!
However, this is a surprising reaction from agile experts! It
seems almost as though agile theory is more important than
practices that work.
Sunsoft code reviews were implemented as a process bridge
during the company’s agile transition. At the time of this task
analysis, Scrum had been in use for just under two years. It is
anticipated that, eventually, automated regression testing at the
unit and story levels will mitigate the need for such a comprehensive review of all code changes at Sunsoft.
Dot: It would make a very interesting study to see what regression bugs (if any) creep in after reviews are abandoned at
Sunsoft!
Some Concluding Comments
In this story, we have seen how a “traditional” review process was used on changes to legacy code in an organization
using agile development. The tried-and-true techniques are
shown to work well at Sunsoft, regardless of how some agile
experts categorize it.
There are many great lessons that we can apply from reviews and inspections to the many challenges that agile teams
face in collaborating today. One of the most important lessons
is that we can find problems early with minimum effort by getting a group of people with diverse skills and experience to
work well together.
The “old” technique, rather than being “past its sell-by
date” actually has some powerful advice for today’s development. Of course, most “ancient texts” do need some interpretation for the current day. If you read Software Inspection, you
will find things that don’t apply today (e.g., heavy process and
documented planning), but you will find many nuggets of information and tips for collaborating better that are eminently
applicable now and in the future. These “ancient” techniques
can help you improve collaboration on agile teams. We hope
that the tips we have outlined in this article will help you do
better reviewing and receive more benefits from reviews, however they are done in your workplace. {end}
www.TechWell.com
rsabourin@amibug.com
dot@dorothygraham.co.uk
For more on the following topic go to
www.StickyMinds.com/bettersoftware.
I
Further reading