Odd E验收测试驱动开发实战
Upcoming SlideShare
Loading in...5
×
 

Odd E验收测试驱动开发实战

on

  • 1,421 views

 

Statistics

Views

Total Views
1,421
Views on SlideShare
1,421
Embed Views
0

Actions

Likes
1
Downloads
22
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Odd E验收测试驱动开发实战 Odd E验收测试驱动开发实战 Presentation Transcript

  • 验收测试驱动开发实战 Acceptance Test Driven Development in practice Acceptance Test Driven Development in practice
    • Steven Mak 麥天志
    • [email_address]
  • What are we up to now?
    • Lost in translation
    • Do not explain why
    • Gaps discovered only until coding started
    • Cumulative effects of small misunderstandings
    • Inadequate and essentially flawed requirements and specifications
  • Failing to meet actual needs
    • are obvious things really obvious?
    • Fulfilling specifications does not guarantee success
    • Imperative requirements
    View slide
  • Meeting the needs with Acceptance TDD
    • Drive implementation of a requirement through a set of automated, executable acceptance tests
    Requirement Acceptance Test Implementation Feedback View slide
  • ATDD in a Nutshell
    • Real-world examples to build a shared understanding of the domain
    • Select a set of these examples to be a specification and an acceptance test suite
    • Automate the verification of acceptance tests
    • Focus the software development effort on the acceptance tests
    • Use the set of acceptance tests to facilitate discussion about future change requests.
  • The ATDD cycle customer documentation A-TDD Workshop Feature Done Developers Testers Product Owner Architect Technical writers Example tests coding testing architecture other activities
  • Benefits of ATDD
    • Comprehensible examples over complex formulas
    • Close Collaboration
    • Definition of Done
    • Co-operative Work
    • Trust and Commitment
    • Testing on system level
  • Variations - an escape?
    • Behaviour-driven development
    • Example-driven development
    • Executable specifications
    • Names do not matter, but underlying practices matter
    • Worthwhile to try if your business people do not like “testing”
  • Ideal candidate to work with
    • Shared interest in success
    • Authority to make decision
    • Ability to understand implications
    • Ability to explain the domain
  • Specification by Examples
    • Use realistic examples to demonstrate differences in possibilities instead of abstract requirements
    • Write specifications down as tables
    • Workflows:
      • Preconditions
      • Processing steps
      • Verifications
  • Examples, Tests, and Spec can become elaborate verify
  • Specification workshop
    • Ask the domain experts
    • Developers and testers should suggest examples of edges or important issues for discussion
    • Ubiquitous language
    • Organise feedback to ensure shared understanding
    • Use facilitator to stay focused if needed
  • Acceptance criteria
    • Write tests collaboratively
    • Examples in a form close to what your automation tool can understand
    • Keep tests in a form that is human-readable
    • Specification over Scripting, describe WHAT, not how
    • Acceptance tests to prevent defects, not to discover
    • Not necessarily automate anything
    • Acceptance tests only work when we can discuss them
  • Some considerations
    • User Interface
      • Easy?
      • Fragile?
      • Performance issues?
    • Boundary of Stub
      • Sufficiently close
      • Simulators?
    • Business logic
      • Not from developer perspective
  • Acceptance Test smells
    • Long tests
    • Parameters of calculation tests that always have the same value
    • Similar test with minor differences
    • Tests that reflect the way code was written
    • Tests fail intermittently even though you didn’t change any code
    • Interdependent tests, e.g. setup for others
  • Change
    • Use existing acceptance tests to discuss future changes
    • Seek advices from customer to determine if it specifies obsolete functionality when test fails
    • Automate periodic execution of regression tests with CI
    • Keep tests in the same version control as code
  • Tools
    • Table-based frameworks
      • FIT, http://fit.c2.com
      • RobotFramework, http://robotframework.org
    • Text-based frameworks
      • Exactor, http://exactor.sourceforge.net
      • TextTest, http://texttest.carmen.se
  • FIT
    • FIT stands for “ F ramework for I ntegrated T ests”
    • Most popular framework in-use
    • Table-based
    • Supporting languages like Java, C#, Python, or Ruby
  • FIT in practice Customer writes a test document containing examples Technical staff enhance the tables in the doc Technical staff implements fixture classes Executable Test Test doc with tables Test doc with sanitised tables Test doc and backing code (e.g. Java)
  • Robot Framework
    • Python-based Keyword-driven test automation framework
    • Test libraries implemented either in Python or Java
    • Test cases are written in tabular format, save in HTML or TSV files
    • Syntax similar to natural language
    • Users can create new keywords from existing ones and contribute to the project
  • Preparing Test cases Test case name Test procedure using keywords Keyword arguments Test Case Action Argument Valid Login Open Login Page Input Name demo Input Password mode Submit Credentials
  • Data-driven test cases
    • Define a keyword which will take the input data and prepare a table with test cases
  • Test case organisation
    • Simple way: Single HTML file containing all test cases
    • Test case tagging
  • Execution
    • Gathering test cases, reading and setting variables
    • Executing all actions for every test case
    • Providing global statistics
    • Writing the output in XML format
    • Generating report and log in HTML format
  • Sample execution result
  • Sample Test Report
    • in HTML format, showing all actions executed up to the failing action, with fail message
  • Tested application Interface
    • Command line
      • OperatingSystem
      • SSHLibrary
      • Telnet library
    • Web
      • Robot Selenium
    • GUI
      • Swing GUI library
  • Adoption
    • Sense of achievement
    • Integrity
    • Openness
    • Right timing
  • Facilitate Adoption
    • Evangelise
    • Lower the bar
    • Train and educate
    • Share and infect
    • Coach and facilitate
    • Involve others by giving them roles
  • Organisational Challenges
    • Business Analysts
    • Testers
    • Developers
  • References
    • Bridging the Communication Gap
      • Gojko Adzic
    • Practical TDD and ATDD for Java Developers
      • Lasse Koskela
    • Agile Testing
      • Lisa Crispin and Janet Gregory
  • Thank you! 多謝!
    • Steven Mak 麥天志
    • Email: [email_address]
    • Twitter: http://twitter.com/stevenmak