Your SlideShare is downloading. ×
Model Driven Testing (MDT): Los modelos al servicio de las pruebas Francisco J. López Minaya Rational Technical Sales Spec...
Agenda <ul><li>Model Driven Development </li></ul><ul><li>Model Driven Testing </li></ul><ul><li>Summary </li></ul>
Today’s Systems are complex…
 
 
 
 
 
 
 
 
 
 
 
Development Process is Evolving… Model Driven  Development MDD t 1960 1970 1980 1990 2000 2010 ASM C
Embedded Market Forecasters Documented in “ What Do You Do When the Horse You’re Riding Drops Dead? Why Model Driven Desig...
MDD Needs To Be Extended… <ul><li>By adopting MDD, developers increased their productivity to offer more complex and more ...
“Too Many Defects Being Introduced” Improve the Development Process! <ul><li>Clear, traceable and testable Requirements </...
Design Level Debugging <ul><li>Push - Button Execution to quickly debug the model </li></ul><ul><ul><li>The best way to mi...
“Defects Being Detected Too Late” <ul><li>Test Early; Test Often! </li></ul><ul><li>Make your design executable, because… ...
When To Start Testing? <ul><li>After the functionality is developed but before shipping to customer… [wrong!] </li></ul><u...
Agenda <ul><li>Model Driven Development </li></ul><ul><li>Model Driven Testing </li></ul><ul><li>Summary </li></ul>
Model Driven Testing to the rescue! <ul><li>Wikipedia, the free encyclopedia on the World Wide Web (www), refers to model-...
<ul><li>UML 2 Testing Profile  extends UML with  test-specific concepts: </li></ul><ul><ul><li>Test Behavior :  the  obser...
<ul><li>SUT  : The System Under Test (SUT) is a part and is the system, subsystem, or component being tested. A SUT can co...
<ul><li>A  Test Case  is a specification of one case to test the system, including what to test, with which input, result,...
<ul><li>The Rhapsody UML Testing Profile is based on the official UML Testing Profile. It contains new terms and stereotyp...
Design and Test Processes  Fully Integrated <ul><li>Common Browser </li></ul><ul><li>Requirements linked to test cases  </...
Typical Testing Process <ul><li>Identify Testable requirements and develop a Test Plan with a traceability to those requir...
<ul><li>Create a Test Architecture </li></ul><ul><ul><li>Fully Automated </li></ul></ul><ul><ul><li>Auto-Updated </li></ul...
<ul><li>Create a Test Architecture </li></ul><ul><li>Develop the Test Cases as  Sequence Diagrams </li></ul><ul><ul><li>Ma...
<ul><li>Create a Test Architecture </li></ul><ul><li>Develop the Test Cases as  Sequence Diagrams </li></ul><ul><ul><li>Ma...
<ul><li>Create a Test Architecture </li></ul><ul><li>Develop the Test Cases as  Sequence Diagrams </li></ul><ul><ul><li>Ma...
Basic Testing Process: Authoring Test Cases <ul><li>Create a Test Architecture </li></ul><ul><li>Develop Test Cases as  Se...
Basic Testing Process: Authoring Test Cases <ul><li>Create a Test Architecture </li></ul><ul><li>Develop Test Cases as  Se...
<ul><li>Create a Test Architecture </li></ul><ul><li>Develop Test Cases as  Sequence Diagrams </li></ul><ul><li>Develop Te...
Manually Define a TestArchitecture  (aka TestBench) for each class/SUT Develop  TestCases  Manually by writing  code Execu...
Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture  (aka TestBench) for each class/SUT Let AT...
Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture  (aka TestBench) for each class/SUT Let AT...
Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture  (aka TestBench) for each class/SUT Let AT...
Basic Testing Process: Code vs. Model Very high very early Very low; measured late Typical Requirements Coverage Models (a...
Rhapsody Testing Solution <ul><li>Model Driven  Development  and Model Driven  Testing  Fully  Integrated </li></ul><ul><u...
Agenda <ul><li>Model Driven Development </li></ul><ul><li>Model Driven Testing </li></ul><ul><li>Summary </li></ul>
<ul><li>Being able to… </li></ul><ul><ul><li>Integrate Testing into the Design process, allowing for frequent and early te...
<ul><li>Model Driven Development is a proven and powerful solution for Developers </li></ul><ul><li>Adopting Model Driven ...
© Copyright IBM Corporation 2008.  All rights reserved.  The information contained in these materials is provided for info...
Upcoming SlideShare
Loading in...5
×

12 Rational Solo Pruebas 2009

190

Published on

Presentación de Rational en Solo Pruebas 2009

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
190
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Transcript of "12 Rational Solo Pruebas 2009"

    1. 1. Model Driven Testing (MDT): Los modelos al servicio de las pruebas Francisco J. López Minaya Rational Technical Sales Specialist
    2. 2. Agenda <ul><li>Model Driven Development </li></ul><ul><li>Model Driven Testing </li></ul><ul><li>Summary </li></ul>
    3. 3. Today’s Systems are complex…
    4. 15. Development Process is Evolving… Model Driven Development MDD t 1960 1970 1980 1990 2000 2010 ASM C
    5. 16. Embedded Market Forecasters Documented in “ What Do You Do When the Horse You’re Riding Drops Dead? Why Model Driven Design is Emerging as a Preferred Best Practice”, March 2007 Advantages of Model Driven Development - Proven!
    6. 17. MDD Needs To Be Extended… <ul><li>By adopting MDD, developers increased their productivity to offer more complex and more competitive products in less time… </li></ul><ul><li>Testing is still stuck at the code level of the 80’s! </li></ul><ul><li>Development moved to Modeling… Should Testing move too? </li></ul>Functional Decomposition Time MDD Testing Productivity Design Productivity We can Design much “faster” than we can Test!
    7. 18. “Too Many Defects Being Introduced” Improve the Development Process! <ul><li>Clear, traceable and testable Requirements </li></ul><ul><li>Good Architectural Design </li></ul><ul><li>Effective Communications </li></ul><ul><li>Automatic Documentation </li></ul><ul><li>Graphical Design Reviews </li></ul><ul><li>Evolving/Iterative Prototypes </li></ul><ul><li>Executable Designs </li></ul>
    8. 19. Design Level Debugging <ul><li>Push - Button Execution to quickly debug the model </li></ul><ul><ul><li>The best way to minimize defects early! </li></ul></ul><ul><ul><li>Debugging at the design level! </li></ul></ul>
    9. 20. “Defects Being Detected Too Late” <ul><li>Test Early; Test Often! </li></ul><ul><li>Make your design executable, because… You Can’t Test What You Can’t Execute! </li></ul><ul><li>Automation; Automation; Automation! </li></ul>Improve the Testing Process!
    10. 21. When To Start Testing? <ul><li>After the functionality is developed but before shipping to customer… [wrong!] </li></ul><ul><ul><li>For example, Waterfall (requirements analysis, design, implementation, testing (validation), integration, and maintenance) </li></ul></ul><ul><ul><li>Defects enjoy a long life! </li></ul></ul><ul><ul><li>Undetected defects lead to further wrong decisions! </li></ul></ul><ul><ul><li>Testing phase gets squeezed to compensate for other project delays! </li></ul></ul><ul><li>When the project starts! </li></ul><ul><ul><li>As in extreme programming and the agile software development process. </li></ul></ul><ul><ul><li>It is a continuous process until the project finishes. </li></ul></ul>
    11. 22. Agenda <ul><li>Model Driven Development </li></ul><ul><li>Model Driven Testing </li></ul><ul><li>Summary </li></ul>
    12. 23. Model Driven Testing to the rescue! <ul><li>Wikipedia, the free encyclopedia on the World Wide Web (www), refers to model-driven testing as “ software testing where test cases are derived in whole or in part from a model that describes some (if not all) aspects of the system under test (SUT) ” </li></ul><ul><li>UML is the language for specifying models, derived tests are described using the UML Testing Profile (UTP) </li></ul><ul><li>UTP can be considered as a modeling language for visualizing , specifying , analyzing , constructing , and documenting the artifacts of a test system. </li></ul>
    13. 24. <ul><li>UML 2 Testing Profile extends UML with test-specific concepts: </li></ul><ul><ul><li>Test Behavior : the observations and activities during the test </li></ul></ul><ul><ul><li>Test Architecture : the elements and their relationships involved in a test </li></ul></ul>Model Driven Testing To The Rescue!
    14. 25. <ul><li>SUT : The System Under Test (SUT) is a part and is the system, subsystem, or component being tested. A SUT can consist of several objects. The SUT is exercised by the test components via its public interface operations and signals. No further information can be obtained from the SUT as it is a black-box </li></ul><ul><li>Test Component : A test component is a class of a test system. Test component objects realize the behavior of a test case. A test component has a set of interfaces via which it may communicate via connections with other test components or with the SUT </li></ul><ul><li>Test Configuration : The collection of test component objects and of connections between the test component objects and to the SUT. </li></ul><ul><li>Test Context : A collection of test cases together with a test configuration on the basis of which the test cases are executed </li></ul>Test Architecture: SUT & Test Components
    15. 26. <ul><li>A Test Case is a specification of one case to test the system, including what to test, with which input, result, and under which conditions. It is a complete technical specification of how the SUT should be tested for a given test objective </li></ul><ul><ul><li>It is defined in terms of sequences, alternatives, loops and defaults of stimuli to and observations from the SUT </li></ul></ul><ul><ul><li>It uses an arbiter to evaluate the outcome of its test behavior. It is an operation specifying how a set of cooperating test components interacting with a system under test realize a test objective </li></ul></ul><ul><ul><li>It is a property of a Test Context. Both the SUT and the various test components are parts of the test context to which the test case belongs </li></ul></ul><ul><ul><li>It may invoke other test cases; It implements a test objective </li></ul></ul><ul><li>Test Objective : A test objective is a named element describing what should be tested. It is associated to a test case </li></ul>Test Behavior in Details
    16. 27. <ul><li>The Rhapsody UML Testing Profile is based on the official UML Testing Profile. It contains new terms and stereotypes that can be utilized for model testing artifacts in Rhapsody </li></ul><ul><li>Some of the elements defined in the UML Testing Profile are presently not part of the Rhapsody Testing Profile. However, the profile includes supplementary elements that are not part of the UML Testing Profile. </li></ul><ul><ul><li>Stubbing, for example, is one of these additional elements that are used for test activities not addressed by the UML Testing Profile </li></ul></ul>Rhapsody Testing Profile
    17. 28. Design and Test Processes Fully Integrated <ul><li>Common Browser </li></ul><ul><li>Requirements linked to test cases </li></ul><ul><li>Easy navigation between Design and Test artifacts </li></ul><ul><li>Design and Test - Always in sync </li></ul><ul><li>Automatically generated test execution reports </li></ul>Design Artifacts Test Artifacts Test Execution Reports
    18. 29. Typical Testing Process <ul><li>Identify Testable requirements and develop a Test Plan with a traceability to those requirements, addressing the following aspects of testing: </li></ul><ul><ul><li>Unit testing </li></ul></ul><ul><ul><li>Integration testing </li></ul></ul><ul><ul><li>SubSystem/System testing </li></ul></ul><ul><ul><li>Performance testing </li></ul></ul><ul><ul><li>Acceptance testing </li></ul></ul><ul><ul><li>Others as needed, such as usability, workflow or conformance testing </li></ul></ul><ul><ul><li>Building a growing suite of Regression tests and when to run them </li></ul></ul><ul><li>We will now discuss the Basic process of authoring and executing a test case. </li></ul>
    19. 30. <ul><li>Create a Test Architecture </li></ul><ul><ul><li>Fully Automated </li></ul></ul><ul><ul><li>Auto-Updated </li></ul></ul><ul><ul><li>Manual </li></ul></ul><ul><li>A Graphical Test Architecture </li></ul><ul><ul><li>Is Architecture only, no test cases yet </li></ul></ul>Basic Testing Process: Create Test Architecture
    20. 31. <ul><li>Create a Test Architecture </li></ul><ul><li>Develop the Test Cases as Sequence Diagrams </li></ul><ul><ul><li>Manually </li></ul></ul><ul><ul><ul><li>Based on </li></ul></ul></ul><ul><ul><ul><ul><li>Textual requirements </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Previous experience </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Domain expertise </li></ul></ul></ul></ul><ul><ul><ul><li>Defines both </li></ul></ul></ul><ul><ul><ul><ul><li>SUT expected behaviour </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Stubs behaviours </li></ul></ul></ul></ul>Basic Testing Process: Authoring Test Cases
    21. 32. <ul><li>Create a Test Architecture </li></ul><ul><li>Develop the Test Cases as Sequence Diagrams </li></ul><ul><ul><li>Manually </li></ul></ul><ul><ul><li>Recorded from Simulations (animated sequence diagrams) </li></ul></ul>Basic Testing Process: Authoring Test Cases
    22. 33. <ul><li>Create a Test Architecture </li></ul><ul><li>Develop the Test Cases as Sequence Diagrams </li></ul><ul><ul><li>Manually </li></ul></ul><ul><ul><li>Recorded from Simulations </li></ul></ul><ul><ul><li>Automatically generated with ATG </li></ul></ul><ul><ul><ul><li>Coverage driven </li></ul></ul></ul><ul><ul><ul><li>Often finds test cases that no one ever considered before… </li></ul></ul></ul>Basic Testing Process: Authoring Test Cases
    23. 34. Basic Testing Process: Authoring Test Cases <ul><li>Create a Test Architecture </li></ul><ul><li>Develop Test Cases as Sequence Diagrams </li></ul><ul><ul><li>Manually </li></ul></ul><ul><ul><li>Recorded from Simulations </li></ul></ul><ul><ul><li>Automatically generated with ATG </li></ul></ul><ul><li>Develop Test Cases as Flowcharts </li></ul><ul><ul><li>“ Graphical programming” </li></ul></ul><ul><ul><li>Intuitive, powerful and easy to use </li></ul></ul>
    24. 35. Basic Testing Process: Authoring Test Cases <ul><li>Create a Test Architecture </li></ul><ul><li>Develop Test Cases as Sequence Diagrams </li></ul><ul><ul><li>Manually </li></ul></ul><ul><ul><li>Recorded from Simulations </li></ul></ul><ul><ul><li>Automatically generated with ATG </li></ul></ul><ul><li>Develop Test Cases as Flowcharts </li></ul><ul><ul><li>“ Graphical programming” </li></ul></ul><ul><ul><li>Intuitive, powerful and easy to use </li></ul></ul><ul><li>Develop Test Cases as Code </li></ul><ul><ul><li>Test Case Behaviour can be captured by coding it directly </li></ul></ul><ul><ul><li>Remember that the Test Architecture was already generated </li></ul></ul>
    25. 36. <ul><li>Create a Test Architecture </li></ul><ul><li>Develop Test Cases as Sequence Diagrams </li></ul><ul><li>Develop Test Cases as Flowcharts </li></ul><ul><li>Develop Test Cases as Code </li></ul><ul><li>Execute/Report on Test Execution </li></ul><ul><ul><li>Inputs to SUT and stubs behaviours are played out automatically </li></ul></ul><ul><ul><li>Unexpected behaviours are highlighted </li></ul></ul><ul><ul><li>SUT can execute on Host and/or Target </li></ul></ul>Basic Testing Process: Execution & Reporting
    26. 37. Manually Define a TestArchitecture (aka TestBench) for each class/SUT Develop TestCases Manually by writing code Execute Test Cases on Host (code level) Execute Test Cases on Target (code level) Code Level Testing Model Driven Testing Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture (aka TestBench) for each class/SUT
    27. 38. Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture (aka TestBench) for each class/SUT Let ATG Automatically Generate TestCases Develop TestCases Interactively Code Flow Charts Manually Define a TestArchitecture (aka TestBench) for each class/SUT Develop TestCases Manually by writing code Execute Test Cases on Host (code level) Execute Test Cases on Target (code level) Animated Sequence Diagrams Requirements Sequence Diagrams Auto Generated TestCases Code Level Testing Model Driven Testing
    28. 39. Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture (aka TestBench) for each class/SUT Let ATG Automatically Generate TestCases Develop TestCases Interactively Code Manually Define a TestArchitecture (aka TestBench) for each class/SUT Develop TestCases Manually by writing code Execute Test Cases on Host (code level) Execute Test Cases on Target (code level) Regression (Animated)Sequence Diagrams Requirements Sequence Diagrams Activity Diagrams Code Level Testing Model Driven Testing Hand written TestCases  I II III IV V Auto Generated TestCases
    29. 40. Basic Testing Process: Code vs. Model Automatically Generate a TestArchitecture (aka TestBench) for each class/SUT Let ATG Automatically Generate TestCases Develop TestCases Interactively Code Activity Diagrams Manually Define a TestArchitecture (aka TestBench) for each class/SUT Develop TestCases Manually by writing code Execute Test Cases on Host (code level) Execute Test Cases on Target (code level) Execute Test Cases on Host (code/model level) Execute Test Cases on Target (code/model level) Regression Sequence Diagrams Requirements Sequence Diagrams Code Level Testing Model Driven Testing
    30. 41. Basic Testing Process: Code vs. Model Very high very early Very low; measured late Typical Requirements Coverage Models (and optionally source code) Source code Configuration managed artifacts Yes, as Sequence Diagrams No Automatic Test Case Generation Part of the Model Requires external tools Traceability to Requirements Model (defect sequence diagrams) Code and text Communicating defects Very Positive Very Negative Competitiveness Models; Easy to do; Done very early Code; Hard to do; Done late Requirement Based Testing Change configuration parameters Review and rewrite all appropriate code Porting test cases to new platform/OS Code, Flowcharts, Sequence Diagrams Scripts, code Test Case authoring Model-Driven Testing Code-Driven Testing Aspect
    31. 42. Rhapsody Testing Solution <ul><li>Model Driven Development and Model Driven Testing Fully Integrated </li></ul><ul><ul><li>Test cases are accessible from the Rhapsody browser, making Test cases part of the model, and can be CM’ed, included in Reports, etc. </li></ul></ul><ul><ul><li>Traceability requirements and test cases </li></ul></ul><ul><ul><li>Testing can be done early and as often as a design can be executed </li></ul></ul><ul><li>Supports Unit testing, white box and black box testing </li></ul><ul><ul><li>Following the UML Testing Profile </li></ul></ul><ul><ul><li>Automatically generate Graphical Test Architectures </li></ul></ul><ul><ul><li>Automatically generate test cases to increase test coverage </li></ul></ul><ul><ul><li>Test case behavior captured Graphically, as Sequence Diagrams and Flowcharts, and Textually, as plain code </li></ul></ul><ul><li>Higher Requirements/Model/Code coverage </li></ul><ul><li>Higher quality in lower cost and less time </li></ul>
    32. 43. Agenda <ul><li>Model Driven Development </li></ul><ul><li>Model Driven Testing </li></ul><ul><li>Summary </li></ul>
    33. 44. <ul><li>Being able to… </li></ul><ul><ul><li>Integrate Testing into the Design process, allowing for frequent and early testing </li></ul></ul><ul><ul><li>Automatically create/update Test Architectures </li></ul></ul><ul><ul><ul><ul><li>Instead of writing test benches </li></ul></ul></ul></ul><ul><ul><li>Automatically create test cases as sequence diagrams. </li></ul></ul><ul><ul><li>Use of Sequence Diagrams as graphical test scripts </li></ul></ul><ul><ul><ul><ul><li>Much more powerful than writing textual scripts </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Very easy and intuitive way to associate behaviors to test components (stubs) </li></ul></ul></ul></ul><ul><ul><li>Use of animated sequence diagrams as test scripts </li></ul></ul><ul><ul><ul><ul><li>Simulating a scenario is in effect a recording of a test case (as a SD) </li></ul></ul></ul></ul><ul><ul><li>Perform feature driven testing, while driving structural testing </li></ul></ul><ul><li>Results in more fun… and </li></ul><ul><ul><li>Higher Requirements/Model/Code coverage </li></ul></ul><ul><ul><li>Higher quality in lower cost and less time </li></ul></ul>Major Reasons to Adopt Model Driven Testing
    34. 45. <ul><li>Model Driven Development is a proven and powerful solution for Developers </li></ul><ul><li>Adopting Model Driven Development is NOT All-or-Nothing ! </li></ul><ul><ul><li>Code based development can co-exist with model driven development </li></ul></ul><ul><ul><li>Code visualization and Reverse Engineering allows for adoption of MDD at any time </li></ul></ul><ul><li>Model Driven Testing is a Must to remain in business </li></ul><ul><li>Adopting Model Driven Testing is NOT All-or-Nothing ! </li></ul><ul><ul><li>Test cases can be described at many levels of abstraction, from code to Sequence diagrams </li></ul></ul><ul><ul><li>ReUse of legacy test cases allows for adoption of MDT at any time </li></ul></ul>Adopting MDD and MDT – Not All-or-Nothing
    35. 46. © Copyright IBM Corporation 2008. All rights reserved. The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind, express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall have the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBM software. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. Product release dates and/or capabilities referenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or feature availability in any way. IBM, the IBM logo, Rational, the Rational logo, Telelogic, the Telelogic logo, and other IBM products and services are trademarks of the International Business Machines Corporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others. Francisco J. López Minaya Rational Technical Sales Specialist [email_address]

    ×