Continuous Change-Driven
Build Verification (ChDT™)
Marc Hornbeek
Manager, SCM Systems
Spirent Communications

June 2011
INTRODUCTION
INTRODUCTION

Manager, SCM Systems, Spirent Communications
 •  Network Test and Measurement Systems
 •  Responsible for SCM Systems including automated BV
 •  Practical “How-to” guides for Engineering management
 professionals

Perforce Environment
    •  1.6TB, 36 MLOC, 9 apps, 15 platforms, 18 releases/year, 100
       builds /day
    •  3000 tests/build: Build Test Cycle: 23 hours -> 4 hours
ChDT™ GAME-CHANGER FOR LARGE SCALE AGILE

                   •  Agile Manifesto:
                       –  Early and continuous delivery of
                          valuable software
                       –  Deliver working software frequently
                       –  Working software is the primary
                          measure of progress.

                   •  Functional testing is the standard of
                      proof.

                   •  Testing for large scale systems is a
                      bottleneck

                   •  By dynamically testing code changes
                      ChDT™ resolves this bottleneck.



                                                         4
ChDT™ Methodology
TYPICAL CODE / TEST FLOW


                                                       Automation allows the
                  Test Case
                  Database                             tests to be run
      Code

                        Pre-defined
                                                       automatically …right?...
  Source     S
   Code
             C           Test Suite    Test Report
  Change
  Control    M
                                      95%   4%    1%



                  Automated
                                        Test 01   P    ..yes but… L
   Software                             Test 02   P
                 Test Harness
    Build                               Test 03   P

                                        Test 04

                                        Test 05
                                                  P

                                                  P
                                                       §  time to run all the tests ?
                                                       §  equipment to run enough
                                                       tests in parallel ?
                                                       §  time and effort to
                                                       analyze all the results ?


                                                                                    6
ChDT™ CODE / TEST FLOW


                                  Test Case                                                    The Change-Driven Test
                                  Database
                                                                                                 ChDT™ Methodology
                                                                                               introduces a CODTEF™
                  Change-Driven Test                             Targeted Test
                   (ChDT) Module                                    Report
                                                                                                Matrix into the code/test
       Code                             Test Cases
                                        0.0          0.7        95%      4%    1%
                                                                                             flow. The matrix is used by
                   Code Modules




                                              1.0


   Source
    Code
              S                   0.8
                                         CODTEF
                                                                     Test 01

                                                                     Test 02
                                                                               P

                                                                               P                 the ChDT™ system to
              C                          MATRIX
                                                                                             automatically drive both test
                                  0.2
   Change                                                            Test 03   P

   Control    M                                                      Test 04   P

                                                                     Test 05   P

                                                                                               selection and test result
                                              0.5




                  Generated
                  Test Suite                                                                 report generation according
    Software                                                                                   to continuously adjusting
     Build                                          Automated Test
                                                       Harness
                                                                                                code change-to-test-to-
                                                                                             results correlation factors in
                                                                                                the CODTEF™ Matrix.

                                                                     ChDT and CODTEF are trademarks of Managecycle. (c) 2011 All rights reserved.


                                                                                                                                           7
TEST SELECTION METHODS COMPARED

 “Empirical Study of Regression Test Selection Techniques – ACM Transactions”
 1.   Retest all - test everything regardless of changes
 2.   Hunches – human intuition
 3.   Randomly select tests
 4.   Minimalist - minimal code coverage
 5.   Dataflow technique - data interactions
 6.   Safe Techniques - cover each code segment deleted and added.


             Efficient ?      Yes               No
             Risk?            Low               High
             Complexity?      Low               High
             Platform ?       Independent       Dependent


             Change-driven: select tests based on specific code change
                 information correlated to prior failure history of tests
                 correlated to the changed modules
                                                                          8
DYNAMIC ADJUSTMENT


 •  Code-to-test CODTEF™ correlation factors automatically adjust
    –    Past test results automatically affect future test selections
    –    Automatic adjustments of test timings
    –    Automatic adjustment of test reports
    –    Automatic adjustments of code and test performance data


         Input

     Build/Test           Automatic
      Process             Adjustment

         Output



                                                                         9
IMPLEMENTATION
FRAMEWORK
PRE_REQUISITES FOR ChDT IMPLEMENTATION

                            Technology:
                            •  Source code management system
                            •  Automated test framework

                            Process:
                            •  Standalone automated test cases**

                            People:
                            •  Tools developers (SCM Framework
                               tools, DB design)




                 ** Practical reference **
         TSCRIPT™: A Practical System for Realizing
             Exemplary Automated Test Scripts


                                                             11
ChDT DATA FRAMEWORK


  ChDT is a system of
  many variables

  •    Code variables
  •    Test variables
  •    Results variables
  •    System variables
  •    Computed variables


  Framing the system around the product itself provides a logical framework.
  There are choices:
  1)   Complete process-able definition of the product
  2)   Use attribute tags to define important (to test) characteristics
  3)   Tests themselves define the product behavior
  4)   Groups of modules and tests define major blocks of product behavior
  5)   Code itself describes the product behavior

                                                                               12
DEVELOPMENT PHASES

        Phase 1                        Phase 2                              Phase 3
       Code/Test                        Refined                            Advanced
         Group                        Sub-Groups                            Metrics
 Break up test suite into       Identify code subgroups and          Analysis and design:
       groups. Identify                 initial test correlations.         Determine which
       code group – test                Identify and design                advanced metrics will
       group relationships.             specific tools                     be used by the
                                        changes.                           organization.
 Tools: Automatically           Tools: Implement remaining           Tools: Create metric tools
        identify group code            tools listed on prior                using the data
        changes, Launch                chart except the                     collected during the
        specific test groups,          advanced metrics                     Change-Driven
        Report results                                                      loops.
        according to group
        owners




                                                                                          13
TOOLS
ChDT™ ARCHITECTURE

                                           Test Case DB                                                 Code Change Analyzer:
                                                 Test Cases
                                                                                                        •  Code Change DB
        ChDT                                     0.0            0.7

                                                                                     ChDT               •  Code deltas

                            Code Modules
        Module                                            1.0

                                                                                    Testcase

        User Input
                                           0.8


                                           0.2
                                                  CODTEF
                                                  MATRIX
                                                                                      Data              Test Selector:
        Command
        Processor
                                                  Process
                                                                                     ChDT               •  Test Case Results DB
                                                         0.5
                                                                                     Code
                                                                                     Data               •  CODTEF™ Matrix
          Code Change                                                          Test Report              •  Test Case Selector
            Analyzer
                                            Test
                                                                                Generater
                                                                               Test Cases
                                                                                                        System Infrastructure:
        Code Module 01   0%                 Case
                                                                                                        •  User inputs
                                                                Code Modules
                                           Selector                                             Test
  SCM   Code Module 02   2%                                                       ChDT         Report
                                                                                   Test
        Code Module 03

        Code Module 04
                         3%

                         0.1%
                                           Test Suite                             Result                •  Resource DB
                                           Generater                             Analyzer
        Code Module 05   0%                                                                             •  Adjust CODTEF™
                                                                                                           values
                                                       Test Harness
                                                                                                        Reports and Metrics:
                                                                                                        •  Report Generators
        ** Practical reference **
                                                                                                        •  Metrics calculators
   TCHANGE™ : A Practical System For
    Implementing Change-Driven Test
              Methodology
                                                                                                                           15
CODTEF™ MATRIX

  CODTEF™: Code/Test correlation Factor for each Code/Test combination

                        Test Cases
                        0.0                0.7
                                                   INITIALIZING CODTEF™
   Code Modules


                                   1.0

                                                   1)  Manual Method: Coders/Testers fill in
                  0.8
                         CODTEF                        the CODTEF values in the matrix
                  0.2
                         MATRIX                          –  Groups of code/tests
                                                         –  Individual tests
                                                   2)  Automatic
                                   0.5



                                                         –  Set all CODTEF factors to the
 CODTEF                           Meaning                   average value of all CODTEF
                                                            factors. ** (or 0.5 very initially
  Value                                                     until an average has been
                                                            established)
   Blank                      unknown / not set
                                                   3)  Semi-automatic
    0.0                        no correlation            –  Set some CODTEF factors using
                                                            method 1, then let the system
    0.5                       some correlation              complete the rest using method 2
    1.0                       Direct correlation


                                                                                        16
CODTEF™ ALGORITHM

Adjust CODTEF after Pass/Fail/Inconclusive verdict is established for each test:
      1. Inconclusive: Do not adjust CODTEF

      2. FAIL:
                 0.0                     Δ δF
                                     CODTEF
                                                    1.0

      New CODTEF = Prior CODTEF + δF(1- CODTEF),
          Maximum=1
      where δF = average test failure rate (E.g. 0.03)
      3. PASS:
             0.0                    δP Δ
                                        CODTEF
                                                   1.0


           New CODTEF = Prior CODTEF - δP(1- CODTEF),
             Minimum=0
           where δP is the change rate of CODTEF for PASS that will
             match the change rate of FAIL if the system performs at
             the average level over time. δP= δF(δF/(1- δF)) (E.g. 0.03x
             (0.03/(1-0.03)) = 0.001 )
                                                                        17
SYSTEM DATA




              R : Resources available for test
              T : Time available for test
              Q : Quality level = minimum CODETEF™ value
              M : Must-do tests
              TE : Test Exclusion guard band
                     •  % of tests allowed to be excluded from each
                        level if T does not allow selection of all of them.




                                                                     18
TEST SELECTION ALGORITHM

Test Selection = TS1 +TS2 +TS3 +TS4 =

= Test Selection Level 1 = Must-do tests
    •  Regardless of CODTEF™

+ Test Selection Level 2 = Recently Failed tests
    •  Regardless of CODTEF™

+ Test Selection Level 3 = Best CODTEF™ tests
    •  CODTEF™ > Q

+ Test Selection Level 4 = Additional tests if time allows
    •  CODTEF™ < Q




                                                             19
RESULTS AND METRICS
METRICS DERIVED FROM CODTEF™ VALUES

 Testers
     –  High yield tests to be re-factored (high average CODTEF™)
     –  Low yield tests to be pruned (low average CODTEF™)

 Code Owners
    –  CODTEF™ is Q or higher
        •  Eliminates redundant reports of test results to irrelevant owners
    –  Code reliability: re-factor error prone code modules (modules with high
       average CODTEF™ values)

 QA Managers
    –  Increasing average CODTEF™ values indicate quality erosion




                                                                             21
RETURN ON INVESTMENT (ROI)


                   250   Developers
             3,150,000   LOC
                 4,550   test cases
                    36   hours to run all tests
                     4   hours average cycle test with ChDT
                 3,800   bugs fixed per year
 BENEFIT




                     4   hours to fix bug found during ChDT cycle
                    20   hours to fix bug found after ChDT cycle
                  30%    fewer bugs escape integration after ChDT implements
              $130,000   loaded labor rate $/year =
                $62.50   $/hour
           $14,250,000   Cost of bugs before ChDT (3 years)
           $10,830,000   Cost of bugs after ChDT (3 years)
            $3,420,000   Cost savings of CHDT derived from lower cost of bugs (3 years)
             $120,000    ChDT specific Tools labor (one time)
              $21,000    Reconfigure existing tests (one time)
 COST




               $7,500    ChDT sever (one time capital expense
              $21,000    Annual maintenance
             $211,500    3 year cost
                    16 to 1 return
 ROI




            $3,208,500 net cash equivalent returned




                                          ** Practical reference **
                              TCASE™: A Practical System for Creating $uccessful
                                       Test Automation Business Cases



                                                                                          22
EXAMPLE SYSTEM
SPIRENT BV SYSTEM



                            EXAMPLE SSCC RESULT FILE

                           <?xml version="1.0" ?>
                       - <Report branch="mainline" startbuild="3.50.4330" endbuild="3.50.4810">
                       - <FileGroup name="IPS" owner="Marc Hornbeek">
                         <FilePath name="PortConfigs" changes="0" />
                         <FilePath name="Rfc2544Back2Back" changes="0" />
                         <FilePath name="Rfc2544Common" changes="0" />
                         <FilePath name="Rfc2544FrameLoss" changes="0" />
                         <FilePath name="Rfc2544Latency" changes="0" />
                         <FilePath name="Rfc2544Throughput" changes="0" />
                           </FileGroup>

                       - <FileGroup name="Routing" owner="Owner Name">
                         <FilePath name="bfd" changes="0" />
                         <FilePath name="bfd" changes="8" />
                         <FilePath name="eoam" changes="0" />
                           </FileGroup>
                           </Report>




     SSCC = Smartest Source Code Counting tool
SUMMARY
SUMMARY

   Continuous Change-Driven Build Verification
     (ChDT™) methodology using the CODTEF™
     matrix is a practical, platform independent, efficient
     and scalable solution to the build / verification loop
     dilemma.




                                                              26
RELATED READING

  •  TCHANGE™ : A Practical System For Implementing Change-
     Driven Test Methodology, Marc Hornbeek, 2010
  •    A Systematic Review on regression test selection techniques;
       Emelie Engström, Per Runeson, Mats Skoglund, Department of
       Computer Science, Lund University, SE-221 00 Lund, Sweden
  •    An Optimized Change-Driven Regression Testing Selection
       Strategy for Binary JAVA Applications; Sheng Huang, Yang
       Chen, Jun Zhu, Zhong Jie Li, Hua Fang Tan; IBM China Research
       Laboratories, and Department of Computer Science, Tsinghua
       University Beijing
  •    An Empirical Study of Regression Test Selection Techniques;
       Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam Porter,
       Gregg Rothermel; Ohio State University, Oregon State University,
       and University of Maryland
  •    Market Survey of Device Software Testing Trends and Quality
       Concerns in the Embedded Industry, Wind River, June 2010



                                                                          27
THANK-YOU !
           Thank-you for attending this session.

                    Marc Hornbeek


           ü  Please fill out an evaluation form.
                      THANK-YOU !!




Follow-up comments, questions and suggestions?
         Marc.Hornbeek@Spirent.com

                     The End

Continuous Change-Driven Build Verification

  • 1.
    Continuous Change-Driven Build Verification(ChDT™) Marc Hornbeek Manager, SCM Systems Spirent Communications June 2011
  • 2.
  • 3.
    INTRODUCTION Manager, SCM Systems,Spirent Communications •  Network Test and Measurement Systems •  Responsible for SCM Systems including automated BV •  Practical “How-to” guides for Engineering management professionals Perforce Environment •  1.6TB, 36 MLOC, 9 apps, 15 platforms, 18 releases/year, 100 builds /day •  3000 tests/build: Build Test Cycle: 23 hours -> 4 hours
  • 4.
    ChDT™ GAME-CHANGER FORLARGE SCALE AGILE •  Agile Manifesto: –  Early and continuous delivery of valuable software –  Deliver working software frequently –  Working software is the primary measure of progress. •  Functional testing is the standard of proof. •  Testing for large scale systems is a bottleneck •  By dynamically testing code changes ChDT™ resolves this bottleneck. 4
  • 5.
  • 6.
    TYPICAL CODE /TEST FLOW Automation allows the Test Case Database tests to be run Code Pre-defined automatically …right?... Source S Code C Test Suite Test Report Change Control M 95% 4% 1% Automated Test 01 P ..yes but… L Software Test 02 P Test Harness Build Test 03 P Test 04 Test 05 P P §  time to run all the tests ? §  equipment to run enough tests in parallel ? §  time and effort to analyze all the results ? 6
  • 7.
    ChDT™ CODE /TEST FLOW Test Case The Change-Driven Test Database ChDT™ Methodology introduces a CODTEF™ Change-Driven Test Targeted Test (ChDT) Module Report Matrix into the code/test Code Test Cases 0.0 0.7 95% 4% 1% flow. The matrix is used by Code Modules 1.0 Source Code S 0.8 CODTEF Test 01 Test 02 P P the ChDT™ system to C MATRIX automatically drive both test 0.2 Change Test 03 P Control M Test 04 P Test 05 P selection and test result 0.5 Generated Test Suite report generation according Software to continuously adjusting Build Automated Test Harness code change-to-test-to- results correlation factors in the CODTEF™ Matrix. ChDT and CODTEF are trademarks of Managecycle. (c) 2011 All rights reserved. 7
  • 8.
    TEST SELECTION METHODSCOMPARED “Empirical Study of Regression Test Selection Techniques – ACM Transactions” 1.  Retest all - test everything regardless of changes 2.  Hunches – human intuition 3.  Randomly select tests 4.  Minimalist - minimal code coverage 5.  Dataflow technique - data interactions 6.  Safe Techniques - cover each code segment deleted and added. Efficient ? Yes No Risk? Low High Complexity? Low High Platform ? Independent Dependent Change-driven: select tests based on specific code change information correlated to prior failure history of tests correlated to the changed modules 8
  • 9.
    DYNAMIC ADJUSTMENT • Code-to-test CODTEF™ correlation factors automatically adjust –  Past test results automatically affect future test selections –  Automatic adjustments of test timings –  Automatic adjustment of test reports –  Automatic adjustments of code and test performance data Input Build/Test Automatic Process Adjustment Output 9
  • 10.
  • 11.
    PRE_REQUISITES FOR ChDTIMPLEMENTATION Technology: •  Source code management system •  Automated test framework Process: •  Standalone automated test cases** People: •  Tools developers (SCM Framework tools, DB design) ** Practical reference ** TSCRIPT™: A Practical System for Realizing Exemplary Automated Test Scripts 11
  • 12.
    ChDT DATA FRAMEWORK ChDT is a system of many variables •  Code variables •  Test variables •  Results variables •  System variables •  Computed variables Framing the system around the product itself provides a logical framework. There are choices: 1)  Complete process-able definition of the product 2)  Use attribute tags to define important (to test) characteristics 3)  Tests themselves define the product behavior 4)  Groups of modules and tests define major blocks of product behavior 5)  Code itself describes the product behavior 12
  • 13.
    DEVELOPMENT PHASES Phase 1 Phase 2 Phase 3 Code/Test Refined Advanced Group Sub-Groups Metrics Break up test suite into Identify code subgroups and Analysis and design: groups. Identify initial test correlations. Determine which code group – test Identify and design advanced metrics will group relationships. specific tools be used by the changes. organization. Tools: Automatically Tools: Implement remaining Tools: Create metric tools identify group code tools listed on prior using the data changes, Launch chart except the collected during the specific test groups, advanced metrics Change-Driven Report results loops. according to group owners 13
  • 14.
  • 15.
    ChDT™ ARCHITECTURE Test Case DB Code Change Analyzer: Test Cases •  Code Change DB ChDT 0.0 0.7 ChDT •  Code deltas Code Modules Module 1.0 Testcase User Input 0.8 0.2 CODTEF MATRIX Data Test Selector: Command Processor Process ChDT •  Test Case Results DB 0.5 Code Data •  CODTEF™ Matrix Code Change Test Report •  Test Case Selector Analyzer Test Generater Test Cases System Infrastructure: Code Module 01 0% Case •  User inputs Code Modules Selector Test SCM Code Module 02 2% ChDT Report Test Code Module 03 Code Module 04 3% 0.1% Test Suite Result •  Resource DB Generater Analyzer Code Module 05 0% •  Adjust CODTEF™ values Test Harness Reports and Metrics: •  Report Generators ** Practical reference ** •  Metrics calculators TCHANGE™ : A Practical System For Implementing Change-Driven Test Methodology 15
  • 16.
    CODTEF™ MATRIX CODTEF™: Code/Test correlation Factor for each Code/Test combination Test Cases 0.0 0.7 INITIALIZING CODTEF™ Code Modules 1.0 1)  Manual Method: Coders/Testers fill in 0.8 CODTEF the CODTEF values in the matrix 0.2 MATRIX –  Groups of code/tests –  Individual tests 2)  Automatic 0.5 –  Set all CODTEF factors to the CODTEF Meaning average value of all CODTEF factors. ** (or 0.5 very initially Value until an average has been established) Blank unknown / not set 3)  Semi-automatic 0.0 no correlation –  Set some CODTEF factors using method 1, then let the system 0.5 some correlation complete the rest using method 2 1.0 Direct correlation 16
  • 17.
    CODTEF™ ALGORITHM Adjust CODTEFafter Pass/Fail/Inconclusive verdict is established for each test: 1. Inconclusive: Do not adjust CODTEF 2. FAIL: 0.0 Δ δF CODTEF 1.0 New CODTEF = Prior CODTEF + δF(1- CODTEF), Maximum=1 where δF = average test failure rate (E.g. 0.03) 3. PASS: 0.0 δP Δ CODTEF 1.0 New CODTEF = Prior CODTEF - δP(1- CODTEF), Minimum=0 where δP is the change rate of CODTEF for PASS that will match the change rate of FAIL if the system performs at the average level over time. δP= δF(δF/(1- δF)) (E.g. 0.03x (0.03/(1-0.03)) = 0.001 ) 17
  • 18.
    SYSTEM DATA R : Resources available for test T : Time available for test Q : Quality level = minimum CODETEF™ value M : Must-do tests TE : Test Exclusion guard band •  % of tests allowed to be excluded from each level if T does not allow selection of all of them. 18
  • 19.
    TEST SELECTION ALGORITHM TestSelection = TS1 +TS2 +TS3 +TS4 = = Test Selection Level 1 = Must-do tests •  Regardless of CODTEF™ + Test Selection Level 2 = Recently Failed tests •  Regardless of CODTEF™ + Test Selection Level 3 = Best CODTEF™ tests •  CODTEF™ > Q + Test Selection Level 4 = Additional tests if time allows •  CODTEF™ < Q 19
  • 20.
  • 21.
    METRICS DERIVED FROMCODTEF™ VALUES Testers –  High yield tests to be re-factored (high average CODTEF™) –  Low yield tests to be pruned (low average CODTEF™) Code Owners –  CODTEF™ is Q or higher •  Eliminates redundant reports of test results to irrelevant owners –  Code reliability: re-factor error prone code modules (modules with high average CODTEF™ values) QA Managers –  Increasing average CODTEF™ values indicate quality erosion 21
  • 22.
    RETURN ON INVESTMENT(ROI) 250 Developers 3,150,000 LOC 4,550 test cases 36 hours to run all tests 4 hours average cycle test with ChDT 3,800 bugs fixed per year BENEFIT 4 hours to fix bug found during ChDT cycle 20 hours to fix bug found after ChDT cycle 30% fewer bugs escape integration after ChDT implements $130,000 loaded labor rate $/year = $62.50 $/hour $14,250,000 Cost of bugs before ChDT (3 years) $10,830,000 Cost of bugs after ChDT (3 years) $3,420,000 Cost savings of CHDT derived from lower cost of bugs (3 years) $120,000 ChDT specific Tools labor (one time) $21,000 Reconfigure existing tests (one time) COST $7,500 ChDT sever (one time capital expense $21,000 Annual maintenance $211,500 3 year cost 16 to 1 return ROI $3,208,500 net cash equivalent returned ** Practical reference ** TCASE™: A Practical System for Creating $uccessful Test Automation Business Cases 22
  • 23.
  • 24.
    SPIRENT BV SYSTEM EXAMPLE SSCC RESULT FILE <?xml version="1.0" ?> - <Report branch="mainline" startbuild="3.50.4330" endbuild="3.50.4810"> - <FileGroup name="IPS" owner="Marc Hornbeek"> <FilePath name="PortConfigs" changes="0" /> <FilePath name="Rfc2544Back2Back" changes="0" /> <FilePath name="Rfc2544Common" changes="0" /> <FilePath name="Rfc2544FrameLoss" changes="0" /> <FilePath name="Rfc2544Latency" changes="0" /> <FilePath name="Rfc2544Throughput" changes="0" /> </FileGroup> - <FileGroup name="Routing" owner="Owner Name"> <FilePath name="bfd" changes="0" /> <FilePath name="bfd" changes="8" /> <FilePath name="eoam" changes="0" /> </FileGroup> </Report> SSCC = Smartest Source Code Counting tool
  • 25.
  • 26.
    SUMMARY Continuous Change-Driven Build Verification (ChDT™) methodology using the CODTEF™ matrix is a practical, platform independent, efficient and scalable solution to the build / verification loop dilemma. 26
  • 27.
    RELATED READING •  TCHANGE™ : A Practical System For Implementing Change- Driven Test Methodology, Marc Hornbeek, 2010 •  A Systematic Review on regression test selection techniques; Emelie Engström, Per Runeson, Mats Skoglund, Department of Computer Science, Lund University, SE-221 00 Lund, Sweden •  An Optimized Change-Driven Regression Testing Selection Strategy for Binary JAVA Applications; Sheng Huang, Yang Chen, Jun Zhu, Zhong Jie Li, Hua Fang Tan; IBM China Research Laboratories, and Department of Computer Science, Tsinghua University Beijing •  An Empirical Study of Regression Test Selection Techniques; Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam Porter, Gregg Rothermel; Ohio State University, Oregon State University, and University of Maryland •  Market Survey of Device Software Testing Trends and Quality Concerns in the Embedded Industry, Wind River, June 2010 27
  • 28.
    THANK-YOU ! Thank-you for attending this session. Marc Hornbeek ü  Please fill out an evaluation form. THANK-YOU !! Follow-up comments, questions and suggestions? Marc.Hornbeek@Spirent.com The End