Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Budapest, 26-28 October 2016
EVALUATING CODE-BASED TEST
INPUT GENERATOR TOOLS
Presented by Zoltan Micskei
© All rights res...
Code-based test generation
2 © All rights reserved
int fun1(int a, int b){
if (a == 0){
printf(ERROR_MSG);
return -1;
}
if...
Code-based test generator tools
3 © All rights reserved
Detailed list of tools: http://mit.bme.hu/~micskeiz/pages/cbtg.html
Motivation
How can the different
test input generator tools
be compared and evaluated?
4 © All rights reserved
APPROACH
© All rights reserved
5
© All rights reserved
Core features
• Types, Operators
• Conditionals,
Loops
• Arrays, Functions
Basic
• Structure usage
• Nested
structures
• …...
Extra features
• Stdin,
Properties
• Files, Sockets
Environment
• Threads, Locks
• Indeterminacy
Multi-
threading
• Classe...
Snippets
9 © All rights reserved
300 core snippets 63 extra snippets
Manual sample
inputs
Execution (SETTE framework)
10 © All rights reserved
RESULTS
© All rights reserved
11
Experiments
Tools
• CATG
• EvoSuite
• IntelliTest
• jPET
• SPF
• Randoop
Status
• N/A
• EX
• T/M
• NC
• C
Measures
• Cover...
Overview of capabilities (Status)
13 © All rights reserved
Overview of capabilities
14 © All rights reserved
Summary of insights
• Duration: 20 min – 200 min
• Randoop, EvoSuite use all available time
• Size:
• 700 (manual) < 1000 ...
Summary
16 © All rights reserved
All code and results are available:
http://sette-testing.github.io
Upcoming SlideShare
Loading in …5
×

Evaluating code-based test-generator tools

66 views

Published on

In recent years several tools have been developed to automatically select relevant test inputs from the source code of the system under test. However, each of these tools has different advantages, and there is little detailed feedback available on the actual capabilities of the various tools. We developed a framework to compare and evaluate such tools.

Published in: Software
  • Be the first to comment

  • Be the first to like this

Evaluating code-based test-generator tools

  1. 1. Budapest, 26-28 October 2016 EVALUATING CODE-BASED TEST INPUT GENERATOR TOOLS Presented by Zoltan Micskei © All rights reserved
  2. 2. Code-based test generation 2 © All rights reserved int fun1(int a, int b){ if (a == 0){ printf(ERROR_MSG); return -1; } if (b > a) return b*a + 5; else return (a+b) / 2; } 1 2 3 4 # a b stm 1 0 * 1, 2 2 a!=0 b > a 3 3 a!=0 b =< a 4
  3. 3. Code-based test generator tools 3 © All rights reserved Detailed list of tools: http://mit.bme.hu/~micskeiz/pages/cbtg.html
  4. 4. Motivation How can the different test input generator tools be compared and evaluated? 4 © All rights reserved
  5. 5. APPROACH © All rights reserved 5
  6. 6. © All rights reserved
  7. 7. Core features • Types, Operators • Conditionals, Loops • Arrays, Functions Basic • Structure usage • Nested structures • … Structures • Object usage • Inheritance, interfaces • … Objects • Generic functions • Generic objects • … Generics • Arithmetic, Strings • Collections • … Library © All rights reserved
  8. 8. Extra features • Stdin, Properties • Files, Sockets Environment • Threads, Locks • Indeterminacy Multi- threading • Classes, Methods Reflection • Native functions Native code © All rights reserved
  9. 9. Snippets 9 © All rights reserved 300 core snippets 63 extra snippets Manual sample inputs
  10. 10. Execution (SETTE framework) 10 © All rights reserved
  11. 11. RESULTS © All rights reserved 11
  12. 12. Experiments Tools • CATG • EvoSuite • IntelliTest • jPET • SPF • Randoop Status • N/A • EX • T/M • NC • C Measures • Coverage • Size • Duration • Mutation score Setup • 30s limit • Repeat 10x • Variable time 12 © All rights reserved
  13. 13. Overview of capabilities (Status) 13 © All rights reserved
  14. 14. Overview of capabilities 14 © All rights reserved
  15. 15. Summary of insights • Duration: 20 min – 200 min • Randoop, EvoSuite use all available time • Size: • 700 (manual) < 1000 (IntelliTest) << 270 000 (Randoop) • Extra snippets • EvoSuite: custom sandbox • Otherwise significant challenge 15 © All rights reserved
  16. 16. Summary 16 © All rights reserved All code and results are available: http://sette-testing.github.io

×