SlideShare a Scribd company logo
Anton Semenchenko
Сравнительный анализ
инструментов
Автоматизации Desktop
AUT
Agenda, part 1 (general)
1. Problem
2. Solutions 2016
Agenda, part 2 (tools and criteria's)
1. Tools to be compared (15)
2. How  why we selected this list of tools?
3. Comparison criteria types (3)
4. Stakeholders oriented comparison criteria (7)
5. Mixed comparison criteria (7)
6. Tech stuff oriented comparison criteria (8)
7. How  why we selected these lists of criteria's?
8. How to select proper criteria's for your project
Agenda, part 3 (comparison analyses)
1. Mixed comparison criteria
2. Tech stuff oriented comparison criteria
3. Stakeholders oriented comparison criteria
4. Define our “standard” context
5. Summarized scores
6. How to calculate scores
7. How to use scores / presentation
8. 4 summarized tables
Agenda, part 4 (tools, “how to” and examples)
1. How to define proper tool based on selected criteria's
2. How to link information from presentation to QA Automation
metrics
3. How to link information from presentation to Project Health Check
4. How to link information from presentation to QA Automation ROI
5. Tools tiny overview
6. Tools overview structure
7. Example of tool usage structure
Agenda, part 5 (trends, science and “what’s next”)
1. Define a Trend! Is it possible ..?
2. Trend – an option
3. Why so?
4. What’s next
Problem
• There is an implicit leader for Web automation
Problem
• It’s not that simple if to talk about desktop apps
Tools to be compared
• TestComplete Desktop
• Unified Functional Testing (UFT)
• Ranorex
• Telerik Test Studio
• Zeenyx AscentialTest
• MS VS Coded UI
• CUIT
• AUTOIT
• Sikuli
• Jubula
• Robot Framework
• Winium
• WinAppDriver
• QTWebDriver
• PyWinAuto
How  why we selected this list of tools?
Comparison criteria types
1. Stakeholders oriented
2. Tech stuff oriented
3. Mixed
Stakeholders oriented comparison criteria
1. Approximate complexity of auto-test development
2. Approximate complexity of auto-test support
3. Approximate “entrance” level
4. Required technical skills level
5. Tests readability
6. How fast tests run
7. Ability to re-use "Business-Logic" layer in other technical context
Mixed comparison criteria
1. Supported platforms
2. Supported technologies
3. Licensing
4. Maturity
5. Record-Play system support
6. Standard actions pack
Tech stuff oriented comparison criteria
1. Programming languages support
2. Have tools for mapping
3. Self-Made architecture support
4. Data-Driven testing support
5. Test-Driven development support
6. Key-word driven
7. Behavior Driven Development support
8. Continues integration system support
How  why we selected these lists of criteria's?
How to select proper criteria's for your project
Mixed comparison criteria
Supported platforms – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Supported platforms
Tool Platforms Mark
TestComplete Desktop Windows
Unified Functional
Testing
Windows
Ranorex Windows
Telerik Test Studio Windows
Zeenyx AscentialTest Windows
MS VS Coded UI ; CUIT Windows
AUTOIT Windows
Sikuli Windows, Unix-like Good
Jubula Windows, Unix-like Good
Robot Framework Windows, Unix-like Good
Winium / WinAppDriver
;
QTWebDriver
Windows / Windows;
Cross-Platform
/ ; Good
PyWinAuto Windows
Supported technologies – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10.Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Supported technologies
Tool Technologies Mark
TestComplete Desktop C/C++, WinForms, WPF,
Java, Qt
Unified Functional Testing WinForms, WPF, Java, SAP
Ranorex WinForms, WPF, Java, Qt,
SAP
Telerik Test Studio WPF Bad
Zeenyx AscentialTest Win Forms, WPF, Java Bad
MS VS Coded UI ; CUIT Win Forms (partial), WPF Bad
AUTOIT OS level Good
Sikuli Image recognition based Good
Jubula WinForms, WPF, Java Bad
Robot Framework Uses AutoIT (and co inside) Good
Winium / WinAppDriver ;
QTWebDriver
WinForms, WPF / Any ; QT Bad
PyWinAuto Win32 API, WinForms
(partial, Win32 API bases)
Bad
Licensing – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Licensing
Tool License Mark
TestComplete Desktop Paid Bad
Unified Functional Testing Paid Bad
Ranorex Paid Bad
Telerik Test Studio Paid Bad
Zeenyx AscentialTest Paid Bad
MS VS Coded UI ; CUIT Paid Bad
AUTOIT Free
Sikuli Open source Good
Jubula Open source Good
Robot Framework Open source Good
Winium / WinAppDriver ;
QTWebDriver
Open source Good
PyWinAuto Open source Good
Maturity – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10.Jubula
11.Robot Framework
12.Winium
13.WinAppDriver
14.QTWebDriver
15.PyWinAuto
Maturity
Tool Maturity
TestComplete Desktop Good
Unified Functional Testing Good
Ranorex Good
Telerik Test Studio Good
Zeenyx AscentialTest
MS VS Coded UI ; CUIT Good
AUTOIT
Sikuli
Jubula
Robot Framework
Winium / WinAppDriver ;
QTWebDriver
Bad
PyWinAuto
Record-Play support – do we really need it?
Record-Play support
Tool Record-Play Mark
TestComplete Desktop Yes Good
Unified Functional
Testing
Yes Good
Ranorex Yes Good
Telerik Test Studio Yes Good
Zeenyx AscentialTest No
MS VS Coded UI ; CUIT No
AUTOIT No
Sikuli No
Jubula No
Robot Framework No
Winium / WinAppDriver
;
QTWebDriver
No
PyWinAuto No
Standard actions pack – do we really need it?
Standard actions pack
Tool STD actions Mark
TestComplete Desktop No
Unified Functional Testing No
Ranorex No
Telerik Test Studio No
Zeenyx AscentialTest Yes Good
MS VS Coded UI ; CUIT No
AUTOIT No
Sikuli Yes Good
Jubula Yes Good
Robot Framework No
Winium / WinAppDriver ;
QTWebDriver
No
PyWinAuto Yes / No (via SWAPY)
Tech stuff oriented comparison criteria
Programming languages – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Programming languages support
Tool Language Mark
TestComplete Desktop Python, C#Script, JScript,
C++Script, VBScript,
DelphiScript
Good
Unified Functional
Testing
VBScript Bad
Ranorex C#, VB.Net
Telerik Test Studio C#, VB.Net
Zeenyx AscentialTest Own DSL Bad
MS VS Coded UI ; CUIT C#, VB.Net
AUTOIT Own Basic-like language Bad
Sikuli Jython, Java
Jubula -
Robot Framework Own DSL, Java, Python
Winium / WinAppDriver ;
QTWebDriver
Java, JavaScript, PHP,
Python, Ruby, C#
Good
PyWinAuto CPython
Tools for mapping – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Tools for mapping
Tool Tools for mapping Mark
TestComplete Desktop Yes Good
Unified Functional Testing Yes Good
Ranorex Yes Good
Telerik Test Studio Yes Good
Zeenyx AscentialTest Yes / No Good
MS VS Coded UI ; CUIT No
AUTOIT No
Sikuli Yes / No
Jubula Yes Good
Robot Framework No
Winium / WinAppDriver ;
QTWebDriver
No
PyWinAuto No
Custom architecture – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Custom architecture
Tool Custom architecture Mark
TestComplete Desktop Yes / No
Unified Functional Testing Yes / No
Ranorex Yes / No
Telerik Test Studio Yes / No
Zeenyx AscentialTest No Bad
MS VS Coded UI ; CUIT Yes Good
AUTOIT No Bad
Sikuli Yes Good
Jubula No / Yes
Robot Framework Yes Good
Winium / WinAppDriver ;
QTWebDriver
Yes Good
PyWinAuto Yes Good
DDT support – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
DDT support
Tool DDT support Mark
TestComplete Desktop Yes Good
Unified Functional Testing Yes Good
Ranorex Yes Good
Telerik Test Studio Yes Good
Zeenyx AscentialTest Yes Good
MS VS Coded UI ; CUIT Yes Good
AUTOIT No Bad
Sikuli Yes / No Good
Jubula Yes Good
Robot Framework Yes Good
Winium / WinAppDriver ;
QTWebDriver
Yes Good
PyWinAuto Yes Good
TDD support – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10.Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
TDD support
Tool TDD Mark
TestComplete Desktop Yes / No Good
Unified Functional Testing Yes / No Good
Ranorex Yes / No Good
Telerik Test Studio Yes / No Good
Zeenyx AscentialTest No Bad
MS VS Coded UI ; CUIT Yes Good
AUTOIT No Bad
Sikuli Yes / No Good
Jubula Yes Good
Robot Framework Yes Good
Winium / WinAppDriver ;
QTWebDriver
Yes Good
PyWinAuto Yes Good
Key-word driven – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Key-word driven support
Tool Key-word Mark
TestComplete Desktop No Bad
Unified Functional Testing No Bad
Ranorex Yes / No
Telerik Test Studio Yes / No
Zeenyx AscentialTest Yes Good
MS VS Coded UI ; CUIT Yes / No
AUTOIT No Bad
Sikuli Yes / No
Jubula No Bad
Robot Framework Yes Good
Winium / WinAppDriver ;
QTWebDriver
Yes / No
PyWinAuto Yes / No
BDD support – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
BDD support
Tool BDD Mark
TestComplete Desktop No Bad
Unified Functional Testing No Bad
Ranorex Yes Good
Telerik Test Studio Yes Good
Zeenyx AscentialTest No Bad
MS VS Coded UI ; CUIT Yes Good
AUTOIT No Bad
Sikuli Yes Good
Jubula No Bad
Robot Framework Yes / No
Winium / WinAppDriver ;
QTWebDriver
Yes Good
PyWinAuto Yes Good
CI support – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
CI support
Tool CI Mark
TestComplete Desktop Automated Build Studio
Unified Functional Testing Jenkins plugin
Ranorex Jenkins
Telerik Test Studio Bamboo
Zeenyx AscentialTest Test Execution
Management
MS VS Coded UI ; CUIT Any Good
AUTOIT - / Any
Sikuli - / Any Java-compatible
Jubula No Bad
Robot Framework Jenkins plugin
Winium / WinAppDriver ;
QTWebDriver
Any Good
PyWinAuto Any Good
Stakeholders oriented comparison criteria
Define our “standard” context
Approximate complexity of auto-test development
Tool Development Mark
TestComplete Desktop ~3h
Unified Functional Testing ~3h
Ranorex ~2h Good
Telerik Test Studio ~2h Good
Zeenyx AscentialTest ~2h Good
MS VS Coded UI ; CUIT ~3h ; 2h ; Good
AUTOIT ~1h Good
Sikuli ~2h Good
Jubula ~2h Good
Robot Framework ~4h
Winium / WinAppDriver ;
QTWebDriver
~3h / 6h -> 2h / Bad -> Good
PyWinAuto ~1h Good
Approximate complexity of auto-test
support (per year)
Tool Support Mark
TestComplete Desktop ~3h Bad
Unified Functional Testing ~3h Bad
Ranorex ~2h Good
Telerik Test Studio ~2h Good
Zeenyx AscentialTest ~3h Bad
MS VS Coded UI ; CUIT ~2h ; 1h Good
AUTOIT ~4h Bad
Sikuli ~5h Bad
Jubula ~2h Good
Robot Framework ~1h Good
Winium / WinAppDriver ;
QTWebDriver
~2h / 10h -> 1h Good / Bad -> Good
PyWinAuto ~2h Good
Approximate “entrance” level – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10.Jubula
11.Robot Framework
12.Winium
13.WinAppDriver
14.QTWebDriver
15.PyWinAuto
Approximate “entrance” level
Tool Level
TestComplete Desktop High
Unified Functional Testing High
Ranorex
Telerik Test Studio
Zeenyx AscentialTest
MS VS Coded UI ; CUIT High
AUTOIT Low
Sikuli Low
Jubula
Robot Framework High
Winium / WinAppDriver ;
QTWebDriver
High ->
PyWinAuto
Required “technical skills” level – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Required “technical skills” level
Tool Level
TestComplete Desktop
Unified Functional Testing
Ranorex
Telerik Test Studio
Zeenyx AscentialTest Low
MS VS Coded UI ; CUIT High ;
AUTOIT Low
Sikuli Low
Jubula Low
Robot Framework High
Winium / WinAppDriver ;
QTWebDriver
High ->
PyWinAuto Low
Test readability – “the worst” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
Test readability
Tool Level
TestComplete Desktop
Unified Functional Testing
Ranorex
Telerik Test Studio
Zeenyx AscentialTest High
MS VS Coded UI ; CUIT
AUTOIT Low
Sikuli High
Jubula High
Robot Framework - > High
Winium / WinAppDriver ;
QTWebDriver
- > High
PyWinAuto High
How fast tests run – “the best” tool?
1. TestComplete Desktop
2. Unified Functional Testing (UFT)
3. Ranorex
4. Telerik Test Studio
5. Zeenyx AscentialTest
6. MS VS Coded UI
7. CUIT
8. AUTOIT
9. Sikuli
10. Jubula
11. Robot Framework
12. Winium
13. WinAppDriver
14. QTWebDriver
15. PyWinAuto
How fast tests run
Tool Level
TestComplete Desktop Bad
Unified Functional Testing Bad
Ranorex
Telerik Test Studio
Zeenyx AscentialTest
MS VS Coded UI ; CUIT Good
AUTOIT Good
Sikuli Bad
Jubula Bad
Robot Framework Good
Winium / WinAppDriver ;
QTWebDriver
Good
PyWinAuto Good
Ability to re-use "Business-Logic" layer
Tool “BDD” Mark
TestComplete Desktop No Bad
Unified Functional Testing No Bad
Ranorex Yes Good
Telerik Test Studio Yes Good
Zeenyx AscentialTest No Bad
MS VS Coded UI ; CUIT Yes Good
AUTOIT No Bad
Sikuli Yes Good
Jubula No Bad
Robot Framework Yes Good
Winium / WinAppDriver ;
QTWebDriver
Yes Good
PyWinAuto Yes Good
Summarized scores
How to calculate scores
How to use scores 
Stakeholders oriented score
Tool Score
TestComplete Desktop -2
Unified Functional Testing -2
Ranorex +3
Telerik Test Studio +3
Zeenyx AscentialTest +1
MS VS Coded UI ; CUIT +1
AUTOIT +1
Sikuli +3
Jubula +2
Robot Framework +2
Winium / WinAppDriver ;
QTWebDriver
+2
PyWinAuto +6
Mixed score
Tool Score
TestComplete Desktop +1
Unified Functional Testing +1
Ranorex +1
Telerik Test Studio 0
Zeenyx AscentialTest -1
MS VS Coded UI ; CUIT -1
AUTOIT +1
Sikuli +4
Jubula +1
Robot Framework +2
Winium / WinAppDriver ;
QTWebDriver
-2
PyWinAuto -1
Tech stuff oriented score
Tool Score
TestComplete Desktop +2
Unified Functional Testing 0
Ranorex +4
Telerik Test Studio +4
Zeenyx AscentialTest -1
MS VS Coded UI ; CUIT +4
AUTOIT -6
Sikuli +4
Jubula +1
Robot Framework +4
Winium / WinAppDriver ;
QTWebDriver
+6
PyWinAuto +5
Summarized score
Tool Stub
TestComplete Desktop +1
Unified Functional Testing -1
Ranorex +8
Telerik Test Studio +7
Zeenyx AscentialTest -1
MS VS Coded UI ; CUIT +4
AUTOIT -4
Sikuli +11
Jubula +4
Robot Framework +8
Winium / WinAppDriver ;
QTWebDriver
+6
PyWinAuto +10
How to define proper tool based on selected
criteria's
How to
1. link information from presentation to QA Automation metrics
2. link information from presentation to Project Health Check
3. link information from presentation to QA Automation ROI
Tools tiny overview
Tools overview structure
1. Pros
2. Cons
3. What kind of project / product / problem / situation
certain tools could be used for!
Example of tool usage structure
1. Plus several examples of each tool usage
— Example structure:
• Values:
— Value the individual
— Act as a team
— Strive for excellence
— Focus on customer
— Act with integrity
• Prisms:
— Technology
— Delivery
— Leadership
Project A
Project A
Project A
Test Complete Desktop
1. Pros
• Low entrance level
• High level of test scripts’ flexibility
• Huge knowledge base (at about MSDN level)
• Wide choice of script languages which look like common languages
2. Cons
• Very expensive license
• Very specific own script languages
Unified functional testing
1. Pros
• Low “entrance” level
• High level of test scripts’ flexibility
• Good tech support
2. Cons
• Strict integration with other HP solutions
• Very specific own DSL
Ranorex
1. Pros
• Low “entrance” level
• Script tests are written on common languages (C#, VB,Net)
• Good tech support
2. Cons
• Paid license
Telerik Test Studio (Desktop)
1. Pros
• Low “entrance” level
• Great parameterization of Keyword tests
• DDT support using common formats (CSV, XLS, DB)
• Converting tests to common languages (C#, VB.NET)
2. Cons
• Only WPF-applications
Zeenyx
1. Pros
• Supporting complex logic
• Great organization of DDT
• Using standard .Net libraries support
2. Cons
• Need time to learn how to use
• Specific own DSL
MS VS Coded UI
1. Pros
• “Native” for Windows
• Supports a huge set of UI technologies
• Generated UI Map
• Ready to go infrastructure
• Good documentation and support
2. Cons
• License cost
• Relatively “low level” API
MS VS Coded UI + CUIT
1. Pros
• The same as for MS VS Coded UI
• Elegant “High level” API
2. Cons
• The same as for MS VS Coded UI
AutoIT
1. Pros
• Easy
• Universal
• Free
2. Cons
• There is no ready-to-use verification instruments
• Test = exe file
• There is no ready-to-use reports
Sikuli
1. Pros
• IDE is easy to learn and use
• Standard actions pack
• Supports an ability to write tests using common languages (Java, Python)
• Supports an ability to work on different platforms and with any applications
• Free
2. Cons
• Low test’s reliability
• Slow tests work
• No ability to work with texts
• Complicated to support tests
Jubula
1. Pros
• IDE is easy to use
• Supports an ability to work on requirement base
• Integrated DB for storing test data and results
• Free
2. Cons
• No flexibility which is ingrain to script tests
• No CI support
Robot Framework
1. Pros
• Their own not complex and easy-to-read Keyword-based language
• Plugins for different IDE’s
• Work with different Oss
• Different programming languages support
• Tools for creating user-own libraries
• Free
2. Cons
• High entrance level
Winium
1. Pros
• Familiar syntax and API
• Supports all the languages that are supported by Selenium WebDriver
• Free
2. Cons
• “Immature” testing tool
• Incomplete way of locating elements
• A lack of documentation
WinAppDriver
1. Pros
• Familiar syntax and API
• “Native” for Windows
• Free
2. Cons
• “Immature” testing tool
• Complicated (in special case usage)
• A lack of documentation
QTWebDriver
1. Pros
• Familiar syntax and API
• QT Applications oriented / “Native” (unique tool)
• Free
2. Cons
• “Immature” testing tool
• Complicated (in special case usage)
• A lack of documentation
PyWinAuto
1. Pros
• Extremely simple to use
• Easy to support
• Free
2. Cons
• Do not support all popular UI technologies
• CPython only
Define a Trend! Is it possible ..?
Trend
1. There is a potential leader for Desktop Automation
Why so?
“Scientific” technical prove of Trend
• Hegel’s dialectics
• Bifurcation mathematical apparat (Bifurcation Theory)
• Sedov’s law of hierarchical compensation
• Pannov-Snuks Vertical
• Big History
Why so?
Non-technical scientific prove of Trend
• Peter Drucker “Management. Challenges for the 21st Century”
Note: It’s a topic of the whole big conversation, and I’m sure we’re going to
get back to it, but not today…
How to
1. use this presentation on different project phases
2. use this presentation based on main project roles
What’s next (just a possible way)
• Shu
1. Use Presentation
1. Please, follow recommendations
a) “How to select proper criteria's for your project”
b) “How to define proper tool based on selected criteria's”
c) “How to link information from presentation to QA Automation metrics”
d) “How to link information from presentation to Project Health Check”
e) “How to link information from presentation to QA Automation ROI”
f) “How to use this presentation on different project phases”
g) “how to use this presentation based on main project roles”
What’s next
• Ha
1. Update a set of criteria's
2. Update a set of tools
3. Update Presentation
4. Read “Scientific” prove of Trend
What’s next
• Ri
1. Re-Read “Scientific” prove of Trend
2. Update a set of criteria's
3. Update a set of tools
4. Update Presentation
5. Predict the “Trend”
6. Manage the “Trend”
Next iteration 
• Move from static (Presentation) to dynamic (Application)
• For example, “https://telescope.epam.com”
CONTACT ME
Anton_Semenchenko@epam.com
semenchenko_anton_v
https://www.linkedin.com/in/anton-
semenchenko-612a926b
https://www.facebook.com/semenche
nko.anton.v
https://twitter.com/comaqa
Thanks for your attention
Anton Semenchenko
DPI.Solutions
EPAM Systems
www.comaqa.by
www.corehard.by

More Related Content

What's hot

QA/Test Engineering Perspectives
QA/Test Engineering PerspectivesQA/Test Engineering Perspectives
QA/Test Engineering Perspectives
Roopesh Kohad
 
Shift left as first transformation step into Quality Assurance
Shift left as first transformation step into Quality AssuranceShift left as first transformation step into Quality Assurance
Shift left as first transformation step into Quality Assurance
Zbyszek Mockun
 
Build & Release Engineering
Build & Release Engineering Build & Release Engineering
Build & Release Engineering
Pranesh Vittal
 
Agile Testing by Example
Agile Testing by ExampleAgile Testing by Example
Agile Testing by Example
Mikalai Alimenkou
 
Team Foundation Server Process Templates For Effective Project Management
Team Foundation Server Process Templates For Effective Project ManagementTeam Foundation Server Process Templates For Effective Project Management
Team Foundation Server Process Templates For Effective Project Management
Aaron Bjork
 
Teamwork and agile methodologies
Teamwork and agile methodologiesTeamwork and agile methodologies
Teamwork and agile methodologies
Stefano Paluello
 
Manual testing1
Manual testing1Manual testing1
Manual testing1
Raghu Sirka
 
xUnit and TDD: Why and How in Enterprise Software, August 2012
xUnit and TDD: Why and How in Enterprise Software, August 2012xUnit and TDD: Why and How in Enterprise Software, August 2012
xUnit and TDD: Why and How in Enterprise Software, August 2012
Justin Gordon
 
Extreme programming (xp)
Extreme programming (xp)Extreme programming (xp)
Extreme programming (xp)
Mohamed Abdelrahman
 
Tech talks #1- Unit testing and TDD
Tech talks #1- Unit testing and TDDTech talks #1- Unit testing and TDD
Tech talks #1- Unit testing and TDD
DUONG Trong Tan
 
agile vs. traditional methodologies
agile vs. traditional methodologies agile vs. traditional methodologies
agile vs. traditional methodologies
SWE department, Bogazici university
 
Introduction to the Agile Methods
Introduction to the Agile MethodsIntroduction to the Agile Methods
Introduction to the Agile Methods
softwareacademy
 
Agile Engineering Practices
Agile Engineering PracticesAgile Engineering Practices
Agile Engineering Practices
Vernon Stinebaker
 
Testing in Agile Projects
Testing in Agile ProjectsTesting in Agile Projects
Testing in Agile Projects
sriks7
 
Saurav_kumar
Saurav_kumarSaurav_kumar
Saurav_kumar
Saurav Kumar
 
Agile Testing
Agile TestingAgile Testing
Agile Testing
Naresh Jain
 
Software development lifecycle_release_management
Software development lifecycle_release_managementSoftware development lifecycle_release_management
Software development lifecycle_release_management
netdbncku
 
Saurav_Kumar
Saurav_KumarSaurav_Kumar
Saurav_Kumar
Saurav Kumar
 
Sandeep Kamath Resume
Sandeep Kamath ResumeSandeep Kamath Resume
Sandeep Kamath Resume
Sandeep Kamath
 
Extreme Programming
Extreme ProgrammingExtreme Programming
Extreme Programming
Knoldus Inc.
 

What's hot (20)

QA/Test Engineering Perspectives
QA/Test Engineering PerspectivesQA/Test Engineering Perspectives
QA/Test Engineering Perspectives
 
Shift left as first transformation step into Quality Assurance
Shift left as first transformation step into Quality AssuranceShift left as first transformation step into Quality Assurance
Shift left as first transformation step into Quality Assurance
 
Build & Release Engineering
Build & Release Engineering Build & Release Engineering
Build & Release Engineering
 
Agile Testing by Example
Agile Testing by ExampleAgile Testing by Example
Agile Testing by Example
 
Team Foundation Server Process Templates For Effective Project Management
Team Foundation Server Process Templates For Effective Project ManagementTeam Foundation Server Process Templates For Effective Project Management
Team Foundation Server Process Templates For Effective Project Management
 
Teamwork and agile methodologies
Teamwork and agile methodologiesTeamwork and agile methodologies
Teamwork and agile methodologies
 
Manual testing1
Manual testing1Manual testing1
Manual testing1
 
xUnit and TDD: Why and How in Enterprise Software, August 2012
xUnit and TDD: Why and How in Enterprise Software, August 2012xUnit and TDD: Why and How in Enterprise Software, August 2012
xUnit and TDD: Why and How in Enterprise Software, August 2012
 
Extreme programming (xp)
Extreme programming (xp)Extreme programming (xp)
Extreme programming (xp)
 
Tech talks #1- Unit testing and TDD
Tech talks #1- Unit testing and TDDTech talks #1- Unit testing and TDD
Tech talks #1- Unit testing and TDD
 
agile vs. traditional methodologies
agile vs. traditional methodologies agile vs. traditional methodologies
agile vs. traditional methodologies
 
Introduction to the Agile Methods
Introduction to the Agile MethodsIntroduction to the Agile Methods
Introduction to the Agile Methods
 
Agile Engineering Practices
Agile Engineering PracticesAgile Engineering Practices
Agile Engineering Practices
 
Testing in Agile Projects
Testing in Agile ProjectsTesting in Agile Projects
Testing in Agile Projects
 
Saurav_kumar
Saurav_kumarSaurav_kumar
Saurav_kumar
 
Agile Testing
Agile TestingAgile Testing
Agile Testing
 
Software development lifecycle_release_management
Software development lifecycle_release_managementSoftware development lifecycle_release_management
Software development lifecycle_release_management
 
Saurav_Kumar
Saurav_KumarSaurav_Kumar
Saurav_Kumar
 
Sandeep Kamath Resume
Sandeep Kamath ResumeSandeep Kamath Resume
Sandeep Kamath Resume
 
Extreme Programming
Extreme ProgrammingExtreme Programming
Extreme Programming
 

Viewers also liked

Вячеслав Черников (Binwell) | Xamarin на практике
Вячеслав Черников (Binwell) | Xamarin на практике Вячеслав Черников (Binwell) | Xamarin на практике
Вячеслав Черников (Binwell) | Xamarin на практике
RIF-Technology
 
Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...
Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...
Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...
RIF-Technology
 
Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...
Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...
Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...
RIF-Technology
 
Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера
Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера
Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера
RIF-Technology
 
Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...
Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...
Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...
RIF-Technology
 
I, For One, Welcome Our New Robot Overlords
I, For One, Welcome Our New Robot OverlordsI, For One, Welcome Our New Robot Overlords
I, For One, Welcome Our New Robot Overlords
Steve Malsam
 
Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...
Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...
Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...
RIF-Technology
 
Continuous Deployment pipeline demonstration spiced with Robot Framework and ...
Continuous Deployment pipeline demonstration spiced with Robot Framework and ...Continuous Deployment pipeline demonstration spiced with Robot Framework and ...
Continuous Deployment pipeline demonstration spiced with Robot Framework and ...
Antti Pohjonen
 
Automation patterns on practice
Automation patterns on practiceAutomation patterns on practice
Automation patterns on practice
automated-testing.info
 
Robot framework
Robot frameworkRobot framework
Robot framework
Prayoch Rujira
 
Taming robotframework
Taming robotframeworkTaming robotframework
Taming robotframework
泰 増田
 
Test automation within a scrum process
Test automation within a scrum processTest automation within a scrum process
Test automation within a scrum process
Kushan Shalindra Amarasiri - Technical QE Specialist
 
Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...
Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...
Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...
Bhaskara Reddy Sannapureddy
 
Sphinx + robot framework = documentation as result of functional testing
Sphinx + robot framework = documentation as result of functional testingSphinx + robot framework = documentation as result of functional testing
Sphinx + robot framework = documentation as result of functional testing
plewicki
 
Robot Framework
Robot FrameworkRobot Framework
Robot Framework
Onur Baskirt
 
Test Cases are dead, long live Checklists!
Test Cases are dead, long live Checklists!Test Cases are dead, long live Checklists!
Test Cases are dead, long live Checklists!
SQALab
 
Functional Tests Automation with Robot Framework
Functional Tests Automation with Robot FrameworkFunctional Tests Automation with Robot Framework
Functional Tests Automation with Robot Framework
laurent bristiel
 
「うねり」を産んだ言葉と 自分なりの越境のやり方
「うねり」を産んだ言葉と 自分なりの越境のやり方「うねり」を産んだ言葉と 自分なりの越境のやり方
「うねり」を産んだ言葉と 自分なりの越境のやり方
Yoh Nakamura
 
ATAGTR2017 Expanding test horizons with Robot Framework
ATAGTR2017 Expanding test horizons with Robot FrameworkATAGTR2017 Expanding test horizons with Robot Framework
ATAGTR2017 Expanding test horizons with Robot Framework
Agile Testing Alliance
 
Robot Framework Introduction & Sauce Labs Integration
Robot Framework Introduction & Sauce Labs IntegrationRobot Framework Introduction & Sauce Labs Integration
Robot Framework Introduction & Sauce Labs Integration
Sauce Labs
 

Viewers also liked (20)

Вячеслав Черников (Binwell) | Xamarin на практике
Вячеслав Черников (Binwell) | Xamarin на практике Вячеслав Черников (Binwell) | Xamarin на практике
Вячеслав Черников (Binwell) | Xamarin на практике
 
Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...
Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...
Сергей Смирнов (Altair Engineering Inc.) | Организация работы распределенной ...
 
Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...
Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...
Ксения Стернина | (Mail.Ru Group)Gamer Experience Research на различных этапа...
 
Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера
Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера
Юрий Буянов | (Одноклассники)Нюансы разработки мобильного мессенджера
 
Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...
Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...
Алексей Раменский (Тэглайн) | Обзор рынка российской заказной веб-разработки ...
 
I, For One, Welcome Our New Robot Overlords
I, For One, Welcome Our New Robot OverlordsI, For One, Welcome Our New Robot Overlords
I, For One, Welcome Our New Robot Overlords
 
Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...
Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...
Олег Бунин (Онтико) | Менеджмент и бизнес-процессы в разработке highload-прое...
 
Continuous Deployment pipeline demonstration spiced with Robot Framework and ...
Continuous Deployment pipeline demonstration spiced with Robot Framework and ...Continuous Deployment pipeline demonstration spiced with Robot Framework and ...
Continuous Deployment pipeline demonstration spiced with Robot Framework and ...
 
Automation patterns on practice
Automation patterns on practiceAutomation patterns on practice
Automation patterns on practice
 
Robot framework
Robot frameworkRobot framework
Robot framework
 
Taming robotframework
Taming robotframeworkTaming robotframework
Taming robotframework
 
Test automation within a scrum process
Test automation within a scrum processTest automation within a scrum process
Test automation within a scrum process
 
Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...
Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...
Groovyscriptingformanualandautomationtestingusingrobotframework 141221014703-...
 
Sphinx + robot framework = documentation as result of functional testing
Sphinx + robot framework = documentation as result of functional testingSphinx + robot framework = documentation as result of functional testing
Sphinx + robot framework = documentation as result of functional testing
 
Robot Framework
Robot FrameworkRobot Framework
Robot Framework
 
Test Cases are dead, long live Checklists!
Test Cases are dead, long live Checklists!Test Cases are dead, long live Checklists!
Test Cases are dead, long live Checklists!
 
Functional Tests Automation with Robot Framework
Functional Tests Automation with Robot FrameworkFunctional Tests Automation with Robot Framework
Functional Tests Automation with Robot Framework
 
「うねり」を産んだ言葉と 自分なりの越境のやり方
「うねり」を産んだ言葉と 自分なりの越境のやり方「うねり」を産んだ言葉と 自分なりの越境のやり方
「うねり」を産んだ言葉と 自分なりの越境のやり方
 
ATAGTR2017 Expanding test horizons with Robot Framework
ATAGTR2017 Expanding test horizons with Robot FrameworkATAGTR2017 Expanding test horizons with Robot Framework
ATAGTR2017 Expanding test horizons with Robot Framework
 
Robot Framework Introduction & Sauce Labs Integration
Robot Framework Introduction & Sauce Labs IntegrationRobot Framework Introduction & Sauce Labs Integration
Robot Framework Introduction & Sauce Labs Integration
 

Similar to Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

summary
summarysummary
summary
ANSHU GOYAL
 
Continuous testing at scale
Continuous testing at scaleContinuous testing at scale
Continuous testing at scale
Gergely Orosz
 
Selenium Demo
Selenium DemoSelenium Demo
Overview and Analysis of Automated Testing Tools: Ranorex, Test Complete, Se...
Overview and Analysis of Automated Testing Tools:  Ranorex, Test Complete, Se...Overview and Analysis of Automated Testing Tools:  Ranorex, Test Complete, Se...
Overview and Analysis of Automated Testing Tools: Ranorex, Test Complete, Se...
IRJET Journal
 
Automation Open Source tools
Automation Open Source toolsAutomation Open Source tools
Automation Open Source tools
QA Club Kiev
 
Stepin evening presented
Stepin evening presentedStepin evening presented
Stepin evening presented
Vijayan Reddy
 
Abhiram_Bharadwaj_Resume -Both
Abhiram_Bharadwaj_Resume -BothAbhiram_Bharadwaj_Resume -Both
Abhiram_Bharadwaj_Resume -Both
Abhiram Bharadwaj
 
Top 10 Automation Testing Tools in 2020
Top 10 Automation Testing Tools in 2020Top 10 Automation Testing Tools in 2020
Top 10 Automation Testing Tools in 2020
Marianne Harness
 
The Best Automation Testing Tools To Use In 2022 | BMN Infotech
The Best Automation Testing Tools To Use In 2022 | BMN InfotechThe Best Automation Testing Tools To Use In 2022 | BMN Infotech
The Best Automation Testing Tools To Use In 2022 | BMN Infotech
BMN Infotech
 
7 automated visual testing tools for you
7 automated visual testing tools for you7 automated visual testing tools for you
7 automated visual testing tools for you
OpenSense Labs
 
Reliable application tests for ui5 apps
Reliable application tests for ui5 appsReliable application tests for ui5 apps
Reliable application tests for ui5 apps
Maxim Naidenov
 
Mobitop
MobitopMobitop
Mobitop
MobitopMobitop
Mobitop
MobitopMobitop
Mobitop
MobitopMobitop
Automation testing
Automation testingAutomation testing
Automation testing
Mona M. Abd El-Rahman
 
STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
Anna Russo
 
Qtp (2)
Qtp (2)Qtp (2)
Qtp (2)
soujanya k
 
2.Android App Development_ Types of Automated Unit Tests.pdf
2.Android App Development_ Types of Automated Unit Tests.pdf2.Android App Development_ Types of Automated Unit Tests.pdf
2.Android App Development_ Types of Automated Unit Tests.pdf
Belayet Hossain
 
Testing Android Application, Droidcon Torino
Testing Android Application, Droidcon TorinoTesting Android Application, Droidcon Torino
Testing Android Application, Droidcon Torino
Pietro Alberto Rossi
 

Similar to Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации (20)

summary
summarysummary
summary
 
Continuous testing at scale
Continuous testing at scaleContinuous testing at scale
Continuous testing at scale
 
Selenium Demo
Selenium DemoSelenium Demo
Selenium Demo
 
Overview and Analysis of Automated Testing Tools: Ranorex, Test Complete, Se...
Overview and Analysis of Automated Testing Tools:  Ranorex, Test Complete, Se...Overview and Analysis of Automated Testing Tools:  Ranorex, Test Complete, Se...
Overview and Analysis of Automated Testing Tools: Ranorex, Test Complete, Se...
 
Automation Open Source tools
Automation Open Source toolsAutomation Open Source tools
Automation Open Source tools
 
Stepin evening presented
Stepin evening presentedStepin evening presented
Stepin evening presented
 
Abhiram_Bharadwaj_Resume -Both
Abhiram_Bharadwaj_Resume -BothAbhiram_Bharadwaj_Resume -Both
Abhiram_Bharadwaj_Resume -Both
 
Top 10 Automation Testing Tools in 2020
Top 10 Automation Testing Tools in 2020Top 10 Automation Testing Tools in 2020
Top 10 Automation Testing Tools in 2020
 
The Best Automation Testing Tools To Use In 2022 | BMN Infotech
The Best Automation Testing Tools To Use In 2022 | BMN InfotechThe Best Automation Testing Tools To Use In 2022 | BMN Infotech
The Best Automation Testing Tools To Use In 2022 | BMN Infotech
 
7 automated visual testing tools for you
7 automated visual testing tools for you7 automated visual testing tools for you
7 automated visual testing tools for you
 
Reliable application tests for ui5 apps
Reliable application tests for ui5 appsReliable application tests for ui5 apps
Reliable application tests for ui5 apps
 
Mobitop
MobitopMobitop
Mobitop
 
Mobitop
MobitopMobitop
Mobitop
 
Mobitop
MobitopMobitop
Mobitop
 
Mobitop
MobitopMobitop
Mobitop
 
Automation testing
Automation testingAutomation testing
Automation testing
 
STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
STARWEST 2010 - 7 Steps To Improving Software Quality using Microsoft Test Ma...
 
Qtp (2)
Qtp (2)Qtp (2)
Qtp (2)
 
2.Android App Development_ Types of Automated Unit Tests.pdf
2.Android App Development_ Types of Automated Unit Tests.pdf2.Android App Development_ Types of Automated Unit Tests.pdf
2.Android App Development_ Types of Automated Unit Tests.pdf
 
Testing Android Application, Droidcon Torino
Testing Android Application, Droidcon TorinoTesting Android Application, Droidcon Torino
Testing Android Application, Droidcon Torino
 

Recently uploaded

PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
ControlCase
 
“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”
Claudio Di Ciccio
 
Full-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalizationFull-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalization
Zilliz
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
Neo4j
 
Microsoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdfMicrosoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdf
Uni Systems S.M.S.A.
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Neo4j
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
SOFTTECHHUB
 
How to use Firebase Data Connect For Flutter
How to use Firebase Data Connect For FlutterHow to use Firebase Data Connect For Flutter
How to use Firebase Data Connect For Flutter
Daiki Mogmet Ito
 
Pushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 daysPushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 days
Adtran
 
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdfUni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems S.M.S.A.
 
Building RAG with self-deployed Milvus vector database and Snowpark Container...
Building RAG with self-deployed Milvus vector database and Snowpark Container...Building RAG with self-deployed Milvus vector database and Snowpark Container...
Building RAG with self-deployed Milvus vector database and Snowpark Container...
Zilliz
 
20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
Matthew Sinclair
 
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
Edge AI and Vision Alliance
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
Alpen-Adria-Universität
 
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
SOFTTECHHUB
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
Quotidiano Piemontese
 
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Speck&Tech
 
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
James Anderson
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Albert Hoitingh
 

Recently uploaded (20)

PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
 
“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”
 
Full-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalizationFull-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalization
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
 
Microsoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdfMicrosoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdf
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
 
How to use Firebase Data Connect For Flutter
How to use Firebase Data Connect For FlutterHow to use Firebase Data Connect For Flutter
How to use Firebase Data Connect For Flutter
 
Pushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 daysPushing the limits of ePRTC: 100ns holdover for 100 days
Pushing the limits of ePRTC: 100ns holdover for 100 days
 
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdfUni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdf
 
Building RAG with self-deployed Milvus vector database and Snowpark Container...
Building RAG with self-deployed Milvus vector database and Snowpark Container...Building RAG with self-deployed Milvus vector database and Snowpark Container...
Building RAG with self-deployed Milvus vector database and Snowpark Container...
 
20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
 
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
 
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
 
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
 
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
 

Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

  • 2. Agenda, part 1 (general) 1. Problem 2. Solutions 2016
  • 3. Agenda, part 2 (tools and criteria's) 1. Tools to be compared (15) 2. How why we selected this list of tools? 3. Comparison criteria types (3) 4. Stakeholders oriented comparison criteria (7) 5. Mixed comparison criteria (7) 6. Tech stuff oriented comparison criteria (8) 7. How why we selected these lists of criteria's? 8. How to select proper criteria's for your project
  • 4. Agenda, part 3 (comparison analyses) 1. Mixed comparison criteria 2. Tech stuff oriented comparison criteria 3. Stakeholders oriented comparison criteria 4. Define our “standard” context 5. Summarized scores 6. How to calculate scores 7. How to use scores / presentation 8. 4 summarized tables
  • 5. Agenda, part 4 (tools, “how to” and examples) 1. How to define proper tool based on selected criteria's 2. How to link information from presentation to QA Automation metrics 3. How to link information from presentation to Project Health Check 4. How to link information from presentation to QA Automation ROI 5. Tools tiny overview 6. Tools overview structure 7. Example of tool usage structure
  • 6. Agenda, part 5 (trends, science and “what’s next”) 1. Define a Trend! Is it possible ..? 2. Trend – an option 3. Why so? 4. What’s next
  • 7. Problem • There is an implicit leader for Web automation
  • 8. Problem • It’s not that simple if to talk about desktop apps
  • 9. Tools to be compared • TestComplete Desktop • Unified Functional Testing (UFT) • Ranorex • Telerik Test Studio • Zeenyx AscentialTest • MS VS Coded UI • CUIT • AUTOIT • Sikuli • Jubula • Robot Framework • Winium • WinAppDriver • QTWebDriver • PyWinAuto
  • 10. How why we selected this list of tools?
  • 11. Comparison criteria types 1. Stakeholders oriented 2. Tech stuff oriented 3. Mixed
  • 12. Stakeholders oriented comparison criteria 1. Approximate complexity of auto-test development 2. Approximate complexity of auto-test support 3. Approximate “entrance” level 4. Required technical skills level 5. Tests readability 6. How fast tests run 7. Ability to re-use "Business-Logic" layer in other technical context
  • 13. Mixed comparison criteria 1. Supported platforms 2. Supported technologies 3. Licensing 4. Maturity 5. Record-Play system support 6. Standard actions pack
  • 14. Tech stuff oriented comparison criteria 1. Programming languages support 2. Have tools for mapping 3. Self-Made architecture support 4. Data-Driven testing support 5. Test-Driven development support 6. Key-word driven 7. Behavior Driven Development support 8. Continues integration system support
  • 15. How why we selected these lists of criteria's?
  • 16. How to select proper criteria's for your project
  • 18. Supported platforms – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 19. Supported platforms Tool Platforms Mark TestComplete Desktop Windows Unified Functional Testing Windows Ranorex Windows Telerik Test Studio Windows Zeenyx AscentialTest Windows MS VS Coded UI ; CUIT Windows AUTOIT Windows Sikuli Windows, Unix-like Good Jubula Windows, Unix-like Good Robot Framework Windows, Unix-like Good Winium / WinAppDriver ; QTWebDriver Windows / Windows; Cross-Platform / ; Good PyWinAuto Windows
  • 20. Supported technologies – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10.Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 21. Supported technologies Tool Technologies Mark TestComplete Desktop C/C++, WinForms, WPF, Java, Qt Unified Functional Testing WinForms, WPF, Java, SAP Ranorex WinForms, WPF, Java, Qt, SAP Telerik Test Studio WPF Bad Zeenyx AscentialTest Win Forms, WPF, Java Bad MS VS Coded UI ; CUIT Win Forms (partial), WPF Bad AUTOIT OS level Good Sikuli Image recognition based Good Jubula WinForms, WPF, Java Bad Robot Framework Uses AutoIT (and co inside) Good Winium / WinAppDriver ; QTWebDriver WinForms, WPF / Any ; QT Bad PyWinAuto Win32 API, WinForms (partial, Win32 API bases) Bad
  • 22. Licensing – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 23. Licensing Tool License Mark TestComplete Desktop Paid Bad Unified Functional Testing Paid Bad Ranorex Paid Bad Telerik Test Studio Paid Bad Zeenyx AscentialTest Paid Bad MS VS Coded UI ; CUIT Paid Bad AUTOIT Free Sikuli Open source Good Jubula Open source Good Robot Framework Open source Good Winium / WinAppDriver ; QTWebDriver Open source Good PyWinAuto Open source Good
  • 24. Maturity – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10.Jubula 11.Robot Framework 12.Winium 13.WinAppDriver 14.QTWebDriver 15.PyWinAuto
  • 25. Maturity Tool Maturity TestComplete Desktop Good Unified Functional Testing Good Ranorex Good Telerik Test Studio Good Zeenyx AscentialTest MS VS Coded UI ; CUIT Good AUTOIT Sikuli Jubula Robot Framework Winium / WinAppDriver ; QTWebDriver Bad PyWinAuto
  • 26. Record-Play support – do we really need it?
  • 27. Record-Play support Tool Record-Play Mark TestComplete Desktop Yes Good Unified Functional Testing Yes Good Ranorex Yes Good Telerik Test Studio Yes Good Zeenyx AscentialTest No MS VS Coded UI ; CUIT No AUTOIT No Sikuli No Jubula No Robot Framework No Winium / WinAppDriver ; QTWebDriver No PyWinAuto No
  • 28. Standard actions pack – do we really need it?
  • 29. Standard actions pack Tool STD actions Mark TestComplete Desktop No Unified Functional Testing No Ranorex No Telerik Test Studio No Zeenyx AscentialTest Yes Good MS VS Coded UI ; CUIT No AUTOIT No Sikuli Yes Good Jubula Yes Good Robot Framework No Winium / WinAppDriver ; QTWebDriver No PyWinAuto Yes / No (via SWAPY)
  • 30. Tech stuff oriented comparison criteria
  • 31. Programming languages – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 32. Programming languages support Tool Language Mark TestComplete Desktop Python, C#Script, JScript, C++Script, VBScript, DelphiScript Good Unified Functional Testing VBScript Bad Ranorex C#, VB.Net Telerik Test Studio C#, VB.Net Zeenyx AscentialTest Own DSL Bad MS VS Coded UI ; CUIT C#, VB.Net AUTOIT Own Basic-like language Bad Sikuli Jython, Java Jubula - Robot Framework Own DSL, Java, Python Winium / WinAppDriver ; QTWebDriver Java, JavaScript, PHP, Python, Ruby, C# Good PyWinAuto CPython
  • 33. Tools for mapping – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 34. Tools for mapping Tool Tools for mapping Mark TestComplete Desktop Yes Good Unified Functional Testing Yes Good Ranorex Yes Good Telerik Test Studio Yes Good Zeenyx AscentialTest Yes / No Good MS VS Coded UI ; CUIT No AUTOIT No Sikuli Yes / No Jubula Yes Good Robot Framework No Winium / WinAppDriver ; QTWebDriver No PyWinAuto No
  • 35. Custom architecture – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 36. Custom architecture Tool Custom architecture Mark TestComplete Desktop Yes / No Unified Functional Testing Yes / No Ranorex Yes / No Telerik Test Studio Yes / No Zeenyx AscentialTest No Bad MS VS Coded UI ; CUIT Yes Good AUTOIT No Bad Sikuli Yes Good Jubula No / Yes Robot Framework Yes Good Winium / WinAppDriver ; QTWebDriver Yes Good PyWinAuto Yes Good
  • 37. DDT support – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 38. DDT support Tool DDT support Mark TestComplete Desktop Yes Good Unified Functional Testing Yes Good Ranorex Yes Good Telerik Test Studio Yes Good Zeenyx AscentialTest Yes Good MS VS Coded UI ; CUIT Yes Good AUTOIT No Bad Sikuli Yes / No Good Jubula Yes Good Robot Framework Yes Good Winium / WinAppDriver ; QTWebDriver Yes Good PyWinAuto Yes Good
  • 39. TDD support – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10.Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 40. TDD support Tool TDD Mark TestComplete Desktop Yes / No Good Unified Functional Testing Yes / No Good Ranorex Yes / No Good Telerik Test Studio Yes / No Good Zeenyx AscentialTest No Bad MS VS Coded UI ; CUIT Yes Good AUTOIT No Bad Sikuli Yes / No Good Jubula Yes Good Robot Framework Yes Good Winium / WinAppDriver ; QTWebDriver Yes Good PyWinAuto Yes Good
  • 41. Key-word driven – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 42. Key-word driven support Tool Key-word Mark TestComplete Desktop No Bad Unified Functional Testing No Bad Ranorex Yes / No Telerik Test Studio Yes / No Zeenyx AscentialTest Yes Good MS VS Coded UI ; CUIT Yes / No AUTOIT No Bad Sikuli Yes / No Jubula No Bad Robot Framework Yes Good Winium / WinAppDriver ; QTWebDriver Yes / No PyWinAuto Yes / No
  • 43. BDD support – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 44. BDD support Tool BDD Mark TestComplete Desktop No Bad Unified Functional Testing No Bad Ranorex Yes Good Telerik Test Studio Yes Good Zeenyx AscentialTest No Bad MS VS Coded UI ; CUIT Yes Good AUTOIT No Bad Sikuli Yes Good Jubula No Bad Robot Framework Yes / No Winium / WinAppDriver ; QTWebDriver Yes Good PyWinAuto Yes Good
  • 45. CI support – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 46. CI support Tool CI Mark TestComplete Desktop Automated Build Studio Unified Functional Testing Jenkins plugin Ranorex Jenkins Telerik Test Studio Bamboo Zeenyx AscentialTest Test Execution Management MS VS Coded UI ; CUIT Any Good AUTOIT - / Any Sikuli - / Any Java-compatible Jubula No Bad Robot Framework Jenkins plugin Winium / WinAppDriver ; QTWebDriver Any Good PyWinAuto Any Good
  • 49. Approximate complexity of auto-test development Tool Development Mark TestComplete Desktop ~3h Unified Functional Testing ~3h Ranorex ~2h Good Telerik Test Studio ~2h Good Zeenyx AscentialTest ~2h Good MS VS Coded UI ; CUIT ~3h ; 2h ; Good AUTOIT ~1h Good Sikuli ~2h Good Jubula ~2h Good Robot Framework ~4h Winium / WinAppDriver ; QTWebDriver ~3h / 6h -> 2h / Bad -> Good PyWinAuto ~1h Good
  • 50. Approximate complexity of auto-test support (per year) Tool Support Mark TestComplete Desktop ~3h Bad Unified Functional Testing ~3h Bad Ranorex ~2h Good Telerik Test Studio ~2h Good Zeenyx AscentialTest ~3h Bad MS VS Coded UI ; CUIT ~2h ; 1h Good AUTOIT ~4h Bad Sikuli ~5h Bad Jubula ~2h Good Robot Framework ~1h Good Winium / WinAppDriver ; QTWebDriver ~2h / 10h -> 1h Good / Bad -> Good PyWinAuto ~2h Good
  • 51. Approximate “entrance” level – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10.Jubula 11.Robot Framework 12.Winium 13.WinAppDriver 14.QTWebDriver 15.PyWinAuto
  • 52. Approximate “entrance” level Tool Level TestComplete Desktop High Unified Functional Testing High Ranorex Telerik Test Studio Zeenyx AscentialTest MS VS Coded UI ; CUIT High AUTOIT Low Sikuli Low Jubula Robot Framework High Winium / WinAppDriver ; QTWebDriver High -> PyWinAuto
  • 53. Required “technical skills” level – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 54. Required “technical skills” level Tool Level TestComplete Desktop Unified Functional Testing Ranorex Telerik Test Studio Zeenyx AscentialTest Low MS VS Coded UI ; CUIT High ; AUTOIT Low Sikuli Low Jubula Low Robot Framework High Winium / WinAppDriver ; QTWebDriver High -> PyWinAuto Low
  • 55. Test readability – “the worst” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 56. Test readability Tool Level TestComplete Desktop Unified Functional Testing Ranorex Telerik Test Studio Zeenyx AscentialTest High MS VS Coded UI ; CUIT AUTOIT Low Sikuli High Jubula High Robot Framework - > High Winium / WinAppDriver ; QTWebDriver - > High PyWinAuto High
  • 57. How fast tests run – “the best” tool? 1. TestComplete Desktop 2. Unified Functional Testing (UFT) 3. Ranorex 4. Telerik Test Studio 5. Zeenyx AscentialTest 6. MS VS Coded UI 7. CUIT 8. AUTOIT 9. Sikuli 10. Jubula 11. Robot Framework 12. Winium 13. WinAppDriver 14. QTWebDriver 15. PyWinAuto
  • 58. How fast tests run Tool Level TestComplete Desktop Bad Unified Functional Testing Bad Ranorex Telerik Test Studio Zeenyx AscentialTest MS VS Coded UI ; CUIT Good AUTOIT Good Sikuli Bad Jubula Bad Robot Framework Good Winium / WinAppDriver ; QTWebDriver Good PyWinAuto Good
  • 59. Ability to re-use "Business-Logic" layer Tool “BDD” Mark TestComplete Desktop No Bad Unified Functional Testing No Bad Ranorex Yes Good Telerik Test Studio Yes Good Zeenyx AscentialTest No Bad MS VS Coded UI ; CUIT Yes Good AUTOIT No Bad Sikuli Yes Good Jubula No Bad Robot Framework Yes Good Winium / WinAppDriver ; QTWebDriver Yes Good PyWinAuto Yes Good
  • 62. How to use scores 
  • 63. Stakeholders oriented score Tool Score TestComplete Desktop -2 Unified Functional Testing -2 Ranorex +3 Telerik Test Studio +3 Zeenyx AscentialTest +1 MS VS Coded UI ; CUIT +1 AUTOIT +1 Sikuli +3 Jubula +2 Robot Framework +2 Winium / WinAppDriver ; QTWebDriver +2 PyWinAuto +6
  • 64. Mixed score Tool Score TestComplete Desktop +1 Unified Functional Testing +1 Ranorex +1 Telerik Test Studio 0 Zeenyx AscentialTest -1 MS VS Coded UI ; CUIT -1 AUTOIT +1 Sikuli +4 Jubula +1 Robot Framework +2 Winium / WinAppDriver ; QTWebDriver -2 PyWinAuto -1
  • 65. Tech stuff oriented score Tool Score TestComplete Desktop +2 Unified Functional Testing 0 Ranorex +4 Telerik Test Studio +4 Zeenyx AscentialTest -1 MS VS Coded UI ; CUIT +4 AUTOIT -6 Sikuli +4 Jubula +1 Robot Framework +4 Winium / WinAppDriver ; QTWebDriver +6 PyWinAuto +5
  • 66. Summarized score Tool Stub TestComplete Desktop +1 Unified Functional Testing -1 Ranorex +8 Telerik Test Studio +7 Zeenyx AscentialTest -1 MS VS Coded UI ; CUIT +4 AUTOIT -4 Sikuli +11 Jubula +4 Robot Framework +8 Winium / WinAppDriver ; QTWebDriver +6 PyWinAuto +10
  • 67. How to define proper tool based on selected criteria's
  • 68. How to 1. link information from presentation to QA Automation metrics 2. link information from presentation to Project Health Check 3. link information from presentation to QA Automation ROI
  • 70. Tools overview structure 1. Pros 2. Cons 3. What kind of project / product / problem / situation certain tools could be used for!
  • 71. Example of tool usage structure 1. Plus several examples of each tool usage — Example structure: • Values: — Value the individual — Act as a team — Strive for excellence — Focus on customer — Act with integrity • Prisms: — Technology — Delivery — Leadership
  • 75. Test Complete Desktop 1. Pros • Low entrance level • High level of test scripts’ flexibility • Huge knowledge base (at about MSDN level) • Wide choice of script languages which look like common languages 2. Cons • Very expensive license • Very specific own script languages
  • 76. Unified functional testing 1. Pros • Low “entrance” level • High level of test scripts’ flexibility • Good tech support 2. Cons • Strict integration with other HP solutions • Very specific own DSL
  • 77. Ranorex 1. Pros • Low “entrance” level • Script tests are written on common languages (C#, VB,Net) • Good tech support 2. Cons • Paid license
  • 78. Telerik Test Studio (Desktop) 1. Pros • Low “entrance” level • Great parameterization of Keyword tests • DDT support using common formats (CSV, XLS, DB) • Converting tests to common languages (C#, VB.NET) 2. Cons • Only WPF-applications
  • 79. Zeenyx 1. Pros • Supporting complex logic • Great organization of DDT • Using standard .Net libraries support 2. Cons • Need time to learn how to use • Specific own DSL
  • 80. MS VS Coded UI 1. Pros • “Native” for Windows • Supports a huge set of UI technologies • Generated UI Map • Ready to go infrastructure • Good documentation and support 2. Cons • License cost • Relatively “low level” API
  • 81. MS VS Coded UI + CUIT 1. Pros • The same as for MS VS Coded UI • Elegant “High level” API 2. Cons • The same as for MS VS Coded UI
  • 82. AutoIT 1. Pros • Easy • Universal • Free 2. Cons • There is no ready-to-use verification instruments • Test = exe file • There is no ready-to-use reports
  • 83. Sikuli 1. Pros • IDE is easy to learn and use • Standard actions pack • Supports an ability to write tests using common languages (Java, Python) • Supports an ability to work on different platforms and with any applications • Free 2. Cons • Low test’s reliability • Slow tests work • No ability to work with texts • Complicated to support tests
  • 84. Jubula 1. Pros • IDE is easy to use • Supports an ability to work on requirement base • Integrated DB for storing test data and results • Free 2. Cons • No flexibility which is ingrain to script tests • No CI support
  • 85. Robot Framework 1. Pros • Their own not complex and easy-to-read Keyword-based language • Plugins for different IDE’s • Work with different Oss • Different programming languages support • Tools for creating user-own libraries • Free 2. Cons • High entrance level
  • 86. Winium 1. Pros • Familiar syntax and API • Supports all the languages that are supported by Selenium WebDriver • Free 2. Cons • “Immature” testing tool • Incomplete way of locating elements • A lack of documentation
  • 87. WinAppDriver 1. Pros • Familiar syntax and API • “Native” for Windows • Free 2. Cons • “Immature” testing tool • Complicated (in special case usage) • A lack of documentation
  • 88. QTWebDriver 1. Pros • Familiar syntax and API • QT Applications oriented / “Native” (unique tool) • Free 2. Cons • “Immature” testing tool • Complicated (in special case usage) • A lack of documentation
  • 89. PyWinAuto 1. Pros • Extremely simple to use • Easy to support • Free 2. Cons • Do not support all popular UI technologies • CPython only
  • 90. Define a Trend! Is it possible ..?
  • 91. Trend 1. There is a potential leader for Desktop Automation
  • 92. Why so? “Scientific” technical prove of Trend • Hegel’s dialectics • Bifurcation mathematical apparat (Bifurcation Theory) • Sedov’s law of hierarchical compensation • Pannov-Snuks Vertical • Big History
  • 93. Why so? Non-technical scientific prove of Trend • Peter Drucker “Management. Challenges for the 21st Century” Note: It’s a topic of the whole big conversation, and I’m sure we’re going to get back to it, but not today…
  • 94. How to 1. use this presentation on different project phases 2. use this presentation based on main project roles
  • 95. What’s next (just a possible way) • Shu 1. Use Presentation 1. Please, follow recommendations a) “How to select proper criteria's for your project” b) “How to define proper tool based on selected criteria's” c) “How to link information from presentation to QA Automation metrics” d) “How to link information from presentation to Project Health Check” e) “How to link information from presentation to QA Automation ROI” f) “How to use this presentation on different project phases” g) “how to use this presentation based on main project roles”
  • 96. What’s next • Ha 1. Update a set of criteria's 2. Update a set of tools 3. Update Presentation 4. Read “Scientific” prove of Trend
  • 97. What’s next • Ri 1. Re-Read “Scientific” prove of Trend 2. Update a set of criteria's 3. Update a set of tools 4. Update Presentation 5. Predict the “Trend” 6. Manage the “Trend”
  • 98. Next iteration  • Move from static (Presentation) to dynamic (Application) • For example, “https://telescope.epam.com”