8 years in IT 
Solutions Architect 
Automation Expert (Consultant, 
Trainer) 
PhD in IT (automation testing) 
orem@softserveinc.com 
WHO IS IT?
WHAT IS SOFTWARE ARCHITECTURE?
IT STARTS FROM ONE GEEKY GUY…
THEN IT MIGHT FINISH UP AS FOLLOWING…
SO, 
WHAT IS SOFTWARE 
ARCHITECTURE?
WHAT FORMS SOFTWARE ARCHITECTURE?
BUSINESS
USERS
SYSTEM
WHAT CAN BE TESTED HERE?
THIS???
SEI 
THEY LIKE NON FUNCTIONAL REQUIREMENTS
THEY CALL THEM – QUALITY ATTRIBUTES 
Category Quality attribute 
Design Qualities Conceptual Integrity 
Maintainability 
Reusability 
Run-time Qualities Availability 
Interoperability 
Manageability 
Performance 
Reliability 
Scalability 
Security 
System Qualities Supportability 
Testability 
User Qualities Usability
WHAT IS QUALITY ATTRIBUTE? 
ABILITY
WHAT QUALITY ATTRIBUTES YOU SEE?
QUALITY ATTRIBUTE WORKSHOP
S.M.A.R.T. 
Specific (why/what/how) 
Measurable 
Achievable 
Result-focused 
Time-bound
ARCHITECTURE SCENARIOS – SAMPLE UTILITY TREE
ACCEPTANCE: A SOFTWARE ERROR OCCURS AT 
HIGH VEHICLE SPEED. REBOOT WITHIN 50 MSEC. 
Source • System 
Stimulus • Software error 
Artifact • System 
Env • High vehicle speed 
Respons 
e 
• Reboot occurs 
Respons 
e 
Measure 
• Time to reboot up to 50 msec.
EXPLAINING ACCEPTANCE: GHERKIN 
Given 
Environment 
• High vehicle 
speed 
Artifact 
• System 
As 
Source 
• System 
When 
Stimulus 
• Software error 
Then 
Response 
• Reboot occurs 
Response 
Measure 
• Time to reboot up 
to 50 msec.
IMPLEMENTATION 
Gherkin stubs 
Best practices 
Base classes reusage 
Implementation monitoring 
….
TRACEABILITY: ACCEPTANCE TREE 
1. Implementation directly linked to Quality 
attribute 
2. Quality attribute linked to business driver 
3. Priority of the scenario shows might be a start 
point for its automation
REAL USE CASE 
Project specifics 
Disk level data encryption system 
Goal: 
 form acceptance for the system 
 Support large number of OS 
 Test in the cloud 
Test definitions 
None 
Technology 
Amazon + LAMP 
Resources 
Senior part time and junior full time developer
HOW IT MIGHT LOOK IN IDE
DIRECT JUMP TO CODE
GETTING ANALYSIS DATA TOGETHER 
Factor QAW-based Classic analysis approach 
Work with stakeholders It’s better to have everybody in one 
location, at least for a short period. 
You can have several separate 
sessions with technical 
representatives only 
Documentation That should be part of stakeholders 
knowledge only 
Might be outdated 
Domain knowledge Not deep knowledge – participants 
are the knowledge holders 
Required 
Rough Estimate 2-3 days 3-5 days
RESULT 
Acceptance test suite definition (~30 tests) 
Mini-framework implementation 
Jenkins integration 
Project was DECLINED to be released 
because of NOT passing the automation tests
RECAP: PROCESS DEFINITION 
•Break scenario into 
steps 
•Define steps in QC 
language 
•Generate test stubs 
•Prioritize 
•Build timeline 
•Implement 
•Define Business Goals 
•Gather Quality 
Attributes 
•Generate Scenarios 
Architecture 
Form 
Acceptance 
Test solution 
Trace and 
update
QUESTIONS?

Quality attributes testing. From Architecture to test acceptance

  • 2.
    8 years inIT Solutions Architect Automation Expert (Consultant, Trainer) PhD in IT (automation testing) orem@softserveinc.com WHO IS IT?
  • 4.
    WHAT IS SOFTWAREARCHITECTURE?
  • 5.
    IT STARTS FROMONE GEEKY GUY…
  • 6.
    THEN IT MIGHTFINISH UP AS FOLLOWING…
  • 7.
    SO, WHAT ISSOFTWARE ARCHITECTURE?
  • 8.
    WHAT FORMS SOFTWAREARCHITECTURE?
  • 9.
  • 10.
  • 11.
  • 12.
    WHAT CAN BETESTED HERE?
  • 13.
  • 15.
    SEI THEY LIKENON FUNCTIONAL REQUIREMENTS
  • 16.
    THEY CALL THEM– QUALITY ATTRIBUTES Category Quality attribute Design Qualities Conceptual Integrity Maintainability Reusability Run-time Qualities Availability Interoperability Manageability Performance Reliability Scalability Security System Qualities Supportability Testability User Qualities Usability
  • 17.
    WHAT IS QUALITYATTRIBUTE? ABILITY
  • 18.
  • 21.
  • 22.
    S.M.A.R.T. Specific (why/what/how) Measurable Achievable Result-focused Time-bound
  • 23.
    ARCHITECTURE SCENARIOS –SAMPLE UTILITY TREE
  • 24.
    ACCEPTANCE: A SOFTWAREERROR OCCURS AT HIGH VEHICLE SPEED. REBOOT WITHIN 50 MSEC. Source • System Stimulus • Software error Artifact • System Env • High vehicle speed Respons e • Reboot occurs Respons e Measure • Time to reboot up to 50 msec.
  • 25.
    EXPLAINING ACCEPTANCE: GHERKIN Given Environment • High vehicle speed Artifact • System As Source • System When Stimulus • Software error Then Response • Reboot occurs Response Measure • Time to reboot up to 50 msec.
  • 26.
    IMPLEMENTATION Gherkin stubs Best practices Base classes reusage Implementation monitoring ….
  • 27.
    TRACEABILITY: ACCEPTANCE TREE 1. Implementation directly linked to Quality attribute 2. Quality attribute linked to business driver 3. Priority of the scenario shows might be a start point for its automation
  • 28.
    REAL USE CASE Project specifics Disk level data encryption system Goal:  form acceptance for the system  Support large number of OS  Test in the cloud Test definitions None Technology Amazon + LAMP Resources Senior part time and junior full time developer
  • 29.
    HOW IT MIGHTLOOK IN IDE
  • 30.
  • 31.
    GETTING ANALYSIS DATATOGETHER Factor QAW-based Classic analysis approach Work with stakeholders It’s better to have everybody in one location, at least for a short period. You can have several separate sessions with technical representatives only Documentation That should be part of stakeholders knowledge only Might be outdated Domain knowledge Not deep knowledge – participants are the knowledge holders Required Rough Estimate 2-3 days 3-5 days
  • 32.
    RESULT Acceptance testsuite definition (~30 tests) Mini-framework implementation Jenkins integration Project was DECLINED to be released because of NOT passing the automation tests
  • 34.
    RECAP: PROCESS DEFINITION •Break scenario into steps •Define steps in QC language •Generate test stubs •Prioritize •Build timeline •Implement •Define Business Goals •Gather Quality Attributes •Generate Scenarios Architecture Form Acceptance Test solution Trace and update
  • 35.