2. ▪ FIWARE is working on production environments
• FIWARE platform must scale up in reliable and real workload conditions
• FIWARE GEs must work at an adequate quality, reliability and performance level
▪ Provide to FIWARE users the high-quality support
• installation, configuration and usage of the FIWARE technology
• improve the FIWARE user experience
▪ Practical approach on improving quality and transparency
• light and agile methodology very operative
Why to test FIWARE: the motivation
1
Guarantee
Improvement
Standard
Satisfaction
Reliability Business
Why to test
FIWARE
3. ▪ Check the completeness of documentation, the specification, the implementation and the
installation of the GEris
▪ Three sections to assess FIWARE GEs:
1. functional testing
2. non-functional testing
3. addressing documentation
▪ How to highlight these assess
• Quality Assurance
Why to test FIWARE: the scope
2
Functional
Testing
Non-Functional
Testing
Documentation
Testing
Verification of the GE specification, validating the APIs
Assessment of performance, stability and scalability, finding the limits of the components
Integrity of GEs documentation, completeness and soundness
OK
4. ▪ Analyzing and checking of the results to:
§ provide labelling to summarize the quality level (A, B, C, D) of the GE
§ send feedback through JIRA issues for any GEs
Quality Assurance in FIWARE: the approach
3
workflow
Functional
Any
GEi
Non-Functional
Documentation
results LABELLING
results
results
Testing
5. The Labelling approach
▪ Quick at a glance mechanism to check the assessed GEs’ quality
▪ Following the EU energy label system
▪ Sub-label per each tested aspect:
• Completeness, Usability, Tests Passed, Scalability, Performance, Stability
▪ All results are available in the catalogue
https://github.com/FIWARE/catalogue
4
6. FIWARE Testing: functional testing
▪ Functional and Non-Functional Testing
• improve the service (reliability, scalability, performance, stability)
▪ FIWARE Testing approach
• automate the test cases using a tool (JMeter) to
□ load functional test and measure performance and
□ provide test results
▪ What is the Functional Testing?
• Test of REST APIs (get, post, put, delete)
5
REST APIs
GET POST PUT DELETE
7. The methodology for Documentation Testing
▪ Documentation tests merge different sources of information:
q installation manuals (step-by-step, docker)
q user and administrator manuals
q academy entries
q catalogue entries
6
Catalogue entry
Installation
Training courses
new
version
GEi
User/Admin
collect
results
update
labels
notify
issues
(Jira)
Labelling
8. The methodology for Functional Testing
▪ Functional test are:
q developed as JMeter scripts
q describing the test environment and versioned in GitHub
q storing results together with scripts
q computing part of the label
q notifying GEi owner via Jira (if needed)
7
API validation
new
version
GEi
run/fix scripts
dev new
scripts
collect
results
update
labels
notify
issues
(Jira)
Labelling
9. ▪ Non-Functional tests are:
q developed as JMeter scripts
q run in a dedicated testing environment isolated from noise and external interactions
q uploaded in GitHub and reported in a detailed analysis report (graphs, tables, etc)
q used for labelling each GE version from three aspects: performance, stability and scalability
q notifying GEi owner via Jira (if needed)
The methodology for Non-Functional Testing
8
new
version
GEi
define metrics
to test (diff for
each type of GE)
dev test
scripts
define
(re-use)
test cases
update
labels
notify
issues
(Jira)
install GE
testing
environment
set up
run tests
collect
results
Analysis and
reporting
Labelling
10. Functional Test: in detail
▪ All tests are committed in GitHub repository:
• https://github.com/FIWARE/test.Functional/tree/master/API.test
▪ Name convention
• <CHAPTER_NAME>.<GE_NAME>/<GE_VERSION>
where
□ CHAPTER_NAME - apps, data, i2nd, iot, security
□ GE_NAME - ApplicationMashup, ContextBroker, Idm, etc..
□ GE_VERSION - i.e. 1.12.0, 1.8.0, 7.0.2, 5.4.3
9
11. Results for each GE version
▪ results sub-folder within CSVs file
• <GE_name>-<GE_version>_<datetime>.csv
▪ <GE name>-<version>.jmx
▪ README.md (how to run the test)
▪ Any additional files (if necessary)
10
Test Executed (TE) Test Failed (TF)
orion_context_broker-2.0.0_2018-10-17_094847.csv
12. JIRA Ticketing: add a WorkItem
11
Prefix:
FIWARE.WorkItem.QA
Chapter
GE name
GE version
Postfix:
functional
13. JIRA Ticketing: add a Bug
12
Prefix:
FIWARE.Bug Chapter GE name
Postfix:
Short bug detail
GE owner
14. Functional Test: The Labelling
▪ Labelling process evaluation
13
Label Value Base Measures Formula
A+++ < 0,1
Total number of Test Failed (TF)
Total number of Test cases Executed (TE)
TF/TE
A++ 0,1 - 0,25
A+ 0,26 - 0,40
A 0,41 - 0,55
B 0,56 - 0,7
C 0,71 - 0,85
D > 0,85
15. Functional Test Results
▪ Results in FI-NEXT (2 years)
14
Functional Test in numbers
Total number of GEs used 26
Total script executed 58
Total bugs detected 119
Labelling
A+++ 50 86 %
A++ 7 12 %
A+ 0 -
A 1 2 %
B 0 -
C 0 -
D 0 -
16. The Future
▪ Upon the continuation of presented activities…
▪ FIWARE Foundation is taking care of QA activities
▪ extending and systematizing tests to all GEs, existing and incubated
▪ Automating as much as possible the assessment:
▪ labelling assignment and updating
▪ launching of verification and performance tests for every new version of GEs in the Catalogue
▪ passing tests to incubated GEs to become part of Catalogue
15