2
CODE QUALITY IMPROVEMENTS
ZOLTAN ISZLAI, CLIENT ENGAGEMENT PLATFORM,
NOVEMBER 2019
3
3
3
 Introduction
 Challenges
 Approach
 The why
 QA tools used
 Maintaining quality
 SonarQube results
 Lessons learned
 Further reading
AGENDA
4
4
4
INTRODUCTION
DESCRIPTION
The Client Engagement Platform will be used for automatic
calculations of credit information used for further analysis for wholesale
banking specialists.
 Improve the projects quality
 Fix current critical issues
 Add new features for the MVP and later for PROD release
 Get to Production from an unstable Pilot III phase
CHALLENGES
5
CHALLENGES
ENGINE
The evaluation for deals is done
using a risk and commercial
engine based on over 50 inputs
CONTINOUS INTEGRATION
Integrate with Azure DevOps
Automate testing
Make tests visible for business
FEATURES
Deal Flow
Authentication and authorization
Notification and audit system
COMPLY WITH POLICIES
Audit policy
Security audit
Credit and Risk policy
6
APPROACH
INCREASING ENGINE STABILITY
Moving away from manual testing (1000+ tests) to automated testing, using:
 Creating functional tests within a test automation framework, which simulates the UI (using Selenium and Cucumber)
 Create unit tests to increase engine coverage above 80%
INCREATE QUALITY
Using QA automation tools and continuous integration to:
 Automate bug detection and measuring the impact of changes
 Automate code quality checking (with Azure DevOps)
7
THE WHY #1
THE COST OF BUGS
With early detection PROD issues
can be prevented.
 Automated local code verification
can be run
 Automated functional UI tests
can be run
 Reputation loss can be avoided
8
THE WHY #2
LENGTH OF THE
FEEDBACK CYCLE
Adding continuous integration into
our SDLC, we can prevent the cost
increase of finding bugs.
 Using TDD we also prevent
issues
 Design defects can be prevented
early
 With active stakeholder
participation early requirements
defects can be avoided
9
QA TOOLS
CheckStyle
Enforces a coding
standard
Code is readability
Checks for JavaDoc
issues
PMD
Static code analyzer
Find common
programming flaws
Includes a CPD to
find duplicate code
SpotBugs
Find bugs in Java
programs
Successor of
FindBugs
SonarQube
Continuous code
inspection
Application Security
Technical Dept
10
10
10
CHECKSTYLE
CHALLENGES
The code style didn’t confirm to any standard
JavaDoc comment’s where correct
Cons: Configuring the rules took 1+ hours
 On validating the code on each build Checkstyle is run
 We used IntelliJ IDEA’s default code style (autocorrection worked)
 JavaDoc comments where also validated
SOLUTIONS
11
11
11
PMD & CPD
CHALLENGES
PMD is very tough to configure, but it runs very fast
Cons: Configuring the rules took many hours, some where ignored
Some rules were biased and not compatible with each other
 The report was very large
 The 80/20 rule was applied for prioritization of issues
 The tool did find bugs and we where able to fix them before PROD
SOLUTIONS
12
12
12
SPOTBUGS
CHALLENGES
SpotBugs where the most accurate among all tools
Configuration was simple and straight forward
 The tool found: dead code, bad programming practices and NPE’s
 The tool did find bugs and we where able to fix them before PROD
SOLUTIONS
13
13
13
SONARQUBE
CHALLENGES
SonarQube was the complete QA tool
Cons: Setup and connection to Azure Devops was time consuming
The rules defined by the client’s where bank specific
 The tool categorized the issues by bugs, code smells, vulnerability's
 It was multiple filters and multiple levels of priority's automatically
assigned
 It measures also new code additions automatically
SOLUTIONS
14
SONARQUBE RESULTS
15
LESSONS LEARNED
GOLDEN HAMMER
If all you have is a hammer,
everything looks like a nail.
Not very rule is a flaw.
Design flaws
Check for architecture flaws, for:
cyclomatic and cognitive
complexity.
SILVER BULLET
There is no one fits all solution
for any project, budget and
timeline.
CONSISTENCY
Use the Pareto principle (80/20
rule) to determine what to test.
Create unit tests for new code.
16
FURTHER READING
PITEST
Real world mutation
testing
Runs unit tests
against modified
versions of the
application code
COST OF
BUGS
Caper Jones:
Applied software
measurement:
assuring productivity
and quality
FEEDBACK
CYCLE
Scott Amber:
Why Agile Software
Development
Techniques Work:
Improved Feedback
PARETO
PRINCIPLE
Vilfredo Pareto:
80% of the effects
come from 20% of
the causes
17
THANK YOU!

Java Code Quality Improvements - DevWeek

  • 2.
    2 CODE QUALITY IMPROVEMENTS ZOLTANISZLAI, CLIENT ENGAGEMENT PLATFORM, NOVEMBER 2019
  • 3.
    3 3 3  Introduction  Challenges Approach  The why  QA tools used  Maintaining quality  SonarQube results  Lessons learned  Further reading AGENDA
  • 4.
    4 4 4 INTRODUCTION DESCRIPTION The Client EngagementPlatform will be used for automatic calculations of credit information used for further analysis for wholesale banking specialists.  Improve the projects quality  Fix current critical issues  Add new features for the MVP and later for PROD release  Get to Production from an unstable Pilot III phase CHALLENGES
  • 5.
    5 CHALLENGES ENGINE The evaluation fordeals is done using a risk and commercial engine based on over 50 inputs CONTINOUS INTEGRATION Integrate with Azure DevOps Automate testing Make tests visible for business FEATURES Deal Flow Authentication and authorization Notification and audit system COMPLY WITH POLICIES Audit policy Security audit Credit and Risk policy
  • 6.
    6 APPROACH INCREASING ENGINE STABILITY Movingaway from manual testing (1000+ tests) to automated testing, using:  Creating functional tests within a test automation framework, which simulates the UI (using Selenium and Cucumber)  Create unit tests to increase engine coverage above 80% INCREATE QUALITY Using QA automation tools and continuous integration to:  Automate bug detection and measuring the impact of changes  Automate code quality checking (with Azure DevOps)
  • 7.
    7 THE WHY #1 THECOST OF BUGS With early detection PROD issues can be prevented.  Automated local code verification can be run  Automated functional UI tests can be run  Reputation loss can be avoided
  • 8.
    8 THE WHY #2 LENGTHOF THE FEEDBACK CYCLE Adding continuous integration into our SDLC, we can prevent the cost increase of finding bugs.  Using TDD we also prevent issues  Design defects can be prevented early  With active stakeholder participation early requirements defects can be avoided
  • 9.
    9 QA TOOLS CheckStyle Enforces acoding standard Code is readability Checks for JavaDoc issues PMD Static code analyzer Find common programming flaws Includes a CPD to find duplicate code SpotBugs Find bugs in Java programs Successor of FindBugs SonarQube Continuous code inspection Application Security Technical Dept
  • 10.
    10 10 10 CHECKSTYLE CHALLENGES The code styledidn’t confirm to any standard JavaDoc comment’s where correct Cons: Configuring the rules took 1+ hours  On validating the code on each build Checkstyle is run  We used IntelliJ IDEA’s default code style (autocorrection worked)  JavaDoc comments where also validated SOLUTIONS
  • 11.
    11 11 11 PMD & CPD CHALLENGES PMDis very tough to configure, but it runs very fast Cons: Configuring the rules took many hours, some where ignored Some rules were biased and not compatible with each other  The report was very large  The 80/20 rule was applied for prioritization of issues  The tool did find bugs and we where able to fix them before PROD SOLUTIONS
  • 12.
    12 12 12 SPOTBUGS CHALLENGES SpotBugs where themost accurate among all tools Configuration was simple and straight forward  The tool found: dead code, bad programming practices and NPE’s  The tool did find bugs and we where able to fix them before PROD SOLUTIONS
  • 13.
    13 13 13 SONARQUBE CHALLENGES SonarQube was thecomplete QA tool Cons: Setup and connection to Azure Devops was time consuming The rules defined by the client’s where bank specific  The tool categorized the issues by bugs, code smells, vulnerability's  It was multiple filters and multiple levels of priority's automatically assigned  It measures also new code additions automatically SOLUTIONS
  • 14.
  • 15.
    15 LESSONS LEARNED GOLDEN HAMMER Ifall you have is a hammer, everything looks like a nail. Not very rule is a flaw. Design flaws Check for architecture flaws, for: cyclomatic and cognitive complexity. SILVER BULLET There is no one fits all solution for any project, budget and timeline. CONSISTENCY Use the Pareto principle (80/20 rule) to determine what to test. Create unit tests for new code.
  • 16.
    16 FURTHER READING PITEST Real worldmutation testing Runs unit tests against modified versions of the application code COST OF BUGS Caper Jones: Applied software measurement: assuring productivity and quality FEEDBACK CYCLE Scott Amber: Why Agile Software Development Techniques Work: Improved Feedback PARETO PRINCIPLE Vilfredo Pareto: 80% of the effects come from 20% of the causes
  • 17.