Empirical Studies of Performance Bugs &
Performance Analysis Approaches for Large
Scale Software Systems
Shahed Zaman
Supervisor: Dr. Ahmed E. Hassan
Software Analysis and Intelligence Lab (SAIL)
School of Computing
Queen’s University
Performance
How fast and efficiently a system can
perform
Performance Bug
Any bug related to performance
ProblemImprovement expectation
2
Bugs have a high impact on
companies
482 bugs/week
Costly Affects
Reputation
3
Software Performance
• Important non-functional characteristics
in a competitive market
• Considered to be of high priority in
practice for testing
4
Most research treats all bugs equally.
Does this make sense?
5
Performance Security Other bugs
6
Research Hypothesis
Performance bugs have different
characteristics than other bugs and should be
treated differently in software maintenance
research and practice.
7
Quantitative Study
Qualitative Study
User-centric performance analysis Study
8
Quantitative Study
Qualitative Study
User-centric performance analysis Study
9
Performance Security
Chrome®
Other bugs
295,198 bugs
7,603 performance bugs
847 security bugs
44,997 bugs
510 performance bugs
327 security bugs
10
Quantitative Study Dimensions
Fix
Time Code
Changes
People
11
Performance bugs are fixed by
more
experienced developers
Performance bugs take longest
time to Fix
12
Quantitative Study Finding
• Performance bugs show different
characteristics.
• Findings are not consistent across
projects.
13
Quantitative Study
Qualitative Study
User-centric performance analysis Study
14
Qualitative Study
Performance Bugs Non-performance bugs
100 Bugs 100 Bugs
=
200 Bugs
+
100 Bugs 100 Bugs
=
200 Bugs
+
15
Dimensions Used in this study
 Impact on the stakeholders
 Available context of the bug
 The bug fix
 Fix validation
16
Regression
Blocking
WFM After a long time
People Talking about switching
Measurement Used
Has test cases
Contains stacktrace
Has reproducible info
Problem in reproducing
Reported by a project member
Duplicate bug
Problem discussion in comments
Depends on other bug
Blocking other bugs
Reporter provides some hint about the fix
Patch uploaded by the reporter
Discussion about the patch
Review
Super-review
Findings
= Statistically significant
difference between perf.
and non-perf.
 Performance
bugs are different
 Findings not
consistent across
projects
17
Findings for Firefox performance bugs
Quantitative Study Qualitative Study
Require more time to fix 1. Problem in reproducing
2. More dependencies between
bugs
3. Collaborative root-cause
analysis process
4. WFM/Fixed/Won’tFix after a
long time
Fixed by more
experienced developers
1. More release blocking
2. People switch to other
software systems
18
Findings
 Impact on the stakeholders
 Available context of the bug
 The bug fix
Perf. bugs have high impact on stakeholders
Perf. bug reports contain more context about the
bug
Perf. bug fixing require more collaborative effort
19
Quantitative Study
Qualitative Study
User-centric performance analysis Study
20
Users threaten to switch to
a competing product
21
User-centric VS Scenario-centric
performance analysis
22
Problem with current practice
Users
10 Requests
per user
Software System
1,000 Requests
10 Requests
Bad Response Time
Requests
23
Scenario-Centric View
Users
Software System
10 Requests
Bad Response Time
10 Requests
per user
1% bad request
instance
1,000 Requests
24
User-Centric View
Users
Software System
1,000 Requests
10 Requests
Bad Response Time
10 Requests
per user
0% bad
request
instance
50% bad
request
instance
1% bad request
instance
User’s
Perspective
System’s
Perspective
25
Data used in this study
• 3 systems
• 13 use-case scenarios
Factor
Enterprise
System 1
Enterprise
System 2
Dell DVD
store
Functionality Telecommunications E-commerce
Vendor’s
Business
Model
Commercial Open-source
Size Ultra Large Large Small
Complexity Complex Complex Simple
26
Our Study Dimensions
Overall
Performance
Performance
Trend
Performance
consistency
vs
27
Performance Trend Over Time
Scenario Centric View User Centric View
Old
New
0 15 30 45 60
30354045
ResponseTime
Running Time
0 20 40 60 80 100 120 140
406080100120
ResponseTime
Instance # for a user
Old
New
28
Findings
Overall
Performance
Performance
Trend
Performance
Consistency
vs
10 out of 13 use-cases
showed a
complementary
different view
8 out of 13 use-cases
showed a
complementary
different view
All 13 use-cases
showed a
complementary
different view
29
User-centric performance analysis
study finding
User-centric view is a
complementary useful view
30
Major Contributions
• First time ever empirical study on
performance bugs
• Developed a taxonomy for the qualitative
analysis of performance bug reports
• Proposed a new approach to analyze the
performance of software systems
31
32
MSR 2011 MSR 2012
ICST 2012

Empircal Studies of Performance Bugs & Performance Analysis Approaches for Large Scale Software Systems

Editor's Notes

  • #3 Fast can be measured by “processing time” , “response time” ………. And The efficiency means how efficiently the resource is utilized ….
  • #12 Describe : Triage time Use of # of tossing to evaluate triage time Fix time Use of # of reopening to evaluate fix time
  • #13 We used 2 metrics for developer experience Number of previously fixed bugs by the developer. 2. Experience in days, i.e., the number of days from the first bug fix of the developer to the current bug's fix date.
  • #24 Test is done to ensure that Optimal Configuration is chosen Any change to hardware or software in the system has not degraded the system performance
  • #25 Test is done to ensure that Optimal Configuration is chosen Any change to hardware or software in the system has not degraded the system performance
  • #26 Test is done to ensure that Optimal Configuration is chosen Any change to hardware or software in the system has not degraded the system performance
  • #29 Mean = 44.82 vs 42.25 Median = 46 vs 32 % of bad = 7.19 vs 0.37