• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Pragmatic Design Quality Assessment - (Tutorial at ICSE 2008)
 

Pragmatic Design Quality Assessment - (Tutorial at ICSE 2008)

on

  • 6,852 views

This set of slides was used for the tutorial given by Tudor Girba, Michele Lanza and Radu Marinescu at International Conference on Software Engineering (ICSE) 2008.

This set of slides was used for the tutorial given by Tudor Girba, Michele Lanza and Radu Marinescu at International Conference on Software Engineering (ICSE) 2008.

Statistics

Views

Total Views
6,852
Views on SlideShare
5,057
Embed Views
1,795

Actions

Likes
6
Downloads
194
Comments
0

11 Embeds 1,795

http://www.moosetechnology.org 1004
http://loose.upt.ro 522
http://moose.unibe.ch 205
http://www.tudorgirba.com 48
http://translate.googleusercontent.com 6
http://www.slideshare.net 4
http://webcache.googleusercontent.com 2
http://64.233.163.132 1
http://192.168.10.100 1
http://localhost 1
http://131.253.14.98 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Pragmatic Design Quality Assessment - (Tutorial at ICSE 2008) Pragmatic Design Quality Assessment - (Tutorial at ICSE 2008) Presentation Transcript

    • Pragmatic Design Quality Assessment Tudor Gîrba University of Bern, Switzerland Michele Lanza University of Lugano, Switzerland Radu Marinescu Politehnica University of Timisoara, Romania
    • 1946
    • 1951
    • 1951
    • 1951
    • 1951
    • 1951 2008
    • 1951 2008
    • 1951 2008
    • ? 1951 2008
    • Software is complex. 29% Succeeded 18% Failed 53% Challenged The Standish Group, 2004
    • How large is your project?
    • How large is your project? 1’000’000 lines of code
    • How large is your project? 1’000’000 lines of code * 2 = 2’000’000 seconds
    • How large is your project? 1’000’000 lines of code * 2 = 2’000’000 seconds / 3600 = 560 hours
    • How large is your project? 1’000’000 lines of code * 2 = 2’000’000 seconds / 3600 = 560 hours / 8 = 70 days
    • How large is your project? 1’000’000 lines of code * 2 = 2’000’000 seconds / 3600 = 560 hours / 8 = 70 days / 20 = 3 months
    • But, code is for the computer. Why would we ever read it?
    • } } { { } } { { g rin ee gin en d ar rw fo
    • fo rw ar d en gin ee rin g { { { { { { } { { } } actual development } } } { } } }
    • What is the current state? fo rw ar What should we do? d en Where to start? gin ee How to proceed? rin g { { { { { { } { { } } actual development } } } { } } }
    • fo rw g rin ar ee d gin en gin en ee se rin erv g re { { { { { { } { { } } actual development } } } { } } }
    • Reverse engineering is analyzing a subject system to: identify components and their relationships, and create more abstract representations. Chikofky & Cross, 90
    • { { { { } } } } { } A large system contains lots of details.
    • ity? its qual ju dge How to { { { { } } } } { } A large system contains lots of details.
    • http://moose.unibe.ch http://loose.upt.ro/incode
    • 1 Software in 2 Software in numbers pictures 3 Software in 4 Software in time tools
    • Software in numbers 1
    • Youcannot control what you cannot measure. Tom de Marco
    • Metrics are functions that assign numbers to products, processes and resources.
    • Software metrics are measurements which relate to software systems, processes or related documents.
    • Metrics compress system traits into numbers.
    • Let’s see some examples...
    • Examples of size metrics NOM - number of methods NOA - number of attributes LOC - number of lines of code NOS - number of statements NOC - number of children Lorenz, Kidd, 1994 Chidamber, Kemerer, 1994
    • McCabe cyclomatic complexity (CYCLO) counts the number of independent paths through the code of a function. McCabe, 1977  it reveals the minimum number of tests to write  interpretation can’t directly lead to improvement action
    • Weighted Method Count (WMC) sums up the complexity of class’ methods (measured by the metric of your choice; usually CYCLO). Chidamber, Kemerer, 1994  it is configurable, thus adaptable to our precise needs  interpretation can’t directly lead to improvement action
    • Depth of Inheritance Tree (DIT) is the (maximum) depth level of a class in a class hierarchy. Chidamber, Kemerer, 1994  inheritance is measured  only the potential and not the real impact is quantified
    • Coupling between objects (CBO) shows the number of classes from which methods or attributes are used. Chidamber, Kemerer, 1994  it takes into account real dependencies not just declared ones  no differentiation of types and/or intensity of coupling
    • Tight Class Cohesion (TCC) counts the relative number of method-pairs that access attributes of the class in common. Bieman, Kang, 1995 TCC = 2 / 10 = 0.2  interpretation can lead to improvement action  ratio values allow comparison between systems
    • ...
    • McCall, 1977
    • Metrics Assess and Improve Quality!
    • Metrics Assess and Improve Quality! a lly ? Re
    • McCall, 1977
    • Problem 1: metrics granularity ? capture symptoms, not causes of problems in isolation, they don’t lead to improvement solutions
    • Problem 1: metrics granularity ? capture symptoms, not causes of problems in isolation, they don’t lead to improvement solutions Problem 2: implicit mapping we don’t reason in terms of metrics, but in terms of design principles
    • 2 big obstacles in using metrics: Thresholds make metrics hard to interpret Granularity make metrics hard to use in isolation
    • Can metrics help me in what I really care for? :)
    • fo rw ar d en gin ee rin g { { { { { { } { { } } actual development } } } { } } }
    • fo rw How do I understand code? ar d en gin ee rin g { { { { { { } { { } } actual development } } } { } } }
    • fo rw How do I understand code? ar d How do I improve code? en gin ee rin g { { { { { { } { { } } actual development } } } { } } }
    • fo rw How do I understand code? ar d How do I improve code? en gin How do I improve myself? ee rin g { { { { { { } { { } } actual development } } } { } } }
    • etr ics! w ith m o do i ng t tn oth I wan fo rw How do I understand code? ar d How do I improve code? en gin How do I improve myself? ee rin g { { { { { { } { { } } actual development } } } { } } }
    • How to get an initial understanding of a system?
    • Metric Value LOC 35175 NOM 3618 NOC 384 CYCLO 5579 NOP 19 CALLS 15128 FANOUT 8590 AHH 0.12 ANDC 0.31
    • Metric Value LOC 35175 NOM 3618 NOC 384 CYCLO 5579 NOP 19 CALLS 15128 FANOUT 8590 AHH 0.12 ANDC 0.31
    • Metric Value LOC 35175 NOM 3618 NOC 384 CYCLO 5579 NOP 19 CALLS 15128 FANOUT 8590 ha t? ww AHH An d no 0.12 ANDC 0.31
    • We need means to compare.
    • hierarchies? coupling?
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 Inheritance ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT Size Communication
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT Size
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT Communication
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 Inheritance ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT
    • Java C++ LOW AVG HIGH LOW AVG HIGH CYCLO/LOC 0.16 0.20 0.24 0.20 0.25 0.30 LOC/NOM 7 10 13 5 10 16 NOM/NOC 4 7 10 4 9 15 ...
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT close to high close to average close to low
    • The Overview Pyramid provides a metrics overview. Lanza, Marinescu 2006 close to high close to average close to low
    • fo rw How do I understand code? ar d How do I improve code? en gin How do I improve myself? ee rin g { { { { { { } { { } } actual development } } } { } } }
    • etr ics! w ith m o do i ng t oth nt n I wa fo rw How do I understand code? ar d How do I improve code? en gin How do I improve myself? ee rin g { { { { { { } { { } } actual development } } } { } } }
    • How do I improve code?
    • Quality is more than 0 bugs. Breaking design principles, rules and best practices deteriorates the code; it leads to design problems.
    • Imagine changing just a small design fragment
    • Imagine changing just a small design fragment
    • Imagine changing just a small design fragment and33% of all classes would require changes
    • expensive Design problems are frequent unavoidable
    • expensive Design problems are frequent unavoidable th em? ate limin nd e ete ct a to d How
    • God Classes tend to centralize the intelligence of the system, to do everything and to use data from small data-classes. Riel, 1996
    • God Classes tend to centralize the intelligence of the system, to do everything and to use data from small data-classes.
    • God Classes centralize the intelligence of the system, do everything and use data from small data-classes.
    • God Classes are complex, are not cohesive, access external data.
    • God Classes are complex, are not cohesive, access external data. sing uer ies u to q s s in ator etric per se m ical o Co mpo log
    • Detection Strategies are metric-based queries to detect design flaws. Lanza, Marinescu 2006 Rule 1 METRIC 1 > Threshold 1 AND Quality problem Rule 2 METRIC 2 < Threshold 2
    • Shotgun Surgery has uses is has (partial) Feature Data Envy uses Class is partially God has Intensive Class Coupling Brain has has Method Extensive Brain has Significant Coupling Class Duplication has is is has Refused is Tradition Parent Breaker Bequest has (subclass) Futile Hierarchy Lanza, Marinescu 2006 Identity Collaboration Classification Disharmonies Disharmonies Disharmonies
    • A God Class centralizes too much intelligence in the system. Lanza, Marinescu 2006 Class uses directly more than a few attributes of other classes ATFD > FEW Functional complexity of the class is very high AND GodClass WMC ! VERY HIGH Class cohesion is low TCC < ONE THIRD
    • An Envious Method is more interested in data from a handful of classes. Lanza, Marinescu 2006 Method uses directly more than a few attributes of other classes ATFD > FEW Method uses far more attributes of other classes than its own AND Feature Envy LAA < ONE THIRD The used quot;foreignquot; attributes belong to very few other classes FDP ! FEW
    • Data Classes are dumb data holders. Lanza, Marinescu 2006 Interface of class reveals data rather than offering services WOC < ONE THIRD AND Data Class Class reveals many attributes and is not complex
    • Data Classes are dumb data holders. Lanza, Marinescu 2006 More than a few public data NOAP + NOAM > FEW AND Complexity of class is not high WMC < HIGH Class reveals many OR attributes and is not Class has many public complex data NOAP + NOAM > MANY AND Complexity of class is not very high WMC < VERY HIGH
    • fo rw ar d en gin ee rin g { { { { { { } { { } } actual development } } } { } } }
    • fo rw How do I understand code? ar d How do I improve code? en gin How do I improve myself? ee rin g { { { { { { } { { } } actual development } } } { } } }
    • How do I improve myself?
    • Follow a clear and repeatable process
    • Follow a clear and repeatable process
    • Follow a clear and repeatable process
    • Follow a clear and repeatable process mb ers! so f nu in term qu ality ab out re ason D on’t
    • QA is part of the the Development Process http://loose.upt.ro/incode
    • Can we understand the beauty of a painting by measuring its frame or counting its colors?
    • 1 Software in 2 Software in numbers pictures 3 Software in 4 Software in time tools
    • Software in pictures 2
    • Software is beautiful
    • 1854, London, cholera epidemic
    • 1812, Napoleon’s Campaign in Russia
    • Numbers..
    • Numbers..
    • Numbers.. ANDC 0.31 AHH 0.12 20.21 NOP 19 9.42 NOC 384 9.72 NOM 3618 NOM 418 0.15 LOC 35175 15128 CALLS 0.56 CYCLO 5579 8590 FANOUT
    • Visualization compresses the system into pictures.
    • A picture is worth a thousand words... anonymous ...depends on the picture Lanza
    • U ML han mo re t tio n is liza isua wa re v Soft
    • We are visual beings ... ... and we’re good at spotting patterns
    • How many groups do you see?
    • How many groups do you see?
    • How many groups do you see?
    • How many groups do you see?
    • Gestalt principles proximity similarity enclosure connectivity
    • More Gestalt principles closure continuity
    • We do not see with our eyes, but with our brain. Our brain works like a computer, with 3 types of memory Iconic memory, the visual sensory register Short-term memory, the working memory Long-term memory Sensation Perception (Physical Process) (Cognitive Process) Stimulus Sensory Organ Perceptual Organ Brain Iconic Memory - Short-term Memory - Long-term Memory
    • Iconic Short-term memory memory
    • Iconic Short-term memory memory < 1 second very fast automatic subconscious preattentive
    • Iconic Short-term memory memory < 1 second couple of seconds very fast 3-9 chunks automatic subconscious preattentive
    • Categorizing Preattentive Attributes Category Form Color Spatial Motion Motion Orientation Hue 2D position Flicker Line length Intensity Direction Line width Size Attribute Shape Curvature Added marks Enclosure
    • Attributes of Form Orientation Line Length Line Width Size Shape Curvature Added Marks Enclosure
    • Attributes of Form Line Length Line Width Size Shape Curvature Added Marks Enclosure
    • Attributes of Form Line Width Size Shape Curvature Added Marks Enclosure
    • Attributes of Form Size Shape Curvature Added Marks Enclosure
    • Attributes of Form Shape Curvature Added Marks Enclosure
    • Attributes of Form Curvature Added Marks Enclosure
    • Attributes of Form Added Marks Enclosure
    • Attributes of Form Enclosure
    • Attributes of Form
    • Exemplifying Preattentive Processing
    • Exemplifying Preattentive Processing 8789364082376403128764532984732984732094873290845 389274-0329874-32874-23198475098340983409832409832 049823-0984903281453209481-0839393947896587436598
    • Exemplifying Preattentive Processing 8789364082376403128764532984732984732094873290845 389274-0329874-32874-23198475098340983409832409832 049823-0984903281453209481-0839393947896587436598 8789364082376403128764532984732984732094873290845 389274-0329874-32874-23198475098340983409832409832 049823-0984903281453209481-0839393947896587436598
    • 70% of all external inputs come through the eyes
    • Software visualization is the use of the crafts of typography, graphic design, animation, and cinematography with modern human-computer interaction and computer graphics technology to facilitate both the human understanding and effective use of computer software. Price, Becker, Small
    • Static Visualization
    • Dynamic Visualization
    • llet r bu no silve
    • Software is complex
    • Software is complex
    • A picture is worth a thousand words.
    • era lly :) ok it lit L to UM
    • Example: what is ?
    • Polymetric Views show up to 5 metrics. Lanza, 2003 Width metric Height metric Position metrics Color metric
    • A simple & powerful concept LOC NOS parameters parameters lines
    • System Complexity shows class hierarchies. Lanza, Ducasse, 2003 attributes methods lines
    • Class Blueprint shows class internals. Lanza, Ducasse, 2005 Initialize Interface Internal Accessor Attribute invocation and access direction
    • Class Blueprint has a rich vocabulary. internal access Access external Attribute access Invocation Regular Constant invocations Overriding Delegating lines Method Extending Setter Abstract Getter
    • Class Blueprint reveals patterns. twin classes schizophrenic class
    • Distribution Map shows properties over structure. Ducasse etal, 2006 31 parts, 394 elements and 9 properties
    • Softwarenaut explores the package structure. Lungu etal, 2006
    • Code City shows where your code lives. Wettel, Lanza, 2007 classes are buildings grouped in quarters of packages
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Jmol - The Time Machine
    • Software is beautiful
    • 1 Software in 2 Software in numbers pictures 3 Software in 4 Software in time tools
    • Software in time 3
    • fo rw g rin ar ee d gin en gin en ee se rin erv g re { { { { { { } { { } } actual development } } } { } } }
    • { { } } } { { { } } re v er se en gin ee rin g reverse engineering fo actual development rw ar d en gin ee rin g { { } } { { } }
    • { { { { } } } } { } A large system contains lots of details.
    • { { { { { { { { { { { { { { { { { { { { } } } } } } } } } } } } } } } } { } } { } } { } } { } } { } The history of a large system contains even more details.
    • Most often time is put on the horizontal and a property on the vertical axis. Lehman etal, 2001
    • Spectographs show change activity. Wu etal, 2004 commit time
    • Evolution Matrix shows changes in classes. Lanza, Ducasse, 2002 Idle class Pulsar class Supernova class White dwarf class
    • Evolution Matrix shows changes in classes. Lanza, Ducasse, 2002
    • History can be measured.
    • What changed? When did it change? ... 2 4 3 5 7 2 2 3 4 9 2 2 1 2 3 2 2 2 2 2 1 5 3 4 4
    • Evolution of Number of Methods LENOM(C) = ∑ |NOMi(C)-NOMi-1(C)| 2i-n LENOM(C) = 4 + 2 + 1 + 0 = 7 1 5 3 4 4
    • Latest Evolution of Number of Methods LENOM(C) = ∑ |NOMi(C)-NOMi-1(C)| 2i-n Earliest Evolution of Number of Methods EENOM(C) = ∑ |NOMi(C)-NOMi-1(C)| 22-i -3 -2 -1 0 LENOM(C) = 42 + 22 + 12 + 02 = 1.5 1 5 3 4 4 EENOM(C) = 4 20 + 2 2-1 + 1 2-2 + 0 2-3 = 5.25
    • ENOM LENOM EENOM 2 4 3 5 7 7 3.5 3.25 2 2 3 4 9 7 5.75 1.37 2 2 1 2 3 3 1 2 2 2 2 2 2 0 0 0 1 5 3 4 4 7 1.25 5.25
    • ENOM LENOM EENOM balanced changer 7 3.5 3.25 late changer 7 5.75 1.37 3 1 2 dead stable 0 0 0 early changer 7 1.25 5.25
    • ENOM LENOM EENOM balanced changer 7 3.5 3.25 late changer 7 5.75 1.37 ed. measur3 1 2 be ry can H isto dead stable 0 0 0 early changer 7 1.25 5.25
    • History can be measured in many ways. Evolution Number of Methods Stability Number of Lines of Code Historical Max of Cyclomatic Complexity Growth Trend Number of Modules ... ...
    • The recently changed parts are likely to change in the near future. Common wisdom
    • The recently changed parts are likely to change in the near future. ally? Common wisdom re re they A
    • 30% 90%
    • present
    • past present
    • past future present
    • past future present
    • past future present
    • past future YesterdayWeatherHit(present): past:=histories.topLENOM(start, present) future:=histories.topEENOM(present, end) past.intersectWith(future).notEmpty() present
    • past future YesterdayWeatherHit(present): past:=histories.topLENOM(start, present) future:=histories.topEENOM(present, end) past.intersectWith(future).notEmpty() present prediction hit
    • Yesterday’s Weather shows the localization of changed in time. Girba etal, 2004 hit hit hit YW = 3 / 8 = 37% hit hit hit hit hit hit hit YW = 7 / 8 = 87%
    • A God Class centralizes too much intelligence in the system. Class uses directly more than a few attributes of other classes ATFD > FEW Functional complexity of the class is very high AND GodClass WMC ! VERY HIGH Class cohesion is low TCC < ONE THIRD
    • A God Class centralizes too much intelligence in the system. Class uses directly more than a few attributes of other classes ATFD > FEW tab le? f it is s wh Functional complexity of the at i ut, class is very high B AND GodClass WMC ! VERY HIGH Class cohesion is low TCC < ONE THIRD
    • History-based Detection Strategies take evolution into account. Ratiu etal, 2004 God Class in the last version isGodClass(last) AND Harmless God Class Stable throughout the history Stability > 90%
    • What happens with inheritance? A A A A A B C B C B C B B D D D E ver .1 ver. 2 ver. 3 ver. 4 ver. 5 A is persistent, B is stable, C was removed, E is newborn ...
    • Hierarchy Evolution encapsulates time. Girba etal, 2005 A changed methods changed age lines C B Removed Removed D E A is persistent, B is stable, C was removed, E is newborn ...
    • Hierarchy Evolution reveals patterns. Girba etal, 2005
    • Co-change analysis recovers hidden dependencies. Time is the lines. Gall etal, 2003
    • Evolution Radar shows co-change relationships. D’Ambros, Lanza 2006 one package and its co-change relationships
    • Software is developed by people.
    • CVS shows activity.
    • Who is responsible for this?
    • Who is responsible for this?
    • Alphabetical order is no order.
    • Ownership Map reveals development patterns. Girba etal, 2006
    • JEdit
    • Ant
    • Who copied from whom? (john 23.06.03) public boolean stillValid (ToDoItem I, Designer dsgr) { (bill 09.01.05) if (!isActive()) { (bill 09.01.05) return false (bill 09.01.05) } (steve 16.02.05) List offs = i.getOffenders(); (john 23.06.03) Object dm = offs.firstElement(); (steve 16.02.05) ListSet newOffs = computeOffenders(dm); (john 23.06.03) boolean res = offs.equals(newOffs); (john 23.06.03) return res; (george 13.02.05) public boolean stillValid (ToDoItem I, Designer dsgr) { (bill 11.13.05) if (!isActive()) { (bill 11.13.05) return false (bill 11.13.05) } (steve 16.02.05) List offs = i.getOffenders(); (george 13.02.05) Object dm = offs.firstElement(); (steve 16.02.05) ListSet newOffs = computeOffenders(dm); (george 13.02.05) boolean res = offs.equals(newOffs); (george 13.02.05) return res;
    • What is useless? (john 23.06.03) public boolean stillValid (ToDoItem I, Designer dsgr) { (bill 09.01.05) if (!isActive()) { (bill 09.01.05) return false (bill 09.01.05) } (steve 16.02.05) List offs = i.getOffenders(); (john 23.06.03) Object dm = offs.firstElement(); (steve 16.02.05) ListSet newOffs = computeOffenders(dm); (john 23.06.03) boolean res = offs.equals(newOffs); (john 23.06.03) return res; (george 13.02.05) public boolean stillValid (ToDoItem I, Designer dsgr) { (bill 11.13.05) if (!isActive()) { (bill 11.13.05) return false (bill 11.13.05) } (steve 16.02.05) List offs = i.getOffenders(); (george 13.02.05) Object dm = offs.firstElement(); (steve 16.02.05) ListSet newOffs = computeOffenders(dm); (george 13.02.05) boolean res = offs.equals(newOffs); (george 13.02.05) return res;
    • When did changes happen? 23.06.03 public boolean stillValid (ToDoItem I, Designer dsgr) { 09.01.05 if (!isActive()) { 09.01.05 return false 09.01.05 } 16.02.05 List offs = i.getOffenders(); 23.06.03 Object dm = offs.firstElement(); 16.02.05 ListSet newOffs = computeOffenders(dm); 23.06.03 boolean res = offs.equals(newOffs); 23.06.03 return res; 13.02.05 public boolean stillValid (ToDoItem I, Designer dsgr) { 11.13.05 if (!isActive()) { 11.13.05 return false 11.13.05 } 16.02.05 List offs = i.getOffenders(); 13.02.05 Object dm = offs.firstElement(); 16.02.05 ListSet newOffs = computeOffenders(dm); 13.02.05 boolean res = offs.equals(newOffs); 13.02.05 return res;
    • Clone Evolution shows how developers copy. Balint etal, 2006
    • { { } } } { { { } } re v er se en gin ee rin g reverse engineering fo actual development rw ar d en gin ee rin g { { } } { { } }
    • 1 Software in 2 Software in numbers pictures 3 Software in 4 Software in time tools
    • Software in tools 4
    • http://moose.unibe.ch http://loose.upt.ro/incode http://www.inf.unisi.ch/phd/wettel/codecity.html
    • Pragmatic Design Quality Assessment Tudor Gîrba University of Bern, Switzerland Michele Lanza University of Lugano, Switzerland Radu Marinescu Politehnica University of Timisoara, Romania
    • Tudor Gîrba, Michele Lanza, Radu Marinescu http://creativecommons.org/licenses/by/3.0/