static ABAP code analyzers

11,197 views

Published on

My slides form the session at sitHH at 12th May 2012 about static ABAP code analysis tools and my experience with them. Apart the tools I share my personal lessons learned for establishing a code profiling process.

Published in: Technology, Business

static ABAP code analyzers

  1. 1. Static ABAP code Analysis A Comparison of Tools with some kind of field reportSonntag, 13. Mai 12 1
  2. 2. Disclaimer • First talk at a SIT • Not a native English speaker • This presentation represents my personal opinon and is not related to any company or the single godly truthSonntag, 13. Mai 12 2
  3. 3. Agenda • Motivation • Tools for static ABAP code analysis • CAST Application Intelligence Plattform • VirtualForge CodeProfiler • SAP Code Inspector • Sonar ABAP Plug-in • Lessons Learned • SummarySonntag, 13. Mai 12 3
  4. 4. Who am I? • Markus Theilen • Enterprise Architect at EWE ENERGIE AG • before that Software Architect at BTC AG • responsible for Customer Care and Billing for UtilitiesSonntag, 13. Mai 12 4
  5. 5. MotivationSonntag, 13. Mai 12 5
  6. 6. Background • EWE does not use SAP IS-U for Customer Care and Billing, but develops its own solution: easy+ • Since 1995 this solution is built and maintained by BTC AG on behalf of EWE ENERGIE AGSonntag, 13. Mai 12 6
  7. 7. easy+ • Productive use since 1997 • Pure ABAP coding • Today, about 100 people in development and maintenance + 20 people in support • used by EWE and about 10 public services companiesSonntag, 13. Mai 12 7
  8. 8. easy+: A few facts • over 25 million invoices billed so far • 8,8 TByte data volume • 8.2 million lines of code • 700 packages, 8,000 reports 6,000 classes • 8,000 tablesSonntag, 13. Mai 12 8
  9. 9. Problems • Team size and fluctuation leads to very heterogenous knowledge and skill set • Maintenance is getting harder with each iteration • Too much code for manual review of coding guidelinesSonntag, 13. Mai 12 9
  10. 10. Problems • With code size, complexity can grow exponentially • Code that looks locally ok can lead to problems when seen in its context and call hierarchy • Complexity is too high for manual checksSonntag, 13. Mai 12 10
  11. 11. Problems • No factual statements ? about code quality possible • No direct indicators for architects/management to decide about where to spend time and money to correct the most urgent problems firstSonntag, 13. Mai 12 11
  12. 12. But #1 Problem is: You do not know what your problems are, until you measure your code!Sonntag, 13. Mai 12 12
  13. 13. Use of static analysis • Gives insight and leads you to your problems • Gives possibility to concentrate on hot spots • Base decisions on facts, not myths and rumours!Sonntag, 13. Mai 12 13
  14. 14. Use of static analysis • tool-based analysis is cheaper than manual reviews, but it comes with a price and is far from being perfect! • false positives, missed violations • expect no solution for your problems, tools just help to find and pinpoint them! • only static information is examined, mostly no dynamic aspects are covered!Sonntag, 13. Mai 12 14
  15. 15. What a tool should offer • reliable rule engine • definition of exceptions / false positives • explanations of rules • reasoning, good/bad examples • seamless integration into development cycleSonntag, 13. Mai 12 15
  16. 16. The ToolsSonntag, 13. Mai 12 16
  17. 17. CAST Application Intelligence Plattform (in production)Sonntag, 13. Mai 12 17
  18. 18. What is it? • developed by CAST, headquaters located in France • „world-wide leader in automated application intelligence“ • not just a simple scanner-and-rules-engine, but an application metadata knowledge baseSonntag, 13. Mai 12 18
  19. 19. How it works • external scanning and analysis engine, written in C++ • transfer of source information via extraction report and files • analysis of source code, mapping to common meta model, creation of relations between objects (calls, uses, etc.) • results are shown in dashboard web application and in fat client for architecture analysisSonntag, 13. Mai 12 19
  20. 20. How it works • CAST uses a customisable hierarchy of result aggregation • health factors like robustness, performance • quality indicators like complexity, programming practices, documentation • quality metrics (basic rules)Sonntag, 13. Mai 12 20
  21. 21. Health Factors Quality Indicators Metrics Metric1 Performance Complexity Metric2 Metric3 Metric4 Robustness Architecture Metric5 Metric6 Metric7 Security Prog. Practice Metric8 Metric9 Metric10 Metric11 Transferability Conventions Metric12 Metric13 Metric14 Changebility Documentation Metric15Sonntag, 13. Mai 12 21
  22. 22. CAST Application ViewSonntag, 13. Mai 12 22
  23. 23. CAST Violation ViewSonntag, 13. Mai 12 23
  24. 24. CAST Investigation ViewSonntag, 13. Mai 12 24
  25. 25. CAST rule detailsSonntag, 13. Mai 12 25
  26. 26. CAST Portfolio ViewSonntag, 13. Mai 12 26
  27. 27. CAST Management Studio ISonntag, 13. Mai 12 27
  28. 28. ERGEBNISSE APRIL-2012 27. APRIL 2012 Verletzungen pro kLOC Top 10 Verletzungen Viol. Diff. Viol. Diff. Metrikname Verletzungen Gewichtet Diff. Themengebiet krit krit Total Total Avoid unchecked return code (SY-SUBRC) after OPEN SQL… 15384 246144 2192 Abrechnen 2,97 -0,01 30,61 0,00 Avoid undocumented Methods 28923 144615 1545 Accounting 2,34 -0,01 47,34 -0,08 Avoid _SELECT *_ or _SELECT SINGLE *_ queries 15662 125296 528 Architektur 2,91 -0,01 42,44 0,03 Avoid using literals in assignments (hardcoded values) 23646 118230 750 BusinessWarehouse 5,20 0,01 33,28 0,03 Avoid unreferenced Methods 23340 93360 2500 CustomerCare 2,62 -0,04 45,68 0,28 Avoid Methods with a very low comment/code ratio 23479 70437 528 Messen 2,93 0,14 40,13 0,29 Avoid using LOOP INTO, use LOOP ASSIGNING instead 11518 69108 492 MPK 1,88 0,01 50,43 0,18 Avoid missing WHEN OTHERS in CASE statements 6511 52088 360 OutputManagement 3,29 0,00 31,18 -0,01 Avoid Artifacts with a Complex SELECT Clause 7545 45270 162 Statistik 3,44 -0,01 34,10 -0,22 Avoid Artifacts with High Cyclomatic Complexity 5028 40224 2624 easy+ 2,69 0,00 40,85 0,08 Die Tabelle zeigt die schwerwiegendsten Verletzungen des easy+. Neben der Verletzungsanzahl ist auch der gewichtete Wert dargestellt, Diese Tabelle zeigt die Anzahl der kritischen Verletzungen und die der für das Ranking ausschlaggebend ist. Gesamtanzahl auf tausend Codezeilen innerhalb der Themengebiets. Beispiel: Ein Viol.krit-Wert von 3 sagt aus, dass in tausend Gewichteter easy+-Qualitätswert und Abnahmestatus Codezeilen im Durchschnitt 3 kritische Regelverletzungen vorliegen. Startwert Aktuell Diff. Diff. % Status CAST Biggest Loser 1388953 1402761 13808 0,99 Rang Themengebiet Startgewicht Aktuell Diff.(%) Diese Tabelle zeigt den aktuellen Status für das Quality Gate zum 1 Statistik 220,13 218,92 -0,55 Abnahme- bzw. zum Systemintegrationstest (AT bzw. SIT). Ist der aktuelle Wert kleiner/gleich des Startwerts, ist der Status grün, sonst ist er rot. 2 Accounting 258,62 258,10 -0,20 Alles aufwachen, bitte! 3 Abrechnen 210,84 210,72 -0,06 Des entwicklungsarmen Releasestarts wegen, liegt die letzte CAST-Auswertung mittlerweile fast vier 4 OutputManagement 218,02 217,98 -0,02 Wochen zurück. Diese längere Pause bleibt allerdings künftig wieder die Ausnahme und so werdet Ihr ab 5 Architektur 251,44 251,46 0,01 sofort wieder zwei-wöchentlich über den aktuellen Stand informiert. 6 CustomerCare 270,70 270,97 0,10 Trotz des vergleichsweise geringen Entwicklungsumfangs in diesen ersten Wochen haben sich wieder 7 BusinessWarehouse 244,40 244,74 0,14 diverse Regelverletzungen eingeschlichen, die den Qualitäts-Startwert bereits um fast ein volles Prozent übersteigen. Spätestens mit diesem Flyer sollten alle Entwickler wieder darauf achten, die CAST-Regeln im 8 MPK 262,63 263,25 0,24 Rahmen ihrer Entwicklung einzuhalten und vorhandene zu entfernen. 9 Messen 252,94 255,03 0,82 Das Rotlicht-Milieu zieht um easy+ 243,70 243,90 0,08 Mit den ersten vergleichbaren Biggest Loser-Zahlen seit Release 35/2 steht nun fest, dass sich das TG Diese Tabelle zeigt das CAST Biggest Loser-Startgewicht gemessen am Business Warehouse nach einer neuen Zimmerbeleuchtung umsehen muss, denn die rote Laterne wechselt Qualitätswert pro tausend Zeilen Quellcode für jedes Themengebiet. den Schreibtisch zu den Kollegen vom TG Messen. CAST Monthly FlyerSonntag, 13. Mai 12 28
  29. 29. Unique selling points • common meta model for development objects and its relations • change impact analysis, support for cost estimation, path finder along call stacks • cross technology analysis (Java, C#, C++, ABAP, COBOL...) • management dashboard and longterm evaluation • layered, weighted aggregation of resultsSonntag, 13. Mai 12 29
  30. 30. Drawbacks • speed of ABAP analysis (easy+: 14-17 h) • no sound ABAP know-how up to now • some unstable, heurisitic rules • big initial and ongoing invest • license, maintenance, education, administration • no integration in ABAP development cycle • sluggish support and information policySonntag, 13. Mai 12 30
  31. 31. Links • Vendor homepage: http://www.castsoftware.com • Product homepage http://www.castsoftware.com/products/ cast-application-intelligence-platformSonntag, 13. Mai 12 31
  32. 32. Virtual Forge CodeProfiler (in examination)Sonntag, 13. Mai 12 32
  33. 33. What is it? • developed by Virtual Forge GmbH • THE ABAP security experts • scans ABAP code, checks against rules and presents the results • concentrates on ABAP analysis in security, compliance, performance and robustnessSonntag, 13. Mai 12 33
  34. 34. How it works • external scanning and analysis engine, written in Java • transfer of source information directly via RFC or file-based • results are generated as PDF or shown in SAP (Tx „Finding Manager“) • uses SAP BI for management viewsSonntag, 13. Mai 12 34
  35. 35. Ergebnisdarstellung in SAP Finding Manager PPT Masterfolie zur Erstellung von Präsentationen © 2011 Virtual Forge GmbH || www.virtualforge.com || All rights reserved. © 2011 Virtual Forge GmbH www.virtualforge.com All rights reserved. Finding ManagerSonntag, 13. Mai 12 35
  36. 36. 2 Executive Summary 2 Executive Summary The ABAP code has been analyzed with 100 test cases. 55 of those test cases yielded findings, totaling 28825 findings. 1535 of them have been rated as critical. The findings are distributed as follows: Critical Findings Total Findings # Analyzed Test Domain # ME # ME Testcases Security 186 N/A 3493 N/A 48 Compliance 292 N/A 2069 N/A 7 Performance 938 N/A 9148 N/A 19 Maintainability 0 N/A 10593 N/A 11 Robustness 119 N/A 3522 N/A 10 Data-Loss- 0 N/A 0 N/A 5 Prevention * ME = Mitigation Effort (N/A = Not completely configured) Also, 0 countermeasures have been detected that have prevented additional security findings. Please note that 0 findings have been manually suppressed by developers. Some test cases are used for informational purposes only. These yielded 856 findings as a basis for further analysis by experts. Example PDF reportSonntag, 13. Mai 12 -2- 36
  37. 37. Daten- und Kontrollflussanalyse PPT Masterfolie METHOD read zur Erstellung von Präsentationen METHOD read . DATA: request TYPE REF TO if_http_request. Input request->get_form_field() DATA: s_html TYPE string. DATA: event TYPE string. Stored in variable 1 s_html = request->get_form_field( mydata ). CALL METHOD me->process EXPORTING 2 s_data = s_html. RETURN. s_html ENDMETHOD. Passed on to another method and variable METHOD process METHOD process . s_data DATA: s_out TYPE string. DATA: out TYPE REF TO if_bsp_writer. 3 CONCATENATE `<b>` s_data Modifed and copied to another variable `</b>` INTO s_out. out = me->get_previous_out( ). s_out 4 out->print_string( s_out ). ENDMETHOD. Output Passed on to dangerous function out->print_string() © 2011 Virtual Forge GmbH || www.virtualforge.com || All rights reserved. © 2011 Virtual Forge GmbH www.virtualforge.com All rights reserved. Data / Control Flow AnalysisSonntag, 13. Mai 12 37
  38. 38. Unique selling points • Speed of analysis (easy+: 1.5 h) • data and control flow analysis • automated corrections possible • superb rules in security domain • sound integration in development cycle • SE80, TMS, ChaRm, CTS(+)Sonntag, 13. Mai 12 38
  39. 39. Drawbacks • not that strong in code quality rules yet • performance, maintenance, robustness • no support for other languages than ABAPSonntag, 13. Mai 12 39
  40. 40. Links • Vendor homepage: http://virtualforge.com/ • Product homepage: http://virtualforge.com/index.php/en/ portfolio/codeprofiler.html • Product Review from KuppingerColeSonntag, 13. Mai 12 40
  41. 41. SAP Code Inspector (in production)Sonntag, 13. Mai 12 41
  42. 42. What is it? • developed by SAP, in ABAP-OO • scans ABAP code, checks against rules and presents the results as tree or list • integrated into every AS ABAP • Transactions SCI and SCIISonntag, 13. Mai 12 42
  43. 43. How it works • internal scanning and rule checking, implemented in ABAP-OO • no need to transfer development objects, it all stays in the system • results can be analyzed in TX SCISonntag, 13. Mai 12 43
  44. 44. How it worksSonntag, 13. Mai 12 44
  45. 45. How it works Check variant (how?)Sonntag, 13. Mai 12 44
  46. 46. How it works Object set Check variant (what?) (how?)Sonntag, 13. Mai 12 44
  47. 47. How it works Inspection references references Object set Check variant (what?) (how?)Sonntag, 13. Mai 12 44
  48. 48. How it works Inspection Results references references Object set Check variant (what?) (how?)Sonntag, 13. Mai 12 44
  49. 49. SAP CI Tx SCISonntag, 13. Mai 12 45
  50. 50. SAP CI Result DetailsSonntag, 13. Mai 12 46
  51. 51. CI reports you should know about • RS_CI_EMAIL: sends emails with inspection results to developers that own violating objects • RS_CI_EMAILTEMPLATE: template for this email • RS_CI_INSPECTOR: plan inspections as background jobsSonntag, 13. Mai 12 47
  52. 52. CI reports you should know about • RS_CI_DIFF: diff between two versions of an inspection, send diff per email • RS_CI_COMPARE: diff between two inspectionsSonntag, 13. Mai 12 48
  53. 53. Unique selling points • build by same vendor as AS ABAP • integrated into ABAP system • no additional hardware or software needed • API to call in custom code and extend with own rulesSonntag, 13. Mai 12 49
  54. 54. Unique selling points • no additional license and maitenance costs • strong rules in performance domain • good integration in development cyclce • stable rules, seldom false positivesSonntag, 13. Mai 12 50
  55. 55. Drawbacks • no dashboard or BI integration • very few cross-object rules • performance: • some rules consume a lot of memory • contains some very slow checksSonntag, 13. Mai 12 51
  56. 56. Links • Book: „Praxishandbuch Code Inspector“Sonntag, 13. Mai 12 52
  57. 57. Sonar ABAP Plug-In (in examination)Sonntag, 13. Mai 12 53
  58. 58. What is it? • developed by SonarSource and Obeo • scans ABAP code, checks against 50+ rules and presents the results in a nice dashboard • integrated into inspection plattform SonarSonntag, 13. Mai 12 54
  59. 59. How it works • scanning and analysing is implemented in Java, needs JRE/JDK and RDBMS • code of objects needs to be exported into files, folder structure defines result structure • results are presented in Sonar dashboardsSonntag, 13. Mai 12 55
  60. 60. Sonar DashboardSonntag, 13. Mai 12 56
  61. 61. Sonar HotspotsSonntag, 13. Mai 12 57
  62. 62. Sonar ComponentsSonntag, 13. Mai 12 58
  63. 63. Sonar Source ViewSonntag, 13. Mai 12 59
  64. 64. Sonar Time MachineSonntag, 13. Mai 12 60
  65. 65. Unique selling points • ease of installation, administration • moderate costs • entry to the fabulous Sonar plattform • plugins like Views, SQUALE etc. • configurable dashboards with myriads of views • extensibilitySonntag, 13. Mai 12 61
  66. 66. Drawbacks • no source code extractor out of the box • trying to change this • small rule base yet (V1.1) • integration of Code Inspector results in examination • no real inhouse ABAP know-how • analysis runs still break without proper error documentation in current versionSonntag, 13. Mai 12 62
  67. 67. Links • Vendor homepages: http://www.sonarsource.com/ http://www.obeo.fr/ • Product homepages: http://www.sonarsource.com/products/ plugins/languages/abap/ http://www.sonarsource.com/products/ software/sonar/Sonntag, 13. Mai 12 63
  68. 68. Lessons Learned so far...Sonntag, 13. Mai 12 64
  69. 69. Things to know before establishing static code analysis • Not everyone is very fond of transparency! • Talk to your workers‘ council early, if there is one! • Be aware of „benchmark optimisations“! • correcting „for the tool“ can have negative impact • Be aware of the impact of false positives! • trust in tools fades with each of itSonntag, 13. Mai 12 65
  70. 70. Things to do when establishing static code analysis • start small, grow large • activate one check after the other • start with new code, then spread by packages • integrate analysis results into developers daily routine (IDE, TMS) • exclude generated ABAP codingSonntag, 13. Mai 12 66
  71. 71. Things to do when establishing static code analysis • have a working QA process established before starting tool integration • integrate analysis results into SLAs • integrate analysis results into manager‘s targets • make them pay for not giving you the space to build great software!Sonntag, 13. Mai 12 67
  72. 72. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  73. 73. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  74. 74. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  75. 75. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  76. 76. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  77. 77. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  78. 78. Things to do when establishing static code analysis • try to keep the whole process fun and entertaining for developers • do not overload the change processSonntag, 13. Mai 12 68
  79. 79. Architektur 2,91 -0,01 42,44 0,03 Avoid using lit BusinessWarehouse 5,20 0,01 33,28 0,03 Avoid unrefer Keep it entertaining: CustomerCare 2,62 -0,04 45,68 0,28 Avoid Method Messen 2,93 0,14 40,13 0,29 Avoid using L MPK 1,88 0,01 50,43 0,18 Avoid missing OutputManagement 3,29 0,00 31,18 -0,01 „Biggest Looser“ Avoid Artifacts Statistik 3,44 -0,01 34,10 -0,22 Avoid Artifacts easy+ 2,69 0,00 40,85 0,08 Die Tabelle zeigt Diese Tabelle zeigt die Anzahl der kritischen Verletzungen und die der für das Rankin Gesamtanzahl auf tausend Codezeilen innerhalb der Themengebiets. Beispiel: Ein Viol.krit-Wert von 3 sagt aus, dass in tausend Codezeilen im Durchschnitt 3 kritische Regelverletzungen vorliegen. • At start of a new CAST Biggest Loser release, the sum of Rang Team Themengebiet Start Startgewicht Cur. Aktuell Dif Diff.(%) D weighted violations per 1 Statistik 220,13 218,92 -0,55 A W 1k code lines is 2 3 Accounting Abrechnen 258,62 210,84 258,10 210,72 -0,20 -0,06 A measured per 4 OutputManagement 218,02 217,98 -0,02 D W development team 5 Architektur 251,44 251,46 0,01 s 6 CustomerCare 270,70 270,97 0,10 T • 7 BusinessWarehouse 244,40 244,74 0,14 d With every snapshot 8 MPK 262,63 263,25 0,24 ü R this rating is recalculated 9 Messen 252,94 255,03 0,82 D easy+ 243,70 243,90 0,08 M • At end of release, the Diese Tabelle zeigt das CAST Biggest Loser-Startgewicht gemessen am Qualitätswert pro tausend Zeilen Quellcode für jedes Themengebiet. B d best team gets an awardSonntag, 13. Mai 12 69
  80. 80. Keep it entertaining: „The Red Lantern“ • With every snapshot there is a rating of development teams („Biggest Looser“) • The team with the highest degradation since baseline gets the Red Lantern on its team leader‘s deskSonntag, 13. Mai 12 70
  81. 81. SummarySonntag, 13. Mai 12 71
  82. 82. A fool with a tool...Sonntag, 13. Mai 12 72
  83. 83. A fool with a tool... • the best working tool for static ABAP code analysis is... • you, the ABAP expert! • Integrate tools when code and team sizing grow beyond manual review capabilitiesSonntag, 13. Mai 12 72
  84. 84. My personal, biased advice • Want to get into tool-based code analysis, no money to spend for external tools: => SAP Code Inspector • Substantial code bases in technologies other than ABAP, cross-technology analysis a must, more than rules engine needed, lots of money to spend: => CAST AIPSonntag, 13. Mai 12 73
  85. 85. My personal, biased advice • Die-hard ABAP development, security and compliance is a big concern, results near to developers a must, a little bit of money on the bench: => Virtual Forge Code Profiler • Lots of Java code and a little bit of ABAP, small budget, no need for deep ABAP coverage now: => keep an eye on Sonar ABAP plug-inSonntag, 13. Mai 12 74
  86. 86. VirtualForge SAP Sonar Criteria CAST AIP CodeProfiler Code Inspector ABAP Plug-In ease of administration -- + ++ + management dashboards ++ + -- + costs -- - ++ O-+ overall technical quality O + + O support for other languages ++ -- -- ++ analysis performance -- ++ - -- long-time evaluation, trends ++ + -- ++ integration in development process -- + + --Sonntag, 13. Mai 12 75
  87. 87. VirtualForge SAP Sonar Criteria CAST AIP CodeProfiler Code Inspector ABAP Plug-In hardware requirements -- O ++ O Extendable with own rules yes/O no yes/+ no rule documentation + ++ O OSonntag, 13. Mai 12 76
  88. 88. Questions ?Sonntag, 13. Mai 12 77
  89. 89. Thanks for listening! • Contact Information • Markus.Theilen@ewe.de • Twitter: @therealtierSonntag, 13. Mai 12 78
  90. 90. BackupSonntag, 13. Mai 12 79
  91. 91. CAST Custom Action Plan ViewerSonntag, 13. Mai 12 80
  92. 92. CAST Acceptance ViewSonntag, 13. Mai 12 81
  93. 93. CAST Management Studio IISonntag, 13. Mai 12 82
  94. 94. CAST Action PlanSonntag, 13. Mai 12 83
  95. 95. CAST Assessment ViewSonntag, 13. Mai 12 84
  96. 96. CAST Compliance ViewSonntag, 13. Mai 12 85
  97. 97. CAST Development ViewSonntag, 13. Mai 12 86
  98. 98. CAST Enhancement ViewSonntag, 13. Mai 12 87
  99. 99. CAST Evolution ViewSonntag, 13. Mai 12 88
  100. 100. CAST Project ViewSonntag, 13. Mai 12 89
  101. 101. SAP CI Result TreeSonntag, 13. Mai 12 90
  102. 102. SAP CI Tx SCIISonntag, 13. Mai 12 91
  103. 103. SAP CI InspectionSonntag, 13. Mai 12 92
  104. 104. SAP CI Object SetSonntag, 13. Mai 12 93
  105. 105. SAP CI Check VariantSonntag, 13. Mai 12 94
  106. 106. Exclude generated maintenance views • In object set you can Edit exclude generated exclude Maint.View functions groups for maintenance views • These function groups cause a lot of violations that you should not bother aboutSonntag, 13. Mai 12 95
  107. 107. Sonar Action PlanSonntag, 13. Mai 12 96
  108. 108. Sonar CloudsSonntag, 13. Mai 12 97
  109. 109. Sonar Start PageSonntag, 13. Mai 12 98
  110. 110. Sonar ReviewsSonntag, 13. Mai 12 99
  111. 111. Sonar Tree MapSonntag, 13. Mai 12 100
  112. 112. Sonar Violation DrilldownSonntag, 13. Mai 12 101
  113. 113. Sonar manual violationSonntag, 13. Mai 12 102

×