A Pragmatic Perspective on Software Visualization

1,340 views

Published on

Slides of the keynote presentation at the 5th International IEEE/ACM Symposium on Software Visualization, SoftVis 2010. Salt Lake City, USA, October 2010.

Published in: Technology, Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,340
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
15
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

A Pragmatic Perspective on Software Visualization

  1. 1. A Pragmatic Perspective on Software Visualization Arie van Deursen Delft University of Technology 1
  2. 2. Acknowledgements• SoftVis organizers – Alexandru Telea – Carsten Görg – Steven Reiss• TU Delft co-workers – Felienne Hermans – Martin Pinzger• The Chisel Group, Victoria – Margaret-Anne Storey 2
  3. 3. Outline1. Questions & Introduction Software visualization reflections2. Zooming in: Visualization for end-user programmers3. Zooming out: Software visualization reflections revisited4. Discussion But not just at the end 3
  4. 4. Proc. WCRE 2000,Science of Comp. Progr., 2006 4
  5. 5. Proc. ICSM 1999Sw. Practice & Experience, 2004 www.sig.eu Software Improvement Group 5
  6. 6. 6
  7. 7. Ali Mesbah, Arie van Deursen, Danny RoestInvariant-based automated testing of modern web applications. ICSE’09, TSE subm. 7
  8. 8. Bas Cornelissen , Andy Zaidman, Arie van Deursen,A Controlled Experiment for Program Comprehension through Trace Visualization. IEEE Transactions on Software Engineering, 2010 8
  9. 9. A. Zaidman, B. van Rompaey, A. van Deursen and S. Demeyer .Studying the co-evolution of production and test code in open source and industrial developer test processes through repository mining. Empirical Software Engineering, 2010 9
  10. 10. A. Hindle, M. Godfrey, and R. Holt..Software Process Recovery using Recovered Unified Process Views. ICSM 2010 10
  11. 11. Observations1. My visualizations leave room for improvement…2. Some very cool results are never applied 3. Software visualizations in context can be successful4. Simpler might be more effective5. What is our perspective on evaluation? 11
  12. 12. What is “Exciting” in anEngineering Field? A. Finkelstein A. Wolf1. Invention of wholly new ideas and directions2. Work of promise that illuminates #13. Early application of #2 showing clear prospect of benefit4. Substantial exploitation of #3 yielding measurable societal benefits5. Maturing of #4 with widespread adoption by practitioners 12
  13. 13. What Can We Learn From The Social Sciences?Paradigms shaping the practice of research:• Post-positivism• Social constructivism• Participatory / advocacy• Pragmatism 13
  14. 14. Post-positivism• Conjectures and Refutations: The Growth of Scientific Knowledge• Testing of hypotheses• A priori use of theory 14
  15. 15. Pragmatism• Clarify meanings of intellectual concepts by tracing out their “conceivable practical consequences”. (Charles Peirce, 1905)• Do not insist upon antecedent phenomena, but upon consequent phenomena; Not upon the precedents but upon the possibilities of action (John Dewey, 1931) 15
  16. 16. Pragmatic Considerations• Not every belief that is “true” is to be acted upon• Not committed to single research method• Research occurs in social (and technological) context• Research builds up “group knowledge” C. Cherryholmes. Notes on Pragmatism and Scientific Realism. Educational Researcher. 1992;21(6):13 - 17. 16
  17. 17. The Qualitative Research Palette• Measuring • Case studies applicability? • Ethnography • Participant observation• The outcome as a • Grounded theory narrative • Phenomenology • Narrative studies• Multi-facetted validity • Participative inquiry C. B. Seaman. • Interviewing Qualitative methods in empirical • Document analysis studies of software engineering. IEEE TSE, 1999 • … 17
  18. 18. Part II: Zooming In Felienne Hermans, Martin Pinzger, Arie van DeursenSupporting Professional Spreadsheet Users by Generating Leveled Dataflow Diagrams. Techn. Rep. TUD-SERG-2010-036, Delft University of Technology. Submitted. 18
  19. 19. CorporateSpreadsheetsDecision makingFinancial reportingForecastingBusiness data 19
  20. 20. Spreadsheet Research• Spreadsheet Risks Interest Groups – Managing & identifying spreadsheet errors• “End Users Shaping Effective Software” (2005…) – Spreadsheet corpus, testing, debugging, surveys – ICSE, TOSEM, TSE, Comp. Surveys, VL/HCC, CHI,… – Nebraska, CMU, Oregon State, Washington, … F. Hermans, M. Pinzger, and A. van Deursen. Automatically Extracting Class Diagrams from Spreadsheets. ECOOP 2010. 20
  21. 21. • 130 billion Euro in “assets under management”• 1600+ employees• Excel #1 software system• 3 hours per day• On average > 5 years old• On average 13 users each 21
  22. 22. Objectives and ApproachObjective:• Assist end-user programmers in spreadsheet comprehensionApproach:• Collect information needs in interviews• Provide tool addressing key information needs• Evaluate tool strengths and weaknesses in concrete Robeco setting 22
  23. 23. Information Need IdentificationInterview 27 people:• Bring a typical spreadsheet• Maximize variance in knowledge, experience, departmentsQualitative data collection• Discover needs through open-ended questions• “Tell us about your spreadsheet” 23
  24. 24. Grounded Theory• Systematic procedure to • Theoretical sensitivity discover theory from • Theoretical coding (qualitative) data • Open coding • Theoretical sampling • Constant comparative method • Selective coding • Memoing S. Adolph, W. Hall, Ph. Kruchten. Using Grounded theory to study the experience of software development. Emp. Sw. Eng., Jan. 2011. B. Glaser and J. Holton. Remodeling grounded theory. Forum Qualitative Res., 2004. 24
  25. 25. [ Intermezzo: Eclipse Testing ] http://the-eclipse-study.blogspot.com/ 25
  26. 26. Result I: Transfer Scenarios• S1: Transfer to colleague (50%) – new colleague; employee leaves; additional users.• S2: Check by auditor (25%) – Assess risks, design, documentation.• S3: To IT department (25%) – Replace by custom software – Increased importance / complexity, multiple people, … 26
  27. 27. Result II: Information Needs• N1: How are worksheets related? (45%)• N2: Where do formulas refer to (40%)• N3: What cells are meant for input (20%)• N4: What cells are meant for output (20%) 27
  28. 28. Observation: Top information needs related to “flow of data” through the spreadsheet Research Question: How can we leveragedataflow diagrams to address information needs? 28
  29. 29. Cell Data Block Classification Identification Name Graph Resolution & GroupingConstruction Replacement 29
  30. 30. Directed GraphMarkup Language• Visual Studio 2010 DGML graph browser• Zooming• Collapsing / expanding levels• Grouping of multiple arrows• Butterfly mode / slicing• Graph editing (deletion, coloring, leveling) Create prototype (GyroSAT) aimed at collecting initial user feedback 30
  31. 31. Example Grading Sheet 31
  32. 32. 32
  33. 33. 33
  34. 34. 34
  35. 35. 35
  36. 36. Neighborhood Browse Mode 36
  37. 37. Evaluation• Is this actually useful for spreadsheet professionals?• Evaluation I: Interviews• Evaluation II: Actual transfer tasks McGrath: “Credible empirical knowledge requires consistency or convergence of evidence across studies based on different methods.” 37
  38. 38. 27 Interviews 38
  39. 39. 39
  40. 40. They really outperform the ExcelAudit Toolbar, because they show the entire view of the sheet 40
  41. 41. As analysts we are usedto thinking in processes and thereforethis kind of diagrams is very natural to us 41
  42. 42. This diagram is verycomplex, I’m not sure it can help me 42
  43. 43. I would prefer to havethe visualization inside Excel 43
  44. 44. Case Studies:Experimental design• Monitor 9 actual transfer tasks: – 3 for each category• Each task involved: – Two participants: expert to receiver – ~ 1 hour – Laptop with GyroSAT; desktop with Excel – Participant observation & reflective questions• Only helped if participants got stuck 44
  45. 45. Spreadsheets Involved 45
  46. 46. Spreadsheets Involved 46
  47. 47. Case Studies S1a,b,c Transfer to ColleagueAll: Surprised by (visualized) complexity of own spreadsheetsS1a: Visualization gives expert a story lineS1c: Visualization helps receiver to understand overviewS1a: Missed support for VBA
  48. 48. Case Studies S2a,b,c AuditS2a: Picture leads me straight to the fishy partsS2b: More helpful than old approach (click all cells)S2b: Helps to spot errors on a new levelS2a/S2b: Missing connections with environment
  49. 49. Case Studies S3a,b,c Transfer to IT DeptAll: Receivers understood experts much better with the use of dataflow diagramsS3b: Top level diagram basis for architectureAll: Storyline, zooming into detailsAll: Used node names from diagrams to explain excel sheetS3a: Multiple calculations in sheets not separated
  50. 50. Spreadsheet Visualization• Threats to validity: – Tradeoff realism versus repeatability – Robeco spreadsheets only – Non-random group of participants• Simple but effective: – Helps to tell the spreadsheet story – Works for complex, realistic spreadsheets – VBA + Excel integration high on wish list 50
  51. 51. Part III: Zooming Out 51
  52. 52. Methodological Pride!• What are our knowledge claims?• What are the corresponding research methods? Mariam Sensalire, Patrick Ogao, and Alexandru Telea. Evaluation of Software Visualization Tools: Lessons Learned. In Proc. VISSOFT. IEEE, 2009 52
  53. 53. Design Science Conditions Control ofType of Practice context Example Users GoalsIllustration no yes small designer illustrationOpinion imagined yes any stakeholder supportLab demo no yes realistic designer knowledgeLab experiment no yes artificial subjects knowledgeBenchmark no yes standard designer knowledgeField trial yes yes realistic designer knowledgeField experiment yes yes realistic stakeholder knowledgeAction case yes no real designer knowledge and changePilot project yes no realistic stakeholder knowledge and changeCase study yes no real stakeholder knowledge and change Roel Wieringa. Design Science Methodology. Presentation for Deutsche Telekom, 2010 (also ICSE, RE tutorial) 53
  54. 54. A Priori Engagement with Users• Understand existing way of working• Identify problems• Embed solutions 54
  55. 55. Software = PeoplewareEvaluations are• qualitative• incomplete• subjectiveEvidence must• grow• and be criticized 55
  56. 56. Visualization = Communication• Beyond individual comprehension• Evaluate team interaction & collaboration 56
  57. 57. End-User Programming 57
  58. 58. Start a Company! 58
  59. 59. 59

×