Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Publish or Perish
Questioning the Impact of Our Research on
the Software Developer
Margaret-Anne Storey
@margaretstorey
Sp...
Lion Man2
Developers3
What do we make?4
Matt Warcholinski
Do we need developers?
How to Stay Relevant as a Software
Developer in the Age of AI
5
6
Developer Study
#1
41.8%
Distracting work
Environment
#2
36.6%
Meetings
#3
36.5%
Non-development
work
7
Context
Human / Social
Aspects
Technical
Aspects
Socio-Technical
Aspects
Software Engineering Design Space8
Human / Social Technical
Socio-Technical
Productivity Paradox9
Human / Social Technical
Socio-Technical
Joint Optimization – Code Review
CodeFlow
CodeFlow: Improving the Code Review
Pro...
11
Human / Social
Aspects
Technical
Aspects
Socio-Technical
Aspects
Software Engineering Design Space12
Software Engineering Research Space?
Human / Social
Aspects
Technical
Aspects
Socio-Technical
Aspects
13
Background
Questioning Our Impact
Paradigms
Contributions
Methods
Improving Our Impact
14
Research Collaborators
Per Runeson
Emelie Engström
Martin Höst
Elizabeth Bjarnason
and
Teresa Baldassarre (Bari)
Arie van ...
Background
Questioning Our Impact
Paradigms
Contributions
Methods
Improving Our Impact
16
“A paradigm is a shared world view that
represents the beliefs and values in a
discipline and that guides how problems
are...
Paradigms – Postpositivism
Scientific method
Evidence-based reality
Theory verification and falsification
Quantitative ove...
Paradigms – Constructivism
Reality is subjective and experiential
Theory generation
Biases are expected and made explicit
...
Paradigms – Advocacy / Participatory
Change oriented
Collaborative
Shaped by political and social lenses
Qualitative and q...
Postpositivism Advocacy /
Participatory
Constructivism
Paradigms21
Margaret BurnettCarolyn SeamanTim Menzies
Activity
Postpositivism Constructivism
Advocacy /
Participatory
22
Go to: menti.com
Enter code: 48 84 23
“I am the publish or perish,
whatever works guy.”
Paradigms
Postpositivism Constructivism
Advocacy /
Participatory
Problem...
Background
Questioning Our Impact
Paradigms
Contributions
Methods
Improving Our Impact
24
Formal
Sciences
Philosophical
Mathematical foundations
Design
Sciences
Medical treatments
Engineering solutions
Explanator...
Lund University26
Environment
Design Science — Hevner (2007)
Design Science
Knowledge Base
Relevance
Cycle
Rigor Cycle
Design Cycle
27
Practice
Theory
Problem
Constructs
Solution
Constructs
Problem
Instance(s)
Solution
Instance(s)
Problem
Characterization
A...
Design Science — Our View (Simplified)
Problem
Instance
Solution
Requirements
Validation
Solution
Evaluation
Problem
Under...
Technological Rule
(Theory Fragment)
Technological Rules
To reduce errors in open source projects
use continuous integrati...
Evaluation Criteria
Problem
Instance
Solution
Requirements
Validation
Solution
Evaluation
Problem
Understanding
31
Evaluation Criteria
🅐
Criteria
🅐 Relevance
Problem
Instance
Solution
Requirements
Validation
Solution
Evaluation
Problem
U...
Evaluation Criteria
🅑
🅑🅑 Criteria
🅐 Relevance
🅑 Rigor
🅐
Problem
Instance
Solution
Requirements
Validation
Solution
Evaluat...
Technological Rule
Evaluation Criteria
🅒
🅑
🅑🅑 Criteria
🅐 Relevance
🅑 Rigor
🅒 Novelty
🅐
Problem
Instance
Solution
Requireme...
Design Science Visual Abstract Template
Problem Instance Solution
Requirements
Validation
Solution
Evaluation
Problem
Unde...
Review of ICSE Distinguished Papers from 2014 to 2018
38Papers
36
Applying Visual Abstracts37
How were papers clustered?
Problem Constructs Design Constructs
Problem Instance(s) Design Instance(s)
⬤ Descriptive
⬤ Pro...
Results for ICSE Distinguished Papers from 2014 to 2018
Rigor
Novelty
Relevance
A+
A
F
Design Science Criteria
39
Problem Constructs Design Constructs
Problem Instance(s) Design Instance(s)⬤ Descriptive
⬤ Problem Solution
⬤ Solution Val...
Background
Questioning Our Impact
Paradigms
Contributions
Methods
Improving Our Impact
41
Field
DataRespondent
Lab
Meta
Formal Theory
Socio-Technical Research Framework
Non-Empirical Empirical
42
Field
Experiments
Studies
Data
In-silico
Retrospective
Respondent
Surveys
Interviews
Lab
Experiments
Studies
Realism
Gener...
Lebeuf, Voyloshnikova, Herzig & Storey: 
“Debugging, and Optimizing Distributed
Software Builds: A Design Study”, ICMSE 20...
Gousios, Storey & Bacchelli,
“Work Practices and Challenges in Pull-Based
Development: The Contributor’s Perspective”, ICS...
Categorizing ICSE Paper Research Methods
253Technical Track Papers
2015 to 2017
46
Field
DataRespondent
Lab
Meta
Formal Theory
19536
22 26
13
7
Categorizing ICSE Paper Research Methods47
Field
DataRespondent
Lab
19536
22
26
Categorizing Research Methods
Control
(human actors)
Precision
(data measurements)
Re...
Meta
Formal
Theory
37 Data Papers Used Triangulation
Field
DataRespondent
Lab
4
8
12
6 7
195
26
36
22
7
13
49
Data Only Papers
158
Contribution Type
Solution
Descriptive
50
22
136
110of 158
Authors Mention
Developers
Data Only Papers
(No human research subjects)
70%
And yet...
158
According to the Aut...
“Our results provide initial evidence that
several assumptions made by automated
debugging techniques do not hold in
pract...
“You are smarter than your data.
Data do not understand causes
and effects; humans do.”
– Pearl and Mackenzie
The Book of W...
Questioning Our Impact
Paradigms Methods Contributions
Postpositivism
Constructivism
Advocacy /
Participatory
Problem
Cons...
Background
Questioning Our Impact
Improving Our Impact
Paradigms
Methods
Contributions
55
Creating Silos56
INDUSTRY PROGRAM SEIP TECHNICAL TRACK
Conference Structures
300+
Papers
57
Assigning Reviewers58
ICSE Paper Reviewing Criteria
Significance Soundness Verifiability
Novel and adds to existing
knowledge
Supports independe...
Background
Questioning Our Impact
Improving Our Impact
Paradigms
Methods
Contributions
60
Why these methods?
“We also would have conducted a
field experiment […], but we
didn’t have subjects readily
available.”
“...
Actionable
Study adopters
Non adopters
Fast
Quantitative
Perceptive
Human factors
Hypotheticals
Controlled
Scalable
Repeat...
Interviews /
Observations
Surveys Telemetry
The Power of Diverse Methods63
Data System
Background
Questioning Our Impact
Improving Our Impact
Paradigms
Methods
Contributions
64
Badges65
Technological Rule Developer Tested Industry CollaborationTriangulation
Some New Badges66
Brynn Hawker @bnhwkr
Write less, think hard, imagine more.
Margaret-Anne Storey
@margaretstorey
“Using a visual abstract as a lens for communicating and promoting design science
research in software engineering”, Store...
Zelkowitz & Wallace, “Experimental Models for Validating Technology,”
1998
Shaw, “Writing good software engineering resear...
Upcoming SlideShare
Loading in …5
×

Publish or Perish: Questioning the Impact of Our Research on the Software Developer

2,647 views

Published on

(Video for this talk can be found here: https://www.youtube.com/watch?time_continue=379&v=DvRdBb9TEUI)

Abstract: How often do we pause to consider how we, as a community, decide which developer problems we address, or how well we are doing at evaluating our solutions within real development contexts? Many of our research contributions in software engineering can be considered as purely technical. Yet somewhere, at some time, a software developer may be impacted by our research. In this talk, I invite the community to question the impact of our research on software developer productivity. To guide the discussion, I first paint a picture of the modern-day developer and the challenges they experience. I then present 4+1 views of software engineering research --- views that concern research context, method choice, research paradigms, theoretical knowledge and real-world impact. I demonstrate how these views can be used to design, communicate and distinguish individual studies, but also how they can be used to compose a critical perspective of our research at a community level. To conclude, I propose structural changes to our collective research and publishing activities --- changes to provoke a more expeditious consideration of the many challenges facing today's software developer.

(Thanks to Brynn Hawker for slide design and proposed new badges. brynn@hawker.me)

Published in: Engineering

Publish or Perish: Questioning the Impact of Our Research on the Software Developer

  1. 1. Publish or Perish Questioning the Impact of Our Research on the Software Developer Margaret-Anne Storey @margaretstorey Special Thanks Jo Atlee • Brynn Hawker Cassandra Petrachenko My husband and kids
  2. 2. Lion Man2
  3. 3. Developers3
  4. 4. What do we make?4
  5. 5. Matt Warcholinski Do we need developers? How to Stay Relevant as a Software Developer in the Age of AI 5
  6. 6. 6
  7. 7. Developer Study #1 41.8% Distracting work Environment #2 36.6% Meetings #3 36.5% Non-development work 7
  8. 8. Context Human / Social Aspects Technical Aspects Socio-Technical Aspects Software Engineering Design Space8
  9. 9. Human / Social Technical Socio-Technical Productivity Paradox9
  10. 10. Human / Social Technical Socio-Technical Joint Optimization – Code Review CodeFlow CodeFlow: Improving the Code Review Process at Microsoft, Czerwonka et al. 2018. 10
  11. 11. 11
  12. 12. Human / Social Aspects Technical Aspects Socio-Technical Aspects Software Engineering Design Space12
  13. 13. Software Engineering Research Space? Human / Social Aspects Technical Aspects Socio-Technical Aspects 13
  14. 14. Background Questioning Our Impact Paradigms Contributions Methods Improving Our Impact 14
  15. 15. Research Collaborators Per Runeson Emelie Engström Martin Höst Elizabeth Bjarnason and Teresa Baldassarre (Bari) Arie van Deursen (Delft) ... Jacek Czerwonka Brendan Murphy Tom Zimmermann Chris Bird Kim Herzig Laura MacLeod Elena Voyloshnikova Carly Lebeuf Courtney Williams Eirini Kalliamvakou Neil Ernst Daniel German Alexey Zagalsky The CHISEL Group ... 15
  16. 16. Background Questioning Our Impact Paradigms Contributions Methods Improving Our Impact 16
  17. 17. “A paradigm is a shared world view that represents the beliefs and values in a discipline and that guides how problems are solved.” – Schwandt, 2001 17
  18. 18. Paradigms – Postpositivism Scientific method Evidence-based reality Theory verification and falsification Quantitative over qualitative 18
  19. 19. Paradigms – Constructivism Reality is subjective and experiential Theory generation Biases are expected and made explicit Qualitative over quantitative 19
  20. 20. Paradigms – Advocacy / Participatory Change oriented Collaborative Shaped by political and social lenses Qualitative and quantitative 20
  21. 21. Postpositivism Advocacy / Participatory Constructivism Paradigms21 Margaret BurnettCarolyn SeamanTim Menzies
  22. 22. Activity Postpositivism Constructivism Advocacy / Participatory 22 Go to: menti.com Enter code: 48 84 23
  23. 23. “I am the publish or perish, whatever works guy.” Paradigms Postpositivism Constructivism Advocacy / Participatory Problem centered • Real-world practice oriented Pragmatism 23
  24. 24. Background Questioning Our Impact Paradigms Contributions Methods Improving Our Impact 24
  25. 25. Formal Sciences Philosophical Mathematical foundations Design Sciences Medical treatments Engineering solutions Explanatory Sciences Descriptive theories Predictive theories Types of Contributions Empirical Research 25
  26. 26. Lund University26
  27. 27. Environment Design Science — Hevner (2007) Design Science Knowledge Base Relevance Cycle Rigor Cycle Design Cycle 27
  28. 28. Practice Theory Problem Constructs Solution Constructs Problem Instance(s) Solution Instance(s) Problem Characterization Analytical Validation Instantiation or Abstraction Empirical Validation Design Science — Our View Problem Solution28
  29. 29. Design Science — Our View (Simplified) Problem Instance Solution Requirements Validation Solution Evaluation Problem Understanding Technological Rule 29
  30. 30. Technological Rule (Theory Fragment) Technological Rules To reduce errors in open source projects use continuous integration. To achieve an effect in a given context use / do intervention. 𝑥 𝑦 𝑧 30
  31. 31. Evaluation Criteria Problem Instance Solution Requirements Validation Solution Evaluation Problem Understanding 31
  32. 32. Evaluation Criteria 🅐 Criteria 🅐 Relevance Problem Instance Solution Requirements Validation Solution Evaluation Problem Understanding 32
  33. 33. Evaluation Criteria 🅑 🅑🅑 Criteria 🅐 Relevance 🅑 Rigor 🅐 Problem Instance Solution Requirements Validation Solution Evaluation Problem Understanding 33
  34. 34. Technological Rule Evaluation Criteria 🅒 🅑 🅑🅑 Criteria 🅐 Relevance 🅑 Rigor 🅒 Novelty 🅐 Problem Instance Solution Requirements Validation Solution Evaluation Problem Understanding 34
  35. 35. Design Science Visual Abstract Template Problem Instance Solution Requirements Validation Solution Evaluation Problem Understanding Technological Rule 🅐 Relevance 🅑 Rigor 🅒 Novelty 35
  36. 36. Review of ICSE Distinguished Papers from 2014 to 2018 38Papers 36
  37. 37. Applying Visual Abstracts37
  38. 38. How were papers clustered? Problem Constructs Design Constructs Problem Instance(s) Design Instance(s) ⬤ Descriptive ⬤ Problem Solution ⬤ Solution Validation ⬤ Solution Design ⬤ Meta 8 7 7 13 Meta 3 PracticeTheory 38
  39. 39. Results for ICSE Distinguished Papers from 2014 to 2018 Rigor Novelty Relevance A+ A F Design Science Criteria 39
  40. 40. Problem Constructs Design Constructs Problem Instance(s) Design Instance(s)⬤ Descriptive ⬤ Problem Solution ⬤ Solution Validation ⬤ Solution Design 5/8 2/7 0/7 6/13 Relevance to stakeholders? 13/35 40 Consider Stakeholders
  41. 41. Background Questioning Our Impact Paradigms Contributions Methods Improving Our Impact 41
  42. 42. Field DataRespondent Lab Meta Formal Theory Socio-Technical Research Framework Non-Empirical Empirical 42
  43. 43. Field Experiments Studies Data In-silico Retrospective Respondent Surveys Interviews Lab Experiments Studies Realism Generalizability Control (human actors) Precision (data measurements) Research Methods and Tradeoffs Realism Generalizability Control (human actors) 43
  44. 44. Lebeuf, Voyloshnikova, Herzig & Storey:  “Debugging, and Optimizing Distributed Software Builds: A Design Study”, ICMSE 2018 Field DataRespondent Lab #1#2 The Methods We Chose Realism Control (human actors) 44
  45. 45. Gousios, Storey & Bacchelli, “Work Practices and Challenges in Pull-Based Development: The Contributor’s Perspective”, ICSE 2016 Field DataRespondent Lab #1 #2 The Methods We Chose Generalizability Precision 45
  46. 46. Categorizing ICSE Paper Research Methods 253Technical Track Papers 2015 to 2017 46
  47. 47. Field DataRespondent Lab Meta Formal Theory 19536 22 26 13 7 Categorizing ICSE Paper Research Methods47
  48. 48. Field DataRespondent Lab 19536 22 26 Categorizing Research Methods Control (human actors) Precision (data measurements) Realism Generalizability 48
  49. 49. Meta Formal Theory 37 Data Papers Used Triangulation Field DataRespondent Lab 4 8 12 6 7 195 26 36 22 7 13 49
  50. 50. Data Only Papers 158 Contribution Type Solution Descriptive 50 22 136
  51. 51. 110of 158 Authors Mention Developers Data Only Papers (No human research subjects) 70% And yet... 158 According to the Authors51
  52. 52. “Our results provide initial evidence that several assumptions made by automated debugging techniques do not hold in practice.” – Parnin & Orso, ISSTA 2011 Solution Study Implications52
  53. 53. “You are smarter than your data. Data do not understand causes and effects; humans do.” – Pearl and Mackenzie The Book of Why 53
  54. 54. Questioning Our Impact Paradigms Methods Contributions Postpositivism Constructivism Advocacy / Participatory Problem Constructs Design Constructs Problem Instance(s) Design Instance(s) 5/8 2/7 0/7 6/13 Field DataRespondent Lab 19536 22 26 54
  55. 55. Background Questioning Our Impact Improving Our Impact Paradigms Methods Contributions 55
  56. 56. Creating Silos56 INDUSTRY PROGRAM SEIP TECHNICAL TRACK
  57. 57. Conference Structures 300+ Papers 57
  58. 58. Assigning Reviewers58
  59. 59. ICSE Paper Reviewing Criteria Significance Soundness Verifiability Novel and adds to existing knowledge Supports independent verification or replication Rigor of appropriate research methods Current Stakeholder involvement Scales to industry Triangulation of realism Generalizability Control of humans Audit trails Member checking Biases & reactivity Future 59
  60. 60. Background Questioning Our Impact Improving Our Impact Paradigms Methods Contributions 60
  61. 61. Why these methods? “We also would have conducted a field experiment […], but we didn’t have subjects readily available.” “We took the standard approach that would typically be reported in a [topic] conference.” 61
  62. 62. Actionable Study adopters Non adopters Fast Quantitative Perceptive Human factors Hypotheticals Controlled Scalable Repeatable Unobtrusive Diverse Collaboration ↔ Diverse Methods Field DataRespondent Lab 62
  63. 63. Interviews / Observations Surveys Telemetry The Power of Diverse Methods63 Data System
  64. 64. Background Questioning Our Impact Improving Our Impact Paradigms Methods Contributions 64
  65. 65. Badges65
  66. 66. Technological Rule Developer Tested Industry CollaborationTriangulation Some New Badges66 Brynn Hawker @bnhwkr
  67. 67. Write less, think hard, imagine more. Margaret-Anne Storey @margaretstorey
  68. 68. “Using a visual abstract as a lens for communicating and promoting design science research in software engineering”, Storey, Engström, Höst, Runeson, Bjarnason, ESEM 2017. http://chisel.cs.uvic.ca/pubs/storey-ESEM2017.pdf “A review of software engineering research from a design science Perspective”, Engström, Storey, Runeson, Höst, Baldassarre, Arxiv 2019. http://arxiv.org/abs/1904.12742 “Methodology Matters: How We Study Socio-Technical Aspects in Software Engineering.”, Courtney Williams, Margaret-Anne Storey, Neil A. Ernst, Alexey Zagalsky and Eirini Kalliamvakou. 2019. Arxiv (forthcoming) Special thanks to Brynn Hawker @bnhwkr for slide and graphic design! Key references68
  69. 69. Zelkowitz & Wallace, “Experimental Models for Validating Technology,” 1998 Shaw, “Writing good software engineering research papers,” 2003 Vessey, Ramesh & Glass, “A unified classification system for research in the computing disciplines,” 2005 Smite, Wohlin, Gorschek & Feldt, “Empirical evidence in global software engineering: a systematic review,” 2010 Wohlin & Aurum, “Towards a decision-making structure for selecting a research design in empirical software engineering,” 2015 Stol, Ralph & Fitzgerald, “Grounded theory in software engineering research: A critical review and guidelines,” 2016 Runeson & Höst, “Guidelines for conducting and reporting case study research in software engineering,” 2008 Feldt & Magazinius, "Validity Threats in Empirical Software Engineering Research-An Initial Survey," 2010 Siegmund, Siegmund & Apel, “Views on internal and external validity in empirical software engineering,” 2015 Bibliography Wohlin et al., ”Experimentation in software engineering,” 2012 Sjøberg et al., "Building theories in software engineering," 2008 Stol & Fitzgerald, "Uncovering theories in software engineering," 2013 Ralph, "Possible core theories for software engineering," 2013 Shneiderman, "Twin-Win Model: A human-centered approach to research success," 2018 Easterbrook, Singer, Storey & Damian, “Selecting empirical methods for software engineering research,” 2008 Shaw, "What makes good research in software engineering," 2002 Creswell, "A Concise Introduction to Mixed Methods Research, " 2014 Hevner, "A three cycle view of design science research," 2007 Van Aken, "Management Research Based on the Paradigm of the Design Sciences: The Quest for Field‐Tested and Grounded Technological Rules,” 2004 69

×