Research industry panel review
Upcoming SlideShare
Loading in...5
×
 

Research industry panel review

on

  • 7,280 views

Summary of ICSE 2011 Panel on "What Industry wants from Research". This is a summary of all the presentations from that panel that I presented in an invited talk at the CSER meeting in Toronto, ...

Summary of ICSE 2011 Panel on "What Industry wants from Research". This is a summary of all the presentations from that panel that I presented in an invited talk at the CSER meeting in Toronto, November, 2011.

Statistics

Views

Total Views
7,280
Views on SlideShare
1,025
Embed Views
6,255

Actions

Likes
2
Downloads
6
Comments
0

7 Embeds 6,255

http://webhome.cs.uvic.ca 4472
http://margaretannestorey.wordpress.com 1707
https://twitter.com 68
http://translate.googleusercontent.com 3
http://webcache.googleusercontent.com 3
http://margaretannestorey.wordpress.com. 1
http://www.365dailyjournal.com 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Research industry panel review Research industry panel review Presentation Transcript

    • Highlights  from  an  ICSE  2011  Panel:    What  Industry  Wants  from  Research   Panel  Organizers:      Jorge  Aranda,  Daniela  Damian,     Marian  Petre,  Margaret-­‐Anne  Storey,  Greg  Wilson     1  
    • 2  
    • A  conversaEon  is  started…   3  
    • QuesEons  we  asked…    What  is  your  percep0on  of  soMware  engineering   research?    Conferences,  journals,  collaboraEons…  What  counts  as  evidence?    How  do  you  find  out   about  results?  What  topics  should  soMware  engineering  research   focus  on?     4  
    • Percep0ons  of  so7ware  research  Focused  on  organiza0ons  that  don’t  compare  to  us    Results  may  fail  to  scale          Piecemeal  improvements  Irrelevant,  dated,  ignoring  cuWng  edge  problems   hXp://Enyurl.com/icse2011panel  5  
    • “[I’m  afraid]  that  industrial  so9ware  engineers  will  think  that  I’m  now  doing  academic  so9ware  engineering  and  then  not  listen  to  me.  (…)  if  I  start  talking  to  them  and  claim  that  I’m  doing  so9ware  engineering  research,  a9er  they  stop  laughing,  they’re  gonna  stop  listening  to  me.  Because  it’s  been  so  long  since  anything  actually  relevant  to  what  pracFFoners  do  has  come  out  of  that  environment,  or  at  least  the  percentage  of  things  that  are  useful  that  come  out  of  that  environment  is  so  small.”   6  
    • What  counts  as  evidence?   Preference  for  quanEtaEve  data  and  staEsEcal   significance   7  
    • What  counts  as  evidence?   Preference  for  quanEtaEve  data  and  staEsEcal   significance  “managers  are  coin  operated  in  some  sense.  If  you  can’t  quanFfy  it  in  terms  of  Fme  or  in  terms  of  money,  it  doesn’t  make  much  difference  to  them.  (…)  I  think  there  does  need  to  be  some  noFon  of  a  numeric  or  at  least  an  objecFve  measure.”   8  
    • What  counts  as  evidence?   Preference  for  quanEtaEve  data  and  staEsEcal   significance   AssociaEon  of  qualitaEve  data  with     mere  anecdote‘   And  yet….  InfluenEal  opinions  persuasive     “I  trust  the  people  I  hire”   9  
    • Difficulty  in  applying  our  findings  Resistance  to  change  for  small  potenEal  gains  “(…)  it  would  depend  in  part  of  how  cumbersome  your  techniques  are;  how  much  retraining  I’m  going  to  have  to  do  on  my  staff.  (…)  I  might  decide  that  even  if  you’re  legit  and  you  actually  do  come  up  with  15%,  that  that’s  not  enough  to  jusFfy  it.”   10  
    • Dissemina0on  of  results  Results  are  not  geWng  out  there!   SuggesFons:     Researchers  should  be  aXending     pracEEoner  conferences   Need  to  disEll  research  results  for     pracEEoners   11  
    • Research  topics  of  interest   Developer  issues    Tool  issues      Code  issues        User  Issues          EvaluaEon  Issues            Management  issues   12  
    • Reac0ons  to  our  blog?   hXp://Enyurl.com/icse2011panel   13  
    • Broadening  this  conversaEon   14  
    • QuesEons  we  posed  to  the  panelists…  What  percep0on  does  industry  have  of  academic   research?  What  kind  of    evidence  is  compelling?  Which  research  ques0ons  should  be  addressed?  Most  useful  empirical  finding?  Success  story?      How  to  improve  dissemina0on  of  research   results?    How  to  engage  industry  in  research?   15  
    • Panel  parEcipants  David  Weiss,  Iowa  State  University  John  Penix,  Google  Lionel  Briand,  Simula  Research  Laboratory  Peri  Tarr,  IBM  Thomas  J.  Watson  Research  Center  Tatsuhiro  Nishioka,  Corporate  SoMware   Engineering  Center,  Toshiba  CorporaEon  Wolfram  Schulte,  MicrosoM  Research   16  
    • Panel  Highlights:  David  Weiss,  perspec0ve  from  Avaya   17  
    • Panel  Highlights:    John  Penix,  Google  •  Striving  for  conEnuous  improvement  of  useful   tools:   –  “We  can’t  improve  what  we  can’t  measure”   –  “Our  goal:  make  the  tools  disappear  from  the   workflow”  •  In  terms  of  scale….    5000  devs  in  40  offices,  2000   acEve  projects,  single  monolithic  code  tree  with   mixed  languages,  20+  code  changes  per  minute,   50,000  builds  per  day,  50  million  test  cases  run   per  day….   18  
    • Panel  Highlights:  John  Penix  (2)  •  Problems  he  cares  about…       –  How  do  developers  work  and  collaborate?   –  Team  producEvity  vs.  individual  producEvity  •  All  MS  students  should  know  how  to  do  a     user  study  •  Faculty  advice:  Maintain  contact  with  your   students  (good  links  to  industry)   19  
    • Panel  Highlights:  Lionel  Briand,    Simula  Research  Labs  •  PercepEons  from  industry:   –  Disconnect,  engage  researchers  in  industry   –  Scalability  (heurisEcs  versus  exact  methods)   –  Applicability  (context  factors,  constraints),  realisEc   condiEons  (human  factors)   –  Fundamental  research  quesEons  haven’t  changed…   –  Relevant  empirical  findings?   •  (Model  based)  tesEng  and  empirical  research  at   MicrosoM,  InspecEons   20  
    • Panel  Highlights:  Lionel  Briand  (2)   21  
    • Panel  Highlights:    Peri  Tarr,  IBM  Research  •  So  you  want  to  marry  an  industrial?  •  Success  comes  at  a  high  price!   –  Huge  Eme  commitment,  opportunity  cost   –  You  become  a  development  resource,  for  beSer  and  for   worse   –  Effort  in  developing  trust,  what  you  do  maXers  to   them,  problems  of  today  •  Industry  balances  strategies  and  considers  ROI   across  porqolios   –  Do  you  add  risk?     –  Evidence  that  customers  can  use  it,  want  it,  will  pay  for   it?   22  
    • Panel  Highlights:  Peri  Tarr  (2)   23  
    • Panel  Highlights:  Peri  Tarr  (3)  •  Find  someone  with  a  forward  looking   perspecEve,  Build  strong,  trusEng  long  term   partner  relaEonships   24  
    • Panel  Highlights:    Tatsuhiro  Nishioka,  Toshiba   25  
    • Panel  Highlights:    Wolfram  Schulte,  Microso7   26  
    • Panel  Highlights:  Wolfram  Schulte  (2)   27  
    • Discussion  Points  •  SoMware  development  is  a  wicked  problem…  how   to  compete  with  snake  oil  salesmen  with  our   modest  soluEons?    •  How  to  deal  with  kinds  of  evidence  accepted…   many  kinds  of  important  soMware  dev  problems   are  not  amenable  to  controlled  experimentaEon?  •  How  to  become  a  beXer  storyteller?  How  to  deal   with  the  apparent  disconnect?  •  How  to  do  industrially  relevant  research  without   hurEng  one’s  academic  career?   28  
    • ICSE  Panel  Slides  hXp://catenary.wordpress.com/2011/06/08/icse-­‐2011-­‐panel-­‐slides-­‐and-­‐recap/   29