Social barriers at


Published on

Presentation of social considerations group at Harvard Attribution Workshop

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Social barriers at

  1. 1. Social Considerations Liz Allen Monica Bradford Alexa McCray Alyssa Goodman Elizabeth Knoll Cameron Neylon Judith Singer Hideaka Takeda Anita de Waard Simeon Warner
  2. 2. Questions to the group:1. In a perfect world, how would you like to be evaluated for your work?2. Can you list a small number (2-5) barriers in the real world that prevent yourself and others from evaluating your work in the ideal way?3. A general action that could be taken to overcome a barrier you have identified.4. A specific action, that you could take within the scope of your own work, to help overcome the barriers you have identified.
  3. 3. 1. How we would like to be evaluated:1. People in our group want the diversity of their contributions (domain, media, types of output) recognized.2. They want to provide input on which aspects of their work are representative of their contribution.3. They want to be evaluated on their influence and wider impact.4. They want evaluations to be appropriate and fit for purpose.5. They want to be able to trace the chain of their influence6. They would like to see teamwork and leadership evaluated and rewarded.
  4. 4. 2. Challenges• Discrepancy between aspects that are important and things that can be measured• We need to start by agreeing on what it is that we value in people.• A first pass:
  5. 5. Some valued characteristics• Creativity (of people):• Networking skills• Influence on peers in same field• Influence on peers in other fields• Translational influence on policy, practice, public• Persuasiveness• Good taste in choosing research questions! Exciting, novel?• Knowledge of the subject area• Craftsmanship• Leadership• Generative-ness• Original vs. curation• Perseverance• Productivity
  6. 6. Some issues with using these:• Not all aspects can be quantified, or even identified (even by the person themselves)• Lack of clarity around reasons for assessment and appropriate metrics: need a different set of values for different roles/institutions/career stages etc.• Predictivity of these is difficult: some are purely retrospective (verbal GREs/height!)
  7. 7. 3. Actions• Flesh out this table and see which aspects can be tracked/measured• Give everyone their ORCID id and tell them to start using it! Start to track all your activities, thoughts, blogs etc. using this ID; collect it• Publishers allow a space in the author’s details for this – add microattribution?• Use ID in workflow systems/codeshare systems• To improve buy-in with e.g. provosts: make a few exemplars (real) to show that these values are an improvement – who did we miss? How could we have found them?
  8. 8. 4. Open questions• Is the way people are assessed (small groups of senior, entitled ‘judges’ sitting in a room) still a valid method of evaluation?• Data point: UK involved ‘random public’ – very positive responses• Is it valid to evaluate people, rather than ‘work’, ‘group’, etc? Is this person-focused evaluation the best way to support science?