Snowball Metrics aim to become the international standard that is endorsed by research-intensive universities so that they can build and monitor the most effective strategies. It is a “bottom-up”, or sector-led, initiative in which universities themselves agree a single method to calculate metrics about their own performance so that they can compare themselves against each other in an apples-to-apples way. This enables them to benchmark and understand their strengths and weaknesses to help to decide in which areas to invest, and in which to divest.
Call Girls Sangamwadi Call Me 7737669865 Budget Friendly No Advance Booking
Snowball Metrics as Standard Information Agreements - Anna Clements and Peter Darroch
1. Snowball Metrics and CASRAI project
Global Standards for Institutional Benchmarking
1
Peter Darroch, Elsevier
Senior Product Manager Research Metrics
Co-Chair, CASRAI-Snowball
Anna Clements, University of St Andrews, UK
Board member, euroCRIS
Chair , CASRAI-UK Data Management Plans
Co-Chair, CASRAI-Snowball
Member, ORCiD Business Steering Group
2. Snowball Metrics addresses
university-driven benchmarking
2
Snowball Metrics UK Project Partners
Universities need standard metrics to benchmark themselves
to know their position relative to peers, so they can
strategically align resources to their strengths and weaknesses
3. Snowball Metrics approach
3
Vision: Snowball Metrics enable benchmarking by driving
quality and efficiency across higher education’s research and
enterprise activities, regardless of system and supplier
• Bottom-up initiative: universities define and endorse metrics to
generate a strategic dashboard. The community is their guardian
• Draw on all data: university, commercial and public
• Ensure that the metrics are system- and tool-agnostic
• Build on existing definitions and standards where possible and
sensible
4. Main roles and responsibilities
• Everyone covers their own costs
• Universities
– Agree the metrics to be endorsed as Snowball Metrics
– Determine methodologies to generate the metrics in a commonly
understood manner to enable benchmarking, regardless of systems
• Elsevier
– Ensures that the methodologies are feasible
– Distribute the outputs using global communications networks
– Day-to-day project management of the global program
• Outside the remit of the Snowball Metrics program
– Nature and quality of data sources used to generate Snowball Metrics
– Provision of tools to enable generation and use Snowball Metrics
4
5. Globalizing Snowball Metrics
5
US
• University of Michigan
• University of Minnesota
• Northwestern University
• University of Illinois at Urbana-
Champaign
• Arizona State University
• MD Anderson Cancer Center
• Kansas State University
Australia / New Zealand
• University of Queensland
• University of Western Australia
• University of Auckland
• University of Wollongong
• University of Tasmania
• Massey University
• The University of Canberra
• Charles Darwin University
Interest and support from:
• Japan RU11 metrics group
• Association of Pacific Rim Universities (APRU)
• European Commission for H2020
• Fundação para a Ciência e a Tecnologia (FCT) in Portugal
6. The output of Snowball Metrics
6
www.snowballmetrics.com/metrics
“Recipes” – agreed and tested metric
methodologies – are the output of
Snowball Metrics
From Statement of Intent:
• Agreed and tested methodologies…
are and will continue to be shared
free-of-charge
• None of the project partners will
at any stage apply any charges for
the methodologies
• Any organization can use these
methodologies for their own
purposes, public service or
commercial
Statement of Intent available at http://www.snowballmetrics.com/wp-content/uploads/Snowball-Metrics-
Letter-of-Intent.pdf
9. Snowball Metrics are feasible
9Note: this pilot was built for the UK project partners, and is not available more widely.
10. Metrics can be size-normalized
10Note: this pilot was built for the UK project partners, and is not available more widely.
11. Metrics can be “sliced and diced”
11Note: this pilot was built for the UK project partners, and is not available more widely.
12. Trusted comparison of metrics on a robust standard
(comparing apples to apples)
Methods (recipes) are not proprietary. They are
agnostic to systems or suppliers – anyone can use
them for their own purposes
Ability to choose and control with whom one
shares/benchmarks (the crossroad/traffic light model)
Ability to benchmark nationally and internationally
Benefits for universities
12
13. Key deliverables for 2015
13
• Recipe book – Final recipe book will be produced, in which we aim to
complete the metrics matrix with the completed recipes for postgraduate
education which need to be shared with the community
• CASRAI profiles for all recipes
• The Snowball Metrics Exchange API will be completed
• euroCRIS : review/approval of Snowball CERIFication work
• Barcelona membership meeting : Nov 9-11th 2015
• Indicators & CERIF Task Groups
15. In more depth : Success Rates
15
• More competition for research funding
• Are we doing better, worse or about the same as our peer institutions?
• We want to compare apples to apples?
16. In more depth : Success Rates
16
• More competition for research funding
• Are we doing better, worse or about the same as our peer institutions?
• We want to compare apples to apples?
• Agreement reached by expert working group over 12 months
• What do we count ?
Record three-states : success / pending / rejection
Uses requested price rather than awarded value : pragmatic i.e. almost all
can provide this; recommend revisit when better linkages between
application and award systems
• When do we count it ?
Record against year of award : rates could change retrospectively
• How do we cope with ‘no shows’ ?
Agree 12 month write-off across all funders
17. In more depth : Success Rates
17
• More competition for research funding
• Are we doing better, worse or about the same as our peer institutions?
• We want to compare apples to apples?
18. CASRAI Snowball Working Group
18
AIMS :
• Extend community participation
• Codify metrics in CASRAI dictionary as an
international standard
19. CASRAI Snowball Working Group
19
AIMS :
• Extend community participation
• Commercial
• Thomson Reuters
• Research Organisations
• US & Canada & UK
• Funders
• Wellcome Trust & MRC
• Standards / existing data collections
• euroCRIS
• HESA
• STAR Metrics
20. CASRAI Snowball Working Group
20
AIMS :
• Codify metrics in CASRAI dictionary as an
international exchange standard
• Terms, objects and fields
• Show “what’s under the hood” => adds value
• Wider review circle => Snowball SG
Deliverables
• Standard exchange agreement per Metric [24]
• Update/extension of CASRAI dictionary
• Agreed streamlined process for new Metrics
23. CASRAI Snowball Working Group
23
Snowball metric, Institution, time period, funder type, number
24. CASRAI Snowball Working Group
24
“Under the hood”
To determine : Snowball metric, Institution, time period, funder type, number
for :
Income Volume
To calculate, need to know institutional information, such as :
• Institutional financial year
• Funding awards – identification and classification
• HESA cost centre of Principal Investigator
• Funding organisations – identification and classification
• HESA funder type
• Funding awards – relevant dates
• Date spent
• Funding awards – relevant values
• Amount spent
25. CASRAI Snowball Working Group
25
“Under the hood”
To determine : Snowball metric, Institution, time period, funder type, number
for :
Income Volume
Awards Volume
To calculate, need to know institutional information, such as :
• Institutional financial year
• Funding awards – identification and classification
• HESA cost centre of Principal Investigator
• Supplementary award
• Funding organisations – identification and classification
• HESA funder type
• Funding awards – relevant dates
• Date spent
• Date entered into system
• Funding awards – relevant values
• Amount spent
• Amount awarded
26. CASRAI Snowball Working Group
26
Progress update:
• Metrics to be published to dictionary in draft form mid Nov
• Feedback from review group / Snowball Experts WG
• Snowball Steering Group – late Nov
• Dissemination and engagement plan
• Webinair/s
• Workshop
2016
Plus CERIF-XML
Plus Snowball Metrics Exchange API
Streamlined process:
• Use 3-month sprint model for new metrics
27. Peter Darroch
Elsevier, Senior Product Manager, Research Metrics
p.darroch@elsevier.com
Anna Clements
University of St Andrews, UK
akc@st-andrews.ac.uk
Twitter: @annakclements
Snowball Metrics http://www.snowballmetrics.com/