SlideShare a Scribd company logo
1 of 25
Download to read offline
Effective PRFS:
principles and practice
Paul Hubbard
www.metodika.reformy-msmt.cz
Introducing myself…
• Head of research policy at HEFCE, 2003 to 2013: advising on the design of
assessment systems and their use in allocating “block grant” funding to
Universities in England
• Assisted in design of RAE 1992; RAE Manager 1996; led HEFCE input to a
major review of UK assessment approach leading up to RAE 2014 and
chaired the RAE steering committee
2
Effective PRFS: principles and
practice
• Why should we assess quality, and why link performance to funding?
• Principles: what makes a strong assessment system? and choosing our
assessment approach
• Practice: disciplines and research units
• Practice: information and indicators
• Practice: management and communication
• Assessment and funding
• A few thoughts to conclude
3
Why assess quality?
By assessing quality we can “look back to look forward”, and make plans
based on a good understanding of where we are starting from
Effective quality assessment can:
•tell us how good a national system, institution or research group is -
providing public information and accountability for funding
•identify strengths and weaknesses
•encourage and inform action to raise standards
4
Why link performance to
funding?
Assessment linked to funding does all of these and in addition it provides:
•a much stronger incentive to push up standards, and strengthens the
hand of research leaders and managers
•better targeting of funding following policy aims
•stronger accountability for money spent
5
What makes a strong assessment
system?
• Objectivity and transparency: seen to be fair and well informed
• Consistency: all disciplines are assessed on similar criteria and datasets
and common standards of excellence
• Fitness for purpose: the system is well designed to achieve our policy goals
• Fit to our national system and culture: there is no one size that fits all and
the best system for one country may not be the best for another
• Timeliness and repetition: evaluation updated periodically, and learning
from experience to refine our approach
• Credibility: The funders, the researchers and others accept that the system is
producing good outcomes
6
Choosing our assessment and
funding approach (1)
What do we want to know and what are we trying to achieve?
•A clear overview strengths and weaknesses will help in deciding policy for
improvement and possibly for targeted funding
•Identifying areas of strength, and paying particular attention to identifying
the societal impacts of the research submitted for assessment, can help to
ensure that the research base is fully supporting the national economy
7
Choosing our assessment and
funding approach (2)
The main available assessment approaches are:
•Self evaluation
•Assessment using statistics (“indicators” or “metrics”)
•Peer review
8
Self evaluation
• Enables researchers and units to ensure that their strengths are seen and
appreciated in full
• Encourages reflection and strategy-building in research units
• But if used as a major element in assessment or funding raises inevitable
questions of credibility and reliability
9
Assessment using statistics
• Is objective and based in facts, not easily open to manipulation or bias
• Is relatively simple and cheap to operate and can be repeated at will
• But works better for some disciplines than others and better for larger
units or groups than for smaller ones
• And quantitative information cannot tell us everything about a research
unit, and the chosen statistical approach and interpretation of the data
will be hotly contested.
10
Peer review
• A well established method, largely accepted by the research community
• Peer judgments are only as good as the information on which they are
based and the expertise of the reviewers. Good system design, and
validation against international standards of excellence, are essential to
success.
• Securing a strong input from international assessors will be helpful by
ensuring that the outcomes are validated against international standards
and in keeping possible conflicts of interest to a minimum
11
So - choosing our assessment
approach
• The best approach to producing well rounded quality judgements across
all disciplines, and at the level of individual institutions or units, is peer
review informed by carefully selected and interpreted contextual and
statistical information. This must include giving research organisations
and units the opportunity to say what they think are their significant
strengths.
• An assessment approach combining clear, broad criteria for excellence
with ratings against several assessment criteria rather than only a single
“score” can best capture the full range of diversity of research methods
and outcomes across the academic spectrum.
12
Practice: disciplines and
research units (1)
• How to split the work between discipline units, and how to define a
research unit on the ground, are key decisions
• We need to ensure that the evaluation is conducted at the right level of
detail to recognise differing approaches across disciplines and also to
inform policy decisions
13
Practice: disciplines and research
units (2)
• Panels should be seen to assess research groups and units, not individuals
• Information about a specific, recognisable field or sub-field is more useful to managers
and planners than broader judgments
• And we need to define field disciplines to reflect practice on the ground - how are
researchers grouped and managed within our institutions?
• It is important that an assessment panel of workable size does not cover too wide a
field to be credible
• The size of the system is relevant - panel workload should be manageable.
• Grouping field panels into broad discipline groups can be helpful in ensuring common
standards
• The system must be seen to deal well with interdisciplinary research
14
Practice: information and
indicators
• We must collect enough information to give panels the full picture
• At the same time we must not swamp panels with information that they
do not need or struggle to understand
• It is good to have a mixture of input and output data. Input data tell us
what resources a unit has to work with; output data tell us what it has
shown it can do with those resources. Both are good indicators of its
capacity and potential to produce excellent research in the future.
• Quantitative indicators should be used to inform academic judgement:
they should not simply feed through directly into the quality ratings
15
Indicators
Ideally we need a basket of indicators that will help the assessment panel
to understand:
•what resources a unit has (showing its capacity and the commitment of its
funders)
•what external recognition (funding and other) its achievements have won
•the full range and impact of its achievements
There can be a place for indicators of productivity, especially if this is a
policy concern, though volume of work alone may not be an indicator of
quality
16
Indicators: citation indices
(bibliometrics)
Citation indices are a strong indicator of research quality:
•in disciplines where publication in journals/ proceedings is the
established norm
•in relation to reasonably substantial bodies of work and within fields of
closely related activity
•where the indices are used with a clear understanding of context
(comparisons within a subfield for which citation behaviour does not vary
greatly are strong)
17
Indicators: citation indices
(bibliometrics)
Citation indices can also be valuable to assist peer review, including at the
level of individual outputs, where the assessors understand the context
and the publication and citation behaviours in a discipline.
18
Information: best outputs
Asking units to identify their best outputs for review gives us a good
picture of what they can do: the strength, depth and breadth of their work
•It keeps the reviewers’ workload to a manageable level
•It focusses attention on the truly excellent work
•It is important to recognise the diversity of forms of research output within
and between disciplines
19
Practice: management and
communication (1)
Assessment and funding systems must be transparent and seen to be fair.
It is helpful if:
•The assessment process and criteria for excellence are published in some
detail in advance
•Research organisations feel that they have been allowed to demonstrate
their strengths fully
•Evaluation panels are well supported by a secretariat and work together to
ensure common standards
•Panels give summative and developmental feedback identifying both
strengths and weaknesses
20
Practice: management and
communication (2)
We should also do what we can to discourage over-preparation and
“games playing” by submitting research units. We should aim to build
confidence that the assessment system will find excellence wherever it
exists.
21
Assessment and funding (1)
It is often best to conduct the assessment before finally deciding funding
parameters:
•this reduces the pressure on the assessment panels
•it reduces the scope for games playing in submissions
•it makes it possible to tailor funding arrangements to the pattern of
excellence that has been found
22
Assessment and funding (2)
• It is highly desirable to have a clear view as to what volumes of research
activity are being assessed, and funded, in individual research units and
organisations.
• This requires some form of count of active researchers. A “feedback
loop” can be created between funding and assessment to discourage
over-counting.
• More direct measures of output volume are not available reliably across
all disciplines.
23
A few thoughts to conclude
• PRFS is a strong tool for delivering improved quality in the research base in keeping
with policy aims and supporting research managers
• Periodic repetition of assessment allows the approach to be tuned to national needs
and circumstances. There is no single best approach for all research systems.
• If the assessment approach appears complex and burdensome for submitting units,
we might ask whether they are doing more to prepare than is really necessary
• A well designed system will not destabilise strong research units. The pace of any
change will largely be determined by funding decisions.
• Transparency and good communication with researchers at all stages are essential
• There will be criticism and some measure of “games playing”. If the system is well
designed and managed this does not invalidate the results: indeed, some measure of
continuing debate is helpful.
24
Thank you for your
attention!
www.metodika.reformy-msmt.cz

More Related Content

What's hot

LibQUAL+ and ClimateQUAL at York
LibQUAL+ and ClimateQUAL at YorkLibQUAL+ and ClimateQUAL at York
LibQUAL+ and ClimateQUAL at YorkStephen Town
 
Monitoring Scale-up of Health Practices and Interventions
Monitoring Scale-up of Health Practices and InterventionsMonitoring Scale-up of Health Practices and Interventions
Monitoring Scale-up of Health Practices and InterventionsMEASURE Evaluation
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems ChangeNoel Hatch
 
Introduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPECIntroduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPECLeonardo ENERGY
 
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...
Monitoring and Evaluation at the Community Level: A Strategic Review of ME...Monitoring and Evaluation at the Community Level: A Strategic Review of ME...
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...MEASURE Evaluation
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approachesJessica Bernardino
 
Consumer Oriented Evaluation Ppt
Consumer Oriented Evaluation PptConsumer Oriented Evaluation Ppt
Consumer Oriented Evaluation PptSpeedballjr
 
Pseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation ApproachPseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation Approachkhrystyl23
 
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Expertise, Consumer-Oriented, and Program-Oriented Evaluation ApproachesExpertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approachesdctrcurry
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluationmigom doley
 
Beat the odds evaluation model table
Beat the odds evaluation model table  Beat the odds evaluation model table
Beat the odds evaluation model table shirleydesigns
 
Interactive evaluation
Interactive evaluationInteractive evaluation
Interactive evaluationCarlo Magno
 
Meta Evaluation and Evaluation Dissemination(Manual 3)
Meta Evaluation and Evaluation Dissemination(Manual 3)Meta Evaluation and Evaluation Dissemination(Manual 3)
Meta Evaluation and Evaluation Dissemination(Manual 3)Dawit Wolde
 
Gender in project design
Gender in project designGender in project design
Gender in project designILRI
 

What's hot (20)

LibQUAL+ and ClimateQUAL at York
LibQUAL+ and ClimateQUAL at YorkLibQUAL+ and ClimateQUAL at York
LibQUAL+ and ClimateQUAL at York
 
Monitoring Scale-up of Health Practices and Interventions
Monitoring Scale-up of Health Practices and InterventionsMonitoring Scale-up of Health Practices and Interventions
Monitoring Scale-up of Health Practices and Interventions
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems Change
 
Introduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPECIntroduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPEC
 
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...
Monitoring and Evaluation at the Community Level: A Strategic Review of ME...Monitoring and Evaluation at the Community Level: A Strategic Review of ME...
Monitoring and Evaluation at the Community Level : A Strategic Review of ME...
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approaches
 
Guidance on CRP Commissioned Evalautions
Guidance on CRP Commissioned EvalautionsGuidance on CRP Commissioned Evalautions
Guidance on CRP Commissioned Evalautions
 
Consumer Oriented Evaluation Ppt
Consumer Oriented Evaluation PptConsumer Oriented Evaluation Ppt
Consumer Oriented Evaluation Ppt
 
"Assessing Outcomes in CGIAR: Practical Approaches and Methods"
"Assessing Outcomes in CGIAR: Practical Approaches and Methods""Assessing Outcomes in CGIAR: Practical Approaches and Methods"
"Assessing Outcomes in CGIAR: Practical Approaches and Methods"
 
Pseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation ApproachPseudo-evaluation and Quasi-evaluation Approach
Pseudo-evaluation and Quasi-evaluation Approach
 
Impact evaluation methods: Qualitative Methods
Impact evaluation methods: Qualitative MethodsImpact evaluation methods: Qualitative Methods
Impact evaluation methods: Qualitative Methods
 
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Expertise, Consumer-Oriented, and Program-Oriented Evaluation ApproachesExpertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
 
Meta-Evaluation Theory
Meta-Evaluation TheoryMeta-Evaluation Theory
Meta-Evaluation Theory
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Meta Evaluation
Meta EvaluationMeta Evaluation
Meta Evaluation
 
Beat the odds evaluation model table
Beat the odds evaluation model table  Beat the odds evaluation model table
Beat the odds evaluation model table
 
Interactive evaluation
Interactive evaluationInteractive evaluation
Interactive evaluation
 
Meta Evaluation and Evaluation Dissemination(Manual 3)
Meta Evaluation and Evaluation Dissemination(Manual 3)Meta Evaluation and Evaluation Dissemination(Manual 3)
Meta Evaluation and Evaluation Dissemination(Manual 3)
 
Gender in project design
Gender in project designGender in project design
Gender in project design
 
Components of a monitoring and evaluation system
Components of a monitoring and evaluation system  Components of a monitoring and evaluation system
Components of a monitoring and evaluation system
 

Similar to Effective PRFS: principles and practice

Introduction to Policy Evaluation
Introduction to Policy EvaluationIntroduction to Policy Evaluation
Introduction to Policy EvaluationpasicUganda
 
Development and Evaluation of clinical practice guideline (CPG) in psychiatry
Development and Evaluation of clinical practice guideline (CPG) in psychiatryDevelopment and Evaluation of clinical practice guideline (CPG) in psychiatry
Development and Evaluation of clinical practice guideline (CPG) in psychiatryDiptadhi Mukherjee
 
SystematicreviewJan2017.pptx
SystematicreviewJan2017.pptxSystematicreviewJan2017.pptx
SystematicreviewJan2017.pptxssuserf2f9e1
 
Lecture 2: Research Proposal Development
Lecture 2: Research Proposal DevelopmentLecture 2: Research Proposal Development
Lecture 2: Research Proposal DevelopmentESD UNU-IAS
 
Let My Patients Flow-Streamlining the OR Suite
Let My Patients Flow-Streamlining the OR SuiteLet My Patients Flow-Streamlining the OR Suite
Let My Patients Flow-Streamlining the OR SuiteProModel Corporation
 
Measuring Nonprofit Outcomes
Measuring Nonprofit OutcomesMeasuring Nonprofit Outcomes
Measuring Nonprofit OutcomesTrina Willard
 
Scce webinar assessment_061316
Scce webinar assessment_061316Scce webinar assessment_061316
Scce webinar assessment_061316Eric Morehead
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
 
Library Strategy: Models and Measurement
Library Strategy: Models and MeasurementLibrary Strategy: Models and Measurement
Library Strategy: Models and MeasurementStephen Town
 
Research analytics service - ARMA study tour
Research analytics service - ARMA study tour  Research analytics service - ARMA study tour
Research analytics service - ARMA study tour Christopher Brown
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs AssessmentLeila Zaim
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosaResty Samosa
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluationCarlo Magno
 
Training needs analysis, skills auditing and training
Training needs analysis, skills auditing and trainingTraining needs analysis, skills auditing and training
Training needs analysis, skills auditing and trainingCharles Cotter, PhD
 
Principles for good metrics: theory to practice
Principles for good metrics: theory to practicePrinciples for good metrics: theory to practice
Principles for good metrics: theory to practiceAlan Fricker
 
Operational research dr ajay tyagi
Operational research dr ajay tyagiOperational research dr ajay tyagi
Operational research dr ajay tyagiDrajay Tyagi
 
How to Assess and Continuously Improve Maturity of Health Information Systems...
How to Assess and Continuously Improve Maturity of Health Information Systems...How to Assess and Continuously Improve Maturity of Health Information Systems...
How to Assess and Continuously Improve Maturity of Health Information Systems...MEASURE Evaluation
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptxssusere0ee1d
 

Similar to Effective PRFS: principles and practice (20)

Introduction to Policy Evaluation
Introduction to Policy EvaluationIntroduction to Policy Evaluation
Introduction to Policy Evaluation
 
Development and Evaluation of clinical practice guideline (CPG) in psychiatry
Development and Evaluation of clinical practice guideline (CPG) in psychiatryDevelopment and Evaluation of clinical practice guideline (CPG) in psychiatry
Development and Evaluation of clinical practice guideline (CPG) in psychiatry
 
SystematicreviewJan2017.pptx
SystematicreviewJan2017.pptxSystematicreviewJan2017.pptx
SystematicreviewJan2017.pptx
 
Lecture 2: Research Proposal Development
Lecture 2: Research Proposal DevelopmentLecture 2: Research Proposal Development
Lecture 2: Research Proposal Development
 
Let My Patients Flow-Streamlining the OR Suite
Let My Patients Flow-Streamlining the OR SuiteLet My Patients Flow-Streamlining the OR Suite
Let My Patients Flow-Streamlining the OR Suite
 
Measuring Nonprofit Outcomes
Measuring Nonprofit OutcomesMeasuring Nonprofit Outcomes
Measuring Nonprofit Outcomes
 
Scce webinar assessment_061316
Scce webinar assessment_061316Scce webinar assessment_061316
Scce webinar assessment_061316
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
Library Strategy: Models and Measurement
Library Strategy: Models and MeasurementLibrary Strategy: Models and Measurement
Library Strategy: Models and Measurement
 
Research analytics service - ARMA study tour
Research analytics service - ARMA study tour  Research analytics service - ARMA study tour
Research analytics service - ARMA study tour
 
6224608.ppt
6224608.ppt6224608.ppt
6224608.ppt
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs Assessment
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluation
 
Training needs analysis, skills auditing and training
Training needs analysis, skills auditing and trainingTraining needs analysis, skills auditing and training
Training needs analysis, skills auditing and training
 
Principles for good metrics: theory to practice
Principles for good metrics: theory to practicePrinciples for good metrics: theory to practice
Principles for good metrics: theory to practice
 
Operational research dr ajay tyagi
Operational research dr ajay tyagiOperational research dr ajay tyagi
Operational research dr ajay tyagi
 
1.1 business research class discussions
1.1 business research class discussions1.1 business research class discussions
1.1 business research class discussions
 
How to Assess and Continuously Improve Maturity of Health Information Systems...
How to Assess and Continuously Improve Maturity of Health Information Systems...How to Assess and Continuously Improve Maturity of Health Information Systems...
How to Assess and Continuously Improve Maturity of Health Information Systems...
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptx
 

More from MEYS, MŠMT in Czech

Pilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisationsPilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisationsMEYS, MŠMT in Czech
 
Průvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizacePrůvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizaceMEYS, MŠMT in Czech
 
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RUSouhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RUMEYS, MŠMT in Czech
 
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...MEYS, MŠMT in Czech
 
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...MEYS, MŠMT in Czech
 
Final report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation MethodologyFinal report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation MethodologyMEYS, MŠMT in Czech
 
Final report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding PrinciplesFinal report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding PrinciplesMEYS, MŠMT in Czech
 
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...MEYS, MŠMT in Czech
 
Summary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding PrinciplesSummary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding PrinciplesMEYS, MŠMT in Czech
 
Identifikace vědeckých pracovníků
Identifikace vědeckých pracovníkůIdentifikace vědeckých pracovníků
Identifikace vědeckých pracovníkůMEYS, MŠMT in Czech
 
Doporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠDoporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠMEYS, MŠMT in Czech
 
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...MEYS, MŠMT in Czech
 

More from MEYS, MŠMT in Czech (20)

Pilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisationsPilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisations
 
Tabulky nákladového modelu
Tabulky nákladového modeluTabulky nákladového modelu
Tabulky nákladového modelu
 
Organizační schémata
Organizační schémataOrganizační schémata
Organizační schémata
 
Průvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizacePrůvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizace
 
Šablona sebeevaluační zprávy
Šablona sebeevaluační zprávyŠablona sebeevaluační zprávy
Šablona sebeevaluační zprávy
 
Zápisy z kalibračních schůzek
Zápisy z kalibračních schůzekZápisy z kalibračních schůzek
Zápisy z kalibračních schůzek
 
Úpravy bibliometrické zprávy
Úpravy bibliometrické zprávyÚpravy bibliometrické zprávy
Úpravy bibliometrické zprávy
 
Příklad bibliometrické zprávy
Příklad bibliometrické zprávyPříklad bibliometrické zprávy
Příklad bibliometrické zprávy
 
Průvodce pro členy panelů
Průvodce pro členy panelůPrůvodce pro členy panelů
Průvodce pro členy panelů
 
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RUSouhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
 
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
 
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
 
Final report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation MethodologyFinal report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation Methodology
 
Final report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding PrinciplesFinal report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding Principles
 
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
 
Summary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding PrinciplesSummary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding Principles
 
Analýza rizik pro zavedení NERO
Analýza rizik pro zavedení NEROAnalýza rizik pro zavedení NERO
Analýza rizik pro zavedení NERO
 
Identifikace vědeckých pracovníků
Identifikace vědeckých pracovníkůIdentifikace vědeckých pracovníků
Identifikace vědeckých pracovníků
 
Doporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠDoporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠ
 
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
 

Recently uploaded

Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 

Recently uploaded (20)

Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 

Effective PRFS: principles and practice

  • 1. Effective PRFS: principles and practice Paul Hubbard www.metodika.reformy-msmt.cz
  • 2. Introducing myself… • Head of research policy at HEFCE, 2003 to 2013: advising on the design of assessment systems and their use in allocating “block grant” funding to Universities in England • Assisted in design of RAE 1992; RAE Manager 1996; led HEFCE input to a major review of UK assessment approach leading up to RAE 2014 and chaired the RAE steering committee 2
  • 3. Effective PRFS: principles and practice • Why should we assess quality, and why link performance to funding? • Principles: what makes a strong assessment system? and choosing our assessment approach • Practice: disciplines and research units • Practice: information and indicators • Practice: management and communication • Assessment and funding • A few thoughts to conclude 3
  • 4. Why assess quality? By assessing quality we can “look back to look forward”, and make plans based on a good understanding of where we are starting from Effective quality assessment can: •tell us how good a national system, institution or research group is - providing public information and accountability for funding •identify strengths and weaknesses •encourage and inform action to raise standards 4
  • 5. Why link performance to funding? Assessment linked to funding does all of these and in addition it provides: •a much stronger incentive to push up standards, and strengthens the hand of research leaders and managers •better targeting of funding following policy aims •stronger accountability for money spent 5
  • 6. What makes a strong assessment system? • Objectivity and transparency: seen to be fair and well informed • Consistency: all disciplines are assessed on similar criteria and datasets and common standards of excellence • Fitness for purpose: the system is well designed to achieve our policy goals • Fit to our national system and culture: there is no one size that fits all and the best system for one country may not be the best for another • Timeliness and repetition: evaluation updated periodically, and learning from experience to refine our approach • Credibility: The funders, the researchers and others accept that the system is producing good outcomes 6
  • 7. Choosing our assessment and funding approach (1) What do we want to know and what are we trying to achieve? •A clear overview strengths and weaknesses will help in deciding policy for improvement and possibly for targeted funding •Identifying areas of strength, and paying particular attention to identifying the societal impacts of the research submitted for assessment, can help to ensure that the research base is fully supporting the national economy 7
  • 8. Choosing our assessment and funding approach (2) The main available assessment approaches are: •Self evaluation •Assessment using statistics (“indicators” or “metrics”) •Peer review 8
  • 9. Self evaluation • Enables researchers and units to ensure that their strengths are seen and appreciated in full • Encourages reflection and strategy-building in research units • But if used as a major element in assessment or funding raises inevitable questions of credibility and reliability 9
  • 10. Assessment using statistics • Is objective and based in facts, not easily open to manipulation or bias • Is relatively simple and cheap to operate and can be repeated at will • But works better for some disciplines than others and better for larger units or groups than for smaller ones • And quantitative information cannot tell us everything about a research unit, and the chosen statistical approach and interpretation of the data will be hotly contested. 10
  • 11. Peer review • A well established method, largely accepted by the research community • Peer judgments are only as good as the information on which they are based and the expertise of the reviewers. Good system design, and validation against international standards of excellence, are essential to success. • Securing a strong input from international assessors will be helpful by ensuring that the outcomes are validated against international standards and in keeping possible conflicts of interest to a minimum 11
  • 12. So - choosing our assessment approach • The best approach to producing well rounded quality judgements across all disciplines, and at the level of individual institutions or units, is peer review informed by carefully selected and interpreted contextual and statistical information. This must include giving research organisations and units the opportunity to say what they think are their significant strengths. • An assessment approach combining clear, broad criteria for excellence with ratings against several assessment criteria rather than only a single “score” can best capture the full range of diversity of research methods and outcomes across the academic spectrum. 12
  • 13. Practice: disciplines and research units (1) • How to split the work between discipline units, and how to define a research unit on the ground, are key decisions • We need to ensure that the evaluation is conducted at the right level of detail to recognise differing approaches across disciplines and also to inform policy decisions 13
  • 14. Practice: disciplines and research units (2) • Panels should be seen to assess research groups and units, not individuals • Information about a specific, recognisable field or sub-field is more useful to managers and planners than broader judgments • And we need to define field disciplines to reflect practice on the ground - how are researchers grouped and managed within our institutions? • It is important that an assessment panel of workable size does not cover too wide a field to be credible • The size of the system is relevant - panel workload should be manageable. • Grouping field panels into broad discipline groups can be helpful in ensuring common standards • The system must be seen to deal well with interdisciplinary research 14
  • 15. Practice: information and indicators • We must collect enough information to give panels the full picture • At the same time we must not swamp panels with information that they do not need or struggle to understand • It is good to have a mixture of input and output data. Input data tell us what resources a unit has to work with; output data tell us what it has shown it can do with those resources. Both are good indicators of its capacity and potential to produce excellent research in the future. • Quantitative indicators should be used to inform academic judgement: they should not simply feed through directly into the quality ratings 15
  • 16. Indicators Ideally we need a basket of indicators that will help the assessment panel to understand: •what resources a unit has (showing its capacity and the commitment of its funders) •what external recognition (funding and other) its achievements have won •the full range and impact of its achievements There can be a place for indicators of productivity, especially if this is a policy concern, though volume of work alone may not be an indicator of quality 16
  • 17. Indicators: citation indices (bibliometrics) Citation indices are a strong indicator of research quality: •in disciplines where publication in journals/ proceedings is the established norm •in relation to reasonably substantial bodies of work and within fields of closely related activity •where the indices are used with a clear understanding of context (comparisons within a subfield for which citation behaviour does not vary greatly are strong) 17
  • 18. Indicators: citation indices (bibliometrics) Citation indices can also be valuable to assist peer review, including at the level of individual outputs, where the assessors understand the context and the publication and citation behaviours in a discipline. 18
  • 19. Information: best outputs Asking units to identify their best outputs for review gives us a good picture of what they can do: the strength, depth and breadth of their work •It keeps the reviewers’ workload to a manageable level •It focusses attention on the truly excellent work •It is important to recognise the diversity of forms of research output within and between disciplines 19
  • 20. Practice: management and communication (1) Assessment and funding systems must be transparent and seen to be fair. It is helpful if: •The assessment process and criteria for excellence are published in some detail in advance •Research organisations feel that they have been allowed to demonstrate their strengths fully •Evaluation panels are well supported by a secretariat and work together to ensure common standards •Panels give summative and developmental feedback identifying both strengths and weaknesses 20
  • 21. Practice: management and communication (2) We should also do what we can to discourage over-preparation and “games playing” by submitting research units. We should aim to build confidence that the assessment system will find excellence wherever it exists. 21
  • 22. Assessment and funding (1) It is often best to conduct the assessment before finally deciding funding parameters: •this reduces the pressure on the assessment panels •it reduces the scope for games playing in submissions •it makes it possible to tailor funding arrangements to the pattern of excellence that has been found 22
  • 23. Assessment and funding (2) • It is highly desirable to have a clear view as to what volumes of research activity are being assessed, and funded, in individual research units and organisations. • This requires some form of count of active researchers. A “feedback loop” can be created between funding and assessment to discourage over-counting. • More direct measures of output volume are not available reliably across all disciplines. 23
  • 24. A few thoughts to conclude • PRFS is a strong tool for delivering improved quality in the research base in keeping with policy aims and supporting research managers • Periodic repetition of assessment allows the approach to be tuned to national needs and circumstances. There is no single best approach for all research systems. • If the assessment approach appears complex and burdensome for submitting units, we might ask whether they are doing more to prepare than is really necessary • A well designed system will not destabilise strong research units. The pace of any change will largely be determined by funding decisions. • Transparency and good communication with researchers at all stages are essential • There will be criticism and some measure of “games playing”. If the system is well designed and managed this does not invalidate the results: indeed, some measure of continuing debate is helpful. 24
  • 25. Thank you for your attention! www.metodika.reformy-msmt.cz