Innovation in Education: Tools and methods for success (Session 1)
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Innovation in Education: Tools and methods for success (Session 1)

on

  • 208 views

Innovation in Education

Innovation in Education
Tools and methods for success
Session 1: Concepts and Models
Joint Pearson and ELIG workshop

Statistics

Views

Total Views
208
Views on SlideShare
208
Embed Views
0

Actions

Likes
0
Downloads
2
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • At Pearson, our ambition has always been to help people make progress in their lives through learning. We know how big a difference education can make to individual learners and whole economies across the world, so we want to make sure that the part we play in improving learner outcomes – alongside governments, institutions, educators, parents and pupils – has the biggest possible impact.Efficacy is our way of achieving this.
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • There is clearly big demand for high quality education, and in this context there are three important factors that have made us realise that now is our best chance to make a difference: The recognition that education can drive personal, economic and societal growthThe subsequent increase in global investment in educationThe advancement of technology that gives us access to real-time data on how well a product or service is helping a learner reach their goals  Together, these factors create a unique opportunity to work with others to transform education, and in doing so, the lives of learners across the world.
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • In line with the recent public commitment to efficacy (http://efficacy.pearson.com/) the learnshop is seen to be an appropriate showcase for the application of the efficacy framework to a wide variety of interested parties. Inside and outside Pearson “efficacy” has different meanings. At Pearson we have agreed on a definition of efficacy. Efficacy is defined as: “A measurable impact on improving someone’s life through learning.”We need to be able to identify the specific impact for a learner. Efficacy has direct and obvious applications for those who are designing and delivering products, services and solutions to learners. The Efficacy Framework was developed by Sir Michael Barber (Chief Education Advisor) and his team. It draws on best practices about delivery from Pearson, and the public and private sectors. The Efficacy Framework has two purposes: to understand whether we are delivering efficacy, and to identify a path to improve efficacy. This is outlined below, with the four key questions asked as part of the framework and a set of ratings for identification.
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • So far you have learnt about the basics of efficacy, what are our plans, and our progress to date. Now we want to talk about what you can do: now and after today.
  • I hope you have found this useful and interesting, and that you feel like our efficacy focus will help you and your students/pupils to achieve your goals.For more information, please visit efficacy.pearson.com, where you will find lots more detail about our approach, and be able to hear from some of our leadership team and external education experts about what it could help to achieve.
  • In line with the recent public commitment to efficacy (http://efficacy.pearson.com/) the learnshop is seen to be an appropriate showcase for the application of the efficacy framework to a wide variety of interested parties. Inside and outside Pearson “efficacy” has different meanings. At Pearson we have agreed on a definition of efficacy. Efficacy is defined as: “A measurable impact on improving someone’s life through learning.”We need to be able to identify the specific impact for a learner. Efficacy has direct and obvious applications for those who are designing and delivering products, services and solutions to learners. The Efficacy Framework was developed by Sir Michael Barber (Chief Education Advisor) and his team. It draws on best practices about delivery from Pearson, and the public and private sectors. The Efficacy Framework has two purposes: to understand whether we are delivering efficacy, and to identify a path to improve efficacy. This is outlined below, with the four key questions asked as part of the framework and a set of ratings for identification.

Innovation in Education: Tools and methods for success (Session 1) Presentation Transcript

  • 1. Innovation in Education Tools and methods for success Session 1: Concepts and Models 8 de abril de 2014 Escola Superior de Educação do Porto - Auditório Central da Biblioteca R. Dr. Roberto Frias, 602 | GPS 41.177252,-8.599733 Mr. Kelwyn Looi, Analista; Office of Sir Michael Barber, Chief Education Advisor Pearson UK Dr. Andreas Meiszner, Senior Advisor; European Learning Industry Group
  • 2. Workshop supported by HOTEL project (HOlistic approach to Technology Enhanced Learning), a FP7 Support action.  Scope: Model that will help innovators to come • From point A (idea, research, early prototype, small scale innovative practice) • to point B (innovation, advanced prototype, exploitable product, large scale innovative practice) ‒ making a significant progress, faster and in a consistent way ‒ taking a holistic approach (e.g. technical, theoretical, educational, relational, social, business , etc.)
  • 3. TEL innovation ecosystems (WHO)
  • 4. Possible origins of TEL innovations  Technology and Industry-led, via availability of a new technology, normally not specifically designed for learning  Research-led, in which learning theories search and find application in experimental learning settings  Practice-led, spontaneous bottom up innovation emerging from individuals or communities of teachers and learners that find original ways of using technology to materialise new ideas about learning and teaching and are able to demonstrate their effectiveness in new contexts of use;  Policy-led innovation, materialised by the many national programmes launched since the 80s to diffuse ICT and its use in classrooms. Technology PracticeTheory TEL Innovation drivers Holistic approach
  • 5. Difficulty of adoption and scale (WHAT) Systemic changes in one of these innovation types, can introduce changes or innovations in the other 3 types as well. Not linear, single rooted, or independent BUT Systemic, several converging technologies, often competing, complex interactions of many players, holistic solutions Need for Supply –demand integration
  • 6. Kamtsiou 2013 Technological Framework (according to the nature of innovation) Technology assessment: Technology readiness GAPs/SWOT analysis & further developments Stable technology platforms evolution based on trends Functional logic of implementation (supply – demand chains) Disruptive Technology Foresight linear/incremental Technology Forecasting Emerging technologies and their possible commercialization Replacement of existing practices, products, technolo gies, Opportunities and Threats Stable areas of technology development Systemic Adoption and change management Co-Innovations value blueprints
  • 7. Innovation or co-innovation – can you tell?
  • 8. Possible strands for TEL innovations; lessons from the STELLAR & TELMAP projects  Personal Learning Environments (PLE)  Open Educational Practices (OEP)  Personal Learning Networks (PLN)  Open Educational Resources (OER)  Massive Online Courses (MOOC) Massive Collaboration
  • 9. Questions: • What are appropriate analytical frameworks to classify innovations, and properly understand their advantage/contribution within a pedagogical context • How can we help innovators to: • properly formulate their ideas in a way which aids introduction • assist them in developing indicators to improve their diffusion/adoption • understand how their innovation fits to current learning practices • OVERALL: How can we accelerate innovation cycles’ speed in TEL The HOTEL project has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement number ICT-318530.
  • 10. Inovação na educação Ferramentas e métodos para o sucesso Sessão 1 Inovação na educação: Conceitos e modelos 8 de abril de 2014 Porto, Portugal. A measurable impact on improving someone’s life through learning Kelwyn Looi Office of the Chief Education Advisor, Pearson Dr. Andreas Meiszner European Learning Industry Group (ELIG)
  • 11. Agenda 1. An introduction to the Efficacy Framework at Pearson 2. Discussion: the Efficacy Framework as a: a) Tool that supports / enhances existing innovation support models b) Tool that can be used in entrepreneurship / start-ups c) Tool that can be used in the Portuguese higher and adult education context 3. How can the efficacy framework be applied in your line of work? 4. Q&A 11 Sessão 1: Conceitos e modelos l April 2014
  • 12. What we aim to achieve • An understanding of the Efficacy Framework and its use as a tool to transform the way the company operates • Introducing the Efficacy Framework as a tool to support innovation / entrepreneurship in education • Recommendations around the Efficacy Framework and its applicability to the Portuguese education system • Identifying where efficacy can be embedded in your work 1 2 3 4 Watch out for this icon – it signals an activity! 12 Sessão 1: Conceitos e modelos l April 2014
  • 13. An Introduction to the session… This session aims at evaluating how analytical tools, such as the Pearson efficacy framework, could enhance already established innovation support models, structures and processes. The session will provide an introduction into the Pearson efficacy framework to subsequently open the floor to a discussion on its applicability within the Portuguese higher and adult education context. Pearson’s Efficacy Framework will be tested as a [e.g. stand-alone] means to support the stakeholders to innovate in TEL / education Key objectives For attendees the session would provide the opportunity: 1. To acquaint yourself with the efficacy framework as a tool to engender learning-focused discussions when assessing and evaluating prospective innovations or its use in entrepreneurship activities 2. To examine the rigour of the framework, suggests improvements based on academic research, and assess the applicability of it to the local context 13 Sessão 1: Conceitos e modelos l April 2014
  • 14. The Efficacy Framework Discussion: Efficacy as a tool Exercise Key takeaways and useful resources
  • 15. The path to Efficacy: why? • As the world’s leading learning company, we feel we have a responsibility and an opportunity to help people make progress in their lives through learning • We have aligned our activities around the principle of Efficacy to achieve this 15 Sessão 1: Conceitos e modelos l April 2014
  • 16. Pearson’s definition of efficacy efficacy (dictionary definition) • ability to produce the intended result efficiency (dictionary definition) • achieve maximum productivity with minimum wasted effort 16 Put simply… it’s all about products that improve results and measurable outcomes for learners. Efficacy (Pearson’s definition) • a measureable impact on improving someone’s life through learning Sessão 1: Conceitos e modelos l April 2014
  • 17. Why now? • There is a shared understanding that high-quality education drives personal, economic and societal growth • Governments, individuals, employers and institutions recognise the need to deliver high-quality learning • New technology makes it increasingly possible to see what works and what doesn’t in helping learners to achieve their goals 17 Sessão 1: Conceitos e modelos l April 2014
  • 18. An Efficacy Framework: predicting the likelihood of impacting learner outcomes Four criteria: 1. What learner outcomes are we trying to achieve? 2. What evidence do we have to believe it is possible to achieve these outcomes? 3. What plans are in place to build and deploy a solution that will impact these outcomes? 4. What capacity exists to achieve these outcomes? 18 Sessão 1: Conceitos e modelos l April 2014
  • 19. The Efficacy Framework has already been examined as a tool to support innovation • At Online Educa Berlin last year (OEB), Pearson partnered with the European Learning Industry Group (ELIG), to deliver an interactive learnshop • This involved applying the efficacy framework to selected case studies that demonstrated innovations to support technology-enhanced learning (TEL) Excerpt from Online Educa Berlin 19 Sessão 1: Conceitos e modelos l April 2014
  • 20. The Efficacy Framework Criteria area Rating Rationale summary • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver Efficacy Key Green: Requires small number of minor actions. Amber/green: Requires some actions (some urgent and some-non urgent). Amber/red: Requires large number of urgent actions. Red: Highly problematic requiring substantial number of urgent actions.
  • 21. An Efficacy Framework: an explanation of ratings Good – requires slight refinement, but on track Mixed – some aspects require attention, some solid Problematic – requires substantial attention, some require urgent rectification Off-track – requires urgent action and problem solving Ratings are not grades on performance Ratings prompt discussions that lead to actions Ratings prioritise and suggest timeline 21 Sessão 1: Conceitos e modelos l April 2014
  • 22. Efficacy Framework: Evidence - Why should anyone believe us? Framework Section Evidence Comprehensiveness of evidence Ask yourself ∙ How well do we understand what our user needs and values? ∙ Is the vision for the innovation supported by research (internal or external)? ∙ Are we leveraging proven approaches from other innovations? Quality of evidence Ask yourself ∙ How rigorous is our evidence? ∙ Is the rigour appropriate for the innovation? ∙ How recent and relevant is the research? Application of evidence Ask yourself ∙ How evidence-based is the innovation design? ∙ Has the design been tested? ∙ Does evidence demonstrate that the innovation can be replicated globally? 22 Sessão 1: Conceitos e modelos l April 2014
  • 23. Rating Rationale Comprehensiveness of evidence ? Project 1 • Strong set of academic evidence and expertise underpinning the innovation. • Comprehensive teacher focus groups across markets and regions, focusing on the right questions (usability, price point etc); some concerns about scope of potential customers canvassed (see below). • Traditional competitors are tracked, but not non-traditional competitors who offer tests. Quality of evidence ? Project 2 • There is a good mix of quantitative and qualitative evidence as well as unbiased samples from the survey. Nonetheless, a large amount of the available evidence is based on the previous innovation (before Sept 2012 enhancement) and, therefore, is not fully applicable to the current innovation. Also, there is little documented evidence coming from the students as to date there has been more focus on teacher rather than student outcomes. Effective use of evidence ? Project 3 • The external evidence that has been collated is not known or accessible to all members of the team. There may be additional evidence within the business that could be exploited. The use of [this capability] in innovation design is not yet articulated and it is essential that any major decisions should be underpinned by research. Pilot products must be timed so information can feed back into the design [of this capability) Warm-up (5 mins): Rate the evidence for Projects 1-3 Exercise: Evidence • Good -- requires small number of minor actions • Mixed – requires some actions (some urgent and some non-urgent) • Problematic -- requires large number of urgent actions • Off track – Highly problematic requiring substantial number of urgent actions Key 23 Sessão 1: Conceitos e modelos l April 2014
  • 24. Rating Rationale Comprehensiveness of evidence Project 1 • Strong set of academic evidence and expertise underpinning the innovation. • Comprehensive teacher focus groups across markets and regions, focusing on the right questions (usability, price point etc); some concerns about scope of potential customers canvassed (see below). • Traditional competitors are tracked, but not non-traditional competitors who offer tests. Quality of evidence Project 2 • There is a good mix of quantitative and qualitative evidence as well as unbiased samples from the survey. Nonetheless, a large amount of the available evidence is based on the previous innovation (before Sept 2012 enhancement) and, therefore, is not fully applicable to the current innovation. Also, there is little documented evidence coming from the students as to date there has been more focus on teacher rather than student outcomes. Application of evidence Project 3 • The external evidence that has been collated is not known or accessible to all members of the team. There may be additional evidence within the business that could be exploited. The use of [this capability] in innovation design is not yet articulated and it is essential that any major decisions should be underpinned by research. Pilot products must be timed so information can feed back into the design [of this capability) How did you do? • Good -- requires small number of minor actions • Mixed – requires some actions (some urgent and some non-urgent) • Problematic -- requires large number of urgent actions • Off track – Highly problematic requiring substantial number of urgent actions Key Solutions: Evidence 24 Sessão 1: Conceitos e modelos l April 2014
  • 25. Framework area Initial review 3- month estimate 6-month estimate Comment Outcomes Intended outcomes After 6 months, outcomes and metrics will be clear and will influence design. Value for money intelligence will be drawn from pilots. Overall design Value for money Strength of evidence base Comprehensiveness of evidence After 6 months, the plan to develop the forward evidence base will be finalised and initiated.Quality of evidence Application of evidence Quality of planning and implementation Action plan After 6 months, long-term plans and reporting structures will be in place and governance agreed. Reporting will be at an early stage. Governance Monitoring and reporting Capacity to deliver Pearson capacity and culture After 6 months, Capacity issues will be clear, pilots delivered and lessons learned and applied. Stakeholder relationships plans will be launched and gathering feedback. Customer capacity and culture Stakeholder relationships An Efficacy Framework: driving improvement 25 Sessão 1: Conceitos e modelos l April 2014
  • 26. How can efficacy be applied to your work? What outcomes are you trying to achieve? • Set clear efficacy goals • Give your people the incentive to focus on outcomes What’s the evidence? • Develop innovations underpinned by research • Build and use effective data systems What’s the plan? • Make delivering outcomes a core part of your strategy • Take an open approach • Employ iterative and agile processes What’s the capacity to deliver? • Talk to your users and understand their students’ needs • Train your students, teachers or others to use your innovation effectively • Shape the debate with influential stakeholders 26 Sessão 1: Conceitos e modelos l April 2014
  • 27. The Efficacy Framework Discussion: Efficacy as a tool Exercise Key takeaways and useful resources
  • 28. A Thought Leadership paper that applies the Efficacy Framework in a digital K-12 context… 28 Sessão 1: Conceitos e modelos l April 2014
  • 29. The Efficacy Framework can be adapted… 29 Sessão 1: Conceitos e modelos l April 2014 Application to innovation: • To make transformative system improvements we need to know, with precision and clarity, what the learning goals are • Digital technologies that do not align with what is to be learned will likely not translate into learning enhancement
  • 30. Key questions - Outcomes 30 Sessão 1: Conceitos e modelos l April 2014 How clearly are the learning outcomes of the innovation defined? Does this innovation have the ability to scale system–wide? Are there overall cost savings realised by the innovation? Does the technology incorporate latest design principles for user experience? Is the innovation of sufficient value, demonstrated by learning outcomes, to justify change? What is the quality of case model design?
  • 31. Key questions - Evidence 31 Sessão 1: Conceitos e modelos l April 2014 What is the quality of the assessment platform? Is it adaptive and does it include an optimal amount of detail? Does the pedagogy reflect the latest global research, including and real–world examples? Is the technology integrated and seamless? Is it clear how the outcomes will be measured? How does the learner use the assessment system to monitor and motivate his or her own learning? Is 24/7 access and learning enabled?
  • 32. Key questions – Planning & Implementation 32 Sessão 1: Conceitos e modelos l April 2014 Is there a mechanism to ensure the pedagogy is updated? Is the technology adaptable and highly connective? Is the assessment system integrated into the pedagogy and learning curriculum? Is there a plan for scale based on world–leading change knowledge? How does the innovation implement in the whole system?
  • 33. Key questions – Capacity to deliver 33 Sessão 1: Conceitos e modelos l April 2014 Is (Are) the clarity of case outcome(s) shared by all stakeholders? What is the nature of the implementation support provided? What support is provided to ensure the technology functions (for all parts including software, hardware, maintenance)? What is the quality of the user experience? Is it engaging, efficient and intuitive? Is the support based on a culture of learning, risk-taking and learning from mistakes? Does the innovation include user training and professional development? Are user development goals explicit?
  • 34. How this should be used: 34 Sessão 1: Conceitos e modelos l April 2014
  • 35. Frameworks are frameworks! Works in progress – noone has the correct answers! 35 Sessão 1: Conceitos e modelos l April 2014
  • 36. The Efficacy Framework Discussion: Efficacy as a tool Exercise Key takeaways and useful resources
  • 37. 1 hour • Think through the four areas that we talk about when we measure efficacy – outcomes, evidence, planning, and capacity • Consider these 3 fields of innovation: MOOCs, Educational Games, Learning Analytics • Using the Efficacy Framework and the Outcomes and Evidence criteria, examine the innovation potential for these 3 fields of innovation • We will go through Outcomes and Evidence today You can apply the Efficacy Framework to support innovation in education… 37 Sessão 2: Ferramentas e técnicas l April 2014
  • 38. Exercise • Framing (5 mins) • Rate for the Outcomes part of the framework and discuss rationale in the group (20 mins) • Rate for the Evidence part of the framework and discuss rationale in the group (20 mins) • Discussion on the results and the usefulness of the exercise (15 mins) 1 2 3 4 38 Sessão 2: Ferramentas e técnicas l April 2014
  • 39. The Efficacy Framework Discussion: Efficacy as a tool Efficacy in your line of work Key takeaways and useful resources
  • 40. On November 15th, Pearson launched a dedicated website: http://efficacy.pearson.com outlining the company’s focus on efficacy and commitment to put the learner at the heart of the global strategy. An interactive version of the efficacy framework also features on the website. Reference material: Efficacy website 40 Sessão 1: Conceitos e modelos l April 2014
  • 41. What is efficacy? Pearson as the efficacy company Efficacy Activities Tools/ What you can do? • Definition: A measurable impact on improving someone’s life through learning • We want to be able to prove that our products and services have a measurable impact • By 2018 we are committed to demonstrating the progress we have made in improving people’s lives though learning. • Tool that supports / enhances existing innovation • Tool that can be used in entrepreneurship • Tool that can be used in Portugal (HE market) • Join the debate on the website • Blog about improving learning outcomes • Complete the Survey Monkey Recap 41 Sessão 1: Conceitos e modelos l April 2014
  • 42. Identifying dialogue and collaboration with the wider education community as crucial to accelerate progress, Pearson has also published two reports: • The first, Asking More: The Path to Efficacy, sets out the imperative for measuring and improving learning outcomes worldwide • The second, The Incomplete Guide to Delivering Learning Outcomes, shares in detail our new approach to contributing to that goal and the progress it has made so far Reference material: Efficacy publications 42 Sessão 1: Conceitos e modelos l April 2014
  • 43. “The future will belong not to those who focus on the technology alone but to those who place it in this wider context and see it as one element of a wider system transformation.” Reference material: Alive in the Swamp Quote is from Michael Barber, Chief Academic Advisor, Pearson 43 Sessão 1: Conceitos e modelos l April 2014
  • 44. You can visit efficacy.pearson.com to: • Find more information about our approach • Use the online interactive efficacy tool • Read up on the role of efficacy in education in two publications: Asking More, and The Incomplete Guide • Find out more on LinkedIn (Open for Learning) and Twitter (@PearsonPLC) • Contact: efficacy.global@pearson.com if interested parties want to collaborate with us • Contact: • kelwyn.looi@pearson.com How can I find out more? 44 Sessão 1: Conceitos e modelos l April 2014
  • 45. The Efficacy Framework Criteria area Rating Rationale summary • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver Efficacy Key Green: Requires small number of minor actions. Amber/green: Requires some actions (some urgent and some-non urgent). Amber/red: Requires large number of urgent actions. Red: Highly problematic requiring substantial number of urgent actions.
  • 46. Efficacy Framework: Outcomes Overall design • Is the innovation designed in a way that will most effectively help your target group reach their goals? • Does the design allow you to automatically collect evidence of your progress? • Have you adapted the design based on feedback from users? • Could the design be used by others? Value for money • Do you understand the benefits of your innovation to your target group, relative to other options? • Is the cost of the innovation competitive, considering the benefits it would deliver? Intended outcomes • Have you identified specific outcomes for your target group? • Do you have a way to measure the intended outcomes? • Do you have ambitious and measurable targets in place, and deadlines for achieving them? • Are your intended outcomes clearly documented and understood by the relevant people within and outside your innovation? Example of green rating Example of red rating • All outcomes are specific and clearly documented. • People within and outside my innovation understand the intended outcomes and are able to communicate them clearly. • Future targets are ambitious and achievable. • Outcomes can be regularly measured against set targets. • Design is superior to other options/competitors with features focused on delivering outcomes. • Real-time evidence is generated. • The design can be adapted and developed. • Others could use this design, and it has been shared with them. • Feedback/research has allowed me to identify what benefits the innovation needs to deliver to users. • Feedback and return-on- investment research shows that the cost of the innovation reflects the benefits delivered. • Outcomes are not documented or specific. • People within and outside my innovation do not understand the intended outcomes or communicate them in the same way. • Targets do not exist to measure outcomes against. • Outcomes are only defined at a high level. • No feedback from users exists (either formal or informal), and the benefits of using this innovation are unclear to our team and our users. • Perceptions of value for money and user experience are poor. • The design does not meet target group expectations and is difficult to use. • The design does not reflect intended outcomes. • The design does not allow for the collection of feedback. • The design is specific to a local situation and cannot be replicated. 46 Sessão 1: Conceitos e modelos l April 2014
  • 47. Quality of evidence • Does the evidence you have collected link directly to what you are trying to achieve? • Is the evidence you have collected unbiased; applicable to your innovation; recent; and does it measure success over a period of time? • Is the evidence you have collected relevant, representative and where possible at an individual level? Application of evidence • Is the evidence stored and accessible to relevant people? Is it available in an electronic and searchable format? • Has the evidence you have collected been analysed to help inform the design of your innovation? • Has the evidence you have collected been analysed to help inform other decisions about your innovation? Comprehensiveness of evidence • Do you collect evidence using a range of methods (quantitative, qualitative, internal and external for example)? • Do you collect evidence for all stages of your innovation (from early conception to design and then to implementation)? • Do you have evidence from all users of your innovation? Example of green rating Example of red rating • A wide range of evidence has been collected via internal/external, and quantitative/qualitative methods. • Evidence relates to all stages of my innovation. • Evidence exists from all users. • Evidence collected effectively proves how well we are meeting our objectives. • Rigorous research methods have been used. • Evidence relates to the specific and relevant use of the innovation. • Evidence was gathered over a period of time. • Of the evidence that does exist it is not linked directly to what I am trying to achieve. • The evidence that exists is: biased; not from a relevant use of the innovation; out of date. • The evidence is not representative of how a learner would use this innovation. • All evidence is readily accessible and searchable. • The evidence is used regularly to inform the design of my innovation. • Collected evidence is also used to inform non-design decisions. • The evidence that exists cannot be accessed quickly via electronic means. • The design of my innovation has not been changed as the result of evidence. • Major decisions about my innovation are not underpinned by evidence. • Evidence is collected via a limited range of methods and does not balance qualitative and quantitative sources. • Evidence is mainly anecdotal and patchy, and does not take into account the innovation’s lifecycle, features, or users. Efficacy Framework: Evidence 47 Sessão 1: Conceitos e modelos l April 2014
  • 48. • Our action plan has not been updated and adapted. • Where feedback exists, it is delayed. • Our team is are unaware of issues or fails to act on them. • Team routines are informal and not focused on monitoring progress. • Team-members do not know who makes key decisions. • Roles for people outside the core team are poorly defined. • New team members are unclear of key processes and do not have documentation to refer to. Monitoring and reporting • Do you update your plan based on progress, adapt it where necessary and communicate this with your stakeholders? • Do you get/have access to real-time feedback from your users? • Do you identify issues early, discuss these honestly and find solutions? • Do you have tools and routines in place to monitor progress (such as emails, calls, document-sharing)? Example of green rating Example of red rating Governance • Do people within and outside your team understand who is responsible for decision-making regarding your innovation? • Have you documented who is responsible for the work, and who should be consulted and informed? Do the relevant people understand this? • Have you identified the key processes required to implement your innovation and are these clearly documented? Action plan • Do you have a plan in place to achieve your outcomes, including milestones, actions, responsibilities and timelines? • Does your plan include short- and long-term priorities. • Have you identified any potential risks and included actions to mitigate these in your plan? • Do you regularly update your plan and communicate changes to relevant people/institutions? • Electronic plan exists with clearly identified steps, responsibilities and deadlines. • The plan includes short and long-term priorities. • The plan is regularly updated and all relevant parties are aware of the changes. • Team-members know who makes decisions, and each member of the team (within and outside my innovation) is clear about their role. • The processes we have in place are documented and well understood, and new members are fully briefed. • Data is collected in real-time and analysed to provide feedback. • Monitoring of the innovation alerts me to issues in real time. • Tools and routines are in place to identify and solve problems. • No electronic plan exists. • Plan is informal with actions, responsibilities and timelines unclear. • Milestones lack clarity and are either too ambitious or not stretching enough. • Potential risks have not been formally identified or planned for. Efficacy Framework: Planning & Implementation 48 Sessão 1: Conceitos e modelos l April 2014
  • 49. • Our team lacks the appropriate skills and resources to deliver the desired outcomes. • Our culture feels negative, traditional and not focused on outcomes. Stakeholder relationships • Have you identified who your key stakeholders are and do you understand their needs and concerns? • Do you regularly communicate with your stakeholders? • Is there a culture of partnership and collaboration between your innovation and your stakeholders? User capacity and culture • Do the target group understand the objectives and their roles in achieving them? • Does the innovation reflect the user’s skillset and available resources? • Do users have the people, skills, time, or resources to achieve their goals? • Have you put measures in place to build users’ skills? Internal capacity and culture • Does your innovation have the right number of people, and people with the right skillsets to enable you to deliver your desired outcomes? • Does your innovation have a culture focused on delivering outcomes, and is it collaborative and innovative? • Do leaders within your innovation support your work and are there opportunities to work with others across the innovation? Example of green rating Example of red rating • Team has right number of people with appropriate skillset and experience. • Culture is focused on delivering outcomes and is collaborative and innovative. • Team has appropriate budget. • The target group understand the objectives and their roles. • The innovation takes the user’s skillset into account and there are mechanisms in place to build skills. • Users have the appropriate resources to achieve their goals. • We meet with stakeholders frequently, and have formal and informal conversations. • Conversations with stakeholders have led to a culture of trust and partnership over a sustained period of time. • The target group and existing users are not aware of what the innovation should help them to achieve and what they need to do to get there. • The innovation is ill-suited to the user and attempts to build users’ skills are ineffective. • Users do not have the resources and skills to meet their goals. • The team and stakeholders have uncertain relationships. • Miscommunication occurs frequently and solving problems in a joint fashion is difficult. Efficacy Framework: Capacity to deliver 49 Sessão 1: Conceitos e modelos l April 2014
  • 50. Contact: kelwyn.looi@pearson.com Muito obrigado pela sua atenção!
  • 51. Appendix
  • 52. In small groups of 2-3, evaluate the following matrices aligned to your innovations, and discuss where and how efficacy can be embedded? • What are the advantages of embedding efficacy into the common innovation design processes? • What are the challenges of embedding efficacy into the common innovation design processes? • What questions do you have about the integration of the two? Self-evaluation exercise: 52 Sessão 1: Conceitos e modelos l April 2014
  • 53. The Efficacy Framework Criteria area Rating Rationale Summary Actions • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver 53 Sessão 1: Conceitos e modelos l April 2014
  • 54. Self-evaluation: Outcomes Criteria area Rating Rationale Summary Actions Outcomes Does the design of the innovation fit their culture in a way that will eventually impact student achievement? Can the user achieve the same goals by investing in alternative innovation for lesser investment? What are we trying to achieve? What is the vision for the outcomes the user of the innovation wants to see? What is the end impact on learning from the innovation? Intended Outcomes Overall Design Value for Money 54 Sessão 1: Conceitos e modelos l April 2014
  • 55. Self-evaluation: Evidence Criteria area Rating Rationale Summary Actions Evidence What is different about the proposed innovation than an existing or competing innovation? Do we have a case study of another innovation with similar characteristics to highlight the evidence of our innovation? Why do we believe we can achieve? How consistent is the evidence and is it quantitatively validated? Comprehensiveness of Evidence Quality of Evidence Application of Evidence 55 Sessão 1: Conceitos e modelos l April 2014
  • 56. Self-evaluation: Planning Criteria area Rating Rationale Summary Actions Planning Who are the individuals responsible for guiding, monitoring and revising implementation once adopted? What routines or protocols will be used to gather feedback, ensure quality of implementation, and report on success as well as areas of need? How would we achieve it? Who is responsible for running point on the implementation, creating the roll- out plan (including timeline and metrics for success)? Action Plan Governance Monitoring and reporting 56 Sessão 1: Conceitos e modelos l April 2014
  • 57. Self-evaluation: Capacity Criteria area Rating Rationale Summary Actions Capacity What have we communicated to the user of the innovation regarding what resources are needed and what has the user agreed to allocate? What are the potential risks, the commonly agreed upon plans to mitigate, and our mutual willingness and trust to re-evaluate what actions are necessary to achieve the desire outcomes? Do we have the capacity to deliver? Do I have the right people, resources, and teams to fit and fully support the innovation? Internal capacity and culture User capacity and culture Relationships with other stakeholders 57 Sessão 1: Conceitos e modelos l April 2014