Innovation in Education: Tools and methods for success (Session 2)
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Innovation in Education: Tools and methods for success (Session 2)

on

  • 216 views

Innovation in Education

Innovation in Education
Tools and methods for success
Session 2: Tools and techniques
Joint Pearson and ELIG workshop at the Escola Superior de Educação do Porto.

Statistics

Views

Total Views
216
Views on SlideShare
216
Embed Views
0

Actions

Likes
0
Downloads
10
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • At Pearson, our ambition has always been to help people make progress in their lives through learning. We know how big a difference education can make to individual learners and whole economies across the world, so we want to make sure that the part we play in improving learner outcomes – alongside governments, institutions, educators, parents and pupils – has the biggest possible impact.Efficacy is our way of achieving this.
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • There is clearly big demand for high quality education, and in this context there are three important factors that have made us realise that now is our best chance to make a difference: The recognition that education can drive personal, economic and societal growthThe subsequent increase in global investment in educationThe advancement of technology that gives us access to real-time data on how well a product or service is helping a learner reach their goals  Together, these factors create a unique opportunity to work with others to transform education, and in doing so, the lives of learners across the world.
  • In short, there are big disruptive changes happening in education currently. We need to help our customers and learners do more (and better) with less; we need to stay ahead of the digital disruption of the publishing industry (in all its forms) by leading that transformation and engaging more directly with our customers; and we need to measure our success not just by how we perform against our traditional competitors but how we engage with a new generation of innovators and entrepreneurs. [Alternatively ask audience to name some of the big changes in education]In terms of efficacy specifically we are noticing two big trends:There is greater demand for measurable learning outcomes.People are facing immense challenges that education can help to solve.People are judiciously evaluating their own spend on education.People have lower trust in private companies, especially in education.Pearson’s competitors are starting to prove outcomes.2) At the same time it is getting easier to prove outcomes with:Greater research into teaching, learning and assessment so we understand what works.A move from single products to integrated solutions and service, and from print to digital that makes it easier to capture results.
  • When we talk about efficacy, and improving “learner outcomes”, what we are really talking about is how we can help many hundreds of millions of our fellow citizens to change their lives for the better. About how we can apply new digital platforms to help them acquire the education and the skills they need to make progress in their lives.
  • Efficacy happens at Pearson through 3 activities that bring efficacy into every nook and cranny of our business from beginning to end, from A to Z, from product idea to happy customer.1. Efficacy reviews – this activity bakes efficacy deep into the design, planning, and delivery for future products, or products being revised. ERs have been going on around Pearson for the past year or so. Efficacy Reviews are simply a process we have developed so that our Pearson product and service teams can evaluate their own work and processes to ensure they are maximizing what they do. 2. Efficacy studies – what is what I want to focus on today - is where we work with customers to build evidence of how our products are performing in real situations; efficacy studies allows us to gather EVIDENCE that our products are doing what we promise they will do and that our products are doing it well. Efficacy studies are where you have the opportunity to directly participate in Pearson’s efficacy mission and studies are the activities that can and will positively impact your sales. 3. Efficacy (or Learner) analytics – this is where we mine learner (and teacher) data of live or prototype products to look for evidence for how to improve the product or service. These are the “Big Data” projects you have been hearing about lately, or will hear about. For example, we can look at submissions from millions of Top Notch student learners to discover information about their learning and our content.Now, let’s get focused on efficacy studies – the digital sales boosters. [CLICK]
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • In line with the recent public commitment to efficacy (http://efficacy.pearson.com/) the learnshop is seen to be an appropriate showcase for the application of the efficacy framework to a wide variety of interested parties. Inside and outside Pearson “efficacy” has different meanings. At Pearson we have agreed on a definition of efficacy. Efficacy is defined as: “A measurable impact on improving someone’s life through learning.”We need to be able to identify the specific impact for a learner. Efficacy has direct and obvious applications for those who are designing and delivering products, services and solutions to learners. The Efficacy Framework was developed by Sir Michael Barber (Chief Education Advisor) and his team. It draws on best practices about delivery from Pearson, and the public and private sectors. The Efficacy Framework has two purposes: to understand whether we are delivering efficacy, and to identify a path to improve efficacy. This is outlined below, with the four key questions asked as part of the framework and a set of ratings for identification.
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • ADAMLet’s look at user stories many of you provided of how our customers may think about efficacy?As a learner, I want to know I’m using a product…As an ELT teacher in a state school, I want to show my headmaster…As an academic coordiantor at a PLS, I want to show prospective…
  • The Efficacy Framework is divided into four sections: Outcomes: to define what we are trying to achieveEvidence: to make sure that our assessment of and ambitions for the product/service are grounded in real experience and resultsPlanning and implementation: to outline how we intend to achieve our goalsCapacity to deliver: to check that we have the people, knowledge and skills we need to reach those goals
  • So far you have learnt about the basics of efficacy, what are our plans, and our progress to date. Now we want to talk about what you can do: now and after today.
  • I hope you have found this useful and interesting, and that you feel like our efficacy focus will help you and your students/pupils to achieve your goals.For more information, please visit efficacy.pearson.com, where you will find lots more detail about our approach, and be able to hear from some of our leadership team and external education experts about what it could help to achieve.

Innovation in Education: Tools and methods for success (Session 2) Presentation Transcript

  • 1. Innovation in Education Tools and methods for success Session 2: Tools and techniques 8 de abril de 2014 Porto, Portugal. Kelwyn Looi Office of the Chief Education Advisor, Pearson PLC. A measurable impact on improving someone’s life through learning
  • 2. Agenda 1. An introduction to Efficacy at Pearson 2. Efficacy in Practice 3. What is the measurable impact on learning outcomes (efficacy) of your case? 4. Q&A 2 Sessão 2: Ferramentas e técnicas l April 2014
  • 3. What we aim to achieve • An understanding of Efficacy at Pearson, and the evolution of its implementation in the company • Introducing the Efficacy Framework as a tool to support innovation in education • Going through some examples of applying the Efficacy Framework to in a generic case • Applying efficacy to your innovations and identifying where efficacy can be embedded 1 2 3 4 Watch out for this icon – it signals an activity! 3 Sessão 2: Ferramentas e técnicas l April 2014
  • 4. An Introduction to the session… This session will look at tools and techniques that can support the development and structuring of an innovation, taking into account the context in which it operates, objectives, expected results, likely indicators of success and the expected impact Pearson‟s Efficacy Framework will be tested as a [e.g. stand-alone] means to support the stakeholders to innovate in TEL / education Key objectives For attendees the learnshop would provide the opportunity: 1. To acquaint yourself with the efficacy framework as a tool to engender learning-focused discussions when assessing and evaluating prospective innovations 2. To examine a specific case through the lens of a rigorous and structured framework, providing key takeaways at both the transversal and individual innovation level 4 Sessão 2: Ferramentas e técnicas l April 2014
  • 5. Efficacy at Pearson Efficacy in Practice Exercise Key takeaways and useful resources
  • 6. The path to Efficacy: why? • As the world‟s leading learning company, we feel we have a responsibility and an opportunity to help people make progress in their lives through learning • We have aligned our activities around the principle of Efficacy to achieve this 6 Sessão 2: Ferramentas e técnicas l April 2014
  • 7. Pearson’s definition of efficacy efficacy (dictionary definition) • ability to produce the intended result efficiency (dictionary definition) • achieve maximum productivity with minimum wasted effort 7 Put simply… it’s all about products that improve results and measurable outcomes for learners. Efficacy (Pearson‟s definition) • a measureable impact on improving someone’s life through learning Sessão 2: Ferramentas e técnicas l April 2014
  • 8. Why now? • There is a shared understanding that high-quality education drives personal, economic and societal growth • Governments, individuals, employers and institutions recognise the need to deliver high-quality learning • New technology makes it increasingly possible to see what works and what doesn‟t in helping learners to achieve their goals 8 Sessão 2: Ferramentas e técnicas l April 2014
  • 9. Strategic accomplishments • Launched publicly, generating commentary from media and influential education leaders around the world • Made efficacy and learner outcomes central to Pearson‟s global education strategy and organizational design Product and service accomplishments • Rigorous process in place for conducting efficacy reviews with a variety of depth • 123 efficacy reviews completed to date Progress to December 2013 In the past year, we’ve taken major steps towards putting efficacy at the heart of Pearson’s work 9 Sessão 2: Ferramentas e técnicas l April 2014
  • 10. Our previous business map… •Inputs •Inputs •Products Existing Offerings New NewExisting Pearson Impact 10 Sessão 2: Ferramentas e técnicas l April 2014
  • 11. Our new business map… Existing • Inputs • Products • Outcomes • Services Offerings New Program Solutions New digital content Offerings New Existing Offerings New NewExisting Pearson Impact 11 Sessão 2: Ferramentas e técnicas l April 2014
  • 12. Our commitment to reporting our impact on learner outcomes “Pearson‟s purpose is to help people make progress in their lives through learning. So, we better be sure that we can demonstrate [our products deliver a] measurable impact on learning outcomes. “[By 2018] we will, in a rigorous and externally audited way…[publicly] report on [how our products and services improve] learner outcomes.” John Fallon Nov 15, 2013 How are we helping millions to improve their lives through learning? 12 Sessão 2: Ferramentas e técnicas l April 2014
  • 13. The response to the external communication of efficacy has been largely positive “That‟s an enormous commitment and if they really follow through, it gives me some confidence that this is not merely a marketing ploy.” Feedback from education leaders “Pearson‟s brightest possible future is not as a company that designs educationally effective products, but as one that facilitates conversation and research about efficacy within the broader academic community.” “At face value, it‟s a bold move because it moves the content of education from being critically reviewed by experts to an empirical review,…. It‟s the transition from, 'This is a great book‟ to „This material really works.‟ ” To report audited learning outcomes measures and targets alongside financial accounts, covering the whole business by 2018. Our commitment Daniel Willingham Michael Feldstein 13 Sessão 2: Ferramentas e técnicas l April 2014
  • 14. The Efficacy Framework has already been examined as a tool to support innovation • At Online Educa Berlin last year (OEB), Pearson partnered with the European Learning Industry Group (ELIG), to deliver an interactive learnshop • This involved applying the efficacy framework to selected case studies that demonstrated innovations to support technology-enhanced learning (TEL) • Opening up the conversation to external input is vital to the success of efficacy Excerpt from Online Educa Berlin 14 Sessão 2: Ferramentas e técnicas l April 2014
  • 15. Efficacy goals: does what it says on the label A 3-step process 1. 1. 1. Ensure our products and services deliver the best learner outcomes: Efficacy Reviews 2. Build comprehensive and rigorous evidence to support our claims: Efficacy Studies / Selling 3. Continually learn from learners to innovate and improve our products and services: Efficacy Analytics 15 Sessão 2: Ferramentas e técnicas l April 2014
  • 16. educational research / market analysis product/servi ce design product/servi ce development product/servi ce marketing product/servi ce deployment and sales product/servi ce customer use 1. Efficacy Reviews pre-launch: use evidence 2a. Efficacy Studies post-launch: generate leads, build evidence 3. Efficacy Analytics on- going: analyze evidence 2b. Efficacy Selling use evidence to sell & support best customer experience Our path to efficacy: three complementary activities for improving learner outcomes 16 Sessão 2: Ferramentas e técnicas l April 2014
  • 17. An Efficacy Framework: predicting the likelihood of impacting learner outcomes Four criteria: 1. What learner outcomes are we trying to achieve? 2. What evidence do we have to believe it is possible to achieve these outcomes? 3. What plans are in place to build and deploy a solution that will impact these outcomes? 4. What capacity exists to achieve these outcomes? 17 Sessão 2: Ferramentas e técnicas l April 2014
  • 18. The Efficacy Framework Criteria area Rating Rationale summary • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver Efficacy Key Green: Requires small number of minor actions. Amber/green: Requires some actions (some urgent and some-non urgent). Amber/red: Requires large number of urgent actions. Red: Highly problematic requiring substantial number of urgent actions.
  • 19. An Efficacy Framework: an explanation of ratings Good – requires slight refinement, but on track Mixed – some aspects require attention, some solid Problematic – requires substantial attention, some require urgent rectification Off-track – requires urgent action and problem solving Ratings are not grades on performance Ratings prompt discussions that lead to actions Ratings prioritise and suggest timeline 19 Sessão 2: Ferramentas e técnicas l April 2014
  • 20. Efficacy Framework: Evidence - Why should anyone believe us? Framework Section Evidence Comprehensiveness of evidence Ask yourself ∙ How well do we understand what our user needs and values? ∙ Is the vision for the innovation supported by research (internal or external)? ∙ Are we leveraging proven approaches from other innovations? Quality of evidence Ask yourself ∙ How rigorous is our evidence? Is the rigour appropriate for the innovation? ∙ How recent and relevant is the research? Application of evidence Ask yourself ∙ How evidence-based is the innovation design? ∙ Has the design been tested? ∙ Does evidence demonstrate that the innovation can be replicated globally? 20 Sessão 2: Ferramentas e técnicas l April 2014
  • 21. Rating Rationale Comprehensiveness of evidence ? Project 1 • Strong set of academic evidence and expertise underpinning the innovation. • Comprehensive teacher focus groups across markets and regions, focusing on the right questions (usability, price point etc); some concerns about scope of potential customers canvassed (see below). • Traditional competitors are tracked, but not non-traditional competitors who offer tests. Quality of evidence ? Project 2 • There is a good mix of quantitative and qualitative evidence as well as unbiased samples from the survey. Nonetheless, a large amount of the available evidence is based on the previous innovation (before Sept 2012 enhancement) and, therefore, is not fully applicable to the current innovation. Also, there is little documented evidence coming from the students as to date there has been more focus on teacher rather than student outcomes. Effective use of evidence ? Project 3 • The external evidence that has been collated is not known or accessible to all members of the team. There may be additional evidence within the business that could be exploited. The use of [this capability] in innovation design is not yet articulated and it is essential that any major decisions should be underpinned by research. Pilot innovations must be timed so information can feed back into the design [of this capability) Warm-up (5 mins): Rate the evidence for Projects 1-3 Exercise: Evidence • Good -- requires small number of minor actions • Mixed – requires some actions (some urgent and some non-urgent) • Problematic -- requires large number of urgent actions • Off track – Highly problematic requiring substantial number of urgent actions Key 21 Sessão 2: Ferramentas e técnicas l April 2014
  • 22. Rating Rationale Comprehensiveness of evidence Project 1 • Strong set of academic evidence and expertise underpinning the innovation. • Comprehensive teacher focus groups across markets and regions, focusing on the right questions (usability, price point etc); some concerns about scope of potential customers canvassed (see below). • Traditional competitors are tracked, but not non-traditional competitors who offer tests. Quality of evidence Project 2 • There is a good mix of quantitative and qualitative evidence as well as unbiased samples from the survey. Nonetheless, a large amount of the available evidence is based on the previous innovation (before Sept 2012 enhancement) and, therefore, is not fully applicable to the current innovation. Also, there is little documented evidence coming from the students as to date there has been more focus on teacher rather than student outcomes. Application of evidence Project 3 • The external evidence that has been collated is not known or accessible to all members of the team. There may be additional evidence within the business that could be exploited. The use of [this capability] in innovation design is not yet articulated and it is essential that any major decisions should be underpinned by research. Pilot innovations must be timed so information can feed back into the design [of this capability) How did you do? • Good -- requires small number of minor actions • Mixed – requires some actions (some urgent and some non-urgent) • Problematic -- requires large number of urgent actions • Off track – Highly problematic requiring substantial number of urgent actions Key Solutions: Evidence 22 Sessão 2: Ferramentas e técnicas l April 2014
  • 23. Framework area Initial review 3- month estimate 6-month estimate Comment Outcomes Intended outcomes After 6 months, outcomes and metrics will be clear and will influence design. Value for money intelligence will be drawn from pilots. Overall design Value for money Strength of evidence base Comprehensiveness of evidence After 6 months, the plan to develop the forward evidence base will be finalised and initiated.Quality of evidence Application of evidence Quality of planning and implementation Action plan After 6 months, long-term plans and reporting structures will be in place and governance agreed. Reporting will be at an early stage. Governance Monitoring and reporting Capacity to deliver Pearson capacity and culture After 6 months, Capacity issues will be clear, pilots delivered and lessons learned and applied. Stakeholder relationships plans will be launched and gathering feedback. Customer capacity and culture Stakeholder relationships An Efficacy Framework: driving improvement 23 Sessão 2: Ferramentas e técnicas l April 2014
  • 24. Efficacy Studies: holistic, long-term studies with specific learners, teachers, and institutions 24 Sessão 2: Ferramentas e técnicas l April 2014
  • 25. MyEnglishLab Efficacy study • MyEnglishLab makes a difference • The Efficacy study illustrated real-world situations in which MEL is positively impacting classrooms • Studies uncover proven best practices from customers who are USING MEL in their own classroom This means a long term customer! 25 Sessão 2: Ferramentas e técnicas l April 2014
  • 26. Identify common learner difficulties Personalise learning (by L1) Optimise learning by L1 Research learner behaviours that lead to success (machine learning) Improve learner engagement (activity design) Predict learners who will fail for early intervention (predictive algorithms) Efficacy Analytics and insights from Big Data 26 Sessão 2: Ferramentas e técnicas l April 2014
  • 27. both students have the same net score 0 100 200 300 400 500 600 -25 -20 -15 -10 -5 0 5 10 NetScore Responses Submitted by Student 0 100 200 300 400 500 600 0 50 100 150 200 250 300 Student 57 Fractal D = 1.60 NetScore Responses Submitted by Student fractal alert: alert teacher and learner to intervene responses submitted over course student who will succeed - smooth Fractal = 1.60 student who will fail or not complete – noisey Fractal = 1.94 Identifying learners at risk: ignoring assessment and taking a (random) walk Patent awarded 2013 27 Sessão 2: Ferramentas e técnicas l April 2014
  • 28. Efficacy at Pearson Efficacy in Practice Exercise Key takeaways and useful resources
  • 29. User stories: our end goals for efficacy “As a…, I want to…, so that I can…” - as a learner, I want to… EXAMPLE: “As a potential learner of an English Language course, I want evidence of how the course will help me improve my English so that I can make an informed decision about which provider to choose.” Fill out the table. Many answers are possible. 29 Sessão 2: Ferramentas e técnicas l April 2014
  • 30. As a…. I want to… so that I can… As a student As a parent As a teacher As a school As a … (you) Activity 1: User Stories – for an ELT course (10 min) 30 Sessão 2: Ferramentas e técnicas l April 2014
  • 31. As a…. I want to… so that I can… As a student I want to know the course method works for real students like me so that I can reach my exam (educational) goals As a parent I want to know the course is improving my child‟s English so that I can feel comfortable my child is getting the right level of education As a teacher I want to use data to understand my students‟ preparedness so that our classes are productive As a school I want to know that this is the most appropriate course on the market so that I can best serve my students As a … (you) I want to demonstrate to potential clients that the ELT course method works so that I can address their learning and financial concerns Activity 1: User Stories – for an ELT course (10 min) 31 Sessão 2: Ferramentas e técnicas l April 2014
  • 32. How can efficacy be applied to your work? What outcomes are you trying to achieve? • Set clear efficacy goals • Give your people the incentive to focus on outcomes What’s the evidence? • Develop innovations underpinned by research • Build and use effective data systems What’s the plan? • Make delivering outcomes a core part of your strategy • Take an open approach • Employ iterative and agile processes What’s the capacity to deliver? • Talk to your users and understand their students‟ needs • Train your students, teachers or others to use your innovation effectively • Shape the debate with influential stakeholders 32 Sessão 2: Ferramentas e técnicas l April 2014
  • 33. Global Efficacy Strategy Efficacy in Practice Exercise Key takeaways and useful resources
  • 34. 1 hour • Think through the four areas that we talk about when we measure efficacy – outcomes, evidence, planning, and capacity • Consider the question: What is the measurable impact on learning outcomes (efficacy) of the case? • We will go through Outcomes and Capacity today You can apply the Efficacy Framework to support innovation in education… 34 Sessão 2: Ferramentas e técnicas l April 2014
  • 35. Exercise • Spend some time reading the case that you have in front of you (15 mins) • Rate for the Outcomes part of the framework and discuss rationale in the group (20 mins) • Rate for the Capacity part of the framework and discuss rationale in the group (15 mins) • Discussion on the results and the usefulness of the exercise (10 mins) 1 2 3 4 35 Sessão 2: Ferramentas e técnicas l April 2014
  • 36. Efficacy at Pearson Efficacy in Practice Exercise Key takeaways and useful resources
  • 37. On November 15th, Pearson launched a dedicated website: http://efficacy.pearson.com outlining the company‟s focus on efficacy and commitment to put the learner at the heart of the global strategy. An interactive version of the efficacy framework also features on the website. Reference material: Efficacy website 37 Sessão 2: Ferramentas e técnicas l April 2014
  • 38. What is efficacy? Pearson as the efficacy company Efficacy Activities Tools/ What you can do? • Definition: A measurable impact on improving someone‟s life through learning • We want to be able to prove that our products and services have a measurable impact • By 2018 we are committed to demonstrating the progress we have made in improving people‟s lives though learning. • Efficacy Studies • Efficacy Reviews • Efficacy Analytics • Join the debate on the website • Blog about improving learning outcomes • Complete the Survey Monkey Recap 38 Sessão 2: Ferramentas e técnicas l April 2014
  • 39. Identifying dialogue and collaboration with the wider education community as crucial to accelerate progress, Pearson has also published two reports: • The first, Asking More: The Path to Efficacy, sets out the imperative for measuring and improving learning outcomes worldwide • The second, The Incomplete Guide to Delivering Learning Outcomes, shares in detail our new approach to contributing to that goal and the progress it has made so far Reference material: Efficacy publications 39 Sessão 2: Ferramentas e técnicas l April 2014
  • 40. “The future will belong not to those who focus on the technology alone but to those who place it in this wider context and see it as one element of a wider system transformation.” Reference material: Alive in the Swamp Quote is from Michael Barber, Chief Academic Advisor, Pearson 40 Sessão 2: Ferramentas e técnicas l April 2014
  • 41. You can visit efficacy.pearson.com to: • Find more information about our approach • Use the online interactive efficacy tool • Read up on the role of efficacy in education in two publications: Asking More, and The Incomplete Guide • Find out more on LinkedIn (Open for Learning) and Twitter (@PearsonPLC) • Contact: efficacy.global@pearson.com if interested parties want to collaborate with us • Contact: • kelwyn.looi@pearson.com How can I find out more? 41 Sessão 2: Ferramentas e técnicas l April 2014
  • 42. Efficacy Framework: Outcomes Overall design • Is the innovation designed in a way that will most effectively help your target group reach their goals? • Does the design allow you to automatically collect evidence of your progress? • Have you adapted the design based on feedback from users? • Could the design be used by others? Value for money • Do you understand the benefits of your innovation to your target group, relative to other options? • Is the cost of the innovation competitive, considering the benefits it would deliver? Intended outcomes • Have you identified specific outcomes for your target group? • Do you have a way to measure the intended outcomes? • Do you have ambitious and measurable targets in place, and deadlines for achieving them? • Are your intended outcomes clearly documented and understood by the relevant people within and outside your innovation? Example of green rating Example of red rating • All outcomes are specific and clearly documented. • People within and outside my innovation understand the intended outcomes and are able to communicate them clearly. • Future targets are ambitious and achievable. • Outcomes can be regularly measured against set targets. • Design is superior to other options/competitors with features focused on delivering outcomes. • Real-time evidence is generated. • The design can be adapted and developed. • Others could use this design, and it has been shared with them. • Feedback/research has allowed me to identify what benefits the innovation needs to deliver to users. • Feedback and return-on- investment research shows that the cost of the innovation reflects the benefits delivered. • Outcomes are not documented or specific. • People within and outside my innovation do not understand the intended outcomes or communicate them in the same way. • Targets do not exist to measure outcomes against. • Outcomes are only defined at a high level. • No feedback from users exists (either formal or informal), and the benefits of using this innovation are unclear to our team and our users. • Perceptions of value for money and user experience are poor. • The design does not meet target group expectations and is difficult to use. • The design does not reflect intended outcomes. • The design does not allow for the collection of feedback. • The design is specific to a local situation and cannot be replicated. 42 Sessão 2: Ferramentas e técnicas l April 2014
  • 43. Quality of evidence • Does the evidence you have collected link directly to what you are trying to achieve? • Is the evidence you have collected unbiased; applicable to your innovation; recent; and does it measure success over a period of time? • Is the evidence you have collected relevant, representative and where possible at an individual level? Application of evidence • Is the evidence stored and accessible to relevant people? Is it available in an electronic and searchable format? • Has the evidence you have collected been analysed to help inform the design of your innovation? • Has the evidence you have collected been analysed to help inform other decisions about your innovation? Comprehensiveness of evidence • Do you collect evidence using a range of methods (quantitative, qualitative, internal and external for example)? • Do you collect evidence for all stages of your innovation (from early conception to design and then to implementation)? • Do you have evidence from all users of your innovation? Example of green rating Example of red rating • A wide range of evidence has been collected via internal/external, and quantitative/qualitative methods. • Evidence relates to all stages of my innovation. • Evidence exists from all users. • Evidence collected effectively proves how well we are meeting our objectives. • Rigorous research methods have been used. • Evidence relates to the specific and relevant use of the innovation. • Evidence was gathered over a period of time. • Of the evidence that does exist it is not linked directly to what I am trying to achieve. • The evidence that exists is: biased; not from a relevant use of the innovation; out of date. • The evidence is not representative of how a learner would use this innovation. • All evidence is readily accessible and searchable. • The evidence is used regularly to inform the design of my innovation. • Collected evidence is also used to inform non-design decisions. • The evidence that exists cannot be accessed quickly via electronic means. • The design of my innovation has not been changed as the result of evidence. • Major decisions about my innovation are not underpinned by evidence. • Evidence is collected via a limited range of methods and does not balance qualitative and quantitative sources. • Evidence is mainly anecdotal and patchy, and does not take into account the innovation‟s lifecycle, features, or users. Efficacy Framework: Evidence 43 Sessão 2: Ferramentas e técnicas l April 2014
  • 44. • Our action plan has not been updated and adapted. • Where feedback exists, it is delayed. • Our team is are unaware of issues or fails to act on them. • Team routines are informal and not focused on monitoring progress. • Team-members do not know who makes key decisions. • Roles for people outside the core team are poorly defined. • New team members are unclear of key processes and do not have documentation to refer to. Monitoring and reporting • Do you update your plan based on progress, adapt it where necessary and communicate this with your stakeholders? • Do you get/have access to real-time feedback from your users? • Do you identify issues early, discuss these honestly and find solutions? • Do you have tools and routines in place to monitor progress (such as emails, calls, document-sharing)? Example of green rating Example of red rating Governance • Do people within and outside your team understand who is responsible for decision-making regarding your innovation? • Have you documented who is responsible for the work, and who should be consulted and informed? Do the relevant people understand this? • Have you identified the key processes required to implement your innovation and are these clearly documented? Action plan • Do you have a plan in place to achieve your outcomes, including milestones, actions, responsibilities and timelines? • Does your plan include short- and long-term priorities. • Have you identified any potential risks and included actions to mitigate these in your plan? • Do you regularly update your plan and communicate changes to relevant people/institutions? • Electronic plan exists with clearly identified steps, responsibilities and deadlines. • The plan includes short and long-term priorities. • The plan is regularly updated and all relevant parties are aware of the changes. • Team-members know who makes decisions, and each member of the team (within and outside my innovation) is clear about their role. • The processes we have in place are documented and well understood, and new members are fully briefed. • Data is collected in real-time and analysed to provide feedback. • Monitoring of the innovation alerts me to issues in real time. • Tools and routines are in place to identify and solve problems. • No electronic plan exists. • Plan is informal with actions, responsibilities and timelines unclear. • Milestones lack clarity and are either too ambitious or not stretching enough. • Potential risks have not been formally identified or planned for. Efficacy Framework: Planning & Implementation 44 Sessão 2: Ferramentas e técnicas l April 2014
  • 45. • Our team lacks the appropriate skills and resources to deliver the desired outcomes. • Our culture feels negative, traditional and not focused on outcomes. Stakeholder relationships • Have you identified who your key stakeholders are and do you understand their needs and concerns? • Do you regularly communicate with your stakeholders? • Is there a culture of partnership and collaboration between your innovation and your stakeholders? User capacity and culture • Do the target group understand the objectives and their roles in achieving them? • Does the innovation reflect the user‟s skillset and available resources? • Do users have the people, skills, time, or resources to achieve their goals? • Have you put measures in place to build users‟ skills? Internal capacity and culture • Does your innovation have the right number of people, and people with the right skillsets to enable you to deliver your desired outcomes? • Does your innovation have a culture focused on delivering outcomes, and is it collaborative and innovative? • Do leaders within your innovation support your work and are there opportunities to work with others across the innovation? Example of green rating Example of red rating • Team has right number of people with appropriate skillset and experience. • Culture is focused on delivering outcomes and is collaborative and innovative. • Team has appropriate budget. • The target group understand the objectives and their roles. • The innovation takes the user‟s skillset into account and there are mechanisms in place to build skills. • Users have the appropriate resources to achieve their goals. • We meet with stakeholders frequently, and have formal and informal conversations. • Conversations with stakeholders have led to a culture of trust and partnership over a sustained period of time. • The target group and existing users are not aware of what the innovation should help them to achieve and what they need to do to get there. • The innovation is ill-suited to the user and attempts to build users‟ skills are ineffective. • Users do not have the resources and skills to meet their goals. • The team and stakeholders have uncertain relationships. • Miscommunication occurs frequently and solving problems in a joint fashion is difficult. Efficacy Framework: Capacity to deliver 45 Sessão 2: Ferramentas e técnicas l April 2014
  • 46. Contact: kelwyn.looi@pearson.com Muito obrigado pela sua atenção!
  • 47. Appendix
  • 48. In small groups of 2-3, evaluate the following matrices aligned to your innovations, and discuss where and how efficacy can be embedded? • What are the advantages of embedding efficacy into the common innovation design processes? • What are the challenges of embedding efficacy into the common innovation design processes? • What questions do you have about the integration of the two? Self-evaluation exercise: 48 Sessão 2: Ferramentas e técnicas l April 2014
  • 49. The Efficacy Framework Criteria area Rating Rationale Summary Actions • Action plan • Governance • Monitoring and reporting • Internal capacity and culture • User capacity and culture • Stakeholder relationships Outcomes • Intended outcomes • Overall design • Value for money • Comprehensiveness of evidence • Quality of evidence • Application of evidence Evidence Planning and implementation Capacity to deliver 49 Sessão 2: Ferramentas e técnicas l April 2014
  • 50. Self-evaluation: Outcomes Criteria area Rating Rationale Summary Actions Outcomes Does the design of the innovation fit their culture in a way that will eventually impact student achievement? Can the user achieve the same goals by investing in alternative innovation for lesser investment? What are we trying to achieve? What is the vision for the outcomes the user of the innovation wants to see? What is the end impact on learning from the innovation? Intended Outcomes Overall Design Value for Money 50 Sessão 2: Ferramentas e técnicas l April 2014
  • 51. Self-evaluation: Evidence Criteria area Rating Rationale Summary Actions Evidence What is different about the proposed innovation than an existing or competing innovation? Do we have a case study of another innovation with similar characteristics to highlight the evidence of our innovation? Why do we believe we can achieve? How consistent is the evidence and is it quantitatively validated? Comprehensiveness of Evidence Quality of Evidence Application of Evidence 51 Sessão 2: Ferramentas e técnicas l April 2014
  • 52. Self-evaluation: Planning Criteria area Rating Rationale Summary Actions Planning Who are the individuals responsible for guiding, monitoring and revising implementation once adopted? What routines or protocols will be used to gather feedback, ensure quality of implementation, and report on success as well as areas of need? How would we achieve it? Who is responsible for running point on the implementation, creating the roll- out plan (including timeline and metrics for success)? Action Plan Governance Monitoring and reporting 52 Sessão 2: Ferramentas e técnicas l April 2014
  • 53. Self-evaluation: Capacity Criteria area Rating Rationale Summary Actions Capacity What have we communicated to the user of the innovation regarding what resources are needed and what has the user agreed to allocate? What are the potential risks, the commonly agreed upon plans to mitigate, and our mutual willingness and trust to re-evaluate what actions are necessary to achieve the desire outcomes? Do we have the capacity to deliver? Do I have the right people, resources, and teams to fit and fully support the innovation? Internal capacity and culture User capacity and culture Relationships with other stakeholders 53 Sessão 2: Ferramentas e técnicas l April 2014