iNACOL Symposium 2013 - Highlights of Michigan Virtual Learning Research Institute Projects


Published on

Freidhoff, J., DeBruler, K., Stimson, R., Kennedy, K., Barbour, M. K., Clark, T., & Winter, P. (2013, October). Highlights of Michigan virtual learning research institute projects. A presentation at the International Association of K-12 Online Learning Blended and Online Learning Symposium, Orlando, FL.

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • {"16":"In higher education, where Quality Matters (quality has created evidence-based standards grounded in "what works" in published educational research. \nThis grounding has facilitated development of a nationally recognized certification process that provides third party validation of the quality of courses and online components at over 700 participating colleges and universities. \nWe would like to see similar processes developed at the K-12 level. Evidence-based third party validation of course and program quality. \nIn the meantime, external reviews and audits can help states validate the quality of courses and programs. \nThis would involve a collaboration of schools and providers with professional and state associations, colleges and universities. \n","5":"Ensuring quality in online courses is a common evaluation approach. The course approval processes of the California Learning Resource Network (CLRN) and Maryland Virtual Learning Opportunities Program are good examples to consider. \nQuality Matters, the only validated approach to certifying online course quality of which we are aware, has seen limited adoption at the K-12 level. An evidence-based validation process for widely accepted course standards, such as those of iNACOL, might help states justify their use in the evaluation of the quality of K-12 online courses. \n","11":"There are now state-level examples of ways to present student growth comparisons across schools to the public in a way that is meaningful and transparent. \nFor example, the State of Colorado now allow visitors to its online student growth data site to generate visual images depicting the level of student achievement against the level of student academic growth. \nThese images show based on data from 2011-2012, relative to other schools, fully online schools tend to have students who are lower achieving, and these students also tend to have lower growth trajectories. The difference is most pronounced in Math and least pronounced in Reading. \n","6":"The education of students participating in full-time online learning programs as opposed to taking a course or two online should be of special concern to states. Effective processes for evaluating the quality of online programs are needed. \nMany states evaluate the quality of full-time providers and programs through an initial approval process. Going in, about half of the states approve full-time providers, while nearly two thirds approve programs. In some cases, a provider may offer multiple programs. \n","1":"This study represents a collaboration between two of the Institute’s fellows, Michael Barbour and Tom Clark, Kristen DeBruler of the Institute, and other Institute staff. \nThis last issue is critical, growth outpacing research on quality. The iNACOL Research Agenda Survey talked about what's working and the gaps we see in knowledge in the field. The gap in quality research is critical. Evidence-based quality standards cannot be developed until such evidence is in place. We’ll talk more about this later. \n","7":"Of course, all public schools must annually report key data to their state. Full-time online and blended schools that serve as a student’s school of record must do this as well.\nOur focus was on which states went beyond this. We identified five. \nInterestingly, one state, Colorado, has removed the additional ongoing reporting requirements that it put in place earlier. \n","2":"Cyber schools AND course providers … \nCriteria by which they should be monitored and evaluated \n","8":"Based on analysis of the evidence, we arrived at specific conclusions and recommendations. \nFirst of all, we do believe the input-focus initial evaluations of full-time schools should continue, but encourage states to re-examine their processes as evidence emerges about what components are important to ensuring student success. \n","3":"The purpose of the study and the research questions is to gather and analyze the information needed to develop such monitoring and evaluation criteria.\nWe looked at the 50 states through a multi-stage process, including review of background literature, online surveys, and telephone follow-up with states that had potential as case studies. We found a wide range of policies and practices that we illustrated through six in-depth profiles. Each state has a unique policy and practice environment, which those recommending potential models and guidelines must keep in mind. \nThe first question is about how course-level quality.\nHow do states evaluate the quality of online learning courses\nThe other two questions are about program-level quality:\nHow do states initially evaluate the quality of online learning programs? \nHow do states ensure the quality of online learning programs on an on-going basis? \n","9":"Blended is a major trend. In order to study relative effectiveness, we need to be able to distinguish between blended, online and traditional – both in terms of courses and programs. \nThe method for doing so needed to be simple enough to apply in state contexts. \nHere’s one way of doing it - Define blended schools with a significant online learning component.\n","15":"We would like to encourage online and blended programs to collaborate actively with researchers to obtain the evidence on “what works” in terms of course and program quality.\nEvidence-based quality standards for K-12 online and blended learning cannot be developed until such evidence is in place. \nIn higher education, Quality Matters has created evidence-based standards grounded in "what works" in published educational. \n","4":"As we gathered the evidence, patterns emerged that eventually became the 6 dimensions.\nLEVEL OF EVALUATION & APPROVAL\nProvider level: Approval based on evaluation and determination of quality of online provider or program. Example: Georgia\nCourse level: Approval required for every online course offered regardless of provider approval.\nExample: California, Maryland\nAPPROVAL REQUIREMENT\nOptional: Approval not mandated by state but recognized and required by higher education institutions. Example: California.\nRequired: Approval mandated by state, sometimes tied to funding. Example: Kansas.\nGEOGRAPHIC REACH\nMulti: Specific approval requirements for providers enrolling a certain threshold percentage of students outside their district. Example: Washington.\nSingle: Specific approval requirements for providers enrolling students only in their district or enrolling outside their district under a certain threshold. \nBoth: Identical approval processes for multi-district and single-district providers.\nMODE OF INSTRUCTION\nFully Online: Specific approval requirements for fully online courses \nBlended: Specific approval requirements for courses delivered via a blend of online & FTF\nEVALUATION AND APPROVAL TIMEFRAME (PROCEDURES)\nInitial: Initial approval is singular requirement for online providers. Example: Colorado\nOngoing: Providers are not required to undergo initial approval but must submit annual reports or undergo annual audits. Example: Arkansas\nBoth: Providers are required to be approved prior to offering any courses and must undergo annual performance evaluations (beyond standard PS reporting). Example: Arizona, Michigan\nINSTRUCTION\nF-T online: Specific approval processes required only to full-time providers such as virtual charter schools. Example: New Jersey\nSupplemental: Specific approval processes for providers offering supplemental courses. \n","10":"We believe that the use of periodic external program audits by dedicated teams of experts, as found in British Columbia, can play a valuable role in ensuring program quality, and can provide a mechanism for starting program shutdown when absolutely needed. \nIt can also provide an avenue for helping programs remediate quality problems. Another valuable practice to consider is a state review after two years of operation, as in Colorado, either as part of an audit process, or as a less intensive paper review process performed across all emerging programs, in states where there are many programs.\nThe State of Washington provides a good example of how to differentiate reviews of full-time and part-time providers. \n"}
  • iNACOL Symposium 2013 - Highlights of Michigan Virtual Learning Research Institute Projects

    1. 1. Overview • Growth in K-12 online & blended learning programs & enrollments, in MI & U.S. • MI Legislature lifts ban on cyber charters (PA 227, 2010) • Removes restrictions, creates pro-growth policies (PA 129, 2012) • Growth is outpacing research on quality in K-12 OLL
    2. 2. Overview • MVU tasked to develop COLRI (PA 201, 2012) • Provide Leadership for MI online & blended learning • Key COLRI task: research, develop, and recommend annually to the department criteria by which cyber schools and online course providers should be monitored and evaluated to ensure a quality education for their pupils (p.44).
    3. 3. Methodology • Purpose: To examine existing policies and practices related to the evaluation and approval of K-12 online learning in the U.S. – RQ1: How do states evaluate the quality of online learning courses? – RQ2: How do states initially evaluate the quality of online learning programs? – RQ3: How do states ensure the quality of online learning programs on an on-going basis?
    4. 4. Six Dimensions of Consideration Evaluation & Approval: Level Provider / Course Evaluation & Approval: Timeframe Approval Requirement Front-End/Ongoing Geographic Reach Multi-Dist / Single Dist Modes of Instruction Fully Online / Blended Instruction Full-time / Supplemental Optional / Required
    5. 5. Findings • RQ1: How do states evaluate the quality of online learning courses? • States typically focus either at course or provider level • Some do both (GA, for example) • 11 states evaluate course quality • MD’s MVLO and CA’s CLRN: good prescriptive & optional review examples
    6. 6. Findings • RQ2: How do states initially evaluate the quality of online learning programs? • 24 states require initial approval of F-T providers; approval process ranges from simple to complex • 33 states require initial approval of F-T programs (usually as charters) • Example: GA
    7. 7. Findings • RQ3: How do states ensure the quality of online learning programs on an on-going basis? • All states that permit F-T online public schools require them to report like other public schools • At least 5 states require ongoing additional reporting or audits, beyond standard reporting • Examples: AZ, MI • One state (CO) recently removed ongoing evaluation, now only has initial approval
    8. 8. Recommendations • Continue input-focused evaluation and approval processes for F-T online schools – Seek to ensure they meet basic quality standards during development & startup – Consider elimination of input processes not supported by research or evidence of student impact
    9. 9. Recommendations • Define blended schools with a significant online learning component, and track their results – For example, define blended as 30%-80% of instructional time online – Track results separately from F-T online (over 80%) – Track separately from supplemental use in traditional schools (under 30% online)
    10. 10. Recommendations • Consider adopting an intensive state review process for F-T online schools – After two years of operation or on a periodic basis as funding permits – BC: good external audit model – WA: good P-T vs F-T differential review model
    11. 11. Recommendations • Adopt a student growth model for K-12 student performance data analysis – Provide public online access to comparative analyses of data – Facilitate comparison of F-T online, blended, and traditional school results
    12. 12. Recommendations • Collaborate actively with educational researchers to help build the evidence base for what works in K-12 online and blended learning
    13. 13. Recommendations • Adopt processes across states for evidence-based third party external validation of K-12 online courses and program quality – Work in collaboration with professional associations, associations of states, online learning providers, and post-secondary institutions
    14. 14. Thank you! Michael Barbour MVLRI Fellow Tom Clark MVLRI Fellow Kristen DeBruler & MVRLI colleagues