The document outlines American Public University System's program review process, which takes an interdisciplinary approach to ensure quality and accountability. The program review process involves collecting data from various sources, conducting an external review, and benchmarking programs against similar programs. Program findings are summarized, and recommendations are made for a three-year strategic plan. Data from course evaluations and fact books are used to assess programs and drive decisions as part of a continuous improvement cycle.
Scaling the mountain of faculty governance in building institutional online c...Dr. Kristin Palmer
Panel discussion for the OLC Innovate Conference in Denver, Colorado in April 2019. Looking at comment elements of governance around online learning with strategies and tools.
The American Public University System (APUS) will discuss strategies to engage faculty and administrators in ongoing assessment processes at the fully online distance learning institution. As a result of these strategies, faculty and administrators are increasingly integrating assessment information into daily decision-making processes to improve on academic courses and programs.
This leadership webinar focused on current practices for evaluating program effectiveness, the evaluation tools in use and how state online schools analyze multiple sources of data to understand program success. Presenters will lead a discussion of important considerations around the ongoing formative data collected to inform teachers and administrators about what contributes to student success in online courses. The panelists will explore how their programs approach collection of data and what methodology they use to organize and present data for school or district leaders.
Scaling the mountain of faculty governance in building institutional online c...Dr. Kristin Palmer
Panel discussion for the OLC Innovate Conference in Denver, Colorado in April 2019. Looking at comment elements of governance around online learning with strategies and tools.
The American Public University System (APUS) will discuss strategies to engage faculty and administrators in ongoing assessment processes at the fully online distance learning institution. As a result of these strategies, faculty and administrators are increasingly integrating assessment information into daily decision-making processes to improve on academic courses and programs.
This leadership webinar focused on current practices for evaluating program effectiveness, the evaluation tools in use and how state online schools analyze multiple sources of data to understand program success. Presenters will lead a discussion of important considerations around the ongoing formative data collected to inform teachers and administrators about what contributes to student success in online courses. The panelists will explore how their programs approach collection of data and what methodology they use to organize and present data for school or district leaders.
Presented by Pat Marshall, Deputy Commissioner for Academic Affairs & Student Success, and Christine Williams, Director of Strategic Initiatives for Academic Affairs & Student Success, at the June 20, 2017 meeting of the Massachusetts Board of Higher Education.
HEIR conference 8-9 September 2014: Forsyth and StubbsRachel Forsyth
Rewriting the Rules: Institutional procedural change based on analysis of student feedback
As part of a large JISC-supported institutional project on assessment and feedback, two different types of institutional data were analysed to identify potential changes to assessment procedures and practice. Comments from institutional student survey data were analysed to identify 10,000 comments relating to assessment. Coding of these comments enabled the project team to identify a series of areas for change which were common across the institution, rather than just using the survey data for course-level changes, which had happened in the past. This led to the production of new institutional assessment procedures designed to improve the student experience. Institutional records about assignment types, which had been produced simply to support course validation, were then analysed to discover the ten most common types of assignment in use across the institution. Detailed guidance on implementing the new procedures was then developed for these ten assignment types, which accounted for two-thirds of the total number of assignments being taken by students. The combination of data from different parts of the institution has enabled change to be made and supported in a way novel to the university.
How Student Data and Analytics can be used to Target Intervention and Improve...Kelly Rennie
Presentation delivered by John Gledhill at EUNIS 2015, showing how student analytics can be used to help universities better target intervention strategies at those students most in need and how to improve outcomes for students.
Two Balls, One Bat: The SACSCOC Fifth-Year Review as a Motivator For Institut...Jon Ernstberger
In this presentation given at the 2016 GICA (Georgia Independent Colleges Association) SACSCOC Workshare Conference, I describe a framework and mindset to use SACSCOC distance education guidelines and best practices to generate institutional improvement.
Presented by Chris Gabrieli, chair of the Massachusetts Board of Higher Education, at the Massachusetts Early College Initiative launch event on March 23, 2017. #ecil17
Event sponsors: Massachusetts Executive Office of Education, Department of Higher Education, Department of Elementary & Secondary Education
Event partners: MassINC, Massachusetts Business Roundtable, Rennie Center, Jobs for the Future
Presented to the Board of Higher Education and Board of Elementary and Secondary Education at the joint meeting on January 26, 2016 at Roxbury Community College.
In an effort to increase graduate student retention and graduation rates, the University of North Texas is in the process of developing academic support services for graduate students outside the classroom. Based on data gathered as part of a larger needs-assessment, new programs include research and statistical support, expanded tutoring options, and individual academic coaching. Participants in this session will learn about the process of developing, implementing, and marketing these programs along with information on future plans for the programs including expansion, refining, and formal assessment.
Presented by Pat Marshall, Deputy Commissioner for Academic Affairs & Student Success, and Christine Williams, Director of Strategic Initiatives for Academic Affairs & Student Success, at the June 20, 2017 meeting of the Massachusetts Board of Higher Education.
HEIR conference 8-9 September 2014: Forsyth and StubbsRachel Forsyth
Rewriting the Rules: Institutional procedural change based on analysis of student feedback
As part of a large JISC-supported institutional project on assessment and feedback, two different types of institutional data were analysed to identify potential changes to assessment procedures and practice. Comments from institutional student survey data were analysed to identify 10,000 comments relating to assessment. Coding of these comments enabled the project team to identify a series of areas for change which were common across the institution, rather than just using the survey data for course-level changes, which had happened in the past. This led to the production of new institutional assessment procedures designed to improve the student experience. Institutional records about assignment types, which had been produced simply to support course validation, were then analysed to discover the ten most common types of assignment in use across the institution. Detailed guidance on implementing the new procedures was then developed for these ten assignment types, which accounted for two-thirds of the total number of assignments being taken by students. The combination of data from different parts of the institution has enabled change to be made and supported in a way novel to the university.
How Student Data and Analytics can be used to Target Intervention and Improve...Kelly Rennie
Presentation delivered by John Gledhill at EUNIS 2015, showing how student analytics can be used to help universities better target intervention strategies at those students most in need and how to improve outcomes for students.
Two Balls, One Bat: The SACSCOC Fifth-Year Review as a Motivator For Institut...Jon Ernstberger
In this presentation given at the 2016 GICA (Georgia Independent Colleges Association) SACSCOC Workshare Conference, I describe a framework and mindset to use SACSCOC distance education guidelines and best practices to generate institutional improvement.
Presented by Chris Gabrieli, chair of the Massachusetts Board of Higher Education, at the Massachusetts Early College Initiative launch event on March 23, 2017. #ecil17
Event sponsors: Massachusetts Executive Office of Education, Department of Higher Education, Department of Elementary & Secondary Education
Event partners: MassINC, Massachusetts Business Roundtable, Rennie Center, Jobs for the Future
Presented to the Board of Higher Education and Board of Elementary and Secondary Education at the joint meeting on January 26, 2016 at Roxbury Community College.
In an effort to increase graduate student retention and graduation rates, the University of North Texas is in the process of developing academic support services for graduate students outside the classroom. Based on data gathered as part of a larger needs-assessment, new programs include research and statistical support, expanded tutoring options, and individual academic coaching. Participants in this session will learn about the process of developing, implementing, and marketing these programs along with information on future plans for the programs including expansion, refining, and formal assessment.
Rider University Ed.D. in Educational Leadership LaunchJamie Mitchell
This Power Point was shared by Dr. Len Goduto at the reception to celebrate the launch of Rider University's Ed.D. in Educational Leadership program. Learn more at www.rider.edu/edd
This presentation investigates the characteristics of an online graduate degree program in library and information studies (LIS), and its unusual success in retaining students to degree conferral. It has been reported for more than a dozen years that attrition rates for distance education programs are higher than for those programs in which instruction is delivered face to face. In the present study an online master's degree program in LIS that has maintained an overall retention rate higher than 90 percent over five successive entering classes is examined for explanatory characteristics. These characteristics are described and compared with attributes that the literature relates to retention success. Mapping the characteristics of our LIS program to the factors for retention requires description of specific implementations of the program design. We detail the factors and activities recommended for student retention and provide a summary of the activities inherent in the implementation our successful LIS program. Additional question for investigation are identified.
Quality frameworks for e-learning (SIEAD 2018, Brazil)Jon Rosewell
A contribution to INTERNATIONAL SEMINAR ON OPEN AND DISTANCE EDUCATION (SIEAD-BR 2018) 22nd October 2018.
"Contributions from Open and Distance Education to Higher Education Quality: present and future"
"Contribuições da Educação Aberta e à Distância para uma Educação Superior de Qualidade: presente e futuro"
In this presentation I will suggest using a quality framework to help you think about and improve quality of e-learning. I start with some general observations about quality and the need for quality frameworks. I then discuss two specific frameworks: the well-established E-xcellence benchmarks for e-learning, and the OpenupEd framework which as been specifically aligned at MOOCs. Finally I return to some more practical advise, particularly about thinking about the learning design of a course at an early stage.
The San Jose State University (SJSU) School of Information (iSchool) hosts online and in-person open house events. Find out more about the iSchool's lifelong learning solutions in this presentation, originally given at the Santa Clara County Library District in Campbell, CA on September 29, 2015.
Developing Extension Pre-Service Training Programs for Sub-Saharan CountriesMEAS
Extension agents critical to the success of any extension program, and one of weakest yet most critical resources to strengthen extension is staff members.
Field agents have little knowledge/experience in extension education and need competence in planning, delivery, evaluation, communication, teaching methods.
Purpose:
Agriculture colleges personnel determine ways to incorporate extension education into the master’s curricula
Objectives:
1. Investigate the potential for extension education programs
2. Identify skills/competencies needed by agriculture graduates
3. Propose an extension education program model
The landscape of competency-based education (CBE) has progressed tremendously over the past few years in higher education. This session will provide a brief overview of CBE regulations, design models, and learner experience considerations, focusing on how CBE can be successfully delivered using quality frameworks. Participants will identify, discuss, and explore key inputs supporting an engaged faculty and student experience. Join three CBE leaders as they connect how their own institutions have instituted, continuously improved, and evolved various CBE models in this interactive presentation.
Session Objectives:
Recognize the key features and value of competency-based education for learners, workforce, and educational institutions.
Explain the foundations of operationalizing CBE and the key elements of the engaged faculty and learner experience.
Connect C-BEN’s quality frameworks to how CBE is implemented for three leading institutions.
1. Program Review Process: An Interdisciplinary Approach to Ensure Quality and Accountability Higher Learning Commission Conference | Chicago, IL | April 2009
2. Overview • Introductions • APUS Mission and Demographics • Goals of the Program Review Process • Program Review Process, Tools, and Resources • Use of Continuous Improvement in Business Program • Use of Fact Books to Drive Decisions • Conclusion and Dialogue
3.
4.
5. Our Mission To educate the nation's military and public service communities by providing respected, relevant, affordable, and student-focused online programs which prepare them for service and leadership in a diverse, global society.
6. Who We Are • Founded in 1991 as American Military University to provide affordable and convenient distance learning to service members • In 2002, expanded reach to national security, public safety and other professionals seeking affordable education and online convenience, establishing American Public University System with member institutions: American Military University and American Public University • We are a 100% online for-profit school that serves 49,000+ students who study online from 100 countries.
7.
8. % of Students by Degree Program % of Students by School Student Body Profile Graduate 24% Undergraduate 76%
10. • Course books • Electronic resources • Learning strategies Library and Learning Resources • Student learning outcomes • Instructional strategies • Evaluation procedures • Academic rigor Curriculum Assessment • Expert reviewer report • Industry Advisory Council report External Reviewer Feedback Paper Free Process Electronic Repository of Information
11. • Analysis of faculty credentials and expertise to ensure breadth and diversity Faculty • Student demographic information • Enrollment History • Growth trends Students • Curricular Mapping • Assessment measures • Fact book Learning Outcomes Assessment Paper Free Process Electronic Repository of Information
12. • Program benchmarking with similar programs and institutions Program Benchmarking • Evaluation of findings • Program recommendations • Three year proposed strategic plan Program Directory Summary • Dean’s observations • Meeting minutes Program Review Findings Paper Free Process Electronic Repository of Information
Harbert 2 min Our curriculum aligns with our mission – educating those who serve. Specialty programs to match professional requirements as well as traditional educational options for students in that community who may wish to careers in other areas of service.
Harbert - 2 min A diverse student body of 31,000 civilian and military students The school of education is a new school – excepting that, you can see that the students are fairly evenly distributed
The ongoing program review process at APUS consists of 6 distinct phases: 1 - Data Collection: Each PD is responsible for this phase with assistance from their faculty and administrative staff. PD reviews different aspects of the program and prepares data and information for others to review (will outline this information in the next few slides). 2 - External review: The program is reviewed by an evaluator from outside of the APUS system and community. 3 - Analysis: The School Dean and PD review all relevant data. After they complete an analysis of the program, they prepare a report of the findings. After all of this information is compiled, it becomes available to individuals within the APUS community to read and review. 4 - Program Review Meeting: After folks have had a chance to review the material from the previous 3 phases, there is a formal 2 hour review meeting led by our Academic Dean to engage stakeholders from across the University. Stakeholders include – PD, school dean, enrollment management, student services, transfer credit folks, registrar’s office, faculty development, library resources, learning outcomes assessment, marketing, and the Provost. These meetings provide an opportunity for stakeholders to openly and honestly discuss the data that is presented by the PD program director. At this time, all parties are involved in the discussion. 5 - Three Year Plan: Each PD is required to develop a 3 year plan of program development as part of the review. Following the review meeting, this ensures that the PD is held accountable. Given that programs are on a 3 year cycle of review this document becomes a source of analysis for the subsequent reviews also. 6 - Follow up: To monitor progress, the 3 year plan is followed up by the school Dean to ensure that the plan is on target. The follow up takes place during the annual strategic planning cycle so each program can prepare for the coming year.
The analysis of data in the program review is a key part of the process, enabling folks to evaluate all aspects of the program during each phase. The primary tool used to house the data is an electronic storage repository that has been developed specifically for this purpose. This is a paper free repository that allows members from across the university to gain access to program information on a shared network drive. There are 9 different dimensions of the program that are examined in the review: Library and Learning Resources: How frequently is the library used by students in the program? What electronic resources are used? Are the text books relevant and up-to-date? Curriculum Assessment: What are the program level objectives? Are they current and relevant? Are the course level objectives aligned with the program and institutional level objectives? What instructional strategies and evaluation strategies are used? Are they varied and do they accommodate different learning styles? To accomplish this review all course syllabi are reviewed as part of this process. External Review Feedback: There is a review of the program by an evaluator from outside of the APUS system. This is typically a recognized academic leader in the program area who teaches or is an administrator at another University. The reviewer writes up a report that includes an assessment of the program and recommendations for change. Some programs have industry advisory councils, a group if individuals who meet a few times a year to provide guidance to the development of the program.
Faculty: Who are the faculty in the program area? What are their credentials? What is the student/faculty ratio? Students: Who are the students and from where do they come? Enrollment history & program growth are examined. What draws them to this program and why? Are students seeking specific program content that is not covered? What is unique about this program from a student’s perspective and how does this uniqueness serve our mission? Learning Outcomes Assessment : PD create curricular maps to ensure that course objectives aligned with program and institutional objectives. Also, the OOP creates fact books for each of the programs. Fact books contain information such as student and faculty demographic data, course completion & withdrawal rates, grade distributions for courses, results from direct and indirect measures of assessment. They are typically 15-20 page booklets that display all data available for a particular program.
Benchmarking: How does the program compare with similar programs at other accredited academic institutions? The courses offered, # of credit hours, cost of the program, student population, and other related data is analyzed in a detailed spreadsheet. PD Summary: The PD prepares a report that identifies the strengths and challenges of the program. This report includes an evaluation of findings, program recommendations, and a 3 year proposed strategic plan for continuous improvement. Review Findings : This information is compiled after the program review meeting and includes Dean’s observations and meeting minutes. Chad will now talk about how the program review process has been used for continuous improvement in the Business Administration program.
Introduction – Courses included in the Bachelor of Business Administration program are spread across three other departments including management, marketing, and math. Although these core and major courses are owned by different department chairs the courses must be included in BBA Learning Outcomes Assessment because they reflect the students overall program experience. The program data will stay intact.