This presentation is a summary of a program evaluation project I performed on the CSC230 Database for Web Applications course, which I teach online, to Community College Students.
The document discusses different views and approaches to evaluation including objective vs subjective views, utilitarian vs intuitionist-pluralist approaches, and quantitative vs qualitative methods. It also identifies factors that contribute to alternative views of evaluation such as philosophical beliefs, methodological preferences, and practical experience. Finally, it presents a classification schema that identifies six main approaches to evaluation.
This document discusses various approaches to program evaluation including objective-oriented, expertise-oriented, participant-oriented, and consumer-oriented approaches. It provides examples of each approach and how they may be applied. Strengths and weaknesses of each approach are considered. The document also discusses evaluation methods such as surveys, interviews, and mixed methods. References are provided on related research and examples of evaluation studies.
This document discusses two approaches to consumer-oriented evaluation: summative and formative. It introduces Consumer Union, an independent nonprofit organization founded in the 1930s to assist consumers. Consumer Union publishes Consumer Reports magazine and website to evaluate products. The document also profiles Michael Scriven, considered a major contributor to consumer-oriented evaluation, and his extensive checklist for evaluating products. Both the checklist and Consumer Union aim to provide consumers with independent and thorough evaluations to make informed purchasing decisions.
This document discusses different approaches to program evaluation. It defines program evaluation as the systematic gathering of information to make decisions about improving and assessing the effectiveness of a curriculum. There are four main approaches discussed: product-oriented, which focuses on achieving goals and objectives; static-characteristic, which uses outside experts; process-oriented, which questions the worth of goals; and decision-facilitation, which gathers information to help administrators make judgments. The document also covers dimensions that shape evaluation perspectives, including the purpose (formative or summative), type of information (process or product), and type of data and analysis (quantitative or qualitative).
The document discusses several evaluation models: expertise approach, consumer-oriented approach, program-oriented approach, participant-oriented approach, and decision-oriented approach. The consumer-oriented approach developed by Scriven in the 1960s focuses on evaluating products by determining important criteria, establishing standards, and examining performance against criteria to provide information to consumers. This approach advocates for consumer education and independent reviews of products. The document provides advantages and disadvantages of each approach.
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approachesdctrcurry
All information referenced from: Fitzpatrick, J., Sanders, J., & Worthen, B. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Upper Saddle River, N.J.: Pearson Education.
What is program evaluation lecture 100207 [compatibility mode]Jennifer Morrow
The document discusses what program evaluation is, including defining it as the systematic collection of information about program activities, characteristics, and outcomes to improve effectiveness and inform decision making. It also outlines the types and purposes of evaluation, how to prepare for and conduct an evaluation by developing a logic model and methodology, and important considerations around data collection, analysis, and ethics.
This gives the information about programme evaluation, planning of evaluation, requirement and purpose of evaluation, steps involved in evaluation, Uses of evaluation, Stakeholder and their role in evaluation, finding and analysing the result of evaluation, Standards of effective evaluation, utilization of evaluation.
The document discusses different views and approaches to evaluation including objective vs subjective views, utilitarian vs intuitionist-pluralist approaches, and quantitative vs qualitative methods. It also identifies factors that contribute to alternative views of evaluation such as philosophical beliefs, methodological preferences, and practical experience. Finally, it presents a classification schema that identifies six main approaches to evaluation.
This document discusses various approaches to program evaluation including objective-oriented, expertise-oriented, participant-oriented, and consumer-oriented approaches. It provides examples of each approach and how they may be applied. Strengths and weaknesses of each approach are considered. The document also discusses evaluation methods such as surveys, interviews, and mixed methods. References are provided on related research and examples of evaluation studies.
This document discusses two approaches to consumer-oriented evaluation: summative and formative. It introduces Consumer Union, an independent nonprofit organization founded in the 1930s to assist consumers. Consumer Union publishes Consumer Reports magazine and website to evaluate products. The document also profiles Michael Scriven, considered a major contributor to consumer-oriented evaluation, and his extensive checklist for evaluating products. Both the checklist and Consumer Union aim to provide consumers with independent and thorough evaluations to make informed purchasing decisions.
This document discusses different approaches to program evaluation. It defines program evaluation as the systematic gathering of information to make decisions about improving and assessing the effectiveness of a curriculum. There are four main approaches discussed: product-oriented, which focuses on achieving goals and objectives; static-characteristic, which uses outside experts; process-oriented, which questions the worth of goals; and decision-facilitation, which gathers information to help administrators make judgments. The document also covers dimensions that shape evaluation perspectives, including the purpose (formative or summative), type of information (process or product), and type of data and analysis (quantitative or qualitative).
The document discusses several evaluation models: expertise approach, consumer-oriented approach, program-oriented approach, participant-oriented approach, and decision-oriented approach. The consumer-oriented approach developed by Scriven in the 1960s focuses on evaluating products by determining important criteria, establishing standards, and examining performance against criteria to provide information to consumers. This approach advocates for consumer education and independent reviews of products. The document provides advantages and disadvantages of each approach.
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approachesdctrcurry
All information referenced from: Fitzpatrick, J., Sanders, J., & Worthen, B. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Upper Saddle River, N.J.: Pearson Education.
What is program evaluation lecture 100207 [compatibility mode]Jennifer Morrow
The document discusses what program evaluation is, including defining it as the systematic collection of information about program activities, characteristics, and outcomes to improve effectiveness and inform decision making. It also outlines the types and purposes of evaluation, how to prepare for and conduct an evaluation by developing a logic model and methodology, and important considerations around data collection, analysis, and ethics.
This gives the information about programme evaluation, planning of evaluation, requirement and purpose of evaluation, steps involved in evaluation, Uses of evaluation, Stakeholder and their role in evaluation, finding and analysing the result of evaluation, Standards of effective evaluation, utilization of evaluation.
This document discusses approaches to program evaluation. It defines program evaluation as the systematic gathering of information to make decisions about a program. There are four main approaches discussed: product-oriented, which focuses on achieving goals and objectives; static-characteristic, which uses outside experts to determine effectiveness; process-oriented, which questions the worth of program goals; and decision-facilitation, which gathers information to help administrators make judgments. The document also outlines dimensions that shape evaluation perspectives, including formative vs. summative purposes, process vs. product focuses, and quantitative vs. qualitative data types.
The basic steps to program evaluation are to first define the purpose and objectives of the evaluation by identifying stakeholders, budget, timeline and intended outcomes. Next, a plan is created which determines the evaluation questions and selects a model to collect both qualitative and quantitative data from sources like questionnaires and interviews. Finally, the data is analyzed and findings are reported in a final evaluation report.
Dr. G. Sawarkar qualitative & quantitative evaluation methodeGaurav Sawarkar
This document discusses quantitative and qualitative methods for evaluation. Both methods provide important information but are rarely used alone and generally provide the best overview when combined. Quantitative data answers questions like "how many" through surveys and statistics, while qualitative data answers questions like "why" or "how" through interviews, observations, and focus groups. Both have strengths like precision for quantitative and context for qualitative, but also limitations such as generalizability for qualitative and complexity for quantitative. Using both methods together can provide the fullest picture of a project for evaluation purposes.
The presentation is a systematic and comprehensive formative evaluation plan to investigate the implementation of social studies education for Democratic citizenship (SSEDC) in the mature stage. The lead evaluator will select a team to guide and conduct key actions throughout the evaluation process. The plan will begin with the Grades K-6 program description, followed by the theoretical framework, including the research questions that will guide the project over a 12-week period. The methodology will be mixed method survey design, using multiple methods to collect quantitative and qualitative data. The sampled target group will include various stakeholders in the school community, including the implementers and others as the need arises. Content and descriptive data analyses will be the suggested methods to extract themes and concepts and highlight possible findings influenced by (a) teachers’ understanding of SSEDC goal; (b) methods used by teachers; and (c) problems the teachers are experiencing during the implementation process. The evidence will form the basis for findings and conclusions, and for recommending strategies for improvement of SSEDC. The evaluation team will put measures in place to promote accurate results, and efficient reporting procedures. The evaluation team will put efficient reporting procedures or measures in place respected by the internal stakeholders – designers and implementers.
The document discusses the process of evaluation in healthcare education. It defines evaluation as gathering data to determine the success of an action. The key aspects of evaluation include having a clear focus, design, data collection process, analysis and interpretation of results, reporting, and utilizing results. Different types of evaluation include process, content, outcome, impact, and total program evaluation. Proper evaluation requires selecting appropriate methods, instruments, and addressing potential barriers.
Program evaluation and outdoor education: An overviewJames Neill
This document provides an overview of program evaluation in outdoor education. It discusses what program evaluation is, why it's important to do, and different evaluation methods and tools. The presentation considers example evaluation studies and allows time to workshop individual program needs. Program evaluation aims to systematically determine a program's value by answering questions about needs, feasibility, process, outcomes, costs and generalizability. Common data collection methods include questionnaires, interviews, documentation review, observation, and focus groups. The evaluation process involves defining the purpose and audience, identifying objectives and stakeholders, gathering and analyzing data, and reporting/disseminating results.
This document outlines the presentation on evaluating a national health programme. It discusses key topics like monitoring versus evaluation, the history and purpose of evaluation, different types of evaluation including formative, summative and participatory evaluation. The document details the evaluation process including planning evaluations, gathering baseline data, implementing evaluations and using evaluation results. It also covers standards for effective evaluation including ensuring the utility, feasibility, propriety and accuracy of evaluations. The overall summary is that the document provides an overview of best practices for conducting program evaluations of national health initiatives.
The document discusses improving course evaluation methods at universities. It notes low survey response rates are a key issue and that in-class administration can help increase participation. Students want feedback that directly benefits them and surveys should engage students and clearly communicate the actions taken in response. A consistent, centralized approach to surveys is recommended to allow benchmarking, but individual departments should have flexibility to include bespoke questions. Effective course evaluation provides evidence of a university's value to students.
This document discusses management-oriented evaluation approaches. It begins by stating that these approaches aim to serve decision makers by providing evaluation information to help with good decision making. It describes the CIPP model created by Stuffbeam which evaluates programs based on Context, Input, Process, and Product. The document also discusses other early evaluation models like the UCLA model. It notes strengths of the management approach include focusing evaluations and linking them to decision making. Potential limitations include the evaluator becoming too aligned with management or evaluations becoming too complex.
Overall, assessments are used either as a Programmatic Assessment or as a Learning Assessment. One of the most familiar learning assessments is the multiple choice assessment that reflects the typical pen and paper traditional classroom test (Popham, 2006). However, these tests are not very easy to construct to ensure validity due to unclear directions, ambiguous statements, unintended clues, complicated syntax and difficult vocabulary (Popham, 2006). Other learning assessments with construct validity, such as the essay and the reflective journal, tend to focus on student-centered pedagogy. These assessments are ideal for assessing the learning outcomes of the individual and increase the student’s personal responsibility for their own learning. This reading document provides a brief summary of assessment tools that are available for both programmatic and learning.
Program evaluation is a systematic process to determine if a program achieved its intended outcomes. It involves defining goals and measurable objectives, designing an evaluation plan to collect relevant data, gathering both quantitative and qualitative data according to the plan, analyzing the results, and reporting findings to stakeholders. The overall process helps assess program effectiveness and inform future planning and implementation.
This presentation is the continuation of the first part, which was all about the basics of program evaluation. This ppt contains slides describing the impact evaluation in details and also the logical framework is also explained with practical examples.
N.B: Please go through it, using slide view to use the animation effects.
A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions;
How the Management-Oriented Evaluation Approach Has Been Used; Strengths and Limitations of the Management-Oriented Evaluation Approach; Other References, Questions for Discussion
Program Evaluation: Forms and Approaches by Helen A. CasimiroHelen Casimiro
This document discusses different forms and approaches to program evaluation. It describes five forms of evaluation: 1) Proactive Evaluation which occurs before program design to synthesize knowledge for decisions, 2) Clarificative Evaluation which occurs early in a program to document essential dimensions, 3) Participatory/Interactive Evaluation which occurs during delivery to involve stakeholders, 4) Monitoring Evaluation which occurs over the life of an established program to check progress, and 5) Impact Evaluation which assesses the effects of a settled program. It also outlines several evaluation approaches including behavioral objectives, four-level training outcomes, responsive, goal-free, and utilization-focused evaluations.
Evaluation is critical component in public policy and other forms of policy. Thus this slides gives a short overview of relevance of Evaluation in every capacity.
The document discusses four purposes of evaluation:
1) Assessment of merit and worth, also called summative evaluation, to determine the overall impact of a program.
2) Program and organizational improvement, also called formative evaluation, to provide feedback for ongoing improvement.
3) Oversight and compliance to ensure proper implementation and adherence to regulations.
4) Knowledge generation to contribute to theory and build understanding.
This document discusses the evolution of programmatic assessment in UK medical training over the past 30 years. It outlines how assessment has shifted from high-stakes exit exams to integrated programs that use workplace-based assessments like mini-CEX, DOPS, and CbD. Key organizations like the GMC, PMETB, and foundation program have developed principles of good assessment including assessing multiple competencies through various methods. The foundation program initially piloted four assessment tools but has since refined these to better provide feedback and identify trainees needing support. Overall, the document traces the progression towards valid programmatic assessment across medical education in the UK.
This document proposes a model for programmatic assessment that optimizes assessment for learning while arriving at robust decisions about learner progress. The model distinguishes between learning activities, assessment activities, and learner support activities throughout an ongoing curriculum. Individual assessments are designed to be maximally informative for learning, while a longitudinal program of various assessment methods contributes to certification decisions. The principles discussed include ensuring validity in standardized and non-standardized assessments, using both quantitative and qualitative data, and relying on expert judgement at various evaluation points. An example is provided of how this model could be applied to a blended TeleGeriatrics Nurse Training Course.
The purpose of this Programme Exit Survey (PES) was to provide data to gauge perceptions of various aspects of programmes and services offered and to identify areas where improvements may be needed in the Department of Electronic Engineering (Computer) JKE, Politeknik Kota Kinabalu (PKK). This PES was conducted on 21 final semester students, graduating from Diploma in Electronic Engineering (Computer) (DTK). They were the second Cohort whose intake was in December 2010. The survey questionnaire had five main sections: respondents’ profile; assessment of overall quality; assessment of skills and knowledge; assessment of Lecturers and Academic Advisor; and assessment of academic resources and facilities. All the data were analysed using the Statistical Product and Service Solutions (SPSS) software version IBM SPSS Statistics 19.0. For the assessment of the overall quality, attribute for teaching and learning experience was rated 100% with “excellent”, “very good” and “good”. Skills and knowledge section was evaluated by relating the statements with nine items as stated in the Programme Learning Outcomes (PLO). All the PLOs’ were marked at least “good” by 98% of the students. Assessment on lecturers and academic advisor were rated 33.3% as “excellent” and 57.1% as “very good”. In terms of academic resources and facilities, the access to Wi-Fi had the highest unsatisfactory concerned from the respondent whereby 28.6% rated the item as “weak”.
Online Assessment | Klaus Dieter Rossade (OUUK)EADTU
This document summarizes the key points from a presentation on designing online assessment solutions. It discusses the transition from in-person to online exams, principles for future assessment including being authentic, accessible, appropriately automated and continuous. Features of future assessment are outlined, emphasizing assessment being embedded in the learning process. Good practice examples from universities demonstrate flexibility and choice in assessment. The document concludes with references for further reading.
This document discusses approaches to program evaluation. It defines program evaluation as the systematic gathering of information to make decisions about a program. There are four main approaches discussed: product-oriented, which focuses on achieving goals and objectives; static-characteristic, which uses outside experts to determine effectiveness; process-oriented, which questions the worth of program goals; and decision-facilitation, which gathers information to help administrators make judgments. The document also outlines dimensions that shape evaluation perspectives, including formative vs. summative purposes, process vs. product focuses, and quantitative vs. qualitative data types.
The basic steps to program evaluation are to first define the purpose and objectives of the evaluation by identifying stakeholders, budget, timeline and intended outcomes. Next, a plan is created which determines the evaluation questions and selects a model to collect both qualitative and quantitative data from sources like questionnaires and interviews. Finally, the data is analyzed and findings are reported in a final evaluation report.
Dr. G. Sawarkar qualitative & quantitative evaluation methodeGaurav Sawarkar
This document discusses quantitative and qualitative methods for evaluation. Both methods provide important information but are rarely used alone and generally provide the best overview when combined. Quantitative data answers questions like "how many" through surveys and statistics, while qualitative data answers questions like "why" or "how" through interviews, observations, and focus groups. Both have strengths like precision for quantitative and context for qualitative, but also limitations such as generalizability for qualitative and complexity for quantitative. Using both methods together can provide the fullest picture of a project for evaluation purposes.
The presentation is a systematic and comprehensive formative evaluation plan to investigate the implementation of social studies education for Democratic citizenship (SSEDC) in the mature stage. The lead evaluator will select a team to guide and conduct key actions throughout the evaluation process. The plan will begin with the Grades K-6 program description, followed by the theoretical framework, including the research questions that will guide the project over a 12-week period. The methodology will be mixed method survey design, using multiple methods to collect quantitative and qualitative data. The sampled target group will include various stakeholders in the school community, including the implementers and others as the need arises. Content and descriptive data analyses will be the suggested methods to extract themes and concepts and highlight possible findings influenced by (a) teachers’ understanding of SSEDC goal; (b) methods used by teachers; and (c) problems the teachers are experiencing during the implementation process. The evidence will form the basis for findings and conclusions, and for recommending strategies for improvement of SSEDC. The evaluation team will put measures in place to promote accurate results, and efficient reporting procedures. The evaluation team will put efficient reporting procedures or measures in place respected by the internal stakeholders – designers and implementers.
The document discusses the process of evaluation in healthcare education. It defines evaluation as gathering data to determine the success of an action. The key aspects of evaluation include having a clear focus, design, data collection process, analysis and interpretation of results, reporting, and utilizing results. Different types of evaluation include process, content, outcome, impact, and total program evaluation. Proper evaluation requires selecting appropriate methods, instruments, and addressing potential barriers.
Program evaluation and outdoor education: An overviewJames Neill
This document provides an overview of program evaluation in outdoor education. It discusses what program evaluation is, why it's important to do, and different evaluation methods and tools. The presentation considers example evaluation studies and allows time to workshop individual program needs. Program evaluation aims to systematically determine a program's value by answering questions about needs, feasibility, process, outcomes, costs and generalizability. Common data collection methods include questionnaires, interviews, documentation review, observation, and focus groups. The evaluation process involves defining the purpose and audience, identifying objectives and stakeholders, gathering and analyzing data, and reporting/disseminating results.
This document outlines the presentation on evaluating a national health programme. It discusses key topics like monitoring versus evaluation, the history and purpose of evaluation, different types of evaluation including formative, summative and participatory evaluation. The document details the evaluation process including planning evaluations, gathering baseline data, implementing evaluations and using evaluation results. It also covers standards for effective evaluation including ensuring the utility, feasibility, propriety and accuracy of evaluations. The overall summary is that the document provides an overview of best practices for conducting program evaluations of national health initiatives.
The document discusses improving course evaluation methods at universities. It notes low survey response rates are a key issue and that in-class administration can help increase participation. Students want feedback that directly benefits them and surveys should engage students and clearly communicate the actions taken in response. A consistent, centralized approach to surveys is recommended to allow benchmarking, but individual departments should have flexibility to include bespoke questions. Effective course evaluation provides evidence of a university's value to students.
This document discusses management-oriented evaluation approaches. It begins by stating that these approaches aim to serve decision makers by providing evaluation information to help with good decision making. It describes the CIPP model created by Stuffbeam which evaluates programs based on Context, Input, Process, and Product. The document also discusses other early evaluation models like the UCLA model. It notes strengths of the management approach include focusing evaluations and linking them to decision making. Potential limitations include the evaluator becoming too aligned with management or evaluations becoming too complex.
Overall, assessments are used either as a Programmatic Assessment or as a Learning Assessment. One of the most familiar learning assessments is the multiple choice assessment that reflects the typical pen and paper traditional classroom test (Popham, 2006). However, these tests are not very easy to construct to ensure validity due to unclear directions, ambiguous statements, unintended clues, complicated syntax and difficult vocabulary (Popham, 2006). Other learning assessments with construct validity, such as the essay and the reflective journal, tend to focus on student-centered pedagogy. These assessments are ideal for assessing the learning outcomes of the individual and increase the student’s personal responsibility for their own learning. This reading document provides a brief summary of assessment tools that are available for both programmatic and learning.
Program evaluation is a systematic process to determine if a program achieved its intended outcomes. It involves defining goals and measurable objectives, designing an evaluation plan to collect relevant data, gathering both quantitative and qualitative data according to the plan, analyzing the results, and reporting findings to stakeholders. The overall process helps assess program effectiveness and inform future planning and implementation.
This presentation is the continuation of the first part, which was all about the basics of program evaluation. This ppt contains slides describing the impact evaluation in details and also the logical framework is also explained with practical examples.
N.B: Please go through it, using slide view to use the animation effects.
A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions;
How the Management-Oriented Evaluation Approach Has Been Used; Strengths and Limitations of the Management-Oriented Evaluation Approach; Other References, Questions for Discussion
Program Evaluation: Forms and Approaches by Helen A. CasimiroHelen Casimiro
This document discusses different forms and approaches to program evaluation. It describes five forms of evaluation: 1) Proactive Evaluation which occurs before program design to synthesize knowledge for decisions, 2) Clarificative Evaluation which occurs early in a program to document essential dimensions, 3) Participatory/Interactive Evaluation which occurs during delivery to involve stakeholders, 4) Monitoring Evaluation which occurs over the life of an established program to check progress, and 5) Impact Evaluation which assesses the effects of a settled program. It also outlines several evaluation approaches including behavioral objectives, four-level training outcomes, responsive, goal-free, and utilization-focused evaluations.
Evaluation is critical component in public policy and other forms of policy. Thus this slides gives a short overview of relevance of Evaluation in every capacity.
The document discusses four purposes of evaluation:
1) Assessment of merit and worth, also called summative evaluation, to determine the overall impact of a program.
2) Program and organizational improvement, also called formative evaluation, to provide feedback for ongoing improvement.
3) Oversight and compliance to ensure proper implementation and adherence to regulations.
4) Knowledge generation to contribute to theory and build understanding.
This document discusses the evolution of programmatic assessment in UK medical training over the past 30 years. It outlines how assessment has shifted from high-stakes exit exams to integrated programs that use workplace-based assessments like mini-CEX, DOPS, and CbD. Key organizations like the GMC, PMETB, and foundation program have developed principles of good assessment including assessing multiple competencies through various methods. The foundation program initially piloted four assessment tools but has since refined these to better provide feedback and identify trainees needing support. Overall, the document traces the progression towards valid programmatic assessment across medical education in the UK.
This document proposes a model for programmatic assessment that optimizes assessment for learning while arriving at robust decisions about learner progress. The model distinguishes between learning activities, assessment activities, and learner support activities throughout an ongoing curriculum. Individual assessments are designed to be maximally informative for learning, while a longitudinal program of various assessment methods contributes to certification decisions. The principles discussed include ensuring validity in standardized and non-standardized assessments, using both quantitative and qualitative data, and relying on expert judgement at various evaluation points. An example is provided of how this model could be applied to a blended TeleGeriatrics Nurse Training Course.
The purpose of this Programme Exit Survey (PES) was to provide data to gauge perceptions of various aspects of programmes and services offered and to identify areas where improvements may be needed in the Department of Electronic Engineering (Computer) JKE, Politeknik Kota Kinabalu (PKK). This PES was conducted on 21 final semester students, graduating from Diploma in Electronic Engineering (Computer) (DTK). They were the second Cohort whose intake was in December 2010. The survey questionnaire had five main sections: respondents’ profile; assessment of overall quality; assessment of skills and knowledge; assessment of Lecturers and Academic Advisor; and assessment of academic resources and facilities. All the data were analysed using the Statistical Product and Service Solutions (SPSS) software version IBM SPSS Statistics 19.0. For the assessment of the overall quality, attribute for teaching and learning experience was rated 100% with “excellent”, “very good” and “good”. Skills and knowledge section was evaluated by relating the statements with nine items as stated in the Programme Learning Outcomes (PLO). All the PLOs’ were marked at least “good” by 98% of the students. Assessment on lecturers and academic advisor were rated 33.3% as “excellent” and 57.1% as “very good”. In terms of academic resources and facilities, the access to Wi-Fi had the highest unsatisfactory concerned from the respondent whereby 28.6% rated the item as “weak”.
Online Assessment | Klaus Dieter Rossade (OUUK)EADTU
This document summarizes the key points from a presentation on designing online assessment solutions. It discusses the transition from in-person to online exams, principles for future assessment including being authentic, accessible, appropriately automated and continuous. Features of future assessment are outlined, emphasizing assessment being embedded in the learning process. Good practice examples from universities demonstrate flexibility and choice in assessment. The document concludes with references for further reading.
The First Year Experience - Lisa Curran - RMIT UniversityBlackboard APAC
This presentation will provide a case study and overview of our findings to demonstrate how supporting and building staff capacity in instructional design through the application of Quality Matters standards and the use of Blackboard and digital tools, can enhance the First Year Experience of students in large first year business courses across transnational delivery locations.
Academic Planning and Strategies Faculty Development Model - Competency-Based...Becky Lopanec
This document outlines plans and strategies for an academic program called Accelerated Programmer Training (APT) at Austin Community College. It discusses problems the program aims to address like high demand for IT jobs but low completers of computer science/IT programs. The document details plans to recruit and retain students through competency-based education, career tracks, and partnerships. It discusses building the program team, developing the curriculum by mapping competencies, and recognizing student achievements through stackable credentials. The goal is to accelerate training of unemployed/underemployed individuals to meet local IT workforce needs.
This document outlines the criteria and weightages for NBA accreditation (Tier II) of engineering programs. It discusses 12 criteria for evaluation including vision, mission and program objectives, program outcomes, curriculum, student performance, faculty contributions, facilities, teaching-learning processes, governance and finances. Maximum points are allocated to each criterion and minimum qualifying points are also specified. Guidelines for a 5-year accreditation require a minimum of 750 total points including minimum scores in mandatory criteria. A 2-year accreditation requires 600 total points and minimum scores in mandatory criteria. The document provides details on the evaluation process and points allocation for each criterion.
This document provides a summary of a project to develop a distance learning design and model for the North Carolina Community College System (NCCCS). Key points:
1. The project was funded to address distance education needs across the NCCCS and involved developing training materials, models, resources and marketing.
2. A needs assessment survey found most faculty and staff want distance education training and prefer training on developing internet-based courses.
3. The report outlines training modules on basic and advanced internet skills and using the internet for instruction to prepare faculty to develop and teach online courses.
4. Recommendations include establishing a distance education training center, grants for collaborative online course development, and competency guidelines for
The 3-year online Bachelor of Computer Applications (BCA) degree program from Chandigarh University provides students with both theoretical knowledge and practical skills needed for a career in the digital world. It is accredited by UGC and NAAC and ranked highly among Indian universities. The program uses an advanced learning management system to deliver live sessions, industry talks, case studies and Harvard Business Publishing modules. It aims to bridge the gap between classroom learning and real-world applications. Students receive a degree equivalent to the conventional BCA degree and are supported with career services and scholarships to make the program affordable.
Evaluating Distance Education: Focus on Online Course EvaluationJulia Parra
The document discusses quality assurance processes for online courses, focusing on the Quality Matters Rubric. The Quality Matters Rubric is a peer-review process used to certify and improve the quality of online courses. It consists of forty specific elements across eight standards. Courses must meet all essential standards and earn over 72 points to pass review. The document also discusses additional ways the university evaluates its online courses, such as surveys and student practicums assessed with the Quality Matters Rubric.
Enhancing Student Employability Through The Peer Review Of Professional Onlin...Thomas Lancaster
This research talk discusses the peer review process used at Birmingham City University for the Computing module Research and Professional Practice UG2. The module requires students to create a Professional Online Presence and provide a positive view of themselves to employers. The slides, originally presented at RESCON 2015, assess the use of a peer review process to allow students to assess the contributions of one another and receive valuable feedback.
Presentation delivered by Robin McGregor, Director of Learning Enhancement at North East Scotland College, at the Scottish Learning Technology Network meeting on the 16th of March, 2018.
University Recommendation Support System using ML AlgorithmsIRJET Journal
This document presents a university recommendation system that uses machine learning algorithms like KNN and SVM to analyze student profile data and recommend top universities with the highest chance of admission. The system collects data on student attributes and admission outcomes from 45 universities on the edulix.com forum. It cleans, pre-processes and selects important features from the data. Models are trained using KNN and SVM classification and used to suggest a top 10 university list customized for new student profiles to maximize chances of acceptance. The system aims to help students struggling with the complex university selection process.
Quality is like a Box of Chocolates: Developing National QA Guidelines for Di...Mark Brown
The document discusses quality assurance guidelines for digital education. It begins by noting the varied interpretations of quality and outlines insights from an OECD study that found most countries lack specific standards for digital education. It then discusses the development of national QA guidelines in Ireland, including feedback received during consultation. Lastly, it examines lessons for updating the European Standards and Guidelines, such as including new indicators on digital learning design, resources, and staff training.
The document discusses various aspects of evaluating online learning programs, including comparing the accreditation process to program reviews, defining distance education and correspondence courses, seeking approval from accrediting bodies for offering online programs, applying quality standards to online course design and teaching, and analyzing metrics like student satisfaction, learning outcomes, completion rates, and employment outcomes. It also provides examples of how to structure an on-campus program review that could inform the online program review process.
New Trends in Assessment and Evaluation_ Online Examination, Computer- Based ...trinayandutta700
This is presentation done for Sibsagar College of Teachers Education. It mainly focus on Online Exam and Computer Based Exam its advantage and disadvantage , Role of teacher , Government initiative, why this form of examination is important. I hope everyone like it - From Trinayan Dutta
Speakers:
David Lewis, senior analytics consultant, Jisc
Mike Hughes, IT director, City University, London
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Melissa Cline-Douthitt has extensive experience in instructional design and developing online content for higher education institutions. She has created numerous online training modules, workshops, and orientations during her career as an instructional designer and student development coordinator. Her skills include instructional design, online content development, presentation skills, and experience advising and counseling students. She currently works as a freelance instructional designer developing online trainings and modules for faculty, staff, and students.
The document summarizes recent and upcoming work from Jisc Data Analytics to support higher education providers. Recent work includes dashboards on topics like international student impacts and postgraduate recruitment. Upcoming products include dashboards tracking Welsh HE performance and graduate outcomes. The document also summarizes findings from Jisc's 2020 student digital experience survey, including requests for more online content, technology support, and consistency in teaching methods during the pandemic. Finally, it previews Jisc's work to help universities address challenges from the pandemic like building digital skills and embracing blended learning models.
Similar to EDLD808 Program Evaluation Final Project - Online Education (20)
Towards An Understanding of Online Collaborative Learning Theory Paul Gruhn
This presentation was a requirement for my EDLD813 Theory Class, working towards an Ed.D. in Educational Leadership. As we explore various learning theories, we were required to construct our own learning theory, as it will apply to our future research. This is a work in progress.
FIVE LENSES INTO THE WORK OF LEV SEMENOVICH VYGOTSKYPaul Gruhn
FIVE LENSES INTO THE WORK OF LEV SEMENOVICH VYGOTSKY A group project presented by Stefan Carretero, Casey Cummings, Kim Csapo-Ebert, Paul Gruhn, Jonathan Lake
March 2017 - EDLD 813 – Theory in Education Research
Integration Community Of Practice - 100th monkey - Paul Gruhn - Percha Kucha Paul Gruhn
This is a Percha Kucha presentation I did for Yale University, ITS. The message was to stimulate interest in a newly forming iCoP - Integration Community of Practice. The premise; Change may take time, will you be the 100th monkey?
EDLD804 Constitutional Law Chapter 1 PresentationPaul Gruhn
As a part of the EDLD808 Law in Education course at University of Bridgeport, each student was required to present on a chapter in Alexander, Kern, & Alexander (2011) Educational Law textbook. I did chapter one. This is the presentation,
EDLD813 Paul Gruhn - My Research AutobiographyPaul Gruhn
This document provides an autobiographical sketch of Paul Gruhn, a researcher seeking a theoretical stance to inform his research design. It outlines his background and experiences that have shaped him into a pragmatist. It describes his worldview as a pragmatist that sees multiple realities and values both quantitative and qualitative data. It proposes a mixed methods sequential explanatory design using surveys, focus groups, interviews, and narratives. The goal is to improve professional development for online educators by understanding perspectives from both insiders and outsiders.
Paul Gruhn Faculty-Research-Day Student-Poster Program EvalutionPaul Gruhn
On March 24, 2017, I submit this poster at the University of Bridgeport, Faculty Research Day poster presentations. This is a summary of a program evaluation project I completed in Dr. Linda Paslov's EDLD 808 Program Evaluation Course.
EDLD808 Program Evaluation Final Project Final Paper - Online EducationPaul Gruhn
This the complete research for program evaluation project I performed on the CSC230 Database for Web Applications course, which I teach online, to Community College Students.
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
Pride Month Slides 2024 David Douglas School District
EDLD808 Program Evaluation Final Project - Online Education
1. Program Evaluation Final Project
CSC230 Database for Web Applications
EDLD 808 Program Evaluation and Human Resources
Final Project
Paul Gruhn, Ed.D. Student – University of Bridgeport, Bridgeport, CT
2. Acknowledgements
To all the people who helped me ‘do my homework’ – Thank you.
Dr. Linda Paslov, for the class learning experience and final project
assignment with the flexibility to apply this to my context and
academic/professional goals.
My EDLD 808 fellow students, it’s been a fun class.
Prof. Richard Gnall, for allowing me to use my actual CSC230 class for this
project.
Tim Boto, for doing an honest professional level evaluation of the CSC230
course design.
Mostly, to all the past CSC230 students who participated in providing honest
feedback and continue to challenge me to be the best teacher I can be.
2
3. Disclaimer
Paul Gruhn, the author of this document, is a doctoral student in Educational
Leadership Doctoral program at the University of Bridgeport in Bridgeport,
Connecticut.
This paper is in partial fulfillment of the EDLD808 Program Evaluation course
he is took in the spring 2016 semester. Paul is also an Adjunct Professor at
Manchester Community College (MCC), Manchester, Connecticut.
The program being evaluated by Paul is a course he has taught for the past
three years in his role as adjunct professor at MCC. While the data,
research, and evaluation are as accurate as possible; this paper is to be
viewed as a student work, and due to limited timeframe should be
considered incomplete and stills a work in progress.
The University of Bridgeport, or Manchester Community College does not
endorse this document.
3
4. Definitions
Blackboard – Refers to an online learning management system (LMS) provided by
www.blackboard.com, this is the LMS, which Manchester Community College currently uses.
Creditability – Judgments of quality to determine the level of truth (Patton, 2002).
Internal Validity – To show the results achieve were due to the implemented program, and that
other factors did not impact the final outcomes (Posavac, 2015).
Mixed-Methods Research – Research which includes both statistical quantitative and thematic
qualitative data, as well as, incorporating both quantitative and qualitative methodologies in
gathering and analyzing the data (Teddlie & Tashakkori, 2009).
Outcomes – The realized results/outputs achieved because of the implementation of the program
being evaluated (Posavac, 2015).
Program Evaluation – “Is a collection of methods, skills, and sensitivities necessary to determine
whether a human service is needed and likely to be used, whether the service is sufficiently
intensive to meet the unmet needs identified, and whether the service is offered as planned, and
whether the service actually does help people in need at a reasonable cost without unacceptable
side effects” (Posavac, 2015)
Qualitative data – The use of words, narrative, and story to measure results in research and
evaluation.
Quantitative data – The use of numbers to measure results in research and evaluation.
Structured Query Language (SQL) – Is the ANSI (American National Standards Institute) computer
language to communicate with a database, it is the standard for most relational database
management systems.
Triangulation – The use of multiple sources, and methodologies to create coherent justification of
the presented research themes and results (Creswell, 2014).
4
6. Program Description: Academic Setting
6
Manchester Community College, located in Manchester, Connecticut, was founded in 1963
and serves over 15,000 students a year. MCC offers associate degrees in over 40
disciplines, in both arts and sciences, as well as various certificate programs.
The evaluated program, “CSC230 Database for Web Applications” (CSC230) course, in this
document is one of the required three credit courses in the web certificate program
offered by MCC.
More information about MCC can be found on their website,
http://catalog.mcc.commnet.edu/index.php?catoid=6 .
7. Program Description: Target Population
7
The target population for both the web certificate and CSC230 course are people desiring
to obtain the needed web development skills in order to increase their technical skill set
for self and career advancement in their current job, or to enter into a new career in the
web technology field.
“Positions for web developers are expected to increase by about 38 percent through 2016
because of the increasing necessity for the internet and internet applications … Web
designers typically have a background in computer languages in addition to an
understanding of computers, computer programming and computer development … “
8. Program Description: Context
8
The online course CSC230 – Database Concepts with Web Applications, is a component of
the larger academic offerings provided by Manchester Community College (MCC) listed in
the figure below.
CSC230 is a required course for the following program offerings at MCC:
• Prerequisite for CST258 Internet Programming course. (Appendix C)
• Web Certificate Approval Form (Appendix B), Requirements Form (Appendix D)
• Computer Programming Technology A.S. Degree (See Appendix E)
• Computer Programming Technology Certificate (See Appendix F)
9. Evaluation Questions: Student Outcomes
9
1. Did CSC230 adequately prepare students to take the CST 258 Internet Programming
course?
2. Did students earn the MCC offered Web Certificate?
3. Did students earn the MCC offered Computer Programming Technology A.S. degree?
4. Did students earn the MCC offered Computer Programming Technology certificate?
5. Did students skill growth in using SQL related database software?
6. Did students advance in their current job settings?
7. Did students acquire a new job because of the CSC230 course?
10. Evaluation Questions: Course Evaluation
10
1. Is the online mode of education effective for the CSC230 course?
2. Is the current course design effective in presenting the information that facilitates
student learning and success?
3. What are the strengths of the current CSC230 online course?
4. What are the weaknesses of the current CSC230 online course?
5. What are the attitudes and responses to this 3 credit, 15 week, delivered in an
accelerated 8-week time frame?
11. Evaluation Design
11
A Mixed-Methods Methodology
· Program Records
· External Expert Evaluation
· Quantitative: Survey to Past Students
· Qualitative: A Case Study, Semi-Structure Interview with Past
Student
· Self-Report: Professor Narrative
Evaluation
Design
Methodology
Components
Multiple Data Sources: Three different participant groups.
· Students from the Spring 2014 semester
· Students from the Spring 2015 semester
· Students from the Spring 2016 semester
Multi
Group
Evaluation
Multiple Data Types:
· Quantitative
· Qualitative:
· Narrative
· Self-report
Evaluation
Design
Data Types
Components
12. Evaluation Design
12
Overall Approach
In order to ensure evaluation and overall research credibility, as well as, internal validity, a
mixed methods approach was seen as the best methodological choice for this evaluation. As shown
in the figure above, triangulation was achieved by the implementation of multiple methodologies,
multiple data types, and multiple program participant groups evaluated.
13. Evaluation Design
13
Program Records: The first leg of this evaluation was the use of various program records consisting of; student
evaluations, student grades, emails from the students, and academic graduation records.
External Expert Evaluation: The second leg of this evaluation was the use of an external subject matter expert.
Tim Boto (name used with permission) is the Assistant Director of Educational Technology & Distance Learning at
MCC.
Quantitative: Survey to Past Students: The third leg of this evaluation is a quantitative survey sent to 77 past
CSC230 students, of which 12 percent did respond. This survey was also used to solicit potential focus group
members. Due to limited time and response, a focus group session was not held.
Qualitative: A Case Study, Semi-Structure Interview with Past Student: In lieu of a planned focus group session,
a past student provided an insightful, qualitative semi-structured interview provided a case study narrative of her
experiences and journey in both CS230 and the web certificate program.
Qualitative: Student responses to a quick email request: A question was asked in an email sent by the professor
to students who had already taken CSC230, and were taking CST258 at the time. “Are you graduating, and what are
your plans?”
Self-Report: Professor Narrative: As already stated, I am both the evaluator and the professor of the CSC230
course. Self-report narratives can offer threat to internal validity to any study. However, at the same time, a self-
report can provide a lot of rich, detailed information, since the professor is the closest to the course details,
providing a history across all three semesters being evaluated.
14. Evaluation Findings: Student Grades
14
Measure Spring 2014 Spring 2015 Spring 2016
Enrolled 24 25 28
Withdrawals 3 (13%) 5 (20%) 4 (14%)
A- or better 18 (75%) 17 (68%) 14 (50%)
B- to B+ 2 (8%) 1 (4%) 7 (25%)
C- to C+ 1(4%) 0 1 (4%)
D- to D+ 0 0 0
F 0 2 (8%) 2 (8%)
Took CST258 TBD TBD TBD
Earned Web Certificate TBD TBD TBD
Earned Associates Degree TBD TBD TBD
16. Evaluation Findings: What did you like?
16
Students like the learning materials
• The textbook, that the entire book was covered and was easy to follow
• Layout of Murach’s book, one page of reading than on the opposite page a shorten outline version.
• The online videos, I loved the tutorial videos that the Professor made; I’m a visual learning so the videos helped me immensely.
• The various online tools & supporting technologies
• The option to Skype with the professor
• Course requirements were stated clearly in the syllabus.
Students like the Professor
• Professor was timely in providing feedback and returning assignments, was very helpful
• Teaching Style
• Professor Gruhn was a great professor. He was constantly updating us, making videos, and just making sure he stayed
• connected with his students.
• Professor’s availability & flexibility (really, thank you)
Students like the course content
• Working with SQL commands/skills, learning about databases and the back-end
• Liked that it was difficult (liked to an extent).
• The information learned and how thorough the course was.
• The Content (it’s seriously cool to have learned another facet of the Structured Query Language)
Students like the course structure & design
• The technology used in the course supported the goals of the course.
• Instructional technology allowed me to achieve my goals.
• The interaction
• I loved getting the chance to learn by trial and error. The hands on approach really helped and I could work at my own pace.
• The final project was structured but left open ended, which gave me some creativity on what type of database to create
• It was straight and to the point.
• Structure of the course
17. Evaluation Findings: What did you dislike?
17
Students dislike the learning materials
• Some book confusion
• I disliked the database administration portion of Murach’s book. Unlike the previous chapters
• we’re everything was written clearly, the last couple of chapters were very confusing.
• Uploaded files not being correct
• The textbook was sometimes not very helpful.
• Sometimes the exercises were not clear enough.
• The videos were unhelpful; they were very hard to view and read.
Students like the course structure & design
• How to present the work. Screenshots are very tedious
• The lack of ability to interact
• The discussion boards are hard to interact on without direction; this is not specific to only this class.
• Taking screenshots and putting them in word documents (screenshots were fine, but it was a pain to copy/paste them).
Students dislike the limited amount of time
• The last week of class was crammed and was not a full week.
• Too much to do at the end. Was expecting the last week to be a full week.
• Class is very fast paced because lot of learning materials got covered in small amount of time.
• I disliked the 8 week accelerated time frame for this class. Even though 1st semester HTML/CSS course was extremely hard
• I was able to get through with a lot of hard work and not feel rushed. I thought this semester even though the work
• was extremely hard I felt super rushed and wished we had more time to take certain chapters a bit more slowly.
18. Evaluation Findings: What would you improve?
18
Learning materials suggestions
• Resolution on videos, it was hard to read the code being written sometimes
• The videos were hard to read. Perhaps a different resolution?
• To be honest, choose a different textbook (if it’s in your power, that is).
Course structure & design suggestions
• The syllabus encouraged discussions on the class discussion board, but I find it hard to interact without a reason.
• Finding more ways to encourage the discussions would benefit the class participation aspect.
• More interaction within groups of students and professor.
• At the beginning of the course, make sure the homework sheets align with the book.
Course length suggestions
• My only suggestion would be to lengthen from 8 weeks to maybe a 10 week course.
• That way we can explore this book at a slower pace because there’s just way too much info to be absorbed.
• And for those of us with no or very little database experience, the extra 2 weeks can make a world of difference.
• The class can be extended for few more weeks to enhance the learning experience.
No suggestions
• I honestly don’t have anything that I truly disliked about this course. Unlike my previous course in Database Design,
• this went more in depth. So where in a previous course I had learned about giving, say, Aliases to rows,
• I was used to doing it using MS SQL, not MySQL. But seriously, I didn’t dislike anything
• This is one of the best books I have had for a class and the videos are awesome. Great job
19. Evaluation Findings: In one sentence
19
• This class was pretty difficult but very rewarding.
• If you apply yourself in this course you will truly gain A LOT of knowledge about databases
• Tough but fair
• I would describe this class as interesting and quite challenging.
• I’ve taken a few web course, and this was little challenging.
• This is a class where reading and keep up with the textbook
• and lectures really make the most difference.
• This class was fun (most of the time) and very difficult, but enlightening and important.
• Very hard working and fast paced.
• This class is who is looking to dive into SQL and enhance their own websites,
• or just to learn how to work with databases.
• This class was an eye opener to another facet of what my job will be like in the future
• (going to be a DBA, hopefully).
• I learned a lot, obviously will have to practice way more to really get it to stick
• This class was insanely hard!!!
20. Evaluation Findings: In one word
20
• Time
• Interesting and Quite challenging
• Difficult
• Busy
• Essential (for me to have taken. I really learned a lot!)
• Thought-provoking
• Unexpected
• Interesting
• Challenging
• Spectacular
21. Evaluation Findings: Outside Expert
21
Tim’s Comments
“I am not seeing anything that is wrong, it is good overall.”
“Overall this course looks nice.”
(Appendix K1)“You use the ‘Getting Started” section to your own version, not just the canned version presented by the school.”
(Appendix K2) “You use announcements a lot” [this is good]
“You use embedded video.”
“Your use embedded links”
(Appendix K3) “Syllabus looks good, and provides the actual links to needed files for the course, and is high up on the syllabus.”
(Appendix K4) Course Content
“Has a roadmap in the front”
“You have videos in each module”
Laid out well, and consistent.
(Appendix K5) My Grades Section, he liked how they were organized.
22. Evaluation Findings: Outside Expert
22
Tim’s Suggestions
• Switch from Content Folders, to Learning Modules
• Consider incorporating additional tools provided by Blackboard
• Tests & Quizzes
• Rubrics
• Might fit well with Final Project
• Not used a lot by my peers
• Journals
• If this was a full semester course, and not an 8 week accelerated course,
suggest using ‘Availability Setting” for modules.
When asked, “On a scale of 1 to 5 stars, what would you give this course?”
Tim replied, “4 ½ stars, based on my suggestions listed above.”
23. QUAN Survey
Q3 - Did you earn your "Web Certificate" from MCC?
23
Answer % Count
Yes 31% 5
No 0%
Still in process 31% 5
Not a goal 38% 6
Total 100% 16
24. QUAN Survey
Q4 - Did you earn your Associates degree from MCC after taking this course?
24
Answer % Count
Yes 19% 3
No 0%
Still in process 44% 7
Not a goal 38% 6
Total 100% 16
25. QUAN Survey
Q8 - Did you attend another college or university after taking the CSC230 course?
25
Answer % Count
No 94% 15
Yes - What school and program? 6% 1
Total 100% 16
- Central Connecticut State University
- 1 Student is also planning on ging to UConn
26. QUAN Survey
Q14 - Did your experience and training in CSC230 help you to get a job in a related career?
26
I am currently database programmer
Working with MySQL help me by getting familiar working with database, have to use
Sybase database at one point
I finished the Certificate Program in Summer 2014 I got a job as web assistant in
October 2014
Use database/web programming concepts weekly for programming custom Visual Basic
application and data mining at work
Answer % Count
No 75% 12
Yes - Describe 25% 4
Total 100% 16
27. QUAN Survey
Q10 - Were you able to use your CSC230 Database training for any personal or other database
projects?
27
Planning on using this to help with a personal business project
Working on a database at work, but I'm not involved in programming it. At least I understand how the parts all come
together.
Frequently use SQL and databases in some of my programs.
Created a trouble ticket applications that allows me to keep track of clients and computers I do computer repairs for
I am a Geek Squad manager and we work with clients who have databases
Answer % Count
Yes - Describe 38% 6
No 63% 10
Total 100% 16
28. QUAL Survey
28
I took this course as part of my Associate's Degree program and hope to use that degree and the information learned
from this course and others to break into the software development field.
I have a life-long love of learning so there was a positive impact from taking this course, or any other course for that
matter.
Having just finished the course, I am not sure about the long term impacts. I do see the potential for this course to
assist me in my future goals.
I enjoyed the classes I rook with you, Professor Gruhn. I earned my degree in Programming, but haven't found a job
programming.
I ended up going in a completely different direction in regard to career, but I'm glad I have these skills should a new
opportunity arise that I wish to pursue.
Gained a better understanding of database creation and organization.
I learned new concepts
Great professor and course learned everything I needed to know about SQL!
Part of my current job deals with an SQL database. I'm sure that taking this course and getting the MCC Certificate
helped me get the job
This course was excellent in teaching programming concepts with databases. It allows me to easily pick up other
programming languages and start writing it them faster.
The class was designed perfectly to help me understand the basics in creating databases and I found that applying that
knowledge to my business has helped with assisting clients more effectively.
29. QUAL Survey
29
It was a very good class, well above average for online for-credit courses I've taken thus far.
It was a great educational experience and I look forward to the next session.
This was a very practical course. I have a strong understanding of databases.
I found out recently that I qualify for the Web Certificate, so I plan on applying for that even though I received my
degree in 2014.
Enjoyed the course, was challenged, and learned a lot while Prof Gruhn made the course enjoyable.
Thanks for the great experience!
I thought it was a great course and the Professor and Class made it a good experience. I learned a lot of things that
help me in my current job.
I think this was an excellent course. I was particularly fond of all the web development courses at MCC, and the
professor that taught this course made the learning process not stressful, engaging, and easy to work with the
material. The only thing I wish that the web certification program offered was an associate’s degree rather than a
certificate. seeing that more and more applications/services are migrating to the, I feel as those this program could
be expanded in to a full blown degree program
I have had great experiences not only with python and MySQL but overall the experience I had with Professor Gruhn
was very fruitful and his way of teaching actually got me to like programming!
30. QUAL Survey: Meet Diane
30
My goal is to get the web development & design certification here at MCC, and
this course is a part of the curriculum. So, I went into this course, not knowing
anything at all zero knowledge of MySQL. So, in the fall I took Web Dev 1 and
Web Dev 2, and I rolled into your course.
I started taking general education courses in 2012 towards a degree, any degree.
It didn’t really matter to me; I was doing one course a semester. Then I had a
need in my current position I work at UConn. In my department there is a job
opportunity within the next probably year, for a publicity and marketing
manager. I am currently an executive assistant, and our publicity manager has
been mentoring me with the hopes I roll into it. So, when I heard about the
certificate at MCC I was really excited because it is helpful in my current
position. So, I kind of side tracked from just Gen Ed classes to an actual
certification. So, when I am done with this in May I will go back to my courses
towards and general studies associates’ degree.
31. QUAL Survey: Email – Will you graduate?
31
Response: Student A
“I am going for a web tech certificate. I may however take a little break, not sure though and probably
not but a break feels nice to think about, as I only have the project management after this to do.”
Response: Student B
“I'll be graduating this semester with an associate’s degree in Computer Programming Technology (god
willing).”
Response: Student C
“Thanks for your email. I have finished all my course work for my Programming Certificate and I am
only one course (Project Management) away from the Web Certificate. I have to talk to advising to
make sure I can graduate in August with both certificates. If I can I will take Project Management this
summer.
I am undecided about what I am planning in the future. I want to continue teaching for now, but I may
eventually want to pursue a career in web development or programming. If you have any advice on
next steps for either self-study or job paths, I would appreciate it. I really love programming and have
learned so much at MCC.”
33. QUAL Survey: Conclusion
33
Items to implement before next CSC 230 Course Offering, Spring 2017
• I would like to have one of my Quinnipiac University Online Instructional Designers,
who are all trained in “Quality Matters” online course design standards review both
my CSC230 & CST258 courses. I wanted to include this in this evaluation. But due
to time limitations I could not.
• Update all the CSC230 videos
• Use current textbook
• Use improved software to improve video quality to High Definition
• Add tags to the videos, so students can easily jump to certain sections
• Review and update all homework assignment sheets to insure they are current and
accurate.
• Change Blackboard structure from ‘folder based’ to ‘module based.’
• Devise a better methodology to communicate to the students, the amount of effort
and time they will need to complete the course.
• Develop assignment rubrics to improve the quality and consistency of grading.
34. QUAL Survey: Self-Report!
34
Open Questions for Continued Discussion
• Some students do not like the accelerated eight-week format, yet completing
the Web Certificate program in one year is important to students. One
student suggested a 10 week option, which could be considered.
• We need to adhere better to this perquisite requirement. Students from both
MCC and Asnuntuck Community College (ACC) who have entered CST258
Internet Programming, without CSC230 have struggled with the CST258
requirements. The problem especially arose with ACC students coming to MCC
to meet their CST258 requirement at MCC.
• How can we assure continuity of skill sets across all courses in the web
certificate program? As well as, better communication of the various faculty
and program leaders? As a teacher in this program, I feel alone.
• Should rolling start dates for the certificate program be considered? This
would require offering the courses two times a year instead of one.
• Can we standardize on online course design for all the courses in the web
certificate program?
• Students respected the Murach book. We could have a discussion on
standardizing textbooks.
35. QUAL Survey: Self-Report!
35
Future Research Plans
• Work with MCC reporting team, to compare course/certificate attendees to actual
graduation rates. Did the actuals meet the goals set out in January 2009
(Appendix B)
• Due to time constraints, the two semi-structured interviews were not transcribed.
This will be done in the future, and then coded for themes. The findings will then
be added to a future revision of this evaluation.
• Expand an evaluation to include the entire MCC Web Certificate program, not just
this one course. Did the Web Certificate meet its goals as set forth in 2009? What
are the web Certificate program goals for the next 1, 3, 5 years? Is the current
program in alignment with current web technologies in play today?
• Include more of the MCC Web Certificate stakeholders in future research and
evaluations.
• Can/should the web technologies play a stronger role in the MCC A.S. terminal
degree?
• Can I use the MCC Web Certificate program for my thesis? What question(s) need
answers?
36. One letter says it all.
Hello Professor Gruhn,
Don’t know if you remember me. I took both CSC 230 and 258 in the spring of 2014. After that I took Project Management
and completed the MCC Certificate Program in the summer of 2014. I was able to get a job in October 2014 as a web
assistant. It was the first paying job I had since I stopped working in 1998 to be a stay at home mom. I volunteered in
non-profit activities (creating and managing websites) when not working. Now I work for a NYC based art company in a
small satellite office that is about 15 minutes from my home. I know that completing the MCC Certificate Program helped
me get the job.
I completed the survey and would be interested in the focus group if I were available. Do you have an idea of which
Saturday it would be?
I have been at my current job for more than a year and like what I am doing. I manage an ecommerce website, work with
a database (Counterpoint SQL), manage a blog in WordPress, prepare and send out email blasts with mail chimp and do a
host of other things. I got a great evaluation from my boss after just 3 months. However, the company is not great about
compensation so I may be looking at other options where there is more potential for growth. If you hear of any
opportunities or have suggestions, I would welcome them.
Good luck with your Doctorate endeavors. I really appreciated the CSC230 course and enjoyed the challenges it gave
me. I have no doubt that working on the certificate program helped me get back in the workforce. It was the first job I
applied for (I know I was really lucky!).
Thanks for all you did to make the course interesting and challenging. Let me know what you decide to do about the
focus group.
Best regards,
36
37. Thank you
37
Paul Gruhn
Final Project
EDLD808 – Program Evaluation
University of Bridgeport, CT
https://www.linkedin.com/in/paulgruhn