Using research to_create_effective_on-line

284 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
284
On SlideShare
0
From Embeds
0
Number of Embeds
11
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Surveys - They are typically paper and pencil but the use of on-line survey instruments is becoming more popular and is available through FCPS resources. Internal Expert Judgements - For this evaluation we will used the guidelines set up by NSDC Standards for Staff Development and the five dimensions of on-line professional development set by Goals 2000 through the US Department of Education. Using a rubric volunteer from within the department and from FCPS Professional Development SITE Facilitators each course will be evaluated on the basis of meeting the guidelines with 80% as being identified as acceptable. MSA Scores - Each school will be identified where participants are employed. 2003 (if available) and 2004 MSA scores from that school will be looked at and the measure of student achievement identified.
  • – This survey is given at the conclusion of each course. The date the survey is disseminated depends on when the course is offered. Courses are offered summer, fall and spring semesters each year. Data for the 2002 – 2003 has already been collected. Data for the summer and fall semesters of the 2003 – 2004 semesters has already been collected. Data for the spring 2004 will be collected during the months of April and May. A preliminary formative report will be issued in April 2004. This report will be based on the findings of the Scantron survey and extended responses E-mail to complete the on-line survey will be sent out in early June 2004. This data was chosen because I did not want this survey to be confused with the Scantron Survey that spring 2004 participants are completing in April and May. Additionally, the survey will be sent via FCPS e-mail address and if sent later that mid June many participants may not respond because they do not check their school e-mail June many participants may not respond because they do not check their school e-mail during summer non-working days. June/July 2004. Expert will be recruited during the summer of 2004. This time frame was chosen because these expert and typically not as busy during the summer months and would have more available time to volunteer for this evaluation. MSA data from 2003 has already been collected and will be compiled by school in May/June 2004. This timeframe was choose because the evaluator will need to research each of the participants schools and a new employee information searchable database is being release in mid-May. This database should be a timesaver in collecting this research data. MSA data from 2004 will not be released by the state until sometime in June 2004. August 2004 This data was chosen because beginning July 1 we will be hiring a new director for the Office of Instructional Technology. Since the interviewing process won’t begin until June with hope of finding a director to begin July 1. Given this timeframe and allow some time for the new director to acclimate to the new job I felt that an August reporting would be appropriate.
  • Accomplished it listed objectives Agree Strongly Agree Somewhat Undecided Disagree Somewhat Disagree Strongly   Required active participant involvement Agree Strongly Agree Somewhat Undecided Disagree Somewhat Disagree Strongly   Was practical Agree Strongly Agree Somewhat Undecided Disagree Somewhat Disagree Strongly   Should continue to be offered Agree Strongly Agree Somewhat Undecided Disagree Somewhat Disagree Strongly   Will help me in my job Agree Strongly Agree Somewhat Undecided Disagree Somewhat Disagree Strongly
  • My overall evaluation is: Outstanding Very Good Satisfactory Marginally Satisfactory Unsatisfactory
  • The organization of the course was: Poor Adequate Good Excellent
  • The goals of the course were: Vague Somewhat Evident Evident Clearly Evident
  • The ideas and activities presented were: Not Useful Somewhat Useful Useful Very Useful
  • Use of course learning will promote student achievement: Disagree Somewhat Agree Agree Strongly Agree
  • I am interested in further developing my professional knowledge in the area: Disagree Somewhat Agree Agree Strongly Agree
  • Overall, I consider this course: Poor Adequate Good Excellent
  • Positive Comments The online course was well organized and practical. Feedback from instructors was useful and made it feel not like an online course. Support was there if needed. The content was great. I found many classroom ideas I loved the convenience of the on-line course. It made it possible for me to really pursue my interests as I was investigating sites etc. Negative Comments The amount of time required was excessive. I spent as much time on this course as I did on a 3 credit graduate course – far more that 15 hours that equal 1 MSDE credit. Directions were not clear enough on some assignments. I wasn’t sure if I was to do the answers in both search engines or only 1. When passing out the syllabus, try to explain how to do some things. Maybe splitting the group into those who have and haven’t had any classes before would benefit people more.
  • Positive Comment: The websites were interesting. Negative Comment: There were several technology glitches, which made it difficult to complete some assignments.
  • This date was chosen because beginning July 1 we will be hiring a new director for the Office of Instructional Technology. Since the interviewing process won’t begin until June with hope of finding a director to begin July 1. Given this timeframe and allow some time for the new director to acclimate to the new job I felt that an August reporting would be appropriate.
  • Using research to_create_effective_on-line

    1. 1. Using Research To CreateEffective On-Line Professional Development Courses Karen Kroll April 28th, 2004 JHU Course #893.601
    2. 2. Background Information A. What We Have - • Six on-line courses some of which have been run for five consecutive years three times a year. • The courses are approved by the Maryland State Department of Education (MSDE) and participants can earn 1 MSDE credit upon successful completion of the course. • The state does not evaluate the courses. • Courses are given approval for five consecutive years and then counties must reapply for state approval of the course if they want to continue offering the course • FCPS does complete a general evaluation on all MSDE courses however, the Office of Technology Instruction completes no formal evaluations
    3. 3. B. The Problem(s)… • It is very difficult to ascertain if participants of the course are finding the courses to be relevant to the classroom. • The courses do not appear to be designed with elements that support highly qualified professional development. • The mandates of the No Child Left Behind (NCLB) legislation Title II. Part A, states that professional development programs will be “regularly evaluated for their impact on increased teacher effectiveness and improved student academic achievement, with the findings of the evaluations used to improve the quality of professional development.”
    4. 4. C. Why Evaluate? • Is the program or activity leading to the results that were intended? • Is it better than what was done in the past? • Is it better than another, competing activity? • Is it worth the costs?
    5. 5. D. Elements To Consider WhenConstructing An On-Line Program The Goals 2000 research conducted by the US Department of Education recommends ten basic principals to ensure that “the mission of the professional development is to prepare and support educators to help all students achieve to high standards of learning and development.” The Program… 1. focuses on teachers as central to student learning, yet includes all other members of the school community. 2. focuses on individual, collegial, and organizational improvement. 3. respects and nurtures the intellectual and leadership capacity of teachers, principals, and others in the school community. 4. reflects best available research and practice in teaching, learning, and leadership
    6. 6. Elements (cont.) 5. enables teachers to develop further expertise in subject content, teaching strategies, uses of technologies, and other essential elements in teaching to high standards. 6. promotes continuous inquiry and improvement embedded in the daily life of schools. 7. is planned collaboratively by those who will participated in and facilitate that development. 8. requires substantial time and other resources. 9. is driven by a coherent long-term plan. 10. is evaluated ultimately on the basis of its impact on teacher effectiveness and student learning; and this assessment guides subsequent professional development efforts.
    7. 7. E. Five dimensions that contribute toeffective on-line learning. 1. Relevant and challenging assignments 2. Coordinated on-line learning environments 3. Adequate and timely feedback: teacher-student interaction 4. Rich environments for student - to - student interaction 5. Fostering flexibility in teaching and learning (Levin, et al, 2001)
    8. 8. Purpose of This Evaluation Are the current FCPS on-line professional development courses effective? And If the courses are not effective what elements could be added to make them more effective or should the courses be abandoned for new courses that would be more effective.
    9. 9. Hypothesis The FCPS on-line professional development courses are not effective as they are written and presented. They should be abandoned in favor of new more rigorous courses that follow appropriate on-line professional development design and meet the criteria for highly-qualified professional development.
    10. 10. What Does The Research Tell Us? National Center For Education Statistics • Survey of professional development and training in 2000 – Teachers are most likely to spend less than one day on a single subject PD – The majority of teachers felt PD did not improve their teaching – Collaboration usually occurs during teaming not during PD activities – Those that participated in mentoring felt it did improve their teaching
    11. 11. What Does The Research Tell Us? University of Strathclyde Professional Development On-Line Online PD provides: • Opportunities to review information • Opportunities to discuss key issues and concepts with peers • Opportunities for coaching and feedback • Access to “expert” systems • Opportunities to make thinking processes more visible and explicit • Provides rich context environments for learning via a visual media
    12. 12. What Does The Research Tell Us? What Does A High Quality On-Line Professional Development Program Look Like? Two studies: Ferguson, Mohamed, Wier and Wilson at the University of Strathclyde (2001) and Dozier, T (1999) • participants of PD need to engage in a community whereby they are learning and exploring from other participants • the tools used to develop the Web community must match the requirements and needs of the participants (the world wide web, chat rooms, bulletin boards, e-mail, and search facilities) • PD must be an on-going, sustained experience that is supported by the community and focused on outcomes
    13. 13. What Does The Research Tell Us? Indiana University Student’s Distress With A Web-based Distance Education Course • Two Foci of Student Distress – Technological Problems, which includes no access to technical support – Course Content and Instructor Support which includes ill managed communication, confusion over instructions, feed back isn’t prompt, ambiguous instructions and web messages (part of this is due to student reluctance to express anxieties/frustration believed to be caused by weak social cues of asynchronous text-based communication)
    14. 14. Evaluation Design The evaluation will consist of a combination of paper/Scantron surveys, on-line survey delivered via e- mail, internal expert judgments and state test results.  Surveys are measures of personal opinion, beliefs or values.  Internal expert judgments provide observations, opinions or information about the program. Experts usually use a set of standards or guidelines as the basis for their judgments.  Maryland School Assessment (MSA) Reading and Math scores will also be looked at. Passage standards are determined each year by the state.
    15. 15. What Questions Will The EvaluationTools Focus On? 1. Has the professional development program impacted student achievement? 2. How have students been impacted by the professional development program? 3. How have teachers been impacted by the professional development program? 4. What components of the professional development program have been most influential in impacting student achievement? 5. Has the professional development program changed teachers’ instructional practices? 6. Is the professional development program being implemented accurately?
    16. 16. Evaluation Instruments Question Information To Formative or The data will be Collect Summative? collected by…Has the professionaldevelopment program Changes in student MSA scores of schools where Summativeimpacted student academic achievement participants of the coursesachievement? teachHow have students been Formative Perceived changes inimpacted by the On-line survey of participants student behavior, attitude  professional developmentprogram?How have teachers been Perceived changes in Formative and Scantron survey taken at theimpacted by the teacher achievement, Summative completion of the course andprofessional development behavior, attitude extended responses.program?What components of the Weight of components ofprofessional development professional development Formative Expert Judgmentsprogram have been most courseinfluential in impactingstudent achievement?Has the professional Scantron survey taken at the Change in teacher Formative anddevelopment program completion of the course and instructional practice Summativechanged teachers’ extended responses.practices On-line survey of participants Comparison between whatIs the professional was planned and Expert Judgmentsdevelopment program Formative implemented and bestbeing implemented   practicesaccurately?
    17. 17. Sample The study will be limited to those teachers that have participated in the five FCPS Technology Services on-line professional development courses. Theses courses are:  Integrating the Web in the K-8 classroom  Navigating the Net 1: Setting Sail  Navigating the Net 2: Information Anchors  Navigating the Net 3: Uncharted Waters  Navigating the Net 4: Chart Your Own Course
    18. 18. Sample (cont.) • Will only include participants in the courses offered during the 2002-2003 and 2003-2004 school year. This would include all three semesters (summer, fall, spring) of participants. • Enrollment was generally limited to a maximum of ten students per course however some exceptions did occur. Approximately 150 teachers participated in at least one course. Some participants enrolled in more than one course and not all courses were full or offered each semester due to low enrollment.
    19. 19. Time Frame Prior to August Month April 2004 May 2004 June 2004 July 2004 April 2004 2004 Task A B E C, D, E D F A. Scantron Survey and Extended Responses B. Formative Report C. On-line Survey D. Expert Judgments E. MSA Data F. Report on Evaluation Results
    20. 20. Analytic Strategies  A.1. Scantron Survey – The number and percentage of participants that answer good or excellent to the six questions.  A.2. Extended Responses – Teacher attitudes and behaviors will be identified using the following descriptive word rubric.   2 – great, excellent, I really liked, valuable 1 – didn’t like, poor, inadequate, difficult    Listing of suggestions for improvement will be compiled and compared to suggestions/recommendations from the expert judgments.  
    21. 21. Analytic Strategies (cont.)  On-line Survey – Data Base Rubric: 3 - Exceeds expectations 2 - Meets expectations 1 - Does not meet expectations    Expert Judgments – Scoring Rubric developed by the experts following the the guidelines set up by NSDC Standards for Staff Development and the five dimensions of on-line professional development set by Goals 2000 through the US Department of Education    MSA Data – state acceptable rate of passage
    22. 22. Preliminary Results Scantron Survey Year 1 Questions 1-5 160 140Number of Respondents 120 100 Series1 Series2 80 Series3 Series4 60 Series5 40 20 0 Question Agree Strongly Agree Somew hat Undecided Disagree Somew hat Disagree Strongly Responses
    23. 23. Preliminary Results Scantron Survey Year 1 Question 6 100 90 80Number of Respondents 70 60 50 40 30 20 10 0 Outstanding Very Good Satisfactory Marginally Satisfactory Unsatisfactory Responses
    24. 24. Preliminary Results Scantron Survey Year 2 Question 1 60 50 Number of Respondents 40 30 20 10 0 Question 1 Poor Adequate Good Excellent Responses
    25. 25. Preliminary Results Scantron Survey Yr 2 Question 2 60 50Number of Respondents 40 30 20 10 0 Question 2 Vague Somew hat Evident Evident Clearly Evident Responses
    26. 26. Preliminary Results Scantron Survey Yr 2 Question 3 60 50Number of Respondents 40 30 20 10 0 Question 3 Not Useful Somew hat Useful Useful Very Useful Responses
    27. 27. Preliminary Results Scantron Survey Yr 2 Question 4 60 50Number of Respondents 40 30 20 10 0 Quest ion 4 Disagree Somewhat Agree Agree St rongly Agree Responses
    28. 28. Preliminary Results Scantron Survey Yr 2 Question 5 40 35 30Number of Respondents 25 20 15 10 5 0 Question 5 Disagree Somewhat Agree Agree Strongly Agree Responses
    29. 29. Preliminary Results Scantron Survey Yr 2 Question 6 40 35Number of Respondents 30 25 20 15 10 5 0 Question 6 Poor Adequate Good Excellent Responses
    30. 30. What We’ve Learned … So Far • Generally participants are satisfied with the courses as written. • Most participants believe the course will improve student achievement. • Most participant believe the assignments were useful. • Most participants believe the courses met the stated goals/objectives. • Most believe the organization of the courses was good. • Most participants believe that taking the course required active participation on the part of the student. • Most participants believe that the course information was practical. • Most participants believe that the courses should continue to be offered.
    31. 31. What Have We Learned FromParticipant’s Comments? Year 1 • 34 comments were given • 79% of the respondents left the class feeling it was a valuable learning experience. • 21% of the participants that commented left the course with negative feelings.
    32. 32. Comments - Year 2 1. The most useful components of this course were: 75% of the respondents made positive comments. 2. Components of the course I would change are… 100% of the respondents made negative comments 3. I would like to learn more about the following… Only one respondent commented to this question. • “I respect the idea of making it all very applicable, but felt I could have been pushed more to do more varied tasks.”
    33. 33. How Does The Data Match With TheResearch? 1. Has the professional development program impacted student achievement? No data as of yet has been collected to answer this question 2. How have students been impacted by the professional development program? No data as of yet has been collected to answer this question – however, participants taking the course feel it will impact student achievement. 3. How have teachers been impacted by the professional development program? Generally teachers feel positive about the experience and felt it was worthwhile.
    34. 34. How Does The Data Match With TheResearch? 4. What components of the professional development program have been most influential in impacting student achievement? No data has been collected to answer this question. 5. Has the professional development program changed teachers’ instructional practices? Some teachers felt that by taking the course it has helped them to better prepare their lessons and provided resources for designing technology rich lessons. 6. Is the professional development program being implemented accurately? Many teachers were complimentary of the organization of the courses, the research that was put into the content and the timely responses of the instructors HOWEVER, many felt that they did not have enough guidance and that the amount of work required was excessive. Some commented that the links did not always work and they couldn’t use sites as instructed to do so.
    35. 35. How Does The Data Support The 5Dimensions of An On-Line Course? 1. Relevant and challenging assignments Most participants felt the assignments were relevant and challenging. 2. Coordinated on-line learning environments Some participants had difficulty working in an on-line environment however, most were successful. Most felt the courses were well organized. 3. Adequate and timely feedback: teacher-student interaction Some participant’s commented positively on the timeliness of teacher comments and help. Many participants felt lost and didn’t know where to go to get help.
    36. 36. 4. Rich environments for student - to - student interaction. No comments were given on this. The courses do not allow for student – student interaction.5. Fostering flexibility in teaching and learning The courses did provide opportunities for the participants to learn new information and provided opportunities to incorporate new ideas into lessons. Many commented that they left the course with new ideas for teaching their students.
    37. 37. Dissemination • Report To FCPS Director of Library Media and Instructional Technology in August 2004 • Report will be Included in 2004 – 2005 Annual Technology Report that addresses the technology goals in the FCPS Master Plan • Report to FCPS Teacher Specialist for Technology assigned to developing MSDE Professional Development Courses for Instructional Technology

    ×