Video Lecture Capture Initiative - Summer Pilot Report
Upcoming SlideShare
Loading in...5
×
 

Video Lecture Capture Initiative - Summer Pilot Report

on

  • 1,513 views

Submitted by:

Submitted by:
Dr. Naomi M. Hall
Assistant Professor of Psychology
Behavioral Sciences and Social Work
Winston-Salem State University

Statistics

Views

Total Views
1,513
Views on SlideShare
1,513
Embed Views
0

Actions

Likes
0
Downloads
4
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft Word

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Video Lecture Capture Initiative - Summer Pilot Report Video Lecture Capture Initiative - Summer Pilot Report Document Transcript

  • Winston-Salem State UniversityVideo Lecture Capture InitiativeSummer Pilot Report wssu8/11/2009Submitted by:Dr. Naomi M. HallAssistant Professor of PsychologyBehavioral Sciences and Social WorkWinston-Salem State University<br />Introduction<br />The lecture-capture initiative was started to address the increasing problem at Winston-Salem State University (WSSU) of students who are dropping, failing, or withdrawing (DFW) from classes each semester. Institutional research studies have shown a constant decline in retention each semester as a result of this problem. Presently, the retention rate is roughly 14%, which is alarmingly low.<br />A number of different approaches have been taken across the academy to address the issue of high DFWs. Some of these approaches included modifying tutoring services, increasing mentorship initiatives, following up with at-risk students at shorter intervals in the semester in order to monitor their progress, providing remediation, among other efforts. Since the core of students’ academic success centers around what happens in the classroom, the lecture-capture approach seemed to be a good solution for enhancing student performance. Lectures captured using digital video and made available using streamed video allows students to review at their own pace the information the instructor explained. They can also view any associated visual aids and search for concepts and resources relating to those aids. Lectures captured in this way may help students overcome weaknesses in areas such as taking notes, paying attention over extended periods of time, or understanding language barriers. Another very important aspect is the ability to make remediation sessions available so that students who may lack certain skills can remediate without having to spend class time doing so. With this in mind, the Center for Excellence in Teaching and Learning (CETL) explored alternative video lecture capturing solutions, which in essence, places at the student’s disposal the core of what affects their academic success, which is the professor’s perspective on the course material. This report details the results of the pilot implementation of the Video Lecture-Capture (VLC) Initiative undertaken by CETL during the 2009 summer session II.<br />Methodology<br />Participants<br />The participant pool (N=72) for this pilot initiative was overwhelmingly populated by women (81.9%). When students were asked “in which range does your age fall?” we saw that 41.7% of the students indicated they were less than 20 years of age, 16.7% between 21 and 24 years, and 41.7% over the age of 25. The student classification was a little more diverse with the sample consisting of 7% Freshman, 23% Sophomore, 32% Juniors, 10% Seniors, and 28% pursuing a second degree. In terms of the course of study for students, the majority of students (60%) were Nursing majors. There were 12 different majors indicated (not including those who are ‘undeclared’), but Nursing represented the largest percentage of students, and Exercise Science was second with 18%. The other ten majors represented 28%, collectively, of the pool.<br />Although there were five instructors participating in the pilot, students from only three instructors participated in the data collection portion of the pilot. The students who provided input on the project were from the following instructors: Drs. David Kump (48.6%), Daniel Williams (36.1%), and Frank Ingram (15.3%). Both Dr. Williams and Dr. Kump taught Biology 2312 (Anatomy & Physiology II), and Dr. Ingram taught Math 1311(College Algebra).<br />Measures<br />Students were asked to complete three measures: 1) Initial Student Assessment, 2) Individual Video Lecture Capture (VLC) Assessment, and the 3) Final Student Assessment. The Initial Student Assessment was given to all students (N=72) at the beginning of each course and asked primarily for demographic information such as gender, student classification, age, major, instructor, and prior science and math classes taken. The Individual VLC Assessment was an ongoing survey that asked students to provide information on the video(s) they viewed, along with their attitudes and experience using the software. Forty six percent (33) of the students viewed at least one video—according to the assessment data. Finally, students were asked to complete a Final Student Assessment to ascertain their attitudes and experiences viewing the video, using the software, and how it may have affected their performance in the class. This assessment also asked students to provide any suggestions for improving the videos and software, and implementing the initiative in other classes. Seventy-one percent (51) of the students who completed the Initial Student Assessment also completed the Final Student Assessment. Faculty who participated in the pilot initiative were asked to complete a Faculty Assessment Form to gather feedback about their perceptions of using the software, how they think the performance of students in class was impacted, and any of their suggestions for future use. <br />Procedures<br />The videos were captured and uploaded using Panopto. Panopto is a lecture-capture system software which makes portability quite easy. It allowed faculty to record their lectures from anywhere, at anytime, using video and audio, and captured information from PowerPoint slides, or other text sources. From the metadata that is created, students had the opportunity to search using any of the text, or PowerPoint thumbnails. During a search, all data (video, text, audio) is synchronized at the point of delivering the search results. Students were all instructed to sign up for a Panopto account so that they could access the videos.<br />After a few trials to be sure that Panopto would not pose any problems with its integration with Blackboard, the software was fully implemented for use by any faculty member. Five faculty members, mainly in Life Sciences and Math, recorded their lectures live, and made the contents available for viewing by their students. Additionally, faculty were provided with the measures from which the following data were collected. <br />Students and faculty completed all assessment forms via SurveyMonkey © by assessing the provided link. Faculty provided the link to the three assessment tools to all students via email. Students clicked on the link and were taken to the appropriate survey. Faculty were also given a link to use for completion of the Faculty Assessment Form. <br />Results (Student Perceptions)<br />Initial Student Assessment<br />In an effort to ascertain data regarding student’s prior knowledge, they were asked, “Prior to this course, how many courses have you taken in each of the following disciplines: Life Science, Chemistry, Physics, Computer Science, and Math.” At least 85% of the students have taken at least one Life Science course prior to enrolling in their current class. More than half of the students (54%) reported taking at least one Chemistry course, with one-fourth (24%) of the students indicating they have taken ‘two or three’ Chemistry courses. The majority of students had not taken Physics (73%) yet, and about one-third (31%) had not completed a Computer Science class. However, almost half of the students (49%) said they had one Computer Science course. There were only 3% of students who reported not having any prior Math classes, and 69% had two or more classes prior to the current class. Overall, it appears that students had more experience with Life Science, Chemistry, and Math classes than Physics and Computer Science.<br />Students were asked to “Please indicate which science courses you have taken (you may choose multiple options).” This question was asked because the instructors felt it was important to not only know if they had taken math and science prior to the current course, but specifically which classes were taken. Figures 1 and 2 show the percentage of students who have taken which courses.<br />Due to the high DFWs of the classes chosen for this pilot initiative, it was important to gather information on the grades students received in some of the prerequisite and complementary classes offered at WSSU. Students were asked to indicate their grade, if applicable, in the following classes: Biology 1301 (General Biology), Biology 2311 (Anatomy & Physiology I), Math 1311 (College Algebra), and Chemistry 2311 (General Chemistry). In both of the Biology classes, more than eight out of ten students indicated they received a ‘C’ or higher—Biology 1301 (90%) and Biology 2311 (89%). Data for students who have taken Biology 1301 indicate that 23% received an ‘A,’ 43% a ‘B,’ and 24% a ‘C.’ Approximately one-third of the students indicated they received an ‘A’ in Biology 2311 (31%), with the same number receiving a ‘B,’ and 27% receiving a ‘C.’ Both of these courses are a prerequisite for the Biology 2312 (Anatomy & Physiology II) course. The grade distribution for Math 1311 and Chemistry 2311 were very similar with more than half receiving an ‘A’ or ‘B.’<br />Individual VLC Assessment<br />The Individual VLC Assessment asked students to provide information on the video(s) they viewed, along with their attitudes and experience using the software. Due to limitations in data collection, it was difficult to distinguish between Dr. Williams and Dr. Kump’s class, and to distinguish between each of the sections (each professor has two sections). The data presented is aggregated, and very few correlations with specific instructors can be made.<br />Students were asked to indicate which videos they reviewed during the course. The chapters in Biology 2312 with the highest number of students reviewing them were chapters 14, 16, 17, and 28. The chapters with the lowest number of students reviewing them were chapters 23, 24, 35, and 29. When asked how many videos viewed and how long were they viewed, students provided the following information: <br />
    • Fifty eight percent of the students who viewed one or more of the videos indicated they watched the entire video.
    • Of the 42% who did not the view an entire video, the majority indicated they didn’t have time to watch the whole video.
    • 65% of the students who viewed at least one video watched at least 10 minutes of the video.
    • Of those students who viewed one or more of the videos, 93% watched it alone.
    It was important to gather information on the reasons students viewed any of the videos, and how the video(s) may or may not have contributed to their success in class. The overall tone of the students was that it was good for review and clarification. When students were asked to indicate why they viewed the video, one student said “to make sure I had an understanding of what was going on so I am prepared for my test. After listening to the videos it made everything clear to me.” Another student indicated in was important to “review and fill in parts of my notes where I did not get everything written down in class.” Students were asked if the video(s) were helpful in providing the information they needed for the class, and 85% indicated it was. Interestingly, one student who did not review the video(s) indicated that “if I would have viewed the videos I wouldn't have paid as close attention in class.”<br />Even though the majority of the videos were not pre-recorded by the instructors, it was essential to learn if students acquired any new information by watching the videos. Three out of five students (61%) acquired new information, and one student said that he or she was able to get “something that I may did not get in class…I understood it better after listening to the video.” Additionally, another student said that “in class, discussion over diagrams were perhaps too fast for me to absorb all the information. Listening to those slides again helped.”<br />We asked students if the video(s) were helpful to them, and 88% said “yes.” When asked how they were helpful, three responses stood out:<br />
    • “Good review, clearer understanding second time hearing it”
    • “I really really enjoy the video lectures I think every class and every university should have access to such a wonderful tool. It is an excellent guide to enforcing better study habits.”
    • “The slides went along with the video that helped out A TON!”
    On a scale from 0 to 10 (10 being the highest), students were asked how beneficial was the lecture-capture video you just watched to your understanding of concepts relating to this class? Fifty-two percent rated the video(s) between an 8 and 10; 36% between a 5 and 7; and only 12% said the video(s) was not very beneficial (between a 0 and 4).<br />We wanted to see the grade point average (GPA) of those who viewed the videos to see if those who had higher GPAs were more or less likely to utilize the software. Of the students who used the software to view one or more video, 76% had a GPA of 3.0 or better. The video(s) were viewed most by those who had a GPA between 3.1-3.5. Additionally, most of those who viewed the video(s) were employed (64%) and worked at least 21 hours per week (71%). <br />This is the first time that CETL used the Panopto software, so the students were asked to rate the technical quality of the video. As you can see in Figure 3, 73% rated the technical quality of the video(s) as good or excellent. <br />Final Student Assessment<br />Seventy-one percent (51) of the students who completed the Initial Student Assessment also completed the Final Student Assessment. The overwhelming majority of these students were from Dr. David Kump’s classes (96%). The students were asked to indicate the number of videos watched over the course of the summer school course, and one in four students (24%) viewed five or more videos. Thirty-one percent of the students did not view any videos, and the remaining 45% viewed between one and four of the videos available. <br />Of those 31% who did not view the video, 51% said they did not because “it was not required for the course.” Unfortunately, 33% did not view any of the videos because they indicated they had technical difficulties assessing Panopto; 12% said they “didn’t see the benefit;” 8% said they were “too difficult to access;” and 4% did not know about the videos. Students had the option of choosing multiple options, so the total does not equal 100%. <br />As indicated earlier, most of the videos were not pre-recorded but taped during a live class session. There appears to be a difference in the number of students who viewed the videos that were taped in their class versus those taped in the other class (65% vs. 35%, respectively). Keeping in line with the Individual VLC Assessment, students overwhelmingly viewed the videos alone (73%). When asked about their video lecture viewing habits, students indicated that they either watched the entire video once, or fast-forwarded to certain parts and watched those multiple times (see Figure 4).<br />Students were asked how beneficial the videos viewed were in understanding the concepts related to the course, and how valuable the videos viewed were to the course. Both of these questions were on a 0-10 scale, with ‘0’ being the lowest (‘not at all’) point on the scale. The responses were exactly the same in both questions, 51% rated the benefit and value of the videos between an 8 and 10, 25% between a 5 and 7, and 45% between a 0 and 4. Although there were a number of students who rated the benefit and value fairly low, 96% said the videos were “very beneficial” or “somewhat beneficial” for student learning in the course. We were interested in any impact the initiative may have on a student’s attendance. Once of the concerns voiced was that students may not show up to class because the videos are available to them. This was unfounded in this study, as 96% of the students said the video(s) had “no effect” on their attendance.<br />We asked students a series of eight questions designed to get their attitudes, opinions, and experiences with using the software and videos. The results of the eight questions are in Figure 5. One can see that, overall, videos were deemed to be a complementary tool to the lecture and text for most students.<br /> <br />The last two questions asked of students were to provide any suggestions to other students on how to use the videos to their benefit, and provide any suggestions on how to improve using the videos in the future. Student suggestions on how to use the videos to their benefit included “using them if you miss a class, or something in class,” “if you can’t reach your teacher and you have a question,” and “filling in the blanks.” It appears that most students who viewed the video(s) used them because it may have been difficult to catch everything in class, so they could look at the video to make sure they understood what was said. There were a number of suggestions on how to improve the videos, and their use, in the future: <br />
    • “Improve the sound quality…sometimes hard to hear”
    • “Get rid of the extra password, we already have one for Banner.”
    • “These should be downloadable because we don’t all have the Internet at home.”
    • “Work on the camera positions in class.”
    • “Make them available to everyone, not just those in this class.”
    • “Make the videos in smaller segments, it is hard to watch and listen to all of that.”
    Results (Faculty Perceptions)<br />Faculty who participated in the pilot initiative were asked to complete a Faculty Assessment Form to gather feedback about their perceptions of using the software, how they think the performance of students in class was impacted, and any of their suggestions for future use. Many of these questions were the same in the Final Student Assessment—just re-worded. Three of the five faculty members completed the assessment. Two of the faculty (66.7%) recorded five or more lectures, while the other faculty member (33.3%) recorded three or four lectures. All three faculty indicated they spent more than 30 minutes on each lecture, informed all of their students about the videos, and all recorded the lectures in class. Two of the three faculty members indicated they actually showed students how to access the lectures. One faculty member provided extra credit to students for using the videos (“5 extra points on the exam), while two provided no incentives for students. One faculty member elaborated on his/her decision to not provide any extra motivation for using the videos, “As I stated many times before, if one of our outcomes is increased grades we cannot inflate the grades of the experimental group by giving points or incentives. The incentive has to be their want to do better.” <br />Instructors were asked how beneficial they thought the videos viewed were in students’ understanding the concepts related to the course, and how valuable the videos viewed were to the students. Both of these questions were on a 0-10 scale, with ‘0’ being the lowest (‘not at all’) point on the scale. Surprisingly, the faculty members all rated the benefit and value of the videos less than 5. Although the faculty rated the benefit and value fairly low, 66.7% (2) said the videos were “somewhat beneficial,” and one member (33.3%) said it was “beneficial” for student learning in the course. Two faculty members felt that the videos had “no effect” on student attendance, but one professor felt that having the videos available “reduced student attendance.”<br />We also asked the faculty members a similar series of eight questions designed to get their attitudes, opinions, and experiences with using the software and videos. The results of the eight questions are in Figure 6. One can see that, overall, videos were not seen as positive as they were with the students. <br />Faculty were asked, “in your opinion were there any apparent learning/ performance differences between students who viewed certain topics versus those who did not?” All three faculty members said “no.” Two professors qualified their answers as follows: <br />
    • “This is very difficult to state. I have not yet evaluated the data, but I know that many of the students who were using the lectures were diligent students that were already spending a tremendous amount of time to master the material. I am interested to see who actually viewed the videos, for how long, and for which lectures.”
    • “I think the A students did not need the videos which was 4 out of my 15 students. The students that struggled with the material had trouble with the videos they expressed to me that they could not stop and ask me the question when they did not understand a topic in the video lecture.”
    Even though all three instructors said they did not see any apparent learning/performance differences, two recommended the continued and expanded use of Panopto at WSSU. <br />Finally, faculty were asked to “please provide any additional comments you would like us to consider relating to the use of Panopto at WSSU.” One participant answered the question and provided the following comment, “Looks like simply attending class and taking good notes, in other words, the old fashioned, tried and true methods, work best.”<br />Discussion<br />Overall, it appears that students who utilized Panopto overwhelmingly thought it was beneficial to the class in which they were enrolled. Although this was a pilot study with only 72 students, almost half of the students who completed the assessments reviewed at least one video. The most common reasons to view the videos were to review for a quiz/test, to get clarity on a topic discussed in class, and to reinforce good study habits. The students who reviewed the videos more frequently, and for longer periods of time appear to be those students whose GPAs are hovering around the 3.1-3.5 range. This may be interpreted in a number of ways: 1) students who have lower GPAs may not have well-developed study habits to begin with, so they were not assessing the videos as much or over a certain period of time; 2) students with very high GPA (3.6 and higher) may not feel the need to review the videos because they have developed a system of notetaking and studying that does not necessitate additional strategies; and/or 3) the majority of students enrolled in the class have a GPA between 3.1 and 3.5, so they naturally would be the ones reviewing the videos. As indicated in the Results section, 85% of students found the videos helpful in providing information relevant to the course. It appears that it is both advantageous and helpful to students if CETL continues using Panopto in classes with high DFWs.<br />It appears that two of the main factors that helped students decide whether or not to access the videos were class requirements, and time. Of those students who did not review any videos, over half said they did not because it was not required for the course. As with many classes, students are focused on requirements for the course and it is important to make sure that reviewing at least one or two of the videos are somehow integrated into the syllabi of faculty. Students are more likely utilize the software if there is an obvious and smooth incorporation into student expectations for the course. Students also indicated they did not have enough time to review the videos. It is unclear if students believed that they needed to review the entire video, or if it was made clear to them that they had options. The summer sessions are very fast and intense, and most of the students who were attending classes (per Assessment data) also worked at least 20 hours per week. Students were honest in saying they had to prioritize what they could and could not do in this shorten course. This again speaks to making Panopto a requirement so that students will not have to make those types of study and review choices in the future. <br />Although only three faculty members provided feedback on their experience with Panopto, two were teaching the Biology 2312 class which has one of the highest DFWs at WSSU. The reviews from faculty were mixed to say the least. Faculty did not see the benefit of the videos to the students. The videos were not seen as convenient, helpful in student preparation for quizzes and exams, or in discussions. This may be a direct reflection of the fact that all videos during the pilot were recorded during an actual class session. One of the ideas that was originally presented to faculty was for them to pre-record certain modules or sections—this would have allowed students to review information focused on hard-to-learn concepts, and not view a class they may have already sat through. Pre-recording has the potential to have faculty teach a concept/theory/phenomenon in a virtual one-on-one session with PowerPoint slides, instead of moving around the classroom. None of the faculty agreed to pre-record, so this is something that will be explored in the fall 2009 sessions.<br />Limitations<br />No studies, especially a quasi-experimental one, is without limitations. The first limitation is that only three professors provided data, and not everyone viewed the videos. Even though the data was aggregated, one professor had the majority of students using the software. It is unclear if accessing/using the software is positively correlated with the incentive given by that professor. Secondly, students were self-reporting and may over exaggerate their software usage because of the desire to obtain the incentive. It is not known if the data is skewed in the pilot because most of the students were Nursing majors, and the Biology 2312 is a pre-requisite for the actual program. This could mean that students are theoretically more motivated to do well in this particular class, because not doing so would mean denied entry into a specialized program. Finally, the summer session is a very short session, and often attracts a hodgepodge of students from WSSU and other surrounding schools—many of who are not traditional undergraduate students. It is uncertain whether the results from this session would match those of a traditional 15-week course taught to traditional students. <br />