2. Of the 42% who did not the view an entire video, the majority indicated they didn’t have time to watch the whole video.
3. 65% of the students who viewed at least one video watched at least 10 minutes of the video.
4. Of those students who viewed one or more of the videos, 93% watched it alone.</li></ul>It was important to gather information on the reasons students viewed any of the videos, and how the video(s) may or may not have contributed to their success in class. The overall tone of the students was that it was good for review and clarification. When students were asked to indicate why they viewed the video, one student said “to make sure I had an understanding of what was going on so I am prepared for my test. After listening to the videos it made everything clear to me.” Another student indicated in was important to “review and fill in parts of my notes where I did not get everything written down in class.” Students were asked if the video(s) were helpful in providing the information they needed for the class, and 85% indicated it was. Interestingly, one student who did not review the video(s) indicated that “if I would have viewed the videos I wouldn't have paid as close attention in class.”<br />Even though the majority of the videos were not pre-recorded by the instructors, it was essential to learn if students acquired any new information by watching the videos. Three out of five students (61%) acquired new information, and one student said that he or she was able to get “something that I may did not get in class…I understood it better after listening to the video.” Additionally, another student said that “in class, discussion over diagrams were perhaps too fast for me to absorb all the information. Listening to those slides again helped.”<br />We asked students if the video(s) were helpful to them, and 88% said “yes.” When asked how they were helpful, three responses stood out:<br /><ul><li>“Good review, clearer understanding second time hearing it”
5. “I really really enjoy the video lectures I think every class and every university should have access to such a wonderful tool. It is an excellent guide to enforcing better study habits.”
6. “The slides went along with the video that helped out A TON!”</li></ul>On a scale from 0 to 10 (10 being the highest), students were asked how beneficial was the lecture-capture video you just watched to your understanding of concepts relating to this class? Fifty-two percent rated the video(s) between an 8 and 10; 36% between a 5 and 7; and only 12% said the video(s) was not very beneficial (between a 0 and 4).<br />We wanted to see the grade point average (GPA) of those who viewed the videos to see if those who had higher GPAs were more or less likely to utilize the software. Of the students who used the software to view one or more video, 76% had a GPA of 3.0 or better. The video(s) were viewed most by those who had a GPA between 3.1-3.5. Additionally, most of those who viewed the video(s) were employed (64%) and worked at least 21 hours per week (71%). <br />This is the first time that CETL used the Panopto software, so the students were asked to rate the technical quality of the video. As you can see in Figure 3, 73% rated the technical quality of the video(s) as good or excellent. <br />Final Student Assessment<br />Seventy-one percent (51) of the students who completed the Initial Student Assessment also completed the Final Student Assessment. The overwhelming majority of these students were from Dr. David Kump’s classes (96%). The students were asked to indicate the number of videos watched over the course of the summer school course, and one in four students (24%) viewed five or more videos. Thirty-one percent of the students did not view any videos, and the remaining 45% viewed between one and four of the videos available. <br />Of those 31% who did not view the video, 51% said they did not because “it was not required for the course.” Unfortunately, 33% did not view any of the videos because they indicated they had technical difficulties assessing Panopto; 12% said they “didn’t see the benefit;” 8% said they were “too difficult to access;” and 4% did not know about the videos. Students had the option of choosing multiple options, so the total does not equal 100%. <br />As indicated earlier, most of the videos were not pre-recorded but taped during a live class session. There appears to be a difference in the number of students who viewed the videos that were taped in their class versus those taped in the other class (65% vs. 35%, respectively). Keeping in line with the Individual VLC Assessment, students overwhelmingly viewed the videos alone (73%). When asked about their video lecture viewing habits, students indicated that they either watched the entire video once, or fast-forwarded to certain parts and watched those multiple times (see Figure 4).<br />Students were asked how beneficial the videos viewed were in understanding the concepts related to the course, and how valuable the videos viewed were to the course. Both of these questions were on a 0-10 scale, with ‘0’ being the lowest (‘not at all’) point on the scale. The responses were exactly the same in both questions, 51% rated the benefit and value of the videos between an 8 and 10, 25% between a 5 and 7, and 45% between a 0 and 4. Although there were a number of students who rated the benefit and value fairly low, 96% said the videos were “very beneficial” or “somewhat beneficial” for student learning in the course. We were interested in any impact the initiative may have on a student’s attendance. Once of the concerns voiced was that students may not show up to class because the videos are available to them. This was unfounded in this study, as 96% of the students said the video(s) had “no effect” on their attendance.<br />We asked students a series of eight questions designed to get their attitudes, opinions, and experiences with using the software and videos. The results of the eight questions are in Figure 5. One can see that, overall, videos were deemed to be a complementary tool to the lecture and text for most students.<br /> <br />The last two questions asked of students were to provide any suggestions to other students on how to use the videos to their benefit, and provide any suggestions on how to improve using the videos in the future. Student suggestions on how to use the videos to their benefit included “using them if you miss a class, or something in class,” “if you can’t reach your teacher and you have a question,” and “filling in the blanks.” It appears that most students who viewed the video(s) used them because it may have been difficult to catch everything in class, so they could look at the video to make sure they understood what was said. There were a number of suggestions on how to improve the videos, and their use, in the future: <br /><ul><li>“Improve the sound quality…sometimes hard to hear”
7. “Get rid of the extra password, we already have one for Banner.”
8. “These should be downloadable because we don’t all have the Internet at home.”
9. “Work on the camera positions in class.”
10. “Make them available to everyone, not just those in this class.”
11. “Make the videos in smaller segments, it is hard to watch and listen to all of that.”</li></ul>Results (Faculty Perceptions)<br />Faculty who participated in the pilot initiative were asked to complete a Faculty Assessment Form to gather feedback about their perceptions of using the software, how they think the performance of students in class was impacted, and any of their suggestions for future use. Many of these questions were the same in the Final Student Assessment—just re-worded. Three of the five faculty members completed the assessment. Two of the faculty (66.7%) recorded five or more lectures, while the other faculty member (33.3%) recorded three or four lectures. All three faculty indicated they spent more than 30 minutes on each lecture, informed all of their students about the videos, and all recorded the lectures in class. Two of the three faculty members indicated they actually showed students how to access the lectures. One faculty member provided extra credit to students for using the videos (“5 extra points on the exam), while two provided no incentives for students. One faculty member elaborated on his/her decision to not provide any extra motivation for using the videos, “As I stated many times before, if one of our outcomes is increased grades we cannot inflate the grades of the experimental group by giving points or incentives. The incentive has to be their want to do better.” <br />Instructors were asked how beneficial they thought the videos viewed were in students’ understanding the concepts related to the course, and how valuable the videos viewed were to the students. Both of these questions were on a 0-10 scale, with ‘0’ being the lowest (‘not at all’) point on the scale. Surprisingly, the faculty members all rated the benefit and value of the videos less than 5. Although the faculty rated the benefit and value fairly low, 66.7% (2) said the videos were “somewhat beneficial,” and one member (33.3%) said it was “beneficial” for student learning in the course. Two faculty members felt that the videos had “no effect” on student attendance, but one professor felt that having the videos available “reduced student attendance.”<br />We also asked the faculty members a similar series of eight questions designed to get their attitudes, opinions, and experiences with using the software and videos. The results of the eight questions are in Figure 6. One can see that, overall, videos were not seen as positive as they were with the students. <br />Faculty were asked, “in your opinion were there any apparent learning/ performance differences between students who viewed certain topics versus those who did not?” All three faculty members said “no.” Two professors qualified their answers as follows: <br /><ul><li>“This is very difficult to state. I have not yet evaluated the data, but I know that many of the students who were using the lectures were diligent students that were already spending a tremendous amount of time to master the material. I am interested to see who actually viewed the videos, for how long, and for which lectures.”
12. “I think the A students did not need the videos which was 4 out of my 15 students. The students that struggled with the material had trouble with the videos they expressed to me that they could not stop and ask me the question when they did not understand a topic in the video lecture.”</li></ul>Even though all three instructors said they did not see any apparent learning/performance differences, two recommended the continued and expanded use of Panopto at WSSU. <br />Finally, faculty were asked to “please provide any additional comments you would like us to consider relating to the use of Panopto at WSSU.” One participant answered the question and provided the following comment, “Looks like simply attending class and taking good notes, in other words, the old fashioned, tried and true methods, work best.”<br />Discussion<br />Overall, it appears that students who utilized Panopto overwhelmingly thought it was beneficial to the class in which they were enrolled. Although this was a pilot study with only 72 students, almost half of the students who completed the assessments reviewed at least one video. The most common reasons to view the videos were to review for a quiz/test, to get clarity on a topic discussed in class, and to reinforce good study habits. The students who reviewed the videos more frequently, and for longer periods of time appear to be those students whose GPAs are hovering around the 3.1-3.5 range. This may be interpreted in a number of ways: 1) students who have lower GPAs may not have well-developed study habits to begin with, so they were not assessing the videos as much or over a certain period of time; 2) students with very high GPA (3.6 and higher) may not feel the need to review the videos because they have developed a system of notetaking and studying that does not necessitate additional strategies; and/or 3) the majority of students enrolled in the class have a GPA between 3.1 and 3.5, so they naturally would be the ones reviewing the videos. As indicated in the Results section, 85% of students found the videos helpful in providing information relevant to the course. It appears that it is both advantageous and helpful to students if CETL continues using Panopto in classes with high DFWs.<br />It appears that two of the main factors that helped students decide whether or not to access the videos were class requirements, and time. Of those students who did not review any videos, over half said they did not because it was not required for the course. As with many classes, students are focused on requirements for the course and it is important to make sure that reviewing at least one or two of the videos are somehow integrated into the syllabi of faculty. Students are more likely utilize the software if there is an obvious and smooth incorporation into student expectations for the course. Students also indicated they did not have enough time to review the videos. It is unclear if students believed that they needed to review the entire video, or if it was made clear to them that they had options. The summer sessions are very fast and intense, and most of the students who were attending classes (per Assessment data) also worked at least 20 hours per week. Students were honest in saying they had to prioritize what they could and could not do in this shorten course. This again speaks to making Panopto a requirement so that students will not have to make those types of study and review choices in the future. <br />Although only three faculty members provided feedback on their experience with Panopto, two were teaching the Biology 2312 class which has one of the highest DFWs at WSSU. The reviews from faculty were mixed to say the least. Faculty did not see the benefit of the videos to the students. The videos were not seen as convenient, helpful in student preparation for quizzes and exams, or in discussions. This may be a direct reflection of the fact that all videos during the pilot were recorded during an actual class session. One of the ideas that was originally presented to faculty was for them to pre-record certain modules or sections—this would have allowed students to review information focused on hard-to-learn concepts, and not view a class they may have already sat through. Pre-recording has the potential to have faculty teach a concept/theory/phenomenon in a virtual one-on-one session with PowerPoint slides, instead of moving around the classroom. None of the faculty agreed to pre-record, so this is something that will be explored in the fall 2009 sessions.<br />Limitations<br />No studies, especially a quasi-experimental one, is without limitations. The first limitation is that only three professors provided data, and not everyone viewed the videos. Even though the data was aggregated, one professor had the majority of students using the software. It is unclear if accessing/using the software is positively correlated with the incentive given by that professor. Secondly, students were self-reporting and may over exaggerate their software usage because of the desire to obtain the incentive. It is not known if the data is skewed in the pilot because most of the students were Nursing majors, and the Biology 2312 is a pre-requisite for the actual program. This could mean that students are theoretically more motivated to do well in this particular class, because not doing so would mean denied entry into a specialized program. Finally, the summer session is a very short session, and often attracts a hodgepodge of students from WSSU and other surrounding schools—many of who are not traditional undergraduate students. It is uncertain whether the results from this session would match those of a traditional 15-week course taught to traditional students. <br />