• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Features, Technical Problems and Technical Support in Wimba Classroom
 

Features, Technical Problems and Technical Support in Wimba Classroom

on

  • 2,167 views

Presented at the 2010 Creating Futures Through Technology Conference about a research study conducted to evaluate the perceptions of the use of Wimba Classroom.

Presented at the 2010 Creating Futures Through Technology Conference about a research study conducted to evaluate the perceptions of the use of Wimba Classroom.

Statistics

Views

Total Views
2,167
Views on SlideShare
2,163
Embed Views
4

Actions

Likes
0
Downloads
0
Comments
0

2 Embeds 4

http://www.slideshare.net 3
http://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • all
  • amy ask the audience – who has used a virtual classroom/wimba before? what is wimba classroom? pilot how we started
  • amy
  • amy
  • MN
  • MN
  • MN
  • MN
  • J
  • J
  • J
  • J
  • J
  • J
  • J
  • J
  • J
  • J
  • J
  • MN
  • A
  • MN
  • A
  • A

Features, Technical Problems and Technical Support in Wimba Classroom Features, Technical Problems and Technical Support in Wimba Classroom Presentation Transcript

  • FEATURES, TECHNICAL PROBLEMS AND TECHNICAL SUPPORT IN WIMBA CLASSROOM Mary Nell McNeese, Ph.D. Amy Thornton, M.S. Jalynn Roberts, Ph.D. The University of Southern Mississippi
  • Wimba Classroom
  • Wimba Classroom
    • Fully featured live virtual classroom that supports:
      • Audio
      • Video
      • Application sharing
      • Content display
      • Interactive Polling
      • Breakout Rooms
      • E-board
      • Downloadable archives (.mp3 and .mp4)
  • History
    • Pilot study
      • Spring 2007
      • Small sample size (USM only)
    • Extended research
      • April 2009
      • 2 universities and 2 community colleges
  • Wimba Classroom: an Emerging Technology
    • Very limited range of research done to date.
    • Studies which were conducted are individual case studies of how Wimba was used in specific classrooms (McIntosh, Braul, and Chao, 2003; Curtis and Lawson, 2001; Mallory, Ramage, Snow, and Coyle, 2009 ; Conrad, 2002; O’Regan, 2003; King, 2002; Jones, Johnson-Yale, Millermaier, & Perez, 2008).
  • Wimba Classroom: an Emerging Technology
    • Collaboration software use has positive effect on students.
    • Students need time to acclimate to unfamiliar collaborative technology.
    • Graduate students, in particular, experience more anxiety related to the use of Wimba Classroom.
    • Many students reported frustrations when trying to use Wimba classroom features online.
  • The research question for this study
    • Is there a statistically significant difference in the rating of the features used in Wimba Classroom, the level of technical problems with using Wimba Classroom, the rating of the technical support they were provided, and the number of completely online classes taken based on whether participants were underclassmen, upperclassmen or graduate students?
  • Increase of online courses
    • More higher education courses are online which enhances flexibility and expands enrollment and, in turn, increases revenue (Allen & Seaman, 2008; Larreamendy-Joerns & Linehardt, 2006).
    • More undergrads report taking online courses and having greater experience with online technology than grad students (Artino and Stephens, 2009).
  • Methodology
    • Approximately 300 students (ages ranged 18 to 88) invited to participate, and 142 responded (approximately 47%).
    • Students were enrolled in online classes using Wimba Classroom software.
    • Students sampled from two community colleges as well as two public universities.
    • The institutions were selected based on convenience and expressed interest in participation in the study.
  • Survey Instrument
    • The researchers constructed an online questionnaire, based on the research literature in the medium of distance learning.
    • Questionnaire was field-tested and evaluated by a panel of experts.
    • The questionnaire was placed online using the university’s survey software tool.
  • Survey Instrument Questions
    • Collected basic demographic information – classification, major, and institution.
    • The questionnaire also collected the following data:
      • The level of student satisfaction using Wimba Classroom
      • The number of fully or partially online classes previously taken
      • Students’ previous experiences using online collaboration software
      • Types of collaboration software previously used
      • Overall satisfaction with and usefulness of Wimba Classroom
      • Types of features used in Wimba Classroom
      • Usefulness of features used in Wimba Classroom
  • Survey Instrument Questions
      • Information collected by questionnaire:
      • Type of Internet connection used
      • Level of technical difficulties experienced while using Wimba Classroom
      • Types of technical support received and usefulness of resolving problems.
      • Open-ended question allowed students to share any additional information and/or feedback desired.
  • Likert-Scale Items
    • Items measured by Likert Scales included:
      • Usefulness of features used in Wimba Classroom (5-point). Consisted of an average of all Wimba features rated on the scale.
      • The level of technical problems with using Wimba Classroom (5-point)
      • Usefulness of technical support provided (5-point)
    • Likert-scale items were determined reliable by Cronbach’s alpha analysis (.77).
  • Procedures
    • Researchers obtained IRB approval and permission from the chief academic officer at participating institutions.
    • Researchers then emailed the link for the questionnaire to the distance learning coordinators who then emailed the links to the appropriate faculty who distributed the link to their students in April of 2009. The email also included an informed consent statement.
    • The questionnaire took approximately 10 - 15 minutes to complete. At the end of June 2009, the survey window closed.
    • Researchers downloaded data from the online questionnaire into SPSS software for storage and analysis.
    • Missing data were not considered in the data analysis.
  • Results – Recoding Classification
    • To answer research questions, the classification data was recoded into three levels.
      • The researchers collapsed the classifications “Freshman” and “Sophomore” into the combined category “Underclassmen” (coded as 1)
      • The classifications “Junior” and “Senior” into the combined category “Upperclassmen” (coded as 2)
      • The classification “Graduate Student” was coded as 3.
    • After recoding, three categories were used in the present study: underclassmen, upperclassmen, and graduate students.
  • Results - MANOVA
    • The researchers used a multivariate analysis of variance (MANOVA) to address the research question.
    • The alpha level was set at .05.
    • The independent variable was the fixed factor “group” which had 3 levels: underclassmen (freshmen and sophomores), upperclassmen (juniors and seniors), and graduate students.
    • The dependent variables were:
      • The number of classes taken fully or partially online
      • The level of technical problems with using Wimba Classroom
      • The rating of technical support provided
      • The composite variable average of “usefulness of features in the Wimba Classroom”
  • MANOVA Results Continued
    • Box’s test revealed a minor violation of homogeneity of covariance matrices.
    • Therefore the Pillai’s Trace statistic is reported because it is the most robust to that particular assumption violation.
    • Significant multivariate differences were found among the three student groups, Pillai’s Trace = 1.6, F (8, 254) = 2.69, p = .007, partial η 2 = .08, and the observed power = .93.
    • The multivariate partial eta-squared was relatively low.
  • Follow-Up Tests
    • Univariate follow-ups were examined based on the multivariate significant differences.
    • Two of the four dependent variables displayed significant differences:
      • The rating of the level of technical problems with using Wimba Classroom, F (2, 129) = 3.65, p = .03
      • The “average usefulness of features of Wimba Classroom”, F (2, 129) = 4.65, p = .01
    • The number of classes taken fully online and the rating of technical support provided were not statistically significant.
  • And Finally…
    • Dunnett’s C post hoc analysis for the rating of the level of technical problems showed that:
      • Underclassmen rated the problems significantly lower than the graduate students
      • No other groups were significantly different from each other.
      • The underclassmen reported significantly fewer problems than did the graduates.
    • Tukey’s HSD post hoc analysis for the rating of the averaging of the Wimba Classroom features showed that:
      • Underclassmen were significantly different when compared to upperclassmen and graduate students
      • No statistically significant difference between upperclassmen and graduate students
      • Underclassmen rated the features more useful than did the upperclassmen and graduate students.
  • Discussion – Technical Issues
    • Classification
      • underclassmen
      • upperclassmen
      • graduate
    • Experience Level
      • number of online courses taken
  • Practical Applications
    • Best Practices
      • audio checks
      • regular interaction
      • start slow
    • Preparation
      • orientation
      • organization
      • practice
    • Archiving
  • Limitations
    • Researchers also participating instructors
      • unintended influence
    • Sample size
      • distribution of questionnaires – might have gotten lost in transfer
    • Subjective
      • measured student’s own perception
  • Recommendations for Future Research
    • Case study
      • Compare measured learning outcomes
      • Face-to-face vs. online course
    • Grouping by age
    • Anxiety level
      • Pre-test and post-test
      • potential differences
    • Usefulness
      • vs. technical problems reported
      • vs. type of Internet connection
  • Where are we now?
    • Submitted to Internet and Higher Education Journal
    • Presenting at Wimba Connect 2010
    • Pursuing new comparison study between synchronous and asynchronous online learning environments
  • Questions???
    • [email_address]
    • [email_address]
    • [email_address]
    • http://instructtech.wordpress.com