Evaluating success of_digital_content

419 views

Published on

WCET Annual Meeting Presentation

  • Be the first to comment

  • Be the first to like this

Evaluating success of_digital_content

  1. 1. Evaluating the Success of Digital Content WCET Annual Meeting San Antonio, Texas November 2, 2012 Presenters:Dr. Darlene Williams, Northwestern State University (LA) Ms. Julie Ricke (eduKan)
  2. 2. Evaluating the Success of Digital ContentAbstractInstitutions are converting textbook-based courses to moreinteractive courses with digitally embedded content. Thepresenters will discuss measures of success with theimplementation of digitally embedded content at two very differentinstitutions. Additionally, they will describe the model and strategyemployed by their respective institutions for their digital contentinitiatives; share the successes and stumbling blocks to itsimplementation; and enumerate methods to measure and analyzeof student success and engagement with digital content and openeducational resources.
  3. 3. Northwestern State University of Louisiana  32 Online Degree Programs  68% of Students Take Online Courses  30% are Exclusively Online  12,000 Enrollments Each Semester  550-600 Course Sections Each Semester  LMS - Moodle
  4. 4. Northwestern State University of LouisianaExperimentation (1999-2000) (Desktop video, Course Submission Database, Student Assessment Splash Page “Core” Course Development Video Conferencing InfrastructureTechnology Costing Methodology Project (2001: WCET and NCHEMS Joint Project) Recognizing the Need to be Adult FriendlySREB Assessment (September 2004)Professional Development Redesign (2005) Competency Levels Staff (Instructional and Media) Mobile Initiatives (Web, Course Content, etc.)Digital Content and Social Media 2008-2009 Storage, Sharing, Backup, Lecture Capture
  5. 5. Barriers to the Adoption of Digital ContentDigital resources– web resources, video, audio, provide great options when developing digital contentFindings: During development and review processes established for online courses the resources are: Not necessarily evaluated for quality Not necessarily evaluated as appropriate measure of learning outcomes Alignment with learning outcomes is a time-intensive process University resources often limited to support the digital effort Not necessarily used to the fullest potential
  6. 6. Early InitiativesPublishers Course Packs Designed for Learning Management SystemsLearning Objects (Merlot)Faculty Produced Digital Content (Video Cameras, Audio Recorders)Video Conferencing Recordings (Full and Partial Lectures)Podcasts (Podcast Producer) RSS Feeds, Imported into the LMS, Designed for Multiple Devices There was still limited digital content being created and distributed in online courses.
  7. 7. Faculty as Digital Content Adopters and Developers Ebooks (Student Choice) Audio Video YouTube Vimeo Blogs Wikis Documents Portfolio Elements
  8. 8. Faculty as Digital Content Adopters and Developers
  9. 9. Challenges in the Support and Development of Digital Content• Creating Content• Methods for Creating Access to the Content• Sharing with Students/Departments/Colleagues• Backup/Storage/Disaster Recovery Protocol Establishing IT Support Protocol is Important Part of the Process
  10. 10. Challenges in the Support and Development of Digital Content What is the Life Cycle of the Digital Content? How Do You Manage the Process? What Support is Required in Order to be Successful?How Do We Manage the Promotion of Effective Digital Pedagogy? What are the Criteria for Evaluating Digital Content? How Can we Better Assess the Effectiveness of Digital Content?
  11. 11. Early Initiative Early Deployment Architecture Transition to Fully Integrated Comprehensive Content Management SystemCurrent Capability: 47 Classrooms Capable of Recording Lecture Automatically or Adhoc A Web Interface Allows for the Upload of Content from the Office, Home, or in the Field Video Available for Faculty to Publish
  12. 12. Measuring Success of Digital ContentStudents:Usage: (Do Student Access and View the Content) Track “Hits” on Learning Management System Track “Views” on VIC (Video Integrated Content System: Content created as full lectures in the classroom, adhoc from the office or field, through Jabber (MOVI), Podcast Producer...etc. First Week of Classes Tracked Hundreds of Views by Students *VIC is also the name of NSUs mascot.
  13. 13. Measuring Success of Digital Content Faculty adoption Participation in professional development Work with appropriate staff (instructional and media positions to assist in the development and assessment of desired course content) Adoption of standards and rubrics to assess learning outcomes Involvement in a peer review process
  14. 14. Current EffortsAssist Faculty in the Development of All Forms of Content:  Open content  Mobile content  Connected content  Collaborative content  Cross media content
  15. 15. FutureLong range planning does not deal with future decisions, but with the future of present decisions. (Peter Drucker) Developing content that provides the best learning experience for students Adoption of “digital content” best practices Incorporate assessment strategies for digital content
  16. 16. Thank You! Dr. Darlene WilliamsVice President for Technology, Research, and Economic Development Northwestern State University darlene@nsula.edu
  17. 17. . EduKan provides access to quality higher education through degrees, certificates, individual courses, support services and emerging market-driven programming.We are accessible, convenient and affordable
  18. 18. Average cost of a text book $157.56 ($18.29 - $307.90)Spring 2010 1,338 students spent $233,08210 most popular classes 60% of total textbook costs Average cost of $224.20
  19. 19. Project Aristotle
  20. 20. Retention Match NumberCourse ID Course Title Index Format Completed Number Withdrew Number students Completion Rate Difference 4933338 Introduction to Computer Concepts and Applications, Herrera 1 ARISTOTLE 25 6 31 0.81 4576193 Introduction to Computer Concepts and Applications, Herrera 1 TEXTBOOK 16 2 18 0.89 -0.08 4933352 Principles of Macroeconomics, Reynolds 2 ARISTOTLE 18 2 20 0.90 4576219 Principles of Macroeconomics, Reynolds 2 TEXTBOOK 8 2 10 0.80 0.10 4933324 Beginning Algebra, Wenzel 3 ARISTOTLE 18 6 24 0.75 4576098 Beginning Algebra, Wenzel 3 TEXTBOOK 12 5 17 0.71 0.04 4933336 Intermediate Algebra, Goymerac 4 ARISTOTLE 12 2 14 0.86 4576187 Intermediate Algebra, Goymerac 4933328 College Algebra, Dowell 4576113 College Algebra, Dowell 6 lower 4 TEXTBOOK 5 ARISTOTLE 5 TEXTBOOK 23 15 19 0 4 7 23 19 26 1.00 0.79 0.73 -0.14 0.06 4933390 American Government 4573597 American Government, Kryschtal 4933384 Personal Finance, Niederman retention 6 ARISTOTLE 6 TEXTBOOK 7 ARISTOTLE 12 16 11 2 0 7 14 16 18 0.86 1.00 0.61 -0.14 4576166 Personal Finance, Niederman 7 TEXTBOOK 14 2 16 0.88 -0.26 4933375 Introduction to Business, M Hatcher 8 ARISTOTLE 14 2 16 0.88 4576141 Introduction to Business, M Hatcher 8 TEXTBOOK 9 3 12 0.75 0.13 4955234 College Algebra, Faullin 4576091 College Algebra, Faullin 8 higher 9 ARISTOTLE 9 TEXTBOOK 20 11 2 1 22 12 0.91 0.92 -0.01 4933400 Introduction to Computer Concepts and Applications, Herrera 10 ARISTOTLE 13 4 17 0.76 4933392 Beginning Algebra, Wenzel retention 4573626 Introduction to Computer Concepts and Applications, Herrera 10 TEXTBOOK 11 ARISTOTLE 13 11 6 2 19 13 0.68 0.85 0.08 4573600 Beginning Algebra, Wenzel 11 TEXTBOOK 9 1 10 0.90 -0.05 4933399 Intermediate Algebra, Goymerac 12 ARISTOTLE 10 0 10 1.00 4573624 Intermediate Algebra, Goymerac 12 TEXTBOOK 13 1 14 0.93 0.07 Almost no 4930692 Introduction to Computer Concepts and Applications, Herrera 3979823 Introduction to Computer Concepts and Applications, Herrera 13 ARISTOTLE 13 TEXTBOOK 9 8 0 1 9 9 1.00 0.89 0.11 4930686 American Government, Kryschtal 3979813 American Government, Kryschtal impact 14 ARISTOTLE 14 TEXTBOOK 4 3 2 2 6 5 0.67 0.60 0.07 Avg diff -0.003
  21. 21. Other Findings2011 – 12 Students saved approximately $68,000eduKan retained approximately $24,000
  22. 22. Introduction to Business, M HatcherAristotle Design: 4933375 Pre Aristotle Design: 4576141Completion Rate: 88% Completion Rate: 75%Average Grade: 83% Average Grade: 86%
  23. 23. Introduction to Business, M HatcherInteractivity Node/Edge chartAristotle Design Pre Aristotle Design Instructor and student interactivity higher in the new design – Of particular note is the inclusion of all course users
  24. 24. Introduction to Business, M HatcherActivity intensity comparison by feature Aristotle Design Pre Aristotle Design Average Minutes of Activity per User 1 >15
  25. 25. Introduction to Business, M HatcherActivity comparison by week 1 > 35 Average Minutes of Activity per User Aristotle Design 0 0 0 1 1 1 Pre Aristotle Design Week Number 1 2 3 4 5 6 7 8 9 10 11 12
  26. 26. Introduction to Business, M HatcherPoint accumulations ( per user ) paired to weekly activity Week Number 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 Aristotle Design Pre Aristotle Design 1 >35 Average Minutes of Activity per User
  27. 27. Completion Rates vs. EnrollmentCompletion Rates – Census to Course End EduKan Ops Review
  28. 28. Advice & Lessons Learned • Plan, plan, plan! • Determine the textbook. • Map out your course. • Decide which learning objects you want to use. • Allow adequate time for delivery
  29. 29. Design Process Consultation with Textbook representative Selection of Textbooks Work with instructional designer Review the course upon delivery
  30. 30. Research on My Labs Resources Assets Interactive tutorials as supplement to reading, definition pop ups, note taking capability Podcasts Simulations VideosSelection of My Labs Resources
  31. 31. Highlights Fully Customized Course Capitalization of Teaching Style Completely Embedded Digital Content “I Don’t Have” Student Excuse is Eliminated Some books are now available via the iPad and Android tablet. Retention Best of All Worlds in Resources (Subjective Opinion)
  32. 32. Speed BumpsTimeline Approximately 3 MonthsRepaginationReading Electronic Book
  33. 33. Lessons Learned Slowly Integrate Timeline Expectations Valuable Project
  34. 34. EDUKAN section analysis - VisualsNode/Edge Charts explained:• Thread Interactivity • Nodes: Users in threads • Node color is final grade at course end ( Red = < 50%, Green > 80%) • Blue node: Instructor • Grey node: Dropped User or zero activity user • Node size represents total number of posts • Node location is influenced by node size as it relates to transitive edges (1) • Edges ( lines ) indicate connections • Edge weight is how often that connection is made • Edge color corresponds to direction: takes on source color• Content map • Nodes : Feature type • Node color and size: how much that activity was engaged • Edges indicate connections between features within the course interaction • Edge weight is how often that connection is made • Edge color corresponds to direction: Edge takes source color (1) The positioning of the node is influenced by the size of the node ( total interactions) and the push and pull of total edges per node. Consider each node floating in space: Each connection that leaves a node provides a push away from the destination node, and each connection to a node provides a pull towards the source node. The imbalance between those two influences weighted by the size of the node determines location.
  35. 35. EDUKAN section analysis - Visuals Activity Intensity explained: • Week Number across the top - range is determined by course start and end date • Units and Feature items across the vertical – Features are listed by their containing units • Measures a combination of activity minutes and record insertions • Intended to reflect the intensity against a feature • Scale is given in minutes for clarity, but there is an underlying scoring to capture record inserts (1) (1) Average count of record inserts measured per feature, per user. If the amount of inserts is greater than the average, a 1.5 multiplier is applied. This only has an influence for threaded discussion, as an attempt to capture the number of posts, not just time on task.
  36. 36. EDUKAN section analysis - Visuals Point accumulation plots explained: • Week Number across the horizontal- range is determined by course start and end date • Cumulative points along the vertical. Scale is determined by what is possible ( weighted ) • Each line represents a user in the course. • Intended to expose common/different point accumulation patterns between users and courses
  37. 37. EDUKAN section analysis - VisualsStudent Survival plots explained. • Week Number across the horizontal- range is determined by course start and end date • % of total activity (minutes only) on the y axis. • Each area represents a user in the course, the width of line indicates what % of total minutes for the course were earned by them during that week. • Not intended to differentiate all users, only the users that dropped. • Intended to measure improvement in student survival (how long are the dropped students staying engaged). A measure of improvement. • Red bars indicate points where user(s) dropped out • Blue line is instructor contribution.
  38. 38. Thank You! Ms. Julie Ricke eduKanjulier@edukan.org

×