7000 courses Fall and Spring semesters in total, but only ran 4500 last semesterStudents are around the world and “go to VT”…Sakai is used at remote campuses like our campus in Northern VA and South Carolina (VCOM)…and EvalSys is used at Wake Forest for students in a program taught by VTTurned off Blackboard in Fall of 2010, but had started piloting Sakai for projects in 2006Scholar is our local name for Sakai and we will probably use this interchangeablyScholar used by all colleges at VT / Evaluation System used by everyone other than Vet School (plans to add them in Fall 2012)
Campuses in Boston, Medford, GraftonArts & Sciences, Engineering, Medical Dental Vet SchoolSakai @ Phase 2 of 3 phase project, bringingon A&S, Eng, Fletcher (law), Friedman third phase may involve Medical School, Dental School, Vet SchoolA&S, Eng, Fletcher (law), Friedman, Vet, Medical, Dental, Tisch community service
Several years ago, started project to put evals online using a system from Columbia based upon ColdFusion (used partially thru 2011)Lots of different systems at VT which gets very confusing in terms of where data lives and how to access reports
Needed a system that could scale and provide flexibility for disparate colleges with different goals and questionsWe weren’t sure what response rates to expect, but set a 70% target at start (without incentives, totally voluntary at VT to complete these)
3 levels in hierarchy – VT-wide (12 core questions, developed by aforementioned task force), we allow each college/department combination to add 8 more questions from a menu / we do not allow instructors to add their own questions
Spring 2012 went nearly full production (other than Vet school) for first time, as opposed to being in pilot mode
- Student feedback has been extremely favorable, students just seem to think this is logical and appropriate
In the chart above, you can see the percentage of responses we got by day. We’ve changed the number of days slightly each time (13, 15, 16) – standardized on 15 going forward for F/SYou can clearly see the “reminder effect” here – we send reminders on odd days and if you look at one color, like blue, you can see how every other day, there is a spike, which lasts the entire periodAs said before, we weren’t sure how response rates would turn out, but we have been in high 60% each time – attribute to:Email reminders to students often (but we feel this is a bit aggressive)Asking instructors to reinforce importance of this with students and letting them know results are used for P/T + improving coursesExposing college admins to the data throughout cycle by sending out regular response rate reportsWe don’t know if this will be sustained OR when newness wears off it rates drop and we need more incentives??
The “form” was going to be revamped regardless of our system, but we decided to do them jointly given the timing hereFor ramp-up, we made changes each semester after surveying teaching faculty in new colleges addedWe have a paper with “myths” documented (like it’s a popularity contest OR results change a good bit if you do them last day of class vs 2 weeks earlier) – many instructors really want to know why we made some of the implementation decisions we made
-Admin interface needs work to support many end-users-Make policy decisions and communicate – TA’s-Development solutions – studentless/instructorless-Understand and *influence* institutional data – cross-listed and team taught - project can be a catalyst for change
-Pilot pace – can be dictated by external factors, more knowns now than several years ago, and more partners comfortable with online-Course integration model can influence eval group/hierarchy integration strategy-One template good if there are many overlapping questions, with a need for granular customization | Several templates good if units don’t have many overlapping questions, but still standardize items at a high level-Separate instance – helpful for performance, scale issues | Primary – acceptable with smaller scale, provides convenience
- In general, the system is “done” (but we all know it’s never truly done)…still have some items we want to tackle – like above
Transcript of "Sakai_EvalSys_VT&Tufts"
Will Humphries, Senior Developer @ Tufts Brian Broniak, Director of Online Learning & Collaboration Services (OLCS) @ Virginia Tech June 10-15, 2012Growing Community;Growing Possibilities
Background information ◦ Both schools and Sakai The VT story from idea through full production The Tufts story from mandate to current state Goal: Share ideas for a successful implementation of EvalSys at your university 2012 Jasig Sakai Conference 2
VT is in Blacksburg, VA – a college town 30,000 students globally, 7000 courses Most students in B’burg, but students global 4 semesters each year – F/S, Summer I and II Very decentralized, hard to do “enterprise” Sungard Banner as our SIS Sakai 2.8.1 as of Spring 2012 Scholar Sakai school fully since 2010, informally for much longer 2012 Jasig Sakai Conference 3
Tufts is around Boston, MA 10,000 students, approx. 5,000 undergrad Sakai for 1 year, 3 of 8 colleges ◦ it’s called Trunk Moving to PeopleSoft Campus Solutions 9.0 Sakai 2.8.1, eval 1.4 2012 Jasig Sakai Conference 4
Using paper OpScan forms since 70s ColdFusion based system, first try online ◦ Decided it would not meet needs at scale so started a tool for Sakai EvalSys today Task Force(s) on the process and form in parallel to work on the system Another system for distance learning (IDDL) at VT (developed in-house) 2012 Jasig Sakai Conference 5
Decentralized – some online, some paper A&S and Eng. motivated to leave paper ◦ Cost savings from centralized service Evals for all Tufts Trunk adopters Fall 2012 ◦ 30,000 potential responses from 2,500 courses Evals in LMS a selling point for Trunk project 2012 Jasig Sakai Conference 6
Provide “one” system to unify this data – replace ColdFusion, paper, IDDL system Provide robust reporting Garner 70%+ response rates sustained Provide a “good” student and instructor experience Our approach: ◦ Ramp up slowly to allow for departments to adjust (a 5 semester ramp-up) ◦ Communicate, communicate, and communicate 2012 Jasig Sakai Conference 7
EvalSys 1.3 with local changes We use the hierarchy with 3 levels – VT-wide, college, and department Email reminders each course section every 2 days All scaled questions on 6 point scale We have an external data warehouse and Jaspersoft reporting interface We purge the data each semester after putting copies of reports into Scholar personal workspaces and in our warehouse 2012 Jasig Sakai Conference 8
In Spring 2012 we finally ran this VT wide! ◦ 4500 courses, 125,000 requests with 85,000 completed = 69% response rate Students like it far better, faculty feedback has been mixed Response rates in high 67-70% range each semester + a definite “reminder effect” (see chart) 2012 Jasig Sakai Conference 9
Asked two questions to random sampling of students over all 5 semesters ◦ “Compared to the evaluation form used in the past, the items on this form allow me to provide a better evaluation of this course/instructor.” 22,697 responses from students here – with 87% favorable (somewhat agree, agree, or strongly agree), 61% = agree or strongly agree ◦ “I prefer filling out the evaluation form on-line rather than filling out the paper-and- pencil form in class.” 23,017 responses from students here – with 87% favorable (somewhat agree, agree, or strongly agree), 77% = agree or strongly agree 2012 Jasig Sakai Conference 10
Need to get chart here from data Steve is working on 2012 Jasig Sakai Conference 11
Major org challenges, especially with decentralized universities ◦ Get an “owner” outside of IT – our Office of Assessment and Evaluation has been a great partner Focus on the form and go ahead and change scales earlier rather than later Ramp up slowly, gives you time to refine system and everyone to adjust Build a warehouse (lots of valuable data) – do your best to gather report requirements Work through change curve – myths, rationale, etc 2012 Jasig Sakai Conference 12
Centralized administration Identify edge cases early ◦ Cross-listed ◦ Team taught ◦ TA’s / non-official instructors ◦ Studentless/instructorless courses External reporting? Still in the works 2012 Jasig Sakai Conference 13
Gradual pilot vs. rapid deployment CM API (evalgroup) vs. Site-based (evalproviders) One template vs. several Separate instance vs. primary LMS instance 2012 Jasig Sakai Conference 14
Consolidated emails Exposing response rates real-time Deal with edge cases – non-traditional schedules and GAs Get back to the Sakai baseline 2012 Jasig Sakai Conference 15
Contact information: Brian Broniak – firstname.lastname@example.org Will Humphries – email@example.com June 10-15, 2012Growing Community;Growing Possibilities
A particular slide catching your eye?
Clipping is a handy way to collect important slides you want to go back to later.