Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
What Do Users Say?
             Findings from the
 Multi-Institutional Survey Initiative (MISI)

  Steven Lonn & Stephanie...
Panel Overview
• Organization, Survey Items, and
  Logistics
• Preliminary Overall Findings
• Tales from the Trenches:
 • ...
Participating Institutions
• Bradley University           • Texas State University
                                 - San ...
Other Participating
•   Charles Sturt University
•   Columbia University
•   Universidade Fernando Pessoa
•   University o...
Starting the Initiative
• Discussions with other institutions at
  Sakai conferences.
  • First multi-institutional panel:...
Agreeing on Core Survey
• Conference calls to discuss scope,
  logistics, etc.
• Voting on Michigan survey items via
  Con...
Data Collection & Analysis
• Surveys individually administered by each
  participating institution
  • Some offered incent...
Sakai Implementation




July 2009   10th Sakai Conference - Boston, MA, U.S.A.   8
Sakai Implementation

                             Sakai 2.4
                                3




            Sakai 2.5
 ...
Sakai Implementation
               Sakai 2.4
                  3




Sakai 2.5
   11




   July 2009               10th ...
Sakai Implementation
               Sakai 2.4
                  3


      Some / All Depts. Require Sakai Use
            ...
Sakai Implementation
               Sakai 2.4
                  3                            Some / All Depts. Require Sak...
Sakai Implementation
               Sakai 2.4
                  3                                Some / All Depts. Require...
Sakai Implementation
                      Sakai 2.4
                         3                            Some / All Dept...
Sakai Implementation
                      Sakai 2.4
                         3                                 Some / All...
Sakai Implementation
                      Sakai 2.4
                         3                            Some / All Dept...
Sample Information
• Majority of courses conducted face-
  to-face (80%) or in "blended" (17%)
  formats

• Primarily "lar...
Survey Respondents by
                                        Rutgers Stanford
                             Rice         5...
Survey Respondents by Role
• 2,962 Instructors
 • 14 Institutions
 • Average Response Rate: 20% (Min: 6%, Max:
   40%)


•...
Instructors: Years Teaching
Q: How many years have you been an instructor/faculty in
                More than 30education...
Students: Year in Program
             Q: What is your year in your program?
                         Doctoral Student
   ...
Use of / Preference for IT in
Q: Which of the following best describes your use / preference
         of information techn...
How Much is Sakai Being
  Q: For how many different courses have you used Sakai?

40%                                     ...
Activities Within Sakai
• Different activities can be accomplished a
  variety of different ways within Sakai
• 28 differe...
Activities Within Sakai
• Materials Management Activities
  • Instructors: 4.34 (Min: 3.52; Max: 4.64)
  • Students: 4.18 ...
Tools Within Sakai
• Activities happen within tools
• Included tools in use for at least
  two-thirds (9 out of 14) of the...
Tools Within Sakai
• Materials Management Tools
  • Instructors: 4.30 (Min: 3.45; Max: 4.67)
  • Students: 4.07 (Min: 3.35...
Carnegie Classifications
    Institution       Research?                              Enrollment
      Bradley             ...
Carnegie Classifications
    Institution       Research?                              Enrollment
      Bradley             ...
Non-Research vs. Research Differences
                 Materials Management Activities

                                  ...
Non-Research vs. Research Differences
                 Interactive Teaching / Learning Activities

                       ...
Non-Research vs. Research Differences
                 Materials Management Tools

                                       ...
Non-Research vs. Research Differences
                 Interactive Teaching / Learning Tools

                            ...
Non-Research vs. Research Differences

• Participants from Non-Research
  institutions more strongly agreed
  that activit...
Carnegie Classifications
    Institution       Research?                              Enrollment
      Bradley             ...
Carnegie Classifications
    Institution       Research?                              Enrollment
      Bradley             ...
Enrollment Level Differences
               Materials Management Activities

                                             ...
Enrollment Level Differences
               Interactive Teaching / Learning Activities

                                  ...
Enrollment Level Differences
               Materials Management Tools

                                                  ...
Enrollment Level Differences
               Interactive Teaching / Learning Tools

                                       ...
Enrollment Level Differences
• Participants from High
  Undergraduate institutions more
  strongly agreed that activities ...
Tales From the Trenches



                University of
                  Virginia
            Stephanie Conley & Yitna F...
Contrasting Experiences
• Students view themselves as more
  advanced

• Students solve Sakai problems on
  their own

• S...
Something New about Faculty / Students

• What this means to us:
  • Faculty comfort in, and “ownership” of,
    their Sak...
Addressing Suggested
• Faculty Support:
  • Move from tool oriented training to
    pedagogically oriented faculty develop...
Lessons & Benefits from
• Lessons:
  • We need to give our community more
    incentives to increase response rate
  • Chal...
Tales From the Trenches



            Mount Holyoke
               College
               Mary Glackin




July 2009     ...
Mount Holyoke’s comments
• Students -   “Every faculty
    member should be
    FORCED to use Sakai for
    every class.”
...
Mount Holyoke College says
• Collaboration has allowed us to
  conduct an extensive survey of our
  user community.
• Shar...
Tales From the Trenches



             Stanford
            University
              Keli Amann




July 2009     10th Sa...
Stanford University




July 2009     10th Sakai Conference - Boston, MA, U.S.A.   43
Stanford University
• Keli Amann, User Experience Specialist




    July 2009      10th Sakai Conference - Boston, MA, U....
Stanford University
• Keli Amann, User Experience Specialist

• Participants in all 7 schools
   •~200/1,600 instructors
 ...
Stanford University
• Keli Amann, User Experience Specialist

• Participants in all 7 schools
   •~200/1,600 instructors
 ...
Three Major Themes




July 2009    10th Sakai Conference - Boston, MA, U.S.A.   44
Three Major Themes
• Visuals matter




    July 2009      10th Sakai Conference - Boston, MA, U.S.A.   44
Three Major Themes
• Visuals matter
   • Students are sensitive to aesthetics




    July 2009       10th Sakai Conferenc...
Three Major Themes
• Visuals matter
   • Students are sensitive to aesthetics
   • Instructors don’t only care about text
...
Three Major Themes
• Visuals matter
   • Students are sensitive to aesthetics
   • Instructors don’t only care about text
...
Three Major Themes
• Visuals matter
   • Students are sensitive to aesthetics
   • Instructors don’t only care about text
...
Three Major Themes
• Visuals matter
   • Students are sensitive to aesthetics
   • Instructors don’t only care about text
...
Three Major Themes
• Visuals matter
   • Students are sensitive to aesthetics
   • Instructors don’t only care about text
...
Follow up Pilot
                  Questionnaire
1. What were your hopes and expectations for this tool when you
heard abou...
Tales From the Trenches



Universidad Politécnica de
         Valencia
             Raúl Mengod López




 July 2009     ...
Introduction: UPV and IMSI
• About UPV
  • 40,000 Students, 2,600 Faculty
  • Sakai activated by default for all subjects
...
Survey Conclusions
• Sakai is the main tool used
• General satisfaction raised to 89%
• Mobile devices not used and not va...
Lessons Learned
• Usability is still the main problem of Sakai
• Sakai has a high learning curve
• From Students
  • Deman...
Tales From the Trenches



            Marist
            College
            Brian Dashew




July 2009    10th Sakai Con...
Marist College survey
• Deployed using Evaluation System
  tool
• 5,804 students (FT, PT, residential,
  distance, etc.)
 ...
Marist Students and Faculty
• Our second survey in three years
• Significantly higher turnout
• Potential reasons:
  • Poss...
Acting on Results
• Still reviewing data
  • Jim Regan and a graduate student
• 2.6 Upgrade Briefing
• Presentations and pu...
Lessons and Modifications
• Altered survey order (seemed to
  work!)
• Evaluation system email settings
• Shifting demograp...
You Too Can Participate!
• All MISI core survey items &
  institution information available to
  entire Sakai community:
 ...
Next Steps
• What do potential Sakai implementers
  want to know from this data?
• What do current Sakai implementers
  wa...
Materials Management
DESCRIPTION                                      ROLE                    % USE             MEAN    SI...
Interactive Teaching & Learning Activities
DESCRIPTION                                             ROLE                   ...
Materials Management Tools
 DESCRIPTION                             ROLE                    % USE            MEAN    SIG?
...
Interactive Teaching &
DESCRIPTION                    ROLE                   % USE            MEAN    SIG?
Assignments    ...
Upcoming SlideShare
Loading in …5
×

What Do Users Say? Findings from the Multi-Institutional Survey Initiative (MISI)

1,503 views

Published on

Panel Presentation at the 10th Sakai Conference in Boston, MA

This panel session will present an overview of preliminary data from the Sakai Multi-Institutional Survey Initiative (MISI) including perspectives and lessons learned from individual participating institutions. Initial trends, similarities, and differences between institutions regarding instructor and student responses will be discussed.

Published in: Education, Technology
  • Steve,
    I understand your reasons, and I often seek IRB approval myself because I am a researcher as well as a developer. But I don't find your reasons compelling.
    1) not sure why this matters
    2) I don't believe the activities you list need fall under IRB purview any more than do surveys if they are for development purposes.
    3) Maybe, but I would never use that as a reason. It's too heavy-handed.
    As I said in my first comment, there is an important difference between product development and academic research (not in rigor, but in purpose). For many development groups, IRB approval processes are daunting enough (or at least perceived as such) that they simply wouldn't gather user input if they had to go through IRB first. I hate to put barriers between developers and users unnecessarily. That's really my core motivation for making these comments. If in your group seeking and gaining IRB approval isn't a barrier, then that's great.

    I do recognize the importance and value of IRBs. I'm not one of those anti-IRB people. Their work is very important. For me it's a bit like copyright. If people ask permission to use something (or worse, pay for reuse of something) that is covered by the fair use provision, it erodes fair use.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Mark,
    In my opinion, the reasons we recommend that institutions apply for IRB approval or exemption are three-fold:
    1) By going through the process, it informs the research community at the institution as well as the faculty that this is serious, validated research that is not only about improving IT, but understanding how instructors and students perceive, use, and feel about technology.
    2) Depending on the institution, they may want to follow-up their survey with focus groups, interviews, event log analysis, etc. Such activities may fall under the purview of IRB organizations and it is incumbent upon those conducting these activities to obtain approval or exemption before conducting such activities.
    3) Finally, simply 'following good human subjects principles' is not always entirely clear, even to those with the best intentions and seemingly benign survey questions. An institution's IRB is a service provided to the university in order to develop and disseminate the best possible research as well as safeguarding subjects' rights and welfare.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • I disagree on the IRB approval recommendation. This is not research in the IRB sense. It is an internal IT organization trying to improve its work. We are allowed to do such development activity, and even share the results, without going through IRB approval. However, we should of course follow good human subjects principles such as only sharing de-identified data, etc. If we keep asking for IRB approval when we don't need it, soon we will need it for everything!
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

What Do Users Say? Findings from the Multi-Institutional Survey Initiative (MISI)

  1. 1. What Do Users Say? Findings from the Multi-Institutional Survey Initiative (MISI) Steven Lonn & Stephanie D. Teasley, University of Michigan; Stephanie Conley & Yitna Firfyiwek, University of Virginia; Mary Glackin, Mt. Holyoke College; Keli Amann, Stanford 1
  2. 2. Panel Overview • Organization, Survey Items, and Logistics • Preliminary Overall Findings • Tales from the Trenches: • Contrasting faculty / student experiences; Using suggested improvements • The small college experience with Sakai collaboration • Major themes found; Follow-up Questionnaire • International perspective: findings and lessons learned • Acting on survey results; lessons & modifications July 2009 10th Sakai Conference - Boston, MA, U.S.A. 2
  3. 3. Participating Institutions • Bradley University • Texas State University - San Marcos • Georgia Institute • Universidad of Technology Politécnica de • Marist College Valencia • Mount Holyoke • University of Michigan College • University of Virginia • University of Windsor • Rice University • University of Wyoming • Rutgers University • Stanford University July 2009 10th Sakai Conference - Boston, MA, U.S.A. 3
  4. 4. Other Participating • Charles Sturt University • Columbia University • Universidade Fernando Pessoa • University of California, Berkeley • University of Limerick July 2009 10th Sakai Conference - Boston, MA, U.S.A. 4
  5. 5. Starting the Initiative • Discussions with other institutions at Sakai conferences. • First multi-institutional panel: 2006 in Vancouver • Open email invitation sent out Dec. 2008 after Virginia Tech regional conference July 2009 10th Sakai Conference - Boston, MA, U.S.A. 5
  6. 6. Agreeing on Core Survey • Conference calls to discuss scope, logistics, etc. • Voting on Michigan survey items via Confluence • Wording, order, and core vs. optional discussed in second set of conference calls • Individual email questions July 2009 10th Sakai Conference - Boston, MA, U.S.A. 6
  7. 7. Data Collection & Analysis • Surveys individually administered by each participating institution • Some offered incentives, some not • Average survey availability: 23 days (min: 12 days; max: 39 days) • Data uploaded to Michigan's Sakai implementation • Michigan volunteered to combine & analyze quantitative data • Full combined data set available to all MISI institutions July 2009 10th Sakai Conference - Boston, MA, U.S.A. 7
  8. 8. Sakai Implementation July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  9. 9. Sakai Implementation Sakai 2.4 3 Sakai 2.5 11 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  10. 10. Sakai Implementation Sakai 2.4 3 Sakai 2.5 11 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  11. 11. Sakai Implementation Sakai 2.4 3 Some / All Depts. Require Sakai Use 4 Sakai 2.5 11 Sakai Use Optional 10 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  12. 12. Sakai Implementation Sakai 2.4 3 Some / All Depts. Require Sakai Use 4 Sakai Use Optional Sakai 2.5 10 11 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  13. 13. Sakai Implementation Sakai 2.4 3 Some / All Depts. Require Sakai Use 4 Sakai is Only LMS in Use 6 Sakai Use Optional Sakai 2.5 10 11 Other LMS in Use 8 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  14. 14. Sakai Implementation Sakai 2.4 3 Some / All Depts. Require Sakai Use 4 Sakai Use Optional Sakai 2.5 10 11 Sakai is Only LMS in Use 6 Other LMS in Use 8 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  15. 15. Sakai Implementation Sakai 2.4 3 Some / All Depts. Require Sakai Use 4 1-2 Years < 1 Year 1 Sakai 2.5 6 Sakai Use Optional 10 11 3-4 Years 2 Sakai is Only LMS in Use 6 5+ Years 3 Other LMS in Use 8 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  16. 16. Sakai Implementation Sakai 2.4 3 Some / All Depts. Require Sakai Use 4 Sakai Use Optional Sakai 2.5 10 11 Sakai is Only LMS in Use 6 1-2 Years < 1 Year 1 6 3-4 Years 2 Other LMS in Use 8 5+ Years 3 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 8
  17. 17. Sample Information • Majority of courses conducted face- to-face (80%) or in "blended" (17%) formats • Primarily "large" institutions (9) • Some "medium" institutions (4) • One "small" institution July 2009 10th Sakai Conference - Boston, MA, U.S.A. 9
  18. 18. Survey Respondents by Rutgers Stanford Rice 5% 3% 6% Texas State 8% Mt. Holyoke 7% Valencia 8% Michigan, Dearborn 12% Virginia 6% Windsor 1%Wyoming 1% Bradley 5% Georgia Tech 2% Marist 7% Michigan, Ann Arbor 30% July 2009 10th Sakai Conference - Boston, MA, U.S.A. 10
  19. 19. Survey Respondents by Role • 2,962 Instructors • 14 Institutions • Average Response Rate: 20% (Min: 6%, Max: 40%) • 7,513 Students • 13 Institutions • Average Response Rate: 12% (Min: 1%, Max: 29%) July 2009 10th Sakai Conference - Boston, MA, U.S.A. 11
  20. 20. Instructors: Years Teaching Q: How many years have you been an instructor/faculty in More than 30education? higher years year or less 1 8% 12% 21-30 years 14% 2-5 years 25% 11-20 years 21% 6-10 years 19% July 2009 10th Sakai Conference - Boston, MA, U.S.A. 12
  21. 21. Students: Year in Program Q: What is your year in your program? Doctoral Student 7% 1st-Year Undergraduate 17% Masters Student 16% 2nd-Year Undergraduate 18% 4th-Year (or More) Undergraduate 22% 3rd-Year Undergraduate 20% July 2009 10th Sakai Conference - Boston, MA, U.S.A. 13
  22. 22. Use of / Preference for IT in Q: Which of the following best describes your use / preference of information technology in your courses? 60% 56% 48% 45% 30% 27% 24% 21% 15% 15% 2% 2% 3% 1% 0% None Limited Moderate Extensive Exclusive Instructors Students July 2009 10th Sakai Conference - Boston, MA, U.S.A. 14
  23. 23. How Much is Sakai Being Q: For how many different courses have you used Sakai? 40% 37% 31% 30% 30% 27% 24% 20% 14% 11% 11% 10% 10% 5% 0% None 1-2 Courses 3-6 Courses 7-10 Courses 11+ Courses Instructors Students July 2009 10th Sakai Conference - Boston, MA, U.S.A. 15
  24. 24. Activities Within Sakai • Different activities can be accomplished a variety of different ways within Sakai • 28 different activities rated • Are these activities "valuable" • 5-point Likert scale: Strongly Disagree (1) - Strongly Agree (5) • In analysis, activities categorized as "Materials Management" (13 activities) or "Interactive Teaching / Learning" (15 activities) July 2009 10th Sakai Conference - Boston, MA, U.S.A. 16
  25. 25. Activities Within Sakai • Materials Management Activities • Instructors: 4.34 (Min: 3.52; Max: 4.64) • Students: 4.18 (Min: 3.65; Max: 4.50) • Highest rated activity: Post / Access online readings & supplementary materials • Interactive Teaching / Learning Activities • Instructors: 3.81 (Min: 2.57; Max: 4.11) • Students: 3.72 (Min: 2.97; Max: 4.24) • Highest rated activity: Students turning in assignments online July 2009 10th Sakai Conference - Boston, MA, U.S.A. 17
  26. 26. Tools Within Sakai • Activities happen within tools • Included tools in use for at least two-thirds (9 out of 14) of the MISI institutions • 18 tools total • In analysis, tools categorized as "Materials Management" (10 tools) or "Interactive Teaching / July 2009 10th Sakai Conference - Boston, MA, U.S.A. 18
  27. 27. Tools Within Sakai • Materials Management Tools • Instructors: 4.30 (Min: 3.45; Max: 4.67) • Students: 4.07 (Min: 3.35; Max: 4.39) • Highest rated tool: Resources Lowest rated tool: News • Interactive Teaching / Learning Tools • Instructors: 4.05 (Min: 3.23; Max: 4.41) • Students: 3.82 (Min: 3.11; Max: 4.36) • Highest rated tool: Assignments Lowest rated tools: Wiki / Polls July 2009 10th Sakai Conference - Boston, MA, U.S.A. 19
  28. 28. Carnegie Classifications Institution Research? Enrollment Bradley No Very High Undergraduate Georgia Tech Yes High Undergraduate Marist No Very High Undergraduate Michigan, Ann Arbor Yes Majority Undergraduate Michigan, Dearborn No High Undergraduate Mount Holyoke No Very High Undergraduate Rice Yes Majority Undergraduate Rutgers Yes High Undergraduate Stanford Yes Majority Undergraduate* Texas State No High Undergraduate Valencia Yes High Undergraduate Virginia Yes Majority Undergraduate Windsor Yes High Undergraduate Wyoming Yes High Undergraduate July 2009 10th Sakai Conference - Boston, MA, U.S.A. 20
  29. 29. Carnegie Classifications Institution Research? Enrollment Bradley No Very High Undergraduate Georgia Tech Yes High Undergraduate Marist No Very High Undergraduate Michigan, Ann Arbor Yes Majority Undergraduate Michigan, Dearborn No High Undergraduate Mount Holyoke No Very High Undergraduate Rice Yes Majority Undergraduate Rutgers Yes High Undergraduate Stanford Yes Majority Undergraduate* Texas State No High Undergraduate Valencia Yes High Undergraduate Virginia Yes Majority Undergraduate Windsor Yes High Undergraduate Wyoming Yes High Undergraduate July 2009 10th Sakai Conference - Boston, MA, U.S.A. 21
  30. 30. Non-Research vs. Research Differences Materials Management Activities Significa Instructors Students nt? Non-Research Institutions 4.47 4.20 * Research Institutions 4.30 4.17 * Significant? * NS July 2009 10th Sakai Conference - Boston, MA, U.S.A. 22
  31. 31. Non-Research vs. Research Differences Interactive Teaching / Learning Activities Significa Instructors Students nt? Non-Research Institutions 3.96 3.81 * Research Institutions 3.75 3.65 * Significant? * * July 2009 10th Sakai Conference - Boston, MA, U.S.A. 23
  32. 32. Non-Research vs. Research Differences Materials Management Tools Significa Instructors Students nt? Non-Research Institutions 4.43 4.04 * Research Institutions 4.26 4.09 * Significant? * * July 2009 10th Sakai Conference - Boston, MA, U.S.A. 24
  33. 33. Non-Research vs. Research Differences Interactive Teaching / Learning Tools Significa Instructors Students nt? Non-Research Institutions 4.24 3.84 * Research Institutions 3.99 3.79 * Significant? * * July 2009 10th Sakai Conference - Boston, MA, U.S.A. 25
  34. 34. Non-Research vs. Research Differences • Participants from Non-Research institutions more strongly agreed that activities & tools were valuable than participants from Research institutions • Instructors > Students • Materials Management > Interactive Teaching / Learning July 2009 10th Sakai Conference - Boston, MA, U.S.A. 26
  35. 35. Carnegie Classifications Institution Research? Enrollment Bradley No Very High Undergraduate Georgia Tech Yes High Undergraduate Marist No Very High Undergraduate Michigan, Ann Arbor Yes Majority Undergraduate Michigan, Dearborn No High Undergraduate Mount Holyoke No Very High Undergraduate Rice Yes Majority Undergraduate Rutgers Yes High Undergraduate Stanford Yes Majority Undergraduate* Texas State No High Undergraduate Valencia Yes High Undergraduate Virginia Yes Majority Undergraduate Windsor Yes High Undergraduate Wyoming Yes High Undergraduate July 2009 10th Sakai Conference - Boston, MA, U.S.A. 27
  36. 36. Carnegie Classifications Institution Research? Enrollment Bradley No Very High Undergraduate Georgia Tech Yes High Undergraduate Marist No Very High Undergraduate Michigan, Ann Arbor Yes Majority Undergraduate Michigan, Dearborn No High Undergraduate Mount Holyoke No Very High Undergraduate Rice Yes Majority Undergraduate Rutgers Yes High Undergraduate Stanford Yes Majority Undergraduate* Texas State No High Undergraduate Valencia Yes High Undergraduate Virginia Yes Majority Undergraduate Windsor Yes High Undergraduate Wyoming Yes High Undergraduate July 2009 10th Sakai Conference - Boston, MA, U.S.A. 28
  37. 37. Enrollment Level Differences Materials Management Activities Significa Instructors Students nt? Very High Undergraduat e 4.25 4.16 NS High Undergraduat e 4.42 4.18 * Majority Undergraduat e 4.30 4.19 * Significant? * NS July 2009 10th Sakai Conference - Boston, MA, U.S.A. 29
  38. 38. Enrollment Level Differences Interactive Teaching / Learning Activities Significa Instructors Students nt? Very High Undergraduat e 3.86 3.72 NS High Undergraduat e 3.87 3.81 NS Majority Undergraduat e 3.76 3.65 * Significant? * NS July 2009 10th Sakai Conference - Boston, MA, U.S.A. 30
  39. 39. Enrollment Level Differences Materials Management Tools Significa Instructors Students nt? Very High Undergraduat e 4.25 3.87 * High Undergraduat e 4.38 4.18 * Majority Undergraduat e 4.26 4.09 * Significant? * * July 2009 10th Sakai Conference - Boston, MA, U.S.A. 31
  40. 40. Enrollment Level Differences Interactive Teaching / Learning Tools Significa Instructors Students nt? Very High Undergraduat e 4.01 3.68 * High Undergraduat e 4.10 3.91 * Majority Undergraduat e 4.03 3.82 * Significant? NS * July 2009 10th Sakai Conference - Boston, MA, U.S.A. 32
  41. 41. Enrollment Level Differences • Participants from High Undergraduate institutions more strongly agreed that activities & tools were valuable than participants from other institutions • Instructors > Students • Materials Management > Interactive Teaching / Learning July 2009 10th Sakai Conference - Boston, MA, U.S.A. 33
  42. 42. Tales From the Trenches University of Virginia Stephanie Conley & Yitna Firfyiwek July 2009 10th Sakai Conference - Boston, MA, U.S.A. 34
  43. 43. Contrasting Experiences • Students view themselves as more advanced • Students solve Sakai problems on their own • Students believe faculty need more training • Students see both problematic and July 2009 10th Sakai Conference - Boston, MA, U.S.A. 35
  44. 44. Something New about Faculty / Students • What this means to us: • Faculty comfort in, and “ownership” of, their Sakai environments is not only a critical element, but also one that demands a different sort of support • Student growing understanding of the Sakai will impact Sakai support in the future • Students can become powerful allies in July 2009 10th Sakai Conference - Boston, MA, U.S.A. 36
  45. 45. Addressing Suggested • Faculty Support: • Move from tool oriented training to pedagogically oriented faculty development • Augmenting support with course development opportunities through our Teaching Resource Center • Differentiating faculty needs by levels (minimum, moderate, advanced) and providing support accordingly • Prioritizing a dynamic infrastructure: • Support integration of external applications July 2009 10th Sakai Conference - Boston, MA, U.S.A. 37
  46. 46. Lessons & Benefits from • Lessons: • We need to give our community more incentives to increase response rate • Challenge of making a general survey meaningful for our institutional culture • Benefits: • Jump starting our survey process • Facilitating IRB exemption • Power of collaborating with peer institutions July 2009 10th Sakai Conference - Boston, MA, U.S.A. 38
  47. 47. Tales From the Trenches Mount Holyoke College Mary Glackin July 2009 10th Sakai Conference - Boston, MA, U.S.A. 39
  48. 48. Mount Holyoke’s comments • Students - “Every faculty member should be FORCED to use Sakai for every class.” • Faculty - “Don’t make it required!” July 2009 10th Sakai Conference - Boston, MA, U.S.A. 40
  49. 49. Mount Holyoke College says • Collaboration has allowed us to conduct an extensive survey of our user community. • Sharing results with support staff is promoting a better understanding of Sakai’s value to our community. • Beginning conversations with faculty about sharing their July 2009 10th Sakai Conference - Boston, MA, U.S.A. 41
  50. 50. Tales From the Trenches Stanford University Keli Amann July 2009 10th Sakai Conference - Boston, MA, U.S.A. 42
  51. 51. Stanford University July 2009 10th Sakai Conference - Boston, MA, U.S.A. 43
  52. 52. Stanford University • Keli Amann, User Experience Specialist July 2009 10th Sakai Conference - Boston, MA, U.S.A. 43
  53. 53. Stanford University • Keli Amann, User Experience Specialist • Participants in all 7 schools •~200/1,600 instructors •~100/18,000 students July 2009 10th Sakai Conference - Boston, MA, U.S.A. 43
  54. 54. Stanford University • Keli Amann, User Experience Specialist • Participants in all 7 schools •~200/1,600 instructors •~100/18,000 students kamann@stanford.edu July 2009 10th Sakai Conference - Boston, MA, U.S.A. 43
  55. 55. Three Major Themes July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  56. 56. Three Major Themes • Visuals matter July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  57. 57. Three Major Themes • Visuals matter • Students are sensitive to aesthetics July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  58. 58. Three Major Themes • Visuals matter • Students are sensitive to aesthetics • Instructors don’t only care about text content July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  59. 59. Three Major Themes • Visuals matter • Students are sensitive to aesthetics • Instructors don’t only care about text content • Don’t make me visit if I don’t have to July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  60. 60. Three Major Themes • Visuals matter • Students are sensitive to aesthetics • Instructors don’t only care about text content • Don’t make me visit if I don’t have to • Tools may not correspond to actual needs July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  61. 61. Three Major Themes • Visuals matter • Students are sensitive to aesthetics • Instructors don’t only care about text content • Don’t make me visit if I don’t have to • Tools may not correspond to actual needs • I want to create pages != Wiki tool July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  62. 62. Three Major Themes • Visuals matter • Students are sensitive to aesthetics • Instructors don’t only care about text content • Don’t make me visit if I don’t have to • Tools may not correspond to actual needs • I want to create pages != Wiki tool • I want students to discuss != Forum, ?= BlogWow July 2009 10th Sakai Conference - Boston, MA, U.S.A. 44
  63. 63. Follow up Pilot Questionnaire 1. What were your hopes and expectations for this tool when you heard about it?  2. Did you actually use the tool? If so, please describe the context. If not, why not?  3. What problems did you, the TAs, or the students encounter? What workarounds were you forced to use?  4. Despite problems, would you choose to use it again? Why?  5. If you would not use the tool again, a. what must we fix before you would reconsider it? b. what will you do instead? 6. What would you tell a colleague about this tool? July 2009 10th Sakai Conference - Boston, MA, U.S.A. 45
  64. 64. Tales From the Trenches Universidad Politécnica de Valencia Raúl Mengod López July 2009 10th Sakai Conference - Boston, MA, U.S.A. 46
  65. 65. Introduction: UPV and IMSI • About UPV • 40,000 Students, 2,600 Faculty • Sakai activated by default for all subjects since 2006 • Faculty are free to use any other tool • IMSI Data • Last general survey in 2006 • Survey sent to 6,960 students and 1,200 teachers July 2009 10th Sakai Conference - Boston, MA, U.S.A. 47
  66. 66. Survey Conclusions • Sakai is the main tool used • General satisfaction raised to 89% • Mobile devices not used and not valued (18%/27%) • High value to video and multimedia (68%/ 68%) • Collaborative tools not well valued • Best valued tools are the most used tools July 2009 10th Sakai Conference - Boston, MA, U.S.A. 48
  67. 67. Lessons Learned • Usability is still the main problem of Sakai • Sakai has a high learning curve • From Students • Demand teachers a more intensive use of the tool. • Able to distinguish where is the lack, (platform or teachers) • From Teachers • Don’t appreciate the use of new technologies like social networks or mobile devices • Appreciate more Sakai features after use July 2009 10th Sakai Conference - Boston, MA, U.S.A. 49
  68. 68. Tales From the Trenches Marist College Brian Dashew July 2009 10th Sakai Conference - Boston, MA, U.S.A. 50
  69. 69. Marist College survey • Deployed using Evaluation System tool • 5,804 students (FT, PT, residential, distance, etc.) • 638 responses (11% return) • 562 instructors (FT, PT, adjunct, etc.) • 79 responses (14% return) July 2009 10th Sakai Conference - Boston, MA, U.S.A. 51
  70. 70. Marist Students and Faculty • Our second survey in three years • Significantly higher turnout • Potential reasons: • Possible reward • Careful about messaging/ overwhelming users • Commitment to Sakai • MISI July 2009 10th Sakai Conference - Boston, MA, U.S.A. 52
  71. 71. Acting on Results • Still reviewing data • Jim Regan and a graduate student • 2.6 Upgrade Briefing • Presentations and publications • General trends of note: • Overall satisfaction • Social networking connection desired July 2009 10th Sakai Conference - Boston, MA, U.S.A. 53
  72. 72. Lessons and Modifications • Altered survey order (seemed to work!) • Evaluation system email settings • Shifting demographics • Is there a way to shorten the survey? July 2009 10th Sakai Conference - Boston, MA, U.S.A. 54
  73. 73. You Too Can Participate! • All MISI core survey items & institution information available to entire Sakai community: • http://confluence.sakaiproject.org/x/BoBF • Additional data welcome • Human Subjects (IRB) approval recommended July 2009 10th Sakai Conference - Boston, MA, U.S.A. 55
  74. 74. Next Steps • What do potential Sakai implementers want to know from this data? • What do current Sakai implementers want to know from this data? • What other ways can this data be analyzed? • Expanding MISI in future iterations • Others? July 2009 10th Sakai Conference - Boston, MA, U.S.A. 56
  75. 75. Materials Management DESCRIPTION ROLE % USE MEAN SIG? Post / Access a syllabus Instructors 96.50% 4.56 0.001 Students 98.30% 4.49 Send / Receive messages or notifications Instructors 94.20% 4.56 0.000 Students 95.70% 4.27 Post / Access online reading & supp materials Instructors 93.50% 4.64 0.000 Students 97.20% 4.5 Provide / Use single access point for materials Instructors 86.00% 4.46 0.000 Students 91.50% 4.19 Publish / Access public course description Instructors 85.00% 4.23 0.000 Students 85.20% 3.74 Post / Access lecture outline AFTER lecture Instructors 78.10% 4.32 0.000 Students 92.40% 4.41 Post / Access grades Instructors 74.40% 4.2 NS Students 90.10% 4.23 Post / Access lecture outline BEFORE lecture Instructors 73.60% 4.24 NS Students 87.40% 4.22 Post / Access sample exams & quizzes Instructors 67.10% 4.21 NS Students 85.20% 4.22 Post / Access multimedia materials Instructors 65.70% 4.12 0.000 Students 77.70% 3.96 Construct / View calendar / schedule of events Instructors 63.30% 3.68 0.008 Students 80.70% 3.76 Provide / Use structure to sequence or scaffold Instructors 55.20% 3.68 NS activities Students 70.10% 3.65 Post / Access audio/video lecture recording Instructors 41.80% 3.52 0.000 Students 64.30% 3.67 MaterialsJuly 2009 Management Activities - Combined Instructors 75.00% 10th Sakai Conference - Boston, MA, U.S.A. 4.34 0.000 57 Students 85.80% 4.18
  76. 76. Interactive Teaching & Learning Activities DESCRIPTION ROLE % USE MEAN SIG? Students turn in assignments online Instructors 63.80% 4.11 0.000 Students 88.60% 4.24 Return / Receive assignments w comments & grade Instructors 57.40% 3.76 NS Students 79.60% 3.8 Students access library resources / research help Instructors 55.10% 3.87 0.023 Students 73.40% 3.8 Monitor / Observe student progress or engagement Instructors 51.50% 3.43 0.001 Students 63.40% 3.31 Give / Take exams & quizzes Instructors 49.20% 3.56 NS Students 69.70% 3.6 Students provide course / lecture feedback Instructors 48.00% 3.62 NS Students 70.70% 3.69 Students work together on task / assignment Instructors 47.60% 3.65 0.005 Students 69.20% 3.54 Students ask questions BEFORE lecture Instructors 44.90% 3.53 0.003 Students 70.70% 3.64 Students generate / share instructional materials Instructors 44.70% 3.59 0.033 Students 68.10% 3.68 Support distance learning Instructors 44.10% 3.72 0.031 Students 66.50% 3.64 Students ask questions AFTER lecture Instructors 43.20% 3.62 0.000 Students 66.40% 3.82 Students read / comment on each other's work Instructors 42.10% 3.55 NS Students 63.30% 3.5 Create / Part of ad-hoc student groups / teams Instructors 41.90% 3.32 NS Students 55.50% 3.34 Hold / Visit online office hours Instructors 38.50% 3.03 0.000 Students 55.80% 3.29 Students ask questions DURING lecture Instructors 31.40% 2.57 0.000 Students 55.30% 2.97 Interactive Activities - Combined July 2009 10th Sakai Conference - Boston, 46.90% Instructors MA, U.S.A. 3.81 0.000 58 Students 67.70% 3.72
  77. 77. Materials Management Tools DESCRIPTION ROLE % USE MEAN SIG? Announcements Instructors 93.60% 4.61 0.000 Students 97.40% 4.34 Syllabus Instructors 91.70% 4.5 0.000 Students 96.30% 4.36 Resources Instructors 89.70% 4.67 0.000 Students 91.50% 4.39 My Workspace Instructors 71.20% 4.06 0.000 Students 75.10% 3.79 Gradebook Instructors 64.60% 4.14 NS Students 85.20% 4.13 Schedule Instructors 64.40% 3.99 0.000 Students 79.10% 3.8 Drop Box Instructors 56.50% 3.81 NS Students 76.60% 3.8 Web Content Instructors 52.40% 3.99 0.000 Students 57.30% 3.68 News Instructors 38.60% 3.45 0.016 Students 58.00% 3.35 Modules Instructors 38.00% 3.93 0.000 Students 55.50% 3.71 Materials Management Tools - Combined Instructors 66.10% 4.3 0.000 Students 77.20% 4.07 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 59
  78. 78. Interactive Teaching & DESCRIPTION ROLE % USE MEAN SIG? Assignments Instructors 82.10% 4.41 0.015 Students 96.60% 4.36 Mail Tool Instructors 71.80% 4.37 0.000 Students 64.10% 3.4 Email Archive Instructors 57.40% 3.96 0.000 Students 66.00% 3.73 Tests & Quizzes Instructors 53.60% 3.86 NS Students 78.30% 3.9 Chat Room Instructors 42.70% 3.08 0.000 Students 68.20% 3.26 Forums Instructors 36.00% 3.39 NS Students 61.10% 3.44 Polls Instructors 31.40% 3.29 0.000 Students 53.00% 3.11 Wiki Instructors 31.20% 3.23 NS Students 51.60% 3.3 Interactive Tools - Combined Instructors 50.80% 4.05 0.000 Students 67.40% 3.82 July 2009 10th Sakai Conference - Boston, MA, U.S.A. 60

×