College America Grant Reports- Final Evaluation
Upcoming SlideShare
Loading in...5
×
 

College America Grant Reports- Final Evaluation

on

  • 620 views

 

Statistics

Views

Total Views
620
Views on SlideShare
620
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

College America Grant Reports- Final Evaluation College America Grant Reports- Final Evaluation Document Transcript

  •                         Completion  Innovation  Challenge  Grant  Evaluation                               Evaluation  of  the  Completion  Innovation  Challenge  Grant   Prepared  by:  JVA  Consulting,  LLC   September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant         Table  of  Contents   List  of  Figures  ...............................................................................................................   2   List  of  Tables  ................................................................................................................   3   Executive  Summary  ......................................................................................................   5   Methodology  .............................................................................................................  21   Findings  .....................................................................................................................  25   Conclusion  .................................................................................................................  53   Appendix  A:  Student  Survey  .......................................................................................  57   Appendix  B:  Faculty  Survey  ........................................................................................  62   Appendix  C:  Faculty  Interview  Guide  ..........................................................................  85       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   1  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant         List  of  Figures   Figure  1.  Gender  for  the  Entire  Sample  (n  =  1,527)  ......................................................................  25   Figure  2.  Race  by  Group  for  the  Entire  Sample  (n  =  1,527)  ...........................................................  26   Figure  3.  Gender  for  Survey  Data  (n  =  153)  ...................................................................................  26   Figure  4.  Gender  by  Group  for  Student  Survey  Respondents  (n  =  153)  ........................................  27   Figure  5.  Hours  Worked  Per  Week  During  the  Semester  for  Student  Survey  Respondents  (n  =   153)  ...............................................................................................................................................  28   Figure  6.  Relationship  Status  of  Survey  Respondents  (n  =  153)  ....................................................  28   Figure  7.  Faculty  Perception  of  Open  Entry-­‐Exit  Math  Labs  Compared  to  a  Traditional  Format  (n  =   7;  ACC  =  1,  PPCC  =  6,  TSJC  =  0)  ......................................................................................................  35   Figure  8.  Faculty  Preference  for  the  Continuation  of  Open  Entry-­‐Exit  Math  Labs  (n  =  7;  ACC  =  1,   PPCC  =  6,  TSJC  =  0)  ........................................................................................................................  35   Figure  9.  Faculty  Perception  of  Accelerated  and  Compressed  Courses  Compared  to  a  Traditional   Format  (n  =  5;  CCA  =  0,  CCD  =  0,  FRCC  =  5,  LCC  =  0)  ......................................................................  40   Figure  10.  Faculty  Preference  for  the  Continuation  of  Accelerated  and  Compressed  Courses  (n  =   5;  FRCC  =  5,  LCC  =  0)  ......................................................................................................................  41   Figure  11.  Faculty  Perception  of  Modularized  Courses  With  Diagnostic  Assessments  Compared   to  a  Traditional  Format  (n  =  3;  MCC  =  1,  NJC  =  0,  PCC  =  2)  ...........................................................  49   Figure  12.  Faculty  Preference  for  the  Continuation  of  Modularization  and  Diagnostic   Assessments  (n  =  3;  MCC  =  1,  NJC  =  0,  PCC  =  2)  ............................................................................  50       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   2  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       List  of  Tables   Table  1.  Overview  of  Math  Labs  ....................................................................................................  11   Table  2.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming  .................  12   Table  3.  Overview  of  Online  Hybrid  Classes  ..................................................................................  13   Table  4.  Overview  of  Modularization  and  Diagnostic  Assessments  ..............................................  14   Table  5.  Percentage  Latino  and  Not  Latino  for  Entire  Sample  (n  =  1,527)  ....................................  25   Table  6.  Percentage  Latino  and  Not  Latino  for  Survey  Data  (n  =  153)  ..........................................  27   Table  7.  Mean  (SD)  Age,  Number  of  Children  Under  18  and  Number  of  Children  Under  18  Living   with  Respondent  for  Student  Survey  Data  (n  =  153)  ....................................................................  27   Table  8.  General  Satisfaction  Measures  (n  =  153)  .........................................................................  29   Table  9.  Student  Perception  on  Indicators  of  Institutional  Quality  (n  =  153)   ................................  30   Table  10.  Student  Ratings  of  Barriers  to  Retention  (n  =  153)  .......................................................  31   Table  11.  Correlation  Between  Barriers  to  Retention  and  Course  Completion  and  Self-­‐Reported   Expectation  to  Continue  College  (n  =  153)  ....................................................................................  32   Table  12.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for  Open   Entry/Exit  Math  Labs  .....................................................................................................................  33   Table  13.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Open  Entry/Exit  Math  Labs  .............................  34   Table  14.  Process  Measures  for  Open  Entry-­‐Exit  Math  Labs  (n  =  7;  ACC  =  1,  PPCC  =  6,  TSJC  =  0)   36   Table  15.  Overview  of  Math  Labs  ..................................................................................................  38   Table  16.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for   Accelerated,  Compressed,  Contextualized  and  Mainstreaming  ...................................................  39   Table  17.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Accelerated,  Compressed,  Contextualized  and   Mainstreaming  ..............................................................................................................................  40   Table  18.  Process  Measures  for  Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Courses  (n  =  5;  FRCC  =  5,  LCC  =  0)  .................................................................................................  41   Table  19.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming  ...............  44     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   3  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  20.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for  Online   Hybrid  Courses  ..............................................................................................................................  45   Table  21.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Online  Hybrid  Courses  .....................................  46   Table  22.  Overview  of  Online  Hybrid  Classes  ................................................................................  47   Table  23.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for   Modularization  and  Diagnostic  Assessments  ................................................................................  48   Table  24.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Modularization  and  Diagnostic  Assessments  ..  49   Table  25.  Process  Measures  for  Modularization  and  Diagnostic  Assessments  (n  =  3;  MCC  =  1,  NJC   =  0,  PCC  =  2)  ..................................................................................................................................  50   Table  26.  Overview  of  Modularization  and  Diagnostic  Assessments  ............................................  52   Table  27.  Overview  of  All  Innovation  Clusters  ..............................................................................  53       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   4  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Executive  Summary   The  Colorado  Department  of  Higher  Education  (CDHE)  received  a  Complete  College  America   (CCA)  grant  to  fund  the  Completion  Innovation  Challenge  Grant  (CICG)  project.  The  CCC  project   is  operated  by  the  Colorado  Community  College  System  (CCCS)  and  seeks  to  improve  college   completion  rates  within  CCCS  by  aligning  developmental  education  (DE)  courses  with  innovative,   evidence-­‐based  strategies  (innovations)  and  by  initiating  policy  reforms  that  ensure  the  state   financially  rewards  institutions  that  successfully  increase  the  number  of  college  graduates.     This  evaluation  attempts  to  answer  the  following  research  questions:   n n n n Were  the  innovations  implemented  as  intended?   What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   Are   students   within   innovation   DE   programs   more   successful   (in   terms   of   graduation,  retention  and  GPA)  than  those  in  standard  DE  programs?   Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention   and  GPA)?   This  report  summarizes  the  methodology  of  this  evaluation  and  the  findings  to  date,  which   includes  data  from  the  first  semester  of  implementation  (spring  2012).  A  second  report  will  be   produced  in  August  of  2013  and  will  include  data  from  the  first  three  semesters  of   implementation  (spring  2012  through  spring  2013).    Evaluation  will  continue  beyond  the  spring   of  2013,  though  at  this  time  it  is  not  entirely  clear  what  form  this  evaluation  will  take.1     Innovations   As  part  of  the  CICG  project,  seven  innovations  in  developmental  education  are  being   implemented  at  12  colleges  within  the  CCCS  system  (see  the  full  innovations  section  below  for  a   description  of  each):   n Open  Entry/Exit  Math  Labs   n Mainstreaming   n Accelerated  and  Compressed   n Contextualization   n Modularization   n Diagnostic  Assessment     n Online  Hybrid  Courses  for  Developmental  Education                                                                                                                           1  The  CCA  grant  that  funds  these  innovations  and  their  evaluation  will  not  fund  third-­‐party  evaluation  beyond  the   spring  of  2013.  However,  JVA  will  work  with  CCCS  to  ensure  evaluation  continues  in  some  form  beyond  this  time.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   5  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Some  of  these  innovations  are  being  implemented  as  stand-­‐alone  innovations,  while  others  are   being  implemented  in  combination.  Additionally,  several  innovations  closely  overlap  in  practice.     While  there  were  not  sufficient  data  available  to  robustly  investigate  each  institution   separately,  this  study  investigates  the  above  innovations  in  four  distinct  innovation  clusters   (see  the  full  innovations  section  below  for  a  description  of  each):   n Open  Entry/Exit  Math  Labs   n Accelerated,  Compressed,  Contextualized  and  Mainstreaming   n Online  Hybrid     n Modularization  and  Diagnostic  Assessments   Though  there  is  some  variation  within  each  of  these  clusters,  for  analytical  purposes,  they  are   treated  as  distinct  and  mutually  exclusive  sets  of  innovative  strategies.    The  institutions  within   each  cluster  are  presented  below.   Open  Entry/Exit  Math  Labs   Three  institutions  implemented  open  entry/exit  math  labs  as  part  of  the  CCC  project:   n Arapahoe  Community  College  (open  entry/exit  math  labs)   n Pikes  Peak  Community  College  (open  entry  math  labs)     n Trinidad  State  Junior  College  (open  entry/exit  math  labs)   Accelerated,  Compressed,  Contextualization  and  Mainstreaming   Four  institutions  implemented  accelerated,  compressed,  contextualized  and/or  mainstreaming   efforts  as  part  of  the  CCC  project:   n n Community  College  of  Aurora  (accelerated,  compressed  and  mainstreaming)   Community   College   of   Denver   (accelerated,   compressed,   mainstreaming   and   contextualized)   n Front  Range  Community  College  (accelerated  and  compressed)     n Lamar  Community  College  (accelerated  and  compressed)   Online  Hybrid  Courses   Two  institutions  implemented  online  hybrid  courses  as  part  of  the  CCC  project:   n n   Colorado  Community  College  Online  (online  hybrid  courses)     Otero  Junior  College  (online  hybrid  courses)   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   6  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Modularization  and  Diagnostic  Assessments   Three  institutions  implemented  modularization  and  diagnostic  assessments  as  part  of  the  CCC   project:   n Morgan  Community  College  (diagnostic  assessments  and  math  mods)     n Northeastern  Junior  College  (diagnostic  assessments  and  math  mods)   n Pueblo  Community  College  (diagnostic  assessments  and  math  mods)   Methodology   To  answer  the  research  questions,  a  survey  was  administered  to  students  to  support   institutional  data  derived  from  the  Student  Unit  Record  Data  System  (SURDS).    Additionally,  a   faculty  survey  was  administered  and  interviews  were  conducted  with  key  faculty  members.  The   sections  below  discuss  each  of  these  data  sources  in  more  detail,  as  well  as  how  the  control   groups  were  constructed  and  the  limitations  of  this  evaluation.   Data  Sources   The  data  used  in  this  study  were  gathered  from  four  sources:   n n n n CCCS   institutional   data—demographics,  grades  and  course  completion  variables   from  Student  Unit  Record  Data  System  (SURDS).   Student  survey—an   electronic   survey   designed   to   ascertain   student   satisfaction   with   DE   programming   and   to   identify   challenges   DE   students   experience   that   may  act  as  barriers  to  graduation  (see  Appendix  A).   Faculty  survey—an  electronic  survey  designed  to  ascertain  the  degree  to  which   faculty/staff   members   feel   each   innovation   is   being   implemented   as   intended   and  faculty  perception  of  the  quality  of  the  innovations  (see  Appendix  B).   Faculty   interviews—phone   interviews   lasting   approximately   15–30   minutes   with   16   key   faculty   and   staff   members   to   ascertain   the   degree   to   which   each   innovation  is  being  implemented  as  intended,  what  is  going  well  and  what  could   be  improved  upon  (see  Appendix  C).   Control  Group   To  build  control  groups,  students  in  traditional-­‐format  DE  courses  were  identified  and  matched   by  institution  and  course—for  each  innovation  course,  a  corresponding  traditional  course  at  the   same  institution  was  identified.  When  this  was  not  possible,  a  course  at  a  similar  institution   (similar  in  terms  of  size  and  rural/urban  location)  was  identified.  This  process  ensured  that,   whenever  possible,  innovation  courses  were  matched  to  control  courses  at  the  same  institution.   As  such,  institutionally  specific  variables  were  controlled  as  much  as  possible.  Finally,  within   each  innovation  cluster,  control  groups  were  matched  to  the  innovation  groups  along  four     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   7  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       demographic  factors:  (1)  gender,  (2)  ethnicity,  (3)  race2  and  (4)  age.  In  the  findings  section   below,  the  relative  match  between  control  and  innovation  groups  is  identified  for  each   innovation  cluster.   Study  Limitations   Though  this  evaluation  provides  valuable  information  on  the  CCC  program,  it  suffers  from  some   limitations:     n n A   mismatch   between   the   time   horizon   of   the   study   and   the   desired   outcomes— college  retention  is  a  long-­‐term  measure  that  will  most   effectively   be   measured   over  a  longer  period  of  time.     The   ambiguity   contained   within   definitions   of   these   innovations—institutions   define  and  implement  the  same  innovations  somewhat  differently.     n An  inability  to  make  distinctions  between  similar  innovations  within  clusters.     n Generally  small  sample  sizes  limit  the  generalizability  of  these  findings.     n Not  all  of  the  potential  benefits  associated  with  these  innovations  are  measured   by  this  evaluation.     Despite  these  limitations,  this  evaluation  provides  valuable  information  on  the  progress  made   by  the  CICG  project.    Though  these  findings  cannot  be  considered  conclusive,  they  do  provide  a   sense  of  how  the  project  has  progressed  and  what  it  has  accomplished  thus  far.   Findings   Findings  are  presented  in  six  sections  below:  (1)  student  demographics,  (2)  student  experience,   (3)  math  lab  innovation  cluster,  (4)  accelerated,  compressed,  contextualized  and  mainstreaming   innovation  cluster,  (5)  online  hybrid  innovation  cluster,  and  (6)  modularization  and  diagnostic   assessment  innovation  cluster.   Student  Demographics   Student  demographic  data  for  this  study  are  from  two  sources:  (1)  institutional  data  and  (2)  the   student  survey.  Data  from  each  of  these  sources  are  presented  below:   n n n Gender—more  than  half  (55%)  of  the  entire  sample  is  female  and  just  over  two-­‐ thirds  (70%)  of  survey  respondents  are  female.   Ethnicity—roughly   one-­‐fifth   (20.3%)   of   the   entire   sample   identifies   as   Latino,   as   did  a  slightly  smaller  proportion  of  survey  respondents  (17.0%).   Race—almost   three-­‐fifths   (58%)   of   the   entire   sample   identifies   as   white,   and   just  over  one-­‐fifth  (22%)  did  not  identify  as  any  of  the  available  racial  categories.                                                                                                                             2  In  these  data,  ethnicity  is  treated  as  a  separate  concept  from  race.  Ethnicity  consists  of  Latino/non-­‐Latino  and  race   consists  of  five  separate  racial  categories.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   8  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Similarly,   almost   two-­‐thirds   (65%)   of   survey   respondents   identify   as   white,   which  is  higher  than  the  sample  as  a  whole.  Additionally,  16%  did  not  identify  as   any  of  the  available  racial  categories,  which  is  lower  than  the  sample  as  a  whole.     n Age—the   mean   age   for   the   sample   as   a   whole   is   28.06   (SD   =   9.676),   ranging   from  17  years  old  to  72  years  old.  Survey  respondents  are  slightly  older  with  a   mean  age  of  31  (SD  =  11.256).   Student  Experience   A  student  survey  was  administered  to  get  a  sense  of  the  student  experience,  including  student   satisfaction  with  DE  programming  and  challenges  DE  students  experience  that  may  act  as   barriers  to  graduation.    Though  these  results  contain  useful  findings,  the  sample  is  too  small  to   be  confident  that  it  is  fully  representative  of  all  the  students  in  this  study.3    As  such,  extreme   caution  should  be  taken  when  reading  these  results,  as  they  may  not  generalizable  to  the   population  at-­‐large  (i.e.  all  students  in  the  study).   The  student  survey  suggests  satisfaction  is  relatively  high  among  CCCS  students,  with  just  over   four-­‐fifths  (81.6%)  of  survey  respondents  indicating  they  were  either  satisfied  or  very  satisfied   with  their  college  experience.  Additionally,  95.2%  of  survey  respondents  indicated  that  their   college  experience  met  or  exceeded  their  expectations  and  almost  two-­‐thirds  (72.6%)  indicated   that  they  plan  on  graduating  from  the  college  they  are  attending,  while  just  over  half  (53.5%)   indicated  that  they  plan  on  transferring  to  a  different  college.  When  results  from  these  two   questions  are  combined,  the  data  show  that  91.6%  of  respondents  indicated  that  they  either   plan  on  graduating  from  the  college  they  are  in,  and/or  they  plan  on  transferring  to  a  different   college.  Thus,  at  this  point,  8.4%  of  survey  respondents  do  not  anticipate  progressing  through   the  system  to  degree  completion.   In  addition  to  the  satisfaction  measures  addressed  above,  students  were  asked  to  agree  or   disagree  with  a  set  of  statements  related  to  institutional  quality.  These  data  suggest  that   student  perception  of  institutional  quality  is  generally  high.  Indeed,  on  a  five-­‐point  Likert-­‐type   scale  where  1  =  “Strongly  disagree”  and  5  =  “Strongly  agree,”  for  all  but  three  items,  mean   scores  were  above  4  (or  Agree)  and  more  than  80%  of  respondents  agreed  or  strongly  agreed   with  the  statements.  Further,  the  remaining  items  had  mean  scores  above  3  (or  the  neutral   point)  indicating  more  agreement  than  disagreement.     The  student  survey  also  asked  students  to  indicate  the  extent  to  which  certain  circumstances   were  barriers  to  their  ability  and/or  willingness  to  attend  school  next  semester.  Responses  were                                                                                                                           3  The  margin  of  error  for  this  sample  (153  from  a  population  of  1,527)  is  7.52%  at  a  95%   confidence  level.    To  attain  a  more  generally  acceptable  margin  of  error  of  5%  while  retaining  a   95%  confidence  level,  a  sample  of  308  would  have  been  needed.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   9  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       on  a  four-­‐point  Likert-­‐type  scale  where  1  =  “Not  a  barrier,”  2  =  “Somewhat  of  a  barrier,”  3  =   “Moderate  barrier”  and  4  =  “Extreme  barrier.”  As  demonstrated  by  the  mean  scores  (with  only   one  item  exceeding  a  mean  score  of  2,  or  somewhat  of  a  barrier),  respondents  do  not  seem  see   these  items  as  overwhelming  barriers  to  their  ability  to  continue  with  school  next  semester.       Additionally,  correlations4  were  run  with  these  barriers  and  both  the  course  completion  ratio   (ratio  of  DE  courses  passed  over  those  attempted)  and  self-­‐reported  continuance  (respondent   indicating  either  an  intent  to  graduate  and/or  transfer  to  other  school).  These  data  indicated   that  there  is  no  correlation  between  a  student’s  perception  of  each  barrier  and  whether  or  not   he  or  she  expects  to  graduate  or  transfer  to  another  college.  However,  there  are  correlations   between  student  perception  of  barriers  and  their  course  completion  ratio.  In  particular,  the   following  barriers  are  significantly  negatively  correlated  with  course  completion:   n Amount  of  time  required     n Difficulty  of  the  classes     n Navigating  the  administration     n The  lack  of  a  social  scene     n The  school’s  fit  with  my  academic  needs     n Cost  of  school     In  other  words,  as  student  perception  of  each  of  the  above  barriers  rises,  the  likelihood  that  he   or  she  passes  his  or  her  DE  courses  drops.  Yet,  there  is  no  such  correlation  between  student   perception  of  these  barriers  and  their  self-­‐reported  expectation  to  continue  with  college.  This   suggests  that  all  of  the  barriers  listed  in  the  bullet  points  above  impact  student  performance   (as  measured  by  DE  course  completion),  but  that  the  barriers  do  not  impact  student   expectations  regarding  graduation  or  transfer.   Open  Entry/Exit  Math  Labs   (ACC,  PPCC  and  TSJC)   Below  (Table  1)  is  a  summary  of  findings  for  the  math  lab  innovation  cluster  (for  more  complete   findings,  see  the  full  Open  Entry/Exit  Math  Labs  section  below).                                                                                                                           4  The  Pearson  product-­‐moment  correlation  coefficient  is  a  measure  of  the  relationship  between  two  variables;  in   other  words,  a  measure  of  the  tendency  of  the  variables  to  increase  or  decrease  together.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   10  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant         Table  1.  Overview  of  Math  Labs   Item   Performance   Course  Completion  Over  Control   Significantly  Lower   Term  GPA  Over  Control   Perception  of  Innovation  Quality  (Faculty)   No  Significant  Difference   About  the  Same   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Increases  flexibility  for  students     • Allows  appropriate  pace  (not  necessarily  faster)     • Mastery  of  the  subject  matter  (not  just  pass)   • More  friendly  for  some  older  students     • Reduces  point-­‐in-­‐time  student-­‐to-­‐teacher  ratios     Ongoing  Challenges  in  Implementation   • Different  facility  requirements     • Increased  administrative  complexity     • Increased  complexity  for  instructors     • Insufficient  time  management  (on  the  part  of  students)     • “Appropriate  pace”  ≠  faster     Start-­‐Up  Growing  Pains   • Messaging  issues     • Insufficient  training       Accelerated,  Compressed,  Contextualization  and  Mainstreaming   (CCA,  CCD,  FRCC  and  LCC)   Table  2  below  summarizes  the  findings  for  the  accelerated,  compressed,  contextualized  and   mainstreaming  innovation  cluster  (for  more  complete  findings,  see  the  full  Accelerated,   Compressed,  Contextualization  and  Mainstreaming  section  below).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   11  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  2.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Item   Performance   Course  Completion  Over  Control   Term  GPA  Over  Control   No  Significant  Difference   Significantly  Higher   Perception  of  Innovation  Quality  (Faculty)   Better   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Allows  students  to  progress  more  quickly     • Positively  impacts  student  motivation     • Contributes  to  an  improved  academic  culture     • Increases  student  autonomy     • Increases  curriculum  relevance     • Increases  student  engagement     • Facilitates  learning  across  subjects     Ongoing  Challenges  in  Implementation   • Students’  lack  of  desire  to  go  faster     • Students’  lack  of  ability     • Complexity  of  administrative  logistics     • Less  room  to  adjust  to  unforeseen  issues     • Finding  the  appropriate  pace     • Students’  need  for  additional  support     • Occasional  tension  between  contextual  projects  and  basic  content   Start-­‐Up  Growing  Pains   • Messaging  issues     • Insufficient  training     • Time  constraints       Online  Hybrid  Courses   (CCCOnline  and  OJC)   Table  3  below  summarizes  the  findings  for  the  online  hybrid  innovation  cluster  (for  more   complete  findings,  see  the  full  Online  Hybrid  Courses  section  below).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   12  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  3.  Overview  of  Online  Hybrid  Classes   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   No  Significant  Difference   Perception  of  Innovation  Quality  (Faculty)   No  Data   Desire  to  Continue  Innovation  (Faculty)   No  Data   Implemented  as  Intended  (Faculty  Perception)   No  Data   Key  Contextual  Notes   Positive  Developments  in  Implementation   • Adds  a  “personal  touch”  to  online  courses     • Expands  tutoring  within  CCCOnline     • Awareness  was  established     • Access  was  provided     Start-­‐Up  Growing  Pains   • Insufficient  program  definition     • Messaging  issues     • Lack  of  integration     • OJCs  largely  not  utilized           Modularization  and  Diagnostic  Assessments   (MCC,  NJC  and  PCC)   Table  4  below  summarizes  the  findings  for  the  modularization  and  diagnostic  assessments   innovation  cluster  (for  more  complete  findings,  see  the  full  Modularization  and  Diagnostic   Assessments  section  below).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   13  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  4.  Overview  of  Modularization  and  Diagnostic  Assessments   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   No  Significant  Difference   Perception  of  Innovation  Quality  (Faculty)   Better   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Appropriate  pace     • Mastery  of  the  subject  matter     • Shorter  remediation  track     • Instant  feedback     • Appropriate  placement     Challenges  in  Implementation   • Increased  administrative  complexity     • Perception  that  students  are  “teaching  themselves”     • Lack  of  computer  skills     • Diagnostic  testing  ≠  shorter  remediation  track     Start-­‐Up  Growing  Pains   • Messaging  issues     Conclusion  in  Executive  Summary   These  data  go  some  distance  in  answering  outcome  related  research  questions:   • Are  students  within  innovation  DE  programs  more  successful  (in  terms  of  graduation,   retention  and  GPA)  than  those  in  standard  DE  programs?   It  is  premature  to  fully  answer  this  question,  but  thus  far  there  is  not  strong  evidence   to  suggest  that  innovation  formats  are  outperforming  traditional  formats  in  terms  of   retention  and  GPA.  This  is  not  entirely  surprising  as  these  measures  are  largely  long-­‐ term  measures,  and  CCCS  institutions  are  still  in  the  initial  stages  of  the   implementation  of  these  innovations.  Additionally,  it  appears  that  some  innovations   provide  benefits  to  students  that  are  not  objectively  measured  by  this  evaluation.     • Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention,  and   GPA)?   At  this  point  in  the  evaluation,  the  accelerated,  compressed,  contextualized  and   mainstreaming  innovation  cluster  is  outperforming  the  other  innovations  in  terms  of   retention  and  GPA.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   14  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Additionally,  these  data  address  the  following  process  related  research  questions:     • Were  the  innovations  implemented  as  intended?   Despite  some  initial  hurdles,  and  with  a  few  exceptions,  these  innovations  are  being   implemented  largely  as  originally  intended.   • What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   The  evaluation  of  the  first  semester  of  the  implementation  of  the  CICG  project  has   uncovered  a  variety  of  important  lessons:   • Messaging  is  important   • Appropriate  pace  ≠  faster  pace     • There  are  unanticipated  benefits  to  some  of  these  innovations   • New  formats  are  resource  intensive  to  set  up   • New  formats  have  a  learning  curve   • Innovations  are  not  necessarily  replacements  for  a  traditional  format     Additionally,  several  potential  barriers  to  retention  not  related  to  these  innovations   emerged  as  significantly  correlated  with  course  completion  (though  not  with   respondents’  expectations  for  graduation  or  transfer).     These  findings  are  preliminary,  and  it  is  far  too  early  to  make  any  conclusive  judgments  about   the  success  of  the  innovations  implemented  as  part  of  the  CCC  project.  Such  judgments  will   come  later  as  data  are  collected  over  a  longer  period  of  time  and  these  innovations  mature.   However,  the  data  collected  to  date  suggest  that  these  innovations  provide  a  benefit  to   students  and  should  continue  to  be  implemented.  Despite  the  benefits,  however,  these   innovations  are  unlikely  to  be  a  panacea  for  the  challenges  faced  by  DE.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   15  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Introduction  and  Background   The  Colorado  Department  of  Higher  Education  (CDHE)  received  a  Complete  College  America   (CCA)  grant  to  fund  the  Completion  Innovation  Challenge  Grant  (CICG)  project.  The  CICG  project   is  operated  by  the  Colorado  Community  College  System  (CCCS)  and  seeks  to  improve  college   completion  rates  within  CCCS  by  aligning  developmental  education  (DE)  courses  with  innovative,   evidence-­‐based  strategies  (innovations)  and  by  initiating  policy  reforms  that  ensure  the  state   financially  rewards  institutions  that  successfully  increase  the  number  of  college  graduates.     CDHE  and  CCCS  contracted  with  JVA  Consulting,  LLC  (JVA)  to  act  as  a  third  party  evaluator  for   the  innovation  portion  of  this  project.  This  evaluation  attempts  to  answer  the  following  research   questions:   • Were  the  innovations  implemented  as  intended?   • What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   • Are  students  within  innovation  DE  programs  more  successful  (in  terms  of  graduation,   retention  and  GPA)  than  those  in  standard  DE  programs?   • Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention  and   GPA)?   This  report  summarizes  the  methodology  of  this  evaluation,  and  the  findings  to  date,  which   includes  data  from  the  first  semester  of  implementation  (spring  2012).  A  second  report  will  be   produced  in  August  of  2013  and  will  include  data  from  the  first  three  semesters  of   implementation  (spring  2012  through  spring  2013).    Evaluation  will  continue  beyond  the  spring   of  2013,  though  at  this  time  it  is  not  entirely  clear  what  form  this  evaluation  will  take.5     This  report  is  organized  around  four  major  sections  (1)  Introduction  and  Background,  (2)   Methodology,  (3)  Findings  and  (4)  Conclusion.  The  Introduction  and  Background  section  (this   section)  introduces  the  CICG  project  with  a  focus  on  the  need  for  the  project,  the  innovations   implemented  and  the  institutions  involved.  The  methodology  section  discusses  the  overall   design  of  the  evaluation,  each  of  the  data  sources,  the  analysis,  limitations  of  the  data  and  steps   taken  to  protect  study  participants.  The  Findings  section  summarizes  the  key  findings  from  this   study,  focusing  on  four  areas:  student  demographics,  student  experience,  process  evaluation   (were  the  innovations  implemented  as  intended?)  and  outcome  evaluation  (how  successful   were  the  innovations?).                                                                                                                           5  The  CCA  grant  that  funds  these  innovations  and  their  evaluation  will  not  fund  third-­‐party  evaluation  beyond  the   spring  of  2013.  However,  JVA  will  work  with  CCCS  to  ensure  evaluation  continues  in  some  form  beyond  this  time.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   16  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       The  Need  for  the  CICG  Project   Students  referred  to  DE  courses  are  at  risk  of  failing  to  complete  their  degree—under  the   current  circumstances,  half  will  not  even  complete  their  developmental  sequence.6  In  2009,  29%   of  Colorado’s  college  students  required  remediation  in  reading,  writing  or  mathematics,  and   over  half  (53%)  of  students  attending  two-­‐year  institutions  needed  remediation.  At  current   rates,  of  100  students  enrolled  in  the  lowest  level  of  developmental  math,  only  four  will   graduate.     In  response  to  this  need,  the  Higher  Education  Strategic  Planning  Steering  Committee  identified   remediation  redesign  as  a  top  priority  for  Colorado,7  and  the  Governor’s  Office  and  its  partners,   the  Colorado  Commission  on  Higher  Education  (CCHE),  the  Colorado  Department  of  Higher   Education  (CDHE)  and  the  Colorado  Community  College  System  (CCCS)  propose  to  increase  the   number  of  college  graduates  while  reducing  time  to  completion  by  transforming  the  delivery  of   DE.  Thus,  the  CICG  project  is  aligned  with  a  larger  statewide  effort  to  improve  retention  among   students  referred  to  DE  courses.   Innovations   As  part  of  the  CICG  project,  seven  innovations  in  developmental  education  are  being   implemented  at  12  colleges  within  the  CCCS  system:   n n n Open   Entry/Exit   Math   Labs—open   entry/exit   math   labs   offer   developmental   math   courses   that   allow   students   to   work   at   their   own   pace   and   to   test   independently,  while  making  math  mentors  available  to  students  as  needed.   Mainstreaming—mainstreaming   refers   to   an   approach   that   allows   students   who   test   at   the   upper   range   of   developmental   education   to   enroll   in   college   level  courses  with  one  additional  credit  hour  to  allow  them  time  to  strengthen   their  foundational  skills.   Accelerated   and   Compressed—accelerated   courses   alter   the   scheduling   of   developmental   education   such   that   students   can   complete   required   courses   faster  than  the  traditional  semester  sequence.  A  compressed  format  (e.g.,  five-­‐ week  courses)  is  one  type  of  accelerated  course,  though  there  are  others  (e.g.,   combined  formats  where  030  and  060  courses  are  instructed  concurrently  in  the   same  semester).                                                                                                                           6  Bailey,  T.,  Jeong,  D.,  &  Sung-­‐Woo,  C.  (2009).  Referral,  enrollment,  and  completion  in  developmental  education   sequences  in  community  colleges.  New  York:  Community  College  Research  Center,  Teachers  College,  Columbia   University.   7  Colorado  Department  of  Higher  Education  (2010).  The  degree  dividend:  Building  our  economy  and  preserving  our   quality  of  life:  Colorado  must  decide.  Colorado’s  Strategic  Plan  for  Higher  Education.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   17  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       n n n n n Contextualization—contextualized   courses   embed   developmental   education   within   the   context   of   program   specific   content.   Contextualized   courses   either:   (1)   relate   developmental   competencies   to   career/technical   education   competencies,   or   (2)   pair   developmental   education   courses   with   college   level   courses.   Mainstreaming—mainstreaming   refers   to   an   approach   that   allows   students   to   enroll  in  college-­‐level  courses  with  additional  credit  hours  to  allow  them  time  to   strengthen   their   foundational   skills   and   meet   developmental   education   requirements.     Modularization—modularization  refers  to  the  reorganization  of  developmental   education  courses  into  distinct  stand-­‐alone  modules  (or  mods)  that  can  be  taken   in  a  variety  of  combinations.  Currently,  modularization  is  only  available  for  math   courses.   Diagnostic   Assessment—diagnostic   assessment   refers   to   a   pretest   used   to   determine   the   appropriate   placement   of   students   based   on   the   requirements   for   entrance   into   their   degree   program.   Currently,   diagnostic   assessment   is   being   paired   with   modular   math   to   help   determine   the   appropriate   mods   for   students  to  ensure  they  meet  the  requirements  of  their  degree  program.   Online   Hybrid   Courses   for   Developmental   Education—these   innovations   combine   elements   of   traditional   formats   with   online   classes.   In   particular,   live   tutors  are  made  available  to  students  taking  online  courses.   Some  of  these  innovations  are  being  implemented  as  stand-­‐alone  innovations,  while  others  are   being  implemented  in  combination.  Additionally,  several  innovations  closely  overlap  in  practice.     While  there  were  not  sufficient  data  available  to  robustly  investigate  each  institution   separately,  this  study  investigates  the  above  innovations  in  four  distinct  innovation  clusters:   n n n n Open   Entry/Exit   Math   Labs—though   the   precise   meaning   of   “open”   differs   among   institutions,   math   labs   are   implemented   consistently   enough   across   CCCS  institutions  to  treat  them  as  a  distinct  group.   Accelerated,   Compressed,   Contextualized   and   Mainstreaming—based   on   faculty   interviews,   it   appears   that   in   practice   these   innovations   overlap   substantially   within   CCCS   institutions.   Thus,   while   they   are   technically   distinct   innovations,  they  are  clustered  together  for  analysis.   Online   Hybrid—though   the   form   of   online   hybrid   courses   differs,   they   are   similar  enough  to  be  treated  as  a  single  entity.     Modularization   and   Diagnostic   Assessments—one   of   the   three   institutions   implementing   modular   math   is   not   using   diagnostic   assessments.   However,   these  innovations  are  similar  enough  to  be  treated  as  a  single  cluster.     Though  there  is  some  variation  within  each  of  these  clusters,  for  analytical  purposes,  they  are   treated  as  distinct  and  mutually  exclusive  sets  of  innovative  strategies.    To  get  a  better  sense   of  the  variation  within  each  cluster,  descriptions  of  the  specific  innovation  strategies   implemented  by  each  institution  are  present  for  each  cluster  below.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   18  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Open  Entry/Exit  Math  Labs   Three  institutions  implemented  open  entry/exit  math  labs  as  part  of  the  CCC  project:   n n n Arapahoe   Community   College   (open   entry/exit   math   labs)—at   Arapahoe   Community   College   (ACC),   developmental   math   courses   offered   in   a   math   lab   format  are  referred  to  as  FLEX  classes.    FLEX  classes  attempt  to  provide  students   with  the  flexibility  to  decide  when  and  where  they  work,  though  the  format  is   not  entirely  self-­‐paced  as  deadlines  are  provided  (but  students  can  work  faster  if   desired).     In   FLEX   classes,   students   complete   their   homework   online,   but   complete   exams   on   campus.     Additionally,   students   in   FLEX   courses   have   access   to  the  FLEX  Lab  for  face-­‐to-­‐face  tutoring  and  support.   Pikes   Peak   Community   College   (open   entry   math   labs)—at   Pikes   Peak   Community   College   (PPCC),   math   labs   are   open   entry,   but   not   open   exit.     This   format   allow   students   to   work   at   their   own   pace   and   to   come   to   the   lab   as   needed,   where   they   can   access   tutors   and   resources   such   as   practice   tests,   graphing   calculators   or   instructional   DVDs.     These   math   labs   are   also   where   students  go  to  take  their  proctored  tests.     Trinidad   State   Junior   College   (open   entry/exit   math   labs)—at   Trinidad   State   Junior   College   (TSJC),   math   labs   provide   self-­‐paced   instruction   incorporating   both   the   MyMathLab   program   and   more   traditional   paper-­‐pencil   instruction.     Students  are  provided  deadlines  to  complete  their  courses,  but  are  able  to  flex   their  time  within  set  time  blocks.   Accelerated,  Compressed,  Contextualization  and  Mainstreaming   Four  institutions  implemented  accelerated,  compressed,  contextualized  and/or  mainstreaming   efforts  as  part  of  the  CCC  project:   n n   Community  College  of  Aurora  (accelerated,  compressed  and  mainstreaming)— the   Community   College   of   Aurora   (CCA)   provides   a   form   of   accelerated   and   compressed   courses   in   which   two   developmental   math   courses   are   combined   into   one,   allowing   students   to   complete   their   developmental   requirements   in   fifteen   weeks   instead   of   thirty   weeks.     To   support   students   working   at   this   accelerated   pace,   CCA   provides   extra   tutoring   opportunities   and   requires   students   to   attend   a   minimum   amount   of   tutoring.     Additionally,   CCA   is   experimenting   with   some   mainstreaming   efforts   in   which   students   who   would   normally  be  assigned  to  a  developmental  reading  course  (REA  090)  are  able  to   meet  these  requirements  within  a  college  level  course  (BIO  111).   Community   College   of   Denver   (accelerated,   compressed,   mainstreaming   and   contextualized)—at   the   Community   College   of   Denver   (CCD)   the   FastStart   program  combines  accelerated,  compressed  and  mainstreaming  approaches  to   allow   students   to   complete   their   developmental   requirements   more   quickly.     FastStart  allows  students  to  complete  two  levels  of  classes  in  a  single  semester,   or   to   combine   higher   developmental   education   courses   with   college   level   courses   (mainstreaming).     In   addition   to   FastStart,   CCD   students   are   able   to   participate   in   learning   communities   where   they   spend   an   hour   per   week   with   their  peers  and  the  instructor.    Finally,  CCD  offers  a  contextualization  option  in   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   19  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       which  students  apply  the  skills  they  learn  in  their  courses  to  develop  a  business   plan  over  the  course  of  the  semester.   n n Front   Range   Community   College   (accelerated   and   compressed)—Front   Range   Community  College  (FRCC)  initially  intended  to  engage  in  mainstreaming  efforts,   but  latter  shifted  to  an  accelerated  and  compressed  format  that  it  implemented   at   its   Westminster   campus.     To   build   on   this   effort,   FRCC   will   implement   what   was   developed   at   the   Westminster   campus   at   the   Longmont   campus   to   allow   students   from   multiple   campuses   (Longmont,   Greely   and   Fort   Collins)   to   take   advantage  of  the  program.     Lamar   Community   College   (accelerated   and   compressed)—at   Lamar   Community  College  (LCC),   developmental  education  is  offered  in  a  compressed   format,   which   combines   two   classes   in   to   one.     This   shortens   the   remediation   track   and   allows   students   to   complete   their   developmental   education   requirements  more  quickly.   Online  Hybrid  Courses   Two  institutions  implemented  online  hybrid  courses  as  part  of  the  CCC  project:   n n Colorado   Community   College   Online   (online   hybrid   courses)—Colorado   Community   College   Online   (CCCOnline)   provides   online   courses   for   colleges   throughout   CCCS,   and   as   part   of   the   CICG   innovations   in   developmental   education,   added   additional   in   house   tutoring   services   for   developmental   English  and  math.    This  approach  is  intended  to  add  a  personal  touch  to  online   courses,   and   as   such,   to   combine   some   of   the   most   promising   elements   of   traditional  and  online  courses.   Otero   Junior   College   (online   hybrid   courses)—at   Otero   Junior   College   (OJC),   students  in  developmental  math  are  able  to  take  advantage  of  an  online  hybrid   format   by   combining   face-­‐to-­‐face   instruction   with   online   tutoring   services   offered  by  CCCOnline.   Modularization  and  Diagnostic  Assessments   Three  institutions  implemented  modularization  and  diagnostic  assessments:   n n   Morgan   Community   College   (diagnostic   assessments   and   math   mods)—at   Morgan   Community   College   (MCC),   students   take   the   ACCUPLACER   to   identify   their   appropriate   placement   within   the   MyFoundationsLab   program.     This   program  provides  online  activities  and  assessments,  with  an  interactive  guided   solution   and   sample   problem   for   each   exercise.     This   program   also   provides   students   with   a   variety   of   resources   including   video   lectures,   animations,   and   audio  files.     Northeastern   Junior   College   (diagnostic   assessments   and   math   mods)—at   Northeastern  Junior  College  (NJC),  all  developmental  math  has  been  converted   to  a  modular  format.    Students  take  the  ACCUPLACER  test  within  the  first  week   of   classes   to   identify   which   modules   are   most   appropriate   for   them.     Once   placed,   student’s   complete   modules   at   their   own   pace,   but   are   provided   timelines   to   guide   them   through   the   semester.     To   advance   through   the   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   20  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       sequence   students   have   to   pass   tests.     Initially,   students   had   unlimited   opportunities   to   take   these   tests,   but   NJC   found   that   this   led   many   of   its   students   to   not   take   the   tests   seriously.     As   such,   students   now   have   three   opportunities  to  pass  each  test.   n Pueblo   Community   College   (diagnostic   assessments   and   math   mods)—at   Pueblo   Community   College   (PCC),   students   have   option   to   take   their   developmental  math  courses  in  modules  that  allow  them  to  work  at  their  own   pace   utilizing   online   math   software.     A   diagnostic   assessment   is   used   to   identify   the   competency   areas   in   which   students   have   not   demonstrated   mastery,   and   the   modules   that   are   associated   with   these   areas.     Though   the   course   itself   is   four   credit   hours,   over   the   course   of   a   semester,   students   can   complete   the   equivalent  of  up  to  13  credit  hours  worth  of  developmental  math  coursework.   Methodology   This  study  has  two  design  components:  (1)  an  outcome  evaluation  component  and  (2)  a  process   evaluation  component.  The  outcome  evaluation  was  designed  to  measure  what  these   innovations  accomplished  last  semester,  and  to  answer  the  questions:   n n Are   students   within   innovation   DE   programs   more   successful   (in   terms   of   graduation,  retention  and  GPA)  than  those  in  standard  DE  programs?   Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention   and  GPA)?   To  measure  these  outcomes,  a  case-­‐control  quasi-­‐experimental8  design  was  used.  In  this  design,   student  performance  within  innovation  courses,  measured  by  institutional  data,  was  compared   to  the  performance  of  students  within  control  groups.  These  data  were  supplemented  by  a   student  survey,  which  provided  additional  data  to  ensure  the  differences  observed  between   innovation  and  control  courses  were  not  the  result  of  other  factors.     The  process  evaluation  component  was  designed  to  answer  the  questions:   n n Were  the  innovations  implemented  as  intended?   What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   To  answer  these  questions,  a  survey  was  administered  to  students  to  support  institutional  data   derived  from  the  Student  Unit  Record  Data  System  (SURDS).    Additionally,  a  faculty  survey  was   administered  and  interviews  were  conducted  with  key  faculty.  The  sections  below  discuss  each                                                                                                                           8  Quasi-­‐experimental  designs  differ  from  experimental  designs  in  that  treatments  or  interventions  are  not  assigned   randomly.  In  this  case,  it  refers  to  the  fact  that  students  were  not  randomly  assigned  to  innovation  courses.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   21  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       of  these  data  sources  in  more  detail,  how  the  control  groups  were  constructed,  the  analyses   that  were  conducted,  and  the  limitations  of  the  study.   Data  Sources   The  data  used  in  this  study  were  gathered  from  four  sources:  (1)  CCCS  institutional  data,  (2)  a   student  survey,  (3)  a  faculty  survey  and  (4)  interviews  with  faculty.  Each  of  these  sources  is   discussed  below.   Institutional  Data   JVA  worked  with  CCCS  to  access  institutional  data  for  all  students  in  the  study  through  the   Student  Unit  Record  Data  System  (SURDS).  These  data  included  demographics,  grades  and   course  completion  variables.  Student  ID  numbers  were  used  as  unique  identifiers  to  match   these  data  to  student  survey  data  (see  below).  However,  in  an  effort  to  maximize  protection  of   student  data,  student  numbers  were  stripped  from  the  data  once  the  match  was  made  and  new   identifiers  were  assigned.   Student  Survey   In  partnership  with  CCCS,  JVA  designed  and  administered  an  electronic  survey  to  all  students  in   in  the  study.  This  survey  was  designed  to  ascertain  student  satisfaction  with  DE  programming,   and  to  identify  challenges  DE  students  experience  that  may  act  as  barriers  to  graduation.   Student  ID  numbers  were  used  to  match  these  data  to  the  institutional  data  collected  (see   above)  but  were  stripped  once  the  match  was  made.  Additionally,  electronic  informed  consent   was  acquired  as  part  of  the  survey  (see  Appendix  A  for  a  copy  of  the  survey).   Faculty  Survey   JVA  also  worked  with  CCCS  to  administer  an  electronic  survey  to  DE  faculty  and  staff  to  ascertain   the  degree  to  which  faculty/staff  members  feel  each  innovation  is  being  implemented  as   intended  and  faculty  perception  of  the  quality  of  the  innovations.  Skip  logic  was  used,  such  that   respondents  were  presented  with  questions  tailored  to  the  innovations  their  institution  is   implementing.  These  data  are  reported  in  aggregate,  and  all  personal  identifiers  (i.e.,  names  and   email  addresses)  were  stripped  from  the  data.  Additionally,  electronic  informed  consent  was   acquired  as  part  of  the  survey  (see  Appendix  B).   Faculty  Interviews   JVA  conducted  phone  interviews  lasting  approximately  15–30  minutes  with  16  key  faculty  and   staff  members  to  ascertain  the  degree  to  which  each  innovation  is  being  implemented  as   intended,  what  is  going  well  and  what  could  be  improved  upon.  Though  these  data  are  reported   in  aggregate;  to  maintain  confidentiality,  names  are  not  attached  to  any  of  the  data.   Additionally,  verbal  informed  consent  was  acquired  prior  to  engaging  in  the  interview  (see   appendix  C).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   22  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Control  Group   To  build  control  groups,  students  in  traditional-­‐format  DE  courses  were  identified  and  matched   by  institution  and  course—for  each  innovation  course,  a  corresponding  traditional  course  at  the   same  institution  was  identified.  When  this  was  not  possible,  a  course  at  a  similar  institution   (similar  in  terms  of  size  and  rural/urban  location)  was  identified.  This  process  ensured  that,   whenever  possible,  innovation  courses  were  matched  to  control  courses  at  the  same  institution.   As  such,  institutionally  specific  variables  were  controlled  as  much  as  possible.  Finally,  within   each  innovation  cluster,  control  groups  were  matched  to  the  innovation  groups  along  four   demographic  factors:  (1)  gender,  (2)  ethnicity,  (3)  race9  and  (4)  age.  In  the  findings  section   below,  the  relative  match  between  control  and  innovation  groups  is  identified  for  each   innovation  cluster.   Data  Analysis   The  quantitative  data  contained  within  this  report  (institutional  data  and  survey  data)  were   analyzed  using  SPSS  (a  statistical  analysis  software  package).  Analyses  included  descriptive   statistics  as  well  as  basic  inferential  statistics  including  Chi-­‐squared  distributions,  Pearson’s   correlations,  independent  samples  t-­‐tests  and  analysis  of  variance  (ANOVA).  General   descriptions  of  these  procedures  are  contained  within  footnotes  to  the  procedures  themselves.   The  qualitative  data  contained  within  this  report  (interview  notes  and  open-­‐ended  survey   questions)  were  analyzed  using  NVivo,  a  qualitative  data  analysis  software  package.  Using   NVivo,  JVA  analysts  coded  the  data  by  source  (group)  and  general  themes.  These  original  codes   were  then  reworked  (clustered  and  split)  until  coherent  stand-­‐alone  themes  were  produced.     Study  Limitations   Though  this  evaluation  provides  valuable  information  on  the  CICG  program,  it  suffers  from  some   limitations.  Chief  among  these  is  the  mismatch  between  the  time  horizon  of  the  study  and  the   desired  outcomes.  In  particular,  college  retention  is  a  long-­‐term  measure  that  will  most   effectively  be  measured  over  time.  As  such,  it  is  simply  too  early  to  reach  any  definite   conclusions  regarding  the  impact  these  innovations  have  on  retention  (though  preliminary   findings  are  contained  within).  Over  time,  this  limitation  will  be  partially  mitigated,  as  this  study   will  continue  in  its  current  form  for  another  12  months,  and  then  continue  in  a  modified  form   after  that.    However,  data  on  long-­‐term  student  retention  will  not  be  available  for  several  years   to  come  and  conclusive  data  may  never  become  available  given  the  already  limited  sample  size   and  the  relatively  large  attrition  rates  experienced  by  this  population.                                                                                                                             9  In  these  data,  ethnicity  is  treated  as  a  separate  concept  from  race.  Ethnicity  consists  of  Latino/non-­‐Latino  and  race   consists  of  five  separate  racial  categories.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   23  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Another  limitation  is  the  ambiguity  contained  within  definitions  of  these  innovations— institutions  define  and  implement  the  same  innovations  somewhat  differently.  This  presents  a   challenge  to  evaluation  by  making  it  more  difficult  to  draw  clear  distinctions  between   innovations.  This  challenge  is  exacerbated  by  the  relatively  small  number  of  students  contained   within  specific  innovations  at  specific  institutions.  This  study  has  partially  overcome  both  of   these  challenges  by  grouping  the  innovations  into  similar  innovation  clusters,  thus  providing   clearly  distinct  groups  with  enough  cases  to  conduct  statistical  analysis.       This  approach  has,  however,  presented  an  additional  limitation.    By  combining  multiple   innovations  into  clusters,  the  analysis  is  unable  to  make  distinctions  between  similar  innovations   within  clusters.    For  example,  though  three  institutions  (ACC,  PPCC  and  TSJC)  are  implementing   open  entry/exit  math  labs,  the  ways  in  which  they  are  doing  so  vary  (see  the  descriptions   above).    This  limitation  is  particularly  stark  for  the  Accelerated,  Compressed,  Contextualized  and   Mainstreaming  cluster.    All  of  the  institutions  involved  in  this  cluster  engage  in  some  form  of   accelerated  and  compressed  developmental  education,  but  several  include  either   mainstreaming  or  contextualization  as  well.    These  issues  are  compounded  by  the  fact  that  CCCS   institutions  vary  dramatically  in  size,  and  as  a  result,  rather  large  portions  of  some  innovation   clusters  are  made  up  of  single  institutions.    This  means  that  a  particular  form  of  an  innovation   implemented  by  a  particular  institution  may  disproportionally  influence  the  results  observed  for   a  particular  cluster.   An  additional  limitation  is  the  generally  small  sample  sizes  for  some  of  the  measures.    In   particular,  the  samples  for  data  from  faculty  (the  faculty  survey  and  interviews  with  faculty)  are   too  small  to  be  considered  representative  of  the  views  of  all  faculty  members.    Data  for  the   students  is  less  limited,  as  sample  size  is  not  a  problem  for  the  institutional  data  grouped  by   innovation  cluster.    However,  the  sample  for  the  student  survey  is  too  small  to  be  considered   representative  of  all  students  in  the  study10.    As  such,  the  generalizability  of  these  findings  is   somewhat  limited  and  extreme  caution  should  be  taken  when  extrapolating  from  these   findings.   Finally,  not  all  of  the  potential  benefits  associated  with  these  innovations  are  measured  by  this   evaluation.  As  a  particularly  cogent  example,  some  innovations  appear  to  be  increasing  the   amount  students  learn  by  slowing  the  pace  at  which  they  do  so  (see  the  Math  Labs  section   below).  While  the  qualitative  data  included  below  are  able  to  partially  capture  this  possibility,   the  extent  to  which  this  is  actually  occurring  is  not  possible  to  determine  here  as  the  data   needed  to  draw  such  conclusions  were  not  collected  as  part  of  this  evaluation.                                                                                                                           10  The  margin  of  error  for  the  student  survey  sample  (a  sample  of  153  from  a  population  of   1,527)  is  7.52%  at  a  95%  confidence  level.    To  attain  a  more  generally  acceptable  margin  of  error   of  5%  while  retaining  a  95%  confidence  level,  a  sample  of  308  would  have  been  needed.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   24  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Despite  these  limitations,  this  evaluation  provides  valuable  information  on  the  progress  made   by  the  CICG  project.    Though  these  findings  cannot  be  considered  conclusive,  they  do  provide  a   sense  of  how  the  project  has  progressed  and  what  it  has  accomplished  thus  far.   Findings   Findings  are  presented  in  six  sections  below:  (1)  student  demographics,  (2)  student  experience,   (3)  math-­‐lab  innovation  cluster,  (4)  accelerated,  compressed,  contextualized  and  mainstreaming   innovation  cluster,  (5)  online  hybrid  innovation  cluster,  and  (6)  modularization  and  diagnostic   assessment  innovation  cluster.   Student  Demographics   Student  demographic  data  for  this  study  are  from  two  sources:  (1)  institutional  data  and  (2)  the   student  survey.  Data  from  each  of  these  sources  are  presented  below.   Overall  (Institutional  Data)   Figure  1  below  displays  the  gender  breakdown  for  the  entire  sample.  As  shown  below,  more   than  half  (55%)  of  the  sample  is  female.   Figure  1.  Gender  for  the  Entire  Sample  (n  =  1,527)   45%   55%   Male   Female     Table  5  below  displays  the  ethnic  break  down  (Latino,  not  Latino)  for  the  sample.  As  shown   below,  roughly  one-­‐fifth  (20.3%)  of  the  sample  identifies  as  Latino.  Figure  2  below  shows  the   racial  breakdown  for  the  sample.   Table  5.  Percentage  Latino  and  Not  Latino  for  Entire  Sample  (n  =  1,527)   Ethnicity                        Percentage  of  Sample   Latino   20.3%   Not  Latino   79.7%       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   25  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   26       As  shown  below,  almost  three-­‐fifths  (58%)  of  the  sample  identifies  as  white,  and  just  over  one-­‐ fifth  (22%)  did  not  identify  as  any  of  the  available  racial  categories.  Additionally,  the  mean  age   for  this  sample  was  28.06  (SD  =  9.676),  ranging  from  17  years  old  to  72  years  old.   Figure  2.  Race  by  Group  for  the  Entire  Sample  (n  =  1,527)   100%   80%   58%   60%   40%   22%   20%   3%   10%   2%   5%   1%   0%   Asian   Black   Naove  American   Pacific  Islander   White   Mixed   Not  ID'd     Student  Survey  Respondents   Figure  3  below  displays  the  gender  breakdown  for  the  student  survey  respondents.  As  shown   below,  just  over  two-­‐thirds  (70%)  of  student  survey  respondents  identified  as  female.  This  is  a   higher  proportion  than  for  the  sample  as  a  whole.   Figure  3.  Gender  for  Survey  Data  (n  =  153)   30%   Male   Female   70%     Table  6  below  displays  the  ethnic  break  down  (Latino,  not  Latino)  for  survey  respondents.  As   shown  below,  just  under  one-­‐fifth  (17.0%)  of  the  sample  identifies  as  Latino.  This  is  slightly   lower  than  the  sample.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   27       Table  6.  Percentage  Latino  and  Not  Latino  for  Survey  Data  (n  =  153)   Ethnicity                        Percentage  of  Sample   Latino   17.0%   Not  Latino   83.0%     Figure  4  below  shows  the  racial  breakdown  for  student  survey  respondents.  As  shown  below,   almost  two-­‐thirds  (65%)  of  survey  respondents  identify  as  white,  which  is  higher  than  the   sample  as  a  whole.  Additionally,  16%  did  not  identify  as  any  of  the  available  racial  categories,   which  is  lower  than  the  sample  as  a  whole.     Figure  4.  Gender  by  Group  for  Student  Survey  Respondents  (n  =  153)   100%   90%   80%   70%   60%   50%   40%   30%   20%   10%   0%   65%   2%   Asian   Black   8%   16%   3%   Naove  American   6%   1%   Pacific  Islander   White   Mixed   Not  ID'd     As  shown  in  Table  7,  below,  the  mean  age  for  survey  respondents  is  31,  which  is  three  years   older  than  the  average  for  the  sample  as  a  whole.  Additionally,  survey  respondents  have  an   average  of  almost  one  child  (for  both  children  under  18  generally  and  children  under  18  living   with  the  respondent).   Table  7.  Mean  (SD)  Age,  Number  of  Children  Under  18  and  Number  of  Children  Under  18  Living   with  Respondent  for  Student  Survey  Data  (n  =  153)   Item                                                Mean  (SD)   Age  of  respondent   31.31(11.26)   Children  under  18   0.97(1.35)   Children  under  18  living  with  respondent   0.99(1.19)       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   28       Figure  5  below  shows  the  breakdown  for  the  number  of  hours  worked  per  week  by  survey   respondents.  As  shown  below,  just  under  two-­‐fifths  (39%)  of  respondents  don’t  work  at  all   during  the  semester.  Further,  of  those  who  do  work,  most  work  between  21  and  30  hours.   Figure  5.  Hours  Worked  Per  Week  During  the  Semester  for  Student  Survey  Respondents  (n  =  153)   100%   80%   60%   39%   40%   20%   19%   14%   7%   13%   8%   0%   0   1  to  10   11  to  20   21  to  30   31  to  40   40+     The  relationship  status  of  survey  respondents  is  displayed  below  in  Figure  6.  As  Figure  6  shows,   roughly  half  of  survey  respondents  (46%)  are  not  in  a  relationship  (single,  divorced  or  separated)   and  exactly  half  (50%)  are  either  married  or  in  a  relationship.  Further,  of  students  who  have  one   or  more  child  under  18  living  with  them,  roughly  half  (46%)  are  not  in  a  relationship.   Figure  6.  Relationship  Status  of  Survey  Respondents  (n  =  153)   100%   80%   60%   33%   40%   21%   29%   13%   20%   3%   0%   Single   Relaoonship  (not  married)   Married   Divorced/Separated   Other     Student  Experience   A  student  survey  was  administered  to  all  students  in  both  the  control  and  innovation  groups  (a   total  of  1,527  students)  to  get  a  sense  of  the  student  experience,  including  student  satisfaction   with  DE  programming,  and  challenges  DE  students  experience  that  may  act  as  barriers  to   graduation.      One  hundred  and  fifty  three  students  responded  to  this  survey,  providing  a   response  rate  of  10%.    However,  with  a  population  of  1,527  (all  students  in  the  study,  both     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       control  and  innovation)  this  sample  is  too  small  to  be  confident  that  it  is  fully  representative  of   the  students  in  this  study.11    This  does  not  mean  that  these  results  are  meaningless,  but  extreme   caution  should  be  taken  when  reading  these  results,  as  they  may  not  generalizable  to  the   population  at-­‐large.   The  results  are  presented  below  in  three  sections:  general  satisfaction  measures,  student   perceptions  of  institutional  quality  and  barriers  to  retention.       General  Satisfaction   The  student  survey  suggests  satisfaction  is  relatively  high  among  CCCS  students,  with  just  over   four-­‐fifths  (81.6%)  of  survey  respondents  indicating  they  were  either  satisfied  or  very  satisfied   with  their  college  experience.  Additionally,  95.2%  of  survey  respondents  indicated  that  their   college  experience  met  or  exceeded  their  expectations.   Table  8  below  displays  the  results  of  satisfaction  related  questions  rated  on  a  five-­‐point  Likert-­‐ type  scale,  where  1  =  “Definitely  NOT,”  2  =  “Probably  NOT,”  3  =  “Unsure,”  4  =  “Probably”  and  5  =   “Definitely.”  Items  are  listed  in  the  table  in  descending  order,  from  highest  mean  to  lowest   mean.  As  shown  in  Table  8,  the  vast  majority  of  respondents  indicated  that  they  would   recommend  their  college  to  others,  and  that  if  they  had  it  to  do  over  again,  they  would  attend   the  same  college.  Additionally,  almost  two-­‐thirds  (72.6%)  indicated  that  they  plan  on  graduating   from  the  college  they  are  attending,  and  just  over  half  (53.5%)  indicated  that  they  plan  on   transferring  to  a  different  college.  When  results  from  these  two  questions  are  combined,  the   data  show  that  91.6%  of  respondents  indicated  that  they  either  plan  on  graduating  from  the   college  they  are  in,  and/or  they  plan  on  transferring  to  a  different  college.  Thus,  at  this  point,   8.4%  of  respondents  do  not  anticipate  progressing  through  the  system  to  degree  completion.   Table  8.  General  Satisfaction  Measures  (n  =  153)   Question   Mean  (SD)   %  (n)  Probably    or   Definitely   Would  you  recommend  this  college  to  others?   If  you  had  to  do  it  over  again,  would  you  attend   this  college?   Do  you  plan  on  graduating  from  this  college?   4.42(0.80)   90.4%(132)   4.31(0.84)   87.6%(127)   4.09(1.11)   72.6%(106)   Do  you  plan  on  transferring  to  a  different  college?   3.65(1.27)   53.5%(77)                                                                                                                             11  The  margin  of  error  for  this  sample  (153  from  a  population  of  1,527)  is  7.52%  at  a  95%   confidence  level.    To  attain  a  more  generally  acceptable  margin  of  error  of  5%  while  retaining  a   95%  confidence  level,  a  sample  of  308  would  have  been  needed.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   29  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       To  test  whether  satisfaction  varied  by  innovation  cluster,  a  one-­‐way  Analysis  of  Variance   (ANOVA)12  was  conducted  on  satisfaction  items  with  innovation  clusters  as  the  factor.  This  test   showed  that  there  was  no  significant  difference  in  these  satisfaction  measures  between   innovation  clusters.13  In  other  words,  there  is  no  evidence  to  suggest  that  satisfaction  varies   across  the  innovation  clusters.     Student  Perception  of  Institutional  Quality   In  addition  to  the  satisfaction  measures  addressed  above,  students  were  asked  to  agree  or   disagree  with  a  set  of  statements  related  to  institutional  quality.  Responses  were  on  a  five-­‐point   Likert-­‐type  scale  where  1  =  “Strongly  disagree”  and  5  =  “Strongly  agree.”  These  results  are   displayed  in  Table  9  below,  with  items  listed  from  highest  to  lowest  mean  score.   Table  9.  Student  Perception  on  Indicators  of  Institutional  Quality  (n  =  153)   Item   Mean  (SD)   %  (n)  Agree  or   Strongly  Agree   My  professors  are  knowledgeable   I  am  able  to  get  to  classes   The  content  of  my  courses  is  valuable   I  enjoy  my  classes   Campus  is  safe  and  secure   My  professors  are  fair   The  content  of  my  courses  is  relevant  to  me   The  registration  process  works  well   The  tuition  I  pay  is  a  worthwhile  investment   My  advisor  is  helpful   4.32(0.84)   4.24(0.81)   4.23(0.76)   4.19(0.84)   4.18(0.81)   4.15(0.86)   4.07(0.92)   3.95(0.99)   3.90(1.00)   3.82(1.13)   88.4%(129)   89.7%(130)   88.4%(130)   84.8%(123)   82.2%(120)   86.4%(127)   82.1%(119)   77.4%(113)   72.6%(106)   63.9%(94)     As  shown  above,  these  data  suggest  that  student  perception  of  institutional  quality  is  generally   high.  Indeed,  for  all  but  three  items,  mean  scores  were  above  4  (or  Agree)  and  more  than  80%   of  respondents  agreed  or  strongly  agreed  with  the  statements.  Further,  the  remaining  items  had   mean  scores  above  3  (or  the  neutral  point)  indicating  more  agreement  than  disagreement.                                                                                                                             12  Analysis  of  Variance  (ANOVA)  is  a  statistical  procedure  that  examines  differences  in  outcomes  for  two  or  more   groups.  Results  from  this  procedure  determine  if  two  or  more  values  are  significantly  different—or  not  due  to  a   chance  occurrence  in  the  data.     13  General  satisfaction  [F(3,  76)  =  0.926,  p  =  0.432]     Recommend  to  others  [F(3,  76)  =  0.948,  p  =  0.422]   Attend  if  had  to  do  over  again  [F(3,  76)  =  0.915,  p  =  0.438]   Plan  on  graduating  from  this  college  [F(3,  76)  =  0.198,  p  =  0.898]   Plan  on  transferring  to  a  different  college  [F(3,  75)  =  2.404,  p  =  0.074]       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   30  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       To  test  whether  these  findings  varied  by  innovation  cluster,  a  one-­‐way  ANOVA  was  conducted   with  innovation  clusters  as  the  factor.  This  test  showed  that  there  was  no  significant  difference   in  these  quality  measures  between  innovation  clusters.14  In  other  words,  there  is  no  evidence  to   suggest  that  student  perception  of  quality  varies  across  the  innovation  clusters.     Barriers  to  Retention   The  student  survey  also  asked  students  to  indicate  the  extent  to  which  certain  circumstances   were  barriers  to  their  ability  and/or  willingness  to  attend  school  next  semester.  Responses  were   on  a  four-­‐point  Likert-­‐type  scale  where  1  =  “Not  a  barrier,”  2  =  “Somewhat  of  a  barrier,”  3  =   “Moderate  barrier”  and  4  =  “Extreme  barrier.”  These  results  are  presented  in  Table  10  below,  in   order  of  the  highest  to  lowest  mean  scores.   Table  10.  Student  Ratings  of  Barriers  to  Retention  (n  =  153)   Barrier                  Mean  (SD)   %  (n)  Indicating    is   a  Barrier   Cost  of  school     2.03(1.02)   59.6%(87)   Amount  of  time  required     1.99(0.90)   65.1%(95)   Work  obligations   1.92(0.93)   58.7%(84)   Family  obligations   1.91(0.93)   59.6%(87)   Difficulty  of  the  classes   1.68(0.83)   47.3%(69)   Navigating  the  administration     1.60(0.87)   39.7%(58)   This  school's  fit  with  my  academic  needs   1.54(0.86)   34.5%(50)   The  lack  of  a  social  scene   1.42(0.76)   27.8%(40)   Distance  from  family/friends   1.41(0.69)   30.3%(44)   I  don't  fit  in  at  this  school   1.33(0.73)   21.4%(31)     As  demonstrated  by  the  mean  scores  above  (with  only  one  item  exceeding  a  mean  score  of  2,  or   somewhat  of  a  barrier),  respondents  do  not  see  these  items  as  overwhelming  barriers  to  their   ability  to  continue  with  school  next  semester.                                                                                                                             14  Professors  are  knowledgeable  [F(3,  76)  =  0.187,  p  =  0.905]   I  am  able  to  get  to  classes  [F(3,  74)  =  1.091,  p  =  0.358]   Content  of  courses  is  valuable  [F(3,  76)  =  0.322,  p  =  0.809]   I  enjoy  my  classes  [F(3,  75)  =  1.520,  p  =  0.26]   Campus  is  safe  and  secure  [F(3,  76)  =  0.430,  p  =  0.732]   My  professors  are  fair  [F(3,  76)  =  0.773,  p  =  0.513]   Content  of  courses  is  relevant  [F(3,  75)  =  1.237,  p  =  0.302]   Registration  process  works  well  [F(3,  76)  =  0.528,  p  =  0.664]   Tuition  I  pay  is  worthwhile  [F(3,  76)  =  0.850,  p  =  0.471]   My  advisor  is  helpful  [F(3,  76)  =  1.227,  p  =  0.306]       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   31  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       To  investigate  the  relationship  between  potential  barriers  and  retention  further,  correlations15   were  run  with  these  barriers  and  both  the  course  completion  ratio  (ratio  of  DE  courses  passed   over  those  attempted)  and  self-­‐reported  continuance  (respondent  indicating  either  an  intent  to   graduate  and/or  transfer  to  other  school).  These  results  are  presented  in  Table  11  below.   Table  11.  Correlation  Between  Barriers  to  Retention  and  Course  Completion  and  Self-­‐Reported   Expectation  to  Continue  College  (n  =  153)   Barrier   Course   Self-­‐Reported   Completion   Continuance   Cost  of  school     -­‐0.205*   0.144   -­‐0.244**   -­‐0.006   Work  obligations   -­‐0.161   -­‐0.125   Family  obligations   -­‐0.122   -­‐0.070   Difficulty  of  the  classes   -­‐0.241**   -­‐0.028   Navigating  the  administration     -­‐0.216**   -­‐0.036   This  school's  fit  with  my  academic  needs   -­‐0.214**   -­‐0.023   The  lack  of  a  social  scene   -­‐0.215**   0.024   Distance  from  family/friends   -­‐0.043   0.102   I  don't  fit  in  at  this  school   -­‐0.134   -­‐0.011   Amount  of  time  required     *  p  <  0.05   **  p  <  0.01   The  data  above  shows  that  there  is  no  correlation  between  a  student’s  perception  of  each   barrier  and  whether  or  not  he  or  she  expects  to  graduate  or  transfer  to  another  college.   However,  there  are  correlations  between  student  perception  of  barriers  and  their  course   completion  ratio.  In  particular,  the  following  barriers  are  significantly  negatively  correlated  with   course  completion:   n Amount  of  time  required     n Difficulty  of  the  classes     n Navigating  the  administration     n The  lack  of  a  social  scene     n The  school’s  fit  with  my  academic  needs     n Cost  of  school     In  other  words,  as  student  perception  of  each  of  the  above  as  a  barrier  rises,  the  likelihood  that   he  or  she  passes  his  or  her  DE  courses  drops.  Yet,  there  is  no  such  correlation  between  student                                                                                                                           15  The  Pearson  product-­‐moment  correlation  coefficient  is  a  measure  of  the  relationship  between  two  variables;  in   other  words,  a  measure  of  the  tendency  of  the  variables  to  increase  or  decrease  together.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   32  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       perception  of  these  barriers  and  their  self-­‐reported  expectation  to  continue  with  college.  This   suggests  that  all  of  the  barriers  listed  in  the  bullet  points  above  impact  student  performance   (as  measured  by  DE  course  completion),  but  that  the  barriers  do  not  impact  student   expectations  regarding  graduation  or  transfer.   Open  Entry/Exit  Math  Labs   (ACC,  PPCC  and  TSJC)   This  section  investigates  the  open  entry/exit  math  lab  innovation  cluster,  which  includes   developmental  math  courses  that  allow  students  to  work  at  their  own  pace  and  to  test   independently,  while  making  math  mentors  available  to  students  as  needed.    Three  institutions   are  included  in  this  cluster:   n Arapahoe  Community  College  (open  entry/exit  math  labs)   n Pikes  Peak  Community  College  (open  entry  math  labs)     n Trinidad  State  Junior  College  (open  entry/exit  math  labs)   The  findings  presented  below  are  divided  into  three  sections:  (1)  outcomes,  (2)  process  and  (3)   an  overview.   Outcomes:  How  Successful  were  Math  Labs?   The  outcome  component  of  this  study  draws  from  institutional  data  on  course  completion  and   GPA,  as  well  as  faculty  perceptions  from  the  faculty  and  staff  survey.  These  data,  as  well  as  data   on  how  closely  the  control  group  matches  the  innovation  group  for  this  cluster,  are  presented   below.   Characteristics  of  the  Control  and  Innovation  Groups   Students  in  the  control  group  are  in  the  same  DE  classes  (but  are  in  a  traditional  course  format)   and  attend  the  same  institutions  as  those  in  the  innovation  group.  Additionally,  the  control   group  was  matched  to  the  innovation  group  along  four  demographic  characteristics:  (1)  gender,   (2)  ethnicity,  (3)  race  and  (4)  age.  Table  12  below  displays  this  relative  match.   Table  12.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for  Open   Entry/Exit  Math  Labs   Characteristic   Control    (n  =  390)   Innovation  (n  =   Significant   390)   Difference^   No   Percentage  Female  (gender)   53.0%   53.0%   Percent  age  Latino  (ethnicity)   20.0%   14.6%   No   Percentage  Non-­‐White  (race)   39.2%   37.2%   No   Mean  Age   28.16   28.02   No   ^  Statistical  tests  were  conducted  on  each  of  the  above  to  determine  whether  the  difference  between  the  control  and   innovation  groups  observed  is  statistically  significant.  Chi-­‐Square  tests  were  conducted  on  gender  (percentage   female),  ethnicity  (percentage  Latino)  and  race  (percentage  non-­‐White)  and  an  independent  samples  t-­‐test  was   conducted  on  age.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   33  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant         As  shown  above,  the  control  group  for  this  innovation  cluster  closely  mirrors  the  innovation   group  in  terms  of  gender,  ethnicity,  race  and  age.  Indeed,  the  differences  that  exist  are  not   statistically  significant  using  this  method  of  comparison.     Course  Completion  and  Term  GPA   To  measure  the  impact  of  this  innovation  on  course  completion  and  term  GPA,  an  independent   samples  t-­‐test16  was  conducted  to  compare  the  mean  performance  of  students  in  the  control   group  to  those  in  the  innovation  group.  Results  are  presented  below.   Table  13.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation  Group   for  Course  Completion  and  Term  GPA  for  Open  Entry/Exit  Math  Labs   Item   Control               Innovation     t   df   p   Mean  (SD)   Mean  (SD)   Mean  Course  Completion  Ratio     0.60(0.47)   0.50(0.49)   3.065   777   0.002   Mean  Term  GPA   2.80(1.03)   2.79(1.02)   0.019   510   0.985     As  shown  above,  there  is  no  significant  difference  in  term  GPA  between  the  control  and   innovation  groups.  However,  there  is  a  significant  difference  between  control  and  innovation  in   course  completion,  with  math  labs  having  a  lower  completion  rate  than  control.  This  suggests   that  students  in  math  labs  complete  their  DE  courses  at  lower  rates  than  those  in  traditional   formats,  but  that  there  is  no  difference  in  term  GPA.     Faculty  Perception   The  faculty  survey  included  two  questions  related  to  faculty  members’  perception  of  the  success   of  open  entry-­‐exit  math  labs  over  the  course  of  the  semester.  The  first  of  these  questions  asked   respondents  to  indicate  whether  they  thought  math  labs  were  better  or  worse  for  students  than   the  traditional  format.  These  results  are  presented  in  Figure  7  below.  As  shown  below,  three   respondents  were  unsure  and  four  indicated  that  they  thought  math  labs  were  about  the  same   for  students  overall.                                                                                                                           16  A  t  test  is  a  statistical  procedure  that  is  commonly  used  to  examine  differences  in  mean  values  across  two  groups.     Results  from  this  procedure  determine  if  two  values  are  significantly  different—or  not  due  to  a  chance  occurrence  in   the  data     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   34  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   35       Figure  7.  Faculty  Perception  of  Open  Entry-­‐Exit  Math  Labs  Compared  to  a  Traditional  Format  (n  =   7;  ACC  =  1,  PPCC  =  6,  TSJC  =  0)   5   4   4   3   3   2   1   0   0   I  don't  know   0   Much  worse   Worse   0   Bexer   About  the  same   0   Much  bexer     The  second  question,  asked  faculty  members  to  indicate  whether  they  thought  math  labs  should   be  continued  as  a  format  at  their  institution.  These  findings  are  displayed  in  Figure  8  below.  As   shown  in  Figure  8,  all  respondents  indicated  that  the  math  labs  should  be  continued  at  their   institution.  Taken  together,  these  data  suggest  that  faculty  see  enough  value  in  the  math  lab   format  to  want  to  continue  it,  but  that  it  is  not  always  and  completely  better  for  students  than   the  traditional  format.   Figure  8.  Faculty  Preference  for  the  Continuation  of  Open  Entry-­‐Exit  Math  Labs  (n  =  7;  ACC  =  1,   PPCC  =  6,  TSJC  =  0)   5   4   4   3   3   2   1   0   0   0   Definitely  NOT   0   Probably  NOT   Unsure   Probably   Definitely     Process:  Were  Math  Labs  Implemented  as  Intended?   The  process  component  of  this  study  draws  on  faculty  survey  data  as  well  as  interviews  with  key   faculty.  The  faculty  survey  asked  respondents  to  indicate  how  much  they  agreed  or  disagreed   (on  a  five  point  Likert-­‐type  scale  where  1  =  “Strongly  disagree”  and  5  =  “Strongly  agree”)  with   statements  related  to  the  intended  implementation  of  math  labs.  Results  presented  in  Table  14     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       below  suggest  that  faculty  members  perceive  the  math  labs  as  being  implemented  as   intended.   Table  14.  Process  Measures  for  Open  Entry-­‐Exit  Math  Labs  (n  =  7;  ACC  =  1,  PPCC  =  6,  TSJC  =  0)   Item   Mean  (SD)   %  (N)  Agree  or   Strongly  Agree   Math  mentors  are  available  to  students  as  needed   4.71(0.49)   100%(7)   Students  are  able  to  work  at  their  own  pace   4.71(0.49)   100%(7)   Students  test  independently   4.71(0.49)   100%(7)   The  math  lab  is  being  implemented  as  intended   4.43(0.79)   86%(6)     In  addition  to  the  faculty  survey  data  presented  above,  interviews  with  key  faculty  provided   important  contextual  information  about  what  is  going  well,  and  what  could  be  improved  related   to  math  labs.  These  findings  are  presented  in  three  sections  below:  positive  developments,   ongoing  challenges  and  potential  temporary  growing  pains  associated  with  start  up.   Positive  Developments  in  Implementation   Interviews  with  key  faculty  revealed  several  positive  developments  in  the  implementation  of   Open  Entry-­‐Exit  Math  Lab  courses  at  CCCS  institutions.  These  included:     § n n n n Increased   flexibility   for   students—this   format   allows   students   more   flexibility   to   two   ways:   (1)   students   are   able   to   flex   their   daily   work   schedules,   and   (2)   because   math   labs   are   “student   centric”   rather   than   “class   centric,”   students   can  register  for  a  math  lab  after  the  start  of  regular  classes   Appropriate  pace—students  are  able  to  progress  at  the  most  appropriate  pace   for   them.   For   some   students   this   is   faster   than   the   traditional   format,   for   others   it  is  slower.   Mastery   of   the   subject   matter—with   this   format,   students   have   to   pass   each   element  in  order  to  progress.  As  such,  they  are  not  able  to  pass  a  class  while  not   understanding   a   particular   element   (which   is   sometimes   possible   in   traditional   formats  because  the  course  grade  accounts  for  the  entire  course).  Thus,  in  some   instances,  this  format  leads  to  subject  mastery  more  than  traditional  formats.   More   welcoming   to   older   students—some   older   students   are   uncomfortable   sitting   in   a   classroom   with   mostly   younger   students.   This   format   allows   older   students  to  avoid  such  a  situation.   Reduces   point-­‐in-­‐time   student-­‐to-­‐teacher   ratios—this   format   reduced   the   student-­‐to-­‐teacher   ratios   for   any   given   point   in   time,   allowing   for   more   one-­‐on-­‐ one  instruction.   Ongoing  Challenges  in  Implementation   Interviews  with  key  faculty  identified  some  structural  challenges  to  this  format  that  are  likely  to   be  ongoing.  These  included:     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   36  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       n n n n n Different   facility   requirements—math  labs  have  different  facility  requirements   than   traditional   formats.   In   general,   they   require   a   larger   individual   working   space  (though  not  necessarily  more  room  space)  as  well  as  computers.  Attaining   such   space   can   be   a   challenge   because   most   existing   space   was   designed   for   traditional  formats.   Increased   administrative   complexity—math   labs   increase   the   administrative   complexity  in  a  few  ways:  (1)  staffing  needs  fluctuate  throughout  the  day  as  well   as  throughout  the  semester;  (2)  there  are  more  classes  to  account  for;  and  (3)   existing   administrative   functions   (such   as   billing)   don’t   always   align   with   this   format.   Increased   complexity   for   instructors—this   format   can   be   more   complex   for   instructors,  as  at  any  one  time,  they  have  to  assist  students  working  on  a  variety   of   concepts   (rather   than   assisting   all   students   with   the   same   concept,   as   in   a   traditional  format).   Insufficient   time   management   (on   the   part   of   students)—the   increased   flexibility   of   this   format   requires   students   to   carefully   manage   their   time,   and   some  students  have  difficulty  doing  so.   “Appropriate   pace”   ≠ faster—while   math   labs   allow   students   to   progress   at   the   most   appropriate   pace   for   them,   this   pace   can   be   slower   than   traditional   formats.     Start-­‐Up  Growing  Pains   Beyond  the  ongoing  structural  challenges  listed  above,  several  potential  temporary  challenges   emerged  in  interviews  with  key  faculty,  including:   n n Messaging   issues—there   was   some   confusion   among   students   and   administrators   as   to   what   “self-­‐paced”   entails   within   math   labs.   Additionally,   there  was  some  confusion  as  to  the  type  of  student  for  whom  this  format  works   best.   Insufficient   training—several   faculty   members   indicated   they   could   have   benefited  from  more  training  prior  to  implementing  this  new  format.     Overview  of  Math  Labs   Below  (Table  15)  is  a  summary  of  findings  for  the  math  lab  innovation  cluster.               Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   37  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  15.  Overview  of  Math  Labs   Item   Performance   Course  Completion  Over  Control   Significantly  Lower   Term  GPA  Over  Control   Perception  of  Innovation  Quality  (Faculty)   No  Significant  Difference   About  the  Same   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Increases  flexibility  for  students     • Allows  appropriate  pace  (not  necessarily  faster)     • Mastery  of  the  subject  matter  (not  just  pass)   • More  friendly  for  some  older  students     • Reduces  point-­‐in-­‐time  student-­‐to-­‐teacher  ratios     Ongoing  Challenges  in  Implementation   • Different  facility  requirements     • Increased  administrative  complexity     • Increased  complexity  for  instructors     • Insufficient  time  management  (on  the  part  of  students)     • “Appropriate  pace”  ≠  faster     Start-­‐Up  Growing  Pains   • Messaging  issues     • Insufficient  training       Accelerated,  Compressed,  Contextualization  and  Mainstreaming   (CCA,  CCD,  FRCC  and  LCC)   This  section  investigates  the  accelerated,  compressed,  contextualization  and  mainstreaming   innovation  cluster.      Accelerated  and  compressed  courses  alter  the  scheduling  of  developmental   education  to  allow  students  to  complete  their  remediation  courses  more  quickly.    Several   institutions  combined  these  innovations  with  a  form  of  mainstreaming  in  which  students  are   allowed  to  enroll  in  college-­‐level  courses  with  additional  credit  hours  that  give  them  the  time   they  need  to  strengthen  their  foundational  skills  and  meet  developmental  education   requirements.    One  institution  also  engaged  in  contextualization,  in  which  developmental   education  is  embedded  within  relevant  career  or  technical  competencies.             Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   38  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Four  institutions  are  included  in  this  cluster:   n n Community  College  of  Aurora  (accelerated,  compressed  and  mainstreaming)   Community   College   of   Denver   (accelerated,   compressed,   mainstreaming   and   contextualized)   n Front  Range  Community  College  (accelerated  and  compressed)     n Lamar  Community  College  (accelerated  and  compressed)   The  findings  for  this  innovation  cluster  are  presented  below  in  three  sections:  (1)  outcomes,  (2)   process  and  (3)  an  overview.   Outcomes:  How  Successful  Were  These  Innovations?   Institutional  data  on  course  completion  and  GPA  were  paired  with  data  from  the  faculty  and   staff  survey  to  measure  the  success  of  these  innovations.  These  data,  as  well  as  how  closely  the   control  group  matches  the  innovation  group  for  this  cluster,  are  presented  below.   Characteristics  of  the  Control  and  Innovation  Groups   Students  in  the  control  group  are  in  the  same  DE  classes  (but  are  in  a  traditional  format)  and   attend  the  same  institutions  as  those  in  the  innovation  group.  Additionally,  the  control  group   was  matched  to  the  innovation  group  along  four  demographic  characteristics:  (1)  gender,  (2)   ethnicity,  (3)  race  and  (4)  age.  Table  16  below  displays  this  relative  match.   Table  16.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for   Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Characteristic   Control    (n  =  140)   Innovation  (n  =   Significant   178)   Difference^   No   Percentage  Female  (gender)   60.0%   60.1%   Percentage  Latino  (ethnicity)   19.3%   21.9%   No   Percentage  Non-­‐White  (race)   37.9%   36.0%   No   Mean  Age   27.71   28.42   No   ^  Statistical  tests  were  conducted  on  each  of  the  above  to  determine  whether  the  difference  between  the  control  and   innovation  groups  observed  is  statistically  significant.  Chi-­‐Square  tests  were  conducted  on  gender  (percentage   female),  ethnicity  (percentage  Latino)  and  race  (percentage  non-­‐White)  and  an  independent  samples  t-­‐test  was   conducted  on  age.       As  shown  above,  the  control  group  for  this  innovation  cluster  closely  mirrors  the  innovation   group  in  terms  of  gender,  ethnicity,  race  and  age.  Indeed,  the  differences  that  exist  are  not   statistically  significant  using  this  method  of  comparison.     Course  Completion  and  Term  GPA   An  independent  samples  t-­‐test  was  conducted  comparing  the  mean  performance  of  students  in   the  control  group  to  those  in  the  innovation  group,  to  measure  the  impact  of  this  innovation  on   course  completion  and  term  GPA.  Table  17  presents  these  findings  below.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   39  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   40         Table  17.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation  Group   for  Course  Completion  and  Term  GPA  for  Accelerated,  Compressed,  Contextualized  and   Mainstreaming   Item   Control               Innovation     t   df   p   Mean  (SD)   Mean  (SD)   Mean  Course  Completion  Ratio     0.70(0.44)   0.76(0.36)   -­‐1.275   266   0.203   Mean  Term  GPA   2.73(0.10)   3.12(0.90)   -­‐2.282   197   0.005     As  shown  above,  there  is  no  significant  difference  in  course  completion  between  the  control  and   innovation  groups.  However,  there  is  a  significant  difference  between  control  and  innovation  in   term  GPA,  with  students  in  accelerated,  compressed,  contextualized  and  mainstreaming  courses   having  a  higher  GPA  than  control.  This  suggests  that  students  in  accelerated,  compressed,   contextualized  and  mainstreaming  courses  had  higher  term  GPAs  than  those  in  traditional   formats,  but  that  there  was  no  difference  in  course  completion  rates.     Faculty  Perception   Faculty  perception  of  the  success  of  this  innovation  cluster  was  measured  by  two  questions   contained  in  the  faculty  survey.  One  question  asked  respondents  to  indicate  whether  they   thought  these  innovations  were  better  or  worse  for  students  than  the  traditional  format.  As   shown  in  Figure  9  below,  most  (4  or  80%)  respondents  thought  these  innovations  were  an   improvement  upon  the  traditional  format  for  DE.   Figure  9.  Faculty  Perception  of  Accelerated  and  Compressed  Courses  Compared  to  a  Traditional   Format  (n  =  5;  CCA  =  0,  CCD  =  0,  FRCC  =  5,  LCC  =  0)   5   4   4   3   2   1   1   0   0   Much  worse   0   Worse   0   About  the  same   Bexer   Much  bexer     The  second  question  asked  faculty  members  to  indicate  whether  they  thought  these  innovations   should  be  continued  at  their  institution.  Results  for  this  question  are  presented  in  Figure  10   below.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   41       Figure  10.  Faculty  Preference  for  the  Continuation  of  Accelerated  and  Compressed  Courses  (n  =   5;  FRCC  =  5,  LCC  =  0)   5   4   3   3   2   2   1   0   0   0   Definitely  NOT   0   Probably  NOT   Unsure   Probably   Definitely     As  shown  in  Figure  10  above,  all  respondents  indicated  that  these  innovations  should  be   continued  at  their  institution.  Taken  together,  these  data  suggest  that  faculty  members  view   accelerated,  compressed,  contextualized  and  mainstreaming  formats  as  superior  to  a   traditional  format.   Process:  Were  These  Innovations  Implemented  as  Intended?   Faculty  member  survey  data  and  interviews  with  key  faculty  members  provide  the  data  sources   for  the  process  component  of  this  study.  The  faculty  survey  asked  respondents  to  indicate  how   much  they  agreed  or  disagreed  (on  a  five-­‐point  Likert-­‐type  scale  where  1  =  “Strongly  disagree”   and  5  =  “Strongly  agree”)  with  statements  related  to  the  intended  implementation  of  math  labs.   As  seen  in  Table  18,  faculty  members  perceive  accelerated,  compressed,  contextualized  and   mainstreaming  courses  as  being  implemented  as  intended.   Table  18.  Process  Measures  for  Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Courses  (n  =  5;  FRCC  =  5,  LCC  =  0)   Item   Mean  (SD)   %(n)  Agree  or   Strongly  Agree   Each  semester,  students  have  the  opportunity  to  take   4.40(0.55)   100%(5)   two  or  more  DE  courses   Students  can  take  DE  courses  alongside  college-­‐level   4.00(0.71)   80%(4)   courses   Compressed/accelerated  courses  are  being   4.33(0.58)   100%(3)   implemented  as  intended*   *  One  respondent  selected  “I  don’t  know”  for  this  item,  and  one  did  not  respond  at  all,  thus  this   item  was  calculated  for  a  total  of  three  respondents.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       In  addition  to  these  faculty  survey  data,  interviews  with  key  faculty  members  provided   important  contextual  information  about  what  was  going  well,  and  what  could  be  improved   related  to  these  innovations.  These  findings  are  presented  in  three  sections  below:  positive   developments,  ongoing  challenges  and  potential  temporary  growing  pains  associated  with  start   up.   Positive  Developments  in  Implementation   There  have  been  a  variety  of  positive  developments  in  the  implementation  of  accelerated,   compressed  and  mainstreamed  DE  courses  at  CCCS  institutions.  In  particular,  interviews  with   key  faculty  revealed  that,  as  intended,  this  format:   n n n Allows   students   to   progress   more   quickly—students   are   allowed   to   progress   through  DE  courses  more  quickly  than  in  a  traditional  format.   Positively   impacts   student   motivation—some   faculty   members   mentioned   a   noted   improvement   in   the   motivation   of   their   students.   In   part,   faculty   members  believe  students  are  motivated  by  (1)  seeing  the  results  of  their  work   more  quickly,  and  (2)  being  treated  like  college  students.   Contributes  to  an  improved  academic  culture—some  faculty  members  perceive   a   qualitative   change   in   the   academic   culture   in   accelerated   and   compressed   courses.   In   particular,   faculty   members   reported   observing   an   increase   in   the   interaction   between   students—helping   each   other,   holding   each   other   accountable,   and   reaching   out   to   each   other   outside   of   the   classroom.   Faculty   also  noted  an  increase  in  personal  responsibility  on  the  part  of  some  students.   Additionally,  interviews  revealed  that  contextualized  courses:   n n n n Increase   student   autonomy—allows   students   more   autonomy   to   shape   their   learning  experience  within  the  bounds  of  the  program.   Increase   curriculum   relevance—contextualization   makes   the   curriculum   more   relevant   to   many   students   by   relating   academic   subjects   to   real   world   situations.   Increase   student   engagement—faculty   members   observed   increased   student   engagement  in  contextualized  courses.   Facilitate  learning  across  subjects—contextualization  increased  learning  across   subjects  for  both  students  and  faculty.   Ongoing  Challenges  in  Implementation   Though  overall  the  implementation  of  accelerated  and  compressed  courses  was  quite   successful,  interviews  with  key  faculty  members  identified  some  structural  challenges  to  this   format  that  are  likely  to  be  ongoing.  These  include:   n   Students’   lack   of   desire   to   go   faster—a  sub-­‐set  of  students  had  no  interest  in   completing  their  courses  at  a  faster  pace.  The  reasons  for  this  vary  by  student,   but   faculty   members   speculated   that   among   the   following   were   among   the   more  prominent  reasons:   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   42  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       n n n n n n n n n When  students  fear  a  subject,  doing  it  faster  can  be  scary   Some   students   don’t   feel   pressure   to   graduate   because   they   enjoy   the   college  lifestyle   Some   students   resist   engaging   in   accelerated   and   compressed   courses   because   they   don’t   want   to   be   in   different   classes   than   their   friends   and   peers   Students’   lack   of   ability—some   students   placed   within   compressed   and   accelerated  courses  simply  lacked  the  ability  to  work  at  the  pace  of  the  class.   Complexity   of   administrative   logistics—compressed   and   accelerated   courses   add  a  layer  of  administrative  complexity,  as  administrators  need  to  coordinate   courses  with  different  schedules  and  credit  loads.   Less  room  to  adjust  to  unforeseen  issues—a  somewhat  challenging  byproduct   of   a   compressed   and   accelerated   schedule   is   that   there   is   less   room   to   adjust   to   unforeseen  issues.  For  example,  one  instructor  indicated  that  due  to  the  shorter   timeframe,  it  was  more  difficult  to  make  up  ground  after  missing  a  class.   Finding  the  appropriate  pace—the  pace  is  too  fast  for  some  students  and  thus   an   ongoing   challenge   is   to   find   the   appropriate   pace   for   compressed   and   accelerated  courses  (i.e.,  not  too  fast,  not  too  slow).   Students’   need   for   additional   support—accelerated   and   compressed   courses   sometimes  require  more  support  for  students  than  traditional  formats.   Occasional   tension   between   contextual   projects   and   basic   content—while   faculty  members  indicated  that  contextualization  generally  supported  coverage   of   basic   content,   several   also   indicated   that   under   time   constraints   there   was   occasionally   tension   between   finishing   a   contextual   project   and   ensuring   all   basic  content  was  covered.   Start-­‐Up  Growing  Pains   In  addition  to  the  ongoing  structural  challenges  listed  above,  several  potentially  temporary   challenges  emerged  in  interviews  with  key  faculty  including:   n n n Messaging   issues—there   was   some   confusion   among   students   and   administrators   as   to   what   accelerated   and   compressed   courses   entailed,   and   some  stated  that  messaging  efforts  could  be  improved  moving  forward.   Insufficient   training—several   faculty   members   indicated   they   could   have   benefited  from  more  training  prior  to  implementing  this  new  format.     Time  constraints—faculty  members  indicated  that  getting  these  innovations  up   and  running  was  a  massive  push  under  the  time  constraints.   Overview  of  These  Innovations   Table  19  below  summarizes  the  findings  for  the  accelerated,  compressed,  contextualized  and   mainstreaming  innovation  cluster.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   43  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  19.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   Significantly  Higher   Perception  of  Innovation  Quality  (Faculty)   Better   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Allows  students  to  progress  more  quickly     • Positively  impacts  student  motivation     • Contributes  to  an  improved  academic  culture     • Increases  student  autonomy     • Increases  curriculum  relevance     • Increases  student  engagement     • Facilitates  learning  across  subjects     Ongoing  Challenges  in  Implementation   • Students’  lack  of  desire  to  go  faster     • Students’  lack  of  ability     • Complexity  of  administrative  logistics     • Less  room  to  adjust  to  unforeseen  issues     • Finding  the  appropriate  pace     • Students’  need  for  additional  support     • Occasional  tension  between  contextual  projects  and  basic  content   Start-­‐Up  Growing  Pains   • Messaging  issues     • Insufficient  training     • Time  constraints       Online  Hybrid  Courses   (CCCOnline  and  OJC)   This  section  investigates  the  online  hybrid  courses  cluster,  which  includes  institutions  that   combined  elements  of  online  education  with  a  traditional  format.    Institutions  in  this  cluster   include:   n n   Colorado  Community  College  Online  (online  hybrid  courses)   Otero  Junior  College  (online  hybrid  courses)   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   44  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       The  findings  presented  below  are  divided  into  three  sections:  (1)  outcomes,  (2)  process  and  (3)   an  overview.   Outcomes:  How  Successful  Were  Online  Hybrid  Courses?   Institutional  data  on  course  completion  and  term  GPA,  as  well  as  the  match  between  the   innovation  and  control  group,  are  presented  below.  No  one  from  CCCOnline  or  OJC  answered   the  faculty  and  staff  survey,  thus  no  faculty  perception  data  are  available  for  this  cluster.   Characteristics  of  the  Control  and  Innovation  Groups   Students  in  the  control  group  are  in  the  same  DE  classes  (but  are  in  a  traditional  format)  and   attend  the  same  institutions  as  those  in  the  innovation  group.  Additionally,  the  control  group   was  matched  to  the  innovation  group  along  four  demographic  characteristics:  (1)  gender,  (2)   ethnicity,  (3)  race  and  (4)  age.  Table  20  below  displays  this  relative  match.   Table  20.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for  Online   Hybrid  Courses   Characteristic   Control  (n  =  27)   Innovation  (n  =   Significant   34)   Difference^   No   Percentage  Female  (gender)   51.9%   55.9%   Percentage  Latino  (ethnicity)   0.0%   17.6%   Yes*   Percentage  Non-­‐White  (race)   39.3%   38.2%   No   Mean  Age   21.14   25.32   Yes**   ^  Statistical  tests  were  conducted  on  each  of  the  above  to  determine  whether  the  difference  between  the  control  and   innovation  groups  observed  is  statistically  significant.  Chi-­‐Square  tests  were  conducted  on  gender  (percentage   female),  ethnicity  (percentage  Latino)  and  race  (percentage  non-­‐White)  and  an  independent  samples  t-­‐test  was   conducted  on  age.     *  p  <  0.05   **  p  <  0.01   As  shown  above,  the  control  group  for  this  innovation  cluster  closely  mirrors  the  innovation   group  in  terms  of  gender  and  race,  but  not  in  terms  of  ethnicity  and  age.  Indeed,  the  control   group  has  no  Latinos  (as  opposed  to  17.6%  of  the  innovation  group)  and  is  significantly   younger  than  the  innovation  group.  These  differences  may  have  impacted  the  findings   presented  below.   Course  Completion  and  Term  GPA   To  measure  the  impact  of  the  online  hybrid  format  on  course  completion  and  term  GPA,  an   independent  samples  t-­‐test  was  conducted  to  compare  the  mean  performance  of  students  in   the  control  group  to  those  in  the  innovation  group.  These  results  are  presented  below.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   45  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  21.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation  Group   for  Course  Completion  and  Term  GPA  for  Online  Hybrid  Courses     Item   Control               Innovation     t   df   p   Mean  (SD)   Mean  (SD)   Mean  Course  Completion  Ratio     0.55(0.48)   0.63(0.48)   -­‐0.643   60   0.523   Mean  Term  GPA   2.84(0.86)   2.69(0.71)   43   0.541   0.616     As  shown  above,  there  is  no  significant  difference  in  term  GPA  or  course  completion  between   the  control  and  innovation  groups.  This  suggests  that  there  is  no  difference  in  either  course   completion  or  term  GPA  between  students  in  online  hybrid  formats  and  those  in  traditional   formats.     Process:  Were  Online  Hybrid  Courses  Implemented  as  Intended?   Because  no  faculty  members  from  CCCOnline  or  OJC  answered  the  faculty  survey,  the  process   data  for  this  innovation  cluster  are  entirely  derived  from  faculty  interviews.  However,  these   interviews  provided  important  contextual  information  about  what  was  going  well,  and  what   could  be  improved  related  to  online  hybrid  courses.  These  findings  are  presented  in  two   sections  below:  positive  developments  and  potential  temporary  growing  pains  associated  with   start  up.  Additionally,  within  these  sections,  findings  from  CCCOnline  are  separated  from  OJC   because  their  experiences  were  substantially  different.   Positive  Developments  in  Implementation   The  online  hybrid  innovation  implemented  by  CCCOnline  has  experienced  several  positive   developments.  In  particular,  interviews  with  key  faculty  members  revealed  that,  as  intended,   this  format:   n n Adds   a   “personal   touch”   to   online   courses—this   innovation   adds   a   human   component  to  online  education   Expands   tutoring   within   CCCOnline—this   innovation   adds   additional   in-­‐house   tutoring  support   Interviews  with  key  faculty  at  OJC  revealed  that:     n n Awareness  was  established—students   were   made   aware   of   OJC’s   online   hybrid   program   Access  was  provided—students  had  access  to  OJC’s  online  hybrid  program   Start-­‐Up  Growing  Pains   CCCOnline’s  online  hybrid  innovation  has  both  an  English  and  a  math  component.  The  math   component  was  largely  successful,  but  the  English  component  experienced  some  challenges:   n   Insufficient   program   definition—the   English   program   suffered   from   a   lack   of   definition,   which   made   it   difficult   to   implement   and   even   more   difficult   to   communicate   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   46  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       n n Messaging  issues—there   was   some   confusion   among   students   and   faculty   as   to   how  this  innovation  works   Lack   of   integration—the   additional   tutoring   support   was   not   sufficiently   integrated  into  the  class  structure   On   the   other   hand,   OJC’s   most   fundamental   challenge   was   that   it   was   largely   not   utilized  by  students  last  semester.   Overview  of  Online  Hybrid  Courses   Table  22  below  summarizes  the  findings  for  the  online  hybrid  innovation  cluster.   Table  22.  Overview  of  Online  Hybrid  Classes   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   No  Significant  Difference   Perception  of  Innovation  Quality  (Faculty)   No  Data   Desire  to  Continue  Innovation  (Faculty)   No  Data   Implemented  as  Intended  (Faculty  Perception)   No  Data   Key  Contextual  Notes   Positive  Developments  in  Implementation   • Adds  a  “personal  touch”  to  online  courses     • Expands  tutoring  within  CCCOnline     • Awareness  was  established     • Access  was  provided     Start-­‐Up  Growing  Pains   • Insufficient  program  definition     • Messaging  issues     • Lack  of  integration     • OJCs  largely  not  utilized           Modularization  and  Diagnostic  Assessments   (MCC,  NJC  and  PCC)   This  section  investigates  the  modularization  and  diagnostic  assessments  innovation  cluster.       Modularization  refers  to  the  reorganization  of  developmental  education  courses  into  distinct   stand-­‐alone  modules  (or  mods)  that  can  be  taken  in  a  variety  of  combinations.    To  identify  which   mods  are  most  appropriate  for  a  student  to  take,  students  typically  take  a  pretest  (diagnostic   assessment).    The  three  institutions  in  this  cluster  include:   n   Morgan  Community  College  (diagnostic  assessments  and  math  mods)     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   47  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       n Northeastern  Junior  College  (diagnostic  assessments  and  math  mods)   n Pueblo  Community  College  (diagnostic  assessments  and  math  mods)   Findings  for  this  cluster  are  divided  into  three  sections:  (1)  outcomes,  (2)  process  and  (3)  an   overview.   Outcomes:  How  Successful  were  Online  Hybrid  Courses?   Institutional  data  on  course  completion  and  GPA  were  paired  with  data  from  the  faculty  and   staff  survey  to  measure  the  success  of  these  innovations.  These  data,  as  well  as  how  closely  the   control  group  matches  the  innovation  group  for  this  cluster,  are  presented  below.   Characteristics  of  the  Control  and  Innovation  Groups   Students  in  the  control  group  are  in  the  same  DE  classes  (but  are  in  a  traditional  format)  and   attend  the  same  institutions  as  those  in  the  innovation  group.  Additionally,  the  control  group   was  matched  to  the  innovation  group  along  four  demographic  characteristics:  (1)  gender,  (2)   ethnicity,  (3)  race  and  (4)  age.  Table  23  below  displays  this  relative  match.     Table  23.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for   Modularization  and  Diagnostic  Assessments   Characteristic   Control    (n  =  84)   Innovation  (n  =   Significant   84)   Difference^   No   Percentage  Female  (gender)   54.8%   54.8%   Percentage  Latino  (ethnicity)   36.9%   36.9%   No   Percentage  Non-­‐White  (race)   63.1%   63.1%   No   Mean  Age   28.79   30.19   No   ^  Statistical  tests  were  conducted  on  each  of  the  above  to  determine  whether  the  difference  between  the  control  and   innovation  groups  observed  is  statistically  significant.  Chi-­‐Square  tests  were  conducted  on  gender  (percentage   female),  ethnicity  (percentage  Latino)  and  race  (percentage  non-­‐White)  and  an  independent  samples  t-­‐test  was   conducted  on  age.       As  shown  above,  the  control  group  for  this  innovation  cluster  closely  mirrors  the  innovation   group  in  terms  of  gender,  ethnicity,  race  and  age.  Indeed,  the  differences  that  exist  are  not   statistically  significant.     Course  Completion  and  Term  GPA   To  measure  the  impact  of  this  innovation  on  course  completion  and  term  GPA,  an  independent   samples  t-­‐test  was  conducted  comparing  the  mean  performance  of  students  in  the  control   group  to  those  in  the  innovation  group.  These  results  are  presented  below.         Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   48  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   49       Table  24.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation  Group   for  Course  Completion  and  Term  GPA  for  Modularization  and  Diagnostic  Assessments     Item   Control               Innovation     t   df   p   Mean  (SD)   Mean  (SD)   Mean  Course  Completion  Ratio     0.64(0.474)   0.74(0.421)   -­‐1.462   164   0.146   Mean  Term  GPA   2.93(0.866)   3.08  (0.901)   -­‐0.893   118   0.374     As  shown  above,  there  is  no  significant  difference  in  term  GPA  or  course  completion  between   the  control  and  innovation  groups.  This  suggests  that  there  is  no  difference  in  either  course   completion  or  term  GPA  between  students  in  math  mods  with  (and  without)  diagnostic   assessments  and  those  in  traditional  formats.     Faculty  Perception   The  faculty  survey  included  two  questions  related  to  faculty  members’  perceptions  of  the   success  of  open  modularization  and  diagnostic  assessments  over  the  course  of  the  semester.   The  first  of  these  questions  asked  respondents  to  indicate  whether  they  thought  modularization   was  better  or  worse  for  students  than  a  traditional  format.  As  shown  in  Figure  11  below,  all   three  respondents  thought  modularization  was  an  improvement  upon  the  traditional  format   for  DE.   Figure  11.  Faculty  Perception  of  Modularized  Courses  With  Diagnostic  Assessments  Compared  to   a  Traditional  Format  (n  =  3;  MCC  =  1,  NJC  =  0,  PCC  =  2)   3   2   2   1   1   0   0   0   Much  worse   0   Worse   About  the  same   Bexer   Much  bexer   The  other  question  asked  faculty  members  to  indicate  whether  they  thought  these  innovations   should  be  continued  at  their  institution.  Figure  12  below  displays  these  findings.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012    
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   50       Figure  12.  Faculty  Preference  for  the  Continuation  of  Modularization  and  Diagnostic  Assessments   (n  =  3;  MCC  =  1,  NJC  =  0,  PCC  =  2)   5   4   3   3   2   1   0   0   0   0   Definitely  NOT   0   Probably  NOT   Unsure   Probably   Definitely   As  shown  above,  all  respondents  felt  strongly  that  math  modularization  should  be  continued  at   their  institution.  Taken  together,  these  data  suggest  that  faculty  modularization  with   diagnostic  assessments  is  superior  to  a  traditional  format.   Process:  Were  Modular  Courses  With/Without  Diagnostic  Assessments  Implemented  as   Intended?   The  process  component  of  this  study  draws  on  faculty  survey  data  as  well  as  interviews  with  key   faculty.  The  faculty  survey  asked  respondents  to  indicate  how  much  they  agreed  or  disagreed   (on  a  five-­‐point  Likert-­‐type  scale  where  1  =  “Strongly  disagree”  and  5  =  “Strongly  agree”)  with   statements  related  to  the  intended  implementation  of  modular  courses.  Results  presented  in   Table  25  below  suggest  that  faculty  members  perceive  the  math  modularization  and   diagnostic  assessments  as  being  implemented  largely  as  intended.   Table  25.  Process  Measures  for  Modularization  and  Diagnostic  Assessments  (n  =  3;  MCC  =  1,  NJC   =  0,  PCC  =  2)   Item   %(n)  Agree  or   Strongly  Agree   Developmental  math  has  been  reorganized  into  modules   Our  modular  math  program  is  being  implemented  as  intended   Our  use  of  diagnostic  assessments  for  developmental  math  is  being   implemented  as  intended^   100%(3)   67%(2)   100%(1)   Note:  Due  to  the  exceptionally  small  sample,  means  were  not  provided.   ^  PCC  does  not  have  diagnostic  assessments,  and  thus  the  two  respondents  from  PCC  were  not  asked  this  question.   In  addition  to  the  faculty  survey  data  presented  above,  interviews  with  key  faculty  provided   important  contextual  information  about  what  was  going  well,  and  what  could  be  improved   related  to  modular  courses.  These  findings  are  presented  in  three  sections  below:  positive     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012    
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       developments,  ongoing  challenges  and  potential  temporary  growing  pains  associated  with  start   up.   Positive  Developments  in  Implementation   CCCS  institutions  implementing  modular  math  with  and  without  diagnostic  assessments  have   experienced  several  positive  developments  in  the  implementation  of  mainstreaming  DE  courses.   In  particular,  interviews  with  key  faculty  members  revealed  that,  as  intended,  this  format  allows   for:   n n n n n Appropriate  pace—students  are  able  to  progress  at  the  most  appropriate  pace   for   them   within   a   structured   timeframe.   For   some   students   this   is   faster   than   the  traditional  format,  for  others  it  is  slower.   Mastery   of   the   subject   matter—in   this   format,   students   have   to   pass   each   element   in   a   mod   in   order   to   progress.   Additionally,   each   student   has   to   pass   the   mod   itself.   As   such,   they   are   not   able   to   pass   a   class   if   they   do   not   understand   a   particular   element   (which   is   sometimes   possible   in   traditional   formats  because  the  course  grade  accounts  for  the  entire  course).  Thus,  in  some   instances,  this  format  leads  to  subject  mastery  more  than  traditional  formats.   Shorter  remediation  track—because  this  format  breaks  the  content  into  smaller   pieces   (or   mods)   than   an   entire   class,   students   are   able   to   take   only   those   elements   that   they   require.   Additionally,   students   don’t   have   to   register   for   entire   classes   if   they   are   unable   to   complete   the   entire   sequence.   As   such,   their   remediation  track  can  be  substantially  shorter  than  it  would  be  in  a  traditional   format.   Instant   feedback—students   are   provided   with   instant   feedback   on   their   progress,  which  allows  them  to  make  necessary  adjustments  more  quickly.   Appropriate  placement—diagnostic  assessments  place  students  more  precisely   and  appropriately.     Ongoing  Challenges  in  Implementation   Interviews  with  key  faculty  identified  some  structural  challenges  to  this  format  that  are  likely  to   be  ongoing,  including:   n Increased   administrative   complexity—this   format   introduced   additional   administrative  complexity  in  a  couple  of  ways:   n n n n   Financial  aid  polices  don’t  align  well  with  modularization   There   are   administrative   limits   on   the   number   of   certain   types   of   classes   that  can  be  offered,  and  math  mods  are  subject  to  these  limits   Perception   that   students   are   “teaching   themselves”—according   to   faculty,   some  students  view  this  format  as  a  form  of  self-­‐teaching,  and  some  don’t  see   the  benefit  in  such  a  format.   Lack  of  computer   skills—this  format  is  computer  heavy,  and  some  students  lack   the  computer  skills  necessary  to  effectively  complete  these  mods.  Additionally,   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   51  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       some   students   who   have   the   necessary   computer   skills   simply   don’t   enjoy   working  on  computers.   n Diagnostic   testing   ≠ shorter   remediation   track—because   a   large   portion   of   students   need   the   entire   remediation   track,   diagnostic   testing   has   not   shortened  the  remediation  track  for  many  students.   Start-­‐Up  Growing  Pains   Beyond  the  ongoing  structural  challenges  listed  above,  a  potentially  temporary  challenge   emerged  in  interviews  with  key  faculty:   n Messaging   issues—some   students   and   administrators   were   unsure   how   modular  math  works.     Overview  of  Modularization  and  Diagnostic  Assessments   Table  26  below  summarizes  the  findings  for  the  modularization  and  diagnostic  assessments   innovation  cluster.   Table  26.  Overview  of  Modularization  and  Diagnostic  Assessments   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   No  Significant  Difference   Perception  of  Innovation  Quality  (Faculty)   Better   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Appropriate  pace     • Mastery  of  the  subject  matter     • Shorter  remediation  track     • Instant  feedback     • Appropriate  placement     Challenges  in  Implementation   • Increased  administrative  complexity     • Perception  that  students  are  “teaching  themselves”     • Lack  of  computer  skills     • Diagnostic  testing  ≠  shorter  remediation  track     Start-­‐Up  Growing  Pains   • Messaging  issues       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   52  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       53   Conclusion     In  the  first  semester  of  the  CICG  project,  12  institutions  implemented  seven  innovations  in  DE  in   various  combinations  that  fall  within  one  of  four  innovation  clusters:   n Open  Entry/Exit  Math  Labs     n Accelerated,  Compressed,  Contextualized  and  Mainstreaming     n Online  Hybrid     n Modularization  and  Diagnostic  Assessments     Table  27  below  summarizes  how  each  of  these  innovation  clusters  performed  over  the  course  of   the  last  semester  along  several  items.   Table  27.  Overview  of  All  Innovation  Clusters   Item   Open  Entry/Exit   Math  Labs   Term  GPA     Online  Hybrid   Courses   Modularization   and  Diagnostic   Assessments   Significantly   Lower   No  Significant   Difference   No  Significant   Difference   No  Significant   Difference   No  Significant   Difference   Course  Completion     Accelerated,   Compressed,   Contextualized  &   Mainstreaming   Significantly  Higher   No  Significant   Difference   No  Significant   Difference   Better   No  Data   Better   Perception  Compared   About  the  Same   to  Traditional  (Faculty)   Desire  to  Continue   (Faculty)   Yes   Yes   No  Data   Yes   Implemented  as   Intended  (Faculty)   Yes   Yes   No  Data   Yes     As  shown  above,  open  entry/exit  math  labs  had  lower  completion  rates  than  their  control   group.  Though  this  is  not  in  the  intended  direction,  interviews  with  key  faculty  revealed  a   potential  cause:  math  labs  allow  students  to  progress  at  their  own  pace,  which  is  sometimes   slower  than  a  traditional  format.  Additionally,  several  faculty  members  suggested  that  math  labs   were  improving  learning  by  forcing  students  to  master  topics  before  advancing.  As  one   interviewee  put  it:     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       “If  success  is  measured  in  terms  of  how  quickly  students  pass  through  then  [math  labs]  are  not  a   success.  But  if  success  is  measured  in  terms  of  how  much  students  learn,  then  [math  labs]  are  a   complete  success.”   —  Faculty  member  involved  with  a  math  lab  as  part  of  the  CICG  project   Thus,  while  math  labs  had  significantly  lower  completion  ratios,  this  is  not  an  entirely  negative   finding.   Additionally,  Table  28  demonstrates  that  the  remaining  three  clusters  did  not  demonstrate   statistically  significant  higher  course  completion  rates  than  their  control  group,  though  this  may   change  over  time  as  students  progress  through  their  DE  courses.  The  same  could  be  said  for   term  GPA,  which  showed  no  significant  difference  from  control  for  three  of  the  four  clusters.   However,  the  accelerated,  compressed,  contextualized  and  mainstreaming  cluster  had  term   GPAs  that  were  statistically  significantly  higher  than  control  (3.12  compared  to  2.73).   In  terms  of  faculty  perception,  it  appears  that  faculty  members  in  all  innovation  clusters  (for   which  data  are  available)  believe  that  these  innovations  were  implemented  as  intended.   Further,  within  all  clusters  for  which  data  are  available,  faculty  members  would  like  to  see  these   innovations  continued.  Finally,  faculty  members  in  both  the  modularization  and  diagnostic   testing  and  the  accelerated,  compressed,  contextualized  and  mainstreaming  clusters  believe   that  these  innovations  are  an  improvement  upon  a  traditional  format  for  most  students.  Faculty   members  in  math  labs,  on  the  other  hand,  largely  view  them  as  of  about  the  same  overall   quality.  However,  interviews  with  faculty  members  suggest  that  this  finding  is  a  result  of  viewing   math  labs  as  better  for  some  students,  and  worse  for  others.  As  such,  these  faculty  members   view  math  labs  as  a  positive  development,  but  not  a  solution  for  all  students.   These  data  go  some  distance  in  answering  the  outcome  related  research  questions:   • Are  students  within  innovation  DE  programs  more  successful  (in  terms  of  graduation,   retention  and  GPA)  than  those  in  standard  DE  programs?   It  is  premature  to  fully  answer  this  question,  but  thus  far  there  is  not  strong  evidence   to  suggest  that  innovation  formats  are  outperforming  traditional  formats  in  terms  of   retention  and  GPA.  This  is  not  entirely  surprising  as  these  measures  are  largely  long-­‐ term  measures,  and  CCCS  institutions  are  still  in  the  initial  stages  of  the   implementation  of  these  innovations.  Additionally,  it  appears  that  some  innovations   provide  benefits  to  students  that  are  not  objectively  measured  by  this  evaluation.     • Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention,  and   GPA)?   At  this  point  in  the  evaluation,  the  accelerated,  compressed,  contextualized  and   mainstreaming  innovation  cluster  is  outperforming  the  other  innovations  in  terms  of   retention  and  GPA.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   54  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       Additionally,  these  data  address  the  following  process  related  research  questions:     • Were  the  innovations  implemented  as  intended?   Despite  some  initial  hurdles,  and  with  a  few  exceptions,  these  innovations  are  being   implemented  largely  as  originally  intended.   • What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   The  evaluation  of  the  first  semester  of  the  implementation  of  the  CICG  project  has   uncovered  a  variety  of  important  lessons:   • Messaging  is  important—Many  innovations  would  benefit  from  better  messaging,   such  that  both  student  and  administrators  fully  understand  what  each  innovation  is,   and  the  type  of  student  it  best  serves.   • Appropriate  pace  ≠  faster  pace—While  most  of  the  people  involved  in  these   innovations  envisioned  a  faster  pace  (compared  to  a  traditional  format),  throughout   the  semester  it  became  clear  that  for  some  students,  a  slower  pace  is  more   appropriate.     • There  are  unanticipated  benefits  to  some  of  these  innovations—Some  faculty   members  have  observed  benefits  to  the  innovations  they  participated  in  that  were   unanticipated  (e.g.,  the  benefits  of  a  slower  pace  for  some  students).  Many  of  these   benefits  are  not  adequately  captured  by  this  evaluation.   • New  formats  are  resource  intensive  to  set  up—Setting  up  a  new  format  requires   additional  time  and  money.   • New  formats  have  a  learning  curve—There  is  a  learning  curve  to  new  formats,  and   most  faculty  anticipate  improved  performance  in  the  coming  semester  as  they  adopt   to  what  they  learned  this  semester.   • Innovations  are  not  necessarily  replacements  for  a  traditional  format—With  only  a   few  exceptions,  most  faculty  believe  that  some  innovations  are  better  for  some   students,  but  that  a  traditional  format  is  better  for  others.  As  such,  these  faculty   believe  that  their  institution  should  provide  BOTH  the  innovative  format  and  a   traditional  format.     Additionally,  several  potential  barriers  to  retention  not  related  to  these  innovations   emerged  as  significantly  correlated  with  course  completion  (though  not  with   respondents’  expectations  for  graduation  or  transfer).  These  include:   • • Difficulty  of  the  classes   •   Amount  of  time  required   Navigating  the  administration   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   55  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant       • The  lack  of  a  social  scene   These  findings  are  preliminary,  and  it  is  far  too  early  to  make  any  conclusive  judgments  about   the  success  of  the  innovations  implemented  as  part  of  the  CCC  project.  Such  judgments  will   come  later  as  data  are  collected  over  a  longer  period  of  time  and  these  innovations  mature.   However,  the  data  collected  to  date  suggest  that  these  innovations  provide  a  benefit  to   students  and  should  continue  to  be  implemented.  Despite  the  benefits,  however,  these   innovations  are  unlikely  to  be  a  panacea  for  the  challenges  faced  by  DE.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   56  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   57       Appendix  A:  Student  Survey       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   58                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   59                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   60                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   61           Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   62       Appendix  B:  Faculty  Survey       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   63             Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   64                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   65                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   66                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   67                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   68                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   69                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   70                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   71                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   72                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   73                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   74                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   75                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   76                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   77                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   78                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   79                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   80                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   81                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   82                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   83                 Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   84             Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   85                                                                                                                                                                                                                                                     Appendix  C:  Faculty  Interview  Guide     Colorado Community College System (CCCS) Developmental Education Innovation Interview Protocol Interviewee Name Date: XX/XX/XXXX Time: XX:XX–XX:XX Mode: Phone Interviewer: NAME Note-taker: NAME Interview #: XX Institution(s) ACC CNCC NJC RRCC CCA FRCC PCC TSJC CCConline LCC OJC CCD MCC PPCC Other Logistic Notes: Key Notes Key Quotes General Tone Potential Tool Improvements     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   86       INTRODUCTION Hi Mr./Ms. BLANK, my name is BLANK2, and I’m calling on behalf of the Colorado Community College System. I was hoping to ask you a few questions about your experience with NAME OF INTERVENTION at NAME OF INSTITUTION. Is now a good time for you? - (If no) Could we set up another time to talk? (try to schedule a new time) - (If yes) Great! (continue with script) Like I said, my name is BLANK2, and I’m with JVA Consulting. We are working with the Colorado Community College System to help us better understand the implementation of innovations in developmental education in CCCS institutions. We are interested in your experience with developmental education at your institution, both in what has gone well and in the challenges you face. This interview should take between 15 and 20 minutes depending on how much you want to share. Will that still work for you? - (If no) Could we set up another time to talk? (try to schedule a new time) - (If yes) Ok good. (continue with script) I have a few prepared questions, and while we’d love to get your feedback on all of them, no question is required. Please feel free to interrupt at any time—we are really just interested in your experience. Also, feel free to ask me to clarify a question if it doesn’t make sense. Finally, everything you share with me today is completely confidential. We will be preparing a report that includes quotes from interviewees, but your name will not appear anywhere in this report, unless you explicitly allow us to do so. The Colorado Community College System really wants to know what is going well, and what could be improved, so please share freely. Do you understand the study as I’ve laid it out to you?           Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   87       - (If no) What could I help to clarify for you? (answer questions) - (If yes) Great! (continue with script) Do you agree to participate in this study as I’ve laid it out? - (If no) Is there anything I can do to alleviate your concerns? (do your best to address concerns without comprising design) - (If yes) Great! (continue with script) Do you have any questions for me before we begin? Questions CONTEXT 1. What is your relationship with NAME OF INNOVATION related to developmental education at NAME OF INSTITUTION? Probe: How familiar are you with NAME OF INNOVATION at NAME OF INSTITUTION? 2. Please briefly describe how NAME OF INNOVATION related to developmental education works at NAME OF INSTITUTION? 3. How does NAME OF INNOVATION compare to the way NAME OF INSTITUTION used to teach developmental education? IMPLEMENTATION vs. INTENTION 4. How does NAME OF INSTITUTION’s current implementation of NAME OF INNOVATION compare to what was originally intended? Probe: Have changes been made to NAME OF INNOVATION since           Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   88       its inception? 5. Are there elements of NAME OF INNOVATION that haven’t been implemented yet? BUILDING ON SUCCESS 6. What are the BEST things about THIS INNOVATION at NAME OF INSTITUTION? Probe. Do you think THIS INNOVATION is helping students accelerate through developmental education more quickly? Probe. Do you think students are learning the appropriate information through THIS INNOVATION to move to college-level courses? 7. How can NAME OF INSTITUTION build on these successes? IMPROVEMENT 8. What are the biggest challenges THIS INNOVATION at NAME OF INSTITUTION faces? Probe. What have you found most frustrating about THIS INNOVATION at NAME OF INSTITUTION? 9. What could NAME OF INSTITUTION do differently to address these issues? Probe. How could NAME OF INSTITUTION make your work on THIS INNOVATION easier? CONCLUSION 10. Those are all the question I have prepared, is there anything else you’d like to share with me about your experience with NAME OF INNOVATION related to developmental education at NAME OF INSTITUTION?           Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • Evaluation  of  the  Completion  Innovation  Challenge  Grant   89       Thank you so much for your time. We really appreciate it, and we really value all that you’ve shared today. If you have any questions for me after we hang up today, you can reach me at xxx.xxx.xxxx or BLANK3@JVAConsulting.com. You may also contact the Colorado Community College System Institutional Review Board (IRB) with any questions: 303.797.5870.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012