1Minutes of the Third AMCOA Meeting, August 18, 2011Prepared by Kerry McNallyHost Campus: Holyoke Community CollegeI. AttendanceThe third AMCOA meeting was hosted by Holyoke Community Collegefrom 10:00 a.m.-12:00 Noon on August 18, 2011. Representatives from21 institutions attended the meeting (See list in Appendix A), as well asRichard M. Freeland, Commissioner of Higher Education; JonathanKeller, Associate Commissioner for Research, Planning and InformationSystems; Anne Perkins, Vision Project Research Associate, Research andPlanning; and Peggy Maki, Consultant under the Davis EducationalFoundation Grant awarded to the Department of Higher Education, whoalso chaired the meeting.II. Holyoke Community College President William Messner OpeningRemarksPresident William Messner welcomed the AMCOA Team to WesternMassachusetts and Holyoke Community College. He praised the Teamfor representing campuses all across the state and for getting togetherand acting “as a system.” “We don’t do enough of that.” Heemphasized the importance of this collaborative work and creditedCommissioner Freeland’s leadership for bringing campuses to theseconversations. After having guided Holyoke Community College througha New England Association of Schools and Colleges (NEASC)accreditation review this past spring, he said that it is wonderful that theNEASC is reviewing schools with an eye for improvements in assessment
2and that assessment is now a priority for the state and federalgovernments, as well as the schools.III. New AMCOA Team MembersPeggy Maki introduced new Team members who were not introduced atthe previous meeting: Benjamin Railton, Associate Professor of English,Coordinator of American Studies, Fitchburg State University; Ann Caso,Director of Institutional Research, and Susan Chang, Director ofAssessment, at Framingham State University; Richard Parkin, AssistantVice President/Academic Affairs, and Linda Meccouri, ProfessionalDevelopment Coordinator and Professor, Multi Media Technology,Springfield Technical Community College; and Yves Salomon-Fernandez,MassBay Community College.IV. Update on Conferences (place, date, chairs)Peggy Maki provided an update on the scheduled conferences andasked for a volunteer to Co-Chair the November 17thconference atGreenfield Community College to serve with Ellen Wentland. JudithTurcotte volunteered for this role.Jim Gubbins of Salem State University volunteered to Chair the FourthAMCOA Conference on April 23rd.Peggy reminded AMCOA Team members to confirm their correct titlesfor the September 30thConference Brochure as soon as possible. Theirnames will appear on the brochures for each of our conferences.Kris Bendikas, chair of the September 30th statewide conference,distributed a handout that listed the Conference Planning Group’srecommended sessions for that conference. (See handout in AppendixB.) All of the presentations could not be included in the first conference,
3but will be given at one of the other conferences. It was decided thatafter each session there will be a companion roundtable as well asadditional roundtables focused on topics identified by the AMCOAgroup, such as the following:a) How LEAP outcomes are integrated into campusesb) How to assess onlinec) How to facilitate alignment between NEASC expectations andthose of DHEd) How to collect, store, and represent assessment resultsJonathan Keller will present a session as well as offer a roundtable afterhis session and be available at the rest of the conference. At themeeting Peggy asked members if they could identify topics for Jonathanthat they would find useful. Among those recommended was a focus onidentifying “what not to do in preparing or reporting results.”Discussion also focused on how we can make our four statewideconference presentations available to our public institutions.Specifically, we focused on discussing the possibility of videotaping,audio taping, recording PowerPoints, or uploading handouts to oursocial network--Yammer. Lori Dawson is currently looking into thepossibility of videotaping and can confirm that method after she seesthe final schedule. Several team members stated that faculty and otherswho will not be able to attend the conference have already stated thatthey would like access to materials. We also discussed how importantsession contents are to faculty development.Peggy reminded the Team to limit the number of colleagues attendingthe September 30thConference to 2-3 because of space considerationsat the Conference location. However, she said that Kerry will be able totrack registrations and let institutions know if they can bring more than2-3 people.
4V. Volunteers to introduce sessions on September 30, 2011Peggy asked for volunteers to introduce sessions at the conference. Thefollowing individuals volunteered: Susan Taylor, Yves Salomon-Fernandez, Suzanne Van Wert, Ellen Zimmerman, Michael Vieira, SusanKeith, Martha Stassen, Judith Turcotte, and Carol Lerch.VI. Discussion of assessment experiments. Peggy Maki, AMCOA Consultant;Commissioner Richard Freeland; and Jonathan Keller, AssociateCommissioner for Research, Planning, and Information Systemsa. Parameters for Assessment ExperimentsPeggy Maki distributed a draft of parameters for assessmentexperiments supported by the Davis Educational Foundation Grant.After she reviewed those parameters (see Appendix C), she responded toquestions about them. In response to the question about facultyreceiving stipends for participating in experiments, she stated that fundscould be used in that way. Another question raised was whetherincluding performance on remedial coursework could be included. Peggysaid she thought that could be included in experiments. A question wasraised about whether or not “useful results for transfer from two-year tofour-year institutions” should be a criterion for the experiments toaddress. Charlotte Mandell said that four-year institutions candisaggregate their data to look at the two-year population and that this isan important focus. Peggy noted that a common thread in herconversations at the community colleges is their desire to make sure thattransfer students are able to demonstrate the same level of learning asthose students who do their first two years at a four-year institution.Overall, Peggy stated, the issue that these experiments will have toaddress is that of scalability.It was agreed that AMCOA team members would vet this current draft ontheir campuses and submit proposed edits or changes by Sept. 15, 2011.Peggy will incorporate proposed changes into a final Request forProposals that would go out to team members one more time for final
5approval before being officially released. It was also agreed that todevelop strong proposals the deadline for submission should be movedto October 31, 2011. The Commissioner’s Advisory Committee wouldthen select proposals in November 2011 based on the ranking ofproposals by volunteer AMCOA members. (The appended draft reflectsnew dates proposed by team members at the meeting.)b. Richard M. Freeland, Commissioner of Higher EducationCommissioner Freeland expressed his gratitude for the work of theAMCOA team –especially participants’ commitment to meet over thesummer months. He also expressed his excitement about the AMCOAproject and the importance of our focus on documenting our students’learning. He stated that he believes “We are on the edge of a realrevolution in Higher Education, and Massachusetts has a chance toinitiate the dialogue. There are many political aspects in public highereducation, but the educational outcomes overshadow the political sides.We must emphasize the educational outcome of this process.” Further,he stated that if faculty and other educators at our institutions approachoutcomes assessment in the right way, they can become more effectiveteachers, particularly for the range of students who come into publichigher education. This commitment can be the most importantdevelopment in higher education.The Commissioner also described how the Davis grant supports campus-based work in assessment through the four statewide conferences, theAMCOA meetings and members’ leadership, Peggy Maki’s initial campusvisits and her availability as a campus resource—all of which will lead tothe design of an eventual statewide system of reporting students’ levelsof achievement. Thus, the State’s approach to assessment is based ontwo levels: (1) campus-level and (2) statewide level. Focus on statewidereporting is aimed at learning how well students are doing so that we canidentify ways to improve students’ learning. Even though political issuesmay emerge, he noted that we are capable of having discussions about
6these levels as demonstrated by the AMCOA project itself. He alsoadded that if a campus does not want to participate in the eventualreporting, that is O.K. What we do need to keep sight of is that “strongereducation is our most important outcome.” Thus, he hopes that allcampuses will want to join this statewide effort of reporting assessmentresults.The Commissioner also reported on movement towards becoming aLEAP-affiliated State, a movement that was endorsed by the Board inJune. Neal Bruss requested that the Board’s action on this be distributedto our campuses. And, the Commissioner reported that he intends toshare this possible affiliation with NEASC as well as meet with NEASCrepresentatives to discuss the ways in which an affiliation would alignwith NEASC reporting requirements.The Commissioner reported on his July presentation at a nationalconference of SHEOs (State Higher Education Officers) during which hetalked about learning outcomes assessment within the context of theVision Project and the parameters of the AMCOA Project. Responsesfrom the SHEOs and nationally recognized leaders such as Peter Ewell,Carol Schneider, and Paul Lingenfelter were that Massachusetts is wayahead on this focus. Specifically, he stated that conference participantswere intrigued by the notion of a reporting system design that is notmandated. Other states expressed interest in working with us, so thereis a lot of interest in our work. After the presentation there was a memosent around lauding the work taking place in Massachusetts and askingother states to take note of our work. Peter Ewell stated that he thoughtour approach is “the best authentic assessment for accountability andimprovement.”Finally, the Commissioner reported that AAC&U has requested that weparticipate in a large Lumina grant they expect to receive with specificfocus on developing better understanding about what students should
7be learning at different levels (as represented in the LuminaUndergraduate Profile) to facilitate effective transfer. The Commissionerstated that because as a state we are well down the road withdiscussions about LEAP outcomes and VALUE rubrics, it would not beadvisable for us to be diverted to another direction. Thus, he stated weneed another year to see how we progress with our current directionand commitment.c. Jonathan Keller, Associate Commissioner for Research, Planning, andInformation SystemsLooking at the partnership focus of our AMCOA work, particularly in thedesign of assessment experiments that will inform a statewide system ofreporting, Jonathan stated that he and his staff are happy to contributetheir expertise and experience to the experiments--particularly since thescope of this commitment to design a system can seem overwhelming.He said that the State has resources that are available to assist teammembers as they and others design a reporting system. Specifically, hestated that he and his staff can assist in: (1) developing experiments; anddown the road (2) developing something statewide. He also said that hecan assist with analyzing and interpreting NSSE and CCSSE resultsbecause he has extensive experience with those instruments. He alsocan provide advocacy for looking at where institutions shine, such asidentifying activities that show critical thinking. He has resources thatcan help with database development and definitions, demographics, andtracking students.VII. Suggestion to develop a database of assessment instrumentsProfessor Benway has recommended that under our Davis Grant wedevelop a database of assessment instruments. Discussion focused onthe value of this effort to all of our institutions as well as the significanceof this database as a possible way to represent to external stakeholdershow we assess our students—perhaps as part of the eventual statewide
8reporting system. Rather than simply list methods, there was arecommendation that there be some kind of reporting or description ofeach method’s particular usefulness. Peggy will ask one of our teammembers to take a stab at writing a description of what we might includein this proposed database. This description will come forward to thegroup for further discussion.VIII. Next AMCOA MeetingsThe next AMCOA meeting will take place on September 14thfrom 10:00a.m.-12:00 Noon at Massachusetts College of Liberal Arts. Peggy Makiwill send an agenda and driving directions to the site one week beforethe meeting. Please let Kerry know as soon as possible if you plan toattend the meeting or cannot.Mark Your Calendars: September 30, 2011: First StatewideAssessment Conference at Worcester State University, 9:00a.m. – 3:00 p.m. (8:30 a.m. Registration)
9Appendix AInstitutions Represented at the AMCOA August 18thMeeting:Berkshire Community CollegeBristol Community CollegeBunker Hill Community CollegeCape Cod Community CollegeFitchburg State UniversityFramingham State UniversityHolyoke Community CollegeMassachusetts College of Liberal ArtsMassasoit Community CollegeMassBay Community CollegeMiddlesex Community CollegeNorthern Essex Community CollegeQuinsigamond Community CollegeRoxbury Community CollegeSalem State UniversitySpringfield Technical Community CollegeUniversity of Massachusetts AmherstUniversity of Massachusetts BostonUniversity of Massachusetts LowellWestfield State UniversityWorcester State University
10Appendix B Session Proposals for September 30thAMCOA ConferenceTopic Presenters Institution“Teamwork: the Key to Faculty Engagement.”Discussion of the Title III funded program, TheConnected College, administered at BristolCommunity College, which involves collaboratingwith faculty to develop institution and program-level assessment strategies through the creation ofa set of Course Design Toolkits.Kevin Forgard, the project’s instructionaldesignerMaureen Melvin Sowa, Professor of HistoryBristol Community College“The development and function of BHCC’sfaculty-driven assessment initiative: the StudentLearning Outcomes Assessment Project(SLOAP).” Based on a conceptual assessmentframework that provides an institutional model forall assessment activities.Judy Lindamood, Chair, Early ChildhoodEducation & Human Services DepartmentTimothy McLaughlin, Chair, English DepartmentNatalie Oliveri, English DepartmentDavid Leavitt, Director of Institutional ResearchBunker Hill Community College“Using Business Process Analysis to FacilitateBuy-In and Effective Assessment InformationFlow.”1) How to improve faculty buy-in byapproaching VALUE Rubrics as acustomizable tool to test in a low-pressurepilot project2) How to get started on a Business ProcessAnalysisAnn Caso, Director of Institutional ResearchSusan Chang, Director of AssessmentCynthia Glickman, Business Systems AnalystPatricia Lynne, Associate Professor andAssessment Liaison, English DepartmentEllen Zimmerman, Associate Vice President forAcademic AffairsFramingham State University“How Institutional Research SupportsMiddlesex Community College’s ProgramReview Process.” Improving access andadvancing student success by strengtheningevidence-based practices and resource allocation.Embedded in this program review process is theassessment of program effectiveness, includingimpact on students – student success and studentlearning – for both academic and co-curricularprograms.Lois Alves, Vice President of InstitutionalResearch and Enrollment ServicesCynthia Lynch, Service Learning CoordinatorElise Martin, Associate Dean of AssessmentMiddlesex Community College
11Appendix B (continued)Topic Presenters Institution“Assessment of Supplemental Instruction.” Inthe spring of 2011, NECC carried out an extensiveassessment project, focused on classes designatedfor Supplemental Instruction (SI). Quantitativeand qualitative data was collected through methodsincluding surveys completed by students, faculty,and SI leaders. The findings provided importantinsights into SI at NECC, and suggested numerousstrategies for improvement.”Lynne Nadeau, Coordinator: Academic Resourceand Tutoring CenterLinda Shea, Assistant Dean: Library andAcademic Support ServicesEllen Wentland, Assistant Dean: AcademicProgram Review, Outcomes Assessment, andEducational EffectivenessNorthern Essex Community College“Discussion of UMass Dartmouth’sCommitment to Student Learning statement(CSL).” Founded on broad campus discussion,using LEAP principles, and motivated by NEASCstandard 4.16, it states aspirational goals forstudents upon graduation. Developedcollaboratively, CSL has guided renovation ofgeneral education, improved advising, and willsoon help refocus academic major learning goals.Richard Panofsky, Assistant Chancellor forInstitutional Research & AssessmentUMass Dartmouth“Institutional Research.” Jonathan Keller, Assoc. Commissioner forResearch, Planning and Information SystemsDHE“Using NSSE/CCSSE Results.” A paneldiscussion to share strategies and best practices.David Leavitt, Director of Institutional ResearchElise Martin, Associate Dean of AssessmentTBD_________________________TBD_________________________Bunker Hill Community CollegeMiddlesex Community College
12Appendix CDraft for Review by AMCOA TEAMParameters for AMCOA Assessment ExperimentsAugust 18, 2011Funding for the AMCOA project under the Davis Educational Foundation grant awarded toDHE includes support for scalable assessment experiments developed among our publichigher education institutions that have the potential to: (1) “provide a foundation for thesystem-wide plan connected to the Vision Project,” and (2) demonstrate that a “campus andsystem collaborative approach to assessment of student learning can be helpful to publiccampuses and to the public system as a whole” (Davis Educational Foundation Grant).The following list provides parameters for developing a proposal to undertake the designof an assessment experiment. The experiment:1. Builds on current campus-based systems that report results as well asplans to improve student learning, creating a statewide system thataddresses both (a) accountability and (b) improvement.2. Reports exiting students’ achievement levels in General Educationbased on scoring students’ authentic work, using agreed uponnationally informed scoring rubrics such as the LEAP rubrics andtranslating assessment results into scores or dashboards that areuseful across campuses. Initially, GE reporting will begin with a focuson results of scoring student work that demonstrates critical thinking,writing, and quantitative reasoning.3. Demonstrates or offers the ability to aggregate or to disaggregateassessment results based on institutional demographics4. Experiments witha. web platforms (including current existing data systems atDHE) to enable disaggregation and aggregation of assessmentresults across student populations orb. portfolio assessment technologies that would provide costefficiencies for campuses5. Identifies plans to improve patterns of weakness in student work6. Provides useful results for transfer from two-year to four-yearinstitutions7. Experiments with the use of NSSE and CCSSE surveys and consortiathat enable comparisons between or among similar institutions,including consortia development of additional survey items for NSSEand CCSSE instruments
13Appendix C (continued)Funding:The Davis Grant provides a total of $60,000 for this phase of the AMCOA Project. Themaximum grant for a proposed experiment is $12,000.Proposal:Please submit to Peggy Maki by October 31, 2011 a one- to two-page proposal that includesthe following information:Initially, a description of your overall approach to your experiment based onthe 7 parameters listed aboveIdentification of those involved in your experiment, including the person whowill chair or convene the groupApproximate timeline you will follow to develop an experiment that you willdemonstrate at the fourth statewide assessment conferenceBudget that may include support for meetings, compensation for faculty time,compensation for new DHE staff demands focused on developing, pilottesting, or assisting the development of web-based reporting in anassessment experimentProposals that involve collaboration between or among two-year and four-year institutionsare welcome, demonstrating that reported results are useful for our public institutions andprovide valuable information about our students’ achievement levels across theireducation. Submitted proposals will be reviewed and ranked by a panel of AMCOAmembers and then submitted to the Commissioner and Advisory Board for final selectionin November, 2011.