• Save

Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this document? Why not share!

Younce action researchproject

on

  • 1,033 views

 

Statistics

Views

Total Views
1,033
Views on SlideShare
1,032
Embed Views
1

Actions

Likes
1
Downloads
0
Comments
0

1 Embed 1

http://www.slideshare.net 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Younce action researchproject Younce action researchproject Document Transcript

  • Improving Student Response Quality Utilizing “Thinking Maps®” -Action Research Final Report- Ann Younce University of Colorado at DenverIT 6720 – Research in Information and Learning Technologies May 2, 2010
  • Thinking Maps®Abstract This introductory experience into action research focuses on the utilization of “Thinking Maps®”(specialized visual organizational tools for content learning, designed by Thinking Maps, Inc.) to increasestudent response quality and depth of understanding of complex content. A fifth grade teacher (theresearcher) working in a suburban district southeast of Denver, Colorado, was motivated to find out aboutthe effectiveness of “Thinking Maps®” (or TMs) strategies in her science class, and whether the mapscould produce a significant change in the quality and depth of responses from her students, many of whomwere reluctant to provide more than surface answers and who generally resisted higher level thought. Theresearcher had never before used the TMs tools, but was inspired by a team member’s inclusion in aschool-wide pilot of the “Thinking Map®” program, and observed its use in the neighboring classroomearlier in the year. The researcher believed that these purposeful thinking tools offered the right blend ofgraphic organizer and critical thinking strategies to increase student engagement and achieve additionalscaffolding for differentiated student learning. The guiding question evolved into determining to what extent “Thinking Maps®” (or TMs) areeffective tools and strategies to significantly produce a change in the quality and depth of writtenresponses, as well as to build scaffolding techniques, deepen understanding, and increase engagement.The research process included implementing the use of a variety of TMs into the instruction of a new“Human Body” science unit. As the action research progressed over the course of the four-week study, theresearcher taught students to use a variety of TMs, applied related key word and questioning strategies,recorded field notes, observed student behaviors and attitudes, collected data in the form of questionnairesand assessments, and monitored changes in the quality of student written responses and the level ofengagement with new information. She also pursued peer contributions in the form of discussions andinterviews. Throughout the implementation, she began to recognize improved note taking and studypatterns among her students, and determined that TMs indeed added a new layer of metacognition forstudents that led to an increase in performance on science assessments. As a result of this action research,the researcher plans to fully incorporate TMs into her instructional methodology, and to participate in aschool-wide “Thinking Map®” cohort next school year to continue with a new phase of action research.2
  • Thinking Maps®Introduction In my current role as a fifth grade teacher in a Denver-area elementary school, I serve an impactedsuburban community, represented by the following school profile data: 580 student enrollment (100 ofwhich are in the fifth grade; 24 of which are in my class), with 34% on free-reduced lunch, an 82%stability rate, a 95% attendance rate, and diversity demographics including Native American (0.9%),Asian American (8.7%), African American (16.5%), Hispanic American (17%) and Caucasian (56.9%)populations. The school setting and culture is one of high expectations for best practice and professionallearning with a strong base of data-driven practices among faculty and staff. The school utilizes embeddedProfessional Learning Communities to build capacity while continually seeking and expanding learningexperiences. The school’s organizational purpose and motto clearly states: “Learning…Whatever It Takes”and it is an effective philosophy for student achievement as well as professional growth. This foundationalmessage both supports the implementation of research-based solutions and encourages action researchamong colleagues.Research Statement and Guiding Question As a largely unexplored topic, I planned to focus my action research in the general area ofeffective instructional strategies that impact all content. I aimed to take a closer look at the use of graphicorganizers as metacognitive tools for deepening understanding and promoting higher level thinking. Alarge number of my students were consistently providing written responses that lacked substance andfailed to reach higher levels of thinking. I had many questions about what the appropriate solutions couldbe and I saw this action research as an opportunity to investigate and attempt a workable remedy. Theoverall research goal was to try to increase literacy and science achievement (as demonstrated by increasedcomprehension and improved written response, predominantly with large amounts of new scienceinformation) by using “Thinking Maps®” (or TMs) in an effort to close current achievement gaps. Thistopic involved both an increase of skills for students - the need for independent/automatic skill use tocommunicate/show their thinking and the ability to apply strategies in cross-curricular ways, as well astheir ability to internalize a new learning strategy process -incorporating specific, visual representationsand patterns for thinking, utilizing a common language (Hyerle, 1996). My main goal was to incorporatenew TMs strategies during a large instructional science unit, utilizing the process of journaling, interviews,disaggregated pre- and post-assessment data, anchor work samples, and similar artifacts. The projected affect of such research was three-fold: specifically, my students could gainimproved ability to acquire and retain large amounts of information, making it both meaningful andrelevant to them through cross-curricular strategies. The teachers (initially me and my grade level team)3
  • Thinking Maps®could also gain in our ability to differentiate instruction for those who need a more structured approach tointernalizing information and communicating higher level thought. Ultimately, my colleagues and theentire school community could benefit from shared information and increased student, instructional andprogrammatic successes. This was a beneficial challenge because my personal interest in brain-compatibleinstruction and metacognition served as another impetus for this research. Finally, the research was bothtimely and relevant due to the fact that sections of my school have been in various stages of theimplementation of a new “Thinking Maps®” pilot program and have established a cross-grade level cohortfor vertical alignment. As a desired outcome, I hoped for increased understanding and achievement for students, utilizingquick, readily-available scaffolding frames at an independent level in order to successfully accomplishthree goals: (1) read/acquire to filter information, consistently identify main idea and provide supportingdetails, (2) organize to construct meaningful information, secure logical patterns and make emotionalconnections, and (3) write to demonstrate increased comprehension by showing quality responses, highlevel thinking and internalized deep understanding. The guiding question for my research continued toevolve and finally emerged with several facets: To what extent are “Thinking Maps®” effective tools and strategies: (1) to significantly produce a change in the quality and depth of written responses, and (2) to build scaffolding techniques, deepen understanding, and increase engagement?Review of Literature It was important for me to select parameters to guide my search for relevant literature as I beganthe ongoing process of researching “Thinking Maps®” (or TMs). As my topic became more and morenarrowed, it naturally diverged into several themes: graphic organizers, engagement, high-order thinking,effective cross-disciplinary instructional strategies, depth/complexity of understanding, and metacognition.I quickly made use of the various online databases such as Google Scholar, and Education Full Text(Wilson’s Web), available from the university’s Auraria Library, to access electronic resources. In addition,I created general Internet searches of familiar professional sites I regularly use, such as ASCD and McRel,in addition to the “Thinking Maps®” website itself, as well as made use of professional texts from my ownpersonal library. A quick search through each of their bibliographies allowed me to identify additional datasources that these other references had used. As I began to search for relevant literature to collect and review, I compiled a list of keywordsearch terms and tried multiple variations, such as: “thinking maps,” “graphic organizers + deepunderstanding,” “common visual language,” “non-linguistic representation,” and similar combinationsinvolving maps or organizers used to solidify learning and increase comprehension. At first, it seemed likea treasure hunt as I searched through articles about “Thinking Maps®” (or TMs) and found references to4
  • Thinking Maps® additional articles, studies and names of researchers. In particular, the “Thinking Maps®” website had devoted entire sections to the research conducted in support of their product, including links to the Thinking Foundation and Designs for Thinking websites. These sites contained various research formats including journal articles, case studies, and news articles, and I found excerpts and glimpses of classroom teacher research projects explained throughout the article texts. In order to produce relevant search results of the literature, I instituted various descriptors to serve as lenses or filters through which to pass my reference sources through. These filter criteria included: recently published references, results from actual studies, implementation information, elementary/middle school-preferred ranges, and a science-based focus (for example, brain research in education). These filters helped me to stay organized, categorize my research into subtopic themes, and evaluate potential references. The chart below provides a general overview of the types of collected literature which includes a variety of source formats such as professional journals, professional texts, and relevant websites. (For a complete listing of all literature sources, see References). Figure 1: Sample Literature Collected Texts/Authors Periodicals Websites OtherHyerle New Hampshire Journal of Ed www.mapthemind.com course websiteMedina Journal of Educational Psych. www.thinkingfoundation.org site cohortMarzano Teaching, Thinking & CreativityCosta Educational LeadershipRobinson & Lai The Reading TeacherJensen Metacognitive LearningTomlinson I began to collect large amounts of information before I realized that most of it was related to the same researcher (Dr. David Hyerle, the founder of “Thinking Maps®”) and was limited to a specific time frame (late 1980’s and through the 1990s). I quickly became frustrated not being able to find many current (within the last few years) studies or publications that related to my topic. Although I utilized several database searches, I still came up with the same listings, from 10 to 15 years ago. Even though the brain research of the last decade shows links between understanding, non-linguistic representation and achievement (Marzano, Pickering & Pollack, 2001), I was unable to find any more recent studies and immediately wondered why, second-guessing my choice of research. These gaps of somewhat outdated literature, as well as some additional identified problems (such as many articles that applied to ELA or special education needs only) left me disappointed with the small scope of the research and I had to consider some sources to not be as valid and reliable as others. I have since assumed that some of the gaps (especially in Hyerle’s case) may have been due to renewed longitudinal studies since the earlier years of 5
  • Thinking Maps®research. Therefore, as far as the scope is concerned, certain literature may not be included because thereseems to be very little (as far as research studies) conducted past 2001. Amid the articles and studies that I did happen to find, there were varying degrees of cohesivenessin the findings, and just as many areas of conflict. From the significant researchers, relevant ideas emergedand my beliefs were further derived and substantiated. As for a few key studies or articles of significanceto my research, there were some similarities that related directly to my own questions, beginning with theforemost expert in the field, Dr. David Hyerle. Hyerle suggests that after being reinforced over severalyears (a main difference from my own study) students are able to “transfer multiple maps into each contentarea, becoming spontaneous in their ability to choose and use maps for whatever content information andconcepts they are learning” (Hyerle & Williams, 2009). Although my students had only about four weeksof “intensive” map training, they too could choose from multiple maps. (Although the breakdown here formy students was that they were not always used correctly for the corresponding skill). Like Hyerle’s recentstudy, some of my students were also able to identify the correct thinking process and select theappropriate map and use the obvious cognition pattern. In a path that diverges from Hyerle’s study, I foundthat my students would all start with the common language and graphic, yet how they completed theinformation varied greatly among students. I also found design strategies among the literature that were appealing to me and similar those Ichose for my study. Stull and Mayer’s (2007) study, although far more complex than my own, hadstudents in one portion generate their own graphic organizers after being frontloaded in how to generatevarious types of visual tools, such as hierarchical, lists, flowcharts and matrices. Both their study and myown had the instructor first modeling types of graphic organizers. In contrast however, I found conflicting analyses and conclusions in the Stull and Mayer (2007)piece, where they found that there was little to no evidence that students constructing their own graphicorganizers achieved any more than students who received author-provided organizers. Although mystudents all received the same “blank” organizer, they differed on their responses within. There were manydiscrepancies in the results of studies and the conclusions of articles as to whether or not the visualpatterns (and I interpreted “Thinking Maps®” here) resulted in extraneous processing or generativeprocessing, and the evidence for deeper learning was contradictory. I question here whether this effects“Thinking Maps®” (or TMs) and do all the various types of graphic organizers interfere with a student’scognitive process as they’re selecting and using the maps – do they use too many versions of graphicorganizers and waste time/cognitive processes independently deciphering which is the best tool to use? In addition I found very little from which to model the design of my own research among thearticles and studies. Many of the earlier studies had hundreds of participants across multiple schools ordistricts, whereas I am limited to my own class of 24 students; although I did find a few useable6
  • Thinking Maps®similarities in some of the shared vignettes from classroom to classroom that were mentioned in some ofthe articles and research studies (Leary, 1999; Hyerle, Curtis & Alpert, 2004). TMs are advanced graphic tools used to secure brain pathways for learning, and metacognitivetools used to increase awareness of cognitive cues and help to deepen student understanding. One otheraspect that concerned me (and confused me) was in the contradiction of some of the evidence in somestudies (Merkley & Jeffries, 2000/2001; Stull & Mayer, 2007; and Ritchhart, Turner & Hadar, 2009) thatshowed that general graphic organizers (not mentioning TMs specifically) may or may not increase studentcomprehension or deepen learning. Following recent brain research, it was stated that vision dominates allother senses in the brain, and the more elaboration, the better the coding and hence better storing andretrieval (Medina, 2008), which seems to make an argument in favor of visual tools like graphicorganizers. Although Medina also states that it is through pictures - not through written or spoken words –that the brain makes the most lasting connections, that leads me to wonder why the TMs program doesn’tmake mention of incorporating sketches and pictures in place of some words within the various mapstructures? In summary of the literature, although there definitely were more articles than any recent studiesthat pertained to my research topic, I feel as though I semi-exhausted my search across the related fields ofcognitive science, effective instructional practice, and brain research, and it did allow me to form a morecomplete research picture, especially as I began to see a saturation of the same researchers mentionedagain and again. Some further thoughts or questions that were prompted by my literature review wouldhave to include whether or not my choice of study still represents a valid instructional practice? Themethodology and research presented by the TMs website itself seems relevant and robust, yet I am still notclear as to why there is not more current research out there? Perhaps I have stumbled upon a gap in theliterature as it seems as though there should be more studies that explore variations of the TMs strategiesor, at the very least, further challenge the relevance of this type of tool as it compares with other brainresearch.Relevant Background Knowledge TMs were designed to integrate content learning with thinking process instruction acrossdisciplines and be utilized as a “common visual language” for learning (Hyerle & Yeager, 2007). Learnersare expected to identify 8 fundamental thinking skills and link them to dynamic visual representations tocreate specific pathways for “thinking about their thinking”. Maps are designed to be used in combinationto increase depth and complexity, and as a developmental process, complexity in use increases overtime(Hyerle, Curtis & Alpert, 2004). “Frames of reference” surround each map to provide connections tosources, reflection and ownership. The maps are consistent, integrative, and flexible for various types of7
  • Thinking Maps®implementation. Each map corresponds with a specific thinking process designed to activate and buildschema: Figure 2: “Thinking Maps®” and 8 Corresponding Cognitive Skills “Thinking Maps®” Cognitive Skill “circle map” defines in context; presents point of view “bubble map” describes sensory, emotional, logical qualities “double bubble map” compares & contrasts qualities “tree map” shows relationships between main idea & supporting detail “flow map” relates events as a sequence “multi-flow map” indicates cause & effect; helps predict outcomes “brace map” deconstructs physical structures & part-to-whole relationships “bridge map” transfers or forms analogies (See Appendix D for actual map designs) Expository text structure is often difficult to digest for intended meaning, complex concepts,specialized vocabulary and inferred relationships such as cause & effect, compare/contrast, sequences orcycles (McCormick, 1995). TMs enable learners to assess prior knowledge, respond to complex inquiryand, working independently or collaboratively, draw out essential knowledge, identify critical informationwith supporting details, and make inferences and find connections to the text. TMs have been purposefully organized to align with current brain research to enable movementfrom concrete to abstract thinking with greater depth and to directly apply thinking to complex tasks.Meaningful context links to emotional and metacognitive “frames of reference” and ownership, addressingessential questions, prior knowledge, primary sources, point of view and other influences. Over the lastdecade, brain research shows that 80% of all information that comes into the brain is visual (Jensen, 1998)and the visual patterns of TMs help learners create concrete images from abstract thought. The brain is anatural pattern detector and teachers need to provide students with experiences that enable them toperceive patterns and make connections (Caine & Caine, 1991). It has been shown that “explicitlyengaging students in the creation of nonlinguistic representations stimulates and increases activity in thebrain” (Gerlic & Jausovec, 1999) and unlike earlier forms of concept maps (Novak & Gowin, 1984) andother types of random or ready-made graphic organizers that do not fully suffice, TMs encourage learnersto become independent thinkers due to the ownership and connections made in their completion. In further support, the dual-coding theory (Paivio, 1986) states that simultaneous use of picturesand words enhance the brain’s ability to organize information for subsequent retrieval and use. Researchproves that the more learners use both linguistic and nonlinguistic representations the better their thinkingand recall (Marzano, Pickering & Pollack, 2001). TMs align nicely and integrate into practice with many8
  • Thinking Maps®of Marzano’s 9 strategies most likely to increase achievement, among them: identifying similarities &differences, summarization & note-taking, cooperative learning, cues, questions and advanced organizers.Class time spent using TMs increases engagement with specialized thinking skills, and it is suggested thatstudents should spend 60-80% of class time engaged in “process, discussion, group work, self-assessment,journal writing, feedback, mapping” (Tomlinson & Kalbfleisch, 1998) with such differentiation evident inthe strategic components of TMs.Methodology/Research Design: The purpose of the study was to determine the effectiveness of “Thinking Maps®” (or TMs)strategies for use in science instruction, and whether the maps could produce a significant change in thequality and depth of responses from students, many of whom were reluctant to provide more than surfaceanswers and who generally resisted higher level thought. The guiding research question was: To whatextent are “Thinking Maps®” effective tools and strategies: (1) to significantly produce a change in thequality and depth of written responses, and (2) to build scaffolding techniques, deepen understanding, andincrease engagement? In order to best serve my research into TMs, one small way in which I employed aspects ofproblem-based methodology in using both qualitative and quantitative research methodology and designtraditions, was in exploring some limited use of the “Theory of Action” techniques (Robinson & Lai,2006) in exploration of constraints, actions and consequences which will be addressed later in this study. Ibegan by asking questions about the assumptions and beliefs I held about my classroom and my conceptsof student response quality. Students need a certain type of response to demonstrate understanding and Ibelieve I had been a good judge of the depth of their understanding, asking them to “write/draw it inanother way” or to “tell me more” to elicit the best possible response. From here I needed to furtherinvestigate what I do in my practice and why I do it when assessing student response quality. The main type of data analysis I employed however, most resembles that of Wolcott’s ThreeStages of Analysis (Wolcott, 1994), where different aspects of analysis based on qualitative data collectioninform each other to incorporate (1) descriptive data from original notes, questionnaires and pre-/post- testexperiences, (2) analyses based on how the descriptive data initially relates to the research question in anexploratory manner, looking for patterns and themes, identifying key factors and relationships, and (3) myinterpretations to make sense of findings and the conclusions based on the extent to which they posemeaningful, relevant and compelling answers to the research question. Quantitative data collectionappeared in the formulation of results tables to quantify through tallies the number of occurrences ofspecific written response types, frequency of TMs used, or data points for behaviors.9
  • Thinking Maps®Site Selection I selected my own fifth grade classroom as the site for this study, as I believed this would help meachieve the maximum impact and benefit, thus making the experience as meaningful and personallyrelevant as possible. The classroom is one of four adjacent fifth grade classes, with 24 students (11 boysand 13 girls) with a demographic composition that closely reflects that of the entire school population (seeAbstract). The science lessons take place predominantly in the afternoon block, being the last 45-minuteclass of the day. The students are situated in table groups of 4 to 5 students per group, and a typical dayconsists of a quick preview of new material (or a review of the previous day’s lessons), followed by eithera group exploration activity or experiment to “kick-off” the inquiry. Time is then usually spent in directinstruction through interactive SmartBoard lessons, class discussion, group reading or data mining,collaborative or independent work production, and finally reflection. Occasionally, due to schedulechanges or anticipated interruptions of the instructional time, I would switch (or add) science instructiontime into the morning block, with each day usually starting out with a pre-reading activity or TMspreparatory work as part of a morning menu.Role My role was generally that of “facilitator”, where I would introduce content, model and suggeststrategies, employ questioning techniques, elicit responses and reflection, and monitor collaboration. Mygoal for consistency was to develop an instructional routine to closely follow in order to keep theexperience familiar and consistent for the students, and to keep an order for me to follow in terms ofmaking observational notations and focusing specifically on student response quality and incidents ofengagement. The classroom climate needed to be one where thinking is valued, visible and activelymodeled (Ritchhart, Turner & Hadar, 2009). Just prior to the beginning of the first data collection cycle,we held an initial class meeting to discuss the action research project they would be a vital part of – usingTMs to improve content learning and assess response quality. I answered any questions or address anyconcerns they had (and the only “fear” that needed to be alleviated was that they wouldn’t be graded ontheir questionnaire responses – only on the regular unit quizzes and tests). In describing my relationshipwith the students, I tried to maintain a positive report and communication style, rigorous expectations, acasual teaching style that incorporated flexible pace as needed and interjecting humor to engage. I utilizeda school-wide positive reinforcement behavior management system (“Mountain of Trust” – earning theirway “up the mountain” using silent and visual cues employed without a break in instructional momentum)with predictable rewards and consequences. My students were comfortable with me and that climateallowed them to take risks this group endeavor when asked, and at times, independently.10
  • Thinking Maps®Purposeful Sampling Strategies When it came to the selection of individual respondents, I made the decision to sample all 24 ofthe students in my class. As an intermediate grade level teacher, it was inconceivable to run any sort of“control group” and “variable group” when I’m ethically bound to provide consistent instruction for all.Instead, I would make general comparisons of my class as TM beginners, to other classes with more orless TM training. Although my grouping was a relatively small data set, it was one that was manageable byme. For my interview subjects, I first selected one of my grade level colleagues who was participating inthe tandem TMs pilot cohort on site, as well as an instructional coach/trainer who was overseeing thecohort. Additional questions and probes were asked of other team members who were not part of the pilot. The selection of activities to observe included taking anecdotal field notes on random studentbehaviors, attitudes, engagement, and expressions. I rotated around the classroom listening-in on tablecollaborations and cooperative learning routines. Other activities included observing TM use during classand for assessments. I was looking for confident and appropriate use of TMs as well as an improvement inoverall response quality. As for the selection of documents to analyze, I looked at the use of thinking strategies, TMcompletion and response quality on student work products such as class and individual TMs, sciencenotebooks, test questions and research questionnaires, and my own field notes. I created a mix of closedquestions with a range of pre-determined responses, and open-ended opportunities that were non-threatening and would allow for rich and detailed responses (Ritchhart, Turner & Hadar, 2009).Data Collection Strategies In order to remain consistent with the traditions of the selected research designs, data collectiontools were designed purposefully to elicit clear pictures of student thought processes to more accuratelydetermine content knowledge and assess response quality. I established a balance of different artifacts tocollect (Robinson & Lai, 2006) which included observations of students recorded in field notes,questionnaires for students to explain use of strategies or to rate difficulty, examination of student workproduct (actual TMs), and interviews with colleagues. Data was obtained by observation techniques, notetaking, disaggregating student responses on questionnaires, pre-tests and post-tests, and comparisons ofstudent TMs to anchor maps. I was intrigued by the notion that students learn more deeply when theyconstruction their own organizers, or “learn by doing” than when the organizers are provided for them, or“learn by viewing” (Stull & Mayer, 2007). I knew I had to model the initial use of each type of TM whilebalancing the need to allow for learner-generated variance. I established a collection cycle and kept itconsistent for each subtopic within the unit: pre-test, map introduction and use (“circle map”, followed by“tree map” or “bubble map” or “brace map” as needed, and culminating with a “flow map”) and finally11
  • Thinking Maps®post-test. I would rotate around from group to group, observing students over-the-shoulder, have studentsverbally explain their strategy to me, or observe from a clear vantage point during an assessment.Interviews of colleagues were conducted largely in an unstructured conversational style and I designedquestions so as not to be leading in any way. The collection of actual data was planned for as I recordedobservations, thoughts and process notes in a spiral notebook. Throughout the lessons and assessments Iwould collect data in an informal way (sticky notes or clipboard), tally according to criteria (for example,the number of times students looked up vaguely, trying to remember; or the number of times they lookedaround the room looking for visual cues), or anecdotal notations be formally transcribed after class.Assurances of Confidentiality In order to ensure confidentiality throughout the process, assurances were made and designed intothe research methodology. Verbal consent of colleagues was achieved, and informed consent of thestudents was implied as I spoke with them at length about the process and their role in it. Parental consentwas not necessary as this was part of the daily routines of class and no identifying elements would presenta conflict. I concentrated on the student group as a whole, with no individual identifying comments in fieldnotes (other than pre-determined codes to safeguard identity) and I took care not to mention anyidentifying factors of specific individuals in class conversations, interviews with colleagues, or in thewriting of this document.Data Analysis Strategies I endeavored to employ techniques to be consistent with traditional qualitative and quantitativemethods, although my approach was not a particularly linear process but one that was flexible among thecomplexities of the teaching and learning day. I looked for obvious shifts in the classroom as far asbehavior, engagement, strategy use, and quality of response, and I also looked to evaluate changes instudent conceptions of thinking. The literature provided me with guidance in the form of 3 components forsuccessful and creative data analysis: (1) patience - reminding myself that it was my first time with actionresearch, as well as a first for my students, (2) mistakes – maintaining a willingness to take risks and anacceptance that mistakes would be made, and (3) playfulness – remembering to have fun with the processand with my students as we investigated TMs together (Hubbard & Power, 2003). In each of the data collection tools I attempted to examine frequency of specific questionresponses, and tally instances of successful use, or marginal attempts at use, or non-use of various TMs.Some of the participant coding makes reference to both gender as well as ethnicity data (and this particularuse of demographics is something that my district and school consistently utilize in order to focus onequity issues, as well as serving as a reminder to always be cognizant of potential equity constraints in12
  • Thinking Maps®instructional practices). The coding indicated: B (boy), G (girl), 4/3 (ethnicity code for “hispanic/black”),and 5 (ethnicity code for “white”) as there are no other ethnicity groups represented in this participantpool. I also made an attempt to code emotions (such as “easy, fun, hard, or confusing”) in my field notesand look for emergence of new types of responses. There also came the realization that, after many yearsof experimenting with tracking student responses and behavior, it may be difficult to analyze the “subtlenuances of change” and find different ways to code than what had been intuitive thus far as simpleanecdotal records (Hubbard & Power, 2003). Other techniques of verification included transcription after the fact into usable data, looking forrecurring patterns to identify, as well as the use triangulation to observe from several different angles, suchas student interviews and observations and pre-tests/post-tests, to see if they all independently produce thesame conclusions (Robinson & Lai, 2006). The additional use of colleague data will help to round out thequestionnaires, observations and test data to confirm stronger triangulation within data collection.Presentation of Findings: Results, Implications and Significance When looking at the data through the lens of “Theory of Action” (Robinson & Lai, 2006) makingsense of the constraints meant identifying the factors that led to my use of TMs: the program was beingpiloted at my school with a pilot cohort on site, and one of my fifth-grade team members was activelyusing TMs in our classes. In my class there was varying ability to read/write independently and a widevariance in the depth of learning and quality of written responses for some individuals, so there were manydiverse learning styles and needs within the same group. TM authors recommend at least an 8-weekintroduction period followed by a year-long process to work with TMs until they start to become intuitivefor learners. My own 3-4 week study pales in comparison and the limited exposure to TMs (except inflexible grade-level groupings for math and reading where they may have been exposed to more TMtraining with the pilot teacher, or from other grade level team teachers attempting to emulate without thetraining) may have produced inequities with TM instruction repetition for some and being new for others.In addition to absences which are unavoidable, the majority of instruction occurred at the end of the daywhich may have produced a loss of attention span, frequent interruptions (PA announcements, studentsleaving school for appointments) and the like. One other possible constraint was determining whether itwas the presentation of new material (“human body”) or new map use (or both) that made the difference inresults. In an attempt to satisfy these constraints on the problem, various actions through instruction wereused to narrow to a range of solutions (Robinson & Lai, 2006). I worked to keep students focused (usingSmartBoard strategies designed to engage) and created predictable routines for each lesson. I implementedalternate activity to increase attention span (integrated movement, varied the use of text, video,13
  • Thinking Maps®SmartBoard, discussion, cooperative groupings for pair-shares and jigsaw activities). I also tried totransition the students in their use of TMs by starting out with the pre-printed blank TMs at first andgradually released them to use their own self-drawn maps. TMs by their design were meant to allow fordifferentiation and the needs of those learners with varying abilities were met in the flexible nature of themaps. I attempted to prevent lesson interruption by occasionally rearranging the day so that the scienceTMs lesson could be earlier to avoid afternoon disruption. I also maintained constant communication withother TM teachers to aim for consistency in instruction, modeling and the rate of exposure to TMs. The subsequent consequences of instructional actions produced a more focused and engagedclimate with an apparent increase in student interest (but again, was that the TMs or subject matter – orboth?) and both intended and unintended consequences emerged (Robinson & Lai, 2006). Intendedconsequences were positive reactions to the TMs demo and students were more motivated to get their ownwritten, more productive in written responses, and the collaborative groups seemed to increase the sense ofcommunity while working with new maps and sharing connections. I made sure to encourage differenttypes of color-coding on the maps and for this, student connections to references and schema were moreeffective in the depth to which they could transfer their learning during class. Other expected results included the findings that TMs were most effective when used incombination for students to fully develop their conclusions (Hyerle & Yeager, 2007). The most precise anddetailed responses came when students used two or more maps to explain their thinking. Also intendedwas my increased ability to use TMs to assess students’ development of concepts and misconceptions(Novak & Gowan, 1984). It was far easier to spot patterns and connections in thinking as well as theglaring gaps from misconceptions or map misuse. It was anticipated that students would achieve the abilityto organize their ideas independently, assess the quality and quantity of their own written responses, andpossess increased awareness of own thinking and an increased motivation to learn. In conversations withstudents, it was made clear that they thought these strategies helped them as they read through text,watched videos and participated in class discussions. 20% of the class even alluded to the fact that the useof TMs helped them more than the traditional study guides when preparing for the test. There were, however, unintended consequences that I did not anticipate. Although I knew thatthose students who generally do well would continue to do well (and the data supported this), I assumedthat all of those who did not generally do well would make some growth. Instead, I found that only a smallgroup began to improve and for some, there was no improvement. For some inexplicable reason, there wasalso a reversal in performance about midway through, and students started to use TMs less and lesseffectively. I relied on the assumption that the maps would begin to make a permanent change in thewritten response quality of most students, and during direct instruction or independent class time, it did.However, on assessment it appears that perhaps the “novelty” wore off because TMs were utilized less and14
  • Thinking Maps®less across the board. My only thought is that in the constraint of available time and length of datacollection could not be altered, therefore it could still be too early to tell if any significant growth ispossible without continuing the TM instruction and monitoring again toward the end of the school year. The overall presentation of the data is organized both by type of data tool and chronological stageof data collection. By examining the data closely I was able to comment and reflect on different aspectswhich appear either below the data sets or further on in the discussion section. As with all the datapresented, absences occurred on a daily basis so there are few data sets showing results for all therespondents at any one time. For “Questionnaire #1”, I thought it was important to establish a baseline ofprior knowledge about the specific content to be studied (the human body) and of various thinkingstrategies and tools previously used prior to the introduction of any TMs in my science class. Figure 3: Student Questionnaire #1: Prior KnowledgeQ-A: How much do you know about the human body before we start our unit?Participant code/Response “ a lot” “some” “a little” “nothing”B 4/3 0 2 0 0B5 1 2 2 0G 4/3 0 0 6 0G5 0 4 0 1Q-B: What strategies and tools do you use most to help you learn information?* Participant code/Response “ study guide” “books” “notes” “TMs”B 4/3 0 0 2 1B5 1 2 1 0G 4/3 0 1 3 2G5 1 1 1 1**Other: There were other responses noted such as: mnemonics, cheers/chants, flash cards, games, videos,online support, group study, memorization… (See Appendix A for actual questionnaire) Most of the respondents (about 88%) had “some” or “a little” prior knowledge about the humanbody content which I think would make for an easier transition into using TMs to organize knowledge andremember terms, facts and processes. There were 4 students (18%) who already used TMs to varyingdegrees as a learning strategy or tool in other classes, and the majority (59%) used some traditional form ofstudy strategy. A smaller percentage (22%) used additional strategies that they delineated, but the datashows that all students were comfortable using strategies of some kind. I anticipated that even though thenumber of participants previously familiar with TMs was small, that this number would grow with theinstruction in each sub-unit of instruction, or body system. As for looking at ethnicity data… there are fewif any implications here that show variance in ethnic groups and I wasn’t sure going forward if anysignificant patterns would be revealed. (One of the initial constraints I noticed was the issue of participantabsences and it made for a varying number of responses every time I collected data. Because the absenceswere attributed to different individuals each time, I wasn’t sure how address the validity of data collection,15
  • Thinking Maps®other than to continue to be consistent for all who were present. If I were to try to “make-up” the datawhen a student returned, it might skew the overall data because it was collected in different settings, and ifI were to ignore the absent individuals, then the total number of participants would change with eachcollection tool. I chose to try a little of each remedy, although I’m not sure that I can disaggregate theeffect from the overall data). Throughout each sub-unit, pretests were given to assess prior knowledge. At no time (especially inthe later pre-tests) did students independently utilize TMs to share their thinking. I assumed that theywould gradually start to appear, but not once in a pre-test situation was a TM used by a student. Duringinstruction, TMs were utilized by all students to their fullest capacity, since I began modeling with thewhole class emulating and then I would wander as they worked looking for completeness. When it cametime to assess independent use, the next form of data collection was the post-tests following each sub-unitwhere it was anticipated that TMs would appear in student responses. Both the “pre-test” and “post-test”asked an identical question: whether the student could describe the process for that specific body system(such as the process of digestion, the process of breathing, the process of blood circulation, the process ofthe brain sending a message, etc.). Most pre-tests were answered in the same manner (short writtenresponses to the question, or “I don’t know”) but after content instruction and TMs modeling of the “flowmap”, the post tests began to show some TMs use to varying degrees of success: Figure 4: Post-Test ResultsBody System Successful TM Use Marginal TM Use Non-TM Use Participant Code 1 1 0 B 4/3Digestive 4 5 0 B5 4 3 0 G 4/3 1 5 0 G5Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 0 2 0 B 4/3Respiratory 2 5 1 B5 2 6 0 G 4/3 2 0 3 G5Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 0 3 0 B 4/3Circulatory 1 3 3 B5 4 2 2 G 4/3 1 3 0 G5Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 0 2 3 B 4/3Nervous 0 1 4 B5 0 4 1 G 4/3 1 2 2 G516
  • Thinking Maps® The initial pre-test responses were for the most part the same - minimal answers with littleterminology or content knowledge, and no use of TMs – even after the students had started to receiveinstruction in their use. Targeted TMs were introduced (such as the “flow map” for sequence ofevents/processes) and utilized throughout instruction and, when the same question was posed on the post-test, my expectation was that the students would use flow maps on the assessment to explain the specificbody function process. The data shows TMs usage with varying degrees of success over time – thedigestive system was the first sub-unit introduced, with the nervous system being one of the last in the 4-week time frame. Some participants successfully used the correct TMs with the correct terminology ateach body system stage. Others attempted to use the flow map, but did not provide the correct informationin each stage of the process. Still others made no attempt at using TMs whatsoever, reverting back to shortwritten responses, or various other diagrams/graphic organizers such as Venn Diagrams, cycles andlabeled models. Subsequent body systems (skeletal and muscular) did not make use of the same type offlow map for processes, and a different set of TMs were utilized. Data was not collected in the samemanner for these final systems. After all of the TM instruction and modeling was introduced, about midway through the datacollection, a questionnaire was given to students to see if they could identify the specific thinkingprocesses and match it to the corresponding map: Figure 5: Student Questionnaire #2: Identify Intended Thinking Strategy Q- Match each TM to its intended thinking strategy (that is, why do we use it)? Participant code/ 0 1 2 3 4 5 6 Number Correct of 6 B 4/3 0 0 1 0 0 0 1 B5 0 2 1 2 2 0 1 G 4/3 0 0 2 1 3 0 0 G5 0 0 1 3 2 0 0 (See Appendix D for actual questionnaire) Most of the students ranged from 2 to 4 correct matches with only about 41% passing acceptably.It was still only midway through an admittedly short research period, so there was hope that with morepractice an increasing number of students would be more successful in the future. Yet another area within the data where unintended results occurred unwittingly was with thehuman body “final test”. It was designed by one of the other grade level team members and administeredto the entire grade level – including my participants (and ultimately was not always consistent in languagewith my earlier post-tests). Students were asked on 3 occasions to describe body function “processes” – all3 which could have been suited for TMs responses if the student was aware of the cues (the test usedsimilar language like “describe how __ works”, “describe how ___ occurs”, and in one very consistentinstance with a pre-test and class work, “compare and contrast ____”; in addition to a bonus section wherestudents were asked to “describe” additional facts not previously required). Even though the questions17
  • Thinking Maps® were worded in a slightly different way, it was still my hope that my students would see the key word, make the connections and utilize the appropriate TMs: Figure 5: Final Test Questions (that included key word descriptions of specific processes) Arteries & Veins Breathing Muscle Pairs “Extra credit facts” (no TM Participant(success/marginal) (success/marginal) (success/marginal) (success/marginal) attempt) Code 0/1 0/1 0/0 0/0 1 B 4/3 1/1 0/2 0/0 0/0 5 B5 1/2 0/0 0/0 0/0 3 G 4/3 2/0 0/0 0/0 0/0 4 G5 Students seemed to fully utilize TMs much more during class and independently than they did on the final assessment. Perhaps it was due to the fact that I purposely did not mention TMs as a specific option for test responses? I wanted to see rather if they would transfer the use of TMs independently without my prompting. The number of students who were successful with their maps on these questions (only about 16%) was disappointing, but even more alarming was the number of students who chose not to use a TM at all (about 54%) on test responses, even within the “low pressure” extra credit section. The “muscle pairs” question also rated no attempts at TM usage for some unknown reason even though we distinctly used TMs in class to discuss the relationship. At the end of all the TM instruction and modeling, and after the final assessment near the end of the data collection, a third questionnaire was given to students to see if they could make connections with the key words and phrases associated with each thinking process and then draw the corresponding map: Figure 6: Student Questionnaire #3: Connect with key words and draw TM Q- Match each TM to its intended thinking strategy (that is, why do we use it)? Participant code/ 0 1 2 3 4 5 6 Number Correct of 6 B 4/3 0 1 0 0 0 1 0 B5 0 0 2 1 3 0 1 G 4/3 0 1 0 2 0 0 2 G5 0 1 1 2 0 0 1 (See Appendix E for actual questionnaire) Most of the students ranged once again from 2 to 4 correct responses and similarly to the second questionnaire, only about 42% passed within acceptable margins. Although a few had moved over into the 4, 5, or 6 correct columns, even more had moved in the opposite direction, showing more incorrect responses overall. I was curious how this result might have related to the frequency of use of each type of map over the course of the data collection. I tallied the frequency of each map as it was used throughout the sub-units, then I determined the percentage of students that correctly connected key words with drawing to see if there were any correlations: 18
  • Thinking Maps® Figure 7: Frequency of TM use v. success rate on questionnaire #3 Type of map Frequency of use % correct responses across sub-units “circle map” 7 70% “tree map” 3 39% “double bubble map” 1 48% “flow map” 5 57% “cause & effect map” 1 39% “brace map” 1 30% * Not all TM types were used during this data collection period The “circle map” was most widely used and showed in the number of correct responses. The “flowmap” and “tree map” were the next most used maps, yet the number of correct responses doesn’t seem tocorrelate. Maybe it had to do with amount of exposure time – or perhaps my sample is too small? The remainder of data was collected in the form of observation notes on student engagement,depth of knowledge and colleague interview responses (see Appendices F & G). General observationsincluded that, after four weeks of TM introduction, some students (about 25%) began to speak aboutcontent in a more confident manner and often referred back to a TM when making a point and attemptingto clarify their thoughts. More students (about 30%) were able to discuss concepts with great detail usingTMs with little prompting from me. I observed a limited increase in written response depth and quality(about 20%), but a much larger increase in class engagement (about 60%). The data collected fromcolleague input was more revealing however. All (100%) of my colleagues found the use of TMs,especially the common language they created, to be extremely helpful. Vertical conversations includingteachers and administration revealed that students were largely on task and participated more when able tooffer the different perspectives that the TMs encourage. The use of TMs also enabled the students in theirclassrooms to be more independent and concise, and several expressed their amazement at what theirstudents could do with the maps. Most (about 80%) agreed that more teacher professional training wasnecessary moving forward.Emerging Themes/Results: As a result of all the data, I was surprised to see that the strongest data (and participation in TMsusage) came at the very beginning and declined on every data collection that followed. Perhaps mostcurious for me is that the data appears to show a decrease in consistent and independent TM use; instead ofgetting stronger with TMs use, it appears that the students were becoming less and less consistent with theproper TMs, and not independently choosing those graphic organizers, but reverting back to other moregeneral forms (or poorly constructed written responses). Not at all what I had expected! In addition, thereappears to be some difficulty for some participants to remember which TMs belongs with which thinking/process skill and I’m concerned that this might present a “double load” and be just as hard as learning the19
  • Thinking Maps®content material itself, presenting further areas of confusion. As an initial response to my researchquestions, the first use of TMs produced more depth of responses and more engagement in the TMsprocess. As each sub unit progressed, the participants utilized TMs in their notes, class discussions andpost tests, but with decreasing degrees of accuracy. There seemed to be a small number of students whoincreased their understanding, response levels and engagement, but they weren’t the ones who needed to interms of skill progression. I am still trying to sort out the reasons why this may have occurred. Some other constraints I hadn’t thought to explore emerged as learning style constraints that TMsseemed to produce – TMs being predominantly visual and systematic organizers (with some tactilequalities of writing or drawing, and auditory qualities when paired with “verbal rehearsal” of theircontents). I realize after looking at the data that I should have allowed students the opportunity to “thinkaloud” and verbalize the process in a way that they could have explained it better and it would haveproduced different results for some students. In wanting to remain consistent in how I modeled the use ofTMs and the expectations for assessment responses, I overlooked meeting the needs of other learningstyles. Another potential barrier to clear and consistent data could have been that this participant groupmay have had varying degrees of previous “response instruction” (modeled by different team teachers inflexible groupings for reading and math, in addition to a different graphic organizer system used in aschool-wide writing program), as well as a different amount of previous exposure to TMs use, and theseinstances may have offered some confusion when confronted with the TMs approach. My overall findings appear to parallel those I read in a sample study (Burden & Silver, 2006)where teachers were convinced that students were capable of achieving more than what they weredemonstrating and intended to raise expectations and realize an increased number of students becomingmore active learners with TMs. Although the results did not quite meet their assumptions, and theresearchers noted that the TMs needed extensive further piloting, they came to the conclusion that studentswere just starting to shows signs that the TM tools helped guide thinking and problem-solving and thatthere were many different uses still ahead for multiple maps implementation. I too agree that there are early signs of habit-forming and that TMs are beneficial, havingconnections to several related clusters of Costa’s “Habits of Mind” (Hyerle, 2008) such as “attending totasks that require patience & analytical, detailed thinking”. Over time, students should develop fluencywith each map and become more able to transfer skills from multiple maps into each content area, having“the flexibility to choose and use maps for a variety of purposes” (Hyerle & Williams, 2009).Discussion and Suggestions for Further Research From the perspective of my action research, I have learned that TMs are really powerful toolswhen modeled and used in appropriate ways. Consistent use (and especially their use across disciplines)provides the ability to make connections to one’s learning in more depth. They offer a way to express20
  • Thinking Maps®thoughts and personalize learning by utilizing the map features (frames of reference and color coding) todifferentiate one’s learning and offer structure to note-taking and the absorption of new content. Some realizations that I had as a result of this action research were that I’ve barely scratched thesurface with my implementation of TMs, and although I learned about these TM strategies “second-hand”from a team member who was part of the actual pilot group, I realize that there are more deliberate skillsand strategies that I need to learn in order to have the maximum effect of my students. As I madeobservations of my students, I noticed some emerging trends in that many students still required a constant“nudging” to encourage them to utilize the TMs to their fullest potential and the most difficult instructionalglitch was getting some of the students to transition to independence in their use of TMs. As I sought answers to my research questions I’ve come to appreciate and need to reinforce thenotion that only through extended exposure over time and consistent practice will TM use become intuitivefor the students (and teachers)! Upon reflection of my research technique I found the use of field notes inthis way to be more powerful than other types of anecdotal record keeping that I’ve attempted in the past.I’ve gained increased reflection and metacognition of my own instructional practices and an increased useof strategies on my part since TMs have slowly become more intuitive with use. As for the future, I will continue to utilize TMs in my classroom across all subject disciplines.I will also begin the use of portfolios to monitor learners as they assimilate new knowledge and developincremental skill using TMs making note as patterns emerge. This research has warranted continuedexploration of my practice and to actively invite explicit thinking, integrate more TMs into the curriculum,and monitor the effects of TMs on scores. The way to increase long-term memory (and improve studentachievement) is to incorporate new information gradually and repeat it at regular intervals to “mind thegap” while attracting and holding attention (Medina, 2008) and this increased exposure to TM usageshould solidify these metacognitive strategies for students to take with them for future years. I have since participated in a school-wide training of TMs and reviewed a culminating report ofthe first pilot group’s findings. I will be part of a summer training conducted by the first pilot group toreview “lessons learned” and share in how to implement TMs successfully in the classroom. I have alsorecently been invited to participate in the second round of the TMs pilot program in the fall, so I will bestarting again with action research on TMs and refocusing my efforts with a new class. Perhaps I will alsotake a look at keeping students motivated once they have been trained in TMs use in an effort to preventthe decline of use over time that I found in this current study. I look forward to learning more aboutThinking Maps® in the future, increasing my instructional skill and continually informing my practice.21
  • Thinking Maps®ReferencesBurden, B., & Silver, J. (2006). Thinking maps in action. Teaching, Thinking & Creativity Retrieved from http://www.teachthinking.comCaine, R.N., & Caine, G. (1991). Making connections, teaching and the human brain. Wheaton, MD: ASCD.Dual-coding theory. (n.d.) In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Dual- coding_theoryGerlic, I., & Jausovec, N. (1999). Multimedia: Differences in cognitive processes observed with EEG. Educational Technology Research and Development, 47(3), 5-14.Hubbard, R.S., & Power, B.M. (2003). The art of classroom inquiry: A handbook for teacher-researchers. Portsmouth, NH: Heinemann.Hyerle, D. (1996). Thinking maps: seeing is understanding. Educational Leadership, 53(4).Hyerle, D., Curtis, S., & Alper, L. (2004). Student successes with thinking maps: School-based research, results and models for achievement using visual tools. Thousand Oaks, CA: Corwin Press.Hyerle, D. & Yeager, C. (2007). Thinking maps: A language for learning. Cary, NC: Thinking Maps, Inc.Hyerle, D. (2008). Thinking maps: Visual tools for activating habits of mind. In A.L. Costa and B. Kallick (eds.), Learning and leading with habits of mind: 16 Essential characteristics for success (149- 176). Alexandria, VA: ASCDHyerle, D., & Williams, K. (2009). Bifocal assessment in the cognitive age: thinking maps for assessing content learning and cognitive processes. New Hampshire Journal of Education, 12, 32-38.Jensen, E. (1998). Brain-based learning: The new paradigm of teaching. Thousand Oaks, CA: Sage.Leary, S. F. (1999). The effect of thinking maps instruction on the achievement of fourth-grade students. Retrieved from http://www.thinkingfoundation.org/research/graduate_studies/pdf/samuel-leary- dissertation.pdfMarzano, R.J., Pickering, D.J., & Pollock, J.E. (2001). Classroom instruction that works: Research-based strategies for increasing student achievement. Alexandria, VA: Prentice Hall.McCormick, S. (1995). Instructing students who have literacy problems. Englewood Cliffs, NJ: Prentice Hall.Medina, J. (2008). Brain rules. Retrieved from IT 6710 http://cu.ecollege.comMerkley, D. M., & Jeffries, D. (Dec. 2000/Jan. 2001). Guidelines for implementing a graphic organizer.The Reading Teacher, 54 (4), 350-357.Novak, J.D., & Gowin, D.B. (1984). Learning to learn. New York, NY: Cambridge University Press.22
  • Thinking Maps®Paivio, A. (1990). Mental representations: A dual coding approach. Available from http://books.google.com/books?id=hLGmKkh_4K8C&lpg=PA3&ots=B1GWeFjljn&dq=Paivio% 20%2B%20dual%20coding%20theory&lr&pg=PP1#v=onepage&q=Paivio%20+%20dual%20cod ing%20theory&f=falseRitchhart, R., Turner, T., & Hadar, L. (2009). Uncovering students thinking about thinking using concept maps. Metacognition and Learning, 4 (2), 145-159.Robinson, V. & Lai, M.K. (2006). Practitioner research for educators: A guide to improving classrooms and schools. Thousand Oaks, CA: Corwin Press.Stull, A.T., & Mayer, R. (2007). Learning by doing versus learning by viewing: Three experimental comparisons of learner-generated versus author-provided graphic organizers. Journal of Educational Psychology, 99 (4), 808-820.Tomlinson, C.A., & Kalbfleisch, M.L. (1998). Teach me, teach my brain: A call for differentiated classrooms. Educational Leadership, 56 (3), 52-55.23
  • Thinking Maps®Appendix A: (Data collection questionnaire sample 1) Questionnaire #1Directions:  Please carefully read each question and respond thoughtfully. Remember to write neatly and make your thoughts clear.1. How much do you know about “the human body” before we start our unit? (Circle one)a lot some a little nothing2. What strategies or tools do you use the most to help you learn and remember new information?__________________________________________________________________________________________________________________________________________________________________________3. How do you think you usually do on tests? (Circle one)very well pretty good just OK not very well4. What do you think of the quality of your test responses? (Select the most appropriate answer)  My work is usually thorough and I’m proud of my success  I usually recognize my mistakes and I can fix them  I’m usually not sure where I went wrong and need further clarification  My responses are not usually sufficient  Other: ________________________________________________________________________ **************************************************************The questions above were about your learning strategies for science. Please answer the following questionsabout learning in all subject areas:Should students be taught “thinking skills”? (Circle one) Y NWhy or why not?__________________________________________________________________________________________________________________________________________________________________________List all of the thinking skills/tools/strategies that you use to help you learn in school:____________________________ ________________________________________________________ ________________________________________________________ ____________________________24
  • Thinking Maps®Appendix B: (Data collection pre-test sample) Pretest #1: DigestionDirections:  Please carefully read each question and respond thoughtfully. Remember to write neatly and make your thoughts clear. 1. Using words or pictures, describe how human digestion works: 2. How do you know?_______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________ 3. Where will you be able to find out more about digestion?_______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________25
  • Thinking Maps®Appendix C: (Data collection post-test sample) Post-test #3: CirculationDirections:  Please carefully read each question and respond thoughtfully. Remember to write neatly and make your thoughts clear. 1. Using words and/or pictures, describe the process of human circulation: 2. How do you know?_______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________26
  • Thinking Maps®Appendix D: (Data collection questionnaire sample 2) Questionnaire #2Directions:  Look at each of the “thinking maps” on the left side of the page.  Carefully draw a line to match each to its intended thinking strategy on the right side of the page. Restated: Match each “thinking map” to the reason we use it… comparing & contrasting sequencing & ordering identifying part-to-whole relationships brainstorming or defining in context analyzing cause & effect classifying & grouping27
  • Thinking Maps®Appendix E: (Data collection questionnaire sample 3) Questionnaire #3Directions:  Carefully read each description below that contains the “code words & phrases” that help us know which type of “thinking map” works best for what we study.  Decide which “thinking map” is being described and draw it into the frame of reference. Restated: Draw the “thinking map” that best describes the phrases below it… define classify compare/contrast brainstorm categorize similarities/differences describe in context organize alike/unlike list group distinguish between… identify sort show what ___ has in common… generate ideas… give sufficient details for… describe shared components of… …as many as you can types of… …tell everything you know describe parts & functions of… …use your prior knowledge list & elaborate on… explore the meaning of… sequence cause/effect part-to-whole order events discuss consequences of… describe/show structure describe phases/cycles/process why did/does… what are the parts/components recount/retell what would happen if… take apart __/ put together __ summarize procedures predict the outcome of… __ is made up of… tell/show how… describe changes describe the identify steps/stages discuss strategy/outcome subparts/subcomponents first/then/next… what is the impact of… how does __ fit in a larger context put in order what would be the result if… how to get from ___ to ___28
  • Thinking Maps® Appendix F: Field notes excerpts Entry (3/9/10): Started on the third sub-unit today “Circulatory System” … introduced circle map to be completed as part of the “morning menu”… First elicited “what you already know” and marked the pre- established color-coding on the board: brain = pencil; For the first time, a G5 asked at the beginning of ‘morning menu’, “Can we add in our own notes with other colors if we once we finish our brainstorming?” This is the first time that anyone has made the connection to use additional colors for additional sources on their own, without my prompting. Maybe they’re starting to get it?? We then added to the board the remainder of the color-coding chart and left it there permanently for the duration of all the human body units: Brain = pencil; text = 2nd color; class discussions & teacher lesson = 3rd color; video = 4th color; Assigned corresponding text readings and note-taking to the menu… Will introduce tree map this afternoon for 3 types of blood cells *** Entry (3/17/10): I walked around the room during group work activity (students were establishing a flow chart for “Nervous System” – How does the brain send/respond to a message?)… observed individuals displaying – and/or + behaviors with respect to the DB map skill… General Observations: (B 3/4) participants (B 5) participants (G 3/4) participants (G 5) participants -quickly scribbling a -starts a Venn Diagram +2 working together -uses bubble map double bubble; no care when asked to “divide and conquer”- instead of DB map taken to align related compare/contrast one uses red for bubbles compare, other uses blue -off task and talking; no +explaining how to use for contrast signs of map being +using tree map for a double bubble to started facts to put into DB map elbow partner -needs explanation from elbow partner; does not +completed DB map +comparing DB map to remember how to use before rest of group with the difference paragraph DB lots of detail ‘bug box’ in writing Note: this week’s lessons are becoming “extra”- feeling like “one more thing’ in an already tight schedule; have to embed brace map as a “quick check” activity (direct instruction) to make time instead prior to quiz…. *** Entry (3/23/10): Group Observation tally Behaviors during (B 3/4) (B 5) (G 3/4) (G 5) “Skeletal/Muscular” assessment: participants participants participants participantsStaring at page or looking up in thought lll I I IIUsing FM instead of MFM to show contractions II IIII III IIColor-coding maps on test I I II IAsks what map to use I I Note: 6 absences today!! Need to determine how that will affect overall data…. Make-up exams!! 29
  • Thinking Maps®Appendix G: Interview question formatThe following questions were asked of my colleagues (grade-level team members, pilot participants, otherteachers trying to emulate the pilot instruction on their own) in an effort to find common experiences andto help me make comparisons between my class and other classes at similar levels of implementation:Domain Interview QuestionsQuantitative 1. Which TM do you most frequently use? Why? 2. How often do you use each of the TMs? 3. How many of each type of map have the students completed? 4. Have you measured growth? Probe Further: How long does the introduction/modeling take for each?Qualitative 1. What instructional strategies do you use to teach TMs? 2. Describe the experience of using TMs in your classroom 3. How have you extended the use of TMs? Probe Further: Which TMs seem easiest for your students? Most difficult?Reflective: Outcome of Lesson 1. How does the TM relate to the lesson being taught? 2. How do you monitor engagement? 3. How do you incorporate TMs in to assessment? Probe Further: How do you differentiate using TMs?Professional Impact and Practice 1. Describe your pacing for TM implementation? 2. What success have you observed? 3. What areas for improvement have you observed? Probe Further: How does the use of TMs support our school or grade-level agreements? (adapted from the literature- Leary, 1999; used that study’s questionnaire format as the model )30
  • Thinking Maps®Appendix H: Research TimelineTask Date/Time Frame People involvedScope the research problem Begin Feb 1 (due Feb. 7) SelfCreate timeline, or “plan of action” Begin Feb 1 (due Feb. 14) SelfConduct the initial literature review Begin Feb 1 (due Apr. 4) SelfDesign the study; plan data collection Begin Feb 7 (due Feb. 21) Self; teamCreate data collection tools Begin Feb 7 (due Feb. 21) SelfArrange to conduct research with respondents and Begin Feb 7 (due Feb. 21) Self; team; studentsinvolved educatorsCreate permission forms and obtain permission to Begin Feb 7 (due Feb. 21) Self; team; parentsconduct research at designated siteCollect data (activity part 1) Begin Feb 22 (due Feb. 28) Self; team; studentsCollect data (activity part 2) Begin Feb 22 (due Mar. 7) Self; team; studentsContinue data collection Begin Feb 22 (due Mar 26) Self; team; studentsAnalyze data Begin Mar. 8 (due Mar. 28) Self; teamMake inferences and determine findings Begin Mar. 8 (due Apr. 11) SelfLiterature review (activity due) Begin Feb. 1 (due Apr. 4) SelfFinalize literature review Begin Feb. 1 (due Apr. 25) SelfWrite final report draft Begin Feb 7 (due Apr. 11) SelfPeer edit Begin Apr. 12 (due Apr. 18) Self; IT 6720 cohort; teamReview findings with key person Begin Apr. 12 (due Apr. 18) Self; teamRevise final report Begin Apr. 19 (due Apr. 25) SelfSubmit final report ** Due Apr. 25 ** SelfPrepare presentation Begin Apr. 25 (due May 2) SelfPresentation: review of study and findings ** Due May 2 ** Self31