Teacher behavior and student achievement.2
Upcoming SlideShare
Loading in...5

Like this? Share it with your network


Teacher behavior and student achievement.2

Uploaded on


  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. TEACHER BEHAVIOR AND STUDENT ACHIEVEMENT: A STUDY IN SYSTEM DYNAMICS SIMULATION A Dissertation Presented to the Faculty of the Graduate School The University of Memphis In Partial Fulfillment of the Requirements for the Degree Doctor of Education by Jorge O. Nelson December, 1995
  • 2. iiCopyright © Jorge O. Nelson, 1995 All rights reserved
  • 3. iii DEDICATIONThis dissertation is dedicated to my parents Doris and Gene Nelson who have given me the love of learning.
  • 4. iv ACKNOWLEDGMENTS I would like to thank my major professor, Dr. Frank Markus, for hisunderstanding and guidance. I would also like to thank the other committeemembers, Dr. Robert Beach, Dr. Randy Dunn, Dr. Ted Meyers and Dr. TomValesky, for their comments and suggestions on this project. This project wasnot without difficulties, but without the support and interest of all the committeemembers, there would have not been a project at all. I would like to thank my mother, Doris Nelson, for her integrity and beliefin my potential. I would like to thank my father, Gene Nelson and his father, mygrandfather, Andy Nelson, for starting me on this journey as I hope to help myson with his someday.
  • 5. v ABSTRACTNelson, Jorge O. Ed.D. The University of Memphis. December 1995. Teacherbehavior and student achievement: A study in system dynamics simulation.Major Professor: Frank W. Markus, Ph.D. Systemic change has been identified as a necessary step in improvingpublic schools. A tool to help define and improve the system of schools could behelpful in such reform efforts. The use of computer simulations are commontools for improvement and training in business and industry. These computermodels are used by management for continually improving the system as well astraining employees how to go about working in the system. Education is onearea that is due for such a tool to help in identifying and continually improvingthe system of public schools. A simple beginning in creating a tool for schools is the development andvalidation of a simulation of, arguably, the most basic and profound processes ineducation—the relationship between student and teacher. The following studywas an attempt to develop a computer simulation of a single component in theentire system of education: a model of identified teacher behaviors which affectstudent achievement in a regular classroom setting. Forresters (1968) system dynamics method for computer simulation wasused as a theoretical approach in creating the model. A knowledge synthesismatrix of research-based conclusions as identified by Brophy and Good (1986)was used to insure that valid, research-based conclusions were represented in themodel. A theory of direct instruction, adapted from Hunters (1967) InstructionalTheory Into Practice (ITIP) model, was used for the structural framework of theteaching paradigm. The simulation was developed on a Macintosh platform
  • 6. virunning in the ithink! system dynamics software environment (High PerformanceSystems, Inc., 1994). Data were gathered in the form of simulation outcomes of the knowledgesynthesis findings modeled in a computer simulation. Participants were solicitedto volunteer and experience the simulation in two validation sessions. Results oftwo sets of six individual simulation sessions, wherein teachers, schooladministrators and university professors manipulated the inputs duringsimulation sessions, were used for validation purposes. It was concluded that knowledge synthesis findings from a body ofresearch-based studies could be simulated on a computer in a system dynamicsenvironment. The implications of the simulation of research-based conclusionsin a computer model are that such simulations may provide educators with atool to experiment with processes that go on in the regular classroom withoutcausing any strain or stress to students, teachers, or any other part of the real-world system in the process. It is recommended that simulations such as this one be used to helpeducators better understand the interdependencies which exist in the schoolsetting. Teachers, administrators, and researchers can see the "big picture" asthey manipulate the research-based findings in this "microworld" of a classroom.It is also recommended that more simulations such as this one be developed, andthat by improving the process and product of educational simulations, thesestudies will help educators better understand schools as complex, dynamicsystems.
  • 7. vii TABLE OF CONTENTSChapter Page I. INTRODUCTION AND LITERATURE REVIEW Introduction...................................................................................1 Problem Statement .......................................................................2 Purpose of the Study ....................................................................3 Research Question ........................................................................3 Review of the Literature ..............................................................3 Counterintuitive Behaviors ....................................................5 Complexity in Schools.............................................................6 Cognition in Simulation ..........................................................7 Promises of Simulation in Educational Reform...................9 Teacher/Study Simulation Component ...............................9 A Model of Direct Instruction ..............................................10 Knowledge Synthesis Matrix ...............................................12 Simulation Software ..............................................................23 Limitations of the Study ............................................................42 II. RESEARCH METHODOLOGY Simulation Modeling..................................................................43 Model Criteria & Knowledge Synthesis .............................45 System Definition...................................................................45 Grouping of Variables & Data Identification.....................46 Mapping the Model ...............................................................47 Model Translation..................................................................57 Model Calibration ..................................................................57 Model Validation ...................................................................65
  • 8. viii TABLE OF CONTENTSChapter Page III. RESULTS Simulation Model of Knowledge Synthesis........................... 67 Validation Results.......................................................................67 Pre-simulation Questionnaire ..............................................68 Simulation Sessions ...............................................................71 Post-simulation Questionnaire ............................................72 Additional Findings ...................................................................74 IV. SUMMARY, CONCLUSIONS, IMPLICATIONS AND RECOMMENDATIONS Summary......................................................................................77 Conclusions .................................................................................78 Implications .................................................................................80 Teachers and Instruction.......................................................81 Administrators .......................................................................82 Research...................................................................................83 Schools as Learning Organizations .....................................84 Recommendations ......................................................................85 REFERENCES...........................................................................................88 APPENDIX A............................................................................................93 APPENDIX B ............................................................................................95 APPENDIX C............................................................................................97 APPENDIX D............................................................................................99 APPENDIX E ..........................................................................................103 APPENDIX F ..........................................................................................106 VITA.........................................................................................................120
  • 9. ix LIST OF TABLESTable Page 1. The Instructional Theory Into Practice (ITIP) Direct Instruction Model....................................................11 2. Knowledge Synthesis Matrix of Teacher Behaviors that Affect Student Achievement................14 3. Inclusion Criteria by Smith and Klein (1991) as Applied to Knowledge Synthesis Report by Brophy and Good (1986)...........................................................22 4. Example of Equations Generated From a Stock and Flow Diagram.............................................................33 5. Example of "Steady State" Simulation Session versus Actual Respondent Data Entry.........................................62 6. Study 1: Pre-Simulation Questionnaire Data Collection Report ...................................................................69 7. Study 2: Pre-Simulation Questionnaire Data Collection Report ...................................................................70 8. Study 1: Post-Simulation Questionnaire Data Collection Report ...................................................................75 9. Study 2: Post-Simulation Questionnaire Data Collection Report ...................................................................76
  • 10. x LIST OF FIGURES Figure Page1. Four basic constructs of system dynamics notation ..................242. High-level description of entire simulation model....................303. Example of submodel .....................................................................324. Example of initial stock data entry...............................................345. First example of converter data entry ..........................................356. Second example of converter data entry .....................................357. First example of initial flow data entry........................................368. Second example of initial flow data entry...................................379. Example of simulation output ......................................................3810. Example of sensitivity analysis data entry..................................4011. Example of four sensitivity analyses runs...................................4112. Flow diagram of the simulation modeling sequence ................4413. Feedback circle diagram of how teacher behaviors may affect student performance ...................................................4614. Student sector ..................................................................................4815. Teacher sector ..................................................................................4916. Giving information .........................................................................5017. Questioning the students ...............................................................5118. Reacting to student responses.......................................................5219. Handling seatwork .........................................................................5320. Teacher burnout ..............................................................................54
  • 11. xi LIST OF FIGURES Figure Page21. System dynamics model of teacher behavior as it affects student achievement ..................................................5622. Post-question wait time impact ....................................................5823. Current student achievement........................................................6024. Example of "steady state" output using data set from table 5.............................................................6325. Example of respondent #6 output using data set from table 5.............................................................64
  • 12. 1 CHAPTER I INTRODUCTION AND LITERATURE REVIEW Introduction Continuous improvement in American public education is perhaps one ofthe most important issues facing educators in the future. Digate and Rhodes(1995) describe how there is an increase in complaints regarding theimprovement of public schools. New strategies and tactics to assist efforts toimprove must be developed to counteract this growing perception of mediocrityin our schools. Recent educational reforms have not been successful in upgrading schooldistricts in the United States (Bell, 1993; Reilly, 1993). These reform efforts—forexample, raised student achievement standards, more rigorous teachingrequirements, a more constructive professional teaching environment, andchanges in the structure and operation of individual schools—have not satisfiedthe changing needs of today’s society. Educational leaders must find effectiveways to reform the quality of public education if the gap between what societydesires and what education delivers is to be reduced. One way to improve quality in schools is to approach reform efforts usingForrester’s (1968) computer-based system dynamics point of view. Lunenberg andOrnstein (1991) reaffirmed that "one of the more useful concepts inunderstanding organizations is the idea that an organization is a system" (p. 17).Darling-Hammond (1993), Kirst (1993), and others believe that problems ineducational reform are to be found in an excessive focus on specific areas withineducation, and an insufficient focus on the entire system, as illustrated inDarling-Hammond’s "learning community" concept (1993, p. 760). A closer look
  • 13. 2at schools as systems promises to help educational leaders better understand theproblematic environment they face. Other than an attempt 20 years ago to simulate student performance in theclassroom (Roberts, 1974), means to study schools as systems, for the most part,have been unavailable to the practitioner. With the confluence of the affordable,powerful desktop computer; system dynamics modeling languages; and thegraphical user interface comes the potential to create such a tool: a simulationmodel of schools to be used by educational leaders. An initial modeling of one ofthe most critical subsystems within schools—the interaction between student andteacher—is a logical place to begin the simulation modeling process due to thelarge body of research-based findings describing this area (e.g., WittrocksHandbook of Research on Teaching, 1986). Using simulation to better understand the behavior of schools,educational leaders can develop and implement more effective policies. Thissaves "real" schools from yet another failed reform, while providing a cyberspacefor the developing, testing, practicing and implementing of educational policies. Problem Statement One relevant component necessary for creating a simulation of publiceducation is the relationship between teacher behavior and student achievement.This study simulates the interaction of a number of research-based conclusionsabout teacher behaviors that, one way or another, affect student achievement.While there have been numerous studies that identify isolated teacher behaviorsin context-specific classrooms, there has been little or no effort to integrate thesefindings into a generalizable simulation model. Only one study, Roberts’ (1974)simulation exercise, was found to have attempted this kind of simulation.
  • 14. 3 The nonlinear complexity of outcomes found in human interactions,sometimes counterintuitive in nature, can make it difficult for teachers tounderstand and choose appropriate behavior(s) for a given classroom setting(Senge, 1990a). There needs to be a generalizable model, using a systemdynamics approach, which illustrates how research-based conclusions regardingteacher behaviors affect student achievement for a particular population. Purpose of the Study The purpose of this particular study was to create a system dynamicssimulation of research-based conclusions of teacher behaviors which affectstudent achievement. Research Question Can the knowledge synthesis findings of teacher behaviors that affectstudent achievement be usefully modeled in a system dynamics computersimulation? Review of the Literature Currently, there is not a tool to help educational leaders integrateknowledge about the systemic areas needed for reform. The following review ofthe literature provides some rationale for such a tool by: identifying the need dueto counterintuitive behaviors and complexity in public education; recognizingcognition in, and promises of, simulation as the tool; defining a teacher/student
  • 15. 4simulation component based on an existing model of direct instruction;identifying research-based conclusions in a knowledge synthesis matrix format;and identifying a simulation software package which can be used to create thesimulation exercise. Rhodes (1994) describes some uses of technology which: support the organizational interactions that align and connect isolated actions of individuals and work groups as they attempt to separately fulfill the common aims of the organization; and ensure that experiential ‘data’ turns into institutionalized knowledge as the organization ‘learns’. (p. 9) Prigogine and Stengers (1984) report that "we are trained to think in termsof linear causality, but we need new ‘tools of thought’: one of the greatestbenefits of models is precisely to help us discover these tools and learn how touse them" (p. 203). Schools are complex organizations. Senge (1990a) points out thatcounterintuitive behaviors appear in these kinds of organizations. The variablesare confusing. For example, Waddington’s (1976) housing project casehypothesized that by simply building housing projects, slums would be clearedout. However, these projects then attract larger numbers of people into the area,and these people, if unemployed, remain poor and the housing projects becomeovercrowded, thereby creating more problems than before. An awareness ofsuch counterintuitive behavior is a prerequisite if educational leaders are todevelop and implement successful reform efforts. Counterintuitive Behaviors
  • 16. 5 Four major counterintuitive behaviors of systems that cognitivelychallenge educational leaders have been identified by Gonzalez and Davidsen(1993). These behaviors are: a) origin of dynamic behavior, b) lag times, c)feedback, and d) nonlinearities.Origin of Dynamic Behavior Dynamic behavior is the interaction between systemic variables and therate of resources moving into and out of those variables. The origin of dynamicbehavior itself is confusing to humans in that the accurate identification of allrelevant variables and rates can be difficult.Lag time In systems, the element of time can create counterintuitive behaviorsthrough lags between systemic variables and rates of moving resources. Anignorance of lag time by management and ignoring or postponing correctiveactions until they are too late can create oscillatory behaviors in system outputs.Short term gains can actually create long term losses.Feedback The identification of feedback usage can illustrate how actions canreinforce or counteract each other (Senge, 1990a). This provides educationalleaders with the ability to discover interrelationships rather than linear cause-effect chains. This discovery provides the opportunity to focus on recurringpatterns.Nonlinearities
  • 17. 6 Schools operate in response to the effects of complex interrelationshipsrather than linear cause and effect chains (Senge, 1990a). Sometimes theseinterrelationships operate at a point where small changes in the system result ina "snowballing" effect that seems out of proportion to the cause (Briggs, 1992). Abetter grasp of the nonproportional interrelationships in systems is needed fordecision-making and policy analysis. Complexity in Schools In focusing on the nonlinear system aspects of schools, one mustincorporate concepts from the emerging science of complexity, or the study of "themyriad possible ways that the components of a system can interact" (Waldrop,1992, p. 86). Complexity is a developing discipline that accounts for the newlydiscovered relationship between order and chaos. Wheatley (1992) describeshow these two traditionally opposing forces are linked together. Those two forces [chaos and order] are now understood as mirror images, one containing the other, a continual process where a system can leap into chaos and unpredictability, yet within that state be held within parameters that are well-ordered and predictable. (p. 11) Schools are complex environments that evolve over time. Educationalleaders generally measure and even forecast social and political climates beforeimplementing reform efforts, but they must also understand how unexpectedchanges in the school district can create the need to adapt policies andprocedures if success is to be achieved. O’Toole (1993) argues that the goal ofpolicy is simplification. "Before an executive can usefully simplify, though, shemust fully understand the complexities involved" (p. 5). Wheatley (1992) statesthat "when we give up myopic attention to details and stand far enough away to
  • 18. 7observe the movement of the total system, we develop a new appreciation forwhat is required to manage a complex system" (p. 110). Using the science of complexity as a theoretical foundation, educationalleaders can observe and better understand the interrelationships found in schooldistricts, and therefore, increase their competence in implementing successfulreform measures. Simulation is the tool for representing reality in social settingsand it has been studied since the 1960’s (Forrester, 1968; Gonzalez & Davidsen,1993; Hentschke, 1975; Richardson, 1991; Richardson & Pugh, 1981; Roberts,1974; Senge, 1990b; Waldrop, 1992). Richardson (1991) describes how "the use ofdigital simulation to trace through time the behavior of a dynamic system makesit easy to incorporate nonlinearities" (p. 155). Only recently has the technologycaught up with the theory. Simulation now combines both the power of systemstheory and the emerging science of complexity to be used as a management toolby the practitioner in the field in the form of a computer model. Cognition in Simulation Complex simulations create an opportunity for the participant to masterboth experiential and reflective cognition (Norman, 1993). They [simulations] support both experiential and reflective processes: experiential because one can simply sit back and experience the sights, sounds, and motion; reflective because simulators make possible experimentation with and study of actions that would be too expensive in real life. (p. 205) For over 20 years the cognitive processes of subjects have been studied inlaboratory settings, where they interact with complex systems in such a way that"the comprehensive work . . . has provided a rich theory of human cognition and
  • 19. 8decision-making in . . . problem situation[s], including social systems" (Gonzalez& Davidsen, 1993, p. 7). In experiential cognition, the participant’s skills are developed and refinedto the point of automatic reflexive action during simulation sessions (Norman,1993; Schön, 1983). The participant’s reasoning and decision-making skills aredeveloped and refined to the point where reflective thought is automatic bothbefore and after simulation sessions. One important element to be considered when participants experiencesimulation sessions is Erikson’s (1963) notion of play, or the attempt tosynchronize the bodily and social processes with the self. When man plays he must intermingle with things and people in a similarly uninvolved and light fashion. He must do something which he has chosen to do without being compelled by urgent interests or impelled by strong passion; he must feel entertained and free of any fear or hope of serious consequences. He is on vacation from social and economic reality—or, as is most commonly emphasized: he does not work. (p. 212) Experiential cognition is reinforced when participants lose themselvesplaying with simulations—similar to the child absorbed while playing at anarcade-style video game. The results from these simulation sessions are aheightened experience, or flow (Csikzentmihalyi, 1990). Playing simulations also provides a collaborative environment foreducational leaders—a sharing of new ideas with colleagues who also play thesimulation. The players create a standardization of vocabulary of school districtsimulation terms. Hass and Parkay’s (1993) findings regarding simulationsessions of the M-1 tank indicate that during simulated stressful conditionsgroups of tank simulators were as useful for teaching teaming skills as they werefor teaching the mechanics of tank operation. Simulations can help to buildteams as well as teach appropriate skills to individuals.
  • 20. 9 Promises of Simulation in Educational Reform Computer simulation promises a number of outcomes for educationalreform. People who are not scientists will be able to create models of variouspolicy options, without having to know all the details of how that model actuallyworks (Waldrop, 1992). Such models would be management flight simulatorsfor policy, and would allow educational leaders to practice crash-landing schoolreform measures without taking 250 million people along for the ride. Themodels wouldn’t even have to be too complicated, so long as they gave people arealistic feel for the way situations developed and for how the most importantvariables interact. The technology, rationale, and expertise required for creating schoolsimulations are all readily available. It is now up to educational leaders to comeforth and demand system simulations as viable data-based decision-making toolsfor the coming millennium. Any other choice may be determinant in creatingless than desirable outcomes. Teacher/Student Simulation Component In searching for a beginning to school simulations, a logical starting pointis the interaction between student and teacher. The focus of this study is thedelivery of instruction from the teacher to the student as identified in relevantteacher behaviors which significantly affect student achievement. Twenty yearsago Roberts (1974) made an attempt to describe such an integration of researchfindings and personal experiences into a teacher-behavior/student-performancesimulation. The Roberts study occurred when computer simulation was an
  • 21. 10unwieldy, expensive, and complex venture better left to computer programmingexperts. The outputs from the simulation were simple two-dimensional graphsshowing directional movement in trend lines after variables were manipulated.One drawback from this type of simulation is that educators had to depend onprogramming experts to generate outputs through data entry. Educators couldnot input data themselves, nor could they manipulate variables "on the fly".Educators need to control data themselves to develop a working knowledge ofthe nonlinear outcomes of public education (Nelson, 1993). With the newer,more powerful, easier-to-use technologies available today comes the potential foreveryone to develop and "play" simulations by themselves and develop greaterunderstanding of the complexities in schools (Waldrop, 1992). A Model of Direct Instruction In searching for an existing educational model of teacher/studentinteraction upon which the simulation would be based, Corno and Snow (1986)reported that a large body of research findings fully support an agreed-upondescription of direct instruction, or "a form of . . . teaching that promotes on-taskbehavior, and through it, increased academic achievement" (p. 622). Using directinstruction as a model for this simulation exercise helped to insure that asufficient number of valid research-based findings would be available forconstruction and validation purposes. For this study, a conceptual framework for describing the interactionduring direct instruction between teacher and student was found in Hunter’s(1967) Instructional Theory Into Practice (ITIP) model of effective teaching. TheITIP model identified "cause-effect relationships among three categories ofdecisions the teacher makes: a) content decisions, b) learner behavior decisions,
  • 22. 11and c) teaching behavior decisions" (Bartz & Miller, 1991, p. 18). This model waschosen due to the repetition of its essential elements as reported in numerousstudies dating back to World War II (Gagné, 1970; Good & Grouws, 1979; Hunter& Russell, 1981; War Manpower Commission, 1945). Table 1 is an outline of themain elements found in the ITIP model for direct instruction. TABLE 1 THE INSTRUCTIONAL THEORY INTO PRACTICE (ITIP) DIRECT INSTRUCTION MODEL________________________________________________________________________ 1. Provide an Anticipatory Set 2. State the Objectives and Purpose 3. Provide Input in the Form of New Material 4. Model Appropriate Examples 5. Check for Student Understanding 6. Provide Guided Practice with Direct Teacher Supervision 7. Provide Independent Practice________________________________________________________________________Source: Adapted from Improved Instruction. Hunter (1967). In the ITIP model, as in other teaching models, there are certain elementsthat need to be addressed when determining which direct instruction strategiesare deemed effective. Those elements include student socioeconomic status(SES)/ability/affect, grade level, content area, and teacher intentions/objectives(Brophy & Good, 1986). How the interactions between these elements react withselected teaching strategies in an ITIP model for direct instruction is oneproblematic area that needs to be addressed by educators before they "tinker"with a classroom situation. Occasionally these interactions have been found
  • 23. 12more significant than main effects in classroom settings (Brophy & Evertson,1976; Solomon & Kendall, 1979). These conclusions suggest "qualitativelydifferent treatment for different groups of students. Certain interaction effectsappear repeatedly and constitute well-established findings" (Brophy & Good,1986, p. 365). A simulation of these interactions is one way for educators todevelop better strategies for direct instruction to increase student achievement. The literature firmly supported direct instruction as a model for teacherbehaviors as they affect student achievement. The next step was to define a set ofresearch-based conclusions regarding teacher behaviors that affect studentachievement. It was recognized that there were potentially scores of availablevariables which may have influenced the setting of this model, so this studyrelied upon gathered research-based conclusions reported in a knowledgesynthesis matrix from Brophy and Good (1986) as outlined below. Knowledge Synthesis Matrix The knowledge synthesis matrix of researchers findings (Table 2) isevidence of identified conclusions regarding the relationship between teacherbehavior and student achievement. Knowledge synthesis has been identified toinclude four general purposes: 1. to increase the knowledge base by identifying new insights, needs,and research agendas that are related to specific topics; 2. to improve access to evidence in a given area by distilling andreducing large amounts of information efficiently and effectively; 3. to help readers make informed decisions or choices by increasingtheir understanding of the synthesis topic; and
  • 24. 13 4. to provide a comprehensive, well-organized content base tofacilitate interpretation activities such as the development of textbooks, trainingtools, guidelines, information digests, oral presentations, and videotapes (Smith& Klein, 1991, p. 238). A logical addition to general purpose number four mightinclude another entry for interpretation activities in the form of simulations. Within the knowledge synthesis, some of the research-based conclusionswere context-specific, whereby certain teacher behaviors enhanced studentachievement at one grade level, but hindered student achievement at another.Therefore, such context specific findings were included in the model at theappropriate level and magnitude.Criteria for Selection To insure that findings reported in the knowledge synthesis matrix wereof sufficient significance, a summary and integration of consistent and replicablefindings as identified by Brophy and Good (1986) was selected for the knowledgesynthesis variables using an adaptation of Smith and Kleins (1991) inclusioncriteria for acceptance. These variables have been "qualified by reference tograde level, student characteristics, or teaching objective" (Brophy & Good, 1986,p. 360). The following is a brief description of the research-based conclusionsintegrated and summarized in Brophy and Goods (1986) knowledge synthesisused as initial data for this study. TABLE 2
  • 25. 14 KNOWLEDGE SYNTHESIS MATRIX OF TEACHER BEHAVIORS THAT AFFECT STUDENT ACHIEVEMENT. Student Teacher Giving Questioning Reacting to Handling Teacher Information the Students Student Seatwork & Burnout1 Responses HomeworkSocio-econ. Management Vagueness in Student Teacher Independent Hours Worked Status Skills & Org. Terminology Call-outs Praise Success Rate per Day Grade Experience Degree of Higher Level Negative Available Enthusiasm Level Redundancy Questioning Feedback Help Expectation Question/ Post-Ques. Positive Interactions Wait Time Feedback Response RateSource: Adapted from Teacher behavior and student achievement. Brophy & Good (1986).Relationship of Knowledge Synthesis to Simulation Variables The knowledge synthesis matrix described four divisions of variables oflesson form and quality as well as two areas of context-specific findings,describing both teacher and student (Brophy & Good, 1986). Seventeen variableswere represented throughout these six areas. An experimental division, teacherburnout, was added as an optional seventh component along with two morevariables, to introduce to the model a chaotic function to the simulation modelfor discussion and experimentation after the validation was completed. Category 1—student. There were two context-specific variables in thestudent arena which affected achievement: grade level and socioeconomic status(SES). Grade level was described as early grades (1-6) and late grades (7-12).Regarding SES, Brophy and Good (1986) stated:1 Optional set of variables for experimentation use only.
  • 26. 15 SES is a proxy for a complex of correlated cognitive and affective differences between sub-groups of students. The cognitive differences involve IQ, ability, or achievement levels. Interactions between process- product findings and student SES or achievement-level indicate that low- SES-achieving students need more control and structure from their teachers; more active instruction and feedback, more redundancy, and smaller steps with higher success rates. This will mean more review, drill, and practice, and thus more lower-level questions. Across the year it will mean exposure to less material, but with emphasis on mastery of the material that is taught and on moving students through the curriculum as briskly as they are able to progress. (p. 365) Students have differing needs in upper and lower grade levels as well asin higher and lower socioeconomic levels. For example, regardingsocioeconomic status, Solomon and Kendall (1979) found that 4th grade studentsfrom a high socioeconomic status (SES) need demanding and impersonalinstructional delivery, and students from a low SES need more warmth andencouragement in delivery strategies. Brophy and Evertson (1976) found thatlow SES students needed to get 80% of questions answered correctly before theymove on to newer content, but high SES students could accept a 70% rate ofsuccess before they move on to newer content. SES is a variable which affects theway students achieve in school. To illustrate differences in student achievement at upper and lower gradelevels, Brophy and Good report on the impact of praise at differing levels."Praise and symbolic rewards that are common in the early grades give way tothe more impersonal and academically centered instruction common in the latergrades" (1986, p. 365). This example describes how teachers need to be aware ofthe use of praise when trying to increase student achievement. Praise has beenshown to elicit a more positive impact on younger children than on their oldercounterparts.
  • 27. 16 Grade level and socioeconomic status are variables that affect studentachievement at differing rates and levels. Teachers need to be aware of thesedifferences to better meet the individual needs of their students. Category 2—teacher. The teacher category had three types of variablesidentified in Brophy and Goods (1986) knowledge synthesis report. Those weremanagement skills and organization, experience, and expectation. Management skills and organization played a large part in affectingstudent achievement across the grade levels. "Students learn more in classroomswhere teachers establish structures that limit pupil freedom of choice, physicalmovement, and disruption, and where there is relatively more teacher talk andteacher control of pupils task behavior" (Brophy & Good, 1986, p. 337). Teacherswho are more organized have a more positive affect on student achievementthan otherwise. Another conclusion showed that teachers need time—experience—todevelop their expertise and increase their effectiveness, because ". . . the majorityof [first year] teachers solved only five of the original 18 teaching problems offirst-year teachers in fewer than three years. Several years may be required forteachers to solve problems such as classroom management and organization"(Alkin, Linden, Noel, & Ray, 1992, p. 1382). Teachers who have more teachingexperiences have been shown to elicit a more positive affect on studentachievement than otherwise. A number of findings were integrated under the expectation variable inBrophy and Goods knowledge synthesis report. "Achievement is maximizedwhen teachers . . . expect their students to master the curriculum" (Brophy &Good, 1986, p. 360). "Early in the year teachers form expectations about eachstudents academic potential and personality. . . If the expectations are low . . .the students achievement and class participation suffers" (Dunkin, 1987, p. 25).
  • 28. 17Expectation was also found to be context specific in increasing studentachievement, showing that ". . . in the later grades . . . it becomes especiallyimportant to be clear about expectations . . ." (Brophy & Good, 1986, p. 365).Expectation is a variable which teachers must address when thinking aboutincreasing student achievement, generally speaking as well as in context. The initial two categories were based upon the context-specific variablesof both student and teacher. The remaining four categories of variables werebased upon lesson design and quality of instruction. Category 3—giving information. One of the variables found in this groupwas identified as vagueness in terminology. "Smith and Land (1981) report thatadding vagueness terms to otherwise identical presentations reduced studentachievement in all of 10 studies in which vagueness was manipulated" (Brophy& Good, 1986, p. 355). Teachers need to be clear and precise in their deliverymethods if student achievement is to be maximized. If a teacher uses vagueterminology in his/her delivery, student achievement is not maximized. A second variable in the giving information group was the degree ofredundancy in the delivery of instruction. "Achievement is higher wheninformation is presented with a degree of redundancy, particularly in the form ofrepeating and reviewing general rules and key concepts" (Brophy & Good, 1986,p. 362). When teachers repeat new concepts, student achievement was shown toincrease more than if new concepts were not repeated. In the same group yet a third variable was identified as the number ofquestions/interactions per classroom period. "About 24 questions were asked per 50minute period in the high gain classes, . . . In contrast, only about 8.5 questionswere asked per period in the low-gain classes . . . " (Brophy & Good, 1986, p.343). When teachers initiate a higher incidence of questions/interactions with
  • 29. 18their students, achievement has been shown to increase more than when teacherselicit a lower incidence of question/interactions. Category 4—questioning the students. A fourth category or group ofvariables, questioning the students, contained four research-based conclusionswhich affect student achievement: students call-outs (i.e. speaking without raisingtheir hands), higher-level questioning, post-question wait time, and response rate. One variable, student call-outs, was shown to make a difference inachievement because "student call-outs usually correlate positively withachievement in low-SES classes but negatively in high-SES classes" (Brophy &Good, 1986, p. 363). Student call-outs are defined as incidents where students"call-out" answers to questions without raising their hands or other forms ofclassroom management for responses. In low-SES situations, student call-outscorrelate with higher achievement, where the student who is frequently shy orwithdrawn takes a risk and calls out an answer. This situation often helps astudents self-esteem if the call-out is not prohibited by or frowned upon theteacher. In high-SES situations student call-outs tend to distract the more securestudent body and, in turn, lower student achievement levels due to an increase inoff-task behaviors. The higher level questioning variable illustrated how ". . . the frequency ofhigher-level questions correlates positively with achievement, the absolutenumbers on which these correlations are based typically show that only about25% of the questions were classified as higher level" (Brophy & Good, 1986,p. 363). Teachers who use higher level questions about one-fourth of the timehad better results in increasing student achievement than any other amount ofhigher-level questioning of the students. The post-question wait time variable described how teachers gave students afew seconds to think about a question before soliciting answers to the question.
  • 30. 19"Studies . . . have shown higher achievement when teachers pause for about threeseconds (rather than one second or less) after a question, to give the studentstime to think before calling on one of them" (Brophy & Good, 1986, p. 363). Forexample, Hiller, Fisher, and Kaess (1969); Tobin (1980); and Tobin and Capie(1982) all report that three seconds seems to be a desirable amount of wait timeteachers need to follow before calling on students for responses to questions.Wait time gives slower students an extra moment or two to further process thequestion before they attempt to reply with the answer. This quantitative waittime data was used as a desirable benchmark in a "wait time" variablerequirement, with less time creating less than desirable output from the model. The fourth variable in this category was identified as response rate from thestudent to teacher input. "Optimal learning occurs when students move at abrisk pace but in small steps, so that they experience continuous progress andhigh success rates (averaging perhaps 75% during lessons when a teacher ispresent, and 90-100% when the students must work independently)" (Brophy &Good, 1986, p. 341). Teachers who monitor student success and move on to newmaterial or independent work after seeing about 75% success rate from thestudents elicited the highest gains in student achievement than any otherresponse rate amount. Category 5—reacting to student responses. The fifth category, asidentified by Brophy and Good (1986), describes general lesson form and qualityand was named reacting to student responses. This category integrates threeresearch-based conclusions: teacher praise, negative and positive feedback. Teacher praise was shown to affect students achievement in asocioeconomic status context. "High-SES students . . . do not require a great dealof . . . praise. Low-SES students . . . need more . . . praise for their work" (Brophy& Good, 1986, p. 365). Praise was also found to affect students differently at
  • 31. 20differing grade levels. "Praise and symbolic rewards that are common in theearly grades give way to the more impersonal and academically centeredinstruction common in the later grades" (p. 365). The type of negative feedback given to students was found to affect theirachievement. "Following incorrect answers, teachers should begin by indicatingthat the response is not correct. Almost all (99%) of the time, this negativefeedback should be simple negation rather than personal criticism, althoughcriticism may be appropriate for students who have been persistentlyinattentive" (Brophy & Good, 1986, p. 364). Teachers who almost neverpersonally criticized their students had higher achievement from those studentsthan teachers who personally criticized their students. Positive feedback was a third and final variable identified in the Brophy andGood (1986) knowledge synthesis category of reacting to student responses."Correct responses should be acknowledged as such, because even if therespondent knows that the answer is correct, some of the onlookers may not.Ordinarily (perhaps 90% of the time) this acknowledgment should take the formof overt feedback" (p. 362). Teachers who acknowledge success in studentresponses elicit more student achievement than those who ignore successfulresponses from their students. Category 6—handling seatwork. A sixth category of variables used toillustrate how teacher behaviors affect student achievement in the knowledgesynthesis was handling seatwork. Within this category were two conclusionsdefined in the research: independent success rate, and available help. Interrelated in the final research-based conclusion in this category,independent success rate and available help were shown to affect achievement."For assignments on which students are expected to work on their own, successrates will have to be very high—near 100%. Lower (although still generally high)
  • 32. 21success rates can be tolerated when students who need help get it quickly"(Brophy & Good, 1986, p. 364). Teachers who successfully taught the lesson and,therefore, had students who could work independently with high success rateselicited higher achievement gains than those teachers who needed to helpstudents who did not "get" the lesson in the first place. Category 7—teacher burnout (experimental in nature). A seventh andoptional category was added to the model. This category has been titled teacherburnout and was included to add a chaotic function for experimental purposes.This part of the simulation model was "turned off" during validation studies toinsure that only the findings described in the knowledge synthesis weresimulated during validation. The assumption used in this category was adapted from a simulation ofemployee burnout by High Performance Systems, Inc. (1994). In this simulationit was assumed that "a certain amount of burnout accrues from each hour ofovertime thats worked" (p. 166). This adaptation of a burnout variable wasadded to an enthusiasm variable as defined by Brophy and Good (1986). Theyfound that "enthusiasm . . . often correlates with achievement" (p. 362). In the High Performance Systems burnout simulation, burnout impactedthe work yield of the employee. In this study, the adaptation attempted toillustrate how the burnout of the teacher impacted the enthusiasm of the teacher,thereby affecting the achievement of the student. Brophy and Goods (1986) summary report findings can be related toSmith and Kleins (1991) criteria for knowledge synthesis as outlined in Table 3.In this table, evidence is presented to support using the Brophy and Good reportas a knowledge synthesis upon which the system dynamics model may be based. TABLE 3
  • 33. 22 INCLUSION CRITERIA BY SMITH AND KLEIN (1991) AS APPLIED TO KNOWLEDGE SYNTHESIS REPORT BY BROPHY AND GOOD (1986) 1. Focus on normal school settings with normal populations. Excludestudies conducted in laboratories, industry, the armed forces, or special facilitiesfor special populations. 2. Focus on the teacher as the vehicle of instruction. Exclude studies ofprogrammed instruction, media, text construction, and so on. 3. Focus on process-product relationships between teacher behavior andstudent achievement. Discuss presage2 and context variables that qualify orinteract with process-product linkages, but exclude extended discussion ofpresage-process or context-process research. 4. Focus on measured achievement gain, controlled for entry level.Discuss affective or other outcomes measured in addition to achievement gain,but exclude studies that did not measure achievement gain or that failed tocontrol or adjust for students’ entering ability or achievement levels. 5. Focus on measurement of teacher behavior by trained observers,preferably using low-inference coding systems. Exclude studies restricted toteacher self-report or global ratings by students, principals and so forth, andexperiments that did not monitor treatment implementation. 6. Focus on studies that sampled from well-described, reasonablycoherent populations. Exclude case studies of single classrooms and studies withlittle control over or description of grade level, subject matter, studentpopulation, and so on. 7. Focus on results reported (separately) for specific teacher behaviors orclearly interpretable factor scores. Exclude data reported only in terms oftypologies or unwieldy factors or clusters that combine disparate elements tomask specific process-outcome relationships, or only in terms of general systemsof teacher behavior (open vs. traditional education, mastery learning, etc.). After the knowledge synthesis was identified and described, a technology-based tool was needed to create the simulation which would assist educators indeveloping policy and for problem-solving exercises. The next step was todetermine what software was to be used to create such a tool.2 Presagevariables include "teacher characteristics, experiences, training, andother properties that influence teaching behavior" (Shulman, 1986, p.6).
  • 34. 23 Simulation Software To create the simulation, the purchase of a software package was required.Computer simulations have been in use by the military since World War II. Oneof the first computer simulations combining feedback theory with decision-making was developed in the early 1960s using a computer programminglanguage entitled DYNAMO (Forrester, 1961). These simulations wereconsidered complex due to "their numerous nonlinearities, capable ofendogenously shifting active structure as conditions change, [which] give thesemodels a decidedly nonmechanical, lifelike character" (Richardson, 1991, p. 160).System Dynamics Out of the simulation sessions by Forrester and others grew a field ofstudy entitled system dynamics, which has "academic and applied practitionersworldwide, degree-granting programs at a few major universities, newslettersand journals, an international society, and a large and growing body ofliterature" (Richardson, 1991, p. 296). System dynamics is a field of study which includes a methodology for constructing computer simulation models to achieve better understanding and control of social and corporate systems. It draws on organizational studies, behavioral decision theory, and engineering to provide a theoretical and empirical base for structuring the relationships in complex systems. (Kim, 1995, p. 51) Within the system dynamics paradigm is a structural representation ofcomponents within the system in question. There are four major constructsfound in every system dynamics simulation that represent components in
  • 35. 24systems: Forresters (1968) systems dynamics notation of stocks, flows,converters, and connectors. In Figure 1, a simulation sector entitled Student is defined using these fourconstructs and is included here as an example to help describe how the modelingprocess uses system dynamics notation. FIGURE 1. FOUR BASIC CONSTRUCTS OF SYSTEM DYNAMICS NOTATION Stocks. Tyo (1995) describes the first of these four constructs as: "stocks,which are comparable to levels" (p. 64). Lannon-Kim (1992) defines stocks asaccumulators, or "a structural term for anything that accumulates, e.g. water in abathtub, savings in a bank account, current inventory. In system dynamicsnotation, a stock is used as a generic symbol for anything that accumulates"(p. 3). For example, in Figure 1 the identified stock represents the current level of
  • 36. 25student achievement reported as an accumulation of a percentage score rangingfrom 0 to 100, with 100 being the maximum percentage of achievement possibleover a given amount of time—in this case 180 school days. This level of studentachievement can fluctuate depending on certain teaching behaviors attributed tochanges in student achievement as modeled in the simulation. Flows. The second construct in systems dynamics notation is called aflow, or rate. This part of the model determines the "amount of changesomething undergoes during a particular unit of time, such as the amount ofwater that flows out of a tub each minute, or the amount of interest earned in asavings account" (Lannon-Kim, 1992, p. 3). In Figure 1, the indicated flowdetermines the amount and direction of change in achievement as reported in apercentage score in the adjacent stock—either higher, lower or equal—and isupdated "daily" during the simulation run (i.e., 180 times each session). Thisflow is represented mathematically in system dynamics notation with thefollowing equation: Achievement Change = Behavior Impact - Current Student AchievementThis equation shows how the impact of teaching behaviors directly affectsoutcomes in student achievement. The relationship in Figure 1 between the stock Current StudentAchievement and the flow Achievement Change is represented in systemdynamics notation in the following equation: Current Student Achievement(t) = Current Student Achievement(t-Dt) + (Achievement Change) * DtThis equation shows how student achievement changes over time depending onthe rate of change in achievement and the current level of achievement. To
  • 37. 26clarify, suppose we said that Achievement Change = A, Behavior Impact = B,Current Student Achievement = C, Time = t, and Change in Time = Dt. Then theformulae would look like the following: A = B + C, and C = C (t - Dt) + A * Dt Converters. The third construct, converters—sometimes called auxiliariesand/or constants, are similar to formula cells in a spreadsheet. These values,whether automatically and/or manually entered, are used to modify flows in themodel. In Figure 1, the selected converter represents the identified behaviorswhich impact on the flow involving changes in student achievement. Thefollowing equation illustrates the mathematical representation of theinterrelationships between all of the knowledge synthesis research findings in allof the subsystems within the model as compiled in the converter BehaviorImpact: Behavior Impact = MEAN (Level of Independence, Management of Response Opportunities, Quality of Structuring, Quality of Teacher Reactions, (Teacher Expectation * GRADE & EXPECTATIONS))This equation states that the output from all of the model sectors are averagedtogether to be used as a modifier in the flow Achievement Change3. Again, toclarify, suppose we said that Behavior Impact = B, Level of Independence = L,3Teacher Expectation was first determined by the relationship between grade leveland previous grade point average as described in the knowledge synthesismatrix findings by Brophy and Good (1986) and then averaged with the rest ofthe sector outcomes.
  • 38. 27Management of Response Opportunities = M, Quality of Structuring = S, Qualityof Teacher Reactions = T, Teacher Expectation = E and GRADE &EXPECTATIONS = G. Then the formula would look like the following: B = (L + M + S + T + (E * G)) 5 Connectors. The connector, or link, represents the fourth construct insystem dynamics and is a connection between converters, flows and stocks. InFigure 1, the behavior impact converter and the achievement change flow aretied together with the indicated connector. This link ties the two parts togetherin a direct relationship, where behavior impacts directly on achievement. A system dynamics approach, based on the four major constructs ofstocks, flows, converters, and connectors, allows computer models to "extract theunderlying structure from the noise of everyday life" (Kim, 1995, p. 24). This"noise" has been described as chaos, or "the science of seeing order and patternwhere formerly only the random . . . had been observed" (p. 24). The next step increating a system dynamics-based tool for educational reform was to determinewhich computer application would be used to create the simulation exercise.The following section describes requirements for the simulation, and a briefreview of various computer programs and their functions.Computer Applications of System Dynamics The first system dynamics computer simulations were written mainly inthe DYNAMO computer language on mainframe computers in the late fifties andearly sixties. These simulations were programmed by computer experts andresearchers either had to learn complex programming languages or wait until thecomputer programmers completed the task and reported the results. Today, this
  • 39. 28type of simulation programming has its drawbacks due to the lack of portabilityand the extremely long learning curve associated with mainframe programminglanguages. A more ideal simulation software package would be developed andrun on laptop computers in a more user-friendly environment (e.g. operatingsystems using graphical user interfaces such as Windows or Macintosh) byeducators and researchers in the field, not on mainframes by computerprogramming experts. In the search for a more ideal software package for the novice, a fewoptions were uncovered. Tyo (1995) and Kreutzer (1994) reviewed systemdynamics software packages, similar in design to DYNAMO, which are currentlyavailable for inexpensive, personal computers. Based on these two distinctsoftware reviews, two types of system dynamics software were considered forthe purpose of creating the simulation: Powersim (ModellData AS., 1993), andithink! (High Performance Systems, Inc., 1994). The following is a brief summaryof those two reviews. Kreutzer (1994) described the ithink! package: Because of its powerful features and ease of use, ithink! is one of the most popular system dynamics modeling tools. It allows you to draw stock- and-flow diagrams on the computer screen, completely mapping the structure of the system before you enter equations. You can add more detail and then group elements into submodels, zooming in for more detail in complex models. The manual is such a good introduction to system dynamics that we recommend it even if you use another program. (p. 3) Stating that "ithink! is one of the most powerful simulation packagesreviewed", Tyo (1995, p. 64) goes on to describe how ithink! has "by far the besttutorials and documentation, as well as a large number of building blocks"(p. 64). Both reviewers agreed that ithink! provides good authoring support fornovice modelers as well as good support for sensitivity analysis so that the
  • 40. 29models can be tested and retested with varying inputs. Tyo (1995) gave ithink! afive star rating (out of a possible five). Powersim was given high ratings due to its built-in workgroup support.Tyo (1995) stated that "the multi-user game object lets several users run themodel concurrently to cooperate or compete against one another. This isparticularly useful for testing workgroups" (p. 64). Both system dynamics packages, Powersim and ithink!, were purchasedfor the purpose of creating and testing the simulation model. The finalsimulation model was created using the ithink! system dynamics modelinglanguage, and was developed and simulated on an Apple Macintosh laptopcomputer platform, model Duo 230. The ithink! package was chosen overPowersim because there was a wealth of available documentation and supportmaterial included in the ithink! package and because of the ease with whichprogramming was accomplished in the Macintosh computer operating system. Simulation using ithink! software. To develop a simulation in the ithink!environment a number of steps have to be followed to insure logical functioningof the model. Those steps include: defining a high-level description of the entiremodel; creating subsystems (i.e., sectors), which represent the separate parts ofthe model; placing each variable in its prospective sector for accurate visualrepresentation; and constructing connections (i.e., dependencies) between thevariables in and across sectors.
  • 41. 30 FIGURE 2. HIGH-LEVEL DESCRIPTION OF ENTIRE SIMULATION MODEL The first step in the ithink! modeling process is to construct a high-leveldescription of the entire model. Figure 2 shows an example of how the entiremodel can be defined in process frames, "each of which models one subsystem,
  • 42. 31such as the rocket propellant in a space shuttle" (Tyo, 1995, p. 64). Processframes are constructed to represent elements found in a particular system, suchas the elements of a teacher behavior/student achievement simulation asdescribed in the knowledge synthesis of Brophy and Good (1986). Each arrow in the high-level description can represent the connection ofone or more research findings to the corresponding sectors. These arrows aredrawn to illustrate dependencies among sectors such as those identified in theknowledge synthesis. According to Tyo (1995), "frames can be connected to oneanother to show dependencies between subsystems. For example, if a companyincorrectly bills its customers, the number of calls to customer service willincrease" (p. 66). If there is doubt about how the sectors are connected in thehigh-level description, there is a function in ithink! which allows connections tobe made at the subsystem level, where the interdependencies are sometimesmore clear to the user. After the high-level frames are laid out, the model is further defined byidentifying submodels within each pertinent area. Tyo (1995) describes how "themodeler steps down into each of the frames to add the necessary constructs foreach submodel" (p. 66). To construct a submodel, the modeler uses the four basic components insystem dynamics notation previously mentioned—stocks, flows, converters, andconnectors. A map of the submodel, using the four components, is drawn on thecomputer screen to visually represent the relationship(s) desired. In Figure 3, thefollowing example is a submodel entitled Student Enrollment. The stock isentitled Current Student Enrollment. Two flows entering and exiting the stockare entitled New Students and Losing Students. Two converters with connectorsto the flows are entitled Student Growth Fraction and Student Loss Fraction.
  • 43. 32The stock returns information via feedback loops to the flows in the form ofconnectors. FIGURE 3 EXAMPLE OF SUBMODEL In this submodel a graphical representation was drawn to visuallydescribe the relationships between new students entering the school as well asstudents leaving the school. The flow of students entering and leaving the schoolis illustrated by the direction of the arrows on the flows themselves. For each stock and flow diagram in the simulation, the software alsocreates a set of generic equations (see Table 4). Tyo (1995) describes how the modeler moves into modeling mode to define the mathematical relationships among the stocks, flows and other constructs. . . ithink! presents the modeler with valid variables to use in defining mathematical relationships. (p. 66)Johnston and Richmond (1994) describe the set of equations in simplifiedterminology:
  • 44. 33 What you have now, is what you had an instant ago (i.e., 1 dt [delta time] in the past) + whatever flowed in over the instant, - whatever flowed out over the instant. The software automatically assigns + and - signs based on the direction of the flow arrowheads in relation to the stock. (p. 9) TABLE 4 EXAMPLE OF EQUATIONS GENERATED FROM A STOCK AND FLOW DIAGRAM________________________________________________________________________ Current_Student_Enrollment(t) = Current_Student_Enrollment(t - dt) + (new_students - losing_students) * dt INIT Current_Student_Enrollment = { Place initial value here… } new_students = { Place right hand side of equation here… } losing_students = { Place right hand side of equation here… } student_growth_fraction = { Place right hand side of equation here… } student_loss_fraction = { Place right hand side of equation here… }________________________________________________________________________ The new student enrollment is dependent on the relationship between theflow of new students affected by the student growth fraction. The number ofstudents leaving the school is dependent on the student loss fraction as it affectsthe flow called losing students. Johnston & Richmond (1994) describe further therelationships: In order to simulate, the software needs to know How much is in each accumulation at the outset of the simulation? It also needs to know What is the flow volume for each of the flows? The answer to the first question is a number. The answer to the second may be a number, or it could be an algebraic relationship. (p. 9)
  • 45. 34 For example, lets assume that there are currently 100 students enrolled inthe school. To enter this information into the submodel, the modeler simply"double-clicks" with the mouse on the Current Student Enrollment stock and anew "window" appears on the screen (see Figure 4). Within this window arenumerous options to enter and change data requirements. The modeler typesthe number 100 in the box entitled "INITIAL (Current__Student__Enrollment) =". FIGURE 4 EXAMPLE OF INITIAL STOCK DATA ENTRY After closing this window the modeler can then enter the student growthfraction in the same manner: double-clicking on the converter icon, and typing in
  • 46. 35the number desired. For this example it was assumed that for every 10 studentscurrently enrolled in the school, two new students entered as well. Thus, thenumber ".2" was entered in the appropriate box within the window entitled"student__growth__fraction = ..." (see Figure 5). FIGURE 5 FIRST EXAMPLE OF CONVERTER DATA ENTRY As with the student growth fraction, the student loss fraction was enteredin the same manner in the corresponding converter (see Figure 6). This time itwas assumed that only a few students were leaving the school (2 for every 100)so the number ".02" was entered into the corresponding window entitled"student__loss__fraction = ...". FIGURE 6 SECOND EXAMPLE OF CONVERTER DATA ENTRY
  • 47. 36 After the ratio of student population growth to student population declinewas determined, the rate at which each of the flows of incoming and outgoingstudents had to be determined. By double-clicking on the flows "new students"and "losing students", data was entered into corresponding new windows thatappeared on the screen (see Figure 7). FIGURE 7 FIRST EXAMPLE OF INITIAL FLOW DATA ENTRY To set the rate of flow an equation had to be entered into the windowentitled "new__students = ...". This equation, "Current__Student__Enrollment *student__growth__fraction", describes the relationship between the currentnumber of students and the new incoming students. Every time the simulationmodel completes one time step (i.e., dt = delta time), the level of current studentenrollment changes as well.
  • 48. 37 The flow of losing students was similar in data entry to the previous flowdata entry. The only difference in the operation was that the equation had astudent loss fraction as the other part of the equation, not a student growthfraction as previously described (see Figure 8). FIGURE 8 SECOND EXAMPLE OF INITIAL FLOW DATA ENTRY Once the initial data entry is completed the modeler can return to thestock and flow diagram, initiate a simulation "run", and watch the results as theyunfold on the screen. To represent the results a number of graphical optionsexist within the software. For this example a simple graph was used to illustratethe outcomes of the simulation "run" (see Figure 9). The graph describes X and Y axes, where X represents the number ofmonths during the school year and Y represents the number of students enrolledin the school (from zero to a possible 1500 students maximum). In the initial
  • 49. 38"day" of the school year, the number of students currently enrolled was 100,same as initially entered in the stock entitled Current Student Enrollment. During the simulation "run" the number of students enrolled graduallyincreased due to more new students enrolling (two to every 10) versus studentslost (two to every 100) so that by the end of the "year" the student population wasclose to 800 students. With this type of simulation, an administrator couldpredict needs for future facilities and when capacities were going to be metduring the year. FIGURE 9 EXAMPLE OF SIMULATION OUTPUT
  • 50. 39 After the simulation has been set up to run, sensitivity analysis isnecessary to insure that the model describes what is out there in the "real" world.Tyo (1995) describes how: ithink! lets users do sensitivity analysis on the model by running it repeatedly with varying inputs. The results of each run are written to a separate line on the output graph. For input, the user can set basic statistical distributions or use graphs. An ithink! model can be animated with the level of the stocks moving up and down as appropriate. (p. 66) To do a sensitivity analysis the modeler needs to choose certainspecifications in the simulation software and run the simulation a number oftimes until the outcomes match the desired results. For example, in the StudentEnrollment model a desired outcome might be that the maximum number ofenrolled students not exceed a lower limit than 800 (for reasons of facilities, etc.).In the software, there is a function that allows the modeler to do a number ofsensitivity runs automatically until the desired output is achieved. To keep theenrollment down fewer students can be allowed into the school than two forevery 10 currently enrolled students, or more can be made to leave the schoolthan two for every 100 currently enrolled students. Thus, the modeler entersdiffering values in the student growth or loss fraction converters and watches therespective outcomes. To initiate the sensitivity analysis the modeler selects an option entitledSensitivity Specifications and a window appears on the computer screen (seeFigure 10). In this window the modeler first selects the system dynamicscomponent to be analyzed—in this example the student growth fractionconverter was selected. Then the modeler selects the number of runs desired toautomate the simulation. For this example four runs were selected. After whichthe modeler chooses a range of data for test purposes. The example shows a
  • 51. 40range of values from "0.05" (i.e., five for every 100 students currently enrolled) tothe initial "0.2" in the first simulation run (i.e., two for every 10 students currentlyenrolled). This selected range is less than and equal to the original specificationsin the student growth fraction converter data entry to bring down the number ofnew students enrolling in the school. FIGURE 10 EXAMPLE OF SENSITIVITY ANALYSIS DATA ENTRY After the data is entered the modeler selects the graph function in thepreviously described window and runs the simulation. The simulation cyclesfour times, each time using a different value for the student growth fractionconverter. The resulting output is drawn on the graph and the modeler can seewhich, if any, output is desirable (see Figure 11). Then the modeler can go backto the student growth fraction converter and change the data entry to the desired
  • 52. 41fraction to "set" this component in the model, or run the analysis again withdifferent data sets to create a more desirable outcome. FIGURE 11 EXAMPLE OF FOUR SENSITIVITY ANALYSES RUNS Using sensitivity analysis, the modeler can insure that the simulationdescribes a situation that reflects what is happening in the system in question.The desired outcomes can be achieved with a logical application of the program. In conclusion, the ithink! system dynamics software is a system dynamicspackage that can be run on inexpensive computers and the ease of use andpracticality of this package makes it an ideal program for creating a technology-based tool to be used in educational reform.
  • 53. 42 Limitations of the Study This study is limited to the simulated observation of variables described inthe knowledge synthesis matrix based upon Brophy and Goods (1986) summaryof teacher behaviors and student achievement found in Table 2.
  • 54. 43 CHAPTER II RESEARCH METHODOLOGY Simulation Modeling Modeling educational processes in a computer simulated environmentinvolved a number of ordered steps to complete the product in a logical,sequential fashion. This sequence involved selection of the software; design andconstruction of the model; and calibration and validation of the outputs from thesimulation. The simulation exercise was constructed in a simulation and authoringtool environment known as ithink! (High Performance Systems, Inc., 1994). Toidentify the proper selection and magnitude of inputs for the simulation, amatrix was developed using information reported in a knowledge synthesisreport of research-based conclusions of significant teacher behaviors that affectstudent achievement (Brophy & Good, 1986). To insure the creation of a valid,reliable simulation, the methodology for creating a computer simulation modelof the variables identified in the matrix was based upon recommended modelingtechniques reported by Whicker and Sigelman (1991). This sequential orderincluded the following components: model criteria, knowledge synthesis,system definition, grouping of variables, data identification, mapping the model,model translation, model calibration, and validation of the model. The organization of this chapter is illustrated in Figure 12—a flowdiagram of the simulation design and construction process for each step of themodeling process as well as pertinent subheadings.
  • 55. 44 FIGURE 12 FLOW DIAGRAM OF THE SIMULATION MODELING SEQUENCE________________________Source: Adapted from Whicker and Sigelman (1991).
  • 56. 45 Model Criteria & Knowledge Synthesis Following the flow diagram of the modeling process, the first step was todefine the criteria of a simulation exercise of teacher behaviors which affectstudent achievement. In the previous chapter the criteria for the model as well asthe knowledge synthesis have been discussed and defined. The next sectiondescribes an attempt to define the system itself. System Definition The actual student achievement/teacher behavior findings were reducedinto interactions among variables and factors. These interactions, described inFigure 13, are known as feedback circle diagrams or "circles of influence rather thanstraight lines" (Senge, 1990a, p. 75). This feedback circle diagram was the first step in describing a school insystem dynamics terminology (Richardson, 1991). For example, the desiredacademic achievement for a classroom setting influenced the teacher’s perceptionof how great the gap in knowledge is at the present time (e.g., standardized testscores, previous grade reports). This perception, in turn, influenced behaviorsthe teacher exhibited, using the Instructional Theory Into Practice (ITIP) model asa conceptual framework for planning appropriate instruction strategies. Thesebehaviors, in turn, influenced the student performance which, in turn, influencedcurrent academic achievement. As the current academic achievementapproached the desired level of achievement, the perceived knowledge gapdecreased and, in turn, teacher behaviors changed.
  • 57. 46 ITIP Framework Teacher Behaviors Desired Desired Academic Academic Achievement Perceived Student Performance Knowledge Gap Current Academic Current Academic Achievement Achievement FIGURE 13 FEEDBACK CIRCLE DIAGRAM OF HOW TEACHER BEHAVIORS MAY AFFECT STUDENT PERFORMANCE Grouping of Variables & Data Identification The grouping of the variables were previously discussed and identified inthe knowledge synthesis (see Table 2). The data requirements of the modeldescribed in this study were identified and recorded to provide initial values forthe variables as reported in the knowledge synthesis matrix in the previouschapter.
  • 58. 47 Mapping the Model The theoretical part of simulation design was followed by writing thesystem dynamics application using research-based conclusions, or "mapping" themodel, using the available software tools found in ithink!. In this next section,the four divisions of generic variables previously identified in the knowledgesynthesis matrix (Table 2)—giving information, questioning students, reacting tostudent responses, and handling seatwork—were mapped into unique modelsectors, describing a structural framework based upon the Instructional TheoryInto Practice (ITIP) model (Table 1). A student sector and teacher sectordescribing context-specific research-based conclusions completed the modeldesigned for validation of Brophy and Goods findings as they were illustrated ina system dynamics environment. An optional sector, teacher burnout, wasadded outside of the structural framework, to be introduced after validation wascompleted. Each process frame (i.e., sector) of the actual computer model is illustratedin the following figures. In each of the following figures, there is a graphicrepresentation of the research-based systemic interactions between variables,connectors, and stock-and-flow diagrams in each illustration as identified in theknowledge synthesis matrix (see Table 2) and discussed in the review of theliterature. The sector names correspond directly to the category names asoutlined in the knowledge synthesis.Sector 1—Student The first sector was a description of the student in two context-specificareas: socioeconomic status (SES), and grade level (Figure 14). The two context-
  • 59. 48specific areas identified by Brophy and Good (1986) that are directly affected bySES are student call-outs and teacher praise. Student Current Student Achievement Choose Socioeconomic Choose achievement chg status (SES) grade level SES GRADE & EXPECTATIONS behavior impact GRADE LEVEL ? ? ~ FIGURE 14 STUDENT SECTOR SES affects achievement in regard to student call-outs. In low SESsituations student call-outs correlate positively with achievement and vice versain high SES situations. SES also affects achievement in regard to the level of teacher praise. LowSES students need a great deal of praise as opposed to their high SEScounterparts who do not need as much praise. In regard to grade level, four areas (Brophy & Good, 1986) are directlyaffected in context-specific findings: vagueness in terminology from the teacher,teachers level of organizational skills, the impact of teacher praise, and teacherexpectations. Vagueness in teacher delivery in instruction is more important in latergrades than in the earlier grades. The more vague the teacher is in her/hisdelivery of instruction, the less achievement is realized from younger students. In the earlier grades, organizational skills of the teacher are moreimportant than in later grades. Students in earlier grades need more help infollowing rules and procedures than their older counterparts.
  • 60. 49 Praise is more important in the earlier grades than their later counterparts.Praise increases student achievement more with younger students than witholder students. In the later grades it is more important to be clear about teacherexpectations than in the earlier grades. To help them increase academicachievement, older students need to know more about what, exactly, is expected.Sector 2—Teacher A description of the teacher in three research findings (Brophy & Good,1986)—organizational skills, experience, and expectation—is illustrated in Figure15. Teacher Teacher Expectation Choose chg in pcvd ability Use last years amount of G.P.A. organizational (from 0.0 to skills chg in managing ? 4.0) ACADEMIC POTENTIAL INDICATOR ? organizational skills Average Management Level ~ Choose amount of ~ MANAGEMENT IMPACT ? years experience experience EXPERIENCE IMPACT FIGURE 15 TEACHER SECTOR A teachers organizational skills affects student achievement. Teacherswho are well organized have better success in increasing academic achievementfrom the students, especially in the lower grades, than teachers who are less thanwell organized. Students who are placed in environments where the teachercontrols student task behaviors experience more academic successes.
  • 61. 50 A teachers experience directly affects student achievement as well.Teachers who have more than three years teaching experience solve moreteaching problems and have better classroom management than their moreinexperienced counterparts. Expectation plays a major role in student achievement. Teachers formexpectations about a students ability and personality early in the year. If thisexpectation is low the students achievement is not maximized.Sector 3—Giving Information In this sector, a combination of the three variables as identified by Brophyand Good (1986) was included: vagueness in terminology, degree of redundancy,and question/interactions per period (Figure 16). Giving Information GRADE & ENTHUSIASM ENTHUSIASM Choose amount GRADE & MANAGEMENT of vagueness in chg in enthusiasm ~ terminology as adjustment delay ~ used by the GRADE & CLARITY amount of interactions teacher ~ changes in structuring ? ? Quality of Structuring vagueness factor Choose ~ CLARITY IMPACT ~ amount of ACADEMIC INTERACTIONS IMPACT questions ~ /answer REDUNDANCY IMPACT ? Choose degree interactions redundancy of redundancy per class period FIGURE 16 GIVING INFORMATION Vagueness in terminology has been shown to reduce studentachievement. This finding is especially more noticeable in the upper grades.
  • 62. 51 Teachers who are redundant in their explanations of new content materialrealize higher achievement in their students than those who do not repeatcontent in their delivery of instruction. This is especially true in repeating andreviewing general rules and key concepts. Teachers who generate more question/interactions among their studentsrealize more achievement gains than their counterparts who have fewerinteractions with students. High-gain classes experience about 24question/interactions per 50 minute period versus 8.5 question/interactions orless in lower-gain classrooms.Sector 4—Questioning the Students The fourth sector was designed to include the following variables definedby Brophy and Good (1986): student call-outs, higher-level questions, post-question wait time, and student response rate (Figure 17). Questioning the Students Choose if student Management of Response Opportunities Choose amount of call-outs are or change in management higher level questions are not allowed by the teacher ? questioning factors percent of higher level questions student callouts ~ ? WAIT TIME IMPACT STUDENT CALLOUTS IMPACT ~ Choose COGNITIVE LEVEL IMPACT response rate Choose amount of ~ ? postquestion wait time SUCCCESS RATE IMPACT wait-time ? ~ percent of correct responses FIGURE 17 QUESTIONING THE STUDENTS
  • 63. 52 Teachers who allow student call-outs in low SES classes notice greaterstudent achievement than those who do not allow call-outs. The opposite effectis true if the students are from a high SES background. The percentage of higher-level questioning techniques also affects studentachievement. A teacher who uses about 25% of questions in the higher-leveldomain can increase student achievement than if more or less of these types ofquestions are used. Postquestion wait time has been shown to increase student achievement ifoptimally used. Three seconds seems to be the most efficient amount of time towait before calling on students after questioning. The teacher who moves at a brisk enough pace to engage students duringquestioning realizes more achievement gains from the students than teacherswho do not briskly cover the content area. About 75% of student successes inresponding to questions is sufficient coverage for achievement to increase.Sector 5—Reacting to Student Responses This fifth sector was defined in three variables (Brophy & Good, 1986):teacher praise, negative and positive feedback (Figure 18). Reacting to Student Response Choose amount Quality of Teacher Reactions collective responses of praise from ? chg in reactions TEACHER PRAISE IMPACT the teacher teacher praise IMPACT•PRAISE ~ ~ delay in adjusting ~ ? CORRECT RESPONSE FEEDBACK IMPACT Choose amount of time negative correct response feedback feedback is simple negation Choose amount of time correct ~ versus personal criticism responses are acknowledged INCORRECT RESPONSE IMPACT ? incorrect response feedback FIGURE 18 REACTING TO STUDENT RESPONSES
  • 64. 53 Teacher praise impacts low SES students achievement gains—the morepraise from the teacher, the more achievement realized from the student. Gradelevel is another area that is sensitive to teacher praise. Younger students learnmore when praised for their effort than do their older counterparts. Negative feedback is an important factor to consider when teaching.When a teacher encounters an incorrect answer to his/her question, 99% of thetime the feedback from the teacher should be simple negation rather thanpersonal criticism for the wrong answer. Positive feedback for correct responses from students also impacts studentachievement gains. Correct responses should be acknowledged almost all of thetime (90%)—if not for the respondents sake, for the onlookers who arewondering if the answer was correct.Sector 6—Handling Seatwork The sixth sector was described in Figure 19 as two variables from Brophyand Goods knowledge synthesis (1986): independent seatwork, and amount ofavailable help. Handling Seatwork success rate Level of Independence help available chg in independence Choose amount of ? ? help available during seatwork Choose amount of success AVAILABLE HELP IMPACT in independent seatwork SUCCESS RATE IMPACT FIGURE 19 HANDLING SEATWORK
  • 65. 54 Students who work independently after the lesson was taught need to besuccessful nearly 100% of the time in their seatwork. Those students who needhelp and immediately receive attention from the teacher will also benefit in theiracademic achievement.Sector 7—Teacher Burnout This optional sector was defined using two variables: enthusiasm andamount of teacher burnout (Figure 20). FIGURE 20 TEACHER BURNOUT Enthusiasm from the classroom teacher has been shown to affectachievement gains in students. Brophy and Good (1986) report that "enthusiasmoften correlates with achievement" (p. 362). Teacher burnout was one area that was not covered in the knowledgesynthesis. This variable was based on assumptions solely derived from personalexperiences from the researcher and a number of colleagues. After the individual sectors were defined, the entire model was integratedinto a large-scale simulation format by connecting the seven sectors together ina logical structure.
  • 66. 55 This integration exercise added structure to the entire model bygraphically, as well as functionally, defining the places where there was feedbackbetween the individual sectors based on the research findings in the knowledgesynthesis. The next section describes how the sectors were connected using theknowledge synthesis as a basis for integration.Model Integration Mapping of the complete model of the teacher behavior as it affectsstudent achievement was constructed by combining the seven sectors previouslydescribed. The next step in mapping was to connect the sectors to each other byway of logical intersections derived from the knowledge synthesis researchfindings (see Figure 21). The connections between sectors in the complete model brought togetherall the research-based conclusions into one complete system. For example, thesocioeconomic status variable in Sector 1—Student was connected to the studentcall-outs impact variable in Sector 5—Questioning the Students due to thefinding which stated that "student call-outs usually correlate positively withachievement in low-SES classes but negatively in high-SES classes" (Brophy &Good, 1986, p. 363). All of the connections were based upon the findings in theknowledge synthesis matrix discussed in the literature review in Chapter I orwere not included in the model. Figure 21 is the actual graphic representation—the "map"—of thesimulation model. This map includes all of the 17 variables and seven divisionsof variables according to the conclusions previously stated in the knowledgesynthesis matrix (Brophy & Good, 1986).
  • 68. 57 Model Translation After this visual mapping of the model was created, the model wasdefined in mathematical equations in the manner previously described inChapter I (see Table 4). The numerical equivalencies were taken from theresearch, and, subsequently, entered into formulae (see Appendix F). Model Calibration In this study, the sensitivity of the initial model was adjusted to behave inthe anticipated fashion when certain input values were altered. To do this, everyidentified variable (e.g., teacher behavior, classroom demographics, studentachievement, etc.) was isolated in each of the separate sectors and the subsequentsimulation outputs from each variable in their individual simulation runs werecompared against real world outputs to insure significance in the sameness ofgenerated data versus reality. For example, the wait time variable, as describedin the knowledge synthesis, was isolated from the rest of the sector, with all othersectors in isolation as well, and sensitivity analysis was used to insure that threeseconds was, in fact, the amount of time which reflected the highest level ofachievement possible before connecting this variable back into the sector and thecomplete system as well. To achieve this isolated calibration of each variable within every sector arelationship was identified and assigned between all pertinent variables withineach sector and the student achievement variable, based upon the researchfindings in the knowledge synthesis. For example, the aforementioned wait timevariable was described in relating the impact of wait time on studentachievement. The relationship must either cause the student achievement to rise
  • 69. 58or fall during the simulation. "Studies . . . have shown higher achievement whenteachers pause for about 3 seconds . . . after a question, to give the students timeto think before calling on one of them" (Brophy & Good, 1986, p. 363). Using this conclusion a wait time impact graph was plotted andsensitivity runs were used to determine the necessary relationship between post-question wait time and student achievement. This graphical function enabledthe simulation to reproduce "complex nonlinear relationships" (HighPerformance Systems, Inc., 1994, p. 14) throughout the simulation model. Figure22 illustrates the post-question wait time impact graph found in the Sector 4—Questioning the Students component of the model. FIGURE 22 POST-QUESTION WAIT TIME IMPACT
  • 70. 59 In Figure 22, there is a table of inputs and outputs as well as a graphicaldescription. When postquestion__wait__time (X axis) data is entered into thesimulation, a relative WAIT__TIME__IMPACT output (Y axis) corresponds tothat input. For example, if the number 3.00 is entered by a participant in a givensimulation run, a corresponding 100 (i.e. student achievement score equivalency)is plotted on the WAIT__TIME__IMPACT axis. Any other input returns less thanoptimal output from the model. Therefore, the three second wait time variable isinsured to impact student achievement more than any other data entered in thisparticular variable. The relationships between each variable and the student achievementoutcome variable were defined using graphical functions similar to the exampledescribed above. A listing of X and Y axes plots from every graphical function inthe model can be recreated using the numerical outputs found in the SystemDynamics Computer Simulation Model Formulae in Appendix F. Once each variable was calibrated in isolation, the individual sectors werecalibrated in isolation as well, to assure that they reacted to input in theanticipated fashion as defined by the research. Each sector was isolated from themodel, run with differing data sets, and calibrated so that optimum results wereachieved when the ideal input, as reported in the research, was entered. Forexample, the giving information sector was calibrated so that: 1) when there waslittle or no vagueness from the teacher, 2) when there was a high degree ofredundancy, and 3) when there was the optimum amount of question/answerinteractions (i.e., 24 per 50-minute period), then, and only then, did the stock-and-flow diagram labeled Quality of Structuring indicate an output of idealproportions (i.e. the number 100).
  • 71. 60Sector Integration Each sector of the system dynamics model was integrated back into themodel by incorporating variables from other sectors with logical connectingpoints. For example, the socioeconomic status (SES) variable in Sector 1—Student was found to affect the impact of student call-outs in Sector 4—Questioning the Students. "Student call-outs usually correlate with achievementin low-SES classes but negatively in high-SES classes" (Brophy & Good, 1986, p.363). Therefore a connection was created between the two sectors, whereby SESdirectly affected the impact of student call-outs on achievement. FIGURE 23 CURRENT STUDENT ACHIEVEMENTCurrent Student Achievement
  • 72. 61 A graphical representation of the stock-and-flow entitled Current StudentAchievement was created to illustrate the combined outcomes from all of thesectors (see Figure 23). Tyo (1995) describes how ithink! "provides both timeseries and scatter graphs to view the output of a simulation run" (p. 66). This"scorecard" of student achievement was situated in Sector 1—Student as a logicalreflection of the results of manipulating the data entered during individualsimulation sessions. The range of scores could run from 0% as a reflection of nostudent achievement up to 100% to reflect the maximum amount of studentachievement."Steady State" Simulation After the integration of every variable as it affected every other pertinentvariable, a number of simulation runs were performed to find the "steady state"of the model. This "steady state" (i.e., ideal simulation output) was required toinsure that each of the individually-calibrated sectors did not adversely affect theother sectors in ways not described in the research when combined in thecompleted model. To insure that a true "steady state" was attained, the simulation outcomeswere balanced until the ideal achievement level for the student occurred whenall of the research-based conclusions from each calibrated sector wererepresented during the manipulation of data entry in each and every variable.Numerical outputs from each variable as well as the accumulated resultsillustrated in the Current Student Achievement stock were checked and when allof the outcomes were as described in the knowledge synthesis, the model wasconsidered at "steady state". Two complete sets of data: one set required for a "steady state" simulationrun and one set from an actual respondent who was included in the validation
  • 73. 62studies were included in Table 5 to illustrate a comparison of how differing setsof data can completely change the final outcome of the simulation run asreflected in the Achievement graph in the Current Student Achievement stock. TABLE 5 EXAMPLE OF "STEADY STATE" SIMULATION SESSION VERSUS ACTUAL RESPONDENT DATA ENTRY KNOWLEDGE SYNTHESIS Study #1 - Resp. #6 "Steady State" VARIABLE DATA ENTRY DATA ENTRY• amount of questions/interactions 5 (per 50 min. period) 24 (per 50 min. period)• degree of redundancy in instruction 1 (low degree) 1 (high degree)• teacher vagueness factor 4 (near max. amount) 0 (least amount)• teacher help available 3 (not available) 1 (readily available)• student success @ seatwork 30 (% of time) 100 (% of time)• % of correct responses from student 5 (% of time) 75 (% of time)• % of higher-level questions from teacher 30 (% of time) 25 (% of time)• postquestion wait time from teacher 3 (seconds) 3 (seconds)• student call-outs 1 (allowed) 3 (not allowed)• correct response feedback from teacher 50 (% of time) 100 (% of time)• incorrect response feedback 50 (% of time) 100 (% of time)• teacher praise 0 (much praise) 1 (little or no praise)• grade level of student 5 (5th grade-Primary) 12 (senior-High School)• socio-economic status of student 0 (low SES) 1 (high SES)• student G.P.A. from previous year 1 (cum G.P.A.) 4 (cum. G.P.A.)• teacher organizational skills 0 (unorganized) 1 (highly organized)• years experience of teacher 3 (years experience) 10 (years experience) Total Score accumulated in the Current Student Achievement stock 53 (% achievement) 100 (% achievement) If a modeler were to input the "steady state" data set from Table 5 into thesimulation, a Current Student Achievement score of 100 would be plotted on theAchievement graph (see Figure 24).
  • 74. 63 FIGURE 24 EXAMPLE OF "STEADY STATE" OUTPUT USING DATA SET FROM TABLE 5 If the modeler decided to enter the data set from respondent #6 found inTable 5, the Current Student Achievement score would be the same as plotted onthe Achievement graph in Figure 25. This function allows users to run ideal,"steady state" simulations as well as to run actual simulation runs from othersources for comparison purposes. A data base of simulation runs could bedeveloped for further study and discussion.
  • 75. 64 FIGURE 25 EXAMPLE OF RESPONDENT #6 OUTPUT USING DATA SET FROM TABLE 5 This "steady state" did not include outcomes based on including Sector7—Teacher Burnout due to the fact that this sector was experimental in designand not a "true" indicator of all of the combined variables as defined in theknowledge synthesis matrix. One assumption in the calibration step was that there was no weighting ofthe outputs from the stock-and-flow diagrams as they interrelated with eachother. Each of the 17 variables were given the exact same level of impact, orweight, as all of the others. For example, the output from Quality of Structuringwas given the same weight as the output from Management of ResponseOpportunities as well as all the other outputs from the stock-and-flow diagrams.
  • 76. 65 Model Validation The research-based conclusions gathered for this study, as identified byBrophy and Good (1986) and illustrated in the knowledge synthesis matrix, wereused as baseline findings of teacher behaviors which affect student achievement.In addition to reliance upon the knowledge synthesis matrix, the model wasdemonstrated to a select group of practitioners—teachers, administrators anduniversity professors —who helped to determine if the outputs from the modelwere or were not similar to how real teacher behaviors affect studentachievement in the classroom. Selection of the practitioners for the first validation study was determinedby responses to a cover letter sent to chief executive officers of 197 overseasAmerican-type schools by obtaining their office addresses from The ISS directoryof overseas schools 1993-94 edition (International School Services, Inc., 1993).This validation study group, comprised mainly of private, overseas, American-curriculum school administrators attending a recruitment conference, was self-selected and conveniently available for the first validation sessions. Individual simulation sessions were conducted with each of the sixinterested respondents at the Association for the Advancement of InternationalEducation (AAIE) annual conference in New Orleans, Louisiana, February, 1995.A pre-simulation questionnaire was used to identify and categorize practitionerpredictions of what data the model generated before the actual simulation wasrun. A post-simulation questionnaire was used to gather participant reflectionsabout the outputs after the simulation had been demonstrated for a comparisonof the expected versus actual outputs.
  • 77. 66 A copy of the cover letter, reply form from the respondents, follow-upletter, pre-simulation questionnaire, and post-simulation questionnaire areincluded in Appendices A, B, C, D and E respectively. Selection of the practitioners for the second validation study wasdetermined by a group of educational leadership graduate students attending a1995 College of Education summer session course entitled Leadership and PolicyStudies 7120—The Supervisory Process, at The University of Memphis,Memphis, Tennessee. This group, comprised mainly of public school teachersfrom the Memphis area, was not self-selected, but coerced by the professor tovolunteer for participation the validation study. Individual simulation sessions were conducted with each of the sixinterested respondents using the same pre- and post-simulation questionnaire togather participant reflections about the outputs after the simulation had beendemonstrated for a comparison of the expected versus actual outputs. The limitations of two small groups used for validation purposes isobvious and needs to be addressed here. Neither group was randomly selected,nor were there sameness in populations between the groups. One group wasself-selected, while the other was coerced to participate. One group was mainlyprivate school administrators and the other—public school teachers. The groupswere purposively selected for their availability, not for randomness or otherwise.Results may have been different if the groups were randomly selected and largerin size.
  • 78. 67 CHAPTER III RESULTS Simulation Model of Knowledge Synthesis The purpose of this particular study was to create a system dynamicssimulation of research-based conclusions of teacher behaviors which affectstudent achievement. Results regarding the research question—whether theknowledge synthesis findings of teacher behaviors that affect studentachievement can be usefully modeled in a system dynamics computersimulation—are presented in this chapter, based on observations from each ofthe individual questions presented to the 12 respondents who participated in thetwo validation sessions. Validation Results Two attempts to validate the simulation model using practitioners in thefield of education as session participants were used to find if the modelrepresented how teacher behaviors affect student achievement. The results fromboth validation studies were comprised of simulation and questionnaire dataand are reported in this section. In both validation studies the respondents participated in individualsimulation sessions using pre- and post-simulation instruments to gather data aswell as individual computer simulations using the system dynamics modelpreviously described. Each of the participants orally completed a pre-simulationquestionnaire, witnessed a computer simulation session, and wrote a post-
  • 79. 68simulation questionnaire used to gather their observations. The completeprocess lasted approximately 25-30 minutes for each simulation session,including the pre-simulation questionnaire, simulation run, and post-simulationquestionnaire. In the first validation study there were four males and two females inattendance: one private school teacher, four private school administrators andone university professor. In the second study there were three males and threefemales: five public school teachers and one post-doctoral student. Pre-simulation Questionnaire The pre-simulation questionnaire solicited information regarding theparticipants choice of data to be entered into the computer model that wouldreflect the highest achievement possible for the type of student chosen by theparticipant (see Appendix D). This information, obtained by personal interview,was entered into the simulation model before the participants watched thecomputer sessions. The participants from both studies were asked 17 questions derived fromthe 17 knowledge synthesis findings. These questions were designed to reducedata entry into numerical equivalencies to determine how separate variableswould perform within the complete simulation. An eighteenth variable—amount of hours worked per day—was entered to allow the participants toobserve how the model would function when the optional sector, teacherburnout, was "switched on". The pre-simulation questionnaire data collection results from the twovalidation studies are described in Tables 6 and 7 respectively. These tablesillustrate how each respondent in the two studies answered the 17 questions.
  • 80. 69 TABLE 6 STUDY 1: PRE-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT respondent respondent respondent respondent respondent respondent# QUESTION #1 #2 #3 #4 #5 #6 amount of4 interactions 24 10 12 24 10 5 degree of5 redundancy 1 1 1 1 1 1 vagueness3 factor 0 0 2 0 3 4 help14 available 1 1 1 3 3 3 success @13 seatwork 100 87.5 70 100 50 30 % of correct8 response 100 80 60 85 40 5 % of higher7 level ques. 75 35 35 20 30 30 postquestion9 wait time 10 10 6 3 2 3 student6 call-outs 3 3 3 3 3 1 corr. resp.12 feedback 100 100 80 50 40 50 incorr. resp.11 feedback 100 0 70 75 70 50 teacher10 praise 0 1 1 1 1 0 grade2 level 8 7 10 5 6 5 socio-econ.1 status 1 1 1 1 1 0 student15 G.P.A. 3 4 2.9 3 2 1 organizatl17 skills 1 1 .5 1 .5 0 years16 experience 8 10 5 7 1 3 Total Score reported in Current 84% 74% 74% 93% 63% 53% Student Achievement
  • 81. 70 TABLE 7 STUDY 2: PRE-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT respondent respondent respondent respondent respondent respondent# QUESTION #1 #2 #3 #4 #5 #6 amount of4 interactions 24 15 18 24 24 10 degree of5 redundancy 0 1 0 1 1 1 vagueness3 factor 0 1 1 1 0 0 help14 available 1 1 3 1 1 1 success13 @ seatwork 100 85 85 75 80 100 % of correct8 response 90 70 70 80 75 60 % of higher7 level ques. 60 25 60 75 25 60 postquestion9 wait time 3 3 3 3 3 3 student6 call-outs 3 3 3 3 3 1 corr. resp.12 feedback 100 100 40 100 100 100 incorr. resp.11 feedback 100 92 50 100 100 100 teacher10 praise 0 0 0 0 0 0 grade2 level 8 8 8 5 2 1 socio-econ.1 status 1 0 1 1 1 1 student15 G.P.A. 4 3 3 3 3 4 organizatl17 skills 1 .5 .5 1 1 1 years16 experience 7 8 7 5 5 8 Total Score reported in Current 94% 86% 73% 82% 86% 82% Student Achievement
  • 82. 71 Simulation Sessions In individual simulation sessions, each of the 12 participants, togetherwith the author, observed how the simulation model reacted to her/his inputs.The simulation results were dependent on choices of numerical input by theparticipant according to answers from the pre-simulation questionnaire asentered into the computer simulation. After the initial simulation run, theparticipant observed a "steady state" simulation run, previously identified by theauthor, whereby all of the outcomes were maximized (i.e., student achievementwas realized at 100%). From the outcomes generated by the "steady state"computer simulation the participants then reviewed research-based conclusions(recorded within the simulation) as to how each item in question realized itsmaximum potential within the system. This second simulation run gave eachparticipant a chance to understand how the less than desirable outcomesgenerated in his/her simulation run might be rectified in future sessions. Thistype of comparison between ideal simulated outcomes and respondentssimulated outcomes may be one way to "teach" desirable teacher behaviors infuture in-service activities. After the two simulation runs, Sector 7—teacher burnout was included(i.e., "switched on") in a third simulation run to demonstrate to the participantshow one basic variable—in this case, hours worked per day—could completelychange simulation outputs throughout the representations of teacher behaviorsin the simulation. This chaotic function was described as an optional sector tothe participants, based on assumptions, not on knowledge synthesis findings.Also explained to the participants was how other sectors could be constructedand included at a later date, giving the simulation an open-ended, continuousimprovement quality—modifiable at any time.
  • 83. 72 Post-simulation Questionnaire The post-simulation questionnaire was designed to solicit reflections fromeach participant about the results from the individual simulation sessions. Therewere four questions regarding demographic data of the participants and eightquestions regarding the knowledge synthesis/simulation outcomes (seeAppendix E). The following section describes each question presented to therespondents and the observation of the responses generated during thevalidation sessions. The responses to the questions were described using thefollowing Likert-type scale, with the stem being neutral and the directionprovided in the response option: 5 4 3 2 1 ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree Using a scale of one to five, with five indicating that the respondentstrongly agreed with the question and one indicating that the respondentstrongly disagreed, a mean score was calculated to show differences betweenresponses to each question as well as directions in opinions and beliefs amongthe respondents regarding the following statements. Question 1. Computer simulation of a classroom setting is veryimportant: The results from the first question were that three respondentsstrongly agreed, seven respondents agreed, and two respondents neither agreednor disagreed. Mean score = 3.83 (study 1); Mean score = 4.33 (study 2).
  • 84. 73 Question 2. Direct instruction (active teaching) is very important: Eightof the respondents strongly agreed and four respondents agreed. Mean score =4.67 (study 1); Mean score = 4.67 (study 2). Question 3. The computer simulation truly reflected what really happensin the classroom: Two of the respondents strongly agreed, eight respondentsagreed, and two respondents neither agreed nor disagreed. Mean score = 4.00(study 1); Mean score = 4.00 (study 2). Question 4. I feel very confident that the research findings used in thisstudy were very accurately modeled in the computer simulation: Fourrespondents strongly agreed, seven respondents agreed, and one respondentneither agreed nor disagreed. Mean score = 4.00 (study 1); Mean score = 4.50(study 2). Question 5. The computer simulation session was very helpful inidentifying appropriate behaviors for a given classroom setting: One respondentstrongly agreed, ten respondents agreed, and one respondent neither agreed nordisagreed. Mean score = 4.00 (study 1); Mean score = 4.00 (study 2). Question 6. The computer simulation is a very good way to demonstrateto teachers how certain behaviors function with one group of students and donot function with another group: Five respondents strongly agreed, Fiverespondents agreed, one respondent neither agreed nor disagreed, and onerespondent disagreed. Mean score = 4.00 (study 1); Mean score = 4.17 (study 2). Question 7. The computer simulation helped me to better understandthe complexities of classroom teaching in general: Two respondents stronglyagreed, five respondents agreed, and five respondents neither agreed nordisagreed.Mean score = 3.83 (study 1); Mean score = 3.67 (study 2).
  • 85. 74 Question 8. I look forward to more computer simulations of this type:Four respondents strongly agreed and eight respondents agreed. Mean score =4.32 (study 1); Mean score = 4.33 (study 2). Additional Findings The data findings from the post-simulation questionnaires are reported inTables 8 and 9 respectively. One unsolicited comment from one of therespondents was that the simulation seemed to be grounded upon a veryimportant instructional theory (i.e., Instructional Theory Into Practice (ITIP)model for direct instruction by Hunter, 1967). This respondent thought that ITIPwas very effective as a framework for teaching. Another unsolicited commentwas from a respondent who wanted to receive a copy of the finished dissertationproject along with a run-time copy of the simulation as soon as it was published.This participant, when asked, replied that it would be used in teacher in-serviceactivities. One respondent commented that she believed higher achievementstudents (which she attributed to high SES) need less lower-level questioningtechniques than reported by Brophy and Good in the knowledge synthesisfindings (1986). Another respondent was surprised that this same finding wasnot what they had expected at all. Higher-level questioning techniques was anissue brought up more than once during participant comments. Overall, thereactions to the simulation exercise were more positive than not.
  • 86. 75 TABLE 8 STUDY 1: POST-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT (N=6) under 30 30-50 over 50 Age range III III teacher administrator other Nature of work I IV I elementary middle school high school university grade level4 III III III I bachelor master doctor highest degree IV II 5 4 3 2 1 MEAN neither agree strongly SCORE question #: strongly agree agree nor disagree disagree disagree 3.83 1. I III II 4.67 2. IV II 4.00 3. II II II 4.00 4. I IV I 4.00 5. I IV I 4.00 6. II III I 3.83 7. II I III 4.32 8. II IV4Administrators may have been represented in more than one grade level due tothe scope of their responsibilities.
  • 87. 76 TABLE 9 STUDY 2:POST-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT (N=6) under 30 30-50 over 50 Age range II IV teacher administrator otherNature of work V I elementary middle school high school university grade level IV II bachelor master doctorhighest degree III II I 5 4 3 2 1MEAN neither agree stronglySCORE question #: strongly agree agree nor disagree disagree disagree 4.37 1. II IV 4.67 2. IV II 4.00 3. VI 4.50 4. III III 4.00 5. VI 4.17 6. III II I 3.67 7. IV II 4.32 8. II IV
  • 88. 77 CHAPTER IV SUMMARY, CONCLUSIONS, IMPLICATIONS, AND RECOMMENDATIONS Summary The purpose of this particular study was to create a system dynamicssimulation of research-based conclusions of teacher behaviors which affectstudent achievement. Interpretations of and conclusions regarding the resultspresented in the previous chapter are described here. Post-questionnaireresponses will be discussed along with observations and recommendations forfuture research as well. It is evident from the small sample size of respondents(study #1: N = 6; study #2: N = 6) that conclusions from this study mightpossibly mask a particular result and that more validation sessions with a largersample size would help to strengthen (or weaken) the following interpretations. In light of the research question, the findings from this study tentativelysupport the position that knowledge synthesis conclusions can be usefullymodeled on a system dynamics computer simulation. The results from twovalidation studies of educators who predicted possible outcomes, witnessed thesimulation, and recorded their observations about the computer-generatedoutcomes from the simulation indicate a positive response to the researchquestion of whether or not knowledge synthesis findings could be modeled in asystem dynamics environment in a computer simulation. The study alsorevealed that all of the respondents valued the usage of computer simulations ineducation, or that student/teacher interactions can be usefully simulated.
  • 89. 78 Conclusions In chapters one and two the research question stated: can the knowledgesynthesis findings of teacher behaviors that affect student achievement beusefully modeled in a system dynamics computer simulation? The results fromthis study indicate that knowledge synthesis findings can, more likely than not,be usefully modeled in a system dynamics environment. The following is adescription of the conclusions regarding the study. The most important response recorded during the validation sessions wasin regard to the theoretical basis on which the simulation was founded. Inquestion number 2 it is apparent from the respondents observations that thedirect instruction approach to teaching was highly valued (Mean scores5 = study#1: 4.67 and study #2: 4.67). This indicates that the area of research on teachingused in developing the simulation—Hunters (1967) Instructional Theory IntoPractice (ITIP) model—was pertinent to all of the respondents and can beconsidered important enough for future simulation studies. Another importantconsideration is that 100% of the respondents look forward to such simulationsin the future (Mean scores = study #1: 4.32 and study #2: 4.33). Thus, it can beconcluded that simulations based on ITIP models are valuable exercises foreducators. The validity of the simulation exercise appeared to be supported by thefact that responses to questions 3 and 5—whether the simulation truly reflectedwhat really happens in the classroom, and that the session was very helpful inidentifying appropriate teacher behaviors—indicated a tentative agreement fromthe majority of the respondents (Mean scores = study #1: 4.32 and study #2:4.33). Responses to questions 1 and 6—importance of simulations and how good5 All mean scores are based on a possible 5 maximum rating
  • 90. 79they are in demonstrating behaviors to teacher—indicate that the respondentsgenerally valued these factors as well (Question #1 mean scores = study #1: 3.83and study #2: 4.33; question #6 mean scores = study #1: 4.00 and study #2: 4.17)with one disagreement from the participants. It can be concluded that, for themost part, the simulation generally represented the research-based conclusionsfound in the knowledge synthesis in an important, valid and understandablemanner. In question 7 there was less agreement than with any of the otherquestions that simulations help educators to understand complexities ofteaching. Five of the respondents neither agreed nor disagreed, five agreed andonly two strongly agreed (Mean scores = study #1: 3.83 and study #2: 3.67). Itcan be concluded that the complexities of teaching may not be as easilysimulated in a system dynamics simulation as the other elements addressed inthe study. In light of the fact that the two validation study respondent groups werediverse in many ways, and limited in size, one point must be addressed: theresults indicate that the observations from both of the groups are quite alike. Thediversity between groups may actually be considered a positive factor insupporting the conclusion that the simulation seems to reflect what happens inthe classroom between teacher and student. Though there was opportunity tostrongly disagree with the results of the simulation sessions, most of therespondents indicated a positive response with every one of the questions in thesurvey instrument.
  • 91. 80 Implications Some implications about the use of a computer simulation model such asthis have been identified. In the pages that follow, these implications arecategorized into four areas of education: teachers and instruction, administrators,research, and schools as learning organizations. All of these areas can usesimulations similar to the one in this study as part of a learning laboratory topractice and learn in "an environment that is risky, turbulent, and unpredictable"(Kim, 1995, p. 46). This simulation exercise can provide educators with such anenvironment. Kim (1995) states that learning laboratories are designed to: create an environment that is of operational relevance in which managers can step out of day-to-day demands to: • reflect on their decision making • develop a common language • learn new tools for thinking systematically • discuss operational objectives and strategies in an open forum • test operating assumptions • experiment with new policies and strategies • and have fun. (p. 47) The second element of learning laboratories—that a common languagecan be developed among participants—is one area of education that needs to beaddressed here. For too long educators have lacked common vocabulary uponwhich to describe events in schools. Simulations can help to formulate and applyagreed-upon terminology which identifies certain behaviors/ideas/applicationsthat affect students, teachers, administrators and researchers in schools. Afterplaying the simulations, the participants can reflect on certain areas of concern,and identify those elements by referring to the modeled research-basedconclusions in the simulation. Teachers and administrators can use the common
  • 92. 81terminology to help better understand the complexities found in education andto better understand their, often times, differing perspectives. Learning laboratories are used in business and industry as a way to trainemployees away from the job site. There are university programs which dealsolely with this type of instructional strategy. For example, one of theseprograms at MITs Sloan School of Management uses learning laboratories toteach leadership training by incorporating system dynamics simulations intosystems thinking courses (Senge, 1990a). Teachers and InstructionTeacher Training In the area of teacher training, learning laboratories using a model such asthe one in this study may be used as a risk-free environment to help studentteachers experiment with strategies for increasing student achievement withoutactually entering the classroom setting. The students can attempt to increasetheir "scores" by playing the simulation, or isolated parts of it—experiencing theresearch-based conclusions in sessions with or without their masterteacher/professor present. The sessions can be stopped at any time, allowing theinstructor or end-user to intervene, where appropriate/necessary.Staff Development In the area of staff development, the simulation may help to introduce orreinforce the research-based conclusions of increasing student achievement toveteran teachers. These experts can attempt to optimize the system forincreasing student achievement. One advantage of this model is that, duringsimulation sessions, teachers may suspend the individual simulation runs at any
  • 93. 82time during the "school year" to modify teaching behaviors if the immediateresults do not reflect sufficient increases in achievement. The teachers can reflecton their choices and change mid-year without having to start the simulationsessions all over again. An argument against such simulations in staff development activities, orany other educational realm for that matter, is that these kinds of exercises arescientifically-based representations of teaching and there is no representation ofthe "art" of teaching in simulation technology. In response to the question of artversus science in simulations, artists tend to realize that only a small amount ofthe world is important and they visualize those important details. Kim (1995)suggests that by paying attention to those details, artists might fail in capturing"the core structures that are important, we may be the unwitting producers ofour own chaos" (p. 25). Simulations allow educators the opportunity to knowmore about the processes in question, broadening the scope of what is possibleand what is happening out there in "real" terms, not just based on partialperception, artistically speaking or otherwise. AdministratorsStaff Evaluation In the area of supervisor/administrator assessment, the simulation may beused to ascertain whether principals have a sufficient research knowledge base toeffectively observe and identify effective teaching behaviors during theevaluation process. For example, principals can run the simulation and observea specific group of teacher behaviors as well as the student achievement resultingfrom this unique combination of behaviors from the teacher. After the session,the principals can give recommendations for the "virtual" teacher to improve
  • 94. 83their performance. Then the principals can view their recommendations inaction during the following session to see if they predicted correctly.In-Service Program Development Superintendents may use this model as a tool for training administratorsabout research-based conclusions in non-threatening learning laboratories.Using the simulation to compare the outcomes from participants simulatedsessions and the "ideal" session is one way to reinforce research findings of theeffectiveness of teacher behaviors which affect student achievement. This type ofcomparison between ideal simulated outcomes and respondents simulatedoutcomes is one way to reinforce in-service programs which enhance desirableteacher behaviors. ResearchMaking Research Results Relevant One implication for research in the field of education may be anidentification process of the gaps in quality of existing research as identified byprofessors who visualize, for the first time, outcomes of a simulation model ofconclusions based on research. Forrester (1993) states that models can, and musthelp to organize information in a more understandable way. With these types ofsimulations researchers can "try out" their findings to "see" the conclusionsmodeled in a visual, dynamic environment. Once researchers "see" the bigpicture they may be able to identify areas which seem counterintuitive andobserve seemingly chaotic functions in real time, giving them more clarity invisualizing distances between causality and time and space.
  • 95. 84Making Research Results Clear When simulations are put together with findings from the research, thelarge amount of prose describing the processes and products becomes distilledinto "sound bites" of numerical equivalencies and one sentence descriptors. Theneed for these equivalencies help to clarify to the user as well as the researcherexactly what was found and the gaps in what else is needed to get the model tofunction like the "real world."Making Research Results Useful Reading the wealth of research findings in education can be a tedious andconfusing ordeal. The chapter solely dedicated to teacher behavior and studentachievement (Brophy & Good, 1986) which was modeled in this simulation iscomprised of 47 pages of singled-spaced text and 205 citations within that text.The body of research in itself is a useful tool for educators who have a need tostudy such findings, but in a 15 minute simulated session of the conclusions thefindings can be manipulated in a classroom setting by a complete novice (as wereall of the respondents). Schools as Learning Organizations Simulations such as the one presented here will allow schools to drawcloser to becoming what Senge (1990a) calls learning organizations. A learningorganization is one that recognizes and celebrates interdependencies within theorganization versus defending the need for independence for individual parts ofthat system, be it persons, departments, management, etc. Learningorganizations stop blaming individuals for systemic problems because theorganization is just that—a system, comprised of many interdependent parts,
  • 96. 85and no one part is independent from the rest. Therefore, no one part or person inthe organization is to blame for a systemic problem as every part and all peopleare involved in and affected by the system in one way or another. The simulation presented in this study gives the individual or workgroupa chance to see one subsystem in its entirety, where all the parts, as identified inthe research, are represented, for once, in a user-friendly, dynamic model ofteacher behavior as it affects student achievement. This chance to see the "big"picture at one instant is necessary for persons in organizations to experienceinterdependencies in training sessions for decision-making and policy analysis.Business and industry use these kinds of simulations to forecast successfulevents and learn about systemic structures and archetypes. Schools must breakthe static, linear cause and effect mode of yesterday and begin to think insystemic, circular, holistic approaches as learning organizations, changingtogether in interdependent "webs" of relationships such as teacher behaviors andstudent achievement. Recommendations The following recommendations for further study are made based on theresults of this research: 1. It might be helpful to include more classroom teachers in thesample population, to insure that a more representative group is involved in thevalidation process, as well as those previously mentioned. 2. It might be helpful to increase the sample size for statisticalpurposes. The simulation sessions, including completion of the pre- and post-questionnaires, were time consuming for both the respondents as well as theresearcher, but another study using the same simulation, questions, and
  • 97. 86methodology but with a larger group of respondents could help to increase (ordecrease) the probability of significance. 3. It is inevitable that this model will be modified for continuousimprovement of how it behaves and predicts outcomes generated from thestrength and relationships between the variables. It is also necessary tocontinuously validate the outcomes from this model after each attempt atmodification. 4. It might be helpful to develop a run-time version of the model fordistribution to end users which can be used without a facilitator present. Thismight help the validation process by giving the respondent more liberty inviewing and manipulating the model, lifting the pressures of time and/or thepersonal interview. 5. It might be helpful to develop a similar simulation using anothertype of instructional technique versus direct instruction (e.g., cooperativelearning, one-on-one instruction) and compare outcomes of additionalsimulations. Some teaching techniques require implicit skills versus the explicitones described by Brophy and Good (1986). A comparison of those teachingstrategies might enable educators to better understand differences in teachingstyles and techniques. 6. A number of studies of other important areas in educational reform(e.g., student behaviors, parent involvement, teacher training, school size,financial and legal aspects of education, etc.) need to be connected to the existingsimulation in an attempt to investigate how factors inside and outside theclassroom setting affect student achievement. This type of study could continueuntil a whole school district is described in system dynamics terms. Such asimulation exercise could be a basis for a true technological tool for ongoingsystemic reform.
  • 98. 87 7. It might be helpful to develop changes in the model as the research-based conclusions change. For example, a future version of the model might beconstructed to be sensitive to new developments in the field of education,particularly as new research is received. Modeling programs such asithink! canbe continually improved as all of the submodels within the system can bechanged at any time without affecting the structure of the model, only the outputfrom the differing equations and data entry.
  • 99. 88REFERENCES
  • 100. 89 REFERENCESAlkin, M. C., Linden, M., Noel, J., & Ray, K. (Eds.). (1992). Encyclopedia of educational research (6th ed.). New York: MacMillan Publishing Company.Bartz, D. E., & Miller, L. K. (1991). Twelve teaching methods to enhance student learning. Washington, DC: National Education Association.Bell, T. (1993, April). Reflections one decade after A Nation At Risk. Phi Delta Kappan, 74, 592-597.Briggs, J. (1992). Fractals: The patterns of chaos. New York: Touchstone.Brophy, J., & Evertson, C. (1976). Learning from teaching: A developmental perspective. Boston: Allyn and Bacon.Brophy, J. E., & Good, T. L. (1986). Teacher behavior and student achievement. In M. C. Wittrock (Ed.), Handbook of Research on Teaching (3rd ed., pp. 328-375). New York: MacMillan Publishing Company.Corno, L. & Snow, R. E. (1986). Adapting teaching to individual differences among learners. In M. C. Wittrock (Ed.), Handbook of Research on Teaching (3rd ed., pp. 605-629). New York: MacMillan Publishing Company.Csikzentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper & Row.Darling-Hammond, L. (1993, June). Reframing the school reform agenda. Phi Delta Kappan, 74, 753-761.Digate, G. A., & Rhodes, L. A. (1995, March). Building capacity for sustained improvement. The School Administrator, 52, 34-38.Dunkin, M. J. (Ed.). (1987). The international encyclopedia of teaching and teacher education. New York: Pergamon Press.Erikson, E. H. (1963). Childhood and society. New York: W. W. Norton & Company.Forrester, J. W. (1961). Industrial dynamics. Cambridge, MA: MIT Press.Forrester, J. W. (1968). Principles of systems. Cambridge, MA: MIT Press.Forrester, J. W. (1993). System dynamics and the lessons of 35 years. In K. B. DeGreene (Ed.), Systems-based approach to policy-making. New York: Kluwer Academic Publishers.
  • 101. 90Gagné, R. (1970). The conditions of learning. New York: Holt, Rinehart and Winston.Gonzalez, J. J., & Davidsen, P. (1993). Integrating systems thinking and instructional science. Paper presented at the NATO Advanced Study Institute, July 12-23, Grimstad, Norway.Good, T., & Grouws, D. (1979). The Missouri mathematics effectiveness project: An experimental study in fourth grade classrooms. Journal of Educational Psychology, 71, 355-362.Hass, G. & Parkay, F. W. (1993). Curriculum planning: A new approach (6th ed.). Boston: Allyn and Bacon.Hentschke, G. C. (1975). Management operations in education. Berkeley, CA: McCutchan Publishing Corporation.High Performance Systems, Inc. (1994). Introduction to systems thinking and ithink! . Hanover, NH: Author.Hiller, J., Fisher, G., & Kaess, W. (1969). A computer investigation of verbal characteristics of effective classroom lecturing. American Educational Research Journal, 6, 661-675.Hunter, M., & Russell, D. (1981). Planning for effective instruction: Lesson design. In M. Hunter, Increasing your teaching effectiveness. Palo Alto, CA: The Learning Institute.Hunter, M. (1967). Improved instruction. El Segundo, CA: TIP Publications.International School Services, Inc. (1993). The ISS directory of overseas schools 1993-94 edition: The comprehensive guide to K-12 American and international schools worldwide. Princeton, NJ: Author.Johnston, D. & Richmond, B. (1994). Getting started with ithink!: A hands-on experience. Hanover, NH: High Performance Systems, Inc.Kim, D. (1995). System thinking tools: A user reference guide. Cambridge, MA: Pegasus Communications.Kirst, M. W. (1993, April). Strengths and weaknesses of American education. Phi Delta Kappan, 74, 613-618.Kreutzer, W. B. (1994). Creating your own management flight simulator: A step- by-step builders handbook (with software reviews). In P. M. Senge, C. Roberts, R. B. Ross, B. J. Smyth, & A. Kleiner (Eds.), The fifth discipline handbook: Strategies and tools for building learning organizations. New York: Currency Doubleday.
  • 102. 91Lannon-Kim, C. (1992, January). The vocabulary of systems thinking: A pocket guide. The Systems Thinker, 2(10), 3-4.Lunenberg, F. C. & Ornstein, A. C. (1991). Educational administration: Concepts and practice. Belmont, CA: Wadsworth Publishing Company.ModellData AS. (1993). Powersim: User’s guide and reference. Manger (Bergen), Norway: Author.Nelson, J. O. (1993). School system simulation: An effective model for educational leaders. The AASA Professor, 16(1).Norman, D. A. (1993). Things that make us smart: Defending human attributes in the age of the machine. Reading, MA: Addison-Wesley Publishing Company.O’Toole, J. (1993). The executive’s compass: Business and the good society. New York: Oxford University Press.Prigogine, I., & Stengers, I. (1984). Order out of chaos: Man’s new dialogue with nature. New York: Bantam Books.Reilly, D. H. (1993, January). Educational leadership: A new vision and a new role within an international context. Journal of School Leadership, 3(1), 9-19.Rhodes, L. A. (1994). Technology-driven systemic change. Paper presented at the annual International Conference on Technology and Education. March 27-30, London, England.Richardson, G. P., & Pugh III, A. L. (1981). Introduction to system dynamics modeling with DYNAMO. Cambridge, MA: MIT Press.Richardson, G. P. (1991). Feedback thought in social science and systems theory. Philadelphia, PA: University of Pennsylvania Press.Roberts, N. H. (1974, September). A computer system simulation of student performance in the elementary classroom. Simulation & Gaming, 5, 265- 290.Schön, D. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.Senge, P. (1990a). The fifth discipline: The art and practice of the learning organization. New York: Doubleday.Senge, P. (1990b). The leader’s new work: Building learning organizations. Sloan Management Review, 32(1), 7-23.
  • 103. 92Shulman, L. S. (1986). Paradigms and research programs in the study of teaching: A contemporary perspective. In M. C. Wittrock (Ed.), Handbook of Research on Teaching (3rd ed., pp. 3-36). New York: MacMillan Publishing Company.Smith, C. B., & Klein, S. S. (1991). Synthesis research in language arts instruction. In J. Flood, J. M. Jensen, D. Lapp, & J. R. Squire (Eds.), Handbook of research on teaching the English language arts. New York: MacMillan Publishing Company.Smith, L. & Land, M. (1981). Low-inference verbal behaviors related to teacher clarity. Journal of classroom interaction, 17, 37-42.Solomon, D., & Kendall, A. (1979). Children in classrooms: An investigation of person-environment interaction. New York: Praeger.Tobin, K. (1980). The effect of an extended teacher wait-time on science achievement. Journal of Research in Science Teaching, 17, 469-475.Tobin, K., & Capie, W. (1982). Relationships between classroom process variables and middle-school science achievement. Journal of Educational Psychology, 74, 441-454.Tyo, J. (1995). Simulation modeling tools. Informationweek, 535, 60-67.Waddington, C. H. (1976). Tools for thought. St. Albans, England: Paladin.Waldrop, M. M. (1992). Complexity: The emerging science at the edge of chaos and order. New York: Simon & Schuster.War Manpower Commission. (1945). The training within industry report. Washington, DC: Bureau of Training.Wheatley, M. J. (1992). Leadership and the new science: Learning about organization from an orderly universe. San Francisco: Berrett-Koehler Publishers.Whicker, M. L. & Sigelman, L. (1991). Computer simulation applications: An introduction. (Applied Social Research Methods Series, Volume 25). Newbury Park, CA: Sage Publications.Wittrock, M. C. (Ed.) Handbook of research on teaching (3rd ed.). New York: MacMillan Publishing Company.
  • 104. 93APPENDIX A
  • 105. 94 APPENDIX ACover Letter To The Chief Executive Officer Of The School
  • 106. 95APPENDIX B
  • 107. 96 APPENDIX BReply Form From Respondent
  • 108. 97APPENDIX C
  • 109. 98 APPENDIX CFollow-Up Letter To Respondent
  • 110. 99APPENDIX D
  • 111. 100 APPENDIX D Pre-Simulation Questionnaire To Be Used For Validation Purposes. Please indicate your choice as to the statements given below. Choose theanswer that you feel will result in the highest overall achievement scores for thegrade level and socioeconomic level you have selected. The choices you pick willbe used as input for the computer simulation session. After the session, a post-session questionnaire will be administered to record your reflections regardingthese inputs and the resulting score. 1. Choose socioeconomic status (SES)Input: 0 = low-socioeconomic status (SES) or dependent/anxious1 = high-SES or assertive/confident 2. Choose grade levelInput: Choose a grade level from 0 (Kindergarten) to 12 3. Choose amount of vagueness in terminology as used by the teacherInput: Choose a number from 0 to 5, whereby 0 indicates the least amount ofvagueness in terminology and 5 indicates the most amount of vagueness interminology 4. Choose amount of questions/answer interactions per class periodInput: Choose from 0 to 24 teacher question/student answer interactions per 50minute class period
  • 112. 101 5. Choose degree of redundancyInput: 0 = little or no degree of redundancy of information given to students1 = high degree of redundancy of information given to students 6. Choose if student call-outs are or are not allowed by the teacherInput: 1 = student call-outs are allowed by teacher3 = student call-outs are not allowed by teacher 7. Choose amount of higher level questions (Blooms taxonomy)Input: Choose from 0% to 100% of higher order type questions used by theteacher 8. Choose response rateInput: Choose from 0% to 100% of the correct response rate from the studentbefore the teacher moves on to a new question 9. Choose amount of wait-time6Input: Choose from 0 to 10 seconds of post question wait-time before teachercalls on a student for response to the question 10. Choose amount of praise from the teacherInput: 0 = great deal of praise from the teacher1 = little or no praise from the teacher 11. Choose amount of time negative feedback is simple negationversus personal criticismInput: Choose from 0% to 100% of the time teacher should use simple negationversus personal criticism for incorrect responses to questions6In an attempt to continually improve the model, in the second validation studythis question was modified to read from 0 to 3 seconds.
  • 113. 102 12. Choose amount of time correct responses are acknowledgedInput: Choose from 0% to 100% of the time in which correct responses areacknowledged by the teacher 13. Choose amount of success in independent seat workInput: Choose from 0% to 100% of the time in which student experiences successin seat work assignments 14. Choose amount of help available during seat workInput: 1 = help is readily available3 = help is not readily available 15. Choose student cumulative G.P.A. (from 0.0 to 4.0)Input: Grade Point Average (GPA) from previous year4 = A; 3 = B; 2 = C, 1 = D, 0 = F 16. Choose amount of teacher experience7Input: Select amount of experience in years from 0 to 10 17. Choose amount of organizational skillsInput: Choose 1 for high amount of organizationChoose .5 for fair amount of organizationChoose 0 for not much organization 18. Choose amount of hours worked per day8Input: Choose from 8 to 16 hours a day worked by the teacher7 In an attempt to continually improve the model, in the second validation studythis question was modified to read from 0 to 40 years.8 Optional section based upon assumptions, not on knowledge synthesis (KS)findings.
  • 114. 103APPENDIX E
  • 115. 104 APPENDIX E Post-Simulation Questionnaire To Be Used For Validation Purposes.age range: under 30 ___ 30-50 ___ over 50 ___nature of work: teacher ___ administrator ___ other ___grade level: elementary ___ middle school ___ high school ___degree of education: bachelor ___ master ___ doctor ___Please indicate how important the following factors are to you in determiningyour perception of the outcomes of teacher/student simulation session.1. Computer simulation of a classroom setting is very important: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree2. Direct instruction (active teaching) is very important: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree3. The computer simulation truly reflected what really happens in the classroom: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree4. I feel very confident that the research findings used in this study were very accurately modeled in the computer simulation:
  • 116. 105 ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree5. The computer simulation session was very helpful in identifying appropriate behaviors for a given classroom setting: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree6. The computer simulation is a very good way to demonstrate to teachers how certain behaviors function with one group of students while do not function with another group: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree7. The computer simulation helped to me to better understand the complexities of classroom teaching in general: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree8. I look forward to more computer simulations of this type: ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor disagree
  • 117. 106APPENDIX F
  • 118. 107 APPENDIX F System Dynamics Computer Simulation Model Formulae Burnout SectorBurnout(t) = Burnout(t - dt) + (increase_in_burnout - dissipation) * dtINIT Burnout = 0DOCUMENT: Assuming more hours worked during months approaching eachsemester break (final exams, report cards, programs, and parent conferences)increase_in_burnout = IF(hours_worked_per_day≤8) THEN(0)ELSE((hours_worked_per_day-8)dissipation = Burnout*dissipation_fracdissipation_frac = GRAPH(Burnout)(0.00, 0.25), (10.0, 0.24), (20.0, 0.229), (30.0, 0.216), (40.0, 0.198), (50.0, 0.182), (60.0,0.163), (70.0, 0.145), (80.0, 0.121), (90.0, 0.09), (100, 0.0175)hours_worked_per_day = GRAPH(time)(1.00, 8.08), (18.9, 8.92), (36.8, 10.3), (54.7, 14.2), (72.6, 13.4), (90.5, 10.0), (108, 9.44),(126, 9.36), (144, 10.9), (162, 13.3), (180, 13.3)DOCUMENT: Input:Choose from 8 to 16 hours a day worked by the teacherimpact_of_burnout_on_enthusiasm = GRAPH(Burnout)
  • 119. 108(0.00, 100), (10.0, 56.0), (20.0, 20.0), (30.0, 11.0), (40.0, 9.00), (50.0, 6.00), (60.0, 3.50),(70.0, 3.00), (80.0, 2.50), (90.0, 1.00), (100, 0.00) Giving InformationENTHUSIASM(t) = ENTHUSIASM(t - dt) + (change_in_enthusiasm) * dtINIT ENTHUSIASM = 75DOCUMENT: "Enthusiasm . . . often correlates with achievement" (Brophy &Good, 1986, p. 362)change_in_enthusiasm = (impact_of_burnout_on_enthusiasm-ENTHUSIASM)/adjustment_delayQuality_of_Structuring(t) = Quality_of_Structuring(t - dt) +(changes_in_structuring) * dtINIT Quality_of_Structuring = 0changes_in_structuring = MEAN((MANAGEMENT_IMPACT*GRADE_&_MANAGEMENT), REDUNDANCY_IMPACT, (ENTHUSIASM*GRADE_&_ENTHUSIASM), ACADEMIC_INTERACTIONS_IMPACT,(CLARITY_IMPACT*GRADE_&_CLARITY))-Quality_of_Structuringadjustment_delay = 2, amount_of_interactions = 24DOCUMENT: Input:Choose from 0 to 24 teacher question/student answer interactions per 50 minuteclass period
  • 120. 109redundancy = 1DOCUMENT: Input:0 = little or no degree of redundancy of information given to students1 = high degree of redundancy of information given to studentsvagueness_factor = 0DOCUMENT: Input:Choose a number from 0 to 5, whereby 0 indicates the least amount of vaguenessin terminology and 5 indicates the most amount of vagueness in terminologyACADEMIC_INTERACTIONS_IMPACT = GRAPH(amount_of_interactions)(0.00, 0.00), (12.0, 61.0), (24.0, 100)DOCUMENT: "About 24 questions were asked per 50 minute period in the highgain classes, . . . In contrast, only about 8.5 questions were asked per period inthe low-gain classes . . . " (Brophy & Good, 1986, p. 343).CLARITY_IMPACT = GRAPH(vagueness_factor)(0.00, 100), (1.25, 67.5), (2.50, 40.0), (3.75, 20.5), (5.00, 1.50)DOCUMENT: "Smith and Land (1981) report that adding vagueness terms tootherwise identical presentations reduced student achievement in all of 10studies in which vagueness was manipulated" (Brophy & Good, 1986, p. 355).GRADE_&_CLARITY = GRAPH(GRADE_LEVEL)(0.00, 0.8), (1.20, 0.865), (2.40, 0.915), (3.60, 0.955), (4.80, 0.985), (6.00, 1.00), (7.20,1.00), (8.40, 1.00), (9.60, 1.00), (10.8, 1.00), (12.0, 1.00)DOCUMENT: "In later grades, lessons are typically with the whole class andinvolve applications of basic skills or consideration of more abstract content.
  • 121. 110Overt participation is less important than factors such as . . . clarity of statementsand questions. . . " (Brophy & Good, 1986, p. 365).GRADE_&_ENTHUSIASM = GRAPH(GRADE_LEVEL)(0.00, 0.805), (1.20, 0.86), (2.40, 0.915), (3.60, 0.97), (4.80, 0.995), (6.00, 1.00), (7.20,1.00), (8.40, 1.00), (9.60, 1.00), (10.8, 1.00), (12.0, 1.00)DOCUMENT: "[Enthusiasm] often correlates with achievement, especially forolder students" (Brophy & Good, 1986, p. 362). "In later grades, lessons aretypically with the whole class and involve applications of basic skills orconsideration of more abstract content. Overt participation is less important thanfactors such as . . . enthusiasm. . . " (p. 365).GRADE_&_MANAGEMENT = GRAPH(GRADE_LEVEL)(0.00, 1.00), (1.20, 1.00),(2.40, 1.00), (3.60, 1.00), (4.80, 1.00), (6.00, 1.00), (7.20, 0.985), (8.40, 0.965), (9.60,0.93), (10.8, 0.88), (12.0, 0.805)DOCUMENT: "In the early grades, classroom management involves a great dealof instruction in desired routines and procedures. Less of this instruction isnecessary in the later grades. . . " (Brophy & Good, 1986, p. 365).REDUNDANCY_IMPACT = GRAPH(redundancy)(0.00, 50.0), (1.00, 100)DOCUMENT: "Achievement is higher when information is presented with adegree of redundancy, particularly in the form of repeating and reviewinggeneral rules and key concepts" (Brophy & Good, 1986, p. 362).
  • 122. 111 Handling SeatworkLevel_of_Independence(t) = Level_of_Independence(t - dt) + (change_in_independence) * dtINIT Level_of_Independence = 100change_in_independence = (SUCCESS_RATE_IMPACT-Level_of_Independence)/AVAILABLE_HELP_IMPACTAVAILABLE_HELP_IMPACT = help_availablehelp_available = 1DOCUMENT: Input:1 if help is readily available3 if help is not readily availablesuccess_rate = 90DOCUMENT: Input:Choose amount of time, from 0% to 100% of the time, in which studentexperiences success in seatwork assignmentsSUCCESS_RATE_IMPACT = IF (success_rate>89) THEN 100 ELSE 75DOCUMENT: "For assignments on which students are expected to work on theirown, success rates will have to be very high—near 100%. Lower (although stillgenerally high) success rates can be tolerated when students who need help get itquickly" (Brophy & Good, 1986, p. 364).
  • 123. 112 Questioning the StudentsManagement_of_Response_Opportunities(t) =Management_of_Response_Opportunities(t - dt) + (change_in_management) * dtINIT Management_of_Response_Opportunities = 100change_in_management = (IF (STUDENT_CALLOUTS_IMPACT=0) THEN(questioning_factors) else ((questioning_factors*3)+STUDENT_CALLOUTS_IMPACT)/4)-Management_of_Response_Opportunitiespercent_of_correct_responses = 75DOCUMENT: Input:Choose from 0% to 100% of the correct response rate from the student before theteacher moves on to a new questionpercent_of_higher_level_questions = 25DOCUMENT: Input:Choose from 0% to 100% of higher order type questions used by the teacherpostquestion_wait_time = 3DOCUMENT: Input:Choose from 0 to 10 seconds of postquestion wait-time before teacher calls on astudent for response to the questionquestioning_factors = MEAN(CLARITY_IMPACT, COGNITIVE_LEVEL_IMPACT, SUCCCESS_RATE_IMPACT, WAIT_TIME_IMPACT)student_call-outs = 1
  • 124. 113DOCUMENT: Input:1 if student call-outs are allowed by teacher3 if student call-outs are not allowed by teacherCOGNITIVE_LEVEL_IMPACT = GRAPH(percent_of_higher_level_questions)(0.00, 0.00), (5.26, 59.5), (10.5, 79.5), (15.8, 91.5), (21.1, 100), (26.3, 100), (31.6, 96.5),(36.8, 92.5), (42.1, 85.5), (47.4, 79.5), (52.6, 72.0), (57.9, 65.0), (63.2, 56.0), (68.4, 44.5),(73.7, 34.5), (78.9, 27.0), (84.2, 18.0), (89.5, 11.5), (94.7, 5.00), (100.0, 0.5)DOCUMENT: ". . . the frequency of higher-level questions correlates positivelywith achievement, the absolute numbers on which these correlations are basedtypically show that only about 25% of the questions were classified as higherlevel" (Brophy & Good, 1986, p. 363).STUDENT_CALLOUTS_IMPACT = GRAPH(SES+student_call-outs)(1.00, 100), (2.00, -100), (3.00, 0.00), (4.00, 0.00)DOCUMENT: "Student call-outs usually correlate positively with achievementin low-SES classes but negatively in high-SES classes" (Brophy & Good, 1986, p.363).SUCCCESS_RATE_IMPACT = GRAPH(percent_of_correct_responses)(0.00, 0.00), (10.0, 21.0), (20.0, 41.0), (30.0, 62.5), (40.0, 81.0), (50.0, 90.0), (60.0, 96.5),(70.0, 100), (80.0, 100), (90.0, 96.5), (100, 86.0)DOCUMENT: "Optimal learning occurs when students move at a brisk pace butin small steps, so that they experience continuous progress and high success rates(averaging perhaps 75% during lessons when a teacher is present, and 90-100%when the students must work independently)" (Brophy & Good, 1986, p. 341).
  • 125. 114WAIT_TIME_IMPACT = GRAPH(postquestion_wait_time)(0.00, 0.00), (0.273, 25.5), (0.545, 45.0), (0.818, 59.5), (1.09, 70.5), (1.36, 79.5), (1.64,85.0), (1.91, 90.0), (2.18, 93.5), (2.45, 97.0), (2.73, 99.5), (3.00, 100)DOCUMENT: "Studies . . . have shown higher achievement when teachers pausefor about 3 seconds (rather than 1 second or less) after a question, to give thestudents time to think before calling on one of them" (Brophy & Good, 1986, p.363). Reacting to Student ResponseQuality_of_Teacher_Reactions(t) = Quality_of_Teacher_Reactions(t - dt) +(change_in_reactions) * dtINIT Quality_of_Teacher_Reactions = 0change_in_reactions = (collective_responses-Quality_of_Teacher_Reactions)/delay_in_adjustingcollective_responses = MEAN(CORRECT_RESPONSE_FEEDBACK_IMPACT,INCORRECT_RESPONSE_IMPACT,(TEACHER_PRAISE_IMPACT * IMPACT_PRAISE))correct_response_feedback = 90DOCUMENT: Input:Choose amount of time, from 0% to 100% of the time, in which correct responsesare acknowledged by the teacherincorrect_response_feedback = 100
  • 126. 115DOCUMENT: Input:Choose from 0% to 100% of the time teacher should use simple negation versuspersonal criticism for incorrect responses to questionsteacher_praise = 0DOCUMENT: Input:0 = great deal of praise from the teacher1 = little or no praise from the teacherTEACHER_PRAISE_IMPACT = IF(SES+teacher_praise=0) OR(SES+teacher_praise=2) THEN 100 else 75DOCUMENT: "High-SES students . . . do not require a great deal of . . . praise.Low-SES students . . . need more . . . praise for their work" (Brophy & Good,1986, p. 365).CORRECT_RESPONSE_FEEDBACK_IMPACT =GRAPH(correct_response_feedback)(0.00, 0.00), (4.76, 3.50), (9.52, 6.00), (14.3, 8.50), (19.0, 11.0), (23.8, 13.0), (28.6, 15.0),(33.3, 17.5), (38.1, 21.0), (42.9, 25.5), (47.6, 31.0), (52.4, 38.5), (57.1, 46.5), (61.9, 56.0),(66.7, 66.5), (71.4, 74.5), (76.2, 83.5), (81.0, 92.0), (85.7, 100), (90.5, 100), (95.2, 100),(100.0, 93.5)DOCUMENT: "Correct responses should be acknowledged as such, becauseeven if the respondent knows that the answer is correct, some of the onlookersmay not. Ordinarily (perhaps 90% of the time) this acknowledgement shouldtake the form of overt feedback" (Brophy & Good, 1986. p. 362).
  • 127. 116delay_in_adjusting = GRAPH(collective_responses/Quality_of_Teacher_Reactions)(0.9, 0.35), (1.00, 2.00), (1.10, 10.0)DOCUMENT: The graph thats drawn makes adjustment stick upward andslippery downward.IMPACT_PRAISE = GRAPH(GRADE_LEVEL)(0.00, 1.00), (1.20, 1.00), (2.40, 1.00), (3.60, 1.00), (4.80, 1.00), (6.00, 1.00), (7.20,0.985), (8.40, 0.965), (9.60, 0.93), (10.8, 0.88), (12.0, 0.795)DOCUMENT: "Praise and symbolic rewards that are common in the earlygrades give way to the more impersonal and academically centered instructioncommon in the later grades" (Brophy & Good, 1986, p. 365).INCORRECT_RESPONSE_IMPACT = GRAPH(incorrect_response_feedback)(0.00, 0.00), (10.0, 0.00), (20.0, 8.00), (30.0, 11.5), (40.0, 16.5), (50.0, 23.5), (60.0, 29.5),(70.0, 38.5), (80.0, 53.0), (90.0, 100), (100, 100)DOCUMENT: "Following incorrect answers, teachers should begin by indicatingthat the response is not correct. Almost all (99%) of the time, this negativefeedback should be simple negation rather than personal criticism, althoughcriticism may be appropriate for students who have been persistentlyinattentive" (Brophy & Good, 1986, p. 364).
  • 128. 117 StudentCurrent_Student_Achievement(t) = Current_Student_Achievement(t - dt) +(achievement_change) * dtINIT Current_Student_Achievement = 0achievement_change = behavior_impact-Current_Student_Achievementbehavior_impact = MEAN(Level_of_Independence, Management_of_Response_Opportunities, Quality_of_Structuring, Quality_of_Teacher_Reactions,(Teacher_Expectation*GRADE_&_EXPECTATIONS))GRADE_LEVEL = 6DOCUMENT: Input:Choose a number from 0 (Kindergarten) to 12SES = 0DOCUMENT: Input:0 = low-Socioeconomic status (SES) or dependent/anxious1 = high-SES or assertive/confidentGRADE_&_EXPECTATIONS = GRAPH(GRADE_LEVEL)(0.00, 0.8), (1.20, 0.875), (2.40, 0.925), (3.60, 0.96), (4.80, 0.99), (6.00, 1.00), (7.20,1.00), (8.40, 1.00), (9.60, 1.00), (10.8, 1.00), (12.0, 1.00)DOCUMENT: ". . . in the later grades . . . it becomes especially important to beclear about expectations . . ." (Brophy & Good, 1986, p. 365).
  • 129. 118 TeacherAverage_Management_Level(t) = Average_Management_Level(t - dt) +(change_in_managing) * dtINIT Average_Management_Level = 0change_in_managing = (EXPERIENCE_IMPACT-Average_Management_Level) *organizational_skillsTeacher_Expectation(t) = Teacher_Expectation(t - dt) + (change_in_pcvd_ability)* dtINIT Teacher_Expectation = 0DOCUMENT: "Achievement is maximized when teachers . . . expect theirstudents to master the curriculum" (Brophy & Good, 1986, p. 360)."Early in the year teachers form expectations about each students academicpotential and personality. . . If the expectations are low . . . the studentsachievement and class participation suffers" (Dunkin, 1987, p. 25).change_in_pcvd_ability = ((ACADEMIC_POTENTIAL_INDICATOR*25)-Teacher_Expectation)ACADEMIC_POTENTIAL_INDICATOR = 4DOCUMENT: Input:Grade Point Average (GPA) from previous year4=A; 3=B; 2=C, 1=D, 0=F
  • 130. 119organizational_skills = 1DOCUMENT: Input:Choose 1 for high amount of organizationChoose .5 for fair amount of organizationChoose 0 for not much organizationyears_experience = 10DOCUMENT: Input:Select amount of experience in years from 0 to 10EXPERIENCE_IMPACT = GRAPH(years_experience)(0.00, 0.00), (1.00, 3.50), (2.00, 13.5), (3.00, 27.0), (4.00, 52.5), (5.00, 71.0), (6.00, 83.0),(7.00, 90.0), (8.00, 95.5), (9.00, 98.5), (10.0, 100)DOCUMENT: "... the majority of teachers solved only 5 of the original 18teaching problems of first-year teachers in fewer than 3 years. Several years maybe required for teachers to solve problems such as classroom management andorganization" (Alkin, 1992, p. 1382).MANAGEMENT_IMPACT = GRAPH(Average_Management_Level)(0.00, 0.5), (10.0, 40.0), (20.0, 61.0), (30.0, 73.5), (40.0, 83.0), (50.0, 87.0), (60.0, 91.0),(70.0, 94.0), (80.0, 95.5), (90.0, 98.0), (100, 100)DOCUMENT: "Students learn more in classrooms where teachers establishstructures that limit pupil freedom of choice, physical movement, anddisruption, and where there is relatively more teacher talk and teacher control ofpupils task behavior" (Brophy & Good, 1986, p. 337).
  • 131. 120 VITA Jorge O. Nelson was born in Vancouver, WA, on September 28, 1957. Heattended elementary, junior and senior high school in Fremont, Nebraska,graduating in 1975. He graduated from the Harry Lundberg School ofSeamanship, Piney Point, MD, in 1978 as an ordinary seaman. Following a shorttour of duty in the U.S. Merchant Marine, he entered Tacoma CommunityCollege, Tacoma, WA, in 1980 and graduated with an Associate in Arts andSciences in 1983. He entered The Evergreen State College, Olympia, WA, in 1983and graduated with a Bachelors of Arts, major in Elementary Education, minorin Drama in 1985. He received his teaching credential from the University ofPuget Sound, Tacoma, WA, in 1985. In 1985 he started his teaching career in aself-contained sixth grade class at the International School Bangkok, Thailand.He was hired by the International School Islamabad, Pakistan, in 1987 to teachmiddle and high school technology education. He enrolled in a degree programthrough Michigan State University, MI, in 1985 and graduated with a Masters ofArts in Curriculum and Teaching in 1988. In 1990, he began his administrativework at the American School of Asunción, Paraguay, as Assistant Director. In1992, after receiving a doctoral fellowship sponsored by the Office of OverseasSchools, U.S. Department of State and The University of Memphis, he moved toMemphis, TN, and enrolled as a doctoral student in the College of Education,Department of Leadership. He was awarded the Outstanding Student Awardfor Scholarship, Professional Accomplishment, and Commitment in EducationalLeadership in 1994 and graduated with an Ed.D. in Administration andSupervision in 1995. He is presently employed as the Director of the American School ofDurango, México.