Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this presentation? Why not share!

Like this? Share it with your network

Share

Gdp2 2013 14_3

on

  • 453 views

 

Statistics

Views

Total Views
453
Views on SlideShare
337
Embed Views
116

Actions

Likes
0
Downloads
2
Comments
0

6 Embeds 116

http://educationcognitionbrain.blogspot.fr 60
http://educationcognitionbrain.blogspot.it 46
http://educationcognitionbrain.blogspot.com 5
http://educationcognitionbrain.blogspot.co.uk 3
http://educationcognitionbrain.blogspot.in 1
http://educationcognitionbrain.blogspot.ch 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • In the previous course we have seen that: Children possess several skills that they lead some (e.g. Alison Gopnik) to theorize that they are like scientists in the crib:Not only they are capable of extracting causal relations, and sensitive to themThey are curious and in search for explanationsThey do perform complex observations and simple experimentsBabies as scientistsIn addition to be re-called in theories of conceptual change, the existence of folk, naïve, pre-instructional explanations and ideas about the child’s natural and social environment has been used to defend the idea that the child and even the baby are more intelligent than we would think (e.g. by developmental psychologists like Paul Bloom). Also, as a sign that the child and even the baby are scientists. Alison Gopnik in particular has defended the idea that young children think just like scientists: they test their hypotheses against data, set up experiments, extract conclusions that include causality and make use of Bayesian inference in order to assess the probability of hypotheses. The study of how babies think and how scientists think thus converge. This kind of theory defies the image of the cognitive development of the child that has its origin in the mind of Jean Piaget (still largely dominant in the psychology of education, yet often overtaken by recent research): the child as illogical, irrational, incapable of abstraction and draw causal links until the end of primary school. The last ten years of research in developmental psychology have changed this picture while strengthening another idea shared by Piaget: the refusal that the brain of the learner is at any moment of his life an empty box, a tabula rasa without knowledge or constraints. In fact, more than had thought Piaget, the brain of the newborn human species has a number of dispositions and knowledge and quickly develops capacities as complex as extracting statistical regularities of the environment to attribute causes to events, to give meaning to the environment in terms of objects separated from each other to meet specific physical laws to distinguish animate and inanimate objects on the basis of their behavior, assign entities moving intentions, thoughts, desires. Other forms of pre-instructional forms of knowledge and early capacity that are shared by humans and precocious are: the special treatment of faces, the Theory of Mind or folk psychology, some form of folk sociology that leads us to treat kins and in-groups differently than out-groups, a natural number sense and face recognition capacity. All this gives rise to expectations that can be revealed by placing the baby and the child has not yet developed the language in situations that contradict these expectations (if they exist). The reaction of surprise at the "anomaly" becomes revealing expectations and therefore knowledge and thought processes maintained by the smaller. All these capacities and forms of knowledge hinder the myth of the blank slate. They are considered to be modular (independent one from the others) and are clearly content-specific capacities, recalled in the solving of content-specific problems, so that they also hinder the myth of the mind-brain as dominated by general capacities. In other words, their existence reinforces the evolutionary psychology model of the mind as modular and operating through domain-specific mechanisms that have been selected by evolution for their adaptive value in relationship to separated problems posed by the environment.  The baby and science, or: play is to make scienceFor Gopnik, recent empirical research (from 90s) reveals that children playing is equivalent to real experimentation. More generally that children deploy three typical operations of scientific professionals: analysis of statistical patterns, experimentation, learning data and theories of other scientists. First, children are sensitive to statistical patterns and use them to infer causes mental or other. Then they conduct experiments and use Bayesian inference: as scientists, they are not limited to record and collect data from the environment, but they test hypotheses and evaluate the data in the light of these assumptions, in order to identify the causes of certain events. Finally, as "other scientists" do, children learn by imitating others and can learn causal reactions by observing what others are doing and what effects it produces; they are also sensitive to the intentions of others, especially when others are preparing to teach them something. Even in this case, they are not limited to copy but use a probabilistic model (Bayesian). The result of all these operations is not just intuitions, but true theories concerning the environment (Gopnik defends a theory theory of conceptual change). Practical implications for educationThis view has, according to Gopnik, practical implications for education. First, the probabilistic model of learning reflects both the stability of certain ideas acquired early (those having received the most empirical support in spontaneous exploration) and the possibility of changing these ideas (due to repeated exposure to data that force to recalculate the likelihood of a certain event). Then, the wealth of exploratory activity that the child performs naturally in the environment is an argument against early-schooling, or school-like kindergartens: the child is already learning, developing inferences and statistical analysis through free exploration in free environments, so the best choice for the youngers is to guarantee an environment that allows them to create themselves their hypotheses. Too explicit teaching too early may limit the number of hypotheses that children are spontaneously willing to consider. Even in school, according to Gopnik, children should (also) be left free to explore themselves, thus creating a balance between teacher-led activities and activities where the child leads his own explorations. In other words, even school children should be left free to do science, not only to learn about science.It is in the light of the continuity between children way of thinking and learning and scientific thought that science should be taught in school. Gopnik is aware, however, that children as "natural scientists" do not understand the process of science as such, but she thinks that these processes should be taught in the framework of experiments and observation rather than through explicit teaching. The theory of "the child as a scientist" is conducive to an educational process generically described as inquiry-based or investigation, exploration and discovery.Recent research then seems to provide a scientific basis for the educational approach of teaching science through inquiry. May our new understanding of scientific intuitions in children help us to go beyond a simple emphasis on the idea of active investigation! It should rather lead us to science-based science education: science (of the mind, brain, behavior) itself should help us to translate natural curiosity and intelligence of young children in the best systems for teaching and learning science. The baby as a scientist: a controversial ideaWhile the idea of ​ babies as having ideas about the natural, psychological, and social world (be them called "naive" or "intuitive" or "preconceptions") and the idea of the child's brain as prepared to perform complex thought processes, including inferences, is now dominant in developmental psychology, that of "the child as a scientist" is more controversial.For example, Gopnik believes that the ideas fostered by the children (intuitions or "naive belief") are more than fragmented ideas that the child develops in response to specific contexts: they are real theories, such as those held by scientists, while not necessarily correct. We have discussed this idea above, and shown that it can be criticized. Also, Gopnik speculates that children may not only be as "smart" as adult scientists, but even more. Children are indeed more open-minded and would be better than the adults in the task of imagining improbable hypotheses and new forms of exploration.The image of science and scientific thinking that Gopnik puts out relegates on a second plan an important component of the scientific enterprise, which is actually part of the scientific discovery process and of science as a cultural endeavor, which relates to "critical thinking" and the establishment of criteria, of controls during experimentation that are not natural for the human mind and that help mastering the natural biases of observation and intuition. In a sense, if the description that Gopnik provides of children thought processes is a fair account of their complexity, at the same time the image it gives of scientific thought seems to be oversimplified. Children also possess different forms of knowledge: innate principles for organizing their experiences and beliefs, concepts, intuitions, about the physical, biological, psychological domains of knowledge, as well as mathematics, geometry and space. Confirmation of the existence of innate beliefs and early concepts related to physical phenomena has come from developmental psychology, and in particular from the study of babies through what is known as “habituation paradigm”: young pre-linguistic children are exposed to a variety of physical or psychological phenomena and their reactions to change are observed; in this way it is possible to decide whether the child distinguishes between the two situations or whether one situation looks more familiar and thus unsurprising to the child. Renee Baillargeon, Elisabeth Spelke, Susan Carey, among others, have thus been able to unmask a number of expectations about the physical world that babies, and then children and pupils seem to hold or form spontaneously.  These intuitions form a useful and solid background against which the young can interpret what happens in her environment and try to anticipate it or react appropriately. They are a head start. It is plausible that some of them have been forged through natural selection – as adaptations or at least by-products, exaptations, spandrels (Buss et al 1998). E.g. it might be the case for the early capacity of parsing the flux of stimuli into discrete objects, or of the irresistible tendency to see patterns, correlations and even causality in uncorrelated phenomena. Others are possibly the fruit of early observation, and then of the natural tendency to observe, inquire, interpret according to categories and causality. Others of imitation. And so on: natural mechanisms of learning, representation and interpretation endow us with the capacity of quickly filling our mindset with various forms of contents, more or less constrained and shaped by “irresistible” biases. Dualism seems to be one of such biases. So, even what looks like just “culture” at a first glance, reveals its natural – evolutionary, developmental - roots at a closer look (Barkow, CosmidesTooby 1992). Which does not mean that these natural endowments of the human mind are for the good or only for the good. As a matter of fact, as much as they reveal the richness of the mind and early competences of the child, they point at the intrinsic limits and obstacles that our history interposes between us and reasoning or science.  These intuitions give children a head start when it comes to understanding and learning about objects and people. But these intuitions also sometimes clash with scientific discoveries about the nature of the world, making certain scientific facts difficult to learn. As Susan Carey once put it, the problem with teaching science to children is "not what the student lacks, but what the student has, namely alternative conceptual frameworks for understanding the phenomena covered by the theories we are trying to teach. (Bloom & Weisberg 2007)E.g. innate dualism, which helps distinguishing animate and inanimate objects, would successively clash with neuroscience, because the latter treat the human mind as a manifestation of the human brain, an evolved organ, and do not require an independent, immaterial soul. On the contrary, children and adults “naturally” tend to treat themselves as using their brain, as being separated from the organ of thought, and human actions as commands “we” order to our bodies. Memories, thoughts, desires are considered as part of an immaterial world, other than the physical-biological one. We are Cartesians (Bloom 2004). And this explains the belief – present in any culture - that souls can survive death. We also tend to treat unanimated objects in teleological terms, in terms of being for a purpose and being designed for that purpose; a vision that fits well with creationism or “intelligent design”.  As underlined by Pascal Boyer, the interesting thing is that nobody seems to be bothered by the consequent gap between our spiritual mind and our physical actions – nobody but philosophers who take the naïve intuition very seriously. Apparently, our naïve ideas and intuition can generate contradictions, conflicts, gaps but, unless we are philosophers, we don’t care. Paul Bloom's arguments also point to the extraordinary mystery of our ordinary thoughts, to the fact that commonsense thinking is anything but banal. Consider this: because we are dualists, we think we have (or rather we are) an immaterial mind that floats about with no physical implementation. We think of our brains as something we use. Now we also think of our bodies as things that are governed by our thoughts: when I want to raise my hand, lo and behold, it does rise. How is that possible? How could our thoughts have an influence on meat and bone stuff? The interesting thing about that is not the question itself (it is entirely created by dualism and once you abandon dualism it vanishes) but the fact that no-one is bothered by the question. That is, there is a wide gap in our commonsense world-view, and unless we are born or made philosophers, we just don't care. (Boyer 2007, http://www.edge.org/discourse/natural-born.html#boyer) In summary:The human brain-mind is not a blank slate - not even before the first year of life. Children possess a rich understanding of the physical, biological, psychological phenomena that surround them. Innate (or at least pre-instructional) and intuitive (or at least observational) “knowledge” about relevant phenomena of daily life is relevant for our understanding of scientific facts as presented during formal instruction. In fact, our “commonsense world-view” can be invoked in order to explain general resistance to science, the difficulty of acquiring correct scientific knowledge in certain domains and a specific phenomenon known as “conceptual change”.Our commonsense world-view can resist conflict and gaps, even internal gaps not challenged by instruction or else. These intuitions and pre-instructional conceptions have been invoked in order to explain the difficulty of learning science in school (case 3): somehow, scientific facts would find a ground that is not virgin, and previous conception should have to be dismantled before new ones can be seeded and fructify. Thus, learning scientific facts for which previous ideas exist - somehow somewhere - in the mind of the learner would demand more than good explanations. It would require a real change of perspective, or conceptual change.
  • Cultural factors (religious indoctrination and insufficient exposition to scientific facts) would not tell the entire story. This is just a theory, but it is supported by the existence of several examples of resistance to science that occur in the absence of contrary indoctrination and in the presence of exposition to scientific facts. It appears plausible, then, that even the “socio-cultural forms of resistance” can have natural bases – they are rooted in our mindset. According to Bloom: The main source of resistance to scientific ideas concerns what children know prior to their exposure to science. The last several decades of developmental psychology has made it abundantly clear that humans do not start off as  "blank slates." Rather, even one year-olds possess a rich understanding of both the physical world (a "naïve physics") and the social world (a "naïve psychology"). Babies know that objects are solid, that they persist over time even when they are out of sight, that they fall to the ground if unsupported, and that they do not move unless acted upon. They also understand that people move autonomously in response to social and physical events, that they act and react in accord with their goals, and that they respond with appropriate emotions to different situations. (Bloom & Weisberg 2007) 
  • Intuitions/naïve beliefs, core knowledge/skillsProvide an early understanding of the natural world and of others’ mindspartly inherited the product of natural selection, or exaptations, spandrels…partly acquired through early observation, imitationpredisposition for observation and a sensitivity to certain stimuli is required
  • An example of folk or naiveknowledgeis represented by the work of SctottAtran and Douglas Medin or by Inagaki and Itano on folkornaïve biology. What are the components of children’s biological-knowledge system before systematic teaching at school? Can this knowledge system be called naive biology? We propose that young children’s biological-knowledge system has at least two essential components—(a) the knowledge needed to identify biological entities and phenomena and (b) teleological and vitalistic causality—and that these components constitute a form of biology. We discuss how this naive biology serves as the basis for per- formance and learning in socially and culturally impor- tant practices, such as health practices and biology instruction. (Inagaki & Hatano 2006, p. 78)Other examples are those cited above as persistent “mistakes” in physics, astronomy, biology, psychology. They are all rooted in some form of folk or naïve knowledge, in pre-instructional representations that tend not to go away with age, and to resist to instruction. This description refers to a notion that is largely diffused in the domain of the psychology of education and also in developmental psychology of age-schooled children: the notion of misconception or preconception and the related notion of conceptual change (introduced by Susan Carey).
  • The notion and research on conceptual change starts in the late 1970s with instructional problems in mind, but on the trail of recent changes in the panorama of the philosophy of science (Nersessian 1992). Conceptual change in science has been one of the major topics of the philosophy of science in the XX century. In the mid-1960s the image of conceptual change as  continuous and cumulative is strongly criticized by the critics of positivism (Feyerabend, Kuhn) and the notions of revolution and incommensurability are introduced. They are successively more or less abandoned in favor of  the search for reasons for choosing among competing theories. The question of conceptual change has successively taken an important place in  science education, with cognitive psychologists arguing that conceptual change in science learning is similar to scientific revolutions and tracing analogies between history and individual learning of science. Nersessian recasts the analogy in the opposite direction: ordinary thinking is not a copy of scientific thinking and its history; rather, scientific thinking and its historical changes are on a continuum with ordinary thinking and problem solving. At the same time, the study of the history of (scientific) thinking constitutes a repository of cognitive strategies that are used by scientists as humans.  Conceptual change is also one of the best examples of the fact that the relationship between cognitive science and education can be bi-directional, because educational research can be informative for the understanding of the human mind as much as cognitive science can be for education. It strongly characterizes research in the learning sciences until the 1990s, then fades without disappearing. Its framework is represented by the effort of reforming science and mathematics education in the US, following the success of the Russian space race (Sputnik: 1957). In 1959 the National Science Foundation and the American Academy of Science mandate the psychologist Jerome Bruner of the organization of a ten-days conference on curriculum. The book that Bruner publishes the next year, summarizing the contents of the conference (Bruner 1960) starts the movement of Discovery Learning in science, while the new maths reform is taking place in mathematics education. In 1969, the images of Neil Armstrong walking on the Moon cool the worries of the Americans for their science program and funds drop until post-industrial age market competition will raise again the problem in 1983 (see the report: A Nation at Risk, ANAR). Immediately before the raising of consciousness expressed by ANAR, science educators and psychologists express their discontent with the science education reform of the 60s: The last concerted national initiative to improve math and science education was in the 1960s, in response to Sputnik. Prominent mathematicians and scientists joined forces with educators to analyze core concepts in mathematics and the sciences, to work out a coherent timetable for developing these concepts, and to work out many innovative curricular approaches for meeting this timetable. Despite this massive effort, math and science instruction in this country is now in a crisis. Many of the reasons for this have nothing to do with shortcomings of the materials developed under the 1960s initiative, but there was one crucial shortcoming, with vast implications for the art and practice of educating our youngsters. Simply put, in the 1960s, educators and psychologists misanalyzed the very problem math and science education must solve. (Carey 2000) Educators notice that learning and teaching in certain areas is especially difficult: there are topics that are “systematically” difficult for students. It happens, mostly, in the domain of the sciences – at any level of instruction, from elementary to university. Put it differently: certain concepts that are part of science instruction find a resistance in the mind of the learner that is not experienced in with other topics. But is it because they are “intrinsically” difficult?  It is observed that learners are not simply incapable of acquiring the right knowledge and understanding the correct explanation. In the classic “tossed coin” problem (which are the forces that act on a falling tossed coin?) students actively propose alternative, incorrect explanations - curiously these explanations often coincide with explanations that have had a place in the history of science. So, students do not lack explanations or are not simply failing at acquiring explanations: they have bad explanations or misconceptions before formal instructions reaches them. Before considering how to propose the good explanations, educators have thus to deal with misconceptions.  John Clement (1982) was one of the first to study how we reason about the coin problem. He found that what leads us to choose … is the misconception that flipping a coin gives it an impetus. On the upward path, we reason, the impetus gradually diminishes, until it becomes less than the force of gravity and the coin falls. According to Newton, however, once we toss the coin, it would continue on a straight-line path indefinitely unless and unbalanced force acts on it. The only force acting on it is gravity. Gravity causes the coin to slow down until it reaches the top of its trajectory, and then to speed up as it falls back on the ground? Clement found that only 12% of students in a first-year engineering course, all high school physics graduates, answered the coin problem correctly. (Bruer 1993, p. 130) Michael McCloskey (1983a, b) did a series of studies that became perhaps the most famous of all misconceptions studies. He claimed that students entered physics with a remarkably coherent and articulate theory that competed directly with Newtonian physics in instruction. The naïve theory, in fact, was very nearly the novice explanation of the toss … Within McCloskey’s theory theory, he also proposed a strong content connection to medieval scientists’ ideas, such as those of John Buridan and early claims of Galileo. (DiSessa 2006, p. 7-8) Psychologists have joined educators in the “misconception trail”, especially psychologists with a Piagetian view. In the Piagetian view, the mind is different from a blank slate where knowledge accumulates; not only, but knowledge is seen by Piaget as the product of a double movement of accommodation and assimilation of new information into previous structures and in the light of previously acquired knowledge. That’s why children do not differ from adults only because of the quantity of things they know, but of the quality of their thinking processes. This view has been expanded and has become a movement in psychology and in education: constructionism. In the Piagetian view, any kind of knowledge is acquired through a process that is much more active and complex than accumulation.  However, not all knowledge seems to be equal – and this is a pragmatic consideration. Only certain concepts offer special resistance to acquisition, or at least more resistance to acquisition. Science is the one with big troubles. Research in conceptual change has thus taken a distance from the Piagetian view that intelligence is a general capacity, and adhered to a more domain-specific view of learning and knowledge.  Psychologists that have explored the “conceptual change problem” have also taken inspiration from the notion of scientific revolution proposed by Thomas Kuhn in the domain of the history of science. Scientific concepts do not simply accumulate one upon the other, creating a linearly growing pile of scientific progress; rather, science endures radical transformations: when a worldview is established – a paradigm – change from this worldview to another comes at the prize of rejecting the old one and radically changing the global paradigm. This happens when new ideas come to form a coherent unit that is in contradiction with the previous one. At this point everything changes and the two paradigms become incommensurable, not only at the level of their conceptual contents, but of how concepts are produced and evaluated. Before and after a revolution the world is no more the same for science.  Susan Carey (1991, 1999) was one of the earliest and most consistent in citing Kuhn’s ideas in the context of children’s conceptual change. She has systematically used the idea of incommensurability between conceptual systems as a primary index of conceptual change (“deep restructuring”). Incommensurability distinguishes conceptual change from “enrichment” (adding new ideas or beliefs) or even mere change of beliefs. (diSessa, 2006, p. 7) All good teachers have always realized that one must start “where the student is.” Since the 1960s, we have come to a completely new understanding of what this means. Back then, it was defined in terms of what the student lacked, and this was seen as a lack of science content knowledge, combined with age-related limitations in general cognitive capacities (e.g., the elementary school child is a concrete thinker not capable of abstract reasoning). Now we understand that the main barrier to learning the curricular materials we so painstakingly developed is not what the student lacks, but what the student has, namely, alternative conceptual frameworks for understanding the phenomena covered by the theories we are trying to teach. Often these conceptual frameworks work well for children, so we face a problem of trying to change theories and concepts. (Carey 2000)
  • The notion and research on conceptual change starts in the late 1970s with instructional problems in mind, but on the trail of recent changes in the panorama of the philosophy of science (Nersessian 1992). Conceptual change in science has been one of the major topics of the philosophy of science in the XX century. In the mid-1960s the image of conceptual change as  continuous and cumulative is strongly criticized by the critics of positivism (Feyerabend, Kuhn) and the notions of revolution and incommensurability are introduced. They are successively more or less abandoned in favor of  the search for reasons for choosing among competing theories. The question of conceptual change has successively taken an important place in  science education, with cognitive psychologists arguing that conceptual change in science learning is similar to scientific revolutions and tracing analogies between history and individual learning of science. Nersessian recasts the analogy in the opposite direction: ordinary thinking is not a copy of scientific thinking and its history; rather, scientific thinking and its historical changes are on a continuum with ordinary thinking and problem solving. At the same time, the study of the history of (scientific) thinking constitutes a repository of cognitive strategies that are used by scientists as humans.  Conceptual change is also one of the best examples of the fact that the relationship between cognitive science and education can be bi-directional, because educational research can be informative for the understanding of the human mind as much as cognitive science can be for education. It strongly characterizes research in the learning sciences until the 1990s, then fades without disappearing. Its framework is represented by the effort of reforming science and mathematics education in the US, following the success of the Russian space race (Sputnik: 1957). In 1959 the National Science Foundation and the American Academy of Science mandate the psychologist Jerome Bruner of the organization of a ten-days conference on curriculum. The book that Bruner publishes the next year, summarizing the contents of the conference (Bruner 1960) starts the movement of Discovery Learning in science, while the new maths reform is taking place in mathematics education. In 1969, the images of Neil Armstrong walking on the Moon cool the worries of the Americans for their science program and funds drop until post-industrial age market competition will raise again the problem in 1983 (see the report: A Nation at Risk, ANAR). Immediately before the raising of consciousness expressed by ANAR, science educators and psychologists express their discontent with the science education reform of the 60s: The last concerted national initiative to improve math and science education was in the 1960s, in response to Sputnik. Prominent mathematicians and scientists joined forces with educators to analyze core concepts in mathematics and the sciences, to work out a coherent timetable for developing these concepts, and to work out many innovative curricular approaches for meeting this timetable. Despite this massive effort, math and science instruction in this country is now in a crisis. Many of the reasons for this have nothing to do with shortcomings of the materials developed under the 1960s initiative, but there was one crucial shortcoming, with vast implications for the art and practice of educating our youngsters. Simply put, in the 1960s, educators and psychologists misanalyzed the very problem math and science education must solve. (Carey 2000) Educators notice that learning and teaching in certain areas is especially difficult: there are topics that are “systematically” difficult for students. It happens, mostly, in the domain of the sciences – at any level of instruction, from elementary to university. Put it differently: certain concepts that are part of science instruction find a resistance in the mind of the learner that is not experienced in with other topics. But is it because they are “intrinsically” difficult?  It is observed that learners are not simply incapable of acquiring the right knowledge and understanding the correct explanation. In the classic “tossed coin” problem (which are the forces that act on a falling tossed coin?) students actively propose alternative, incorrect explanations - curiously these explanations often coincide with explanations that have had a place in the history of science. So, students do not lack explanations or are not simply failing at acquiring explanations: they have bad explanations or misconceptions before formal instructions reaches them. Before considering how to propose the good explanations, educators have thus to deal with misconceptions.  John Clement (1982) was one of the first to study how we reason about the coin problem. He found that what leads us to choose … is the misconception that flipping a coin gives it an impetus. On the upward path, we reason, the impetus gradually diminishes, until it becomes less than the force of gravity and the coin falls. According to Newton, however, once we toss the coin, it would continue on a straight-line path indefinitely unless and unbalanced force acts on it. The only force acting on it is gravity. Gravity causes the coin to slow down until it reaches the top of its trajectory, and then to speed up as it falls back on the ground? Clement found that only 12% of students in a first-year engineering course, all high school physics graduates, answered the coin problem correctly. (Bruer 1993, p. 130) Michael McCloskey (1983a, b) did a series of studies that became perhaps the most famous of all misconceptions studies. He claimed that students entered physics with a remarkably coherent and articulate theory that competed directly with Newtonian physics in instruction. The naïve theory, in fact, was very nearly the novice explanation of the toss … Within McCloskey’s theory theory, he also proposed a strong content connection to medieval scientists’ ideas, such as those of John Buridan and early claims of Galileo. (DiSessa 2006, p. 7-8) Psychologists have joined educators in the “misconception trail”, especially psychologists with a Piagetian view. In the Piagetian view, the mind is different from a blank slate where knowledge accumulates; not only, but knowledge is seen by Piaget as the product of a double movement of accommodation and assimilation of new information into previous structures and in the light of previously acquired knowledge. That’s why children do not differ from adults only because of the quantity of things they know, but of the quality of their thinking processes. This view has been expanded and has become a movement in psychology and in education: constructionism. In the Piagetian view, any kind of knowledge is acquired through a process that is much more active and complex than accumulation.  However, not all knowledge seems to be equal – and this is a pragmatic consideration. Only certain concepts offer special resistance to acquisition, or at least more resistance to acquisition. Science is the one with big troubles. Research in conceptual change has thus taken a distance from the Piagetian view that intelligence is a general capacity, and adhered to a more domain-specific view of learning and knowledge.  Psychologists that have explored the “conceptual change problem” have also taken inspiration from the notion of scientific revolution proposed by Thomas Kuhn in the domain of the history of science. Scientific concepts do not simply accumulate one upon the other, creating a linearly growing pile of scientific progress; rather, science endures radical transformations: when a worldview is established – a paradigm – change from this worldview to another comes at the prize of rejecting the old one and radically changing the global paradigm. This happens when new ideas come to form a coherent unit that is in contradiction with the previous one. At this point everything changes and the two paradigms become incommensurable, not only at the level of their conceptual contents, but of how concepts are produced and evaluated. Before and after a revolution the world is no more the same for science.  Susan Carey (1991, 1999) was one of the earliest and most consistent in citing Kuhn’s ideas in the context of children’s conceptual change. She has systematically used the idea of incommensurability between conceptual systems as a primary index of conceptual change (“deep restructuring”). Incommensurability distinguishes conceptual change from “enrichment” (adding new ideas or beliefs) or even mere change of beliefs. (diSessa, 2006, p. 7) All good teachers have always realized that one must start “where the student is.” Since the 1960s, we have come to a completely new understanding of what this means. Back then, it was defined in terms of what the student lacked, and this was seen as a lack of science content knowledge, combined with age-related limitations in general cognitive capacities (e.g., the elementary school child is a concrete thinker not capable of abstract reasoning). Now we understand that the main barrier to learning the curricular materials we so painstakingly developed is not what the student lacks, but what the student has, namely, alternative conceptual frameworks for understanding the phenomena covered by the theories we are trying to teach. Often these conceptual frameworks work well for children, so we face a problem of trying to change theories and concepts. (Carey 2000)
  • In a certain “radical” view of conceptual change, possessing naïve beliefs and theories about how the physical or biological worlds work is a sort of paradigm that must be transformed through a revolution in order to acquire new, scientific concepts. This is why acquiring scientific concepts is hard and effortful: when an old concept is exchanged with a new one, a deep reorganization occurs. And this is how we recognize true conceptual change: new ideas are not simply more complex or rich than previous ones, they are radically different. This is also how ideas that will be difficult to learn can be identified: it is the ideas for which misconceptions exist that have to be changed in order to acquire new, scientifically correct knowledge. Where no misconception (an all nest of them, actually) is in place, there’s no need for change, and learning is less problematic.  The respective literature of science education, cognitive development, and the history of science are filled with examples of tenaciously held beliefs that seem bizarre from the standpoint of modern scientific literacy. Examples from cognitive development are cars are alive or air is immaterial. It seems unlikely that preschool children who insist that cars are alive could have the same concept of life as the adult and merely be mistaken about cars, and indeed, they do not. Rather, preschool children have constructed a very different theoretical framework from that held by adults, in which they have embedded their understanding of animals, just as children of elementary school age have constructed a different framework theory in which they embed their understanding of the material world. (Carey 2000) Many instructional consequences are extracted from this general model; namely that only challenged beliefs can be changed, and that success consists in the successful replacement of one concept with another. The strong conceptual change model also implies that children possess structured, coherent concepts; according to some structured concepts are proper theories, and the child is much like a real scientist, building theories from observation, hypotheses and experiments; according to others concepts are embedded into larger ontologies and it is at this level that the resistance can be stronger. Rational models hold that students, like scientists, maintain current ideas unless there are good (rational) reasons to abandon them. Posner, Strike, Hewson, and Gertzog (1982) established the first and possibly most important standard in rational models. They argued that students and scientists change their conceptual systems only when several conditions are met: (1) they became dissatisfied with their prior conceptions (experience a “sea of anomalies” in Kuhn’s terms); (2) the new conception is intelligible ; (3) the new conception should be more than intelligible, it should be plausible ;  (4) the new conception should appear fruitful for future pursuits. (DiSessa, 2006, p. 8-9) The “traditional” approach to conceptual change is then committed with three ideas: - the idea of misconceptions (that are blocking or filtering good ones, that are coherent and organized in theory-like structures)- the idea of transformation (radical, non-cumulative, change of perspective in which one concept is given out for another)- the idea of conflict between old and new views, and of the experience of conflict as the necessary and sufficient condition for fueling the transformation. In the broad educational experience, some topics seem systematically to be extremely difficult for students. Learning and teaching in these areas are problematic and present persistent failures of conventional methods of instruction. Many areas in the sciences, from elementary school through university level, have this characteristic, including, in physics: concepts of matter and density, Newtonian mechanics, electricity, and relativity; in biology: evolution and genetics.The name “conceptual change” embodies a first approximation of what constitutes the primary difficulty: students must build new ideas in the context of old ones, hence the emphasis on “change” rather than on simple accumulation or (tabula rasa, or “blank slate”) acquisition. Strong evidence exists that prior ideas constrain learning in many areas. The “conceptual” part of  the conceptual change label must be treated less literally. Various theories locate the difficulty in such entities as “beliefs,” “theories,” or “ontologies,” in addition to “concepts.” Conceptual change contrasts with less problematic learning such as skill acquisition and acquisition of facts, where difficulty may be evident, but for more apparent reasons such as shear mass of learning, or the necessity of practice to produce quick, error free, highly-refined performance. Conceptual change is among the most central areas in the learning sciences for several reasons. First, many of the most important ideas in science seem to be affected by the challenges of problematic learning. Conceptual change also engages some of the deepest, most persistent theoretical issues concerning learning. What is knowledge in its various forms? When and why is it difficult to acquire? What is deep understanding; how can it be fostered? Conceptual change is important not only to education, but also to developmental psychology, epistemology, and the history and philosophy of science. In the history of science, … (DiSessa, 2006) The “traditional” conceptual change view and line of research has received criticisms and several modified versions exist. The first controversial topic concerns what changes: do children really possess theories? How structured are they?
  • What changes? The theory theory  is the claim that children or beginning students have theories in very much the same sense that scientists have them. While this may have been inspired by the broader analogy with the history of science, it has often been invoked independent of content or process similarity. Carey has consistently advocated a version of the theory theory.With respect to another domain, theories of mind, Allison Gopnik (Gopnik & Wellman, 1994) strongly advocates the theory theory.Gopnik is fairly extreme in the parallelism she claims (while still admitting some differences between scientists and children, such as meta-cognitive awareness); others are more conservative in allowing such differences as limits in systematicity and breadth of application (Vosniadou, 2002). (DiSessa 2006, p. 7-8) Alison Gopnik has indeed defended the idea that scientific theory formation and cognitive development share similarities: in order to develop their knowledge, children use the same devices as adults do when doing science. Namely, they both observe reality and form an interpretative theories, make predictions coherent with the theory, make experiences to test predictions, gather relevant evidence, compare predictions with evidence, whether predictions and evidence are in conflict seek for alternative theories that do a better job, eventually replace the old theory. Within the last ten years the idea that there are deep similarities between scientific theory formation and cognitive development, an idea we have called “the theory theory” has become, at the least, a serious developmental hypothesis. The cognitive abilities involved in science do seem to also be involved in everyday cognitive development. The basic idea is that children develop their everyday knowledge of the world using the same cognitive devices that adults use in science. In particular, children develop abstract, coherent, systems of entities and rules, particularly causal entities and rules. That is, they develop theories. These theories enable children to make predictions about new evidence, to interpret evidence, and to explain evidence. Children actively experiment with and explore the world, testing the predictions of the theory and gathering relevant evidence. Some counter-evidence to the theory is simply reinterpreted in terms of the theory. Eventually, however, when many predictions of the theory are falsified, the child begins to seek alternative theories If the alternative does a better job of predicting and explaining the evidence it replaces the existing theory. (Gopnik 2003, p. 3-6) Others have defended weaker views about how structured are the conceptions that children hold. According to Stella Vosniadou, the acquisition of knowledge about the physical world is constrained by the presence of framework theories that bias the way new information is processed and new concepts acquired. Children do not possess theories of the physical world, but rather frameworks of presuppositions. Change happens through enrichment or through revision of beliefs and presuppositions or theories and frameworks. Revision of frameworks is the most difficult process of change. It is argued that concepts are embedded into larger theoretical structures which constrain them. A distinction is drawn between a naive framework theory of physics, which is built early in infancy and which consists of certain fundamental ontological and epistemological presuppositions, and various specific theories which are meant to describe the internal structure of the conceptual domain within which concepts are embedded.It is assumed that conceptual change proceeds through the gradual modification of one’s mental models of the physical world, achieved either through enrichment or through revision. Enrichment involves the addition of information to existing conceptual structures. Revision may involve changes in individual beliefs or presuppositions or changes in the relational structure of a theory.Revision may happen at the level of the specific theory or at the level of the framework theory.Revision at the level of the framework theory is considered to be the most difficult type of conceptual change and the one most likely to cause misconceptions. Misconceptions are viewed as students’ attempts to interpret scientific information within an existing framework theory that contains information contradictory to the scientific view. (Vosniadou 1994, p. 46) According to Magdalene Chi, conceptual change concerns those contents of knowledge for which change is really difficult: no incremental information, corrections, traditional instruction can produce change. Misconceptions are robust: they make surface in several situations and can be abandoned only with great effort. Where the difficulty arises from? Chi proposes that the difficult changes concern beliefs that have assigned to the erroneous category. That is, misconceptions derive from miscategorizations. This fact automatically produces a conflict. “many misconceptions are not only “in conflict” with the correct scientific conceptions, but moreover, they are robust in that the misconceptions are difficult to revise, so conceptual change is not achieved. The robustness of misconceptions has been demonstrated in literally thousands of studies, about all kinds of science concepts and phenomena, beginning with a book by Novak (1977) and a review by Driver and Easley (1978), both published almost three decades ago. By 2004, there were over 6,000 publications describing students’ ideas and instructional attempts to change them (Confrey, 1990; Driver, Squires, Rushworth, & Wood-Robinson, 1994; Duit, 2004; Ram, Nersessian, & Keil, 1997), indicating that conceptual understanding in the presence of misconceptions remains a challenging problem. The daunting task of building conceptual understanding in the presence of robust misconceptions is sometimes referred to as radical conceptual change (Carey, 1985). We propose the operational definition that certain misconceptions are robust because they have been mistakenly assigned to an inappropriate lateral category. Our claim, then, is that some false beliefs and flawed mental models are robustly resistant to change because they have been laterally or ontologically miscategorized. That is, if a misconception belongs to one category and the correct conception belongs to another lateral or ontological category, then they conflict by definition of kind and/or ontology. This means that conceptual change requires a shift across lateral or ontological categories. In order to support this claim, we have to characterize the nature of misconceptions and the nature of correct information to see whether they in fact belong to two categories that differ either in kind or in ontology, thereby are “in conflict.”” (Chi 2008, p. 72)
  • How is change produced? The debate about “what changes?” has an influence on proposals about “how to produce change?”.The most shared view, expressed by G. Posner, is that children change their views only when a conflict arises, that is, when they have good (rational) reasons to change their mind, and that children change their mind in accord with the most rational hypothesis. Our central commitment in this study is that learning is a rational activity. That is, learning is fundamentally coming to comprehend and accept ideas because they are seen as intelligible and rational. Learning is thus a kind of inquiry. The student must make judgments on the basis of available evidence. It does not, of course, follow that motiva- tional or affective variables are unimportant to the learning process. The claim that learning is a rational activity is meant to focus attention on what learning is, not what learning depends on. Learning is concerned with ideas, their structure and the evidence for them. It is not simply the acquisition of a set of correct responses, a verbal repertoire or a set of behaviors. We believe it follows that learning, like inquiry, is best viewed as a process of conceptual change. The basic question concerns how students’ conceptions change under the impact of new ideas and new evidence. (Posner et al. 1982, p. 212) Another view, represented by J. Minstell and J. Clement, puts the accent both on conflicts and on analogies: instructors should help students identifying their facts, correct and erroneous. Erroneous facts are put in conflict with experiences, and their limits revealed; correct facts are identified and used to create good explanations. … the trick is to identify the students’ correct intuitions – their facets that are consistent with formal science – and then build on these. …Some facts are anchors for instruction; others are target for change. (Bruer 1993, p. 162-163)In a benchmark lesson, the teacher and the students dissect their qualitative reasoning about vivid, everyday physics problems into facets. They become aware of the limitations of each facet, and they identify which facets are useful for understanding a particular phenomenon. Minstrell calls two students to the front to help conduct the crucial experiment. Such demonstrations are dramatic and exciting for the students and allow them to see which prediction is correct. Research also suggests that such experiences have an important cognitive role in inducing conceptual change. They provide an initial experience that places naïve and expert theories in conflict. As the students try to resolve the conflict, the dramatic demonstration serves as an organizing structures in long-term memory (an anchor) around which schemas can be changed and reorganized (Hunt 1993). (Bruer, 1993, p. 164-166) Clement defends an analogical teaching strategy: expose misconceptions through appropriate questions; e.g. no upward force on a book resting on a table, find an analogy (e.g. hand holding up the book). Within the past decade and a half there has been an increasing awareness of the detrimental effects (to school learning) of some of students’ prior knowledge. Students come to class with preconceptions which inhibit the acquisition of content knowledge and are often quite resistant to remediation … For several years we have been testing an analogical teaching strategy which attempts to build on students’ existing valid physical intuitions.… build on students conceptions in order to change their conceptions…… we intend to increase the range of application of the useful intuitions and decrease the range of application of the detrimental intuitions…By establishing analogical connections between situations students initially view as not analogous, students may be able to extend their valid intuitions to initially troublesome target situations. This strategy, called the bridging strategy, has been used in tutoring, computer tutoring and classroom instruction, with some apparent success…The first step in the bridging strategy is to make the misconception explicit by means of target question….The next step is to suggest a case which the instructor views as analogous … which will appeal to the student’s intuition. We call such a situation an anchoring example... (Brown & Clement 1989, p. 238-239) The three views have in common the claim that the first move in teaching science consists in revealing the beliefs and the conceptual understanding students have of phenomena (diagnosis and formative assessment); conflict is important for all. But in the “analogy view” the intuitions of the learners are no more just obstacles, they can also be headstarts and hooks for correct beliefs. All these considerations are practical, and come from direct experience, but are rarely the object of fair trials. As Clement, A. DiSessa has especially stressed the importance of analogies but he has proposed an alternative to the idea that the difficulty is represented by the change, thus by the presence of a previous theory. According to him, the difficulty is not inherent to the existence of previous structures or concepts: collecting and coordinating pieces is difficult even in the absence of a competitor, and the same difficulties can be present when a system is created from scratch from observation and when a system requires a change. However, strategies for remediation might be the same as for those who claim the specificity of “conceptual change. DiSessa thus acknowledges that there exist different strategies of “remediation”, but also that there is no evidence about what’s better, and why. A distinctive characteristic of the knowledge in pieces perspective is that the reasons for difficulty of change may be the same in cases where a conceptual structure evolves from scratch, compared to cases where one conceptual system emerges from a different one (theory change).Theory theory views and knowledge I npieces prescribe some strong differences in strategy and process (e.g., rational decision-making vs. a long period of multi-context accumulation and coordination). (diSessa 2006, p. 14) 1.  Instruction is a complex mixture of design and theory, and good intuitive design can override the power of theory to prescribe or explain successful methods. Almost all reported innovative interventions work; almost none of them lead to improvements that distinguish them categorically from other good instruction.2. The very general constructivist heuristic of paying attention to naïve ideas seems powerful, independent of the details of conceptual change theory. Interventions that merely teach teachers about naïve ideas have been surprisingly successful.3. Researchers of different theoretical persuasions often advocate similar instructional strategies, if for different reasons. Both adherents of knowledge in pieces and of theory theories advocate student discussion, whether to draw out and reweave elements of naïve knowledge, or to make students aware of their prior theories in preparation for judgment in comparison to instructed ideas. The use of instructional analogies, metaphors, and visual models is widespread and not theory-distinctive.4. Many or most interventions rely primarily on pre/post evaluations, which do little to evaluate specific processes of conceptual change.  (diSessa, 2006, p. 14) In conclusion, even if conceptual change/misconception studies have produced a large literature, attracted the interest of both educators and of representatives of the sciences of the mind-brain-behavior, and helped undermining the “tabula rasa” image of the mind, their impact is limited by the fact that their theory is shaky (no common ground on what changes) and that instructional methods that are affirmed to be capable of producing the change are not really connected to the theory and moreover are not tested enough. One of the great positive influences of misconceptions studies was bringing the importance of educational research into practical instructional circles. Educators saw vivid examples of students responding to apparently simple, core conceptual questions in non-normative ways. Poor performance in response to such basic questions, often years into the instructional process, could not be dismissed. One did not need refined theories to understand the apparent cause: entrenched, “deeply held,” but false prior ideas. The obvious solution was very often phrased, as in the quotation heading this section, in terms of “overcoming,” or in terms of convincing students to abandon prior conceptions. (DiSessa, 2006, p. 7) In 1986 Minstrell initiated a collaboration with Earl Hunt, a cognitive psychologist … to refine and assess his classroom method. …A comparison of students’ scores on pretests and posttests makes it clear that that Minstrell’s method works. The students learn physics. But why does it work? One concern is whether the method’s success depends entirely on Jim Minstrell’s pedagogical talents. Could someone other than Minstrell use the method successfully? Is the Minstrell’s method better than other instructional methods currently in use? Hunt… has begun a research program back in his laboratory to refine the theory underlying Minstrell’s method. Why are benchmark lessons so important? How does the transfer occurs? How do students develop deep representations and make appropriate generalizations? (Bruer 1993, p. 168-169)
  • John Minstrell and Andrea DiSessa are at the origin of a weak view of conceptual change, which is not really conceptual and is not really a change.According to Minstrell, children’s are neither experts or scientists; their knowledge is not structured, but rather fragmentary and local. The “pieces of knowledge” used to deal with physics are facets: facets derive from everyday experience; some facets are correct, other false. In the early 70s, after a decade of outstanding teaching… Minstrell became concerned about its effectiveness. His students couldn’t transfer their formal book and lecture learning to the physics of everyday situations.… students are given two clay balls of equal size. Students agree that the two balls weight the same. But if one ball is then flattened into a pancake, many students will then say that the pancake weights more than the ball. … This is not a logical error but a conceptual one.Unlike expert scientists who want to explain phenomena with a minimum of assumptions and laws, students are not driven by a desire for conceptual economy. Their knowledge works well enough in daily life, but it is fragmentary and local. Minstrell calls pieces of knowledge that are used in physics reasoning facets. Facets are schemas and parts of schemas that are used to reason about the physical world. Students typically choose and apply facets in the basis of the most striking surface features of a problem. They derive their naïve facets from everyday experience. Such facets are useful in particular situations; however, they are most likely false in general, and for the most part they are only loosely interrelated. Thus students can quickly fall into contradictions. (Bruer, 1993, p. 162-163) From an instructional perspective, Minstrell (1982, 1989) viewed intuitive ideas as resources much more than blocks to conceptual change in physics, in contrast with the predominant misconceptions point of view. He described intuitive ideas as threads that, rather than rejecting, need reweaving into a different, stronger, and more normative conceptual fabric.Recent work has charted hundreds of “facets”—which are elemental and instructionally relevant ideas students have upon entering instruction—in many topics in physics instruction (Hunt & Minstrell, 1994).  The idea that knowledge is in pieces is a central tenant of DiSessa, as well as the idea that not all pieces are incorrect, that pieces are not coherently structured, but only loosely and that pieces can be highly contextual: they do not need to be there, in the mind of the student, before instruction is provided or the question is posed, but they can be created on the spot. Coherent naive theories are nowhere to be seen in this view. In the same book in which McCloskey provided perhaps his definitive statement of “naïve theories,” diSessa (1983) introduced the idea that intuitive physics consisted largely of hundreds or thousands of elements, called p-prims, at roughly the size-scale of Minstrell’s facets.P-prims are explanatorily primitives, provide people with their sense of which events are natural, which are surprising, and why. P-prims are many, loosely organized and sometimes highly contextual, so that the word “theory” is highly inappropriate. P-prims are hypothesized to play many productive roles in learning physics. (DiSessa 2006, p. 11) 
  • Are children intuitively wrong?  Another interrogation that hinders traditional approaches to conceptual change concerns the very premise of change: that beliefs previous to explicit instruction are wrong. There’s large consensus that children have a lot in their minds. But are we capable of assessing the contents children’s beliefs without influencing them? Is it not possible that we get the wrong answer just because we ask the wrong question? Michael Siegal (2003; Siegal et al 2004) suspects that research on conceptual change is partly flawed by inappropriate methods for evaluating children’s beliefs: restricted responses options might reduce the ambiguity of the question and would produce less errors in the answers. Open questionnaires might be one of the reasons why children appear to hold flattened, hollow or dual intuitive models of the Earth. When using questions that specify with precision the alternative views at stake, children would show to hold correct ideas about the Earth as spherical. It would not be proved that children hold models of the Earth in a spontaneous and intuitive manner, since these models could just be experimental artifacts. (However, Vosniadou (2008) disagrees with this method because it is not representative of the variety of answers children can provide.) As proposed by DiSessa, before their explicit instruction, children might just hold fragments of knowledge that are communicated by their cultural environment rather than intuitions, and moreover intuitions shaped as models.  One could then suspect that some of the erroneous beliefs that children and adults hold are not the effect of pre-instructional beliefs, but rather a product of education. E.g. the use of anthropomorphic metaphors for describing the behavior of inanimate physical entities might reinforce dualistic and teleological views, as in the case of the pathetic fallacy, which consists in masquerading animism as scientific descriptions (http://www.ems.psu.edu/~fraser/Bad/PatheticFallacy.html). This hypothesis is not incompatible with more traditional views of conceptual change and with the idea that children hold intuitive beliefs. Do children (and adults) really change their mind?  A further question raised by the literature about conceptual change/misconceptions concerns the real effects of instruction: can instruction actually produce a transformation of the pre-instructional beliefs and intuitions of learners, or are intuitions just masked – and ready to make surface in cognitive load situations, or when the question is asked in a particular way, or when the learner does not feel to be requested to provide a “scientific answer”?  Both Andrew Shtulman and Kevin N. Dunbar have performed experiments that point in the direction of masking or suppression rather than supplanting. These experiments are coherent with the resistance to change, and more than that with the resurgence of pre-instructional views in adults, at least in special conditions (as in the case 3 discussed above).  When students learn scientific theories that conflict with their earlier, naïve theories, what happens to the earlier theories? Are they overwritten or merely suppressed? We investigated this question by devising and implementing a novel speeded-reasoning task. Adults with many years of science education verified two types of statements as quickly as possible: statements whose truth value was the same across both naïve and scientific theories of a particular phenomenon (e.g., ‘‘The moon revolves around the Earth’’) and statements involving the same conceptual relations but whose truth value differed across those theories (e.g., ‘‘The Earth revolves around the sun’’). Participants verified the latter significantly more slowly and less accurately than the former across 10 domains of knowledge (astronomy, evolution, fractions, genetics, germs, matter, mechanics, physiology, thermodynamics, and waves), suggesting that naïve theories survive the acquisition of a mutually incompatible scientific theory, coexisting with that theory for many years to follow. (Shtulman & Valcarcel 2012) Decades of research in cognitive psychology, developmental psychology, and science education have dispelled the myth that students enter the science classroom as ‘‘empty vessels’’ ready to be filled with knowledge. Rather, they enter with rich, pre-instructional theories of the do- main-relevant phenomena that often interfere with learning (Carey, 2000; Keil, 2011; Vosniadou, 1994). In the domain of mechanics, for instance, students hold theories of motion predicated on the belief that forces are transferred from one object to another upon contact and must dissipate before objects can come to a rest (Clement, 1982; McCloskey, 1983). In the domain of thermodynamics, students hold theories of heat predicated on the belief that heat is a kind of substance that flows in and out of objects and can ultimately be trapped or contained (Reiner, Slotta, Chi, & Resnick, 2000; Wiser & Amin, 2001). And in the domain of evolution, students hold theories of adaptation predicated on the belief that all members of a species evolve together, with each organism producing offspring better adapted to the environment than it was at birth (Shtulman, 2006; Shtulman & Schulz, 2008). Science educators are thus charged with two tasks: not only must they help students learn the correct, scientific theory at hand, but they must also help students unlearn their earlier, less accurate theories. Psychologists who have studied this process – typically termed ‘‘conceptual change’’ – have characterized the transition from naïve theories to scientific theories in several ways. Some have emphasized the role of category knowledge, characterizing conceptual change as a series of conceptual differentiations, in which new category boundaries are established, and conceptual coalescences, in which old category boundaries are collapsed (Carey, 2009; Smith, 2007). Some have emphasized the role of ontological hierarchies, characterizing conceptual change as the reassignment of a key concept, or system of concepts, from one branch of an ontological hierarchy to another (Chi, Slotta, & de Leeuw, 1994; Thagard, 1992). And some have emphasized the role of causal expectations, characterizing conceptual change as a revision of the core presuppositions of a causal model or causal theory (Vosniadou, 1994; Wellman & Gelman, 1992). Common to all characterizations is a commitment to knowledge restructuring, or the conversion of one conceptual system into another by radically altering the structure (and not just the content) of that system. Implicit in the idea of knowledge restructuring is the idea that early modes of thought, once restructured, should no longer be accessible, for the basic constituents of the earlier system are no longer represented. A number of recent findings have challenged this idea, however, by showing that early modes of thought do sometimes reemerge later in life. (Shtulman & Valcarcel 2012) Apparently:- Alzheimer patients endorse teleological explanations that are typical of children; non Alzheimer adults can do so under time pressure- Adults, including biology professors, are slower at categorizing plants as living, rather than animals- The verification of the truth value of assertions for which exists a conflicting naïve explanation takes longer than the verification of the truth value of assertions for which instruction does not imply “change”. Our findings suggest that naïve theories are suppressed by scientific theories but not supplanted by them. Across 10 domains, participants were significantly slower and less accurate at verifying statements whose truth-value reversed across a conceptual change (e.g., ‘‘1/13 is greater than 1/30’’) than at verifying structurally analogous statements whose truth-value remained constant across that change (e.g., ‘‘12/13 is greater than 1/13’’). This effect was observed not only in domains where participants were introduced to the correct, scientific concepts in late adolescence but also in domains where they were introduced to those concepts in early childhood. Indeed, the latency data suggest that participants exhibited more cognitive conflict in the latter than in the former, possibly because naïve theories in the latter domains emerge earlier and are thus more deeply entrenched. (Shtulman & Varcalcel 2012) The educational consequences of this kind of approach to pre- or misconceptions would consist in a greater attention to the use of analogies and preconceptions as anchors for scientific concepts, and in a greater attention for the cognitive process of inhibition or control of automatisms (how it works, when it fails, how its functioning can be favored); also, there are immediate pragmatic consequences that stem from the observation of when and how we fail at mobilizing the scientific rather than the pre-scientific option:-time constraints, stress-level of expertise-formulation of the question. Evidence is still poor, and it is not clear whether or in what measure, the resilience of early intuitions (the tendency to “come back”) is domain-specific (only certain early intuitions of preconceptions behave this way) or domain-general (it is a characteristic of preconceptions to be masked but not suppressed).
  • Are children intuitively wrong?  Another interrogation that hinders traditional approaches to conceptual change concerns the very premise of change: that beliefs previous to explicit instruction are wrong. There’s large consensus that children have a lot in their minds. But are we capable of assessing the contents children’s beliefs without influencing them? Is it not possible that we get the wrong answer just because we ask the wrong question? Michael Siegal (2003; Siegal et al 2004) suspects that research on conceptual change is partly flawed by inappropriate methods for evaluating children’s beliefs: restricted responses options might reduce the ambiguity of the question and would produce less errors in the answers. Open questionnaires might be one of the reasons why children appear to hold flattened, hollow or dual intuitive models of the Earth. When using questions that specify with precision the alternative views at stake, children would show to hold correct ideas about the Earth as spherical. It would not be proved that children hold models of the Earth in a spontaneous and intuitive manner, since these models could just be experimental artifacts. (However, Vosniadou (2008) disagrees with this method because it is not representative of the variety of answers children can provide.) As proposed by DiSessa, before their explicit instruction, children might just hold fragments of knowledge that are communicated by their cultural environment rather than intuitions, and moreover intuitions shaped as models.  One could then suspect that some of the erroneous beliefs that children and adults hold are not the effect of pre-instructional beliefs, but rather a product of education. E.g. the use of anthropomorphic metaphors for describing the behavior of inanimate physical entities might reinforce dualistic and teleological views, as in the case of the pathetic fallacy, which consists in masquerading animism as scientific descriptions (http://www.ems.psu.edu/~fraser/Bad/PatheticFallacy.html). This hypothesis is not incompatible with more traditional views of conceptual change and with the idea that children hold intuitive beliefs. Do children (and adults) really change their mind?  A further question raised by the literature about conceptual change/misconceptions concerns the real effects of instruction: can instruction actually produce a transformation of the pre-instructional beliefs and intuitions of learners, or are intuitions just masked – and ready to make surface in cognitive load situations, or when the question is asked in a particular way, or when the learner does not feel to be requested to provide a “scientific answer”?  Both Andrew Shtulman and Kevin N. Dunbar have performed experiments that point in the direction of masking or suppression rather than supplanting. These experiments are coherent with the resistance to change, and more than that with the resurgence of pre-instructional views in adults, at least in special conditions (as in the case 3 discussed above).  When students learn scientific theories that conflict with their earlier, naïve theories, what happens to the earlier theories? Are they overwritten or merely suppressed? We investigated this question by devising and implementing a novel speeded-reasoning task. Adults with many years of science education verified two types of statements as quickly as possible: statements whose truth value was the same across both naïve and scientific theories of a particular phenomenon (e.g., ‘‘The moon revolves around the Earth’’) and statements involving the same conceptual relations but whose truth value differed across those theories (e.g., ‘‘The Earth revolves around the sun’’). Participants verified the latter significantly more slowly and less accurately than the former across 10 domains of knowledge (astronomy, evolution, fractions, genetics, germs, matter, mechanics, physiology, thermodynamics, and waves), suggesting that naïve theories survive the acquisition of a mutually incompatible scientific theory, coexisting with that theory for many years to follow. (Shtulman & Valcarcel 2012) Decades of research in cognitive psychology, developmental psychology, and science education have dispelled the myth that students enter the science classroom as ‘‘empty vessels’’ ready to be filled with knowledge. Rather, they enter with rich, pre-instructional theories of the do- main-relevant phenomena that often interfere with learning (Carey, 2000; Keil, 2011; Vosniadou, 1994). In the domain of mechanics, for instance, students hold theories of motion predicated on the belief that forces are transferred from one object to another upon contact and must dissipate before objects can come to a rest (Clement, 1982; McCloskey, 1983). In the domain of thermodynamics, students hold theories of heat predicated on the belief that heat is a kind of substance that flows in and out of objects and can ultimately be trapped or contained (Reiner, Slotta, Chi, & Resnick, 2000; Wiser & Amin, 2001). And in the domain of evolution, students hold theories of adaptation predicated on the belief that all members of a species evolve together, with each organism producing offspring better adapted to the environment than it was at birth (Shtulman, 2006; Shtulman & Schulz, 2008). Science educators are thus charged with two tasks: not only must they help students learn the correct, scientific theory at hand, but they must also help students unlearn their earlier, less accurate theories. Psychologists who have studied this process – typically termed ‘‘conceptual change’’ – have characterized the transition from naïve theories to scientific theories in several ways. Some have emphasized the role of category knowledge, characterizing conceptual change as a series of conceptual differentiations, in which new category boundaries are established, and conceptual coalescences, in which old category boundaries are collapsed (Carey, 2009; Smith, 2007). Some have emphasized the role of ontological hierarchies, characterizing conceptual change as the reassignment of a key concept, or system of concepts, from one branch of an ontological hierarchy to another (Chi, Slotta, & de Leeuw, 1994; Thagard, 1992). And some have emphasized the role of causal expectations, characterizing conceptual change as a revision of the core presuppositions of a causal model or causal theory (Vosniadou, 1994; Wellman & Gelman, 1992). Common to all characterizations is a commitment to knowledge restructuring, or the conversion of one conceptual system into another by radically altering the structure (and not just the content) of that system. Implicit in the idea of knowledge restructuring is the idea that early modes of thought, once restructured, should no longer be accessible, for the basic constituents of the earlier system are no longer represented. A number of recent findings have challenged this idea, however, by showing that early modes of thought do sometimes reemerge later in life. (Shtulman & Valcarcel 2012) Apparently:- Alzheimer patients endorse teleological explanations that are typical of children; non Alzheimer adults can do so under time pressure- Adults, including biology professors, are slower at categorizing plants as living, rather than animals- The verification of the truth value of assertions for which exists a conflicting naïve explanation takes longer than the verification of the truth value of assertions for which instruction does not imply “change”. Our findings suggest that naïve theories are suppressed by scientific theories but not supplanted by them. Across 10 domains, participants were significantly slower and less accurate at verifying statements whose truth-value reversed across a conceptual change (e.g., ‘‘1/13 is greater than 1/30’’) than at verifying structurally analogous statements whose truth-value remained constant across that change (e.g., ‘‘12/13 is greater than 1/13’’). This effect was observed not only in domains where participants were introduced to the correct, scientific concepts in late adolescence but also in domains where they were introduced to those concepts in early childhood. Indeed, the latency data suggest that participants exhibited more cognitive conflict in the latter than in the former, possibly because naïve theories in the latter domains emerge earlier and are thus more deeply entrenched. (Shtulman & Varcalcel 2012) The educational consequences of this kind of approach to pre- or misconceptions would consist in a greater attention to the use of analogies and preconceptions as anchors for scientific concepts, and in a greater attention for the cognitive process of inhibition or control of automatisms (how it works, when it fails, how its functioning can be favored); also, there are immediate pragmatic consequences that stem from the observation of when and how we fail at mobilizing the scientific rather than the pre-scientific option:-time constraints, stress-level of expertise-formulation of the question. Evidence is still poor, and it is not clear whether or in what measure, the resilience of early intuitions (the tendency to “come back”) is domain-specific (only certain early intuitions of preconceptions behave this way) or domain-general (it is a characteristic of preconceptions to be masked but not suppressed).
  •  What do scientists really do? There’s a wealth of studies that try to answer this question from several points of view and through different methods of analysis: from historical to ethnographic studies, from philosophy to cognitive science (e.g. the work of Ronald Giere, Nancy Nersessian, Kevin Dunbar, David Klahr, Herbert Simon, Frederick Reif, Ryan Tweney, Willard Quine …): …our understanding of scientific knowledge practices needs to be psychologically realistic. Putting it baldly, creative scientists are not only exceptionally gifted human beings - they are also human beings with a biological and social makeup like all of us. In a fundamental sense, science is one product of the interaction of the human mind with the world and with other humans. We need to find out how human cognitive abilities and limitations constrain scientific theorizing… (Nersessian 1992, p. 5) Whatever the image we have of the scientist and of her actions (cognitive and pragmatic), there are good reasons to be wary of the fact that children can be considered real scientists, or that any adult who is not trained for careers in science can be considered as doing "good science." The fact is that the ability of science to provide a realistic understanding of the natural world largely depends on the fact that science relies on some operations that are costly and effortful in terms of thought. These operations, often described as the "scientific method", but which also include informal logic and rules for good argumentation, can bring under control biases and illusions that are part of our normal mental functioning – including of our mental functioning as “intuitive scientists”: curious and explanation-prone. If spontaneous exploration is perfectly suited to help the child and also adults in many situations to discover patterns and causes, and constitutes the basis of all scientific thought, yet it is very different from what makes scientific thought a royal way to understand the world as it really is.In a sense, in fact, what makes us intelligent is what makes us stupid; for example, evolution has selected a brain capable of extracting regularities from the environment and of inferring causal processes from events that are related. These are the capabilities that we see developed early in children. It is also thanks to these capabilities that we are able, as adults, of noting regularities, patterns, e.g. in the night sky (otherwise, no night navigation would have been possible for our ancestors). These are the same skills that helped, among other things, to note that women in labor who had been treated by hospital doctors were more likely to contract an infection than women who gave birth at home - hence a great scientific discovery: the practice of antisepsis. But they are also the same capabilities that make that we see a face in the moon, canals on Mars, or that we attribute causal powers to practices that have none. "External" tools then have to be put in place in order to use our natural capacities with control, since our strategies and solutions are not good at all times (some of them are called "heuristics" because they help to find solutions fast and efficiently in some situations - those for which they were selected, in fact - but prove harmful when used indiscriminately). The history of science in general and somehow of human culture in general is the history of the research and invention of these tools to "think better."
  • See also (Wolpert 1994; Boyer, 1994)Science does not come naturally to the human mind (while religion does) Grounding his argument on studies in developmental, evolutionary psychology and on the observation of science as an anthropological, social enterprise, philosopher Robert McCauley (2000), defends the idea that science is unnatural for the human mind (and opposes science to religion, which comes naturally to the mind). The term “natural” is used to refer to judgments and actions that do not require reflection and that have non-cultural foundations. (McCauley’s arguments are inspired by biologist Lewis Wolpert (1994) and Pascal Boyer (1994). They belong to a recent tendency to naturalize the study of science and of religion, which involves a strong implication of evolutionary psychology and cognitive psychology models.) Some cognitive capacities seem to turn neither on any particular cultural input nor, as in the case of face recognition, on any peculiarly cultural input at all. Children's proclivity to acquire language and nearly all human beings= appreciation of some of the basic physics of solid objects, their assumptions about the mutual exclusivity of taxonomic classes in biology, and their abilities to detect and read agents' minds are just some of the proposed candidates for human cognitive capacities that arise independently of any particular cultural input.These capacities seem in place comparatively early in human development, and their functioning usually seems both automatic and fast. Their operations occasion no conscious searches for evidence, and even if they did, the associated inferences seem woefully underdetermined by whatever evidence that might be available. … In calling religion "natural" and science "unnatural" in this second sense, I am suggesting two things. First, the elaborate cultural institutions surrounding each play a far more integral role in the generation and persistence of science than they do in the case of religion. (Indeed, for some religious systems, e.g., among prehistoric hunter-gatherers, such far-reaching cultural institutions have never existed.) Second, most of the cognitive activity underlying religion concerns cognitive processes that rely far less on particular cultural input, particular forms of cultural input, or even peculiarly cultural input than is the case with science. (McCauley 2000) McCauley does not completely refute the theory-theory proposed by Gopnik and others, but he argues (1) that scientists' and children's conceptual structures are theories, (2) that, for children as well as scientists, these theories provide explanations of events in the world, (3) that, like scientists, children are sensitive to the role that evidence can play in improving their conceptual structures, and (4) that conceptual development in children is, like scientific change, a process of formulating, evaluating, amending, and sometimes even replacing theories.In claiming that religion is more natural than science, it does not follow that nothing about science comes naturally. Undoubtedly, some cognitive activities scientists engage in--their formation of hypotheses, their attention to evidence, and their elaboration, modification, and replacement of theories--predate the emergence of distinctively scientific traditions and institutions and probably do constitute fundamental operations in cognitive development. (McCauley 2000) Science certainly has natural, pre-institutional and cultural foundations – e.g. the drive for explanatory theories seems to be natural, and common to both science and religion. However, several characteristics distinguish science from religion, and science from the bare drive for explanations:explanations sought by science are sophisticated and systematicthey rely on empirical evidence that is difficult to pursue, produce, appraisethey rely on social and cultural arrangements.The capacity of doing science in this “professional” sense does not come “naturally” (automatically, without reflection or cultural help), nor it becomes automatic to the human mind with habit. The scientists continue to fight all their life with confirmation bias and availability heuristics, even if she is more ready than non-scientists to adopt tools and strategies for controlling their effects on her ideas.  Among the huge range of activities scientists undertake, two deserve particular attention when considering the unnaturalness of science:(1)  scientists develop explanatory theories that challenge received views about empirical matters and (2)  their critical assessment of those theories highly values evidence born of empirical tests. (McCauley)What distinguishes science is, first, the relative sophistication and systematicity it brings both to the generation of empirical evidence and to the assessment of that evidence's import for explanatory theories and, second, the pivotal roles that social and cultural arrangements--as opposed to our ordinary cognitive predilections--play in those processes. (See Gopnik and Meltzoff, 1997, pp. 20 and 38, Gopnik, 1996, p. 508, and Brewer and Samarapungavan, 1991, p. 222.) (McCauley)This is not to question children's recognition of the importance of collecting evidence. Nor shall I question the religious on this front either, though, that may be unduly charitable, as remarks on memory in the final section will suggest. Rather, the points I wish to make turn on highlighting both the centrality and the difficulty of systematically pursuing, producing and appraising empirical evidence in science. (Brewer and Samarapungavan, 1991, especially p. 221.) The requisite skills neither automatically come to human beings nor automatically become habits of the human mind.This is one of the reasons why science must be taught and why so many have such difficulty both learning it and learning how to do it. (McCauley 2000) As children, scientists continue to be the prey of biases and illusions, automatic ways of reasoning and fallacies. It is also a reason why speaking of "the scientist as child" is so apt. (Gopnik and Meltzoff, 1997, pp. 13-47) Children are not so much like sophisticated little scientists as scientists, their considerable training and expertise notwithstanding, are like children, not only insofar as they exhibit similar explanatory interests and strategies but also insofar as they exhibit the same cognitive biases and limitations that other human beings do. Whether as children or educated scientists, human beings seek explanations, generate theories, and consider evidence, but they also operate with vague hypotheses, perform fallacious inferences, have memory lapses, and display confirmation bias (see the final paragraphs of this section). (McCauley 2000) 
  • So-so scientistsEven if children seem to have some form of predisposition to do and learn science, a certain approach to the psychology of science opposes folk science and advanced or professional science. The latter does not come naturally to the human mind.The cognitive psychologist Steven Pinker has expressed this idea by describing humans (children and adults) as “so-so scientists”: Natural selection, however, did not shape us to earn good grades in science class or to publish in refereed journals. It shaped us to master the local environment, and that led to discrepancies between how we naturally think and what is demanded in the academy. ...Good science is pedantic, expensive, and subversive. It was an unlikely selection pressure within illiterate foraging bands like our ancestors', and we should expect people's native “scientific” abilities to differ from the original article. (Pinker 1997, p. 303) Accidentally: why didn't we evolve to become good scientists, but just so-so scientists? Pinker suggests three reasons for this: first, good science (the one using logics, appropriate generalization, correct inferences) has to make abstraction of content, that is, of what we know. One has to reason in abstract. But in daily life ignoring what we know does hardly make sense; second, good science is expensive: a lot of trials and errors might not be worth the trouble for small bands of illiterate (no couching in writing, no accumulation); third, good science aims at truth, not necessarily my truth and interests! So, sometimes truth can be adaptive, but other times, when I want my truth to prevail, it is not. In other words, "professional science" leads us to ignore what we intuitively know, what we want (to doubt, and dismiss the tendency to confirm our ideas); it requires accumulation, hence transmission systems and writing. So, much remains to be learned through formal education systems: not only contents, but tools and methods to go beyond what comes naturally to the human mind. This could also help understanding why teaching and learning science is so difficult and often creates resistance: it is not just because of naïve, false beliefs, but also because of the discrepancy between what our mind is good at and has been shaped for and what is demanded for doing good science. As much as it grows from the continuity between the child and the scientist, knowledge produced by professional science is the outcome of highly unnatural cognitive actions - such as: controlling observation biases - and of a cultural apparatus: an institutionalized community with rules, a worldview, shared methods, the capacity of taking into account the past and of building on it. Homo scientificus can only exist because homo is a social, culturally bound species. We cannot hope to grasp intuitively everything! Luckily, Homo sapiens has evolved as a collaborative species. In this sense, even the most unnatural cognitive actions emerge from natural humus.
  • It should be added that “professional science” has gone far in the last four centuries, well beyond the middle world our brains are shaped for. The micro-world and the cosmologically large one pose a serious threat to intuition. Quantum physics and string theory are clearly not intuitive. Tools, diagrams, mathematical language are then required in order to manipulate their parts, evaluate their degree of consistence, devise experiments that might or might not put their hypotheses to the test of experience. What counts as experience is rather twisted in both cases, moreover. Limited minds have created conceptual tools for moving beyond intuition. The following is a transcription of a conversation between the evolutionary biologist Richard Dawkins and the philosopher of mind Daniel Dennett (The four horsemen 2007 http://www.youtube.com/watch?v=MuyUz2XLp1E): R. Dawkins (about mystery in physics) Isn't it possible that our evolved brains because we evolved in what I call "middle world", where we never have to cope either with the very small or the cosmologically very large, we may never actually have an intuitive feel for what is going on in quantum mechanics, we can still test the predictions, do the mathematics and do the physics to actually test the predictions because anybody can read the diagrams
  • The paradox is that in spite of our cognitive limitations we DO science. And with success. Nonetheless, we are not pre-wired for professional science, and our natural endowment is not enough.
  • A possible answer is that we can enjoy a mix of natural predispositions and social cooperation and cultural tools that make science possible. We’ve seen that scientists use “tools”. In this sense, and for this reason, scientists are not like children, nor like other adults. Science, no matter how much it characterizes the human species, is not a universal trait of human beings. Scientists can get around some of their cognitive limitations by exploiting a vast array of tools (such as literacy and mathematical description) and cultural arrangements (such as journals, professional associations, and the division of labor). Children, by contrast, mostly work in comparative isolation unaided by these tools, unable to take advantage of such arrangements, and unacquainted with the enormous bodies of knowledge to which scientists have access. (Brewer and Samarapungavan, 1991). (McCauley 2000) Tools are not limited to language, including mathematical language, diagrams and other forms of representation, or even methods for producing evidence (both in the sense of the technology and of the experimental apparatus or “baloney detection kits” that scientists use in order to evaluate their own work and that of their colleagues). Tools include institutions and the practice of doing science as a community of scientists. The institution of science does an even better job than either individual scientists or local research teams of getting around cognitive limitations, because it is the collective product of an international community of inquirers for whom prestige, fame, and wealth turn, in no small part, on their seizing opportunities to criticize and correct each other's work. Such communal features of the scientific enterprise establish and sustain norms that govern scientific practice. They also ensure that the collective outcome of the efforts and interactions of mistake-prone individuals and small research groups with one another in the long run is more reliable than any of their individual efforts are in the short run. (McCauley 2000)Science requires cultural support: cognitive proclivities are not enough for ensuring that theories are more and more sophisticated and powerful and consistent with evidence. Which evidence does exist that science is unnatural, that is: that it depends on cultural arrangements that entail and allow for special cognitive actions? McCauley names:the rarity of science. Once science is not confounded with technology, no sign of science as described above can be found in prehistory, and even its presence in history is relatively rare (only two periods of the human history seem to be describable as original “scientific” eras: ancient Greece and modern Europe)science challenges intuitions and common sense science has progressively reduced the explanatory role of agent causality – where causality and agency are two essential features of the human mind, and in fact dominate in any religionscientific descriptions differ significantly from everyday descriptions of phenomena: science pursues explanatory depthscience is concerned with understanding nature not (only) because of its effects on us, but for its own sakescience involves forms of thought and practices that are effortful and painstakingly laborioushumans are resilient to science, and stick to their commonsense beliefs – this is true not only for non-scientists but for scientists as wellscientific concepts are difficult to learn and require years of specialized studies (religion is learnt immediately by the child (contrary to religious facts that are easily learnt and remembered; the fact that religious concepts comply with natural tendencies of the human mind is reinforced by the tendency to put them down in narratives, again a natural friend of the human mind). Explanatory theories in science possess increasingly greater theoretical depth because, unlike religion, science is finally concerned with understanding nature for its own sake and not merely for its effects on us.Lewis Wolpert argues that the historical scarcity of inquiries committed to the intrinsic value of understanding nature is evidence not only of the comparative unnaturalness of such inquiries but of the limits of humans natural curiosity. The idea that man is innately curious is partial myth: man's curiosity extends only to what affects his conduct. (Wolpert, 1992, p. 54) In their pursuits scientists are not impervious to our practical concerns with nature, but such concerns are not necessary for doing science. Many scientists devote their entire careers to highly esoteric, impractical studies of nature’s narrowest corners. Their interests in appraising comparatively detailed, low-level proposals ensure that those theories remain empirically responsible. (See Barbour, 1980, p. 242.) In addition to the persistent unnaturalness of scientific proposals, institutionalized science also involves forms of thought and types of practice that human beings find extremely difficult to master. The acquisition of scientific knowledge is a painstaking and laborious process. To become a professional scientist requires at least a decade of focused education and training, and even then the scientist typically gains command of only one sub-field within a single scientific discipline. Not only is scientific knowledge not something that human beings acquire naturally, its mastery does not even guarantee that someone will know how to do science. After four centuries of astonishing accomplishment, science remains an overwhelmingly unfamiliar activity, even to most of the learned public and even in those cultures where its influence is substantial.… In science higher level cultural forces--in contrast to lower level psychological ones--play a far more significant role in shaping the relevant (explanatory) materials (e.g., the contents of theories as opposed to the contents of myths). The importance of the activities and experiences of a highly trained elite--compared with those of an untutored public--differs vastly for ensuring the persistence of the two systems in question. (McCauley) Scientists, themselves, have produced evidence about the difficulties of doing science. Experimental psychologists (Tweney, Doherty, and Mynatt, 1981) have revealed that college level science students often fail to exhibit the forms of judgment and inference suitable for rational assessment of scientific theories. Even experienced researchers are sometimes prone to erroneous forms of reasoning (Kahneman, Slovic, and Tversky, 1982), although they are less likely to make some types of errors when they are operating in areas where they possess expertise.These sorts of findings have at least two implications. First,
overcoming these cognitive biases and errors, to which human beings seem all too naturally prone, requires extensive study and experience, yet even these provide no guarantee against such shortcomings. Second, it is the comparatively narrow community of research scientists that is primarily responsible for maintaining science's critical traditions. Scientific standards, just like scientific knowledge, depend mostly on the evolution of the expert scientific community's collective judgment in the long run. Individual scientists are far too susceptible to such problems as errors in reasoning, flawed heuristics, and confirmation bias.The difficulties associated with reasoning properly, judging reliably, and comprehending esoteric scientific concepts go some way toward explaining why science progresses so slowly most of the time. These difficulties are also excellent indications of just how unnatural doing science is from a cognitive standpoint. (McCauley 2000)
  • D. Dennett But what we can see is that what scientists have constructed over the centuries is the  tools, mind tools, thinking tools, mathematical tools which enable us to some degree to overcome the limitations of our evolved brains, our stone-age, if you like, brains; and overcoming those limitations is not always direct sometimes you have to give up something  you get, you just may never be able as you to think intuitively about this, but you can know, even if you can't think it intuitively, there  is this laborious process you can make progress  and you can have the seed of a certain authority to the progress that you can test that and it can carry you from A to B in the same way you know if you are quadriplegic an artificial device can carry you from A to B, you can't walk from A to B but you get from A to B. So, new tools must be devised in order to teach something (not at the elementary school level maybe) that is far beyond what can be grasped through intuition. This science is not something that can be mastered by a Faraday, almost lacking of any formal instruction. This science is deeply dependent upon education, transmission and technologies for thinking (not necessarily digital technologies, but they help). It is an eminent example of distributed cognition (Hutchins 1995). 

Gdp2 2013 14_3 Presentation Transcript

  • 1. Obstacles to learning COGNITIVE RESISTANCE TO SCIENTIFIC FACTS AND THEORIES BABIES AS SCIENTISTS? THE DIFFICULT ACQUISITION OF SCIENTIFIC CONCEPTS: CONCEPTUAL CHANGE THE ORIGINS AND DEVELOPMENT OF SCIENTIFIC THINKING: BABIES AS SCIENTISTS?
  • 2. Resistance to the theory of evolution
  • 3. Resistance to scientific knowledge in physics & cosmology
  • 4. Resistance to scientific knowledge in physics
  • 5. Resistance to scientific knowledge in physics
  • 6. Neuromyths
  • 7. Resistance to scientific knowledge as implied in superstition and pseudo-scientific claims
  • 8. • Feynman, 1974 During the Middle Ages there were all kinds of crazy ideas, such as that a piece of rhinoceros horn would increase potency… Then a method was discovered for separating the ideas – which was to try one to see if it worked, and if it didn’t work, to eliminate it. This method became organized, of course, into science. And it developed very well, so that we are now in the scientific age. It is such a scientific age, in fact, that we have difficulty in understanding how witch doctors could ever have existed, when nothing that they proposed ever really worked – or very little of it did. But even today I meet lots of people who sooner or later get me into a conversation about UFOs or astrology, or some form of mysticism, expanded consciousness, new types of awareness, ESP, and so forth. And I’ve concluded that it’s not a scientific world. (Feynman, 1974)
  • 9. Obstacles to learning COGNITIVE RESISTANCE TO SCIENTIFIC FACTS AND THEORIES BABIES AS SCIENTISTS? THE DIFFICULT ACQUISITION OF SCIENTIFIC CONCEPTS: CONCEPTUAL CHANGE THE ORIGINS AND DEVELOPMENT OF SCIENTIFIC THINKING
  • 10. Scientists in the crib Curiosity Core knowledge Observation, exp erimentation Several mechanisms, for learning from experience Folk knowledge/Naï ve representations / Common sense
  • 11. Core knowledge Objects Actions Number Social partners Space Human cognition is founded, in part, on four systems for representing objects, actions, number, and space. It may be based, as well, on a fifth system for representing social partners. Each system has deep roots in human phylogeny and ontogeny, and it guides and shapes the mental lives of adults. Converging research on human infants, non-human primates, children and adults in diverse cultures can aid both understanding of these systems and attempts to overcome their limits.
  • 12. Folk knowledge/Naï ve representations / Common sense Folk physics Folk chemistry Folk biology Folk social psychology Folk psychology All of us, from the most sophisticated adults to the youngest children, often engage in what is commonly called ‘‘folk science,’’ that is, certain ways of understanding the natural and artificial world that arise more informally and not as direct reflections of formal instruction in scientific principles (Carey, 1988). There is now extensive work on how children and adults have developed folk psychologies (Wellman, 1990), folk physics (Proffitt, 1999; Vosniadou, 2001), and folk biologies (Inagaki & Hatano, 2002), as well as some indications of folk sciences in such areas as the behaviors of materials and substances (folk chemistry; Au, 1994), the behaviors of heavenly bodies (folk cosmology; Siegal, Butterworth, & Newcombe, 2004), and the nature of value transactions (folk economics; Lakshminaryanan, Chen, & Santos, 2008).
  • 13. The head start model The main source of resistance to scientific ideas concerns what children know prior to their exposure to science. The last several decades of developmental psychology has made it abundantly clear that humans do not start off as "blank slates." Rather, even one year-olds possess a rich understanding of both the physical world (a "naïve physics") and the social world (a "naïve psychology"). Babies know that objects are solid, that they persist over time even when they are out of sight, that they fall to the ground if unsuorted, and that they do not move unless acted upon. They also understand that people move autonomously in response to social and physical events, that they act and react in accord with their goals, and that they respond with appropriate emotions to different situations. (Bloom & Weisberg, 2007)
  • 14. The head start model These intuitions give children a head start when it comes to understanding and learning about objects and people. But these intuitions also sometimes clash with scientific discoveries about the nature of the world, making certain scientific facts difficult to learn. As Susan Carey once put it, the problem with teaching science to children is "not what the student lacks, but what the student has, namely alternative conceptual frameworks for understanding the phenomena covered by the theories we are trying to teach. (Bloom & Weisberg 2007)
  • 15. Folk knowledge/Naï ve representations / Common sense Without explicit instruction in such areas, people seem to develop domain-specific ways of thinking about relatively bounded sets of phenomena such as the behavior of solid objects, living kinds, and the minds of others. These domain-specific understandings have been referred to as ‘‘intuitive theories’’ or ‘‘naıve theories,’’ on the assumption that they reflect sets of beliefs that cohere in a manner that resembles, in important respects, scientific theories (Carey, 1985; Carey & Spelke, 1996; Slaughter & Gopnik, 1996). (Keil, 2010, p. 826-827).
  • 16. Pre-instructional knowledge in biology What are the components of children’s biological-knowledge system before systematic teaching at school? Can this knowledge system be called naive biology? We propose that young children’s biologicalknowledge system has at least two essential components—(a) the knowledge needed to identify biological entities and phenomena and (b) teleological and vitalistic causality— and that these components constitute a form of biology. We discuss how this naive biology serves as the basis for per- formance and learning in socially and culturally impor- tant practices, such as health practices and biology instruction. (Inagaki & Hatano 2006, p. 78)
  • 17. Obstacles to learning COGNITIVE RESISTANCE TO SCIENTIFIC FACTS AND THEORIES BABIES AS SCIENTISTS? THE DIFFICULT ACQUISITION OF SCIENTIFIC CONCEPTS: CONCEPTUAL CHANGE THE ORIGINS AND DEVELOPMENT OF SCIENTIFIC THINKING
  • 18. Misconceptions & the conceptual change model All good teachers have always realized that one must start “where the student is.” Since the 1960s, we have come to a completely new understanding of what this means. Back then, it was defined in terms of what the student lacked, and this was seen as a lack of science content knowledge, combined with age-related limitations in general cognitive capacities (e.g., the elementary school child is a concrete thinker not capable of abstract reasoning). Now we understand that the main barrier to learning the curricular materials we so painstakingly developed is not what the student lacks, but what the student has, namely, alternative conceptual frameworks for understanding the phenomena covered by the theories we are trying to teach. Often these conceptual frameworks work well for children, so we face a problem of trying to change theories and concepts. (Carey 2000)
  • 19. An open debate on (mis)-conceptions and change Uncontroversial: • Students arrive to instruction with prior ideas/knowledge • Prior ideas constrain successive learning • For good and for worst Controversial: • Are all preconceptions misconceptions? • Are preconceptions concepts? Are they structured in theories? • In what consists the change? • What changes? • How does change occurs? • What is the positive role of preconceptions? • How can the research be used for informing practice?
  • 20. Radical Version • • . Rather, preschool children have constructed a very different theoretical framework from that held by adults, in which they have embedded their understanding of animals, just as children of elementary school age have constructed a different framework theory in which they embed their understanding of the material world. (Carey 2000) • misconceptions are blocking or filtering new acquisitions, they are coherent and organized in theorylike structures transformation (radical, noncumulative, change of perspective in which one concept is given out for another, incommensurability between conceptual systems) conflict between old and new views, and of the experience of conflict as the necessary and sufficient condition for fueling the transformation. – 2 main influences : • Thomas Kuhn • Jean Piaget
  • 21. Radical Version The theory theory is the claim that children or beginning students have theories in very much the same sense that scientists have them. … With respect to another domain, theories of mind, Allison Gopnik (Gopnik & Wellman, 1994) strongly advocates the theory theory. Gopnik is fairly extreme in the parallelism she claims (while still admitting some differences between scientists and children, such as meta-cognitive awareness); others are more conservative in allowing such differences as limits in systematicity and breadth of application (Vosniadou, 2002). (DiSessa 2006, p. 7-8) • • Radical view of what changes = – theories (e.g. Susan Carey, Alison Gopnik) that contain concepts – ontologies have to change too (e.g. Magdalene Chi) because resistant mistakes derive from miscategorizations not just wrong concepts Less radical view = frameworks (e.g. Stella Vosniadou) • Theories are structured • Frameworks are less structured, internal quasicoherent explanatory systems, presuppositions
  • 22. Radical Version Our central commitment in this study is that learning is a rational activity. That is, learning is fundamentally coming to comprehend and accept ideas because they are seen as intelligible and rational. (Posener et al, 1982, p. 212) • Change is produced when a conflict arises • = there are good reasons ro change one’s own mind • = learning is a rational activity
  • 23. Soft Version A distinctive characteristic of the knowledge in pieces perspective is that the reasons for difficulty of change may be the same in cases where a conceptual structure evolves from scratch, compared to cases where one conceptual system emerges from a different one (theory change). (diSessa 2006, p. 14) • • • Soft view of what changes – Knowledge in pieces or facets or p-prims (John Minstrell, Andrea DiSessa) – P-prims are many, loosely structured, sometimes highly contextual – Children are not scientists Soft view of the nature of change – Reasons for difficulty might be the same in the absence of previous intuitions: collecting and ordinating pieces is always difficult Soft view of how to produce change – Some facets are consistent with science and can anchor instruction (John Minstrell) – Use both conflict and analogy to produce good explanations (John Clement) – Not necessarily a rational process of transformation, but accumulation and coordination (Andrea Di Sessa)
  • 24. Soft Version When students learn scientific theories that conflict with their earlier, naïve theories, what happens to the earlier theories? Are they overwritten or merely suppressed? We investigated this question by devising and implementing a novel speeded-reasoning task. Adults with many years of science education verified two types of statements as quickly as possible: statements whose truth value was the same across both naïve and scientific theories of a particular phenomenon (e.g., ‘‘The moon revolves around the Earth’’) and statements involving the same conceptual relations but whose truth value differed across those theories (e.g., ‘‘The Earth revolves around the sun’’) – Are children really intuitively wrong? • Or is it an artifact of how their beliefs are evaluated ? (e.g. Michael Siegal) • Isn’t it possible that at least certain misconceptions are induced by instruction? (e.g. pathetic fallacy) – Do children (and adults) really change their mind? • There’s evidence that instruction masks previous beliefs rather thn transforming them (e.g. Andrew Shtulman, Kevin Dunbar)
  • 25. Soft Version Participants verified the latter significantly more slowly and less accurately than the former across 10 domains of knowledge (astronomy, evolution, fractions, gene tics, germs, matter, mechanics, physio logy, thermodynamics, and waves), suggesting that naïve theories survive the acquisition of a mutually incompatible scientific theory, coexisting with that theory for many years to follow. (Shtulman & Valcarcel 2012)
  • 26. Obstacles to learning COGNITIVE RESISTANCE TO SCIENTIFIC FACTS AND THEORIES BABIES AS SCIENTISTS? THE DIFFICULT ACQUISITION OF SCIENTIFIC CONCEPTS: CONCEPTUAL CHANGE THE PARADOX OF SCIENCE
  • 27. Science unnatural Among the huge range of activities scientists undertake, two deserve particular attention when considering the unnaturalness of science: (1) scientists develop explanatory theories that challenge received views about empirical matters and (2) their critical assessment of those theories highly values evidence born of empirical tests. What distinguishes science is, first, the relative sophistication and systematicity it brings both to the generation of empirical evidence and to the assessment of that evidence's import for explanatory theories and, second, the pivotal roles that social and cultural arrangements--as opposed to our ordinary cognitive predilections--play in those processes. The requisite skills neither automatically come to human beings nor automatically become habits of the human mind. This is one of the reasons why science must be taught and why so many have such difficulty both learning it and learning how to do it. (Robert McCauley 2000)
  • 28. So-so scientists Natural selection, however, did not shape us to earn good grades in science class or to publish in refereed journals. It shaped us to master the local environment, and that led to discrepancies between how we naturally think and what is demanded in the academy. …Good science is pedantic, expensive, and subversive. It was an unlikely selection pressure within illiterate foraging bands like our ancestors', and we should expect people's native “scientific” abilities to differ from the original article. (Pinker 1997 p. 303)
  • 29. Counter-intuitive science Isn't it possible that our evolved brains because we evolved in what I call "middle world", where we never have to cope either with the very small or the cosmologically very large, we may never actually have an intuitive feel for what is going on in quantum mechanics, we can still test the predictions, do the mathematics and do the physics to actually test the predictions because anybody can read the diagrams (R. Dawkins)
  • 30. The paradox of science • • • We do science: it is a fact Our cognitive apparatus must be somehow prepared for science – Research on cognitive precursors of science in the evolutionary (phylogeny) and developmental (ontogeny) past But is not pre-wired for professional science – Research on tools that make science viable
  • 31. The natural-cultural hypothesis • A mixed origin of science – Nature: core knowledge, curiosity, causal reasoning, sensitivity to regularities, … • = capacities that reveal themselves very easily in the ontogenetic development and probably go far in our evolutionary past – Culture: social cooperation and tools for augmenting cognitive capacities (e.g. writing for transmission, spatial external representations) • = capacities that have a natural basis and make our culture special
  • 32. The role of cognitive artefacts But what we can see is that what scientists have constructed over the centuries is the tools, mind tools, thinking tools, mathematical tools which enable us to some degree to overcome the limitations of our evolved brains, our stone-age, if you like, brains; and overcoming those limitations is not always direct sometimes you have to give up something you get, you just may never be able as you to think intuitively about this, but you can know, even if you can't think it intuitively, there is this laborious process you can make progress and you can have the seed of a certain authority to the progress that you can test that and it can carry you from A to B in the same way you know if you are quadriplegic an artificial device can carry you from A to B, you can't walk from A to B but you get from A to B. (D. Dennett)
  • 33. Naturalization of scientific cognition Precursors of scientific thinking in phylogenesis Natural (cognitive) enemies of scientific thinking and knowledge in phylogeny Mithen McCauley Liebenberg Boyer Carruthers Povinelli Precursors of scientific thinking in ontogeny Cognitive skills and dispositions displayed by scientists Cognitive skills and dispositions required for science Gopnik Simon Quine Chi Spelke Holyoak DiSessa Atran Carey Carey Dunbar Tooby & Cosmides Bloom Bloom Pinker Natural (cognitive) enemies of scientific thinking and knowledge in ontogeny