My talk on the evidence for agency and consciousness in biological evolution: replacing the Neo-Darwinian paradigm with a "deep" Neo-Lamarkism. Featuring Carl Jung, Neitzche, de Chardin, and a touch of Jordan Peterson
Modern Physics - Part 9 of Piero Scaruffi's class "Thinking about Thought" at...piero scaruffi
Modern Physics - Part 9 of Piero Scaruffi's class "Thinking about Thought" at UC Berkeley (2014), excerpted from http://www.scaruffi.com/nature I keep updating these slides at www.scaruffi.com/ucb.html
This essay presents a neo-evolutionary theory based on a supposed
ideoplastic and mutagenic power of the psyche, and establishes an
intimate relationship between evolution and mimicry.
This theory is called Evolutionary Plasticity or Ideoplastic
evolutionism, and seeks to present itself as a possible meeting point
between positions - only apparently irreconcilable - the neodarwinian
evolutionists and supporters of Intelligent Design.
Particularly interesting are the naturalistic observations related to
mimicry and the original links to ipnology, plant neurobiology, quantum
mechanics and the philosophical conception of Giordano Bruno.
The author of this hypothesis of study, presented in 2009, is dr.
Pellegrino De Rosa, an Italian agronomist, journalist and novelist.
1) The author argues that Wilber's integral approach could be improved by making it more fully integral at all times of human value realization, not just at certain moments.
2) An example is given of how the descriptive, evaluative, normative, and interpretive methods are all necessary to fully understand and realize a value.
3) The author emphasizes distinguishing between epistemology and ontology when considering nondual experiences from Eastern traditions, noting that such experiences speak more to phenomenology than metaphysics.
Ecology and Altruism - Part 6 of Piero Scaruffi's class "Thinking about Thoug...piero scaruffi
Ecology and Altruism - Part 6 of Piero Scaruffi's class "Thinking about Thought" at UC Berkeley (2014), excerpted from http://www.scaruffi.com/nature I keep updating these slides at www.scaruffi.com/ucb.html
This document summarizes Spinoza's mind-body monism and its advantages over Cartesian dualism. It discusses how Descartes proposed a dualist view of the mind and body as distinct substances that interact in some unclear way. Later philosophers like Leibniz and Malebranche tried to explain this interaction but did not fully resolve the issue. Spinoza alone proposed monism, where the mind and body are two attributes of a single substance and are not distinct. This avoids the problem of how two distinct things could interact. The document argues that Spinoza's view aligns better with empiricism and growing scientific evidence that mental states arise from physical processes in the brain.
Did consciousness emerge from the cosmos or visa-versa?
Science: Human consciousness emerged from a cosmic big bang.
Religion: The cosmos emerged from Divine consciousness
How can we reconcile these complementary worldviews?
Archetypal Pattern. Fundamentals Of Non-Traditional Psychoanalysis. Chapter 3.
An image is archetypal only when it is an individual value, which through natural analogs allows a person to learn about himself, obtain information about his individual qualities recorded in the individual structure of psyche, which ultimately allows a person to legalize his own innate qualities.
The authors reason that logically an archetype in its traditional consideration cannot be that prototype (preimage), which, as an initial idea, determines the individual human psyche because by definition it belongs to culture—it is its artifact. Consequently, according to existing semantics, an archetype can be anything except an archetype as an idea. This scientific paper examines whether semantics of an archetype in its traditional sense corresponds to the concept of “an image” if an image is considered in terms of “a copy”, ”a duplicate”; can an archetype of culture be seriously considered as something that directly forms individual human psyche, as a structure that appeared long before symbolism?
The authors think that not every image is archetypal because not every image is equal to prototype (preimage), as an initial idea corresponding to the concept of “an archetype.” An image is archetypal only when it is an individual value, which, through natural analogues, allows a person to learn about his own self and learn about his individual qualities, recorded in the individual structure of psyche, which, as a result, provides a person with a possibility to legalize his own innate qualities.
We here try to apply the concept of Possible/Parallel Worlds from Logic, which came to our knowledge through the hands of Graham Priest, and through a French movie, to Psychiatry. We think this concept is ideal because we can make use of mathematical elements to draw theories of control, and diagnosis, and therefore also therapeutic theories. We will make use of the new model of psyche proposed by us to expand on a few items. Perhaps the best use of this paper is empowering the professionals of Psychiatry, and Psychology by providing new tools for their studies, and work. The main focus is the human psyche. In order to explain the World of God, Inner Reality, and Outer Reality, which are divisions that are obtained from applying the concept of parallel worlds to the studies on the human psyche, we end up paying a light, and perhaps, an enlightening, visit to the concepts of schizophrenia, autism, Down Syndrome, and psychopathy.
Modern Physics - Part 9 of Piero Scaruffi's class "Thinking about Thought" at...piero scaruffi
Modern Physics - Part 9 of Piero Scaruffi's class "Thinking about Thought" at UC Berkeley (2014), excerpted from http://www.scaruffi.com/nature I keep updating these slides at www.scaruffi.com/ucb.html
This essay presents a neo-evolutionary theory based on a supposed
ideoplastic and mutagenic power of the psyche, and establishes an
intimate relationship between evolution and mimicry.
This theory is called Evolutionary Plasticity or Ideoplastic
evolutionism, and seeks to present itself as a possible meeting point
between positions - only apparently irreconcilable - the neodarwinian
evolutionists and supporters of Intelligent Design.
Particularly interesting are the naturalistic observations related to
mimicry and the original links to ipnology, plant neurobiology, quantum
mechanics and the philosophical conception of Giordano Bruno.
The author of this hypothesis of study, presented in 2009, is dr.
Pellegrino De Rosa, an Italian agronomist, journalist and novelist.
1) The author argues that Wilber's integral approach could be improved by making it more fully integral at all times of human value realization, not just at certain moments.
2) An example is given of how the descriptive, evaluative, normative, and interpretive methods are all necessary to fully understand and realize a value.
3) The author emphasizes distinguishing between epistemology and ontology when considering nondual experiences from Eastern traditions, noting that such experiences speak more to phenomenology than metaphysics.
Ecology and Altruism - Part 6 of Piero Scaruffi's class "Thinking about Thoug...piero scaruffi
Ecology and Altruism - Part 6 of Piero Scaruffi's class "Thinking about Thought" at UC Berkeley (2014), excerpted from http://www.scaruffi.com/nature I keep updating these slides at www.scaruffi.com/ucb.html
This document summarizes Spinoza's mind-body monism and its advantages over Cartesian dualism. It discusses how Descartes proposed a dualist view of the mind and body as distinct substances that interact in some unclear way. Later philosophers like Leibniz and Malebranche tried to explain this interaction but did not fully resolve the issue. Spinoza alone proposed monism, where the mind and body are two attributes of a single substance and are not distinct. This avoids the problem of how two distinct things could interact. The document argues that Spinoza's view aligns better with empiricism and growing scientific evidence that mental states arise from physical processes in the brain.
Did consciousness emerge from the cosmos or visa-versa?
Science: Human consciousness emerged from a cosmic big bang.
Religion: The cosmos emerged from Divine consciousness
How can we reconcile these complementary worldviews?
Archetypal Pattern. Fundamentals Of Non-Traditional Psychoanalysis. Chapter 3.
An image is archetypal only when it is an individual value, which through natural analogs allows a person to learn about himself, obtain information about his individual qualities recorded in the individual structure of psyche, which ultimately allows a person to legalize his own innate qualities.
The authors reason that logically an archetype in its traditional consideration cannot be that prototype (preimage), which, as an initial idea, determines the individual human psyche because by definition it belongs to culture—it is its artifact. Consequently, according to existing semantics, an archetype can be anything except an archetype as an idea. This scientific paper examines whether semantics of an archetype in its traditional sense corresponds to the concept of “an image” if an image is considered in terms of “a copy”, ”a duplicate”; can an archetype of culture be seriously considered as something that directly forms individual human psyche, as a structure that appeared long before symbolism?
The authors think that not every image is archetypal because not every image is equal to prototype (preimage), as an initial idea corresponding to the concept of “an archetype.” An image is archetypal only when it is an individual value, which, through natural analogues, allows a person to learn about his own self and learn about his individual qualities, recorded in the individual structure of psyche, which, as a result, provides a person with a possibility to legalize his own innate qualities.
We here try to apply the concept of Possible/Parallel Worlds from Logic, which came to our knowledge through the hands of Graham Priest, and through a French movie, to Psychiatry. We think this concept is ideal because we can make use of mathematical elements to draw theories of control, and diagnosis, and therefore also therapeutic theories. We will make use of the new model of psyche proposed by us to expand on a few items. Perhaps the best use of this paper is empowering the professionals of Psychiatry, and Psychology by providing new tools for their studies, and work. The main focus is the human psyche. In order to explain the World of God, Inner Reality, and Outer Reality, which are divisions that are obtained from applying the concept of parallel worlds to the studies on the human psyche, we end up paying a light, and perhaps, an enlightening, visit to the concepts of schizophrenia, autism, Down Syndrome, and psychopathy.
This section discusses life and the human experience from a scientific perspective. It proposes that humans experience narrative structure that creates a sense of identity through the meaning-making process. The objective world and our subjective experiences interact to form narratives that are shared between people. Understanding life requires comprehending how structure, process, and pattern interact in our experiential world.
This document summarizes Joseph A. Bracken's essay on self-organizing systems and final causality. It discusses how 17th century thinkers like Galileo shifted away from teleological views of the natural world towards mechanistic views. It then discusses how Darwin's theory of evolution by natural selection was interpreted mechanistically. Some scientists like Polanyi and Sheldrake have challenged this view by proposing theories of "morphogenetic fields" and "formative causation" that reintroduce notions of teleology. Bracken seeks to provide a metaphysical framework from Whiteheadian philosophy to support these alternative conceptions.
The document discusses convergence in social technologies and infrastructures, and how current social media systems have failed to fully utilize network properties and distribution. It proposes that deconstructing rigid socio-political systems through liberalizing network properties and realizing network systems through mutations in social reality could emanate new social infrastructures without reference to previous systems. This would cancel media and arrive at pure information and a system without mediation through new syntax or language that enables instant cognition beyond current formats.
Freud, Jung & the Hard Problem of Consciousnesscheriching
The document discusses Freud and Jung's work on the human "networking system" or mechanism of psychical continuity between individuals. It defines this system from objective and subjective perspectives. Freud and Jung described it as an inherited, hardwired structure in the newborn brain and a process of transmitting mental states between generations. They saw this system as the basis for social psychology and the development and progression of civilization.
What is Consciousness? Are Near-Death Experiences Proof of Consciousness Afte...Paul H. Carr
I. SCIENCE: Human Consciousness Emerged from the Cosmos
In The Beginning: Energy, a hot Big Bang 13.8 Billion years ago.
II. RELIGION: Cosmos Emerged From Divine Consciousness
In The Beginning: “Spirit of God,” Gen 1:2
“Word (Logos) was God,” John 1:1
Human micro-consciousness part of cosmic
macro-consciousness.
Emerson’s “Oversoul”
III. ARE NEAR-DEATH EXPERIENCES PROOF OF CONSIOUSNESS AFTER DEATH?
Equation of everything i.e. Quantum Fields: the Real Building Blocks of the U...inventionjournals
Mind, the inner most box of nature has not been investigated by modern physicists .Mind has not been incorporated in Standard model. Mind can only be studied by participatory science. Having searched Basic building blocks of the universe i.e. mass part of reality, we have also investigated mind part of reality and finally two fundamental particles with mind and mass realities are hypothesized . Now we discuss how to further investigate mind so as to know their structures and functions. Atomic genetics is the branch of science where we investigate about fundamental interactions of the universe i.e. atomic transcription and translations. New words have been coined to understand hidden science of mind part of reality. Mind reality have been recognized as different faces by “I” about 5000 years back to Arjuna in Mahabharata. It is just like to understand any language through Alphabets. These are (different faces) Alphabets of mind reality. One Mind reality has one face identity and the second mind reality has second face identity and so on. The facial expression represents phenomenon of intelligence and different face represents different types of properties carrying property. The open eyes means property is activated while close eye means property is inactivated. In spite of carrying properties conscious ness they also know how to conduct not only origin of universe but also how to create two different universe i.e. next creation could be different from this creation. In all, It is automatic system of the universe. The mind realities which are of good properties have devtas face identity (first five faces on both side and those mind realities which are of bad properties have demons face identity ( last four faces on both side) . These are named as code PCPs or messenger atomic genes. The central face is CCP or Thought script where all thoughts of the universe are banked. It is bank of data of all information s of the universe It is face identity of Anti mind particles as data of all information’s of the universe are stored as anti mind particles . It is the Time mind ness (biological clock) that keeps on expressing different thoughts from this thought script (CCP). There are four more faces (black bodies) shown on extreme left and right floating in fire are CPs (translating Atomic genes). That translates the messages and realizes it and reacts accordingly. Rest pictures are creation of different individuals and nature (sun, moon and snake and other pictures made on hands and body) by different thoughts of Almighty B.B.B. The entire picture has been explained in Geeta in 11/ 10 and 11.Whatever is being created in this universe is basically not by our thoughts rather it is the thought of Almighty B.B.B (Yang B.B.B or matter B.B.B. or Male B.B.B working as Highest center of the universe. ) that is dominated over creation and destruction of this cycle of the universe. Hence the World of Everyday Experience, in One Equation is Myth.
aesthetics:a philosophy of art / the recovery of virtues and principles -int...derek dey
A short introduction to aesthetics. The philosophy of art described here is defined by universals, the recent advances in the psychology of creativity and innate character and calling. Aesthetics is a series containing 1. the Introduction. 2. The Psychology of the Creative Self. 3. The Philosophy of Art, and 4. Models of Education. Contact the author for slide supported presentations at derekdey@gmail.com
1) The document outlines Robert Lanza's theory of biocentrism, which argues that biology and consciousness are central to understanding the universe, as evidenced by quantum experiments like the double-slit experiment.
2) Specifically, Lanza believes that consciousness causes reality rather than just observing it, and that without observation, particles exist in an undetermined quantum state.
3) The author critiques Lanza's "strong" biocentrism and argues instead for a "weak" version - that while consciousness may influence quantum phenomena, it does not determine all of reality. Observable objects seem to have characteristics independent of observation.
The story begins with the escape of the apocalyptic reality, into the mind of a genius of the brush and stroke. Being an ambitious and busy mind after the Cézanne Drawing exhibition at the MoMA NYC I was compelled to apply acquired energy into compelling research. Although my mind was still in a noisy state I had a couple of thoughts on revisiting and reviving my old research from 2006/07. The main ambition was to make a significant imprint of earthshattering beauty into a collective conscience.
Traveling from New York to Mexico, I have decided to execute the project in a primordial beauty in the cultivated jungle paradise. Photographs were taken on the 8th of August 2021 at the exhibition at the MoMA NYC and projected on a couple of locations at Tulum, Mexico in October 2021. The event was also filmed with GoPro camera. It was therapy for me, as I am hoping the entire project can have a therapeutic influence on the bruised collective soul and mind.
Progressing the research expanded to comprehension of laws of the universe and into contextual landscaping, cognitive modeling, and intelligent environment(s). In a tête-à-tête with a universe, my input was Cézanne’s artwork and in return, I got answers in a form of puzzles, yet to be cracked.
The true power of the project is in unlocking the potential of the environment by relinquishing or letting go of the role of creator, producer, and viewer. I have sensitized the environment to be able to activate its self-producing force and intelligence. While in the first research I have created a system that is implosive by the nature, constantly generating systems without a recollection of the previous one; in this one, I have emptied the referent point/space to activate the environment yielding on the superposition property of the system to be able to unlock the essence, to reach the energy-momentum and tap into cosmic reason.
The idea is also to dip the research into the NFT art world to gain an extra perspective of art in blockchain technology and to test its reception on alternate realities.
YouTube channel Contextual Landscaping https://www.youtube.com/channel/UCBnuw3Mz1n0j5FFGh-6cWwQ
This document summarizes a talk given by Jack Sarfatti on the topic of how the future may affect the past. Some of the key points discussed include:
1) Theories that attempt to unify general relativity and quantum mechanics suggest that time travel to the past may be possible through wormholes.
2) Quantum experiments have shown that measurements performed in the future can influence the present, raising questions about whether the universe has a predetermined destiny.
3) A post-quantum view incorporates the idea that future events contribute causal influences on present ones, with the past and future interacting at the present moment. This perspective suggests retrocausality, or the idea that the future affects the past.
1. Only certain animals with highly complex and connected brain networks can be conscious, including all mammals and some birds.
2. Everything that arises in the mind or body comes from the dynamics of brain states interacting with each other, not between brain and mind.
3. As brains evolved and enlarged over time, the range of possible brain states and associated mental states expanded, allowing for richer conscious experiences and creativity. However, consciousness remains profoundly mysterious and difficult to explain through normal scientific means alone.
Thinking about Thought - Theories of Brain Mind Consciusness - Part 6. Consciousness, Self, Free Will I keep updating these slides at http://www.scaruffi.com/ucb.html
This document presents a holistic view of the universe as a divine system. It argues that the universe can be seen as a closed system that is able to generate and integrate all natural systems within it. Seeing the universe as a huge, all-encompassing system respects the laws of nature and incorporates all natural entities without chance. Conceptualizing God and the universe as having the same body maintains God's properties of being eternal and all-powerful while viewing the universe and nature as logical and systematic. Understanding the engineering of nature can help assemble reality from micro to macro scales and validate this vision of a divine, logical system of the universe.
Cao, Santiago. "Body and performance in the era of virtual communication. The...Santiago Cao
“To see” is an act much more complex than one purely physiological. There come into play, among other things, knowledge acquired and inherited, which will serve as tools to decode that which is seen in order to understand and assimilate it. And when I make this distinction between acquired and inherited, I consider the former a result of experience of the subject that generates experience and therefore a personal way of "Seeing the world", unlike the inherited knowledge ( "Seeing the world") which is imposed by the culture that raised one (or should I say that co-raised?). But "to see the world" (Ver el mundo) is not the same as "to see the world." (Ver al mundo) To make this distinction, we must develop in this text the premise of "Seeing is Creating and Creating is Believing," which will then be useful to think that if what we see is not what it is but what we believe it is, what happens then to devices of visual representation of "reality" and to those with the power to disseminate those devices? But new technologies such as the Internet and cellular telephony have led to a break in this concept, crossing through the notions of context and paratext, expanding the creative act of "seeing" and thus generating new realities from a single observed event. And the body in all this will not be left out. We will consider that it happens in Performance as an artistic discipline, where the body, which was traditionally support for the work, is now faced with these new ways of seeing and creating it.
The document discusses the relationship between nature and humanity. It explores the perspectives of Einstein, who viewed nature as independent of humanity, and Tagore, who saw nature and humanity as interdependent. The document also examines how indigenous people rationalized their impact on nature during hunts. While humanity has evolved apart from nature, quantum mechanics suggests humans can still interact with and influence nature non-locally. Therefore, both nature and humanity share responsibility for the current natural reality.
Financial Engineering and Its Discontents by Emanuel Derman at QuantCon 2016Quantopian
Neoclassical finance has been with us for over half a century, and its methods have become somewhat uncritically ingrained in the minds of quants. From mean-variance optimization to options theory to behavioral finance, Dr. Derman will discuss which of these ideas work better, and which don’t.
Draft notes for keynote to The Image conference, UCLA and Common Ground, 2 December 2010. Final version will be submitted to http://ontheimage.com/journal/
Philosophy of Mind - Part 2 of Piero Scaruffi's class "Thinking about Thought...piero scaruffi
The document discusses various theories regarding the relationship between the mind and body/brain, including dualism, monism, idealism, materialism, and neutral monism. Dualism proposes that the mind and body are separate substances that interact, while monism argues they are different aspects of the same substance. Materialism specifically states that only matter exists and the mind can be explained physically. The debate examines ideas such as consciousness, cognition, intelligence and how to define the mind.
We are reaching a critical state of time cycle-2012. Only thing that can save us is Truth or knowledge of Nature, understood in simple manner. There is a necessity awaken humanity to Truth. Some one with media skills should read this and cause a revolution as Wael Ghonim, to liberate humanity and take the world to new level of thinking.
The document discusses humanity entering a "scientific period" where scientific method will be applied to understand human nature and guide human progress according to natural laws. It asserts that establishing a "science and art" of directing human capacities and energies for human welfare is consistent with human nature.
The second document criticizes modern rulers for being largely ignorant of modern science and lacking a proper historical and anthropological background. This ignorance creates conflicts between advancing science and outdated worldviews, resulting in a "chaotic" global situation.
The third summary emphasizes studying science and mathematics as forms of human behavior produced by the human nervous system, and studying advances in various fields to understand how knowledge is obtained, with the goal of developing evaluative tools or
This document provides an overview of the Hong Kong Programs course PHIL 250: Philosophy of Mind. It introduces the instructor, textbook, and topics that will be covered in the first lecture, including ideas, concepts, science, theory, and the nature of theories. It also includes two related readings on mapping the brain and the mystery of consciousness. The summaries focus on key concepts that will be discussed in the course rather than providing a comprehensive summary of the document.
This section discusses life and the human experience from a scientific perspective. It proposes that humans experience narrative structure that creates a sense of identity through the meaning-making process. The objective world and our subjective experiences interact to form narratives that are shared between people. Understanding life requires comprehending how structure, process, and pattern interact in our experiential world.
This document summarizes Joseph A. Bracken's essay on self-organizing systems and final causality. It discusses how 17th century thinkers like Galileo shifted away from teleological views of the natural world towards mechanistic views. It then discusses how Darwin's theory of evolution by natural selection was interpreted mechanistically. Some scientists like Polanyi and Sheldrake have challenged this view by proposing theories of "morphogenetic fields" and "formative causation" that reintroduce notions of teleology. Bracken seeks to provide a metaphysical framework from Whiteheadian philosophy to support these alternative conceptions.
The document discusses convergence in social technologies and infrastructures, and how current social media systems have failed to fully utilize network properties and distribution. It proposes that deconstructing rigid socio-political systems through liberalizing network properties and realizing network systems through mutations in social reality could emanate new social infrastructures without reference to previous systems. This would cancel media and arrive at pure information and a system without mediation through new syntax or language that enables instant cognition beyond current formats.
Freud, Jung & the Hard Problem of Consciousnesscheriching
The document discusses Freud and Jung's work on the human "networking system" or mechanism of psychical continuity between individuals. It defines this system from objective and subjective perspectives. Freud and Jung described it as an inherited, hardwired structure in the newborn brain and a process of transmitting mental states between generations. They saw this system as the basis for social psychology and the development and progression of civilization.
What is Consciousness? Are Near-Death Experiences Proof of Consciousness Afte...Paul H. Carr
I. SCIENCE: Human Consciousness Emerged from the Cosmos
In The Beginning: Energy, a hot Big Bang 13.8 Billion years ago.
II. RELIGION: Cosmos Emerged From Divine Consciousness
In The Beginning: “Spirit of God,” Gen 1:2
“Word (Logos) was God,” John 1:1
Human micro-consciousness part of cosmic
macro-consciousness.
Emerson’s “Oversoul”
III. ARE NEAR-DEATH EXPERIENCES PROOF OF CONSIOUSNESS AFTER DEATH?
Equation of everything i.e. Quantum Fields: the Real Building Blocks of the U...inventionjournals
Mind, the inner most box of nature has not been investigated by modern physicists .Mind has not been incorporated in Standard model. Mind can only be studied by participatory science. Having searched Basic building blocks of the universe i.e. mass part of reality, we have also investigated mind part of reality and finally two fundamental particles with mind and mass realities are hypothesized . Now we discuss how to further investigate mind so as to know their structures and functions. Atomic genetics is the branch of science where we investigate about fundamental interactions of the universe i.e. atomic transcription and translations. New words have been coined to understand hidden science of mind part of reality. Mind reality have been recognized as different faces by “I” about 5000 years back to Arjuna in Mahabharata. It is just like to understand any language through Alphabets. These are (different faces) Alphabets of mind reality. One Mind reality has one face identity and the second mind reality has second face identity and so on. The facial expression represents phenomenon of intelligence and different face represents different types of properties carrying property. The open eyes means property is activated while close eye means property is inactivated. In spite of carrying properties conscious ness they also know how to conduct not only origin of universe but also how to create two different universe i.e. next creation could be different from this creation. In all, It is automatic system of the universe. The mind realities which are of good properties have devtas face identity (first five faces on both side and those mind realities which are of bad properties have demons face identity ( last four faces on both side) . These are named as code PCPs or messenger atomic genes. The central face is CCP or Thought script where all thoughts of the universe are banked. It is bank of data of all information s of the universe It is face identity of Anti mind particles as data of all information’s of the universe are stored as anti mind particles . It is the Time mind ness (biological clock) that keeps on expressing different thoughts from this thought script (CCP). There are four more faces (black bodies) shown on extreme left and right floating in fire are CPs (translating Atomic genes). That translates the messages and realizes it and reacts accordingly. Rest pictures are creation of different individuals and nature (sun, moon and snake and other pictures made on hands and body) by different thoughts of Almighty B.B.B. The entire picture has been explained in Geeta in 11/ 10 and 11.Whatever is being created in this universe is basically not by our thoughts rather it is the thought of Almighty B.B.B (Yang B.B.B or matter B.B.B. or Male B.B.B working as Highest center of the universe. ) that is dominated over creation and destruction of this cycle of the universe. Hence the World of Everyday Experience, in One Equation is Myth.
aesthetics:a philosophy of art / the recovery of virtues and principles -int...derek dey
A short introduction to aesthetics. The philosophy of art described here is defined by universals, the recent advances in the psychology of creativity and innate character and calling. Aesthetics is a series containing 1. the Introduction. 2. The Psychology of the Creative Self. 3. The Philosophy of Art, and 4. Models of Education. Contact the author for slide supported presentations at derekdey@gmail.com
1) The document outlines Robert Lanza's theory of biocentrism, which argues that biology and consciousness are central to understanding the universe, as evidenced by quantum experiments like the double-slit experiment.
2) Specifically, Lanza believes that consciousness causes reality rather than just observing it, and that without observation, particles exist in an undetermined quantum state.
3) The author critiques Lanza's "strong" biocentrism and argues instead for a "weak" version - that while consciousness may influence quantum phenomena, it does not determine all of reality. Observable objects seem to have characteristics independent of observation.
The story begins with the escape of the apocalyptic reality, into the mind of a genius of the brush and stroke. Being an ambitious and busy mind after the Cézanne Drawing exhibition at the MoMA NYC I was compelled to apply acquired energy into compelling research. Although my mind was still in a noisy state I had a couple of thoughts on revisiting and reviving my old research from 2006/07. The main ambition was to make a significant imprint of earthshattering beauty into a collective conscience.
Traveling from New York to Mexico, I have decided to execute the project in a primordial beauty in the cultivated jungle paradise. Photographs were taken on the 8th of August 2021 at the exhibition at the MoMA NYC and projected on a couple of locations at Tulum, Mexico in October 2021. The event was also filmed with GoPro camera. It was therapy for me, as I am hoping the entire project can have a therapeutic influence on the bruised collective soul and mind.
Progressing the research expanded to comprehension of laws of the universe and into contextual landscaping, cognitive modeling, and intelligent environment(s). In a tête-à-tête with a universe, my input was Cézanne’s artwork and in return, I got answers in a form of puzzles, yet to be cracked.
The true power of the project is in unlocking the potential of the environment by relinquishing or letting go of the role of creator, producer, and viewer. I have sensitized the environment to be able to activate its self-producing force and intelligence. While in the first research I have created a system that is implosive by the nature, constantly generating systems without a recollection of the previous one; in this one, I have emptied the referent point/space to activate the environment yielding on the superposition property of the system to be able to unlock the essence, to reach the energy-momentum and tap into cosmic reason.
The idea is also to dip the research into the NFT art world to gain an extra perspective of art in blockchain technology and to test its reception on alternate realities.
YouTube channel Contextual Landscaping https://www.youtube.com/channel/UCBnuw3Mz1n0j5FFGh-6cWwQ
This document summarizes a talk given by Jack Sarfatti on the topic of how the future may affect the past. Some of the key points discussed include:
1) Theories that attempt to unify general relativity and quantum mechanics suggest that time travel to the past may be possible through wormholes.
2) Quantum experiments have shown that measurements performed in the future can influence the present, raising questions about whether the universe has a predetermined destiny.
3) A post-quantum view incorporates the idea that future events contribute causal influences on present ones, with the past and future interacting at the present moment. This perspective suggests retrocausality, or the idea that the future affects the past.
1. Only certain animals with highly complex and connected brain networks can be conscious, including all mammals and some birds.
2. Everything that arises in the mind or body comes from the dynamics of brain states interacting with each other, not between brain and mind.
3. As brains evolved and enlarged over time, the range of possible brain states and associated mental states expanded, allowing for richer conscious experiences and creativity. However, consciousness remains profoundly mysterious and difficult to explain through normal scientific means alone.
Thinking about Thought - Theories of Brain Mind Consciusness - Part 6. Consciousness, Self, Free Will I keep updating these slides at http://www.scaruffi.com/ucb.html
This document presents a holistic view of the universe as a divine system. It argues that the universe can be seen as a closed system that is able to generate and integrate all natural systems within it. Seeing the universe as a huge, all-encompassing system respects the laws of nature and incorporates all natural entities without chance. Conceptualizing God and the universe as having the same body maintains God's properties of being eternal and all-powerful while viewing the universe and nature as logical and systematic. Understanding the engineering of nature can help assemble reality from micro to macro scales and validate this vision of a divine, logical system of the universe.
Cao, Santiago. "Body and performance in the era of virtual communication. The...Santiago Cao
“To see” is an act much more complex than one purely physiological. There come into play, among other things, knowledge acquired and inherited, which will serve as tools to decode that which is seen in order to understand and assimilate it. And when I make this distinction between acquired and inherited, I consider the former a result of experience of the subject that generates experience and therefore a personal way of "Seeing the world", unlike the inherited knowledge ( "Seeing the world") which is imposed by the culture that raised one (or should I say that co-raised?). But "to see the world" (Ver el mundo) is not the same as "to see the world." (Ver al mundo) To make this distinction, we must develop in this text the premise of "Seeing is Creating and Creating is Believing," which will then be useful to think that if what we see is not what it is but what we believe it is, what happens then to devices of visual representation of "reality" and to those with the power to disseminate those devices? But new technologies such as the Internet and cellular telephony have led to a break in this concept, crossing through the notions of context and paratext, expanding the creative act of "seeing" and thus generating new realities from a single observed event. And the body in all this will not be left out. We will consider that it happens in Performance as an artistic discipline, where the body, which was traditionally support for the work, is now faced with these new ways of seeing and creating it.
The document discusses the relationship between nature and humanity. It explores the perspectives of Einstein, who viewed nature as independent of humanity, and Tagore, who saw nature and humanity as interdependent. The document also examines how indigenous people rationalized their impact on nature during hunts. While humanity has evolved apart from nature, quantum mechanics suggests humans can still interact with and influence nature non-locally. Therefore, both nature and humanity share responsibility for the current natural reality.
Financial Engineering and Its Discontents by Emanuel Derman at QuantCon 2016Quantopian
Neoclassical finance has been with us for over half a century, and its methods have become somewhat uncritically ingrained in the minds of quants. From mean-variance optimization to options theory to behavioral finance, Dr. Derman will discuss which of these ideas work better, and which don’t.
Draft notes for keynote to The Image conference, UCLA and Common Ground, 2 December 2010. Final version will be submitted to http://ontheimage.com/journal/
Philosophy of Mind - Part 2 of Piero Scaruffi's class "Thinking about Thought...piero scaruffi
The document discusses various theories regarding the relationship between the mind and body/brain, including dualism, monism, idealism, materialism, and neutral monism. Dualism proposes that the mind and body are separate substances that interact, while monism argues they are different aspects of the same substance. Materialism specifically states that only matter exists and the mind can be explained physically. The debate examines ideas such as consciousness, cognition, intelligence and how to define the mind.
We are reaching a critical state of time cycle-2012. Only thing that can save us is Truth or knowledge of Nature, understood in simple manner. There is a necessity awaken humanity to Truth. Some one with media skills should read this and cause a revolution as Wael Ghonim, to liberate humanity and take the world to new level of thinking.
The document discusses humanity entering a "scientific period" where scientific method will be applied to understand human nature and guide human progress according to natural laws. It asserts that establishing a "science and art" of directing human capacities and energies for human welfare is consistent with human nature.
The second document criticizes modern rulers for being largely ignorant of modern science and lacking a proper historical and anthropological background. This ignorance creates conflicts between advancing science and outdated worldviews, resulting in a "chaotic" global situation.
The third summary emphasizes studying science and mathematics as forms of human behavior produced by the human nervous system, and studying advances in various fields to understand how knowledge is obtained, with the goal of developing evaluative tools or
This document provides an overview of the Hong Kong Programs course PHIL 250: Philosophy of Mind. It introduces the instructor, textbook, and topics that will be covered in the first lecture, including ideas, concepts, science, theory, and the nature of theories. It also includes two related readings on mapping the brain and the mystery of consciousness. The summaries focus on key concepts that will be discussed in the course rather than providing a comprehensive summary of the document.
1) The document discusses a new kind of positivism founded on a model that meets Carnap's liberal physicalism and is supported by biophysical evidence.
2) It argues that experience is a primitive aspect of the world that plays a role in the formation of physical structures, in contrast to contemporary emergence theory which dismisses experience.
3) The author proposes developing a notion of sensory manifolds as the basic element of apprehension and motility, and reformulating our treatment of time, with the present having something special about it compared to the past and future.
This document discusses the scientific processes of analysis and synthesis. It explains that analysis is the intellectual operation that considers parts of a whole separately, while synthesis assembles separated parts back into a unified whole. The analysis and synthesis are used together in the scientific method to decompose phenomena into parts for study, and then reintegrate the parts to gain a full understanding. These processes allow scientists to better investigate the causes of complex phenomena.
This document provides a summary of the key ideas from the book "Understanding Computers and Cognition: A New Foundation for Design" by Terry Winograd and Fernando Flores. The book brings together topics of computer technology and human existence to generate new understandings. It draws from philosophers like Heidegger, Gadamer, Maturana, and Austin to develop a new foundation for understanding cognition and designing technology based on our situatedness in social and linguistic traditions.
A New Approach to the Hard Problem by Klee IrwinKlee Irwin
This document proposes a theory that reality is made up of primitive "units of consciousness" that operate as a quasi-crystalline language at the Planck scale. It suggests that these units cooperate to observe and interpret their environment mathematically, expressing the patterns of physical reality through their linguistic rules and choices. By organizing in this way, the units are able to actualize the information that makes up the universe and allow for non-deterministic expressions like consciousness to emerge from larger cooperative structures. The theory aims to address both the hard problem of consciousness and fundamental questions in physics by grounding reality in a system of primitive conscious entities following an algorithmic language at the smallest scales.
This document provides an introduction to the author's paper on objectivity in science. It begins by outlining the debate around whether objectivity exists in science. The author then defines key terms like objectivity and science. The main body discusses the problem of underdetermination, which questions objectivity by showing that multiple hypotheses can be consistent with the evidence. The author argues this problem strikes a "death blow" to the idea of objective science. They intend to later argue that using perspectives and context, an intellectual consensus can be reached that approaches objectivity, though true objectivity cannot be achieved.
This document summarizes a report on creativity. It defines creativity as involving novelty and appropriateness. The cognitive mechanisms behind creativity involve a shift from associative to causal thinking. Personality traits like psychoticism and intelligence are discussed in relation to creativity, though intelligence may be more important as it allows for generating and critically analyzing novel ideas. The direction of future study may involve developing creativity in artificial intelligence.
This document discusses theories of human evolution and the emergence of human neurology and reasoning abilities. It explores monist and dualist perspectives on explaining consciousness and behavior. Quantum effects on microtubules and the evolution of sensation in organisms are considered. The development of neuronal patterns, neural systems, and the human brain are examined as factors in the rise of human phenomenology, including rational and emotional behavior. The origins of language, tool use, and reason are debated from different theoretical standpoints.
The document summarizes and evaluates the type-identity thesis, which claims that mental states are identical to physical brain states. It discusses the motivations for the type-identity thesis in response to behaviorism. It examines the arguments of philosophers J.J.C. Smart and D. Lewis in support of the type-identity thesis, including Smart's analogy to lightning and Lewis' argument based on causal roles and transitivity of identity. It also considers criticisms of the type-identity thesis, such as the "dual property" objection and the argument from multiple realizability, and how defenders of the thesis have responded to these criticisms. Overall, the document argues that the type-identity thesis as formulated by D. Lewis withstands
An Analysis of the Phenomena That Have Led Some Philosophers to Introduce the...inventionjournals
The standpoint that all observable phenomena in the universe are fitting inestimable material for science if they are studied by the scientific method is basically positivistic. All things and facts which can be immediately learned by observation, together with their relationship and uniformities which is discoverable by reason without exceeding the limit of empirical observation, are designated as positivism. In positivism the belief in the sensory observation of empirical phenomena, that is empiricism – therefore plays a predominant part. Methodologically therefore positivism is in controversial opposition to the metaphysical abstraction of traditional of traditional philosophy. The term metaphysical is applied to everything that aims to go beyond the sphere of empiricism and seek the hidden essence of phenomena or the ultimate cause of things
QUESTION 11. Modern-day, more sophisticated versions of mind-bod.docxaudeleypearl
QUESTION 1
1. Modern-day, more sophisticated versions of mind-body identity theory
a.
say that belief in physical science requires as much faith as belief in religion.
b.
back away from saying that every single mental phenomenon that has a mental description has a physical description.
c.
deny that there are any nonphysical entities such as minds or souls, so these terms do not refer to anything they are attempting to define or explain.
d.
allow for the possibility that there may be some mental events--at the sub-atomic level of quarks, leptons, or hadrons, for example--that are not actually physical events.
e.
believe that experiments in neurophysics prove the truth of mind-body identity theory beyond a shadow of a doubt.
0 points
QUESTION 2
1. The question that philosophers ask about how it can be possible for something physical to causally interact with something nonphysical comes under the heading of
a.
the law of contradiction.
b.
the appearance-reality distinction.
c.
the free will problem.
d.
the mind-body problem.
e.
the law of cause and effect.
0 points
QUESTION 3
1. Which of the following best applies to the philosophical position of skepticism?
a.
Knowledge can be attained only through experience of what is real.
b.
All knowledge is relative to the knowing subject.
c.
Some forms of knowledge are constituted by true but unjustified belief.
d.
The human attainment of certain knowledge is impossible.
e.
It is false to equate knowledge with power.
0 points
QUESTION 4
1. Identify the epistemological position which claims that the human mind is, at birth, a tabula rasa (a blank slate), onto which the facts of experience are written; moreover, the sum of our experience forms the basis of human knowledge.
a.
Experiential Epistemology
b.
Empiricism
c.
Rational Sensationism
d.
Conceptualism
e.
Scientific Realism
0 points
QUESTION 5
1. According to Locke’s “Representational Theory of Knowledge,”
a.
all empirical propositions are certain.
b.
ideas are not caused by anything; they are original sources of knowledge.
c.
only our ideas of primary qualities provide true pictures of the external world.
d.
only our ideas of secondary qualities provide true pictures of the external world.
e.
only innate ideas can accurately represent reality.
0 points
QUESTION 6
1. Berkeley’s epistemology leads him to the ontological position that
a.
reality does not actually exist.
b.
only God exists.
c.
minds and ideas constitute a separate world from the physical world of matter.
d.
mind and matter both exist, but we can perceive only the effects of matter.
e.
all that exists in realty are minds and ideas in minds.
0 points
QUESTION 7
1. In discussing the controversy in philosophy between the empiricists and the rationalists, Russell explains that although both schools of thought got some things right and some things wrong, the rationalists were right in asserting that
a.
a priori knowledge is itself a product of exp ...
CHAPTER 4The Nature of Substance, Reality, and Mind Idealism,.docxchristinemaritza
The document discusses different philosophical views on the nature of substance and reality, including idealism, dualism, and materialism. It summarizes the views of philosophers like Berkeley, Descartes, Kant, Hobbes, and Searle on issues like the relationship between mind and body, and whether reality is composed of physical or non-physical substances. It also discusses contemporary theories in the philosophy of mind like reductionism, identity theory, and functionalism regarding how the mind might be explained physically.
Idealism holds that the most basic unit of reality is conceptual rather than material. There are several types of idealism: subjective idealism views reality as constituted by consciousness and its contents; divine idealism sees reality as manifestations of God's mind; ontological idealism argues reality is made of ideas or concepts at its foundation; and epistemological idealism focuses on how the mind structures our understanding of reality. Idealism contrasts with materialism, which views the physical world as the only true reality and consciousness as a physical process in the brain.
Jack oughton intelligent design is not scienceJack Oughton
This document discusses intelligent design and why it is not considered a scientific theory. It provides background on intelligent design and its arguments, such as that biological systems are too complex to have evolved naturally. However, the document argues these claims have been disproven as biological systems have been shown to evolve gradually from simpler precursors. It also discusses how intelligent design is not falsifiable and therefore not a valid scientific theory according to Karl Popper's definition. The document concludes intelligent design is a religious argument disguised as science and has no place being taught as such in public school science classes.
Paper # 11. READ THE ARTICLE THAT FOLLOWS THESE INSTRUCT.docxalfred4lewis58146
Paper # 1
1. READ THE ARTICLE THAT FOLLOWS THESE INSTRUCTIONS
2. THINK ABOUT IT
3. ANSWER THE FOLLOWING QUESTION (USING 2 PHILOSOPHERS YOU HAVE READ OR READ ABOUT SO FAR IN THE CLASS). DOES MARY LEARN ANYTHING NEW WHEN SHE SEES RED FOR THE FIRST TIME? IF SHE DOES, THEN, WHAT IS IT? IF SHE DOES NOT, WHY NOT?
The paper should be:
· 12 font
· Times New Roman
· With a cover page
· A works cited page
· Cite all references and quotations made
· 3 pages
What Did Mary Know?
Marina Gerner on a thought experiment about consciousness.
Imagine a girl called Mary. She is a brilliant neuroscientist and a world expert on colour vision. But because she grew up entirely in a black and white room, she has never actually seen any colours. Many black and white books and TV programmes have taught her all there is to know about colour vision. Mary knows facts like the structure of our eyes and the exact wavelengths of light that stimulate our retinas when we look at a light blue sky.
One day, Mary escapes her monochrome room, and as she walks through the grey city streets, she sees a red apple for the first time.
What changes upon Mary’s encounter with the red apple? Has Mary learnt anything new about the colour red upon seeing the colour for the first time? Since Mary already knew everything about the physics and biology of colour perception, she must surely have known all there is to know about the colour red beforehand. Or is it possible that some facts escape physical explanations? (‘Physical’ in this sense refers to all the realms of physical science, including chemistry, biology, neuroscience, etc.). If Mary has learnt something new, then we can conclude that scientific explanations cannot capture all there is to know, argues Professor Frank Jackson, who thought up this scenario in ‘Epiphenomenal Qualia’, in The Philosophical Quarterly (1982). The story of Mary is known as the ‘knowledge argument’ and it has become one of the most prominent thought experiments in the philosophy of mind.
You might say, “Hang on a minute, how was it possible that Mary grew up in a black and white room in the first place?” Never mind the first place. Some philosophers have put forth that she wore special goggles. But this issue need not concern us, because philosophical thought experiments depend on logical coherence rather than practical feasibility. Philosophers devise such narratives to think through an imagined situation, so as to learn something about the way we understand things. Thought experiments require no Bunsen burners or test tubes; they are laboratories of the mind. In thought experiments, time travel is logically possible, but no philosophy professor is expected to travel back in time to prove their point.
Reinvigorating The Debate
The reason Professor Jackson devised the thought experiment involving Mary was to challenge the physicalist school of thought. In philosophy of mind debates, proponents of physicalism argue that what really m.
The Naturalist Challenge to ReligionMichael RuseNaturali.docxdennisa15
The Naturalist Challenge to Religion
Michael Ruse
Naturalism
“Philosophical Naturalism” – an intention to let one’s philosophical discussions be as science-like and science-based as possible
“Methodological Naturalism” – the attempt to understand the world in terms of unbroken law, i.e. no appeal to supernatural interventions.
“Metaphysical Naturalism” – it claims that there is nothing beyond this natural world, e.g. no gods.
Methodological Naturalism and Metaphysical Naturalism
Many methodological naturalists are metaphysical naturalists, while some methodological naturalists are not metaphysical naturalists.
Outline
The Case for Methodological Naturalism
Objections to Methodological Naturalism
1. Inadequacy of Natural Selection
a. Origin-of-Life Objections (Plantinga)
b. Adaptations Objections (Intelligent Design)
2. Problem of Humans
a. Free Will
b. Preferences and Character Dispositions
c. Consciousness
d. Morality
3. Incoherence of Methodological Naturalism
Metaphysical Naturalism
Methodological Naturalism to Metaphysical Naturalism
The Case for Methodological Naturalism
Thesis
Methodological naturalism is true in the sense that it embodies the proper procedure for acquiring knowledge.
Initial Argument
After 400 years since the Scientific Revolution, the thesis should have been obvious by now. The world operates lawfully. We, increasingly, are knowing more and more about such laws. Anomalous or difficult to explain events have been resolved, according to unbroken law. Many religious scientists feel absolutely no tension between their religion and wholehearted methodological naturalism, since it works and they feel that they can better reveal and understand God’s creation.
Naysayers
There are invokers of miracles (actual violations of the laws of nature). At some level, you cannot argue with them. But, from the standpoint of evidence and reason, it is more reasonable to conclude that an alleged miraculous event is likely explicable naturalistically.
Existence and Nature of Organisms: Organisms are adaptively organized, i.e. not just thrown together randomly but are complex, integrated, and functioning (in accordance to their “final causes” – means to ends). For example, hands and eyes have purposes. There’s a need for an intelligent designer.
For naturalists, Darwin already solved this problem by proposing natural selection. For Darwin, all organisms are the end product of a long and slow process of change. Some, or the fittest, survive and reproduce and their distinctive features (showing final cause) that are passed on to their offspring. Over time, a change in the direction of adaptive advantage is produced. After Darwin, we reject Aristotelian special life forces, which direct organisms or their parts to ends. But the metaphor of design remains among Darwinists, since natural selection produces design-like entities, e.g. eye.
Objections to Methodological Naturalism
1. Inadequacy of Natural Selecti.
For most of the twentieth century a “brain-first” approach dominated the philosophy of consciousness. The idea was that the brain is the thing we really understand, through neuroscience, and the task of the philosopher is try to understand how that thing “gives rise” to subjective experience: to the inner world of colours, smells and sounds that each of us knows in our own case. This philosophical project has not gone all that well–nobody has provided even the beginnings of a satisfying solution to what David Chalmers called “the hard problem” of consciousness.
Essay about Sci-fI Films
Science Essay
Scientific Theory Essay
Evolution of Science Essay
My Love For Science
Essay about Life Science
My Passion For Science
Environmental Science Essay
Essay on Forensic Science
What Is Earth Science? Essay
Similar to Teleology in Evolution - A Presentation for the New Orleans C G Jung Society (20)
Evidence of Jet Activity from the Secondary Black Hole in the OJ 287 Binary S...Sérgio Sacani
Wereport the study of a huge optical intraday flare on 2021 November 12 at 2 a.m. UT in the blazar OJ287. In the binary black hole model, it is associated with an impact of the secondary black hole on the accretion disk of the primary. Our multifrequency observing campaign was set up to search for such a signature of the impact based on a prediction made 8 yr earlier. The first I-band results of the flare have already been reported by Kishore et al. (2024). Here we combine these data with our monitoring in the R-band. There is a big change in the R–I spectral index by 1.0 ±0.1 between the normal background and the flare, suggesting a new component of radiation. The polarization variation during the rise of the flare suggests the same. The limits on the source size place it most reasonably in the jet of the secondary BH. We then ask why we have not seen this phenomenon before. We show that OJ287 was never before observed with sufficient sensitivity on the night when the flare should have happened according to the binary model. We also study the probability that this flare is just an oversized example of intraday variability using the Krakow data set of intense monitoring between 2015 and 2023. We find that the occurrence of a flare of this size and rapidity is unlikely. In machine-readable Tables 1 and 2, we give the full orbit-linked historical light curve of OJ287 as well as the dense monitoring sample of Krakow.
Signatures of wave erosion in Titan’s coastsSérgio Sacani
The shorelines of Titan’s hydrocarbon seas trace flooded erosional landforms such as river valleys; however, it isunclear whether coastal erosion has subsequently altered these shorelines. Spacecraft observations and theo-retical models suggest that wind may cause waves to form on Titan’s seas, potentially driving coastal erosion,but the observational evidence of waves is indirect, and the processes affecting shoreline evolution on Titanremain unknown. No widely accepted framework exists for using shoreline morphology to quantitatively dis-cern coastal erosion mechanisms, even on Earth, where the dominant mechanisms are known. We combinelandscape evolution models with measurements of shoreline shape on Earth to characterize how differentcoastal erosion mechanisms affect shoreline morphology. Applying this framework to Titan, we find that theshorelines of Titan’s seas are most consistent with flooded landscapes that subsequently have been eroded bywaves, rather than a uniform erosional process or no coastal erosion, particularly if wave growth saturates atfetch lengths of tens of kilometers.
PPT on Alternate Wetting and Drying presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
BIRDS DIVERSITY OF SOOTEA BISWANATH ASSAM.ppt.pptxgoluk9330
Ahota Beel, nestled in Sootea Biswanath Assam , is celebrated for its extraordinary diversity of bird species. This wetland sanctuary supports a myriad of avian residents and migrants alike. Visitors can admire the elegant flights of migratory species such as the Northern Pintail and Eurasian Wigeon, alongside resident birds including the Asian Openbill and Pheasant-tailed Jacana. With its tranquil scenery and varied habitats, Ahota Beel offers a perfect haven for birdwatchers to appreciate and study the vibrant birdlife that thrives in this natural refuge.
Discovery of An Apparent Red, High-Velocity Type Ia Supernova at 𝐳 = 2.9 wi...Sérgio Sacani
We present the JWST discovery of SN 2023adsy, a transient object located in a host galaxy JADES-GS
+
53.13485
−
27.82088
with a host spectroscopic redshift of
2.903
±
0.007
. The transient was identified in deep James Webb Space Telescope (JWST)/NIRCam imaging from the JWST Advanced Deep Extragalactic Survey (JADES) program. Photometric and spectroscopic followup with NIRCam and NIRSpec, respectively, confirm the redshift and yield UV-NIR light-curve, NIR color, and spectroscopic information all consistent with a Type Ia classification. Despite its classification as a likely SN Ia, SN 2023adsy is both fairly red (
�
(
�
−
�
)
∼
0.9
) despite a host galaxy with low-extinction and has a high Ca II velocity (
19
,
000
±
2
,
000
km/s) compared to the general population of SNe Ia. While these characteristics are consistent with some Ca-rich SNe Ia, particularly SN 2016hnk, SN 2023adsy is intrinsically brighter than the low-
�
Ca-rich population. Although such an object is too red for any low-
�
cosmological sample, we apply a fiducial standardization approach to SN 2023adsy and find that the SN 2023adsy luminosity distance measurement is in excellent agreement (
≲
1
�
) with
Λ
CDM. Therefore unlike low-
�
Ca-rich SNe Ia, SN 2023adsy is standardizable and gives no indication that SN Ia standardized luminosities change significantly with redshift. A larger sample of distant SNe Ia is required to determine if SN Ia population characteristics at high-
�
truly diverge from their low-
�
counterparts, and to confirm that standardized luminosities nevertheless remain constant with redshift.
The cost of acquiring information by natural selectionCarl Bergstrom
This is a short talk that I gave at the Banff International Research Station workshop on Modeling and Theory in Population Biology. The idea is to try to understand how the burden of natural selection relates to the amount of information that selection puts into the genome.
It's based on the first part of this research paper:
The cost of information acquisition by natural selection
Ryan Seamus McGee, Olivia Kosterlitz, Artem Kaznatcheev, Benjamin Kerr, Carl T. Bergstrom
bioRxiv 2022.07.02.498577; doi: https://doi.org/10.1101/2022.07.02.498577
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
PPT on Sustainable Land Management presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
Teleology in Evolution - A Presentation for the New Orleans C G Jung Society
1. Teleology in Nature
From Awareness to Adaptation and the link
between Psyche and Matter
Saturday, September 1, 18
Slide 1:
- Contribute what I can: Science now achieving Jung’s dream to flesh out his theory of the archetypes, specifically their psychoid nature - that is the “missing
link” between matter and psyche
- Pauli and Jung jumping the gun with physics; reclaim biology
2. Jung and Archetypes
Split of Self/Other
Male and Female
Sacrificial Hero -
transformation
Categories of thought =
categories of reality
Matrix for framing all
experience
Saturday, September 1, 18
Slide 2:
- “The archetypal structure of the unconscious corresponds to the average run of events... they are variations of certain typical occurrences”
- Archetypes; psychological matrices; categories of thought and ultimate categories of reality; they are equivalent to the German philosophical concept of
categories (Hegel, Fichte - that’s why Anglo-Americans have such a difficult time - they are forms without content, molds, matrices, into which experiential reality
automatically falls and crystallizes into particular shapes)
- Perceived indirectly; the images which partly reflect their aspects are not equivalent to the archetype itself which is liminal phenomenon of edge of subjective
and objective, the world of matter and the realm of ideas, the map is not the terrain, the image is not the reality - “Ceci n’est pas une pipe”
- Objective experience is collective subjective experience
3. The Psychoid Archetypes
Life transcends itself -
individual and collective
“the bridge to matter in
general”
As above so below
Objective reality is collective subjective reality
Saturday, September 1, 18
Slide 3:
- “…the position of the archetype would be located beyond the psychic sphere, analogous to the position of physiological instinct, which is
immediately rooted in the stuff of the organism and, with its psychoid nature, forms the bridge to matter in general.” Carl Jung, CW 8, Para 420.
- Archetypal categories have validity on all levels; our experience is the experience of our constituent parts
- Science as struggle with the material manifestation
4. Ontological Reductionism
Experience and
Meaning
Self-identity; goal
seeking and perception
AI “Frame Problem” vs
Heideggerian AI
das Ding an sich? Nein!
Saturday, September 1, 18
Source: “The Minds of Machines” by Namit Arora
Walter Freeman - Neurodynamics
The problems encountered in developing “symbolic AI” best illustrate the flaws of reductionism. It was assumed that since people are essentially
robots, it would be easy to make a machine that behaves like a human. It turns out that framing the world is the most important function of
consciousness, that is, behavior is a constant adjustment of external circumstances to past experience and drives and goals. Symbolic AI developers
assumed that goals and agency are indeed illusions contributing nothing to the behavior of systems, and emboldened by psychology department
colleagues who were more interested in the similarities between humans and pigeons than their differences, embarked on a formalized top-down
approach to AI development.
“Why Heideggerian AI Failed and how Fixing it would Require making it more Heideggerian” by Hubert L. Dreyfus
“My interlocutors countered that although extremely complex, the human brain is clearly an instance of matter amenable to the laws of physics.
They posited a reductionist and computational approach to the brain that many, including Steven Pinker and Daniel Dennett, continue to champion
today… Our intelligence, and everything else that informs our being in the world, had to be somehow coded into our brain’s circuitry – including the
great many symbols, rules, and associations we rely on to get through a typical day. Was there any reason why we couldn’t decode this, and
reproduce intelligence in a machine some day? Couldn’t a future supercomputer mimic our entire neural circuitry and be as smart as us?…They
assumed that our brain stored discrete thoughts, ideas, and memories at discrete points, and that information is ‘found’ rather than ‘evoked’ by
humans. In other words, the brain was a repository of symbols and rules which mapped the external world into neural circuits. And so the problem of
creating AI was thought to boil down to creating a gigantic knowledge base with efficient indexing, ie, a search engine extraordinaire. That is, the
researchers thought that a machine could be made as smart as a human by storing context-free facts, and rules which would reduce the search time
effectively. Marvin Minsky of MIT’s AI lab went as far as claiming that our common sense could be produced in machines by encoding ten million facts
about objects and their functions. It is one thing to feed millions of facts and rules into a computer, another to get it to recognize their significance and
relevance. The ‘frame problem’, as this last problem is called, eventually became insurmountable for the ‘symbolic AI’ research paradigm. One critic,
Hubert L. Dreyfus, expressed the problem thus: “If the computer is running a representation of the current state of the world and something in the
world changes, how does the program determine which of its represented facts can be assumed to have stayed the same, and which might have to be
updated?”” [this is an extension of the Duhem-Quine thesis]
Of course, reality does not consist of facts, but of what Heidegger called “readiness-to-hand” or what J.J. Gibson called “affordances” or evoked
contextual relationships framed by internal goals of the subject. When you want to sit down, you begin seeing potential chairs everywhere. When
you’re hungry your colleagues in the elevator become food, not too soon, I hope. We do not experience objects as “object” but only in their function
so that the object-as-such disappears when we experience it. Reality changes constantly and requires constantly renewed evaluation and pruning of
internal models, which would be too costly and difficult, especially if done consciously. Instead, the “model of reality” is subjective reality itself - what
you get is what you seek and what you seek is what you get.
Coping with reality is natural and rarely involves active “problem solving”, or representational, abstract thought. Instead, all you need is “dynamic
coupling” with the world. An expert chef doesn’t measure his ingredients or use the same amount each time - true mastery is so adaptive as to not be
6. Neo-Darwinian Paradigm
Julian Huxley - 1942
Synthesis of Mendelian
genetics (population
genetics - alleles)
“natural selection” (pos/
neg environmental
pressure)
Why not acquired
characteristics? No
reason apart from
philosophical biases
Saturday, September 1, 18
Slide 2: Darwin proposed that organisms change over time due to the selection sufficiently fit individuals by the environment - as the environment changes, those
individuals with better variations tend to survive. The modern view of the mechanism of this adaptation is called the “Neo-Darwinian Paradigm” or the “Modern Synthesis.”
The latter name more accurately reflects the origins of the theory: a synthesis of Mendelian population genetics with the aforementioned Darwinian theory of Natural
Selection.
Lamarck famously proposed that acquired characteristics of an animal can be passed on to it’s offspring but no mechanism was known to explain this. This idea is in no
way incompatible with Darwin’s theory of Natural Selection - it merely provides an alternative or parallel mechanism by which an organism evolves. It suggests agency, a
will to power, on the part of the living system, and historically has been ridiculed and denied by many biologists (Huxley, Dawkins, Dennett, Ernst Mayer) because it flew in
the face of the reductionistic-materialistic models of biology that became dominant in the 1950s.
How do variants arise in the first place though? Random mutation was proposed later, but has become an utterly unfounded fundamental assumption in population
genetics - as we shall see, random mutations are the rare exception rather than the rule when it comes to genetic change.
(These lovely ladies are bdelloid rotifers - we’ll come back to them later)
7. Origins
Darwin did not firmly
propose a mechanism
Germplasm Theory -
Weismann - 1890’s
Environment and
“germplasm”
interaction
Germplasm
Germplasm
Germplasm
Somatoplasm
Somatoplasm
Somatoplasm
Saturday, September 1, 18
Slide 3: Charles Darwin himself did not propose a mechanism - in fact he speculated that sperm and eggs, the germ cells, were influenced by somatic signals from the body
of the organism. However, the Modern Synthesis, which rejects the heritability of acquired characteristics, has it’s origins with the Germplasm theory of Weismann - he
proposed that the hereditary material generated the “soma” (body) but was passed down independently. To quote his 1893 work, “we must assume natural selection [as
opposed to inheritance of acquired characteristics] to be the principle of explanation of the metamorphoses, because all other apparent principles of explanation fail us, and it
is inconceivable that there could yet be another capable of explaining the adaptations of organisms, without assuming the help of a principle of design.” He was the first to frame a
false dichotomy between the two evolutionary theories of “design,” presumably by God or aliens, on the one hand, and, on the other hand, the random number generator of
nature playing against the random number generator “germplasm,” aka DNA, mutations. He did not conceive of any capacity for intentional change to the organism itself.
8. Origins
“Central Dogma” -
Crick
Fisher - small,
separate, additive
allelic contribution to
phenotype
DNA
DNA
DNA
Protein
Protein
Protein
RNA
RNA
RNA
Saturday, September 1, 18
Slide 4: The Germplasm theory became the “Central Dogma,” an ironically-named hypothesis of Crick, a hard-headed materialist (despite rumors that LSD inspired his
vision for the structure of DNA). The basic premise of this “dogma” states that information can only from DNA to protein via RNA. This has been completely invalidated
and we now know (and have known for decades) that information can flow every conceivable direction in the cell.
In Of Molecules and Men, he wrote, “The ultimate aim of the modern movement in biology is to explain all biology in terms of physics and chemistry.” This aim set
biology on a dead-end track for the last half-century. Instead of focusing on what makes you different from the wood you’re sitting on, biologist have tried to understand life
by trying to add up it’s inanimate components rather than what emerges from the sum of their parts. But as we shall see, despite a desperate rear-guard effort by militant
materialists to stymie public dissemination and discourse by ignoring contrary observations, this reductionist model has been invalidated since the 1960s.
9. Ad Hoc Mods - Genes vs
Population Phenotypes
Haldane - Random mutations
independent of utility;
independent of each other
Kimura, King & Jukes - Neutral
Theory
Gould & Eldridge - Punctuated
Equilibrium?
Genetic Drift
Saturday, September 1, 18
Slide 5: Now every theory needs revisions, so other great figures in the development of the Modern Synthesis, especially Haldane, demonstrated mathematically that very
slight selective pressures combined with random mutation could result in very rapid evolutionary change - faster than can be shown in the fossil record but slower than
would show up in field measurement. Further modifications to the theory - the addition of the concept of Neutral Mutations and Genetic Drift (random assortment of alleles)
shored up other problems with the paradigm. One theory, called “Punctuated Equilibrium” (not so much that change happens rapidly, but that it often doesn’t occur for some
reason) from Stephen J. Gould is often lambasted by the likes of Dawkins although others support it’s compatibility with the Modern Synthesis. As a developmental biologist
he was more interested in the origin of variation among organisms rather than population-sized “gene pools” and “pressures.”
Stephen J. Gould said: "If acquired characters are inherited only rarely and weakly, then Lamarckism might aid natural selection in developing adaptation
more quickly - a position advocated by Darwin himself throughout the Origin. But if acquired characters are inherited faithfully all the time, then natural selection
will be overwhelmed and Lamarckism becomes a refutation of Darwinism. Relative frequency determines the distinction." The Structure of Evolutionary Theory
(2002).
In the classic view of the Modern Synthesis, ‘‘evolution’’ is defined as shifting the frequencies of genes in the ‘‘gene pool,’’ which maintains an abundance of
infinitesimal ‘‘random’’ variation; accordingly, evolutionary causes are conceived as ‘‘forces’’ (‘‘pressures’’) that cause mass–action shifts, that is, classic
evolutionary theory is a theory of forces [acting on a “pool”] (Sober 1984). In this view, mutation is seen as a weak force opposed effectively by selection, given
that mutation rates are so low (Yampolsky and Stoltzfus 2001; Stoltzfus 2006b). This view suggests that an effect of mutation would depend on special conditions
such as unnaturally high rates of mutation, or absence of selection, that is, neutral evolution. Indeed, the research literature associates mutation-biased evolution
with neutral evolution [absence of selection].
10. Crucial Questions
How did life arise? If new genes are
modified old genes, how do new genes
arise in individuals?
Are natural selection and “genetic
drift” the only mechanisms of change
in a population - are acquired traits
heritable?
Can organisms act to alter their
genome? Is there a “meta-genome?”
Are mutations random? Independent?
Infinitesimally variable? Unintended?
Saturday, September 1, 18
Slide 6: In summary, the Modern Synthesis (I prefer this term as it is more accurate and including Darwin in this mess is unfair) consists of: 1) Natural selection (and
random genetic drift and a few other possible minor mechanisms) of random alleles as the main driving force of evolutionary change. 2) That population variation results
from different frequencies of gene variants and the steady accumulation of small, random genetic changes provides the variation that the whims of nature act upon. And 3)
the genetic models of the “Central Dogma” - that mutations are rare, gradual, random, “mistakes” - help to justify this view.
We cannot test natural selection by scientific means. But we can test the theory about the nature the genome, it’s mutations, and how they arise. The problem has been
that theory dictated what scientists saw. Let’s free ourselves of our prior assumptions and ask a few reasonable quesitons
1) First, the biggest one is how did natural forces pressure inorganic matter to acquire the characteristics of live in the first place? No complete theories exist. Testability is
limited. We’ll ignore this question, but I may touch on it later if time permits.
Can organisms genetically modify themselves and their offspring for a purpose? Why not? Are organisms “aware” of the genetic material, i.e. does the system have a meta-
level of encoding what genes are where and the structure/modifications present on the chromosome allowing internal modification? Have these questions been asked and
funded or are scientists as psychologically primitive and backward as the cardinals of the late medieval church they like to make fun of?
Are mutations truly random and independent of each other and the environment? Obviously there are mutagenic chemicals which disrupt the structure of DNA, DNA-
repair mechanisms, or the chromosomal structures, but these are external toxins and act (usually) nonspecifically, randomly, and directly. This in fact proves organisms are at
least somewhat aware of the genome (otherwise there would be no repair mechanisms), but the random-mutation model suggests that errors are the result of failure to
dutifully copy or failing to recognize and repair a disruption. “Positive” errors, or intentional active modifications are simply rejected as impossible by so-called “skeptics” -
why not look for them?
11. A Clockwork Universe
Descartes, Newton,
Laplace had it wrong
Materialism is inherently
theistic
At best probabilistic
(depending on ontological
interpretation of
quantum theory)
Bad at describing high-
level phenomena
Worldview God Nature
Traditional Christian Interactive Animate
Early Enlightenment Interactive Machine
Enlightenment Deism Creator Only Machine
Romantic Deism Creator Only Animate
Romantic Atheism None Animate
Reductive Materialism None Machine
From Rupert Sheldrake’s “Science Set Free”
“Gone, alas, is his faith in his dignity, uniqueness,
irreplaceableness in the rank-ordering of beings - he has
became animal, literally, unqualifiedly, and unreservedly an
animal, man who in his earlier faiths was almost God - child of
God, man of God” - Nietzsche, Genealogy of Morals
Saturday, September 1, 18
Slide 30: But surely, even if living organisms are not “mechanical” they must be deterministic, just like everything in the Universe. Well, the Universe doesn’t seem to be
deterministic. At best, according to the most “determined” interpretation of Quantum Theory, i.e., that which claims the theory to be an ABSOLUTE, ACTUAL description
of fundamental reality, i.e. things exist as probability clouds, the Universe may be said to be probabilistically determined. The only thing that is “determined” or fixed, are
probabilities of events rather than mechanical cause-effect chains - but these probabilities diverge rather quickly in all but the smallest, simplest systems or those at or near
equilibrium (see work of Nobel Laureate Viscount Ilya Prigogine, summarized in The End of Certainty).
In fact, it may be argued that the “clockwork universe” models of Descartes, Newton, and Laplace (with his famous demon), which are still the basis of modern,
enlightenment era rationalist philosophy - expounding rational cause-effect chains as the only legitimate means of understanding the world - are inherently theistic (see table).
The modern twist is the replacement of dualism with a materialistic monism - the ultimate expression of rationalism, which ironically, negates the very concept of rationality or
any theory of mind as not real or meaningful, a mere epiphenomenon. This, in turn, leads to the bleak nihilism predicted by Nietzsche. In fact, Nietzsche had some choice
words for modern science!
“Extreme positions are not succeeded by moderate ones but by extreme positions of the opposite kind. Thus the belief in the absolute immorality of nature, in aim-
and meaninglessness, is the psychologically necessary affect once the belief in God and an essentially moral order becomes untenable… It is the most scientific
of all possible hypotheses. We deny end goals: if existence had one it would have to have been reached. That science is possible in this sense that is cultivated
today is proof that all elementary instincts, life's instincts of self-defense and protection, no longer function. We no longer collect, we squander the capital of our
ancestors, even in the way in which we seek knowledge…
a. In the natural sciences ("meaninglessness"); causalism, mechanism. "Lawfulness" an entr’acte, a residue.”
…classic Freddy!
12. Emergent Physics of Life
Varied exposure, pressure
Information storage
Self-awareness of genome
Formulation of meaning/
problems from internal and
external state
Self-engineering capacity
Horizontal vs Vertical leaps
Saturday, September 1, 18
13. Creativity is “Real”
What is life?
“cooperative, complex
cybernetic systems”
Self Awareness
Genomic “Web”
Organisms are not
Turing machines - they
are self-referential,
self-modifying
Gödel's Theorem
lemma - a closed system
cannot create a more
complex system
Saturday, September 1, 18
Slide 27: Many smart people claim (and this is the underlying assumption of the Modern Synthesis) that there is “In truth, only atoms and the void” colliding randomly
and deterministically and life is no different than any mechanical interaction. Somehow these fixed entities have randomly combined into self-aware agents. But life is clearly
different in its behavior from non-living matter, and it takes a lot of intellectual rationalization to overcome this evident claim. A better way of describing life is as a complex,
self-engineering, cybernetic system. “Complex” means its a self-organizing system through which a flux matter and energy flow (i.e. “open”) in certain consistent patterns (as
described by complex systems theory - chaos, attractors, ergodicity, etc). A “cybernetic system” is actually anything as simple as a thermostat and heater. A sensor senses the
environment, compares it to an internal set-point, and changes the environment until the feedback shuts it off again. This is what cells and organisms do to maintain an
constant internal and external environment, but for some reason extending this to the genome is controversial for no other reason than historical bias of Weisman’s theory.
Living systems contrary to the Modern Synthesis are able to modify themselves on every level as we’ve seen. The problem is that now they have agency and cannot be
reduced to inanimate matter.
“The definition of sets leads to logical paradoxes (Russell-type, like the famous barber paradox: a barber is a person who cuts the hair of every man who does
not cut his own hair.) when we try to include a notion of self-reference. Russell and others have devoted much effort to construct formal axiomatic systems free of
inherent logical paradoxes. Gödel's theorem [72,73] proved that they all have to be “incomplete", including the Principa Mathematica of Russell and Whitehead. It
is important to emphasize that Gödel's theorem applies to closed systems which are also fixed in time. I propose that one has to take an entirely different
approach and not start with the notion of sets of elements. I believe that here is exactly where the reductionist approach fails. We cannot reach self-awareness
starting from passive elements, no matter how intricate is their assembly. I propose to replace elements by agents that possess internal structure…” [The sort of
Munchausen-esque “bootstrapping” that seems to occur does not relate to Gödel’s theorem which suggests that a system cannot develop another system more complex than
itself, a long-derived corollary from his basic statement that a formal logical system cannot be both complete and consistent, so that a self-consistent system must make
reference to a larger set of axioms provable only outside of itself, ie, be unprovable.]
Ben-Jacob noted that because the bacterial colony is greater than a single bacterium, it’s “solutions” can involve modifying and improving its constituent parts (bacterial
genomes, which are self-aware agents, not merely passive “elements”) and thus solve problems and challenges that bacteria on their own would not be able to overcome. This
cooperative self-improvement represents a “vertical” genomic leap rather than more common “horizontal” change on the same organizational or systemic level. That’s the
“cooperative” bit right there.
The concept of a “Universal Turing Machine” is the basic computational model developed by Alan Turing. In fact, all computers (non-quantum) can be modeled using
this theoretical set of rules which consists of nothing more than reading and manipulating symbols on an infinite tape. The programs are nothing more than instructions on
moving and modifying the symbols on each square of infinite tape. Surprisingly ALL computation maps to this basic model. The common thought-cliche that organisms are
nothing more than biological robots is fundamentally wrong because of life’s recursive complexity and ability to change underlying “rules” and structure, as Ben-Jacob
clearly explains, and as the evidence so far suggests. We will consider more evidence in the following slides, and return to the philosophical implications of these ideas and
their opposition to the prevailing paradigm.
“…metaphorically speaking, the genome includes a user with a computational unit and a hardware engineer with a team of technicians for continuous design
and implementation of changes in the hardware. Such a complex is beyond a universal Turing machine. In the latter, the structure is static and is decoupled from
the input/output and the computation process. The genome is a dynamic entity. If its structure changes adaptively it does so according to the performed
computations. It implies that the genome is capable of self-reference, has self-information and, most crucially, has self-awareness. The user represents the ability
of the genome to recognize that it faces a difficulty (imposed by the environmental conditions), formulate the problem associated with the difficulty and initiate a
search for its solution”
If you’re interested in Ben-Jacob’s work and more pretty pictures of bacterial colonies, look up his papers and his great youtube presentation.
Source: Bacterial wisdom, Godel's theorem and creative genomic webs, Eshel Ben-Jacob. Physica A. Aug 1998; 57-76
Antigenic variation and phase variation
http://shapiro.bsd.uchicago.edu/TableII.5.shtml
14. Artificial vs Natural
Meaning and agency
Semantic and pragmatic
communication
Self-identity and
associate identity
Generation and
assignment of meaning
Intentional behavior
Self-modification
Input, Output, Control
Energy and waste exchanged
with environment
Genetic code as program
Output is self-assembly,
duplication and search algos
“Finite control” -
reproduction errors are the
source of variation
This prevailing instinct and fashion which would much rather come
to terms with absolute randomness, and even the mechanistic
senselessness of al events, than the theory that a power-will is
acted out in all that happens.
- Genealogy of Morals
Saturday, September 1, 18
Slide 31: Champions of materialistic philosophy (Crick, Dawkins, Marvin Minsky, Daniel Dennett, and most of the famous so-called “skeptics”) claim that humans are
“nothing but biological robots,” or some other metaphor that reduces the complexity of human beings to material interactions. As we’ve seen, however, even a bacterium
exceeds the basic functional and computational limitations of a computer, as modeled by a Universal Turing Machine, which has a “finite control” or “program script” and a
infinite tape on which it reads and modifies symbols - no context dependence, no self-identity and therefore no self-writing or self-modification capacity. How embarrassing!
A change in the environment has no meaning or relevance for a Turing Machine in the way it does for a living system, whose actions are goal-oriented (i.e. teleological)
towards survival. Throwing a puppy and a cell phone into a river is not equivalent. It’s naively obvious that a living system cannot be “dead” like a machine. Simply the
existence of our own subjective experience of reality demonstrates that we are different from a Universal Turing Machine, but amazingly enough many intelligent
philosophers and scientists hold on to the bizarre and empirically false belief that human self-awareness is an “illusion.” Nor is it exactly clear who or what is being deceived
if subjective experience itself is the illusion. In truth, only atoms and the void…
Here we have a comparison between the two models (taken from Ben-Jacob’s paper). You can also see a diagram of the Universal Turing Machine below, which
effectively describes the function of all computers (see slide 18 for details). Also included is a cybernetic diagram created by psychologist Jordan Peterson (from “Maps of
Meaning”) describing the basic pattern of mammalian behavior. It shows a present state, certain assumptions about the internal and external environment, and a modeled
“ideal” internal/external state to which an organism strives via planned and adaptive goal-seeking behavior. The essential feature is novelty - the constantly emerging chaos of
nature can only be modeled in very limited dimensions by an individual organism, almost entirely by the organism’s goals and predictions of future states. Adaptation and
survival is therefore an active teleological pursuit starting at extremely basic (as we have seen, even molecularly basic) processes in the organism - processes which go all the
way to the highest technical, artistic, and spiritual pursuits of mankind. These are attempts to model, describe, and master nature and reality. This origin of novelty is
depicted in human symbolism by images of the archetype of chaos: Tiamat, Echidna, the Ouroboros, and other representations of combined cthonic, maternal, and reptilian
symbols. To overcome chaos, reality must be carefully observed and understood - which is why Marduk who slew Tiamat had eyes all the way around his head, and why
Argos Panoptes, the hundred-eyed, was the one who slew Echidna (essentially a version of the same myth), and why the falcon-headed Horus, with his famous eye, defeated
Set, god of chaos and the desert in later Egyptian myths, and why Perseus slew the medusa by viewing her image in a polished shield, i.e. modeling chaos, etc, etc).
Ben-Jacob comes to the same conclusions:
“Accordingly it is now largely assumed that all aspects of life can in principle be explained solely on the basis of information storage in the structure of the
genetic material. Hence, an individual bacterium, bacterial colony or any eukaryotic organism is in principle analogous to a pre-designed Turing machine. In this
analogy, the environment provides energy (electric power of the computer) and absorbs the metabolic waste products (the dissipated heat), and the DNA is the
program that runs on the machine. Unlike in an ordinary Turing machine, the program also has instructions for the machine to duplicate and disassemble itself
[von Neumann Machine] and assemble many machines into an advanced machine – the dominant Top-Level Emergence view in the studies of complex systems
and system-biology based on the Neo-Darwinian paradigm…We explicitly propose that the ability to assign contextual meaning to externally gathered information
is an essential requirement for survival, as it gives the organism the freedom of contextual decision-making. By contextual, we mean relating to the external and
internal states of the organism and the internally stored ontogenetic knowledge it has generated. We present the view that contextual interpretation of information
[within the context if its goals] and consequent decision-making are two fundamentals of natural intelligence that any living creature must have.”
Source: Meaning-based natural intelligence vs. information-based artificial intelligence. E. Ben Jacob, Y. Shapira
15. Recursive Complexity
Organisms change environment
Termite Mounds
Oxygen
Biofilms
Behavior
Baldwin effect
Shielding
Humans!
‘‘The Earth was probably born
by accident; but, in accordance
with one of the most general
laws of evolution, scarcely had
this accident happened than it
was immediately made use of
and recast into something
naturally directed.”
- de Chardin, The Phenomenon
of Man
Saturday, September 1, 18
Slide 25: Now we will begin to synthesize these disparate observations. Remember, my main argument is that life does not work randomly, but rather deliberately and this
can be seen at the highest of organization as well as the molecular level.
Consider the fact that phenotypes can feed back to influence the evolution and the direction of changes in the genotype. For example, termites have evolved large mounds
with a constant internal environment. This artificial environment, an evolved extension of the organism, will change the set of selection pressures to which the population is
subject. Same applies to any colonial organism which alters it’s local environment. And what about the Oxygen catastrophe brought about by photosynthetic cyanobacteria?
That has certainly changed the direction of life on this planet. Certain paths close, and new possibilities open - these are no longer random, but follow a probabilistic,
stochastic pattern with constraints, and so-called “attractors.”
On a smaller scale, behavior can feed back onto evolutionary change. For example a “faster learning” phenotype means that the organism is less likely to make deadly
mistakes and more likely to acquire resources without subjecting itself to more specific or specialized (e.g. physical) selective pressures. Thus behavior eventually affects the
genes - this is called the Baldwin effect. A good example is adult lactose tolerance among Northern Europeans (a positive feedforward for dairy culture - more dairy culture
promotes more lactose tolerance in the population, more lactose tolerance favors more dairy production). The opposite of this is called “shielding,” like how hygienic
behaviors may preclude selection pressures for an innate resistance to a pathogen.
These examples demonstrate the phenomenon of recursive complexity - feedback loops whose full effect must be played out to be understood and cannot be predicted
only from the component parts. These occur at all organizational levels of structure and include the genome. Clearly this invalidates the model of evolution as the product of
two random number generators (nature and organisms) playing against each other.
As a system moves further and further from equilibrium, it’s behavior can split into many paths and eventually becomes chaotic. Life is a far-from equilibrium system. If
chaotic behavior occurs in relatively simple chemical systems, it’s silly to assume that life must be less complex.
“An emergent property is a new behaviour or phenomenon which is found at higher levels of organization as a result of interactions at a lower level. Traffic is a
familiar example of emergent behaviour — it’s a property of a collection of cars but not of an individual car. In the realm of physics, things like temperature and
pressure are emergent; an individual molecule doesn’t have a temperature or pressure, which are properties of collections of molecules like gases or liquids. An
interesting thing about emergence is that the arrow of causation doesn’t only point from lower to higher levels; the properties emerging from collective behaviour
can also constrain the behaviour of the individual entities. Again, traffic serves as an excellent example: while traffic emerges from the interaction of many
individual cars, it also affects their behaviour, limiting and guiding them. This sort of “downwards causation” is at odds with reductionist approaches to science,
including a strictly gene-centred approach to evolution.”
http://www.kcl.ac.uk/ip/davidpapineau/Staff/Papineau/OnlinePapers/SocLearnBald.htm
This concept turns reductionism on its head. Quite simply, instead of high-level phenomena being explained by lower-level component, the behavior of low-level
components cannot be truly understood or accurately described without reference to influences from the whole.
16. Saturday, September 1, 18
Slide 7: This is a human karyotype - a pretty picture of the content of the cell’s karyon (nucleus). These are tightly bound continuous segments of DNA bound to a
massive structure of regulatory and scaffolding proteins. We humans have 23 pairs of chromosomes - 22 “autosomes” and a pair of sex-determining chromosomes. One of
each pair is provided by our biological parents (the assumption is randomly, but that’s not true because only the sperm with the greatest fighting spirit succeeds).
Furthermore if we have at least 2 copies of each gene (one from each parent), the question arises, how does the organism know which version of a particular gene to select to
transcribe? For the a long time the assumption was “randomly” but this has been proven to be false. The ability to select and express the “best” gene of a pair requires a
higher-level “meta-genomic” (epigenetic) regulatory framework - a cell has to interpret the code, not just faithfully copy it. In fact we now know many genetic disorders
(Prader-Willie, Angelman’s, etc) which involve exactly these regulatory mechanisms.
If you’re a woman your cells “silence” one of the X chromosomes, but how is that decided? We wont go into details here, but these are the sort of complications which for
a long time were simply assumed to be random processes and now we know they are not.
Other animals have different numbers of chromosomes and use completely different methods to determine sex (haploidy, ZZ system, etc.) Some plants have hundreds
even thousands of them. Some protozoa have thousands of tiny chromosomes. The male Jack Jumper ant has 1 chromosome - the female has 2. The number of copies of
each also varies from species to species and even tissue to tissue with no clear purpose. We all know that triploidy of the 21st chromosome results in Down Syndrome, but
your salivary gland cells contain non-standard number of chromosome copies normally. Consider 1,048,576 copies in the silk glands of the commercial silkworm. Clearly we
have little to no understanding of the structure-function relationship of absolute chromosome number or set.
17. How DNA “Works”
The central dogma of
molecular biology deals with
the detailed residue-by-
residue transfer of
sequential information. It
states that such information
cannot be transferred from
protein to either protein or
nucleic acid. (Crick, 1970)
Saturday, September 1, 18
Slide 8: Let us define some terms. This is a simple illustration on the “Central Dogma” a now (even more) meaningless term because the flow of information has been
proven, and exploited by molecular biologists and genetic engineers, to go every possible way imaginable. The process of transcription generates an RNA intermediate from
DNA, and the process of translation generates a protein strand from the RNA code. Enzymes not only modify the expression of DNA and add “tags” but actively change the
DNA nucleotides (e.g. cytosine deaminase) and proteins affect the folding of other proteins, most dramatically in the “prion” diseases, where it is hypothesized that a
misfolded protein induces other proteins to misfold.
Rqc2p and 60S ribosomal subunits mediate mRNA-independent elongation of nascent chains Science 02 Jan 2015: Vol. 347, Issue 6217, pp. 75-78
18. Your Genome Structure
Majority is transcribed!
Active germ cells,
cancer
Vast repetitive
sequences vs
prokaryotes
The Onion Test
Source: Annu Rev Genomics Hum Genet. 2006;7:407-42. Structural
variation of the human genome. Sharp AJ, Cheng Z, Eichler EE.
Saturday, September 1, 18
Slide 9: This pie chart shows the breakdown of the human genome. Total coding regions for protein and special functional (transfer and ribosomal) RNA represent less
than 1.5% (although in some cells, esp cancer and sperm and ova, much, much more DNA is actually transcribed). About 5% is structurally variant from person to person
(insertions, deletions, inversions, large tandem repeats, etc), while individual nucleotides vary by only about .03%. In terms of alleles that code for proteins, there is up to
50% variation in human beings - that means that between any 2 people, half of their functional proteins are different.
The mutation rate of different parts of the genome can vary by an order of magnitude (we’ll discuss the mechanisms shortly). An additional quarter of the genome is
introns (sequences transcribed into RNA but untranslated into protein). Unique non-transcribed DNA (10x more than the exons) includes pseudogenes, which look like
protein-coding regions but are not translated into protein. About 2/3rds, is various classes of repetitive elements. Let me repeat this - 2/3rds of your genome is the equivalent of
falling asleep on your keyboard or just copying and pasting a paragraph over and over. Compared to prokaryotes, the eukaryotic genome has many more of these repetitive
sequences.
You’ve heard the term junk DNA, right? But have you heard the term “incompetent and arrogant scientists with no imagination?” No? Well you should because they’re
the ones who deserve the credit for the term “Junk DNA.” Despite claims by Crick in the 1970s that it “conveys little or no selective advantage to the organism,” or later
claims that it represents a selfish, parasitic sequence, similar to an endogenous virus, the so-called “junk” provides the self-directed biological engineering capacity of
eukaryotes. It is a source of both short-term and long term variation, as we’ll see - it’s no wonder it was dismissed as junk lest it upset the precious theoretical applecart.
Now what does this have to do evolution? Other than the fact that the genome is supposedly the place where the magic of new mutations happens? Because according the
Modern Synthesis, only the expressed phenotype is subject to selection. The genome isn’t selected. The body is - the proteins are. In fact, as Dawkins pointed out, there
situations where more of a gene could mean less of the physical organism. The physical variations in a species are (apparently) only related to that tiny sliver of the pie chart,
as far as we can understand. How, if at all, does natural selection reach down and affect the regulatory and structural aspect of the genome on “the back end” when only the
“user interface” is visible to nature? How has life achieved the variety of forms (and variable genomic structures) we see through natural selection if natural selection only
selects off 1.5%?What is the molecular basis of the creation of body plans and structure? It seems that there are highly conserved “kernel” genes and networks which,
although identical across species as diverse as butterflies and lizards, actually regulate completely different developmental processes (like wing color patterns and limb
placement for DLX, respectively).This is the marbled lungfish. Why does it have a genome 40x bigger than us. Why? Surely scientists are curious. Other organisms, e.g.
certain amoebae, have even more massive DNA caches. Even closely related organisms have orders of magnitude differences in DNA content. Let me reiterate: the structure/
function relationship remains an enigma (called the C-value paradox). And when you have enough paradoxes in a field it’s time for a paradigm shift, not dismissal of 2/3rds of
your object of study clearly serving no purpose.
“The onion test is a simple reality check for anyone who thinks they can assign a function to every nucleotide in the human genome. Whatever your proposed
functions are, ask yourself this question: Why does an onion need a genome that is about five times larger than ours?”
—T. Ryan Gregory (personal communication)
Source: Annu Rev Genomics Hum Genet. 2006;7:407-42. Structural variation of the human genome. Sharp AJ, Cheng Z, Eichler EE.
19. Built-In Biases
A - known effects 150% or
more on probability of
nucleotide transversion and
transition
B - triplet code to AA
mutation probabilities
assuming equal nucleotide
mutation frequency
C = A+B
Climbing Mount Probable: Mutation as a Cause of Nonrandomness
in Evolution Arlin Stoltzfus and Lev Y. Yampolsky J Hered (2009) 100
(5): 637-647.
Explains/predicts 45%
of human-chimp AA
differences!!!
Saturday, September 1, 18
Slide 10: Variation in mutation rates across the genome can be explained by a few mechanisms. Observational studies consistently confirm that mutation rates vary across
time and genomic space (on various scales, from adjacent bases to entire chromosomes), as well as across species, genders, and individuals (males have slightly higher
mutation rates in primates).
The Modern Synthesis - in it’s “shifting allele frequencies” model - implies that absolutely novel changes do not occur. Instead, all variation is built into the initial gene
pool and evolutionary adaptation occurs via these changing allele frequencies in response to environmental shifts, not an internal impetus. This population-view of evolution
says nothing about how an individual’s novel phenotype arises and spreads in a population - and refuses to do so! Evolutionary biologists absolutely balk at the question.
Such a phenomenon is claimed to be irrelevant. Don’t believe me? Read the literature. While mutations are necessary as a source of variation, the actual “pressure” that
causes adaptation is all external and mutations themselves are claimed to be incapable of influencing evolutionary direction. The likes of Dawkins consider the so-called
“lucky mutant” model, a common popular misunderstanding of evolutionary theory, a blasphemous heresy.
However, change is a two step process and a bias in what kind of mutations occur (before they are displayed to the environment for selection) would mean a bias in the
direction of evolution itself - and there is no reason (except quasi-religious dogmatism) to suppose, a priori, that mutations cannot be restricted, localized, or otherwise
controlled in time and genomic space. As we shall see, overwhelming evidence shows this to be true.
First of all, due to the very nature of the triple-codon/AA mapping, not all amino acids have an equal chance of being substituted for one another (even assuming all
mutation “paths” are identical in probability, some transitions simply have more “paths”) - this is pretty obvious when pointed out. (See figures, C). Is the code then just a
random jumble, or do the triplets that represent certain AAs make sense for some reason (e.g. chemically similar AA’s are more likely to substitute for one another)?
Furthermore, beyond the actual code itself, certain nucleotide transition are more or less likely simply due to chemistry. The figure shows in A) nucleotide mutation bias
incorporating effects with 150% or greater and include: transition:transversion bias (purine to pyrimidine transversions are more likely than A/G and C/T transitions); CpG
sites (the well known CpG effect results in 30x increased transition and 5x transversion at consecutive CGCG chains in primates, but half the rate in other mammals); and
C:G sites (when these nucleotides are across from each other, but this also varies greatly depending on adjacent structures). The resulting pattern of AA substitution
(weighing all synonymous codons equally) gives the pattern B, which explains 45% of the variation in chimp to human individual AA changes; large structural duplications
account for much more of the genetic difference between us and chimps. Further refinements are given by the author, increasing the R-squared.
These built-in features, ignoring for the moment all sorts of biological regulation of the mutation processes themselves, suggest that evolution is NOT a random walk and
has inherent bias in the kind of changes that occur. These evolutionary tendencies are therefore predictable and mutation-driven. Natural selection comes later. Clearly, the
supremacy of natural selection is utter nonsense - this is blatantly obvious to many molecular biologists, bioengineers, and biophysicists, but we, the general public, continue
to be fed sophistries by the biologist establishment interested in protecting their cushy academic sinecures. Send them to work the fields!
Source: Cryptic Variation in the Human Mutation Rate. Alan Hodgkinson, Emmanuel Ladoukakis, Adam Eyre-Walker. Feb 03, 2009: PLOS Biology.
Abstract: The mutation rate is known to vary between adjacent sites within the human genome as a consequence of context, the most well-studied example
being the influence of CpG dinucelotides. We investigated whether there is additional variation by testing whether there is an excess of sites at which both
humans and chimpanzees have a single-nucleotide polymorphism (SNP). We found a highly significant excess of such sites, and we demonstrated that this
excess is not due to neighbouring nucleotide effects, ancestral polymorphism, or natural selection. We therefore infer that there is cryptic variation in the
mutation rate. However, although this variation in the mutation rate is not associated with the adjacent nucleotides, we show that there are highly nonrandom
patterns of nucleotides that extend ~80 base pairs on either side of sites with coincident SNPs, suggesting that there are extensive and complex context effects.
Finally, we estimate the level of variation needed to produce the excess of coincident SNPs and show that there is a similar, or higher, level of variation in the
mutation rate associated with this cryptic process than there is associated with adjacent nucleotides, including the CpG effect. We conclude that there is
substantial variation in the mutation that has, until now, been hidden from view.
Climbing Mount Probable: Mutation as a Cause of Nonrandomness in Evolution. ARLIN STOLTZFUS AND LEV Y. YAMPOLSKY. Journal of Heredity, 2009;
100(5):637–647.
20. Mutations by Region
Large Scale
Chromosomes
Cancer
Hodgkinson A, Eyre-Walker A: Variation in the mutation rate across
mammalian genomes. Nat Rev Genet 2011, 12:756-766.
Qu W, Hashimoto S, Shimada A, Nakatani Y, Ichikawa K, Saito TL, Ogoshi K, Matsushima K,
Suzuki Y, Sugano S, Takeda H, Morishita S: Genome-wide genetic variations are highly
correlated with proximal DNA methylation patterns. Genome Res 2012, 22:1419-1425.
Small Scale
Methylated CpG leads to C->T + mutant zone -
spontaneous or cytosine deaminase
Doublet/Triplet changes
Other NT sequences
SNPs at orthologous sites - WHY?
Medium Scale
Transcription frequency
(germ vs soma)
CG-Islands
Recombination zones
Hot
Genes!
Saturday, September 1, 18
Slide 11: But there aren’t just inherent trends in the way mutations can occur at the individual nucleotide level, there are major unexplained variations in mutation rates on
all scales in the genome!
First, on the small scale, the rate of simultaneous doublet and triplet changes is statistically far above the expected combined independent rate, although still 2-3 orders of
magnitude less than that of individual nucleotides.
Second, we return to the well-studied CpG phenomenon, responsible for an estimated quarter of all human point mutations. CpG sites mutate frequently and are less
abundant because cytosine residues are often methylated at the discretion of the cell, and this makes them more likely to turn into thymine. This second deamination step may
be spontaneous or enzymatically controlled via cytosine deaminase - an enzyme that singlehandedly falsifies the Modern Synthesis. A cell can use an enzyme to tag certain Cs
and then use another enzyme to turn them into Ts when conditions are right. It changes it’s genome enzymatically when needed.
This increased mutation rate at CpG sites also “bleeds” to adjacent sequences for unknown reasons. Other sequences of nucleotides have smaller (3-4x above average
rates) and also highly variable context effects for unknown reasons (e.g. ATTG, ATAG, ACAA).
Finally, small nucleotide polymorphisms (SNPs) and substitutions tend to occur in parts of similar sequence sites across species. This implies conserved hypermutable
zones, independent of flanking sequences or local clustering!
Next, on larger scales, large 1kb CG-islands (common at promoter regions) have much lower mutation rates OF ONLY THE CpG cytosine-to-thymine conversion than
small commonly-methylated CpG sites. This anti-mutation effect extends thousands of kilobases away to nearby small CpG sites. Seemingly the important sequences that
should not be modified by the cell reside in clearly demarcated zones.
In somatic cells, transcribed DNA has much lower mutation rates than non-transcribed DNA (the junky DNA likes to get funky). In germ cells, there are subtle
differences in the kinds of mutations, but not in the overall rate (see figure) because these are sequences that will form the next generation, they need to be adjusted. Finally,
recombination sites on chromosomes (where the chromosomes your parents gave you are mixed and matched before packaging into your own eggs or ova explaining why
children are often more like their grandparents) are known to have higher mutation rates.
Large scale variation in mutation rates has also been demonstrated in areas where selection is very unlikely to explain them. Correlation of mutation frequencies between
adjacent segments of DNA (at the Mb scale) vary greatly among species. A variety of explanatory associations have been proposed, but they don’t explain more than half of
non-CpG mutations. The overall effects are small, but statistically robust (10-50% change in mutation rate). Human Y chromosomes have 50% higher mutation rates than
autosomes which have 30% higher rates than the X (the situation is completely different in rodents). Interestingly, this supports the observation that males are evolutionarily
experimental with a higher mortality rate and trait variance (e.g. intelligence) while women are more average and on average more likely to reproduce. Interautosomal
variability in mutation rates is about 20%. Cancer cells show extremely diverse patterns of mutation on all scales which is why they can be so hard to kill.
Finally, mutation rates between transcribed genes vary enormously. Measures of neutral substitution rates, i.e. silent mutations, or substitutions in the triplet code that do
not change the AA, usually occurring in the 3rd position in genes (in this case 4-fold synonymous sites on orthologous genes in humans compared to mice), shows that the
mutation rates A) have much fatter tails than a normal distribution (leptokurtotic) and B) depend on gene types/classes. Indeed these “hot” or “cold” genes an be both
clustered and not clustered, suggesting that chromosome recombination hot-spots do not fully explain the trend. “Cold” genes generally are older, regulatory and
housekeeping genes while “hot” genes deal more with environmental interactions (e.g. smelling, immune interactions, secretions). The fact that mutations occur more
frequently in genes whose greater variability would help the organism and occur less frequently in genes whose alteration could harm the organism suggests that mutation
itself is just as important a driver of evolution as the natural selection of those mutants.
A very specific example of conserved zones and variable regions that exist within a gene appear in the venom genes of Cone snails.
“We found that the SNP rate also increased by ∼50% (P < 10−2170), and the substitution rates in all dinucleotides increased simultaneously (P < 10−441) around
methylated CpG sites. In the hypomethylated regions, the “CGCG” motif was significantly enriched (P < 10−680) and evolutionarily conserved (P = ∼ 0.203%), and
slow CpG deamination rather than fast CpG gain was seen, indicating a possible role of CGCG as a candidate cis-element for the hypomethylation state. In
regions that were hypermethylated in germline-like tissues but were hypomethylated in somatic liver cells, the SNP rate was significantly smaller than that in
21. Transcriptional relationships on chromosome 22 of testis and prostate tissue sequences (plus
and minus end coding sequences indicated in green and purple, respectively)
Saturday, September 1, 18
Slide 12: “Genes” don’t exist; more precisely, the term denotes an abstraction - a complex functional unit that rarely corresponds to an single stretch of DNA and includes
many regulatory sequences. The classic idea of a reading frame with a fixed and reliable “start” and “stop” code applies to at most 15% of eukaryotic coded regions (that tiny
sliver of the pie). In reality, chromosomes code for “transcriptional networks.” To make a big protein the cell must transcribe scattered pieces which may code for various
segments, like lego blocks. This leads to the phenomenon of chimeric transcripts, thousands of which exist in humans. They occur by pre-mRNA trans-splicing, chromosomal
translocations, gene fusions, or exotic methods of DNA transcription. In independent epigenetic RNA regulatory network has been proposed for certain gene classes.
“Transcriptional network on chromosome 22 in a pool of testis and prostate tissues. The chromosome is depicted as a circle [45], and RACEfrag connections
as inner links between genomic regions (5′ and 3′ RACE [rapid amplification of cDNA ends] connections are red and blue, respectively). The circular tracks are,
going inwards: (1) - chromosome scale (in megabases, starting at 14 Mb), (2) - plus-strand annotated genes (green), (3) - plus-strand annotated pseudogenes
(black), (4) - minus-strand annotated genes (purple), (5) - minus-strand annotated pseudogenes (black).”
22. Mobile Genetic Elements
Predicted genetic
regulation - selective
silencing, “genomic
shock,” stressors
Multiple classes,
targets, functions
Major promoters,
regulators - 10-20%
of all identified
control sequences
V(D)J
recombinases too!
Centromeres!
Imprinting!
Telomeres!
Pregnancy!
Saturday, September 1, 18
Slide 13: Now let’s consider the other 2/3rds of the genome that is composed of mobile elements (and are more mutable anyway). Quite simply, these sequences move
around the genome and they are doing that in your cells right now. They come in many types: copy or cut and insert or recombine; encoding own transposase or
retrotransposase or borrowing another element’s; using or not using RNA intermediates; linear, circular or folding; duplicative or non-duplicative with regards to adjacent
sequences, etc. New ones are constantly being discovered. For a university database of MGE’s, check out GyDB.org.
These “transposable elements” or “jumping genes” were discovered by Barbara McClintock, who actually speculated extensively on their role in gene regulation, which
she believed was her major contribution (predating the “operon” theory). She was a bit wacky, and her discoveries were considered a mere curiosity and limited to corn, until
transposable elements were discovered in bacteria 20 years later (they are largely responsible for carrying drug-resistance genes). In her 1983 Nobel Prize lecture she said:
“There must be numerous homeostatic adjustments required of cells. The sensing devices and the signals that initiate these adjustments are beyond our
present ability to fathom. A goal for the future would be to determine the extent of knowledge the cell has of itself and how it utilizes this knowledge in a
“thoughtful” manner when challenged... In the future, attention undoubtedly will be centered on the genome, with greater appreciation of its significance as a
highly sensitive organ of the cell that monitors genomic activities and corrects common errors, senses unusual and unexpected events, and responds to them,
often by restructuring the genome.”
She was clearly ahead of her time, but to avoid upsetting the rotten apple cart of the Modern Synthesis, transposable elements were lumped under the “selfish gene”
hypothesis, which erroneously implied that they act randomly. This shoehorned the uncomfortable fact of massive genome plasticity into the dogma of random, independent
genomic behavior. We can’t have too much agency in living systems: that would be un-materialistic, Science forbid! The truth is that the movements of mobile elements are
not random; most types are targeted to specific sequences and only allowed to function under certain conditions (e.g. the early embryo, the brain, the testes, during stress)
and they contribute widely to genetic regulation (promoters, enhancers, splice sites, coding exons for amplification and reassortment of protein subdomains, etc). In fact, as
an example, the Ty1,2, and 3 transposons in yeast target upstream of start codons to avoid messing up the reading frame, and the Ty5 class targets to untranscribed regions
only (unless the cell is stressed) - so much fore selfishness! These jumping genes are downright considerate. This is true for many eukaryotic mobile elements, behaving more
like helpful genes, than the selfish jerks imagined by Dawkins, who is likely projecting his own character.
MGEs function analogously to chunks of computer code that represent a single functional motif. For example, the famous Alu retrotransposon (which is increased in
higher primates and relies on LINE elements’ retrotransposase) has been observed in the end-points of 30% of segmental duplications/low copy repeats - structures involved
in recombination sites that are rare in mammals but increase as you evolutionarily approach humans. MGEs are also associated with areas for binding nuclear scaffolding
proteins (providing a standardized scaffold) suggesting even more complex levels of structural transcription and replication regulation. Retroelements that encode reverse
transcriptase are also able to reverse transcribe and encode other mRNAs resulting in a “processed pseudogene” being inserted into the genome - a so-called “cDNA”
sequence.
For long lists of factors (with references) that promote transposon activity, check out: http://shapiro.bsd.uchicago.edu/TableII.7.shtml.
“These factors include nutritional deprivation, intercellular signaling molecules, exposure to toxic substances (not all of which are DNA-damaging agents
themselves), and life history events such as hybridizations and infections. In eukaryotes, there is a strong correlation between these life history events and ones
that disrupt epigenetic control systems (http://shapiro.bsd.uchicago.edu/TableII.10.shtml)…There is not space to include the many examples of NGE targeting in
this review, but the details are accessible in a table online (http://shapiro.bsd.uchicago.edu/TableII.11.shtml).”
“‘Our analysis identified three extended periods in the evolution of gene regulatory elements. Early vertebrate evolution was characterized by regulatory gains
near transcription factors and developmental genes, but this trend was replaced by innovations near extracellular signaling genes, and then innovations near
posttranslational protein modifiers.’…One evolutionary mystery has been how the same binding site locates at multiple dispersed genome positions fast enough
to produce a useful coordinated network of the kind initially proposed by Britten and Davidson [25,222]. It would take an indefinitely long time for transcription
factor binding sites, promoters, and tissue-specific enhancers to accumulate at multiple loci by independent random changes at each position. The existence of
mobile genetic elements provides a mechanism for the rapid dispersal of regulatory sequences through the genome [223]. Mobile elements, such as Alu SINEs,
contain many binding sites for transcription factors that allow them to play a role in establishing the regulation of developmental processes[224].”
23. Regulation of MGEs
piRNAs/rasiRNAs
Sequence or structure
targeting
ERV elements - p53
in primates; 30% of
binding sites
ZF regulation
Brain mosaicism
306 J.A. Shapiro / Physics of Life Reviews 10 (2013) 287–323
Fig. 8. Impact of transposable elements (TEs) on eukaryotic genome architecture and gene expression. TEs (shown as black vertical lines in a
chromosome) can be involved in formation and maintenance of important chromosomal structures, such as centromere and telomere. TEs induce
heterochromatin formation in chromosomes. TEs (shown as black boxes) can also influence local gene expression in various ways. At the transcrip-
tional level, a TE insertion can introduce promoter sequences and cis-regulatory element(s) to a nearby gene. Outward-reading transcription from a
TE downstream of a gene can generate an antisense RNA that potentially interfere with sense transcription. In addition, TEs can induce epigenetic
gene silencing by chromatin remodeling that potentially represses the transcription of adjacent gene(s). At the post-transcriptional level, a TE that
has inserted in the 3′UTR of a gene can introduce a target of miRNA that interferes translation. Finally, TEs or TE-derived sequences can produce
non-coding RNAs (ncRNAs) that operate as a cis- or trans-gene regulator. Reproduced from [24] with permission from John Wiley and Sons.
4.4. Targeting of NGE and mobile elements
Just as cells have the ability to silence and activate NGE functions, they also have the ability to target changes to
specific sites or classes of locations in the genome [10]. There is not space to include the many examples of NGE tar-
geting in this review, but the details are accessible in a table online (http://shapiro.bsd.uchicago.edu/TableII.11.shtml).
The mechanisms of this targeting are all well understood in molecular biology: proteins recognizing DNA sequences,
proteins recognizing DNA structures, protein–protein binding, DNA–DNA or RNA–DNA sequence interactions, and
coupling to other cell functions, such as DNA replication, transcription and chromatin formatting.
Many NGE targeting modalities are of obvious adaptive utility. In terms of the spread of bacterial antibiotic
resistance determinants, we can note (1) the targeting of the Tn7 transposon to replicating DNA [187], which is
characteristic of plasmid molecules in the process of transferring to new cells, and (2) the specificity of integron
cassette insertion for single-stranded cassettes [188], which is the molecular form in which they enter a cell during
horizontal transfer by DNA uptake or conjugation. In other words, these targeting specificities are optimal for the
molecular mechanisms of horizontal DNA transfer in bacteria.
In the case of yeast LTR retrotransposons, we see a different kind of adaptive utility for targeting (see Fig. 2 of
[189]). The Ty1–Ty3 elements are all targeted to insert upstream of transcription start sites, which prevents them from
disrupting important coding sequence data. The Ty5 element is targeted to silent chromatin regions, where it will
likewise not disrupt important vital functions, except in case of stress, when its mutagenic capabilities may prove
useful.
Coupling DNA restructuring to transcription is particularly important because there is no question that cells have
the ability to target transcription to particular sites in the genome as part of a biologically adaptive response to external
and internal circumstances. Isotype class switching in activated B cells illustrates this capability. The location of
class switch recombination (CSR) is determined by lymphokine signals that control the transcription of “switch”
regions where the necessary double-stranded breaks occur [190,191]. In other words, the immune system instructs
the activated B cells which class of antibody to produce in response to a particular infection by targeting a specific
DNA breakage-and-joining process. Thus, if we are searching for a feasible molecular mechanism to integrate widely
accepted adaptive cell responses with the NGE toolbox, coordination between transcriptional regulation and genome
change operations is an ideal candidate.
Saturday, September 1, 18
Slide 14: How are transposable elements regulated? One answer is the recently discovered Piwi-interacting RNAs, first found in mammalian testes, a transcriptionally
hyper-active tissue, as previously discussed. PiRNAs seem to originate from repetitive regions and affect DNA methylation, an epigenetic regulator, and RNA silencing. In
somatic cells, MGEs are often found in heterochromatin (DNA wrapped on methylated histones, i.e. folded, quietly mothballed sections of DNA that’s turned “off”).
Other pathways include rasiRNA, which also regulates transposon expression through a different, but related, mechanism. Barbara McClintock was able to induce
transposon activity in maize by injecting loose chromosome ends and proposed that a stressor, or “genomic shock” was necessary for their activation, or rather, as with most
stress-responses, derepression, as now appears to be the case. The cell has to constantly push the lid closed and when it’s stressed it lets the transposon pot erupt. We now
have much documented evidence of variable transposon activity in different tissues and upon exposure to different stressors (see reference tables by Shapiro below). It’s
likely different stressors promote different kinds of transposon activity. It’s a very ancient way for cells to adapt directly to environmental pressures - not just die and let the
pre-existing lucky mutants flourish. Life has a will to survive.
It is hypothesized that large gene-regulatory networks (e.g. the p53 network, placental signaling cascades, etc) may have arisen with the help of a special class of large
transposons called endogenous retrovirus elements - ERVs or Long-tandem repeat retroelements, comprising 5% of the human genome. About 1/3rd of p53 (and important
growth regulator well studied in cancer cells) genome binding sites are associated with this class of repetitive sequences. Class I and II MHC genes are also associated with a
high density of ERV sequences, which are found near other novel mammalian genes. These elements are in turn suppressed and regulated by zinc-finger DNA-binding
genes, which correlate with the number and variety of LTRs in the genome. Removing these zinc-finger DNA-binding complexes causes massive endogenous retrovirus
bursts in mice. Black dots are mammal data points; possum on top right. If you ever thought possums were weird, now you know why…
But that’s not all! It seems transposons/MGEs are also responsible for mosaicism. Nobody bats an eye when you talk about how the immune cells are able to engineer
their own DNA (using DSB repair in the case of B-cell isotype switching) to combat novel antigens, or that these functions are highly active in germ cells, but what should
make you really flip out is that neurons in the brain are a massive genetic mosaic thanks to the expression and activity of at least 3 retrotransposons (LINE-1, and non-
autonomous Alu and SINE/VNTR/Alu) in neural progenitor cells (NPCs). Insertion sites appear rather specific and almost exclusively introns in the case of LINE-1. NPCs
in all areas of the brain appear to have active retrotransposons. It has been well-studied with a glowing transgenic mouse, in vitro cultures, and in human brain specimens.
The patterns appear highly variable by region and individual, and are hypothesized to be related to various neurological disorders, including autism (see Gilman et al). But
what hasn’t been linked to autism these days? In fruit flies, one class of neurons (αβ), in the olfactory memory center have been observed to do this. The significance of all
this crazy genetic engineering activity remains a mystery - unlike the usual effect of LINE-1, intronic insertion seems to increase rather than decrease transcription of
affected genes. Some suggest that poor LINE-1 regulation may be related to RET syndrome and ataxia-telangiactesia and MGEs may contribute to at least a few dozen other
diseases. This may even occur in glial cells, but at lower levels - NPCs also give rise to astrocytes and oligodendrocytes, after all.
“It is becoming increasingly difficult to escape the conclusion that eukaryotic genome evolution is driven from within not just by the gentle breeze of the genetic
mechanisms that replicate and repair DNA, but by the stronger winds (with perhaps occasional gale-force gusts) of transposon activity.” FORMER PRESIDENT
OF THE AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE Nina V. Fedoroff
“Fig. 8. Impact of transposable elements (TEs) on eukaryotic genome architecture and gene expression. TEs (shown as black vertical lines in a chromosome)
can be involved in formation and maintenance of important chromosomal structures, such as centromere and telomere. TEs induce heterochromatin formation in
chromosomes. TEs (shown as black boxes) can also influence local gene expression in various ways. At the transcriptional level, a TE insertion can introduce
promoter sequences and cis-regulatory element(s) to a nearby gene. Outward-reading transcription from a TE downstream of a gene can generate an antisense
RNA that potentially interfere with sense transcription. In addition, TEs can induce epigenetic gene silencing by chromatin remodeling that potentially represses
the transcription of adjacent gene(s). At the post-transcriptional level, a TE that has inserted in the 3 UTR of a gene can introduce a target of miRNA that interferes
translation. Finally, TEs or TE-derived sequences can produce non-coding RNAs (ncRNAs) that operate as a cis- or trans-gene regulator.
“…We explored the gene regulatory landscape of mammalian endometrial cells using comparative RNA-Seq and found that 1,532 genes were recruited into
endometrial expression in placental mammals, indicating that the evolution of pregnancy was associated with a large-scale rewiring of the gene regulatory
24. de novo Genes
Est 18/24,000 - “brain
and balls” hypothesis
Small (>200 AAs)
Gene dense region
Transposons; epistatic
- multistep changes
Emergence of a new gene from an intergenic region. Heinen TJ, Staubach F,
Häming D, Tautz D.Curr Biol. 2009 Sep 29;19(18):1527-31. doi: 10.1016/j.cub.
2009.07.049. Epub 2009 Sep 3.
Origins, evolution, and phenotypic impact of new genes Henrik Kaessmann
Genome Res. 2010. 20: 1313-1326
The utility of an organ does
not explain its origin; on the
contrary! For most of the
time during which a property
is forming it does not
preserve the individual and is
of no use to him least of all
in the struggle with external
circumstances and enemies.
- Nietzsche
Saturday, September 1, 18
Consider the paradox: if new genes are modified old genes, how do completely novel (never previously encoded) genes arise? And if it takes multiple steps for a gene to
arise from non-coding material, which seems likely, how are these multiple changes selected for by natural selection and propagated in a population if they occur in a proto-
gene and cannot directly affect the phenotype? These are all uncomfortable questions for defenders of the Modern Synthesis who basically deny the existence of de-novo
genes and invoked “genetic drift” as a mechanism for fixing undectable traits into a population - equivalent to saying “it just happens to work out.” Observation suggests that
genes do arise in multiple steps across evolutionary lineages and spread in the population while apparently still unripe for prime-time performance. But how exactly a
particular configuration of proto-gene becomes “fixed” without natural selection is not clear. One theory suggests that “cryptic” regulatory signals (the term “regulatory” is
extremely broad) exist in all intergenic regions. Another researcher postulates the “Writing Phenotype,” which we will discuss in a few slides. Briefly the theory states that
genetic data encodes onto another chromosomal region allowing a higher level of selection - selection on a meta-genome - an allele that describes what combination of alleles
or non-transcribed DNA exists at another location.
Processes of creating new genes using preexisting genes as the raw materials are well characterized. Examples include exon shuffling, gene duplication, retroposition, gene
fusion, and fission. It seems that de novo gene evolution is more likely to occur in a ancestrally transcribed region - remember much more DNA is transcribed than translated,
up to 50% in germ cells! These cells “recall” and toy with the atavistic genes of the ancestors.
The “brain and balls” (aka “nuts and noggin,” aka “cabeza y cajones,” aka “telencephalon and testes”) hypothesis states that novel genes tend to be expressed in novel
tissues at the “cutting edge” of adaptive pressure - the male mammalian reproductive organs and the newer parts of the mammalian brain, for example. This is derived from
many interesting recent observations:
“FLJ33706 in brain tissue. Segment exists in eutherian mammals. Cross-species analysis revealed interesting evolutionary paths of how this gene had originated
from noncoding DNA sequences: insertion of repeat elements especially Alu contributed to the formation of the first coding exon and six standard splice
junctions on the branch leading to humans and chimpanzees, and two subsequent substitutions in the human lineage escaped two stop codons and created an
open reading frame of 194 amino acids.”
Sources: Emergence of a new gene from an intergenic region. Heinen TJ, Staubach F, Häming D, Tautz D. Current Biology. 2009 Sep 29;19(18):1527-31.
De novo origination of a new protein-coding gene in Saccharomyces cerevisiae. Cai J, Zhao R, Jiang H, Wang W. Genetics. 2008 May;179(1):487-96.
Recent de novo origin of human protein-coding genes. Knowles DG, McLysaght A. Genome Res. 2009 Oct;19(10):1752-9.
A human-specific de novo protein-coding gene associated with human brain functions. Li CY, Zhang Y, et al. PLoS Computational Biology. 2010 Mar 26;6(3):e1000734.
25. Sex - Network Level
Evolution
Why sex?
Why so prevalent
(1000:1 species)?
Variation persists
Performing vs writing
phenotype - 2nd
order selection
Working sperm
theory
Saturday, September 1, 18
Slide 30: This model incorporates and explains the prevalence of sex, the predominant reproductive strategy. It is often assumed that sex evolved later than asexual
reproduction, but is this necessarily true? Sex is merely genetic recombination and exchange, but there is no definitive theory for why it even exists or is so common and
apparently necessary for life. Asexual groups are generally recent and short lived - “broken” sexual organisms. The advantages of sex are not obvious. If you have a good
combination of alleles, why would you risk breaking it up and creating some other combination which may be inferior? If the writing phenotype is a feature of genomic
variation, it would explain the prevalence of sexual recombination because it would mitigate the loss of fitness in a new allele combination. The previous combination of
alleles would leave a coded archive on it’s germ-cell genome, which would match with a sexy complementary partner allele archive. This would allow for “abstract” or
“second order” selection for better combinations of genes rather than individual genes themselves. This would provide a very convenient way that specific, useful, and novel
allele combinations could spread and be maintained.
The “Writing Phenotype” model also explains why so much DNA is transcribed in sperm cells. Not because it is performing functions, but because it is being “written”
into other parts of the genome. As mentioned, sperm and cancer cells transcribe almost 50% of the genome. Knocking out some of these pseudo-genes does indeed cripple
sperm, which would seem to imply that this transcribed mess is somehow critical for cellular function, but another way of interpreting this is to say that these genes are being
actively written or are writing or that they form part of the read/write machinery so critical in gametes. The high level of “writing” activity in sperm cells makes them appear
to be evolving very rapidly.
“Sexual selection, selfish genes, and genetic conflict provide compelling explanations for many atypical features of gene expression in spermatogenic cells
including the gross overexpression of certain mRNAs, transcripts encoding truncated proteins that cannot carry out basic functions of the proteins encoded by
the same genes in somatic cells, the large number of gene families containing paralogous genes encoding spermatogenic cell-specific isoforms, the large number
of testis-cancer-associated genes that are expressed only in spermatogenic cells and malignant cells, and the overbearing role of Sertoli cells in regulating the
number and quality of spermatozoa.”
“One of the greatest challenges for evolutionary biology is explaining the widespread occurrence of sexual reproduction and the associated process of genetic
recombination. A large number of theories have been developed that provide a sufficient short-term advantage for sex to offset its two-fold cost. These theories
can be broadly classified into environmental (or ecological) and mutation-based models.”
$ Indeed, “self-replication” is a misleading term. Strictly speaking, there is no such thing as “self-replication”. Do we mean by it that nothing other than the
object itself takes part in the replication of the object? This is logically impossible and empirically evident not to exist. An individual can only be replicated in the
right environment, not to mention that its replication involves material and energy coming from the environment. Since the right environment is indispensable, the
responsibility for replication is not only within the “self”. Furthermore, under sexual reproduction, the individual is not really “replicated” at all.
$ It is not random accident that generates the variance that selection operates on. Rather, a phenotype causing syntactic internal change is absorbing
information from the outside world and changes itself in the process.
Source: A pluralist approach to sex and recombination. West, Lively, Read. Article first published online: 25 DEC 2001
26. CRISPR/Cas System
Extends RM System
40% bacteria, 90%
archea
Acquired immunity
Similar to siRNA in
eukaryotes but
encoded!
ICP1 phage hijacking
Saturday, September 1, 18
“The type II CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) system is used by bacteria and archaea to provide immunological memory
against subsequent invasions of foreign DNA. It works by incorporating short exogenous DNA sequences from the invading pathogen into specific loci of the host
genome. Upon transcription, these sequences are processed into pre−CRISPR RNAs (crRNAs) and further into crRNAs, following maturation, which then function
as detectors of foreign DNA.” Clearly this is more evidence for the plasticity of the genome (and it corresponds to the epigenetic iRNA and siRNA mechanism in
eukaryotes). The system has recently been found to be hijacked by phage ICP1, currently being explored as a biological treatment for cholera.
27. Stress Induced Mutation
Common in nature
Variable response
Negatively correlated
with constitutive
hypermutation
Correlation with niche
Secondary Selection
Mutation as a Stress Response and the Regulation of Evolvability Rodrigo S.
Galhardo, P. J. Hastings, and Susan M. Rosenberg Crit Rev Biochem Mol
Biol. 2007 Sep-Oct; 42(5): 399–435.
Stress-Induced mutagenesis in bacteria. Bjedov I1, Tenaillon O, Gérard B,
Souza V, Denamur E, Radman M, Taddei F, Matic I. Science. 2003 May;
300(5624): 1404-9.
Saturday, September 1, 18
Slide 8: Over the last 20 years, much evidence has accumulated for adaptive mutations in bacteria. This research was pioneered by John Cairns and Barry H. Hall. The
research is summarized, and criticisms refuted, in the excellent review article cited here (although more recent studies do fill in some gaps). Stress-induced mutagenesis is
common in bacteria, negatively correlated with constitutive hypermutation, variable in it’s response, correlated with niche and virulence, and occurs by different mechanisms
under different stressors; the most prominent examples will be discussed here.
Source: Mutation as a Stress Response and the Regulation of Evolvability. Rodrigo S. Galhardo, P. J. Hastings, and Susan M. Rosenberg. Crit Rev Biochem Mol Biol.
2007 Sep-Oct; 42(5): 399–435.
28. RNA
Dozens of RNA types
-zymes, circle, scan...
120 papers since 2004
on lncRNA - 10k in
mammals
Various functions in
DNA regulation, repair
Independently
heritable - “cache”
Chimeric transcription
Saturday, September 1, 18
29. Epigenetics
Multiple modifications
Highly dynamic
Chicken stress
syndromes are
heritable
Dutch famine studies
Endocrine disruptors
e.g. DES
ParamuationEarly Embryos Reprogram DNA Methylation in Two Steps Hao
Wu1, 2, 3, 4
, Yi Zhang Cell, Vlume 10, Issue 5, 4 May 2012,
Pages 487–489
Saturday, September 1, 18
Slide 35: “Epigenetic mechanisms include covalent chemical modification of DNA (methylation) and chromatin (covalent histone modifications), non-coding
RNAs, and polycomb group (PcG) genes, and are ultimately related to the regulation of gene expression and chromatin structure.”
Besides methylation of cytosine as a regulator signal, there is also the deamination and controlled stepwise oxidation of methylated cytosine via DNA dioxygenases.
Cytosine deaminase and thymine-DNA glycosylase catalyze and reverse, respectively, the C-T conversion. Cytosine modification is a dynamic process, occurring at different
rates in the paternal vs maternal DNA with a minimum at the blastocyst stage. Certain transposons like LINEs and LTRs are unleashed, but other classes, like, IAPs
(intracisternal A-particles) are still highly methylated even during the blastocyst stage. Not all DNA methylation is erased during early development (at least in mice).
Broader studies, like the famous Dutch Famine Studies, show that epigenetic effects are robust and discernible in humans. Details can be found elsewhere, but one of the
first findings was that you lived longer if your grandfather starved when he hit puberty, and you had a shortened lifespan if your grandmother starved while she was in utero.
In humans, the effects of endocrine disruptors, like dethylstilbestrol, have been well documented to extend to the next generation via diverse epigenetic effects.
Epigenetic effects on learning have been found in mice and chickens. The social implications of this research are unsettling:
“The scientists grew groups of chickens under stressful conditions, where a randomly fluctuating day-night rhythm made access to food and resting perches
unpredictable. This caused a marked decrease in the ability of the stressed birds to solve a spatial learning task. Remarkably, their offspring also had a decreased
learning ability, in spite of being kept under non-stress conditions from the point of egg-laying. They were also more competitive and grew faster than offspring of
non-stressed birds.”
One well-documented example of stable epigenetic change is “paramutation” in maize plants[199]. Paramutation refers to the phenomenon whereby certain
alleles of specific genetic loci will convert, or “paramutate,” the normal wild-type form of a locus to the epimutated version in the progeny of heterozygous plants
that carry both forms of the allele. No DNA sequences are altered, but epigenetic marking patterns change. These paramutations have proven stable for many
generations of maize breeding. A similar form of paramutation has been reported in mice, but there is less data on how long the epimutations endure [200].
“DNA methylation is highly dynamic in germ cells and during the early development of mammalian embryos. In primordial germ cells (PGCs) DNA methylation is
erased and new parent-of-origin patterns are established in mature gametes genome-wide and at differentially methylated regions of imprinted genes. After
fertilization, DNA methylation is reprogrammed again in the developing embryo. At blastocyst stage the overall levels of 5-methyl cytosine are low and most gene
promoters are not methylated. New patterns of DNA methylation are established in embryonic lineage cells by the epiblast stage E6.5 and these continue to change
in differentiated tissues. Although DNA methyltransferases are absolutely essential for this process, there are other proteins that regulate the action of these
enzymes either globally or at specific genomic loci.”
Heritable genome-wide variation of gene expression and promoter methylation between wild and domesticated chickens Daniel Nätt1, Carl-Johan Rubin2, Dominic
Wright1, Martin Johnsson1, Johan Beltéky1, Leif Andersson2 and Per Jensen1*4 February 2012
Early Embryos Reprogram DNA Methylation in Two Steps Hao Wu1, 2, 3, 4, Yi Zhang Cell, Vlume 10, Issue 5, 4 May 2012, Pages 487–489
30. Horizontal Gene Transfer
Conjugation and
Transduction
Sometimes sequence
specific
Common bacterial
adaptation - 2-33%
“informational” vs
“operational” genes
Patching damage
Saturday, September 1, 18
Slide 21: Other problems with the Modern Synthesis include the prevalence of promiscuous gene swapping among organisms. Horizontal gene transfer in prokaryotes
and archea is extremely common and widespread, and occurs by various mechanisms, including conjugation, transduction (viral phages), and competence (just plain
absorption of free-floating DNA), which can sometimes be sequence specific - organisms “sense” certain genetic sequences before taking them in (don’t want to absorb a viral
sequence). Stress and DNA damage are well known to enhance competence in bacteria.
Interesting DARPA-funded research suggests that there are two distinct classes of genes when it comes to sharing: “informational” and “operational.” The former
represents “how to” genes (common coded enzymes, for example) while the latter gives regulatory information on morphology and context (like a computer operating
system). Informational genes are much more commonly exchanged all throughout the tree of life while operational genes seem to stay in (and define) specific evolutionary
lineages. See papers by Deem and Park, (bio)physicists at Rice.
31. Humans
Telegony
Microchimerism
Cancer cells
rRNA
Mitochondria
From humans to
pathogens
Retroviral “syncytins”
Saturday, September 1, 18
Slide 22: Although many snazzy genes are thought to be products of horizontal gene transfer (HGT), such as the vertebrate adaptive immune system (according to
Professor Michael Deem) human somatic cells appear to share DNA as well. Studies show that apoptotic cell debris carrying DNA are incorporated into the genome of
neighboring cells (via simple fluorescent labeling studies). See “Extracellular Nucleic Acids” edited by Yo Kikuchi and Elena Y. Rykova. 2-5% of a patient’s leukocytes after
receiving donor whole blood in the trauma setting will remain chimeric after 10 years.
It appears that some portion of bacterial genomes are incorporated into cancer cells and even the mitochondria of cancer cells (at least bacterial rRNA transcripts have
been found in cancer cells). It’s not clear where this is coming from, and what they are doing there, but tumor-causing bacteria are a common phenomenon in other
kingdoms, and Bartonella is well known for occasionally causing endothelial tumors. It is even hypothesized that the mammalian placenta uses repurposed viral genes in the
creation of the all-important syncytium (e.g. the herpesviruses are known for their ability to induce syncytia formation in infected tissues). In fact it appears different virus
types contributed these genes to different groups of eutharian mammals. This, along with the wide diversity of placental structures, suggests placentas may have evolved
independently. Older siblings pass on bits of their DNA to younger siblings via their common mother, the mother’s cell get DNA from all her fetuses, and, in fruit flies at
least, internal fertilization by one male somehow passes on traits to offspring sired by an subsequent male thus confirming the ancient Aristotelian theory of telegony. So my
Brazilian buddy Fernando who hates condoms is actually on to something.
$ Male Microchimerism in the Human Female Brain: “In humans, naturally acquired microchimerism has been observed in many tissues and organs. Fetal
microchimerism, however, has not been investigated in the human brain. Microchimerism of fetal as well as maternal origin has recently been reported in the
mouse brain. In this study, we quantified male DNA in the human female brain as a marker for microchimerism of fetal origin (i.e. acquisition of male DNA by a
woman while bearing a male fetus).”
$ “We examined bacterial DNA integration into the human somatic genome. Here we present evidence that bacterial DNA integrates into the human somatic
genome through an RNA intermediate, and that such integrations are detected more frequently in (a) tumors than normal samples, (b) RNA than DNA samples, and
(c) the mitochondrial genome than the nuclear genome... Acinetobacter DNA were fused to human mitochondrial DNA in acute myeloid leukemia samples...DNA
with similarity to Pseudomonas DNA near the untranslated regulatory regions of four proto-oncogenes. This supports our hypothesis that bacterial integrations
occur in the human somatic genome that may potentially play a role in carcinogenesis.”
Sources: Bacteria-Human Somatic Cell Lateral Gene Transfer Is Enriched in Cancer Samples. David R. Riley, Karsten B. Sieber, Kelly M. Robinson, James Robert White,
Ashwinkumar Ganesan, Syrus Nourbakhsh, Julie C. Dunning Hotopp
Horizontal gene transfer: you are what you eat. Holmgren L .Biochem Biophys Res Commun. 2010 May 21;396(1):147-51. doi: 10.1016/j.bbrc.2010.04.026.
32. Other critters
Eukaryote Symbiogenesis
Coffee borer
Bacterial polyketides
2% of Rafflesia genes
Photosynthetic Sea Slug
Arborella mito’s
Wolbachia
Saturday, September 1, 18
Slide 20: Here are a few more organisms (out of many, many others) that exhibit known HGT. The coffee borer, a major coffee pest, has a gene (HhMAN1) which
encodes a mannanase, a glycosyl hydrolase, for digesting the major sugar in coffee berries. This protein, according to Brazilian researchers, is clearly of bacterial origin.
Many fungi, especially those that form symbiotic lichens, are loaded with bacterial polyketides - common defensive chemicals. The Rafflesia plant, a famous smelly jungle
parasite of certain tropical vines, has up to 2% of its transcribed genes stolen from its host. Most of them replacing the original. The sea slug Elysia chlorotica steals
chloroplasts from the algae it eats, but what is truly amazing is that this animal has the required genes in its nucleus to keep those chloroplasts functioning - the only animal
ever discovered to have them. The mitochondria of this jungle vine genus, Arborella, have absorbed the mitochondrial genomes of other plant species. How or why, we don’t
know. Wolbachia is a common endosymbiont and possible parasite of many insects which has a profound effect on their life cycle, reproduction, and sex determination (it can
turn all the offspring of an insect into females to ensure it gets passed down). There is ample evidence for gene exchange in this interaction. Especially with arthropods and
plants, the variety of bacterial interactions (and therefore possibility of gene exchange) is enormous - these interactions certainly alter evolutionary development in a non-
random way.
Source: Adaptive horizontal transfer of a bacterial gene to an invasive insect pest of coffee Ricardo Acuñaa
,
1, Beatriz E. Padillaa
,
, Claudia. Flórez-Ramosa, José D. Rubioa,
Juan C. Herreraa,Pablo Benavidesa, Sang-Jik Leeb
,
c, Trevor H. Yeatsb, Ashley N. Eganb
,
d, Jeffrey J. Doyleb, and Jocelyn K. C. Roseb
,
2
33. Bdelloid Rotifers
400+ species
8% non-metazoan genes
40% enzymes via HGT
Non-homologous
chromosomes - ameiotic
Biochemical Diversification through Foreign Gene Expression in Bdelloid
Rotifers Chiara Boschetti, Adrian Carr, Alastair Crisp, Isobel Eyres,
Yuan Wang-Koh, Esther Lubzens, Timothy G. Barraclough, Gos
Micklem,Alan Tunnacliffe: November 15, 2012
Saturday, September 1, 18
Slide 21: One organism that deserves its own slide is the Bdelloid rotifer, a group that includes a few hundred or so species, and, until recently, was believed to be the
largest group of asexual organisms. Asexuals are very rare and scattered in nature (it’s not a very effective mode of life, and we will discuss theories about this later). One
theory is Muller’s ratchet, which states that without genetic recombination, deleterious mutations accumulate and wipe out the species. When the genome of Bdelloidiae was
studied (which contains an archaic non-functional diploid set, making the organisms technically tetraploid but functionally diploid, allegedly) it was found that they somehow
undergo allele exchange and recombination, just like sexy animals. One theory was that, like some related rotifers, they spawned males on special occasions is what allowed
genetic recombination, though this has never been observed. Furthermore, the genome (it’s functional half) also included 8% non-animal genes (from plants, fungi,
prokaryotes) especially in sub-telomeric regions. It contained the largest diversity (but fewest number) of transposable elements of any known genome (we’ll talk more about
transposons later). One theory is that they patch their DNA with whatever is available after desiccation cycles, much like bacterial competence. Some of these foreign genes
were indeed transcribed and produced functional proteins associated with Homeobox genes (morphologic regulatory elements) redox processes, detox and antioxidant,
genome-repair and regulation, and glycoside-hydrolases (digesting). It’s no surprise that these rotifers are 10x more resistant to radiation and oxidative stress than related
species.
“The high number of horizontally acquired genes, including some seemingly recent ones, suggests that HGTs may also be occurring from rotifer to rotifer. It is
plausible that the repeated cycles of desiccation and rehydration experienced by A. [Adineta] vaga in its natural habitats have had a major role in shaping its
genome: desiccation presumably causes DNA double-strand breaks, and these breaks that allow integration of horizontally transferred genetic material also
promote gene conversion when they are repaired. Hence, the homogenizing and diversifying roles of sex may have been replaced in bdelloids by gene conversion
and horizontal gene transfer, in an unexpected convergence of evolutionary strategy with prokaryotes… By contrast, in bdelloid rotifers we found many genes
that appear to have originated in bacteria, fungi, and plants, concentrated in telomeric regions along with diverse mobile genetic elements. Bdelloid proximal
gene-rich regions, however, appeared to lack foreign genes, thereby resembling those of model metazoan organisms. Some of the foreign genes were defective,
whereas others were intact and transcribed; some of the latter contained functional spliceosomal introns. One such gene, apparently of bacterial origin, was
overexpressed in Escherichia coli and yielded an active enzyme. The capture and functional assimilation of exogenous genes may represent an important force in
bdelloid evolution. Approximately 80% of horizontally acquired genes expressed in bdelloids code for enzymes, and these represent 39% of enzymes in identified
pathways. Many enzymes encoded by foreign genes enhance biochemistry in bdelloids compared to other metazoans, for example, by potentiating toxin
degradation or generation of antioxidants and key metabolites. They also supplement, and occasionally potentially replace, existing metazoan functions. Bdelloid
rotifers therefore express horizontally acquired genes on a scale unprecedented in animals, and foreign genes make a profound contribution to their metabolism.”
Source: Biochemical Diversification through Foreign Gene Expression in Bdelloid Rotifers. Chiara Boschetti, Adrian Carr, Alastair Crisp, Isobel Eyres, Yuan Wang-
Koh, Esther Lubzens, Timothy G. Barraclough, Gos Micklem, Alan Tunnacliffe