(1) The document discusses the concepts of emergence and reduction in physics, specifically arguing that they pose a false dichotomy as they are logically independent.
(2) It provides examples where emergence occurs alongside reduction, such as the emergence of classical behavior from quantum mechanics in certain limits, and the emergence of thermodynamic laws and properties from statistical mechanics.
(3) The key point is that reduction, viewed as deduction, allows for novelty through the choices made in taking limits, such as which symmetries to break or states to keep. So emergence and reduction can be compatible.
AdS Biology and Quantum Information ScienceMelanie Swan
Quantum Information Science is a fast-growing discipline advancing many areas of science such as cryptography, chemistry, finance, space science, and biology. In particular AdS/Biology, an interpretation of the AdS/CFT correspondence in biological systems, is showing promise in new biophysical mathematical models of topology (Chern-Simons (solvable QFT), knotting, and compaction). For example, one model of neurodegenerative disease takes a topological view of protein buildup (AB plaques and tau tangles in Alzheimer’s disease, alpha-synuclein in Parkinson’s disease, TDP-43 in ALS). AdS/Neuroscience methods are implicated in integrating multiscalar systems with different bulk-boundary space-time regimes (e.g. oncology tumors, fMRI + EEG imaging), entanglement (correlation) renormalization across scales (MERA, random tensor networks, melonic diagrams), entropy (possible system states), entanglement entropy (interrelated fluctuations and correlations across system tiers), and non-ergodicity (implied efficiency mechanisms since biology does not cycle through all possible configurations per temperature (thermotaxis), chemotaxis, and energy cues); Maxwell’s demon of biology (partition functions), conservation across system scales (biophysical gauge symmetry (system-wide conserved quantity)), and the presence of codes (DNA, codons, neural codes). A multiscalar AdS/CFT correspondence is mobilized in 4-tier ecosystem models (light-plankton-krill-whale and ion-synapse-neuron-network (AdS/Brain)).
AdS Biology and Quantum Information ScienceMelanie Swan
Quantum Information Science is a fast-growing discipline advancing many areas of science such as cryptography, chemistry, finance, space science, and biology. In particular AdS/Biology, an interpretation of the AdS/CFT correspondence in biological systems, is showing promise in new biophysical mathematical models of topology (Chern-Simons (solvable QFT), knotting, and compaction). For example, one model of neurodegenerative disease takes a topological view of protein buildup (AB plaques and tau tangles in Alzheimer’s disease, alpha-synuclein in Parkinson’s disease, TDP-43 in ALS). AdS/Neuroscience methods are implicated in integrating multiscalar systems with different bulk-boundary space-time regimes (e.g. oncology tumors, fMRI + EEG imaging), entanglement (correlation) renormalization across scales (MERA, random tensor networks, melonic diagrams), entropy (possible system states), entanglement entropy (interrelated fluctuations and correlations across system tiers), and non-ergodicity (implied efficiency mechanisms since biology does not cycle through all possible configurations per temperature (thermotaxis), chemotaxis, and energy cues); Maxwell’s demon of biology (partition functions), conservation across system scales (biophysical gauge symmetry (system-wide conserved quantity)), and the presence of codes (DNA, codons, neural codes). A multiscalar AdS/CFT correspondence is mobilized in 4-tier ecosystem models (light-plankton-krill-whale and ion-synapse-neuron-network (AdS/Brain)).
[ICLR2021 (spotlight)] Benefit of deep learning with non-convex noisy gradien...Taiji Suzuki
Presentation slide of our ICLR2021 paper "Benefit of deep learning with non-convex noisy gradient descent: Provable excess risk bound and superiority to kernel methods."
Abstract:
Establishing a theoretical analysis that explains why deep learning can outperform shallow learning such as kernel methods is one of the biggest issues in the deep learning literature. Towards answering this question, we evaluate excess risk of a deep learning estimator trained by a noisy gradient descent with ridge regularization on a mildly overparameterized neural network, and discuss its superiority to a class of linear estimators that includes neural tangent kernel approach, random feature model, other kernel methods, k-NN estimator and so on. We consider a teacher-student regression model, and eventually show that {\it any} linear estimator can be outperformed by deep learning in a sense of the minimax optimal rate especially for a high dimension setting. The obtained excess bounds are so-called fast learning rate which is faster than O(1/\sqrt{n}) that is obtained by usual Rademacher complexity analysis. This discrepancy is induced by the non-convex geometry of the model and the noisy gradient descent used for neural network training provably reaches a near global optimal solution even though the loss landscape is highly non-convex. Although the noisy gradient descent does not employ any explicit or implicit sparsity inducing regularization, it shows a preferable generalization performance that dominates linear estimators.
History of Information: Classical, Medieval, Modern theory
Open problem of Information: The unification of various theories of information; What is useful/meaningful information?What is an adequate logic of information? Continuous versus discrete models of nature; Computation versus thermodynamics; Classical information versus quantum information; Information and the theory of everything; The Church-Turing Hypothesis; P versus NP?
It from Bit: Why the Quantum? It from Bit? A Participatory Universe?: Three Far-reaching, Visionary Questions from John Archibald Wheeler
Physic, Math, Information: String Theory, Quantum, Sporadic finite Groups, Leech Latice, Gravity as emergent,
Universe digital copy conjecture: representation of universal information
Emergent Transformation Conjecture: the math of emergent
Potential applications: Deep Learning; Capability Transformation using Enterprise Architect
What’s in it for us: Information science, getting ready for Industry 4.0
Imputation of Missing Values using Random ForestSatoshi Kato
missForest packageの紹介
“MissForest - nonparametric missing value imputation for mixed-type data (DJ Stekhoven, P Bühlmann (2011), Bioinformatics 28 (1), 112-118)
Framework for understanding quantum computing use cases from a multidisciplin...Anastasija Nikiforova
This presentation is a supplementary material for the article "Framework for understanding quantum computing use cases from a multidisciplinary perspective and future research directions" (Ukpabi, D.C., Karjaluoto, H., Botticher, A., Nikiforova, A., Petrescu, D.I., Schindler, P., Valtenbergs, V., Lehmann, L., & Yakaryılmaz, A) available at https://arxiv.org/ftp/arxiv/papers/2212/2212.13909.pdf. THe presentation, however, was delivered for QWorld Quantum Science Days 2023 | May 29-31.
Science v Pseudoscience: What’s the Difference? - Kevin KorbAdam Ford
Science has a certain common core, especially a reliance on empirical methods of assessing hypotheses. Pseudosciences have little in common but their negation: they are not science.
They reject meaningful empirical assessment in some way or another. Popper proposed a clear demarcation criterion for Science v Rubbish: Falsifiability. However, his criterion has not stood the test of time. There are no definitive arguments against any pseudoscience, any more than against extreme skepticism in general, but there are clear indicators of phoniness.
Post: http://www.scifuture.org/science-vs-pseudoscience
[ICLR2021 (spotlight)] Benefit of deep learning with non-convex noisy gradien...Taiji Suzuki
Presentation slide of our ICLR2021 paper "Benefit of deep learning with non-convex noisy gradient descent: Provable excess risk bound and superiority to kernel methods."
Abstract:
Establishing a theoretical analysis that explains why deep learning can outperform shallow learning such as kernel methods is one of the biggest issues in the deep learning literature. Towards answering this question, we evaluate excess risk of a deep learning estimator trained by a noisy gradient descent with ridge regularization on a mildly overparameterized neural network, and discuss its superiority to a class of linear estimators that includes neural tangent kernel approach, random feature model, other kernel methods, k-NN estimator and so on. We consider a teacher-student regression model, and eventually show that {\it any} linear estimator can be outperformed by deep learning in a sense of the minimax optimal rate especially for a high dimension setting. The obtained excess bounds are so-called fast learning rate which is faster than O(1/\sqrt{n}) that is obtained by usual Rademacher complexity analysis. This discrepancy is induced by the non-convex geometry of the model and the noisy gradient descent used for neural network training provably reaches a near global optimal solution even though the loss landscape is highly non-convex. Although the noisy gradient descent does not employ any explicit or implicit sparsity inducing regularization, it shows a preferable generalization performance that dominates linear estimators.
History of Information: Classical, Medieval, Modern theory
Open problem of Information: The unification of various theories of information; What is useful/meaningful information?What is an adequate logic of information? Continuous versus discrete models of nature; Computation versus thermodynamics; Classical information versus quantum information; Information and the theory of everything; The Church-Turing Hypothesis; P versus NP?
It from Bit: Why the Quantum? It from Bit? A Participatory Universe?: Three Far-reaching, Visionary Questions from John Archibald Wheeler
Physic, Math, Information: String Theory, Quantum, Sporadic finite Groups, Leech Latice, Gravity as emergent,
Universe digital copy conjecture: representation of universal information
Emergent Transformation Conjecture: the math of emergent
Potential applications: Deep Learning; Capability Transformation using Enterprise Architect
What’s in it for us: Information science, getting ready for Industry 4.0
Imputation of Missing Values using Random ForestSatoshi Kato
missForest packageの紹介
“MissForest - nonparametric missing value imputation for mixed-type data (DJ Stekhoven, P Bühlmann (2011), Bioinformatics 28 (1), 112-118)
Framework for understanding quantum computing use cases from a multidisciplin...Anastasija Nikiforova
This presentation is a supplementary material for the article "Framework for understanding quantum computing use cases from a multidisciplinary perspective and future research directions" (Ukpabi, D.C., Karjaluoto, H., Botticher, A., Nikiforova, A., Petrescu, D.I., Schindler, P., Valtenbergs, V., Lehmann, L., & Yakaryılmaz, A) available at https://arxiv.org/ftp/arxiv/papers/2212/2212.13909.pdf. THe presentation, however, was delivered for QWorld Quantum Science Days 2023 | May 29-31.
Science v Pseudoscience: What’s the Difference? - Kevin KorbAdam Ford
Science has a certain common core, especially a reliance on empirical methods of assessing hypotheses. Pseudosciences have little in common but their negation: they are not science.
They reject meaningful empirical assessment in some way or another. Popper proposed a clear demarcation criterion for Science v Rubbish: Falsifiability. However, his criterion has not stood the test of time. There are no definitive arguments against any pseudoscience, any more than against extreme skepticism in general, but there are clear indicators of phoniness.
Post: http://www.scifuture.org/science-vs-pseudoscience
Global Choice...Right Source | SSOW 2012 - "More Than Simply Savings: Balanci...Lowell Fields Millburn
OceanaGold NZ Ltd.
Delivering on Sustainable Growth
Global Choice, Right Source
15th Annual Shared Services Outsourcing Week (SSOW)
Melbourne Convention Centre - Victoria, Australia
17 April 2012
Lowell Millburn
General Manager - Procurement & Logistics
Relativity and Quantum Mechanics Are Not "Incompatible"John47Wind
Many scientific journals, books, magazines and science web sites state that since Einstein’s theory of gravity doesn’t “fit” into the quantum theory of forces, a new quantum theory of gravity must be found. This essay explodes the prevailing scientific myth that relativity and quantum mechanics are somehow incompatible. The simple fact of the matter is that gravity is not a force at all, so trying to make it “fit” into quantum theory is impossible. This essay demonstrates that relativity and quantum physics are indeed different, but it’s simply a matter of scale. In fact they are perfect reflections of each other.
Order, Chaos and the End of ReductionismJohn47Wind
The author presents a case against reductionism based on the emergence of chaos and order from underlying non-linear processes. Since all theories are mathematical, and based on an underlying premise of linearity, the author contends that there is no hope that science will succeed in creating a theory of everything that is complete. The controversial subject of life and evolution are explored, exposing the fallacy of a reductionist explanation, and offering a theory of order emerging from chaos as being the creative process of the universe, leading all the way up to consciousness. The essay concludes with the possibility that the three-dimensional universe is a fractal boundary that separates order and chaos in a higher dimension. The author discusses the work of Claude Shannon, Benoit Mandelbrot, Stephen Hawking, Carl Sagan, Albert Einstein, Erwin Schrodinger, Erik Verlinde, John Wheeler, Richard Maurice Bucke, Pierre Teilhard de Chardin, and others. This is a companion piece to the essay "Is Science Solving the Reality Riddle?"
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...IOSR Journals
A new theory has been presented, for the first time, called the "Phase Theory", which is the natural evolution of the physical thought and is considered the one beyond the super string theory. This theory solves the unsolved problems of the mysterious of matter, antimatter and interactions and makes a wide step towards the unification of the forces of nature. In this theory, the vibrating string of different frequency modes which determines the different types of elementary particles is replaced by a three dimensional infinitesimal pulsating (black)holes with the same frequency. Different types of elementary particles are determined by different phase angles associated with the same frequency. This allows the force of interactions to take place among elementary particles, without the need to invoke the notion of the force carrier particles, as the (stable) force of interactions can never take place between elementary particles at different frequencies. Besides the strong mathematical proofs given in this paper to prove its truthfulness, an experimental prediction has been given to confirm the theory presented in the form of the relation between the electron radius and quarks radii. The paper shows that quarks are direct consequence of this theory, and solves "the flavor problem" in QCD, and gives the clue to answer the questions of "Why are there so many flavors? The paper also derives the equation of the big bang theory which describes the singularity of the moment of creation of the universe.
The paper proposes a model of a unitary quantum field theory where the particle is represented as a wave packet. The frequency dispersion equation is chosen so that the packet periodically appears and disappears without changing its form. The envelope of the process is identified with a conventional wave function. Equation of such a field is nonlinear and relativistically invariant. With proper adjustments, they are reduced to Dirac, Schroedinger and Hamilton-Jacobi equations. A number of new experimental effects are predicted both for high and low energies.
WAVE-VISUALIZATION
1. Information gleaned from various sources. -“A BRIEF DESCRIPTION” - -Quantum physics is the physical theory that describes the behavior of matter, radiation and all their interactions views as both wave phenomena as either particle phenomena (wave-particle duality), unlike the classical Newtonian physics based on Isaac Newton's theories or, which sees for example the light just like wave and the electron just as a particle. ***In May 1926, Schrödinger proved that Heisenberg's matrix mechanics and his own wave mechanics made the same predictions about the properties and behaviour of the electron; mathematically, the two theories had an underlying common form. Yet the two men disagreed on the interpretation of their mutual theory. For instance, Heisenberg accepted the theoretical prediction of jumps of electrons between orbitals in an atom, but Schrödinger hoped that a theory based on continuous wave-like properties could avoid what he called (as paraphrased by Wilhelm Wien) "this nonsense about quantum jumps." The reconceived theory is formulated in various specially developed mathematical formalisms. In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position, momentum, and other physical properties of a particle. Important applications of quantum mechanical theory include uperconducting magnets, light-emitting diodes and the laser, the transistor and semicoductors such as the microprocessor, medical and research imaging such as magnetic resonance imaging magnetic resonance and electron microscopy, and explanations for many biological and physical phenomena. Wave–particle duality is the fact that every elementary particle or quantic entity exhibits the properties of not only particles, but also waves. It addresses the inability of the classical concepts "particle" or "wave" to fully describe the behavior of quantum-scale objects. As Einstein wrote: "It seems as though we must use sometimes the one theory and sometimes the other, while at times we may use either. We are faced with a new kind of difficulty. We have two contradictory pictures of reality; separately neither of them fully explains the phenomena of light, but together they do". The wave view did not immediately displace the ray and particle view, but began to dominate scientific thinking about light in the mid 19th century, since it could explain polarization phenomena that the alternatives could not
General relativity vs. quantum mechanics issues of foundations uv 1_oct2018SOCIEDAD JULIO GARAVITO
En este seminario, nos enfocaremos en los asuntos fundamentales relacionados con los pilares actuales de la física, y discutiremos los problemas para la creación de una teoría cuántica de la gravitación, es decir Teoría de Cuerdas, Super-Simetría o SUSY, o una Teoría del Todo.
Talk given at Oxford Philosophy of Physics, LSE's Sigma Club, the Munich Center for Mathematical Philosophy, Carlo Rovelli's 60th birthday conference.
I construe dualities in physics as particular cases of theoretical equivalence. The question then naturally arises whether duality is compatible with emergence. For the the focus of emergence is on novelty rather than on equivalence.
In the first part of the talk, I review recent work dealing with this question. I exhibit two ways in which duality and equivalence can be made compatible, and I give an example of emergence in gauge/gravity dualities: dualities between a theory of gravity in (d+1) dimensions and a quantum field theory (QFT) in d dimensions.
In the second part of the talk, I present new results on the question whether diffeomorphisms in gravity theories emerge from QFTs. I critically assess the following idea, taken from the physics literature: given that (a) the QFT is not a diffeomorphism invariant theory, and that (b) there is a duality between the QFT and the gravity theory, are we entitled to (c) conclude that the diffeomorphisms of the gravity theory emerge from the QFT?
I argue that one must distinguish different kinds of diffeomorphisms: some diffeomorphisms are ‘invisible’ to the QFT: all of the QFT’s quantities are invariant under them, therefore the QFT does not ‘see’ them. But other diffeomorphisms are ‘visible’ to the QFT. The invisible diffeomorphisms prompt a ‘Bulk Argument’, in analogy with the Hole Argument. The analysis of emergence is different for these different kinds of diffeomorphisms, and I discuss the way in which we can speak of emergence of diffeomorphisms in gauge/gravity dualities.
Pittsburgh talk on Emergence and in Gauge/Gravity DualitiesSebastian De Haro
Talk given at the conference "Effective Theories, Mixed Scale Modeling, and Emergence", Center for Philosophy of Science, University of Pittsburg, 2-4 October 2015.
Recent developments in gauge/gravity dualities suggest that RG flows and the dynamics of spacetime are intimately related. The direction along which the theory’s couplings ‘flow’, typically an energy scale, is reinterpreted as a dimension (spatial or temporal) in which fields can evolve. It is often claimed that space, time, and-or gravity ‘emerge’ from a lower-dimensional theory in this way.
The aim of this talk is to: (i) present a framework for discussing emergence in cases of duality; (ii) assess the claims of emergence in the best studied case of gauge/gravity duality: the AdS/CFT correspondence; (iii) comment on the possibility of carrying these ideas over to a cosmological context, viz. of de Sitter space-time.
Regarding (i): after introducing general notions of duality and emergence, I argue that these two notions preclude one another. Coarse-graining comes to the rescue, suggesting two possible ways in which emergence can take place in the case of dualities (and I argue that this exhausts the possibilities). (ii) Then I introduce holographic RG techniques as used in AdS/CFT. And I employ the general framework of dualities and emergence to assess in what sense these are cases of emergence. (iii) In the last part of the talk, I discuss the conjectured dS/CFT correspondence for cosmological scenario’s. After (a) discussing some of the difficulties for holographic theories of de Sitter spacetime, (b) I review current progress on two proposals for dS/CFT, and (c) discuss in what sense time can be said to emerge in such scenarios.
Berlin Slides Dualities and Emergence of Space-Time and GravitySebastian De Haro
Holographic relations between theories have become an important theme in quantum gravity research. These relations entail that a theory without gravity is equivalent to a gravitational theory with an extra spatial dimension. The idea of holography was first proposed in 1993 by ‘t Hooft on the basis of his studies of evaporating black holes. Soon afterwards the holographic AdS/CFT duality was introduced, which since has been intensively studied in the string theory community and beyond. Recently, Verlinde has proposed that Newton’s law of gravitation can be related holographically to the ‘thermodynamics of information’ on screens. I discuss the last two scenarios, with special attention to the status of the holographic relation in them and to the question of whether they make gravity and spacetime emergent. I conclude that only Verlinde’s scheme instantiates emergence in a clear and uncontroversial way. I suggest that a reinterpretation of AdS/CFT may create room for the emergence of spacetime and gravity there as well.
Talk on the philosophy of dualities, in particular AdS/CFT. Joint talk delivered with Jeremy Butterfield at the Biennial Meeting of the Philosophy of Science Association, Chicago, 7 Nov. 2014.
Florence Duality Talk: Reduction and Emergence in Holographic Scenarios for G...Sebastian De Haro
Philosophical talk about the status of dualities and the emergence of gravity in two holographic scenarios: 1) AdS/CFT and 2) Verlinde's scenario of emergent gravity.
Based on recent work on quantum gravity and the holographic principle I argue that, instead of thinking of the universe as a 'bubble out of nothing', we should think of space, time, and gravity as emerging 'out of information'.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The system’s unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...
Emergence and Reduction in Physics
1. Emergence and
Reduction in Physics
SEBASTIAN DE HARO
UNIVERSITY OF AMSTERDAM AND UNIVERSITY OF CAMBRIDGE
LONDON, 3 JANUARY 2017
2. Emergence and Reduction: motivation
Emergence and reduction are two concepts which are often invoked in discussions
relating different levels in nature, or different disciplines in science:
For example, it is sometimes said that life ‘emerges’ out of the underlying chemical or
biological components.
The implication of this use of emergence is that there is novelty in that which
emerges (i.e. life).
Motto: “more is different” (Philip W. Anderson, 1972).
Reduction (or reductionism) is often used to imply that the whole is “nothing but”
the sum of its parts.
For example, the physics of the solar system is nothing but the sum of the mutual two-
body interactions of the planets and the Sun.
3. Aims of this talk: emergence and reduction
(i) To clarify the use of these concepts (‘emergence’ and ‘reduction’)
in science, especially in physics.
(ii) Specifically: to argue that the contrast ‘emergence vs. reduction’
poses a false dichotomy, since they are independent.
(iii) To point out that the independence of the two concepts may open
interesting avenues for the philosophy of mind.
But I will not work this out in a theory of the mind.
4. Emergence vs. reduction:
not just an academic discussion
Steven Weinberg: “One of the members of the [SSC] board argued
that we should not give the impression that we think that
elementary particle physics is more fundamental than other fields,
because it just tended to enrage our friends in other areas of
physics. The reason we give the impression that we think that
elementary particle physics is more fundamental than other
branches of physics is because it is. I do not know how to defend
the amounts being spent on particle physics without being frank
about this.”
Philip Anderson: “They [the results of particle physics] are in no sense
more fundamental than what Alan Turing did in founding the
computer science, or what Francis Crick and James Watson did in
discovering the secret of life.”
5. This is what is left of the
Superconductor Super Collider…
6. Stephen Hawking
“By far the most important [prediction] is supersymmetry which is
fundamental to most attempts to unify Einstein's General Relativity
with Quantum Theory. This would be confirmed by the discovery of
superpartners to the particles that we already know. The
Superconducting Super Collider (the SSC) was being built in Texas
and would have reached the energies at which super partners
were expected. However, the United States went through a fit of
feeling poor and canceled the project half way. At the risk of
causing embarrassment, I have to say I think this was a very short
sighted decision. I hope that the US, and other governments will
do better in the next millennium.” (Millennium lecture)
7. A misconstrued debate?
On the surface of it, Anderson seems to have adopted the wiser attitude.
But both seem to have part of the truth.
In fact, they both agree in their acceptance of reductionism:
“The reductionist hypothesis may still be a topic for controversy among
philosophers, but among the great majority of active scientists I think it is
accepted without question.” (Anderson, opening paragraph).
In case this was not sufficiently clear: “We must all start with reductionism, which
I fully accept.”
But reduction and reductionism (complete reduction, i.e. “nothing but…”)
are different things!
8. A misconstrued debate?
And a proposed reconciliation
Reduction is often taken to be emergence’s natural opponent.
But there is a growing consensus, in philosophy of physics, that emergence and
reduction need not be each other’s opposites.
In fact, they are logically independent of each other!
Jeremy Butterfield (2011):
“Emergence, Reduction and Supervenience: A Varied Landscape”
“Less is Different: Emergence and Reduction Reconciled”
And so, we do not need to take sides in the ‘emergence vs. reduction’
debate.
We better get some good idea of what emergence and reduction are in
physics.
Else, how will our theories fare when (if!) we try to apply these concepts to the mind?
9. Intermezzo: vindicating
non-reductionist science
The idea of reduction is, of course, ubiquitous in science, especially physics.
Familiar cases:
At low speeds, special relativity (Einstein) reduces to classical mechanics (Newton).
For sufficiently large systems, quantum mechanics reduces to classical mechanics, or
ℏ → 0 (more precisely, large-action).
But the cases in which reduction fails are equally ubiquitous!
10. An analogy:
mapping a sphere to the plane
There is no simple way to project a sphere onto the plane, without
‘tearing the sphere apart’. There will always be at least one point
(e.g. the North pole) which does not map.
In doing so, we lose information about the topology of the sphere.
11. Example: Maxwell’s theory of
electromagnetism
James Maxwell (1831-1879) unified electric and magnetic phenomena
through his theory of electromagnetism (Maxwell’s equations).
According to this theory, light is a wave propagating at a constant speed, c.
Maxwell’s theory’s basic quantity was the electromagnetic field. This field
failed to be reduced to the quantities of classical mechanics (which describes
particles). Nevertheless, Maxwell believed that his theory would ultimately
receive a mechanical explanation. This never happened.
The shift to Maxwell’s theory required the introduction of new concepts.
Maxwell’s theory was not simply a complex application of the fundamental
principles already known.
Reduction is useful in many cases. But there are cases in which one simply
needs to think in a different way. So, reduction is not always the most
productive way to proceed in formulating new scientific theories.
Insisting on reduction in such cases could be like mapping the sphere to the plane!
12. Example: Maxwell’s theory of
electromagnetism
One may say: but both Maxwell’s theory and classical mechanics are
special cases of a bigger framework, viz. field theory.
Agreed! But this only reinforces my point. The issue is not about the lack
of an explanation or a more comprehensive theory, but about
reductionism not always being the right way to approach a problem.
13. Emergence, Reduction,
Supervenience
Emergence: properties or behaviour of a system which are novel
and robust relative to some appropriate comparison class.
Novel means, typically: not definable from the comparison class,
maybe showing features absent from the class.
Robust: the same for various choices of, or assumptions about, the
comparison class.
Think of a composite system (e.g. a metal): the properties and
behaviour (e.g. hardness, electrical conductivity, colour) are novel
and robust compared to those of its components (atoms).
Robustness is the independence of the properties of the metal on the
specific properties (location, etc.) of the individual atoms.
14. Emergence, Reduction,
Supervenience
Reduction: deduction of one theory (or phenomenon) from another.
For example, classical mechanics can be deduced from quantum mechanics
(by taking a limit, ℏ → 0).
Normally, reduction requires the introduction of bridge principles: new definitions,
or correspondence rules between the two theories, regarded as two languages,
i.e. a dictionary between the two (Ernest Nagel, 1961).
Also, the taking of these limits is often delicate: the strict limits are often not well-
defined, and one needs to appropriately renormalise the physical quantities of
interest. This procedure is often not unique nor can it be a priori defined on the
grounds of the theory of interest only.
Physical judgment is needed: to decide which is the limit of interest, and how to take
this limit.
16. Double-well potential
The ground-state wave-function is symmetric around its two potentials.
The naive limit ℏ → 0 preserves this symmetry, and so the state in the
‘classical’ limit still has two peaks.
This is usually solved by breaking the symmetry, e.g. by adding a small term
to the potential that breaks the symmetry (since realistic potentials will have
correction terms that remove the symmetry).
In this case, one of the two peaks is suppressed, and the classical limit is as we
would expect.
But, in doing so, one is sneaking in extra information!
In other cases, other kinds of extra information may be needed: initial conditions,
boundary conditions, choice of states to keep, etc.
17. Emergence, Reduction,
Supervenience
Supervenience: a set of properties A supervenes upon another set B
just in case no two things can differ with respect to A-properties
without also differing with respect to their B-properties. In short:
“there cannot be an A-difference without a B-difference”.
Take A = classical mechanics, B = quantum mechanics.
Any change in A (e.g. a change in the position of the particle) will imply
a corresponding change in B (e.g. a change in some of the observables
of the corresponding quantum theory).
Supervenience is weaker than reduction, because we do not need
to have an explicit relation between A and B (such a relation may
only be given implicitly, or may not exist at all).
18. Emergence and Reduction
Clearly, there is a tension between emergence and reduction. For
emergence emphasises novelty, whereas reduction emphasises
deduction.
And logic teaches that valid deduction gives no new ‘content’.
So, how can there be novel behaviour, in cases in which there is
reduction?
Indeed, many authors take the definition emergence to also involve
irreducibility (i.e. lack of reduction), thus turning this tension into a built-in
incompatibility.
Answer: there can be novelty in the use of limits.
As we have already seen, taking limits involves making choices about
e.g. which symmetries to break, or which states to keep!
19. A (technical, but important)
warning about reduction!
There are additional reasons why reduction is often not unique, but requires
making choices. Recall the idea of reduction as taking a limit:
Defining the limit of a sequence depends on a choice of a topology (e.g. a
choice of open sets).
A given sequence may converge in one topology, but not in another.
And so, to claim that classical mechanics can be deduced as a limit of special
relativity, one needs to say with respect to which topology it is deduced.
It has been argued (S. Fletcher, British Journal for the Philosophy of Science, 2016)
that there is no canonical choice of topology.
So, also for this additional reason, reduction (viewed as deduction) requires
making some global choices about how theories or phenomena are to be
compared.
20. Summary so far, on reduction
Consider completely reducing one theory to another (‘reductionism’):
Reduction = deduction.
There are several valid deductions possible, depending on:
Choices of states and quantities, symmetries to break (flea on Schrödinger’s cat).
Initial and boundary conditions.
Non-uniqueness of limits (which limits to take, choices of topology).
...
In most cases in which we are told that reduction has been successfully carried out,
implicit choices have been made about these issues.
Additional (usually global) information is being sneaked in, to what (perhaps superficially)
looks like a complete reduction.
And so, it is not true that the ‘parent’ theory, all by itself, determines everything (even if
everything can be deduced, given the right assumptions!).
21. Emergence and Reduction
This means that reduction (deduction) allows for novelty, through the choices
made.
So, emergence and reduction are, in fact, compatible!
Claim (Butterfield 2011): emergence is independent of reduction and
supervenience. So there can be emergence with or without reduction
(and the same for the weaker notion of supervenience).
Reduction without emergence: unsurprising, given that these are often seen as
each other’s opposites.
Emergence without reduction: also unsurprising.
Surprising, nevertheless, that such cases appear in physics! Vis-à-vis Anderson’s claim
that “The reductionist hypothesis... is accepted without question [by physicists]”.
See George and Sachith’s talk!
Reduction with emergence: surprising! And the examples abound!
22. Example 1: the double-well
potential
We have already seen the possible emergence of a classical state of the
double-well potential from the quantum state.
What emerges? In the limit ℏ → 0, the particle has a well-defined position
(determined deterministically).
There is novelty, because that property is forbidden by the quantum theory.
It will be robust, if it does not depend on details of the microscopic
perturbation (the flea on Schrödinger’s cat).
Reduction: because there is a well-defined limit from which the classical
behaviour is mathematically derived.
Since we needed to use extra information about how to take the limit (and in
making this choice we are often motivated by the kind of classical behaviour
we wish to reproduce), we may be inclined to call this reduction incomplete.
23. Example 2: Thermodynamics
Thermodynamics is the branch of physics that describes physical systems in terms of
macroscopic quantities such as heat, temperature, energy, work, etc.
Statistical mechanics is the application of (classical or quantum) mechanics to
systems of many particles (e.g. 1023
, roughly the number of atoms in 12g of carbon).
The quantities and laws of thermodynamics readily follow from the ‘thermodynamic
limit’ of statistical mechanics, i.e. the limit in which the volume of the system and the
number of particles go to infinity (keeping their ratio constant).
In this limit, thermodynamical quantities such as heat and temperature, which have
no microscopic analogue, emerge: in an uncontroversial way (as I have discussed:
these are novel properties, robust relative to the microscopic dynamics).
So we have a case of emergence with reduction.
Notice the strong sense of emergence here: the laws of thermodynamics emerge
along with the thermodynamical properties.
The precise nature of the reduction limit is still a matter of debate (e.g. arrow of time!).
24. A warning!
I have discussed a single
notion of emergence (novel
and robust behaviour, relative
to a comparison class).
There are several ways of
construing ‘novelty’:
epistemic-ontological, weak-
strong (in principle/practice,
amount of anti-reductionism).
Also the comparison class can
be synchronic or diachronic.
So, unpacking the details of the notion of emergence discussed, we obtain different
kinds of emergence, all of which nevertheless fall under the same conception.
Guay & Sartenaer, A New Look at Emergence
25. Some philosophical options
for the mind
I think it is interesting to look at some philosophical options available
in trying to conceptualise the mind:
For simplicity, I base the comments here on the notion of ‘substance’, as
a useful notion.
A single substance:
Materialism: humans are only specific configurations of matter.
Idealism: only mind. Berkeley, Hegel,...
Dual aspect theory: one substance with two different aspects, a
physical and a mental aspect.
Dualism: Plato, Descartes,... Two individual substances, linked in
some (usually problematic, but interesting) way.
26. An excursion: double aspect theory
Emergence (and supervenience) seem to sit well with double aspect
theory.
The words you speak have a physical aspect (sound waves with specific
combinations of frequencies, etc.) and a mental aspect (the meaning that is
attached to those frequencies).
These two aspects cannot be separated without destrying the message of the
communication.
This meaning is actually emergent, in the way I described before.
The mind could work in this way as well. For our mental capacities are not
independent of the physical processes in our brain (which would amount
to a form of dualism).
Even if our mental capacities are supervenient on the brain (no mental activity
possible without brain activity), emergence still allows for novelty in the contents
of our thoughts and in the targets of our desires.
27. Summary and some conclusions
Emergence and reduction are independent notions.
I am not even discussing cases in which reduction fails, which arise in physics.
Even in cases in which there are strong forms of reduction (like the transition from
statistical mechanics to thermodynamics), reduction tends to be subtle and
incomplete (because non-unique). Even in such cases, no reductionism.
Global choices need to be made, such as: which quantities to take the limit of, and
how (which quantities to keep and which ones to throw away), which physical states
are relevant, boundary conditions, initial conditions, topology,... Typically, the number
of choices that one can make is infinite.
Despite the presence of deduction, this does not seem to be algorithmic.
So, perhaps, we should not conflate reduction/deduction with determinism (since
there are choices that need to be made for deduction).
We require our physical judgment to decide which forms of emergence are relevant.