Universal constants like Planck's constant h, the speed of light c, gravitational constant G, and Boltzmann's constant k can be used to structure theoretical physics. They lead to three main theories: quantum field theory (h and c), general relativity (c and G), and quantum statistics (h and k). While not fully unified, these theories underlie the standard models of particle physics and cosmology. Fundamental metrology provides reliable standards by determining the values of dimensionless constants that depend on h, c, k, and G. Metrology exists at the intersection of fundamental physics described by these theories and emergent physics involving statistical mechanics.
Vasil Penchev. Cyclic mechanics. The principle of cyclicityVasil Penchev
1) The document discusses a theory of cyclic mechanics intended to unify quantum mechanics and general relativity.
2) It proposes several foundational principles, including that the universe can return to any point, time is not uniformly flowing, and all laws must be invariant to discrete and continuous transformations.
3) A key concept is introducing the notion of "quantum measure" and "quantum information" to provide a common measure to equate equations from different theories, with the goal of unifying them.
1) The document discusses the classical theory of electromagnetic radiation confined within an isothermal enclosure and the discrepancies with experimental observations.
2) It analyzes the temperature dependence of the energy density and pressure of the radiation using thermodynamic considerations.
3) This leads to the derivation of the Stefan-Boltzmann law relating the emissive power of a black body to the fourth power of its temperature.
This document presents new ideas in loop quantum gravity, including:
1. Deriving a relationship between time and vorticity using wavefunction continuity.
2. Introducing a new "Eikonal constraint" and showing how it removes acausality by gauging time to light cones.
3. Proposing a new form of the Hamiltonian constraint in the style of the Dirac equation, which reduces state fuzziness in line with Penrose's ideas.
1. The document presents an alternative theory called the Dynamic Universe (DU) approach that views relativity as a consequence of conservation of total energy rather than distorted time and distance.
2. In DU, time and distance are universal quantities, and relativity arises from the linkage between local energy states and the overall energy balance of space.
3. DU relies on mathematics based on an overall zero-energy balance as the fundamental law of nature, making it more metaphysically driven than the empirically based theory of relativity. Both approaches produce precise predictions.
This document discusses whether quantum mechanics is involved in the early evolution of the universe and if a Machian relationship between gravitons and gravitinos can help answer this question. It proposes that:
1) Gravitons and gravitinos carry information and their relationship, described as a Mach's principle, conserves this information from the electroweak era to today. This suggests quantum mechanics may not be essential in early universe formation.
2) A minimum amount of initial information, such as a small value for Planck's constant, is needed to set fundamental cosmological parameters and could be transferred from a prior universe.
3) Early spacetime may have had a pre-quantum state with low entropy and degrees of freedom
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
This document discusses the key branches and concepts of modern physics. It begins by outlining the objectives of becoming familiar with modern physics, stating the postulates of special relativity, and differentiating between inertial and non-inertial reference frames. The main branches of modern physics discussed are atomic and nuclear physics, quantum physics, relativistic physics, solid state physics, and plasma physics. Special relativity introduced ideas like time dilation, length contraction, and mass increase that defy common sense. The general theory of relativity further unified these ideas and proposed gravity is a manifestation of curved spacetime due to mass-energy and momentum.
1) John Nash presents an equation that is a 4th order covariant tensor partial differential equation applicable to the metric tensor of spacetime. This equation is formally divergence free like the Einstein vacuum equation.
2) The equation can be derived from a specific Lagrangian involving terms quadratic in the scalar curvature and Ricci tensor. Previous theorists had considered such Lagrangians in attempts at quantum gravity theories but had not focused on this specific choice.
3) The equation may allow a wider variety of gravitational waves, including compressional waves not excluded in electromagnetic theory. Standard GR only allows transverse gravitational waves.
Vasil Penchev. Cyclic mechanics. The principle of cyclicityVasil Penchev
1) The document discusses a theory of cyclic mechanics intended to unify quantum mechanics and general relativity.
2) It proposes several foundational principles, including that the universe can return to any point, time is not uniformly flowing, and all laws must be invariant to discrete and continuous transformations.
3) A key concept is introducing the notion of "quantum measure" and "quantum information" to provide a common measure to equate equations from different theories, with the goal of unifying them.
1) The document discusses the classical theory of electromagnetic radiation confined within an isothermal enclosure and the discrepancies with experimental observations.
2) It analyzes the temperature dependence of the energy density and pressure of the radiation using thermodynamic considerations.
3) This leads to the derivation of the Stefan-Boltzmann law relating the emissive power of a black body to the fourth power of its temperature.
This document presents new ideas in loop quantum gravity, including:
1. Deriving a relationship between time and vorticity using wavefunction continuity.
2. Introducing a new "Eikonal constraint" and showing how it removes acausality by gauging time to light cones.
3. Proposing a new form of the Hamiltonian constraint in the style of the Dirac equation, which reduces state fuzziness in line with Penrose's ideas.
1. The document presents an alternative theory called the Dynamic Universe (DU) approach that views relativity as a consequence of conservation of total energy rather than distorted time and distance.
2. In DU, time and distance are universal quantities, and relativity arises from the linkage between local energy states and the overall energy balance of space.
3. DU relies on mathematics based on an overall zero-energy balance as the fundamental law of nature, making it more metaphysically driven than the empirically based theory of relativity. Both approaches produce precise predictions.
This document discusses whether quantum mechanics is involved in the early evolution of the universe and if a Machian relationship between gravitons and gravitinos can help answer this question. It proposes that:
1) Gravitons and gravitinos carry information and their relationship, described as a Mach's principle, conserves this information from the electroweak era to today. This suggests quantum mechanics may not be essential in early universe formation.
2) A minimum amount of initial information, such as a small value for Planck's constant, is needed to set fundamental cosmological parameters and could be transferred from a prior universe.
3) Early spacetime may have had a pre-quantum state with low entropy and degrees of freedom
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
This document discusses the key branches and concepts of modern physics. It begins by outlining the objectives of becoming familiar with modern physics, stating the postulates of special relativity, and differentiating between inertial and non-inertial reference frames. The main branches of modern physics discussed are atomic and nuclear physics, quantum physics, relativistic physics, solid state physics, and plasma physics. Special relativity introduced ideas like time dilation, length contraction, and mass increase that defy common sense. The general theory of relativity further unified these ideas and proposed gravity is a manifestation of curved spacetime due to mass-energy and momentum.
1) John Nash presents an equation that is a 4th order covariant tensor partial differential equation applicable to the metric tensor of spacetime. This equation is formally divergence free like the Einstein vacuum equation.
2) The equation can be derived from a specific Lagrangian involving terms quadratic in the scalar curvature and Ricci tensor. Previous theorists had considered such Lagrangians in attempts at quantum gravity theories but had not focused on this specific choice.
3) The equation may allow a wider variety of gravitational waves, including compressional waves not excluded in electromagnetic theory. Standard GR only allows transverse gravitational waves.
The objective of this paper is to propose an approach to the unification of physics by attempting
to construct a physical worldview which can be used as the context for a unified physical theory.
The underlying principle is that we have to construct a clear description of the physical world
before we can build a unified physical theory.
The present state of physics is such that there are many theories which all differ in the descriptive
context in which they operate. The theories of general relativity, quantum theory, quantum
electrodynamics, string theory and the standard model of particle physics are based on differing
concepts of the nature of the physical world.
Hilbert Space and pseudo-Riemannian Space: The Common Base of Quantum Informa...Vasil Penchev
Hilbert space underlying quantum mechanics and pseudo-Riemannian space underlying general relativity share a common base of quantum information. Hilbert space can be interpreted as the free variable of quantum information, and any point in it, being equivalent to a wave function (and thus, to a state of a quantum system), as a value of that variable of quantum information. In turn, pseudo-Riemannian space can be interpreted as the interaction of two or more quantities of quantum information and thus, as two or more entangled quantum systems. Consequently, one can distinguish local physical interactions describable by a single Hilbert space (or by any factorizable tensor product of such ones) and non-local physical interactions describable only by means by that Hilbert space, which cannot be factorized as any tensor product of the Hilbert spaces, by means of which one can describe the interacting quantum subsystems separately. Any interaction, which can be exhaustedly described in a single Hilbert space, such as the weak, strong, and electromagnetic one, is local in terms of quantum information. Any interaction, which cannot be described thus, is nonlocal in terms of quantum information. Any interaction, which is exhaustedly describable by pseudo-Riemannian space, such as gravity, is nonlocal in this sense. Consequently all known physical interaction can be described by a single geometrical base interpreting it in terms of quantum information.
Albert Einstein (1879-1955) developed the theories of special and general relativity. Special relativity, published in 1905, established that the laws of physics are the same in all inertial frames of reference and that the speed of light in a vacuum is constant. General relativity, published in 1915, introduced gravitation as a result of the curvature of spacetime caused by the uneven distribution of mass and energy. One of Einstein's most famous equations is E=mc2, which shows that mass and energy are equivalent and interconvertible.
Albert Einstein (1879-1955) was a renowned German-born physicist who developed the theory of general relativity and special relativity. Some of his key contributions include developing the general theory of relativity, one of the pillars of modern physics, and his mass-energy equivalence formula E=mc2, which demonstrated that mass and energy are the same physical entity and can be changed into one another. He showed that the laws of physics are the same for all observers regardless of their motion or frame of reference.
This document provides an overview of modern physics concepts covered in a 1EE modern physics course, including special relativity. It discusses how classical physics concepts of space, time, and velocity were found to be inconsistent with experimental results. The document uses examples such as particle lifetime experiments and velocity addition to show how observations cannot be explained unless space and time are treated as relative and velocities are limited to the speed of light. The principles of relativity were developed by Einstein to resolve these inconsistencies and form the foundations of modern physics.
Introduction to Classical Mechanics:
UNIT-I : Elementary survey of Classical Mechanics: Newtonian mechanics for single particle and system of particles, Types of the forces and the single particle system examples, Limitation of Newton’s program, conservation laws viz Linear momentum, Angular Momentum & Total Energy, work-energy theorem; open systems (with variable mass). Principle of Virtual work, D’Alembert’s principle’ applications.
UNIT-II : Constraints; Definition, Types, cause & effects, Need, Justification for realizing constraints on the system
The document discusses two approaches to describing relativistic phenomena: 1) an observer-oriented theory that relies on distorted time and distance, and 2) a system-oriented theory with universal time. The author argues that a holistic perspective views time and distance as universal quantities and sees relativity as a consequence of energy conservation. In this view, the dynamics of a spherically closed, expanding hyperspace can explain phenomena like varying atomic clock speeds via changes in local motion and energy, without needing different times. The choice between the two approaches is considered philosophical as both produce accurate predictions.
Dark matter modeled as a Bose Einstein gluon condensate with an energy density relative to baryonic energy density in agreement with observation (ArXiv: 1507.00460)
1) The document proposes a theory of statistical geometrodynamics derived from novel statistical postulates about fundamental constituents of spacetime called "geomets".
2) Key results include deriving the Einstein field equations as a first law of geometrodynamics, and relating the Ricci curvature tensor to the entropy of a holographic surface bounding spacetime via a second law.
3) A third law and zeroeth law of geometrodynamics are also proposed, relating the mean curvature of the holographic surface to bit saturation in the bulk and on the surface.
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...IOSR Journals
A new theory has been presented, for the first time, called the "Phase Theory", which is the natural evolution of the physical thought and is considered the one beyond the super string theory. This theory solves the unsolved problems of the mysterious of matter, antimatter and interactions and makes a wide step towards the unification of the forces of nature. In this theory, the vibrating string of different frequency modes which determines the different types of elementary particles is replaced by a three dimensional infinitesimal pulsating (black)holes with the same frequency. Different types of elementary particles are determined by different phase angles associated with the same frequency. This allows the force of interactions to take place among elementary particles, without the need to invoke the notion of the force carrier particles, as the (stable) force of interactions can never take place between elementary particles at different frequencies. Besides the strong mathematical proofs given in this paper to prove its truthfulness, an experimental prediction has been given to confirm the theory presented in the form of the relation between the electron radius and quarks radii. The paper shows that quarks are direct consequence of this theory, and solves "the flavor problem" in QCD, and gives the clue to answer the questions of "Why are there so many flavors? The paper also derives the equation of the big bang theory which describes the singularity of the moment of creation of the universe.
A relationship between mass as a geometric concept and motion associated with a closed curve in spacetime (a notion taken from differential geometry) is investigated. We show that the 4-dimensional exterior Schwarzschild solution of the General Theory of Relativity can be mapped to a 4-dimensional Euclidean spacetime manifold. As a consequence of this mapping, the quantity M in the exterior Schwarzschild solution which is usually attributed to a massive central object is shown to correspond to a geometric property of spacetime. An additional outcome of this analysis is the discovery that, because M is a property of spacetime geometry, an anisotropy with respect to its spacetime components measured in a Minkowski tangent space defined with respect to a spacetime event P by an observer O who is stationary with respect to the spacetime event P, may be a sensitive measure of an anisotropic cosmic accelerated expansion. The presence of anisotropy in the cosmic accelerated expansion may contribute to the reason that there are currently two prevailing measured estimates of this quantity
1) The document discusses the emergent perspective of gravity, which views gravity as arising from entropy balance rather than directly coupling to energy density.
2) In this perspective, dark energy in the form of a cosmological constant arises naturally as an integration constant, with its smallness due to a near-symmetry.
3) The document proposes that dark matter and dark energy could both arise as components of the quantum vacuum rather than being associated with non-vacuum matter.
Translation of four dimensional axes anywhere within the spatial and temporal boundaries of the universe would require quantitative values from convergence between parameters that reflect these limits. The presence of entanglement and volumetric velocities indicates that the initiating energy for displacement and transposition of axes would be within the upper limit of the rest mass of a single photon which is the same order of magnitude as a macroscopic Hamiltonian of the modified Schrödinger wave function. The representative metaphor is that any local 4-D geometry, rather than displaying restricted movement through Minkowskian space, would instead expand to the total universal space-time volume before re-converging into another location where it would be subject to cause-effect. Within this transient context the contributions from the anisotropic features of entropy and the laws of thermodynamics would be minimal. The central operation of a fundamental unit of 10-20 J, the hydrogen line frequency, and the Bohr orbital time for ground state electrons would be required for the relocalized manifestation. Similar quantified convergence occurs for the ~1012 parallel states within space per Planck’s time which solve for phase-shift increments where Casimir and magnetic forces intersect. Experimental support for these interpretations and potential applications is considered. The multiple, convergent solutions of basic universal quantities suggest that translations of spatial axes into adjacent spatial states and the transposition of four dimensional configurations any where and any time within the universe may be accessed but would require alternative perspectives and technologies.
The Significance of the Speed of Light relating to EinsteinKenny Hansen
This document discusses Albert Einstein's Special Theory of Relativity and how it relates to the universal constant speed of light. It explores how light can behave as both a particle and wave, and how its speed is directly linked to concepts like mass, time, and energy. The speed of light is fundamental to our understanding of physics and the nature of the universe according to Einstein's theory.
Conversion of the vacuum energy of electromagnetic zero point oscillations in...PublicLeaker
This document provides a summary of a scientific work that proposes a method for converting vacuum energy into classical mechanical energy. It begins with an introduction and outlines the philosophical and theoretical background. It then describes experiments conducted to test a concept of an electrostatic rotor that successfully demonstrated the conversion of vacuum energy into rotational mechanical energy. The document concludes by discussing potential applications and opportunities for future development of the energy conversion method.
Introduction to mozilla and its projetcsPradeep Singh
Mozilla is a global non-profit community dedicated to openness on the web. It was formed in 1998 by Netscape to develop the Mozilla web browser and promotes open-source projects like Firefox, Thunderbird, and Webmaker. Mozilla achieves its goals of empowering individuals on the internet through building and promoting free software. It encourages community contributions through coding, testing, translation and other volunteer efforts to advance its mission of a open and accessible internet for all.
The objective of this paper is to propose an approach to the unification of physics by attempting
to construct a physical worldview which can be used as the context for a unified physical theory.
The underlying principle is that we have to construct a clear description of the physical world
before we can build a unified physical theory.
The present state of physics is such that there are many theories which all differ in the descriptive
context in which they operate. The theories of general relativity, quantum theory, quantum
electrodynamics, string theory and the standard model of particle physics are based on differing
concepts of the nature of the physical world.
Hilbert Space and pseudo-Riemannian Space: The Common Base of Quantum Informa...Vasil Penchev
Hilbert space underlying quantum mechanics and pseudo-Riemannian space underlying general relativity share a common base of quantum information. Hilbert space can be interpreted as the free variable of quantum information, and any point in it, being equivalent to a wave function (and thus, to a state of a quantum system), as a value of that variable of quantum information. In turn, pseudo-Riemannian space can be interpreted as the interaction of two or more quantities of quantum information and thus, as two or more entangled quantum systems. Consequently, one can distinguish local physical interactions describable by a single Hilbert space (or by any factorizable tensor product of such ones) and non-local physical interactions describable only by means by that Hilbert space, which cannot be factorized as any tensor product of the Hilbert spaces, by means of which one can describe the interacting quantum subsystems separately. Any interaction, which can be exhaustedly described in a single Hilbert space, such as the weak, strong, and electromagnetic one, is local in terms of quantum information. Any interaction, which cannot be described thus, is nonlocal in terms of quantum information. Any interaction, which is exhaustedly describable by pseudo-Riemannian space, such as gravity, is nonlocal in this sense. Consequently all known physical interaction can be described by a single geometrical base interpreting it in terms of quantum information.
Albert Einstein (1879-1955) developed the theories of special and general relativity. Special relativity, published in 1905, established that the laws of physics are the same in all inertial frames of reference and that the speed of light in a vacuum is constant. General relativity, published in 1915, introduced gravitation as a result of the curvature of spacetime caused by the uneven distribution of mass and energy. One of Einstein's most famous equations is E=mc2, which shows that mass and energy are equivalent and interconvertible.
Albert Einstein (1879-1955) was a renowned German-born physicist who developed the theory of general relativity and special relativity. Some of his key contributions include developing the general theory of relativity, one of the pillars of modern physics, and his mass-energy equivalence formula E=mc2, which demonstrated that mass and energy are the same physical entity and can be changed into one another. He showed that the laws of physics are the same for all observers regardless of their motion or frame of reference.
This document provides an overview of modern physics concepts covered in a 1EE modern physics course, including special relativity. It discusses how classical physics concepts of space, time, and velocity were found to be inconsistent with experimental results. The document uses examples such as particle lifetime experiments and velocity addition to show how observations cannot be explained unless space and time are treated as relative and velocities are limited to the speed of light. The principles of relativity were developed by Einstein to resolve these inconsistencies and form the foundations of modern physics.
Introduction to Classical Mechanics:
UNIT-I : Elementary survey of Classical Mechanics: Newtonian mechanics for single particle and system of particles, Types of the forces and the single particle system examples, Limitation of Newton’s program, conservation laws viz Linear momentum, Angular Momentum & Total Energy, work-energy theorem; open systems (with variable mass). Principle of Virtual work, D’Alembert’s principle’ applications.
UNIT-II : Constraints; Definition, Types, cause & effects, Need, Justification for realizing constraints on the system
The document discusses two approaches to describing relativistic phenomena: 1) an observer-oriented theory that relies on distorted time and distance, and 2) a system-oriented theory with universal time. The author argues that a holistic perspective views time and distance as universal quantities and sees relativity as a consequence of energy conservation. In this view, the dynamics of a spherically closed, expanding hyperspace can explain phenomena like varying atomic clock speeds via changes in local motion and energy, without needing different times. The choice between the two approaches is considered philosophical as both produce accurate predictions.
Dark matter modeled as a Bose Einstein gluon condensate with an energy density relative to baryonic energy density in agreement with observation (ArXiv: 1507.00460)
1) The document proposes a theory of statistical geometrodynamics derived from novel statistical postulates about fundamental constituents of spacetime called "geomets".
2) Key results include deriving the Einstein field equations as a first law of geometrodynamics, and relating the Ricci curvature tensor to the entropy of a holographic surface bounding spacetime via a second law.
3) A third law and zeroeth law of geometrodynamics are also proposed, relating the mean curvature of the holographic surface to bit saturation in the bulk and on the surface.
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...IOSR Journals
A new theory has been presented, for the first time, called the "Phase Theory", which is the natural evolution of the physical thought and is considered the one beyond the super string theory. This theory solves the unsolved problems of the mysterious of matter, antimatter and interactions and makes a wide step towards the unification of the forces of nature. In this theory, the vibrating string of different frequency modes which determines the different types of elementary particles is replaced by a three dimensional infinitesimal pulsating (black)holes with the same frequency. Different types of elementary particles are determined by different phase angles associated with the same frequency. This allows the force of interactions to take place among elementary particles, without the need to invoke the notion of the force carrier particles, as the (stable) force of interactions can never take place between elementary particles at different frequencies. Besides the strong mathematical proofs given in this paper to prove its truthfulness, an experimental prediction has been given to confirm the theory presented in the form of the relation between the electron radius and quarks radii. The paper shows that quarks are direct consequence of this theory, and solves "the flavor problem" in QCD, and gives the clue to answer the questions of "Why are there so many flavors? The paper also derives the equation of the big bang theory which describes the singularity of the moment of creation of the universe.
A relationship between mass as a geometric concept and motion associated with a closed curve in spacetime (a notion taken from differential geometry) is investigated. We show that the 4-dimensional exterior Schwarzschild solution of the General Theory of Relativity can be mapped to a 4-dimensional Euclidean spacetime manifold. As a consequence of this mapping, the quantity M in the exterior Schwarzschild solution which is usually attributed to a massive central object is shown to correspond to a geometric property of spacetime. An additional outcome of this analysis is the discovery that, because M is a property of spacetime geometry, an anisotropy with respect to its spacetime components measured in a Minkowski tangent space defined with respect to a spacetime event P by an observer O who is stationary with respect to the spacetime event P, may be a sensitive measure of an anisotropic cosmic accelerated expansion. The presence of anisotropy in the cosmic accelerated expansion may contribute to the reason that there are currently two prevailing measured estimates of this quantity
1) The document discusses the emergent perspective of gravity, which views gravity as arising from entropy balance rather than directly coupling to energy density.
2) In this perspective, dark energy in the form of a cosmological constant arises naturally as an integration constant, with its smallness due to a near-symmetry.
3) The document proposes that dark matter and dark energy could both arise as components of the quantum vacuum rather than being associated with non-vacuum matter.
Translation of four dimensional axes anywhere within the spatial and temporal boundaries of the universe would require quantitative values from convergence between parameters that reflect these limits. The presence of entanglement and volumetric velocities indicates that the initiating energy for displacement and transposition of axes would be within the upper limit of the rest mass of a single photon which is the same order of magnitude as a macroscopic Hamiltonian of the modified Schrödinger wave function. The representative metaphor is that any local 4-D geometry, rather than displaying restricted movement through Minkowskian space, would instead expand to the total universal space-time volume before re-converging into another location where it would be subject to cause-effect. Within this transient context the contributions from the anisotropic features of entropy and the laws of thermodynamics would be minimal. The central operation of a fundamental unit of 10-20 J, the hydrogen line frequency, and the Bohr orbital time for ground state electrons would be required for the relocalized manifestation. Similar quantified convergence occurs for the ~1012 parallel states within space per Planck’s time which solve for phase-shift increments where Casimir and magnetic forces intersect. Experimental support for these interpretations and potential applications is considered. The multiple, convergent solutions of basic universal quantities suggest that translations of spatial axes into adjacent spatial states and the transposition of four dimensional configurations any where and any time within the universe may be accessed but would require alternative perspectives and technologies.
The Significance of the Speed of Light relating to EinsteinKenny Hansen
This document discusses Albert Einstein's Special Theory of Relativity and how it relates to the universal constant speed of light. It explores how light can behave as both a particle and wave, and how its speed is directly linked to concepts like mass, time, and energy. The speed of light is fundamental to our understanding of physics and the nature of the universe according to Einstein's theory.
Conversion of the vacuum energy of electromagnetic zero point oscillations in...PublicLeaker
This document provides a summary of a scientific work that proposes a method for converting vacuum energy into classical mechanical energy. It begins with an introduction and outlines the philosophical and theoretical background. It then describes experiments conducted to test a concept of an electrostatic rotor that successfully demonstrated the conversion of vacuum energy into rotational mechanical energy. The document concludes by discussing potential applications and opportunities for future development of the energy conversion method.
Introduction to mozilla and its projetcsPradeep Singh
Mozilla is a global non-profit community dedicated to openness on the web. It was formed in 1998 by Netscape to develop the Mozilla web browser and promotes open-source projects like Firefox, Thunderbird, and Webmaker. Mozilla achieves its goals of empowering individuals on the internet through building and promoting free software. It encourages community contributions through coding, testing, translation and other volunteer efforts to advance its mission of a open and accessible internet for all.
This document profiles several rock climbing guides who work for HighXposure Adventures in the Shawangunk Ridge area of New York. The guides have a variety of climbing experiences across North America, Mexico, Europe, and Asia and hold certifications from organizations like AMGA and PCGI. They began climbing between the 1970s and 1999 and have been guiding and instructing for HighXposure Adventures and other companies for multiple years.
The document discusses the concept of serendipity and its elements. It defines serendipity as an unexpected discovery that provides insight and value, involving three elements: an unexpected event or encounter, insight, and recognized value. Luck is described as overrated in discussions of serendipity. Instead, preparedness of mind and social capital are key to harnessing serendipity. Social capital refers to trust and diversity in one's networks which can enable "reverse serendipity," where insight comes from others.
Conférence prononcée à Marseille le 9/12/2016 et à Lyon le 13/12/2016 dans laquelle est avancée l'idée qu'est en cours une nouvelle révolution scientifique, celle de la possible réconciliation de la relativité générale et de la physique quantique
Seats2Meet, global conference, Utrecht 23.3.2015Ilkka Kakko
The document discusses harnessing serendipity through competence platforms. It defines serendipity as unexpected, surprising discoveries made while looking for something else. Competence platforms aim to support unexpected encounters and collective insights to generate new ideas. They do this through workspace design, both physical and virtual, and facilitating interactions. Three challenges are outlined: creating critical mass, showcasing early success stories, and integrating skills documentation with project management tools. The future of competence platforms is promising but still nascent, with major players potentially dominating through acquisitions or by building their own platforms.
Zheleznogorsk Innovation Forum 29.11.2013Ilkka Kakko
This document discusses the evolution of innovation environments and the importance of communities within innovation ecosystems. It describes how traditional science and technology parks focus on supporting startups and research, while newer 3GSP parks aim to balance support between existing companies and various types of new entrepreneurs. Communities play a key role in generating interaction dynamics within these ecosystems. Different types of communities are described, including virtual organization breeding environments, professional virtual communities, and online communities. Case studies of innovation hubs like Urban Mill that bring together diverse communities are also provided.
Plenary session keynote at Tangerang Selatan Global Innovation Forum 21.9.2016Ilkka Kakko
How to support and develop innovation-oriented entrepreneurship in turbulent VUCA conditions? Ecosystem development, platform thinking and serendipity management as key drivers to improve vucability.
The Fundamentals of Third Generation Science Park ConceptIlkka Kakko
The document discusses the fundamentals of third generation science park concepts. It describes the shift towards a "post-normal era" with more dynamic knowledge sharing. It discusses the rise of "research clouds" through coworking spaces and communities. The third generation science park (3GSP) concept focuses on community building, pre-incubation, ecosystem thinking, and supporting different types of entrepreneurs. Serendipity management and understanding communities are important aspects. Examples provided include the Demola and Urban Mill cases that utilize open innovation and community building principles.
The Fundamentals of Third Generation Science Park ConceptIlkka Kakko
The document discusses the fundamentals of third generation science park concepts. It notes that existing structures like institutions and corporations are not well adapted to rapid changes and unpredictability. Third generation science parks (3GSP) aim to address this by focusing on ecosystem dynamics and quality of interactions between stakeholders. 3GSP provide both physical and virtual platforms according to a "pull" philosophy, as opposed to traditional science parks which focus on established companies and rely more on hierarchies and a "push" approach. The document also briefly discusses a 2009 report on future knowledge ecosystems and science and technology parks.
Hvs restaurant industry in india - trends opportunitiesadityan_k
This report analyzes trends in the Indian restaurant industry based on 165 questionnaire responses. It finds that India has over 500,000 restaurants currently, and this figure is expected to rapidly increase due to urbanization, rising incomes, and a growing trend of eating out. The report provides an overview of the restaurant industry in India, examines financial data from existing restaurants, identifies global food trends, and offers guidance on conducting feasibility studies and valuing restaurants. It also includes case studies of both successful and unsuccessful restaurants.
Optical Fiber Cables :- An Introduction Pradeep Singh
This document discusses fiber optic cables and their components. It begins by classifying optical fibers into single-mode fibers, which carry light along a single path, and multi-mode fibers, which carry multiple light paths. It then describes the core, cladding and coating layers that make up an optical fiber. Total internal reflection is discussed as the mechanism that keeps light confined in the fiber. Common fiber optic components like connectors, couplers and circulators are also outlined.
Relativity theory project & albert einstenSeergio Garcia
Albert Einstein was a German-born theoretical physicist who developed the theory of relativity, one of the pillars of modern physics. He was born in 1879 in Germany and died in 1955 in the United States. He is best known for his mass-energy equivalence formula E=mc2, which has been called the world's most famous equation. He developed the special theory of relativity, which describes the laws of motion at high speeds and led to his famous equation, and the general theory of relativity, which describes gravity as a geometric property of space and time.
Relativity theory project & albert einstenSeergio Garcia
Albert Einstein was a German-born theoretical physicist who developed the theory of relativity, one of the pillars of modern physics. He was born in 1879 in Germany and died in 1955 in the United States. He is best known for his mass-energy equivalence formula E=mc2, which has been called the world's most famous equation. The document provides background on Einstein's life and work, and summarizes his theories of special relativity, which describes physics at high speeds, and general relativity, which proposes that gravity results from the curvature of spacetime.
MASSIVE PHOTON HYPOTHESIS OPENS DOORS TO NEW FIELDS OF RESEARCHijrap
1) A massive photon hypothesis is proposed, where the photon mass is directly calculated from kinetic gas theory to be 1.25605 x 10-39 kg.
2) This photon mass explains various experiments like light deflection near the Sun and the gravitational redshift.
3) The photon gas is found to behave as a perfect blackbody and ideal gas, with photons having 6 degrees of freedom.
4) The thermal de Broglie wavelength of this photon gas is calculated to be 1.75967 x 10-3 m, matching the wavelength of the cosmic microwave background radiation.
5) This links the CMB radiation to being continuously generated by the photon gas permeating space, rather than being a relic of
Discussions with einstein on epistemological science in atomic phisycs niel...Sabiq Hafidz
1) Niels Bohr discusses his conversations with Albert Einstein regarding epistemological problems in atomic physics. They debated whether quantum theory's departure from classical physics' causal descriptions should be seen as temporary or a permanent change.
2) Over many years of discussions, Bohr and Einstein had contrasting views on this issue. As quantum theory developed, it increasingly required renouncing causal analysis and pictures, which Einstein found difficult. Bohr saw this as necessary to coordinate the growing evidence from atomic experiments.
3) By the 1920s, quantum theory was becoming more comprehensive but its apparent contradictions remained acute. Developments like matrix mechanics helped provide a quantitative formulation but did not resolve the paradoxes of the theory.
Fundamental principle of information to-energy conversion.Fausto Intilla
Abstract. - The equivalence of 1 bit of information to entropy was given by Landauer in 1961 as kln2, k the Boltzmann constant. Erasing information implies heat dissipation and the energy of 1 bit would then be (the
Landauer´s limit) kT ln 2, T being the ambient temperature. From a quantum-cosmological point of view the minimum quantum of energy in the universe corresponds today to a temperature of 10^-29 ºK, probably forming a cosmic background of a Bose condensate [1]. Then, the bit with minimum energy today in the Universe is a quantum of energy 10^-45 ergs, with an equivalent mass of 10^-66 g. Low temperature implies low energy per bit and, of course, this is the way for faster and less energy dissipating computing devices. Our conjecture is this: the possibility of a future access to the CBBC (a coupling/channeling?) would mean a huge
jump in the performance of these devices.
This document provides a summary of the key developments in the history of quantum mechanics:
- Max Planck proposed quantizing light energy in 1900 to explain blackbody radiation.
- Albert Einstein showed that light behaves as both waves and particles in 1905.
- Niels Bohr applied quantization to explain hydrogen atom spectra in 1913.
- Louis de Broglie proposed that all particles exhibit wave-particle duality in 1924.
- Werner Heisenberg formulated matrix mechanics in 1925.
- Erwin Schrodinger developed wave mechanics and the Schrodinger equation in 1926.
- Max Born correctly interpreted the Schrodinger wavefunction as a probability amplitude in 1926.
hf=mc^2 ? Let's try to discover it (WWW.OLOSCIENCE.COM)Fausto Intilla
The document discusses using the Planck-Einstein relation (E=hf) to interpret mass and energy at the fundamental particle level. It proposes that all fundamental particles can be described as mass-energy wave quanta with an intrinsic frequency, in the same way photons are defined. This frequency would be what defines a particle's mass and energy, regardless of its motion. The relation fits with relativity, as the relativistic equations can be derived from a single expression involving hf, mc2, and velocity. Describing fundamental particles and photons as wave quanta with frequency provides a way to understand mass-energy conversions and changes in a physical way, in terms of changes to the internal wave motion of particles.
This document provides a brief introduction and history of quantum mechanics. It discusses how quantum mechanics developed from Planck's hypothesis of quantized energy in 1900 to Schrodinger formulating the wave equation in 1926. The key developments include Einstein explaining the photoelectric effect in 1905, de Broglie proposing that particles are associated with waves in 1924, and Born correctly interpreting Schrodinger's wave as a probability amplitude in 1926.
1) Albert Einstein developed the theory of relativity, which describes the laws of physics for both non-accelerating and accelerating frames of reference.
2) The theory of special relativity, published in 1905, describes motion at constant velocity while general relativity, published in 1915, describes accelerated frames and gravity.
3) One of the most famous equations in physics, E=mc2, comes from the theory of relativity and shows that mass and energy are equivalent and interconvertible.
Quantum mechanics provides a mathematical description of the wave-particle duality of matter and energy at small atomic and subatomic scales. It differs significantly from classical mechanics, as phenomena such as superconductivity cannot be explained using classical mechanics alone. Key aspects of quantum mechanics include wave-particle duality, the uncertainty principle, and discrete energy levels determined by Planck's constant and frequency.
Quantum teleportation allows the transfer of quantum states between particles at a distance without physically transporting the particles themselves. It relies on the phenomenon of quantum entanglement where the quantum states of particles become linked even when separated spatially. The experiment demonstrated successful quantum teleportation of photons' polarization states between two locations, confirming the non-local effects predicted by quantum mechanics. This technique could enable future applications for quantum communication but does not allow the teleportation of macroscopic objects as depicted in science fiction.
Original scientific paper on which September 3&9 presentations by Chongquing University's Dr. Andrew Beckwith is based, and published within the July 2011 edition of the Journal of Modern Physics. (Scientific Research Publishing)
First part of abstract, i.e. part I•
Detailing Coherent, Minimum Uncertainty States of Gravitons, g , y , as Semi Classical Components of Gravity Waves, and How Squeezed States Affect Upper Limits to Graviton Mass /•
“We present what is relevant to squeezed states of initial space time and how that affects both the composition of relic GW and also gravitons. A side ii f li GW, d l i id issue to consider is if gravitons can be configured as semi classical “particles”, which is akin to the Pilot particles model of Quantum Mechanics as embedded in a larger non linear “deterministic” background.” g g -Dr. Andrew Beckwith, Chongquing University, PRC
Links:
G999 Conference, Philadelphia, PA http://ggg999.org/
San Marino Workshop on Astrophysics and Cosmology For Matter and Antimatter, in San Marino, N. Italy.
http://www.workshops-hadronic-mechanics.org/
Slideshare: http://t.co/vEv2ci5
WordPress: http://asimov52.wordpress.com/2011/09/03/dr-andrew-beckwith-%E2%80%9Cdetailing-coherent-minimum-uncertainty-states-of-gravitons-g-y-as-semi-classical-components-of-gravity-waves-and-how-squeezed-states-affect-upper-limits-to-gravito/
This document is Einstein's seminal 1905 paper "Concerning an Heuristic Point of View Toward the Emission and Transformation of Light". In the paper, Einstein summarizes issues with existing theories of light and blackbody radiation. He proposes that light energy is quantized rather than continuous, consisting of discrete "energy quanta" localized in space that can only be emitted or absorbed as complete units. This revolutionary idea helped lay the foundations for the development of quantum mechanics.
The document provides information on quantum theory and its application to atomic structure. It discusses key concepts such as:
1) Energy is quantized and can only be emitted or absorbed in discrete packets called quanta.
2) Electrons in atoms exist in discrete energy levels called shells or orbitals. They can only transition between these levels by absorbing or emitting quanta of energy.
3) Quantum numbers are used to describe the specific energy state of each electron in an atom, including its distance from the nucleus, energy, and orientation.
Development of a theory of origin of the Universe based on the mathematical calculation of the Hubble constant and the temperature of the cosmic background radiation. Calculation based on known physical constants of nature.
1) The document proposes an alternative cosmological model where dark matter and dark energy are described as forms of ether, analogous to Mach's principle of inertia.
2) In this model, dark matter arises from the QCD vacuum or "sea" of quark-antiquark pairs and gluons at the confinement scale, while dark energy corresponds to the zero-point energy of the QCD vacuum.
3) The model aims to replace the standard LambdaCDM model, treating the expanding universe as a dynamically stable "biking" Einstein universe where the running cosmological constant compensates for the effect of gravity at all epochs.
This document discusses Louis de Broglie's famous hypothesis that matter exhibits wave-particle duality in the same way that light does. It argues that de Broglie's hypothesis was not actually a conjecture formulated within a unified theory of light and matter, as there was no such common theory at the time in 1924. It also suggests that de Broglie's hypothesis could be more accurately stated as proposing that the wave-particle duality seen in matter may also occur in light, since theories of light at the time were less mechanistic than theories of matter. The document goes on to discuss dualism in physics and different ways it could be interpreted or related to experiment.
The document provides an overview of statistical thermodynamics including:
- Its historic background beginning with Bernoulli's work in the 18th century and contributions from Maxwell, Boltzmann, Gibbs, and others.
- The key difference between classical thermodynamics and statistical thermodynamics is that the latter links microscopic properties to macroscopic behaviors.
- Statistical thermodynamics is needed to explain thermodynamic parameters in terms of molecular properties and interactions since classical thermodynamics does not address this microscopic level.
- Central topics in statistical thermodynamics include the partition function, degrees of freedom, heat capacity, and Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein statistics.
Vasil Penchev. Gravity as entanglement, and entanglement as gravityVasil Penchev
1. The document discusses interpreting gravity as entanglement by investigating the conditions under which general relativity and quantum mechanics can be mapped to each other mathematically.
2. It outlines a strategy to interpret entanglement as inertial mass and gravitational mass, and to view gravity as another interpretation of any quantum mechanical or mechanical movement.
3. This would allow gravity to be incorporated into the standard model by generalizing the concept of quantum field to include entanglement, represented by a cyclic Yin-Yang mathematical structure.
Neuro Quantology is an international, interdisciplinary, open-access, peer-reviewed journal that publishes original research and review articles on the interface between quantum physics and neuroscience. The journal focuses on the exploration of the neural mechanisms underlying consciousness, cognition, perception, and behavior from a quantum perspective. Neuro Quantology is published monthly.
Similar to Universal constants, standard models and fundamental metrology (20)
This is a "post-preprint" of a contributed paper at the XXII International conference on High Energy Physics Leipzig, July 19-25, 1984. Some of the ideas developed in this paper may turn out to be useful in the interpretation of the standard model of modern cosmology
This is a post-preprint of a contributed paper at the XXII International conference on High Energy Physics Leipzig, July 19-25, 1984. Some of the ideas developed in this paper may be useful in the interpretation of the standard model of modern cosmology
Lambda, the cosmological constant, is considered the fifth foundational constant in physics along with the speed of light c, Planck's constant h, gravitational constant G, and Boltzmann's constant k. Einstein, de Sitter, Lemaitre, and others originally considered Lambda in their early formulations of cosmology. Lambda represents the energy density of empty space and is related to the curvature of spacetime. Its observed value plays a key role in the current Lambda-CDM model of cosmology, which describes our flat, expanding universe containing dark matter, dark energy, and other components.
Cet article a été publié comme un chapitre de l'ouvrage collectif Mutations de l'écriture (Publications de la Sorbonne) édité sous la direction de François Nicolas avec l'aide d'Aurélien Tonneau. Avec Michel Spiro, nous nous étions largement inspirés de cet article pour la rédaction du chapitre 5, consacré à l'électrodynamique quantique, de notre livre Le boson et le chaprau mexicain
Texte (mis à jour en aout 2015) de ma présentation lors de la rencontre Physique et interrogations fondamentales "La science et l'impossible" de novembre 2014
Exposé le 29/07/2015 lors de la session A destinée à un public d'enseignants du secondaire de LISHEP 2015 à Manaus, à l'occasion de l'année internationale de la lumière
Embracing Deep Variability For Reproducibility and Replicability
Abstract: Reproducibility (aka determinism in some cases) constitutes a fundamental aspect in various fields of computer science, such as floating-point computations in numerical analysis and simulation, concurrency models in parallelism, reproducible builds for third parties integration and packaging, and containerization for execution environments. These concepts, while pervasive across diverse concerns, often exhibit intricate inter-dependencies, making it challenging to achieve a comprehensive understanding. In this short and vision paper we delve into the application of software engineering techniques, specifically variability management, to systematically identify and explicit points of variability that may give rise to reproducibility issues (eg language, libraries, compiler, virtual machine, OS, environment variables, etc). The primary objectives are: i) gaining insights into the variability layers and their possible interactions, ii) capturing and documenting configurations for the sake of reproducibility, and iii) exploring diverse configurations to replicate, and hence validate and ensure the robustness of results. By adopting these methodologies, we aim to address the complexities associated with reproducibility and replicability in modern software systems and environments, facilitating a more comprehensive and nuanced perspective on these critical aspects.
https://hal.science/hal-04582287
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptxshubhijain836
Centrifugation is a powerful technique used in laboratories to separate components of a heterogeneous mixture based on their density. This process utilizes centrifugal force to rapidly spin samples, causing denser particles to migrate outward more quickly than lighter ones. As a result, distinct layers form within the sample tube, allowing for easy isolation and purification of target substances.
The cost of acquiring information by natural selectionCarl Bergstrom
This is a short talk that I gave at the Banff International Research Station workshop on Modeling and Theory in Population Biology. The idea is to try to understand how the burden of natural selection relates to the amount of information that selection puts into the genome.
It's based on the first part of this research paper:
The cost of information acquisition by natural selection
Ryan Seamus McGee, Olivia Kosterlitz, Artem Kaznatcheev, Benjamin Kerr, Carl T. Bergstrom
bioRxiv 2022.07.02.498577; doi: https://doi.org/10.1101/2022.07.02.498577
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆Sérgio Sacani
Context. The early-type galaxy SDSS J133519.91+072807.4 (hereafter SDSS1335+0728), which had exhibited no prior optical variations during the preceding two decades, began showing significant nuclear variability in the Zwicky Transient Facility (ZTF) alert stream from December 2019 (as ZTF19acnskyy). This variability behaviour, coupled with the host-galaxy properties, suggests that SDSS1335+0728 hosts a ∼ 106M⊙ black hole (BH) that is currently in the process of ‘turning on’. Aims. We present a multi-wavelength photometric analysis and spectroscopic follow-up performed with the aim of better understanding the origin of the nuclear variations detected in SDSS1335+0728. Methods. We used archival photometry (from WISE, 2MASS, SDSS, GALEX, eROSITA) and spectroscopic data (from SDSS and LAMOST) to study the state of SDSS1335+0728 prior to December 2019, and new observations from Swift, SOAR/Goodman, VLT/X-shooter, and Keck/LRIS taken after its turn-on to characterise its current state. We analysed the variability of SDSS1335+0728 in the X-ray/UV/optical/mid-infrared range, modelled its spectral energy distribution prior to and after December 2019, and studied the evolution of its UV/optical spectra. Results. From our multi-wavelength photometric analysis, we find that: (a) since 2021, the UV flux (from Swift/UVOT observations) is four times brighter than the flux reported by GALEX in 2004; (b) since June 2022, the mid-infrared flux has risen more than two times, and the W1−W2 WISE colour has become redder; and (c) since February 2024, the source has begun showing X-ray emission. From our spectroscopic follow-up, we see that (i) the narrow emission line ratios are now consistent with a more energetic ionising continuum; (ii) broad emission lines are not detected; and (iii) the [OIII] line increased its flux ∼ 3.6 years after the first ZTF alert, which implies a relatively compact narrow-line-emitting region. Conclusions. We conclude that the variations observed in SDSS1335+0728 could be either explained by a ∼ 106M⊙ AGN that is just turning on or by an exotic tidal disruption event (TDE). If the former is true, SDSS1335+0728 is one of the strongest cases of an AGNobserved in the process of activating. If the latter were found to be the case, it would correspond to the longest and faintest TDE ever observed (or another class of still unknown nuclear transient). Future observations of SDSS1335+0728 are crucial to further understand its behaviour. Key words. galaxies: active– accretion, accretion discs– galaxies: individual: SDSS J133519.91+072807.4
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...Scintica Instrumentation
Targeting Hsp90 and its pathogen Orthologs with Tethered Inhibitors as a Diagnostic and Therapeutic Strategy for cancer and infectious diseases with Dr. Timothy Haystead.
PPT on Alternate Wetting and Drying presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
Evidence of Jet Activity from the Secondary Black Hole in the OJ 287 Binary S...Sérgio Sacani
Wereport the study of a huge optical intraday flare on 2021 November 12 at 2 a.m. UT in the blazar OJ287. In the binary black hole model, it is associated with an impact of the secondary black hole on the accretion disk of the primary. Our multifrequency observing campaign was set up to search for such a signature of the impact based on a prediction made 8 yr earlier. The first I-band results of the flare have already been reported by Kishore et al. (2024). Here we combine these data with our monitoring in the R-band. There is a big change in the R–I spectral index by 1.0 ±0.1 between the normal background and the flare, suggesting a new component of radiation. The polarization variation during the rise of the flare suggests the same. The limits on the source size place it most reasonably in the jet of the secondary BH. We then ask why we have not seen this phenomenon before. We show that OJ287 was never before observed with sufficient sensitivity on the night when the flare should have happened according to the binary model. We also study the probability that this flare is just an oversized example of intraday variability using the Krakow data set of intense monitoring between 2015 and 2023. We find that the occurrence of a flare of this size and rapidity is unlikely. In machine-readable Tables 1 and 2, we give the full orbit-linked historical light curve of OJ287 as well as the dense monitoring sample of Krakow.
Evidence of Jet Activity from the Secondary Black Hole in the OJ 287 Binary S...
Universal constants, standard models and fundamental metrology
1. Universal Constants, Standard Models and
Fundamental Metrology
G. Cohen-Tannoudji1
1
Laboratoire de Recherche sur les Sciences de la Matière (LARSIM) CEA-Saclay, F91191 Gif sur Yvette Cedex
Email : gilles.cohen-tannoudji@cea.fr
Abstract. Taking into account four universal constants, namely the Planck's constant h, the velocity of
light c, the constant of gravitation G and the Boltzmann's constant k leads to structuring theoretical
physics in terms of three theories each taking into account a pair of constants: the quantum theory of
fields (h and c), the general theory of relativity (c and G) and quantum statistics (h and k). These three
theories are not yet unified but, together, they underlie the standard models that allow a satisfactory
phenomenological description of all experimental or observational data, in particle physics and in
cosmology and they provide, through the modern interpretation of quantum physics, fundamental
metrology with a reliable theoretical basis.
1 Introduction
The scientific revolution that arose at the beginning of the 20th
century coincided with the discovery or
the re-discovery of universal constants that were interpreted as reflecting some fundamental limitation
principles. It appeared that these limitations are not to be considered as insuperable obstacles but
rather as horizons [1], which means that there exist some rationales to theoretically think which reality
lies behind or beyond them. Actually, as says Ferdinand Gonseth (1890-1975), a Swiss mathematician
philosopher, it seems that the whole of human knowledge is bounded by such horizons [2]: “The
previous results have a value that goes beyond the framework of geometry. They concern the entirety
of knowledge, we mean the state in which knowledge comes to us, at a given instant: Nothing
authorizes us to think that our knowledge, even at its last frontiers, is more than a knowledge horizon;
that the last ‘realities’ that we have conceived are more than a reality horizon.” In this chapter, we
intend to show how this concept of reality horizon, related to universal constants allows establishing a
strong relation between fundamental physics and metrology: just as a sailor is able to determine his or
her position on earth by measuring the height of the sun with respect to the horizon, structuring
theoretical physics in terms of reality horizons determined by dimensioned universal constants allows
establishing a comprehensive reliable frame of reference consisting of standard models that depend on
a finite number of dimensionless universal constants, the determination of which is the task of
fundamental metrology.
2.1 Universal constants and the role of fundamental metrology
2.1 The interpretation of universal constants may change according to the evolution of
theories
What one calls fundamental constants are the fixed and universal values of some physical observables
(dimensioned or not) that enter the theories designed to describe the physical reality. There are several
possible interpretations of these constants. They can appear as parameters in the framework of some
models. They can lead to the unification of previously independent domains. They can serve as
conversion factors between physical units. They can also, and this is the main role they can play in
structuring physics, reflect some fundamental limitation principles. Actually, the interpretation changes
according to the evolution of theories. This is the case for the four constants we shall consider in this
chapter.
2.1.1 The vacuum velocity of light c
2. In Maxwell’s electromagnetic theory of light, c appeared as a phenomenological parameter, with
dimension of a velocity, related to the electric and magnetic properties of the “vacuum”. It is when
Hertz reformulated the Maxwell’s theory in terms of the propagation of electromagnetic waves, and
when he experimentally discovered that the propagation velocity of these waves was compatible with
the already measured velocity of light, that c got the status of a universal constant equal to the velocity
of light in the vacuum, leading to the unification of electricity, magnetism and optics.
Later on, in 1905, c got a new interpretation, the one of an upper bound on any propagation
velocity, translating a fundamental limitation principle, namely the absence of instantaneous action at
a distance. The theory of special relativity takes this fundamental limitation into account by means of
the concept of space-time, introduced in 1908 by Minkowski, and once this theory has been
established, c got yet another interpretation, namely the one of a conversion factor between length and
time units.
2.1.2 The Newton’s constant G
The constant G allows unifying celestial and terrestrial mechanics in the Newton’s theory of universal
gravitation. In general relativity, the Einstein’s theory of universal gravitation that encompasses the
Newton’s one and gives it back as an approximation in the low field limit, G is a coupling constant
relating the matter energy-momentum tensor and the Einstein’s curvature tensor. It may also be
interpreted as a constant relating the mass of a black hole to the radius of its event horizon. When
associated with c and with the Planck’s constant h, it leads to the definition of the Plank’s units of
mass, space and time, characteristic of fundamental limitation principles related to quantum gravity
2.1.3 The Boltzmann’s constant k
According to the current interpretive consensus [3], the Boltzmann’s constant k is just a conversion
factor relating temperature and energy: at thermal equilibrium, temperature is proportional to the
average kinetic energy of molecules, and the proportionality factor is k. Since temperature is not
considered as a fundamental physical quantity, k is not considered as a fundamental universal
constant, it is a conversion factor allowing us to express temperatures in energy units. The
interpretation of the universal constants that we are going to propose is at odds with this interpretive
consensus, in particular about the status of the Boltzmann’s constant. Actually it turns out that the
history of the discovery and the interpretation of this constant made to appear several interpretations
that do not reduce to the one of it being a conversion factor. Consider first the interpretation by
Boltzmann of entropy in the framework of statistical thermodynamics: this physical quantity expresses
the lack of knowledge we have about a system obeying the deterministic laws of rational mechanics
but in which there exists a large number of microscopic configurations (called complexions) leading to
the same macroscopic state; entropy S is proportional to the logarithm of the number W of
complexions, and k (which has thus the dimensional content of entropy) is the proportionality factor,
LnS k W= (1)
This equation that is the fundamental equation of statistical thermodynamics has been
extensively used by Einstein in the beginning of 20th
century [4] for the purpose of two applications:
one in an attempt to estimate the fluctuations affecting a system at thermal equilibrium in order to
have an insight on the fundamental properties of the microscopic constituents of matter (atoms or
molecules) [5], and another one in the interpretation of the Planck’s formula on the black body
radiation in terms of energy quanta [6].
In the framework of information theory, developed in the middle of the 20th
century, the
Boltzmann’s constant got yet another interpretation, the one of the minimal cost, expressed in entropy
of a single bit of information. If information is considered as a fundamental physical quantity, the
Boltzmann’s constant thus translates a fundamental limitation principle.
2.1.4 The Planck’s constant h
3. It is precisely when trying to use the fundamental equation of statistical thermodynamics (1) to explain
the spectrum of the black body radiation that Planck was led to introduce the constant now known as
the Plank’s constant, the elementary quantum of action h. The formula he derived realizes, by means
of the three universal constants c, h, and k (that Planck named as the Boltzmann’s constant), the
unification of statistical thermodynamics and electromagnetism. In the Planck’s formula, as well as in
its interpretation in terms of energy quanta proposed by Einstein in 1905, the Planck’s constant
appears as a proportionality factor between energy and frequency. According to the particle/wave
duality, the Planck’s constant is interpreted as a conversion factor between particle kinematics
(energy, momentum) and wave kinematics (frequency, wave vector). More generally, in quantum
physics, the Planck’s constant translates a fundamental limitation principle stating that conjugate
variables, i.e. the product of which has the dimensional content of an action, cannot be determined
with arbitrary accuracies, but that these accuracies are constrained by the Heisenberg’s inequalities.
2.2 A purely theoretical interpretation: the cube of theories
2.2.1 Two classes of universal constants
According to the program of rational mechanics, there are three, and only three fundamental physical
units entering the dimensional content of all physical quantities, namely the mass M, the length L and
the time T units. Already in 1899, when he had an idea about the constant that bears his name, Planck
remarked that by means of combinations of the three dimensioned constants c, G and h, one can build
three fundamental dimensioned quantities, now known as the Planck’s scales of length, time and mass,
namely
1/2
35
3
1/2
43
5
1/2
19 2
10
10
10 /
P
P
P
hG
l m
c
hG
t s
c
hc
m GeV c
G
−
−
= ≈
= ≈
= ≈
(2)
and, by means of the Boltzmann’s constant he also introduced a “Planck’s scale of temperature”,
2
/P PT m c k= . Once these fundamental units are fixed, all other universal constants can be considered
as dimensionless: the mass of the electron, say, is some dimensionless fraction of the Planck’s mass.
Now, since the dimensioned universal constants seem to be related to fundamental limitation
principles, it seems natural to associate them to some theoretical or axiomatic framework, and to
associate the dimensionless constants with parameters in the framework of models. As long as these
models are heuristic or phenomenological models, the constants are considered as adjustable
parameters, but when a model, compatible with the axiomatic framework agrees well with all available
observational or experimental data in a given phenomenological realm, the model acquires the status of
a “standard model” and the adjustable parameters on which it depends are fixed and acquire the status
of universal constants, which, hopefully would be calculable in the framework of an ultimate theory
able to account for all dimensioned constants.
2.2.2 The “cube of theories”
According to the current interpretive consensus, there are, just as according to the program of rational
mechanics, three and only three fundamental units, and thus three and only three dimensioned
universal constants, h, c and G. Entropy is considered as a dimensionless quantity; the Boltzmann’s
constant is considered as a conversion factor allowing to express temperatures in energy units;
statistical physics is not considered as fundamental. The theoretical framework is structured [7] as
shown in fig.1, according to the way how the three fundamental constants are taken into account (i.e.
set to 1) or not (i.e. set to zero). At the origin of the frame, the three constants h, G and 1/c are set to
zero, which represents non gravitational Newtonian mechanics; we have then the three theories that
4. account for the three constants one by one separately, namely special relativity in which 1/c is set to
1, whereas h and G are set to zero, Newtonian gravity in which G is set to 1 whereas 1/c and h are set
to zero and quantum mechanics in which h is set to 1 whereas 1/c and G are set to zero; we have then
three theories taking into account pairs of universal constants, namely the quantum theory of fields (h
and 1/c set to 1, G set to zero), general relativity (1/c and G set to 1, h set to zero), and a theory which
does not seem very meaningful from the interpretive point of view, namely non relativistic quantum
gravity (h and G set to 1, 1/c set to zero); finally the summit of the cube, opposite to the origin,
represents what is considered as the ultimate goal of theoretical physics, a theory of quantum
relativistic gravity which would take into account the three constants and which would hopefully
allow determining the values of all dimensionless constants.
Newtonian mechanics
Special relativity General relativity
Quantum gravity
"Theory of Everything"
Newtonian gravity
Quantum mechanics
Quantum theory of fields
Non relativistic
quantum gravity
1/c
G
h
Fig. 1. The “cube of theories”
2.3 An interpretation of universal constants relevant to fundamental metrology
Obviously this purely theoretical interpretation is not suitable to discuss the role of fundamental
metrology. On the one hand, it suggests that the three fundamental units are the Planck units which are
far from being “natural” (for instance, a particle with the Planck energy would have the kinetic energy
of a transportation aircraft), and a theory able to take into account the three universal constants h, c
and G is not yet at hand. On the other hand, the link between the theoretical framework and
phenomenology is not taken into account in this interpretation. In our opinion, this lack is due to an
underestimation of the role of information as a genuine physical quantity precisely providing this link
between theory and phenomenology. According to its informational interpretation, entropy is the
information that is sacrificed when statistical averaging or coarse graining is performed. Now it turns
out that in the modern interpretation of quantum physics, coarse graining is unavoidable in order for
quantum mechanics to provide probabilities for different sets of alternative decoherent histories:
decoherence may fail if the graining is too fine; moreover a coarser graining is necessary for having
approximate classical predictability. If thus one wants to have an interpretation of universal constants
that is consistent with this modern interpretation of quantum physics, that is so basic in fundamental
metrology, one has better to include among the fundamental physical quantities the one that depends
on coarse graining, that is entropy or information, and thus to include among the universal constants
5. that determine the relevant theoretical framework, the Boltzmann’s constant that has the dimension of
an entropy.
2.3.1 The c, h, k scheme: kinematics versus dynamics.
Such an interpretive scheme, suitable for fundamental metrology is the one adopted by Christian
Borde [8] who distinguishes a kinematical framework, framed by the three dimensioned constants c, h
and k interpreted as reflecting fundamental limitation principles from a dynamical framework, in
which interactions are described by means of dimensionless constants. Relativity allows the value of c
to be fixed and the standard of length to be redefined, quantum mechanics allows the value of h to be
fixed and the mass standard to be defined and statistical mechanics allows the value of k to be fixed
and the scale of temperature to be defined. Interactions are then implemented in the dynamical
framework and all the “quantities related to the various forces of nature can be formulated from the
mechanical units and coupling constants without dimensions”.
2.3.2. The (h, c), (G, c), (h, k) scheme: fundamental versus emergent physics.
The interpretation we propose in terms of a tripod of theories each taking into account a pair of
universal constants is compatible with the scheme just described; it enlarges it in a way that allows to
emphasize the role of modern quantum physics and fundamental metrology in the establishment of the
standard models of particle physics and cosmology, and in the search of any new physics not
described by them, in the hope of eventually unifying general relativity and quantum theory.
This interpretation is schematized in fig. 2: quantum field theory (QFT taking into account h
and c) that generalizes quantum mechanics and underlies the standard model of particle physics and
general relativity (GR taking onto account G and c) that generalizes relativity and underlies the
standard model of cosmology are not yet unified but, together, they constitute the current conceptual
framework of fundamental physics. Quantum statistics (QS taking into account h and k) that
generalizes statistical mechanics and underlies modern quantum physics rather provides the
conceptual framework of emergent physics. Figure 2 is meant to show that fundamental metrology
evolves at the intersection of fundamental and emergent physics. This is what we intend to explain in
more details in the following sections of this chapter.
h, c
Quantum Field Theory and
the standard model of
particle physics
G, c
General Relativity and
the standard model of
cosmology
h, k
Quantum Statistics and
Modern quantum physics
Fundamental
Metrology
6. Fig.2 Fundamental metrology at the intersection of fundamental and emergent physics
3 Quantum Field Theory and the Standard Model of Particle Physics
3.1 What is Quantum Field Theory?
3.1.1 Fundamental implications of QFT
Quantum Field Theory (QFT) is the result of the merging of quantum mechanics and special relativity.
In ordinary quantum mechanics (QM), the space coordinates of a particle are operators, whereas time
is a continuous parameter. QM is local in time but not local in space. In QFT, the four space-time
coordinates are continuous parameters, just as in classical relativistic field theory. QFT can be made
local in space-time.
The relativistic generalizations of the Schrödinger equation, the Dirac equation
( ) ( ) 0i m xµ
µγ ψ∂ − = or the Klein Gordon equation ( )2
( ) 0m xψ− = , admit negative energy
solutions difficult to interpret if ( )xψ is the wave function of an isolated particle. This problem is the
first one met when one has tried to reconcile quantum mechanics and relativity. It is solved if Dirac or
Klein Gordon equations apply to quantum fields rather than to wave functions of isolated particles. A
quantum field is a field of operators, creating or annihilating, at any point of space-time a particle or
an antiparticle. A particle and its antiparticle have the same mass, the same spin but opposite charges.
A particle with negative energy going backward in time is replaced by its antiparticle, with positive
energy going forward in time.
Particles and antiparticles are not material points: in fundamental interactions, the number of
particles is not conserved; particles or antiparticles can appear or disappear. The creation or the
annihilation of a particle or an antiparticle is a quantum event that causality and relativity force us to
treat as strictly localized in space-time. Heisenberg indeterminacy inequalities imply that such strictly
localized events cannot be individually predicted in a deterministic way. The predictability in QFT
concerns statistical ensembles of events.
In the framework of QFT two fundamental theorems have been established:
• The spin/statistics connection theorem that states that fermions have a half integer spin, and
that bosons have an integer spin
• The PCT theorem that states that all interactions are invariant under the PCT symmetry that
is the product of space parity P by time reversal T and by charge conjugation C
3.1.2 The Fock space
In absence of interactions, (i.e. for free fields) the equations of motion (Dirac, Klein Gordon
or Maxwell equations) can be solved entirely. In momentum space, the free field Hamiltonian reduces
to an infinite sum of independent harmonic oscillators (normal modes). Quantum statistics (Bose-
Einstein or Fermi-Dirac) are most easily taken into account by means of the Fock space, namely a
Hilbert space in which the quantum states of the field are defined in terms of the occupation numbers,
i.e. the number of field quanta with a given four momentum.
3.2 The Path Integral formulation of QFT
In a 1948 paper, untitled Space-time approach to Non Relativistic Quantum Mechanics, Feynman [9]
re-formulates and re-interprets quantum mechanics as a field theory in terms of an integral over all the
trajectories that can virtually follow a particle (the so-called Path Integral formulation):
• Thanks to the principle of superposition of probability amplitudes, the wave function is
interpreted as a field amplitude that is evaluated by means of the Huygens principle
• It is the field intensity (proportional to the modulus squared of the amplitude) that is given a
probabilistic interpretation
• This reformulation can be extended to the quantization of relativistic field theories in a way
that preserves Lorentz invariance.
7. In QFT, the path integral (PI) is a functional integral, i.e. an integral over the fields of the
exponential of i times the “classical” action in units of / 2h π= . The “classical” action integral is the
invariant integral over space-time of the Lagrangian, expressed in terms of c-number fields, which
would lead through the least action principle to the “classical” equations of motion. In absence of
interactions one recovers with PI the results of the standard quantization method for Klein Gordon,
Dirac and Maxwell fields. When the Lagrangian involves an interaction term consisting of the product
of fields multiplied by a small coupling constant, PI reduces to an expansion, called the perturbative
expansion, in powers of this small coupling constant, the coefficients of which involve only ordinary
integrals.
3.3 The paradigm of Quantum Electrodynamics (QED)
3.3.1 The QED Lagrangian
QED is the quantum relativistic theory of the electromagnetic interaction of the electron field, ( )xψ .
From the Dirac Lagrangian for this electron field
( )Dirac ( ) ( )x i m xµ
µψ γ ψ= ∂ −L (3)
we derive the Dirac equation and a possible conserved current:
( ) ( ) ( ). 0; . 0
0if ( ) ( ).
i m i m i m
j j e x xµ µ µ
µ
γ ψ ψ ψ γ
ψ γ ψ
∂ − ≡ ∂/ − = ∂ + =
⇒ ∂ = = −
(4)
The QED Lagrangian is obtained by adding to the Dirac Lagrangian the Maxwell Lagrangian
and an interaction Lagrangian:
( )
QED Maxwell Dirac Interaction
Maxwell
Dirac
Interaction
1
;
4
.
F F F A A
i m
e A
µν
µν µν µ ν ν µ
µ
µ
µ
µ
ψ γ ψ
ψγ ψ
= + +
= − = ∂ − ∂
= ∂ −
= −
L L L L
L
L
L
(5)
3.3.2 The perturbative expansion
The perturbative expansion is obtained through the generating functional
( ) ( )4
QED, , exp ,A A
i
j j j A d x j j j Aψ ψ ψ ψψ ψ ψ ψ
= + + +
∫ ∫Z D D D L
where the j’s are non dynamical sources of the fields, and the integration symbol D represents a
functional integration over field configurations.
• The power expansion in the sources generates the Green’s functions.
• The power expansion of the Green’s functions in power of e is the perturbative expansion the
coefficients of which are sums of Feynman amplitudes associated with Feynman diagrams.
• The power expansion in ℏ leads to the calculation of the effects of quantum fluctuations.
3.3.3 Renormalisability and renormalisation
Due to the occurrence, as implied by micro-causality, of products of fields evaluated at the same event
(space-time point), the integral necessary to define the Feynman amplitudes associated with
Feynman’s diagrams containing loops, in general diverge, which could endanger the whole program of
8. QFT. Apart from the difficulty of negative energy that we discussed above, the problem of infinities in
the perturbative expansion has been considered as the major drawback of QFT. The way out of this
difficulty relies on the renormalisation procedure that we sketch very briefly here.
In QED, this procedure consists in three steps
a. Regularizing the divergent integrals by means of an energy “cut-off” to discard virtual
processes involving arbitrary high energy
b. Splitting the values of the parameters entering the Lagrangian, the mass m and the charge e of
the electron into their bare values, m0 and e0, i.e. the values they would have in absence of
interaction, and their renormalized values, mR and eR, i.e. the values they acquire due to the
interaction.
c. It may happen that when one tries to express the Green’s functions in terms of the
(unphysical) bare parameters, the cut-off parameter cannot be sent to infinity without getting
an infinite answer whereas, when expressed in terms of the (physical) renormalized
parameters, the Green’s function go to a finite limit when the cut-off goes to infinity. A theory
in which such a circumstance occurs for all Green’s functions and at all orders of the
perturbative expansion is called renormalisable, and as such it is free from the problem of
infinities.
It has been proven that QED is renormalisable, which means that this theory has a predictive
power such that it can underlie a genuine standard model: the renormalized parameters cannot be
predicted, but once they are determined from experiment, there are physical quantities that can be
measured experimentally and theoretically predicted. In QED, the agreement between experiment and
theoretical predictions is surprisingly good.
However there remains a problem: the renormalized parameters depend on an arbitrary energy
scale related to the coarse graining implied by the experimental conditions, whereas one expects the
predictions of a “fundamental” theory not to depend on the coarse graining. Fortunately, in a
renormalisable theory, the renormalization group equations constrain the dependence of the
renormalized parameters on the coarse graining in such a way that the physical predictions of the
theory actually do not depend on it.
3.3.4 Gauge invariance in QED
The success of QED was considered as a miracle that it was very tempting to generalize to the theory
of other fundamental interactions. It was then necessary to uncover the property of QED that could be
at the origin of the miracle of renormalisability and that could be generalized to other interactions.
Gauge invariance is precisely this property. Let us first remind how in classical electrodynamics, gauge
invariance is related to current conservation. We write
Classical Maxwell
Maxwell
,
1
;
4
ij A
F F F A A
µ
µ
µν
µν µν µ ν ν µ
= +
= − = ∂ − ∂
L L
L
(6)
This is invariant under the gauge transformation that consists on adding to the potential (that we now
call the gauge field of the electromagnetic interaction) the four-divergence of an arbitrary function:
( ),A A xµ µ µλ→ + ∂ if and only if the current is conserved. Consider now the symmetry of the Dirac
Lagrangian ( )Dirac
,i m
µ
µ
ψ γ ψ= ∂ −L it is invariant under the change of the phase of the electron field
( ) exp( ) ( )
( ) ( )exp( ).
x i x
x x i
ψ α ψ
ψ ψ α
→ −
→
(7)
Because of the space-time derivative, the phase invariance is global and not local, which means that
the phase changing factor cannot depend on x.
The interaction term may allow to impose a local phase invariance by compensating
(infinitesimally) the local change of phase induced by the derivative by a local gauge transformation of
the potential; in fact the QED Lagrangian is invariant under the following transformation, which one
now calls the local gauge invariance of QED
9. ( )
( ) exp( ( )) ( )
( ) ( )exp( ( ))
iff ( ) ( )
A A x
x i x x
x x i x
x x
µ µ µ
λ
ψ α ψ
ψ ψ α
α λ
→ + ∂
→ −
→
=
(8)
The very important feature is that one could have done the reasoning the other way around:
one would have started from the Dirac Lagrangian for a spinorial field; one would have noticed its
global phase invariance; one would have imposed a local phase invariance through the coupling with a
gauge field to compensate the variation induced by space-time derivative; one would have completed
the Lagrangian by adding the Maxwell Lagrangian describing the propagation of the gauge field thus
introduced. One would have thus recovered exactly the QED Lagrangian!
Due to the fact that local gauge invariance is crucial in the proof of renormalisability of QED,
it is very tempting to make of this invariance a unifying principle, a “road map” towards tentative
renormalisable theories of other fundamental interactions:
• For a given fundamental interaction, identify the matter fields and their global symmetries,
• make these symmetries local through the coupling with suitable gauge fields,
• complete the Lagrangian with the propagation terms for these gauge fields,
• verify the renormalisability of the thus derived theory.
3.4 The gauge theories of the standard model
The construction of the standard model of particle physics has followed essentially this road map.
Yang and Mills [10] first generalized the local gauge invariance of QED, that is said to be Abelian, to
non Abelian (i.e. non commutative) local gauge invariance. However it seemed impossible to design a
Yang Mills theory for the weak interaction that has a finite range whereas the quanta of the gauge
fields in a Yang Mills theory are necessarily mass less which implies an interaction with an infinite
range. As far as strong interaction is concerned, the task of describing it by means of a Yang Mills
theory looked even more hopeless because of the proliferation of the family of hadrons, the particles
that participate to all fundamental interactions including the strong one, a proliferation that seemed
hard to reconcile with any sensible QFT.
A key step in the building of the standard model of particle physics was the discovery of a
“sub-hadronic” level of elementarity, the level of quarks. These particles are spin ½ elementary
constituents of hadrons, (hadronic fermions, the baryons are three-quark bound states, and the hadronic
bosons, the mesons are quark-antiquark bound states) with fractional electric charges, that appeared as
good candidates to be the quanta of matter fields in a gauge theory for the strong interaction, and
together with the electron and other leptons, the quanta of a gauge theory for the weak and
electromagnetic interactions.
Another major step in establishing the standard model of particle physics was the proof of the
renormalisability of gauge theories, with or without spontaneous symmetry breaking, by ‘t Hooft,
Veltman [11], Lee and Zinn-Justin [12].
For the strong interaction at the level of quarks, the non Abelian renormalisable gauge theory
is Quantum Chromo Dynamics (QCD), for which the gauge group is a SU(3) group, called the
“colour” group, in reference to an analogy with ordinary colours: the quarks exist in three
“fundamental colours” and combine to make “white” or “colour less” hadrons. The gauge fields are the
eight gluon fields. The coarse graining dependence of the coupling constant is translated in dependence
on the resolution with which the interaction is probed, i.e. on energy. At small distance, i.e. at high
energy, the QCD coupling constant is small (one says that QCD is asymptotically free) which leads to
a reliable perturbative expansion, whereas at a distance of a typical hadronic size, the coupling
diverges, which suggests that non perturbative QCD could explain the impossibility of observing free
propagating quarks and gluon (the so called confinement of subhadronic quanta).
In the standard model, the weak and electromagnetic interactions are combined in the
electroweak theory of Glashow [13], Salam [14] and Weinberg [15], that is, before symmetry breaking,
a gauge theory, with massless chiral quarks and leptons as matter fields, the three massless
10. intermediate vector bosons and the photon as gauge fields. In order to obtain from this theory a reliable
phenomenological model it was necessary to design the mechanism [16], known as the Higgs
mechanism, able to generate the observed non vanishing masses of the intermediate vector bosons and
of the fermions, while preserving renormalisability. This mechanism leads to the prediction of the
existence of a yet non discovered particle, the Higgs boson. Apart from the discovery of this particle,
which is its last missing link, all the predictions of the standard model of particle physics have been
experimentally verified at energies up to 200 GeV. The search for the Higgs boson is the main
objective of the scientific program at the LHC facility that is going to be commissioned in 2008.
4 General relativity and the standard model of cosmology
4.1 Relativity and the theoretical status of space
In the fifth appendix, untitled Relativity and the problem of space, added in 1952 to his book on
relativity [17] for a wide public, Einstein discusses the relation between the theory of relativity and the
theoretical status of space. He explains that in classical Newtonian physics, space has an independent
existence, an idea that “can be expressed drastically in this way: If matter were to disappear, space and
time alone would remain behind (as a kind of stage for physical happening).” In special relativity, the
concept of ether, as a mechanical carrier of electromagnetic waves, is eliminated and replaced by the
concept of field, which is one of the “most interesting events in the development of physical thought.”
However, in special relativity, apart from the field, there exists matter, in the form of material points,
possibly carrying electric charge. This is the reason why, in special relativity, space keeps an
independent existence, although, through the concept of space-time, it acquires a fourth dimension: “it
appears therefore more natural to think of physical reality as a four-dimensional existence, instead of,
as hitherto, the evolution of a three-dimensional existence.”
General relativity is an extension of the theory of relativity to arbitrary change of reference
frame through a detour to gravitation and with the help of the equivalence principle:
• Any arbitrary change of reference frame can locally be replaced by an adequate gravitational
field
• Any gravitational field can locally be replaced by an adequate change of reference frame.
General relativity is thus a geometric theory of gravitation: matter and the gravitational field it
generates are replaced by a non-Euclidean space-time, the metric of which is a universal field. So,
Einstein could conclude his 1952 appendix by saying “On the basis of the general theory of relativity,
space, as opposed to ‘what fills space’ has no separate existence (...) There exists no space empty of
field.”
4.2 The finite, static universe of Einstein
In terms of the metric field ( ),g xµν the Einstein’s equation,
1
( ) ( ) ( )
2
R x g x R T xµν µν µνκ− = − (9)
relates the Ricci-Einstein curvature tensor ( )R xµν and the scalar curvature R to the energy-momentum
tensor of matter ( )T xµν . In this equation, the proportionality factor 2
8 G
c
π
κ = , involving the universal
constants G and c, insures that one recovers, through this new theory of gravitation, the Newtonian
theory at the non relativistic limit (c going to infinity).
The study of the universe as a whole is the objective of cosmology, a domain that, until
recently, belonged rather to philosophy than to science. It is not the least merit of 20th
century physics
to have provided this domain with a scientific basis through Einstein’s theory of general relativity. This
theoretical framework has made it possible to put together the observational data in cosmological
models. A cosmological model can be obtained by modelling the material content of the universe
through a specific form of the energy-momentum tensor in the right hand side of the Einstein’s
11. equation. The first cosmological model was attempted by Einstein himself. Since, at the time when he
tried this model (in 1917), it was hard to imagine the universe not being static, he modified his
equation that seemed not to be compatible with a static universe, by adding in the left hand side of (9)
a term, fully preserving general covariance, that he called the cosmological constant term, which
would provide a universal repulsive force (a negative pressure), preventing the universe from
collapsing under the action of its own gravitation:
1
( ) ( ) ( ) ( )
2
R x g x R g x T xµν µν µν µνκ− + Λ = − (10)
In this model, it turns out that the universe is spatially finite but without boundary, a so called elliptic
space with a radius
3
r =
Λ
. However this model has the unsatisfactory feature that the equilibrium
between the action of the cosmological constant and the gravitation induced by matter is unstable, and
when theoretical and observational arguments led to the conclusion that the universe is actually not
static, Einstein was led to abandon his cosmological model, saying that introducing the cosmological
constant was one of his biggest mistakes.
4.3 The standard cosmological model, from the Big Bang model to the concordance
cosmology
4.3.1 The Hubble law
The discovery that the spectral rays of stars in distant galaxies are red shifted with a relative shift
z
λ
λ
∆
= which was found to be proportional to the distance, was interpreted as an indication of a
recession motion of galaxies with a velocity proportional to the distance. One thus arrived at the idea
of the expansion of the universe, based on the Hubble law
0
L
H
c
λ
λ
∆
≅ (11)
where L is the distance of the galaxy, and H0 is the Hubble constant that has the dimensional content of
inverse time (the subscript 0 refers to present time, because this Hubble “constant” may have varied in
the evolution of the universe). By reversing by thought the direction of time, the expansion of the
universe leads to the idea that the present universe has its origin in a primordial explosion, the Big
Bang, that is a singularity involving infinitely large density of energy and temperature that occurred
some ten to twenty billions years ago.
4.3.2 The Friedman-Lemaître-Robertson-Walker metric
In the big bang model, one models the material content of the universe as a homogeneous and isotropic
fluid (as a consequence of the cosmological principle) with an energy density ρ and a pressure p.
With such a right hand side, a solution of the Einstein’s equation is given by the Friedman-Lemaître-
Robertson-Walker (FLRW) metric (c is set to one):
2
2 2 2 2 2 2 2 2
2
( ) sin
1
dr
ds dt a t r d r d
kr
θ θ φ
= − + +
−
(12)
where k, independent on time, is equal to 1, 0 or -1, according to the spatial curvature of the universe
being positive, null or negative, and where a(t) is the time dependent scale factor of the universe,
related to the Hubble constant by 0 0( / )H a a≡ , the expansion rate of the universe at present.
12. The scale factor of the universe is governed by the two fundamental equations of
“cosmodynamics”:
2
3 3
2
8
; ( )
3
a k G
d a pda
a
π ρ
ρ
+
= = − (13)
The first one that depends on the spatial curvature index k relates the expansion rate of the
universe to the energy density ρ. The second one, when coupled to the equation of state ( )p p ρ=
which relates the pressure p to the energy density, determines the evolution of the energy density as a
function of the scale factor. For a spatially flat universe (k = 0) to occur, the energy density must have
a value called the critical density
2
03
8
c
H
G
ρ
π
= . In phenomenological cosmology, it is convenient to
parameterize the contribution of a given component of matter to energy density by its ratio to the
critical density: / .i i cρ ρΩ =
4.3.3 Observational evidences for the Big Bang model
Apart from the recession of distant galaxies according to the Hubble law which is at its very origin, the
Big Bang model is comforted by two other observational evidences:
• the discovery and the more and more precise measurement, in agreement with the Planck’s
law, of the cosmic microwave background radiation (CMBR), a black body radiation
predicted to have been emitted when nuclei and electrons recombined into neutral atoms, and
when the universe thus became transparent to photons
• The observed relative abundance of light elements in agreement with the one predicted by
means of a model for nucleosynthesis in the primordial universe.
4.3.4 The inflation scenario
However the Big Bang model suffered from some drawbacks on the one hand because of the lack of
precision in the observational data (in particular in the measurement of distances) and, on the other
hand because of at least two theoretical problems:
• The horizon problem results from the principle that information cannot travel faster than light.
In a universe of a finite age, this principle sets a bound, called the particle horizon, on the
separation of two regions of space that are in causal contact. The surprising isotropy of the
measured CMBR spectrum raises a horizon problem: if the universe had been dominated by
matter and radiation up to the time of the emission of the CMBR, the particle horizon at that
time would correspond to two degrees on the sky at present, and there is no reason why the
temperatures of two regions in the CMBR map separated by more than two degree should be
the same.
• The flatness problem belongs to the family of “fine tuning problems”. It turns out that the
observed density of energy is of order of the critical density corresponding to a spatially flat
universe (k = 0); now, since any departure from the critical density grows exponentially with
time, the energy density in the primordial universe should have been fine tuned with an
incredible accuracy in order for the present density to be of the order of the critical density.
A possible solution to these two theoretical problems could be found in a scenario, called
inflation, according to which there has been in the very early universe an epoch of exponential
expansion of space-time such that all regions of the CMBR observed at present are in the particle
horizon of each other and that the currently observable universe, that is a very small part of the whole
universe, approximately appears as spatially flat. However, until very recently, this very speculative
scenario was lacking observational evidence.
4.3.5 Concordance cosmology, the new cosmological standard model [18]
In recent years observational cosmology (why not calling it cosmological metrology?) has made
considerable progress on the one hand in the domain of the measurement of distances of large z
13. galaxies by means of the observation of supernovae of type IA, which improved the determination of
the expansion rate of the universe, and on the other hand in the domain of CMBR measurement which
improved the determination of the various components of the energy density of the universe. The
phenomenological interpretation of the data coming from these two domains converged to what is now
known as the concordance cosmology that is compatible with the inflation scenario, and that can be
summarized as follows
a. The spectrum of fluctuations in the CMBR is compatible with the inflation scenario
b. The age of the universe is 9
13.7 0.2 10 years±
c. Decoupling (emission of CMBR) time is 3
dec
379 8 10 yearst = ±
d. Our universe has tot0.98 1.08≤ Ω ≤ , which is compatible with spatial flatness (k = 0)
e. Observation of primordial deuterium as well as CMBR observations show that the total
amount of baryons (i.e. ordinary matter) in the universe contributes about 0.04 0.06BΩ ≅ − ,
which means that most of the universe is non-baryonic
f. Observations related to large scale structure and dynamics suggest that the universe is
populated by a non-luminous component of matter (dark matter, DM) that contributes about
0.20 0.35DMΩ ≅ −
g. Combining this last observation with the approximate criticality of the universe (item d) one
concludes that there must be one more component to energy density representing about 70%
of critical density. Supernovae observation indicates that this component, called dark energy
(DE) has negative pressure, for which the simplest choice is the Einstein’s cosmological
constant
A comment is in order about this last item. The fact that the dark energy component of the
energy density can be attributed to the effect of the cosmological constant is maybe a scientific
discovery of paramount importance, namely, the discovery of a universal constant reflecting a property
of the universe as a whole, its radius.
5 The interface between General Relativity and Quantum Field Theory
5.1 The tentative Grand Unification Theory (GUT) of strong and electroweak theories
From the unification of celestial and terrestrial mechanics in the Newtonian theory and the unification
of electric, magnetic and optic phenomena in the Maxwell’s theory, to the electroweak theory that
began to unify weak and electromagnetic interactions, the quest of unity has been a very strong
motivation in theoretical physics. The success obtained with the standard model of particle physics and
the progresses made in observational cosmology encourage us pushing this quest further. The three
non-gravitational interactions are described in terms of renormalisable gauge theories, in which the
coupling “constants” measuring the intensities at the elementary level, are not constant, since they
depend on energy. Now it turns out that according to renormalization group equations that govern the
evolution of these couplings, they seem to converge towards the same value at energy of about 1015
GeV, which is tantamount for looking for a theory unifying the three interactions that would occur at
such energy. The simplest group that contains as a sub-group the product of the three symmetry groups
of the three interactions is the SU(5) group that Georgi, Quinn and Weinberg [19] considered in 1974
as the local gauge symmetry group of what has been called the Grand Unified Theory (GUT). In this
theory, quarks and leptons form fivefold multiplets and there are 24 gauge bosons, the eight gluons, the
four electroweak gauge bosons and twelve leptoquarks, i.e. gauge bosons transforming a quark into a
lepton. To explain why these leptoquarks are not observable, because very massive (a mass of about
1015
GeV/c2
) one has to imagine a Higgs mechanism leading to two stages of symmetry breaking, one
occurring at about 1015
GeV in which the leptoquarks get their mass while gluons and electroweak
bosons remains massless and another one occurring at about 200 GeV in which the weak intermediate
bosons become massive whereas the photon remains massless. In this theory, the exchange of
leptoquarks leads to the non conservation of the baryon number, which implies that the proton could be
unstable. With the mass expected for the leptoquarks, a possible lifetime of about 1030
years was
estimated for the proton, which could in principle be tested experimentally.
14. 5.2 The Minimal Supersymmetric Standard Model (MSSM)
The grand unified theory became doubtful because, on one hand, experimental searches did not show
any evidence for the instability of the proton, on the other hand because when the evolution of the
coupling “constants” related to the renormalization group equations was established with a better
accuracy (that is metrology!), it appeared that they actually do not converge towards a unique value at
high energy. In any case, the idea of grand unification at a very high energy scale suffers from a severe
theoretical difficulty, known as the hierarchy problem: a Higgs mechanism involving symmetry
breaking at two very different scales requires a very high fine tuning which makes it highly non
credible. The attempts for curing all these drawbacks, in the same move, rely on a new symmetry,
super symmetry (familiarly called SUSY).
Super symmetry is a new symmetry which was invented to solve certain problems of the
quantization of gravity. It puts together bosons and fermions within “supermultiplets.” It unifies
internal symmetries and space time symmetries: the square of a super symmetry transformation is the
generator of space-time translations.
Since the particles of the standard model cannot be put together in supermultiplets, it should
be imagined, if super symmetry there is, that it is broken in such a way that each particle of the
standard model has a partner of the other statistics that has not been discovered yet. The Minimal
Supersymmetric Standard Model [20] (MSSM) is a supersymmetric extension of the standard model
preserving all its assets at energies up to 200 GeV, and involving the breaking of SUSY at a few
hundred GeV (i.e. with partners with such masses, that could be discovered at the LHC).
This MSSM could correct all the defects of the SU(5) theory.
• With SUSY broken between 500 and 1000 GeV, one can make the three coupling “constants”
to converge exactly.
• The hierarchy problem of the Higgs mechanism would be solved: in a self energy diagram,
say, the boson loops and fermions loops have contributions of different signs which tend to
compensate each other, in such a way that the quadratic divergences at the origin of the
hierarchy problem disappear
• In a gauge theory with SUSY, the Higgs fields and the potential of the Higgs mechanism are
constrained, whereas in a theory without SUSY, they are completely ad hoc.
• The lightest neutral supersymmetric partners (called “neutralinos”) would be stable particles,
interacting only very slightly: they are excellent candidates for solving the cosmological
problem of dark matter.
• The superstring theory currently considered as the best candidate for unifying all fundamental
interactions, including gravitation inevitably implies supersymmetry.
6 Quantum statistics and modern quantum physics
6.1 The modern interpretation of quantum physics: emergence of a quasi-classical
realm
Quantum Statistics is this part of quantum physics that does not deal with “pure states” but that deals
with mixed states, i.e. with partially known states. Since partial knowledge is due to coarse graining,
one can say that quantum statistics is nothing but coarse grained quantum physics. The fact that there
exists no reliable phenomenological description of physical reality without some coarse graining
reflects the fundamental limitation principle related to the universal constants h and k. In this respect it
is interesting to notice that in the path integral formulation of QFT, the Planck’s constant is not
actually interpreted as a limitation, it just provides the unit of action necessary to define the integrand
of the functional path integral. The fact that the limitation aspect of quantum physics is relegated to the
domain of phenomenology, experiment and observation is maybe the reason why the PI formulation
of QFT is so well suited for free theoretical investigations and explorations.
Quantum Statistics is fully integrated in the orthodox (Copenhagen) interpretation of quantum
physics that has not suffered any contradiction with experiment in about eighty years. One can say that,
15. quantum physics supplies, through suitable coarse graining, satisfactory descriptions for all phenomena
observable in the world at our scales [21] from the atomic to the galactic scales. However, for our
purpose of establishing a link between phenomenology and the fundamental theories underlying the
standard models of particle physics and cosmology, the orthodox interpretation may look insufficient
because it postulates the existence of a classical world together with a quantum world and because it
seems to require an observer outside the system making measurements. If one wants an interpretation
of quantum physics that explains the emergence of the classical world and that authorizes quantum
effects to occur in the primordial universe (when there was no “observer” whatsoever), one has to
generalize the interpretation of quantum physics.
According to the interpretation of quantum physics in terms of decoherent alternative histories
that performs this generalization [22], the object of quantum physics is to supply probabilities for sets
of alternative histories of the universe, by means of a path integral functional, called the decoherence
functional. In order to be able to attribute probabilities to alternative histories, one has to perform the
path integral with a graining that is sufficiently coarse so that interferences that might prevent
ascribing additive probabilities to independent events actually cancel: this is what is meant by the term
decoherence. Coarse graining beyond the one necessary for decoherence is needed to achieve
approximate deterministic predictability of classical physics. A set of coarse-grained alternative
histories is called a quasi-classical realm. The coarse graining leading to such a quasi-classical realm
is connected with the coarse graining leading to the familiar entropy of thermodynamics. The
probabilities attributed to alternative histories of the universe are a priori probabilities, and as Gell-
Mann notes [23] in his article untitled What is Complexity? “These are a priori probabilities rather than
statistical ones, unless we engage in the exercise of treating the universe as one of a huge set of
alternative universes, forming a ‘multiverse.’ Of course, even within a single universe cases arise of
reproducible events (such as physics experiments), and for those events the a priori probabilities of the
quantum mechanics of the universe yield conventional statistical probabilities”.
In any case, the modern interpretation of quantum physics aims at explaining how the quasi-
classical realm, in which we live and possibly perform metrology experiments, emerges from the
primordial universe describable in terms of the fundamental theories (QFT and GR).
6.2 The interface between Quantum Field Theory and Quantum Statistics
6.2.1 Renormalisable QFT and critical phenomena
The functional integral over locally coupled fields ϕ in QFT in 3+1 dimensions can be translated into
the Boltzmann-Gibbs classical probability distribution for a configuration S of a four dimensional
system of spins coupled to near neighbours by means of a strict “dictionary”
2 3 4QFT 2 3 Classical 1
4
1
1
exp exp ( )
( ) ( )
dxdx dx dx dx
i
dt dx dx S S
kT
k
dx
idt
T
S
φ φ
φ
= ⇔ =
⇔
⇔
⇔
−
∫ ∫ ∫ ∫( )Z D L Z D H
L H
(14)
This dictionary can be used to translate quantum field theoretic problems into classical statistical
problems and vice-versa: for instance, to the perturbative expansion in QFT corresponds, in classical
statistical physics, the calculation of corrections to the mean field approximation. Now it turns out that
in the case of second order phase transitions (leading to critical phenomena), the mean field
approximation breaks down because the correlation length goes to infinity and the system undergoes
self similar fluctuations at all scales. K. Wilson has shown that the method used to solve this difficulty
16. in statistical physics corresponds, in the above mentioned correspondence (14), to the renormalization
procedure in a renormalisable theory, which extends the above mentioned dictionary to the very
fruitful new item: critical phenomena occurring at the second order phase transition correspond to the
results of a renormalisable quantum field theory.
In statistical physics, the difficulty encountered in second phase order transitions is
comparable with the divergences which, in QFT, prevent from applying perturbative methods: the
presence of fluctuations at all scales indicates that the behavior of the system seems to depend on the
details of the interaction at the microscopic level, just as the regularization in QFT seems to make
physics to depend on an arbitrary high energy cut-off parameters. The recursion method, due to
Kadanoff [24], used to average, scale after scale, the self similar fluctuations affecting the system at a
second order phase transition consists in looking for a fixed point in an ensemble of systems
corresponding to varying coarse graining, related to one another by a transformation, called by Wilson
[25] a renormalization group transformation. Near such a fixed point, the critical phenomenon
somehow “forgets” the lattice spacing, just as in a renormalisable QFT, one can without disadvantage
let the cut-off parameter go to infinity (see paragraph 3.3.3).
Just as, in a renormalisable QFT, renormalized Green’s functions and renormalized
parameters depend on the coarse graining, the dynamics near the fixed point of the systems related by
renormalization group transformations depend on the coarse graining. But, in the same way as in
critical phenomena, fixed point equation expresses self similarity, i.e. what is invariant under a change
of scale, in a renormalisable QFT, the renormalization group equation expresses what, in the dynamics
is invariant under a change in coarse graining.
The correspondence we have just described is at the origin of a remarkable interdisciplinary
synthesis that occurred in the 70’s, since one was able to use, with great success, the same theoretical
tools in two domains of physics which hitherto seemed completely disconnected, the physics of critical
phenomena on one hand and Quantum Chromo Dynamics, the physics of strong interaction on the
other hand. In particular, this correspondence allowed designing some computer simulations of QCD,
the so-called “lattice QCD” providing some insight on the non-perturbative regime of this quantum
field theory.
6.2.2 Emergence, universality and effective theories
The epistemological lesson of this success is that it helps in understanding what the meaning of the
concept of emergence is. In a second order phase transition, the behaviour of the system showing
correlations with a range much larger than the lattice spacing, with giant fluctuations, is what one
typically calls an emergent phenomenon. Since systems which are very different at microscopic scales
like a spin lattice and a renormalisable QFT may show similar emergent behaviours, one can say that
what is emergent exhibits some sort of universality. Actually, of two such systems that share the same
renormalization group equation, one says that they belong to the same universality class. Another
important feature of emergence is that universal emergent behaviour depends only on a few relevant
degrees of freedoms or parameters, whereas all other degrees of freedom or parameters can be
considered as marginal. It is thus legitimate to say that two systems, one as simple as a spin lattice
suitable for a computer simulation, and another one as sophisticated as QCD, which belong to the same
universality class, differ only by marginal degrees of freedom or parameters. This remark is at the heart
of the methodology of effective theories [26] that provides the link between the theoretical framework
and phenomenological modelling. The basic idea of effective theories is that if there are parameters
that are very large or very small with respect to the relevant physical quantities (of the same
dimension), one can obtain a much simpler physical approximate description by putting to zero the
very small parameters and to infinity the very large ones. Actually the standard model of particle
physics is not its “fundamental theory”; it is but a set of effective theories adequate for the description
of physics up to energy of a few hundred GeV, all the “effects” that are used in quantum metrology
rely on effective theories adequate for the decription of physics in our quasi-classical realm.
6.3 The interface between General Relativity and Quantum Statistics
17. 6.3.1The concept of horizon in Quantum Statistics and in General Relativity
In quantum statistics, the concept of horizon is relevant in the interpretation of the fundamental
limitations of human knowledge implied by the Planck’s and Boltzmann’s constants: these limitations
are not to be considered as insuperable obstacles but rather as informational horizons, namely some
boundaries defined in an abstract space, beyond which lie some inaccessible information. The fact of
assuming the existence of an informational horizon does not mean that one neglects or forgets the
information lying beyond it. The methodology that allows keeping track of this missing information is
based on functional integration: to evaluate the probabilities of the values of the dynamical variables
bearing the accessible information (the followed variables) one integrates out the dynamical variables
bearing the inaccessible information (the non-followed variables). Basically, this is the meaning of the
methodology of effective theories sketched in the preceding paragraph.
In general relativity gravitation appears as an interaction (actually the only one) capable of so
much curving space-time that it leads to the formation of a geometric horizon, namely a light-like
surface acting as a “one-way membrane”, a two-dimensional informational horizon hiding information
lying beyond it. Because of the expansion of universe, such a horizon obviously exists in cosmology: it
is defined by the distance at which lie galaxies whose light is infinitely red-shifted. A theoretical
laboratory to explore the physics of such horizons is the physics of black holes. The horizon of a black
hole, called its event horizon, is the surface surrounding it of infinite gravitational red-shift, beyond
which any matter (and thus any information), trapped by the black hole escapes from perception. At
the horizon, the gravitational field is so intense that it may induce in matter certain quantum effects
such as the production of particle antiparticle pairs, which have to be dealt with. Since, in quantum
statistics, missing information is equivalent to entropy, it is natural, in this framework, to attribute
entropy to such a horizon. Bekenstein and Hawking have shown that the entropy corresponding to the
information trapped inside a black hole is k times one quarter of the area of the horizon in Plank’s
units.
3
Horizon
Horizon
1 1
4 4P
A kc
S k A
A G
= = (15)
6.3.2 A promising new concept: Holography
Since entropy is hidden information, (15) can be interpreted as a bound, the so called Bekenstein’s
bound [27], on the information that can be packed in the volume of a black hole. The finiteness of this
bound leads to the conjecture that the world has to be discrete, for, would it be continuous, and then
every volume of space would contain an infinite amount of information. The fact that the Planck’s
units appear in (15) suggests that the discrete structure of space-time that it involves is related to
quantum gravity. This reasoning can be generalized to any horizon (and even to any surface) in terms
of what one calls the holographic principle [28], that Lee Smolin [29] states as follows: “Consider any
physical system, made of anything at all, let us call it The Thing. We require only that The Thing can
be enclosed within a finite boundary, which we shall call The Screen. (…) Since the observer is
restricted to examining The Thing by making observations through The Screen, all what is observed
could be accounted for if one imagined that, instead of The Thing, there was some physical system
defined on the screen itself. This system would be described by a theory which involved only The
Screen. (…) If the screen theory were suitably chosen, the laws of physics inside the screen could
equally well be represented by the response of The Screen to the observer.”
It is interesting to note that in the correspondence (14) between quantum and classical
statistical physics, one has something that is similar to the holographic principle: since, in this
correspondence, imaginary time corresponds to a fourth Euclidean dimension, one can say that
somehow quantization adds an extra space dimension to classical physics: quantum physics in a three-
dimensional space is equivalent to classical physics in a four-dimensional space.
18. 7 Conclusion: universal constants, fundamental metrology and
scientific revolutions
7.1 The Copernican revolution
Let us conclude this chapter by reviewing the role of universal constants and fundamental metrology in
the occurrence and accomplishment of scientific revolutions. Consider first the Copernican scientific
revolution. It marks the outset of modern science with the Newton’s theory of universal gravitation
taking into account the constant G and unifying terrestrial and celestial mechanics. We then have the
accomplishment of this scientific revolution at the turn of the 20th
century by means of the first
standard models consisting on
• The Newton’s theory of universal gravitation
• the unification of rational mechanics and the atomistic conception through the kinetic theory
of matter and statistical thermodynamics that take into account the constant k
• the unification of electricity, magnetism and optics through the Faraday-Maxwell-Hertz
theory that takes into account the constant c.
The Kepler’s laws, the accurate measurements of the velocity of light, the discovery of Neptune,
hypothesized to explain the anomalies in the motion of Uranus, the thirteen independent measurements
by Jean Perrin of the Avogadro number that proved in an irrefutable way the existence of atoms, are all
elements belonging to the realm of metrology that played an essential role in the establishment and the
accomplishment of the Copernican revolution/
7.2 Universal constants and the crisis of physics at the beginning of the 20
th
century
The beginning of the last century was marked by a severe conceptual crisis related to the discovery or
re-discovery of universal constants:
• The Boltzmann’s constant k is connected to the critical problem of inobservability, identity
and stability of the atoms, the elementary constituents of matter
• The velocity of light c is connected to inconsistency of the model of ether
• The Planck’s constant h is connected to the objectivity, causality and determinism crisis.
7.3 The 20th
century scientific revolution
This is the revolution that we have described in this chapter and that consists on the re-foundation of
the theoretical framework by means of quantum field theory, general relativity and quantum statistics
that take into account pairs of the four universal constants. The accomplishment of this revolution
consists on the validation of the standard models that gives an acceptable agreement with all
experimental and observational data by means of a fixed and finite number of free parameters, the
dimensionless universal constants, the determination of which relies on fundamental metrology. The
stake of the LHC program the main objective of which is the search for the very last missing link of
this revolution is particularly high. This revolution is characterized by its complete openness to new
experimental or observational discoveries (in this respect the LHC, Planck, VIRGO and LIGO
programs appear extremely promising) and to new theoretical ideas or principles at the interfaces of
the three theories of the tripod (supersymmetry, superstring, loop quantum gravity, non commutative
geometry holographic principle, membranes, new dimensions of space…).
7.4 Is the cosmological constant the fifth universal constant signalling a new scientific
revolution at the horizon?
What we have said about holography puts on the foreground a striking analogy with the situation at the
beginning of the 20th
century. At that time it was the attempt to understand the thermodynamic
properties of the electromagnetic radiation that induced progress in the understanding of atomic
physics, and that led to the quantum physics revolution. Today it seems that it is the attempt to
19. understand the meaning of the entropy of a horizon (of a black hole or the cosmos with a non
vanishing cosmological constant) that may open, through the holographic principle, a thermodynamic
route [30] towards quantum gravity. On the other hand, the crisis induced by the cosmological constant
whose QFT and GR interpretations disagree by more than 120 orders of magnitudes is at least as
severe as the crisis that aroused at the beginning of the 20th
century in relation with quantum physics.
What we can say for sure is that the energy density, space and time scales of the domain of
quantum gravity are so extreme that one will never be able to explore it by direct experiments. One
will necessarily rely on theoretical modelling leading, in observable quantities to deviations with
respect to the standard models ignoring quantum gravity. Very accurate metrology to detect and
ascertain such deviations will be the main tool, maybe the only one, available to prove or disprove any
future theory of quantum gravity. Fundamental metrology has thus a very promising future.
References
1 G. Cohen-Tannoudji, Les Constantes Universelles, Paris, Hachette Littérature, 1998
2 F. Gonseth, La géométrie et le problème de l’espace, p. IV-46 (310), Neuchâtel, Éditions du Griffon, 1949. An
analysis, by F. Gonseth, of the concept of reality horizon can be found (in French) at the following address:
http://www.logma.ch/afg/fghr/fg.htm
3. J.P. Uzan and R. Lehoucq, Les Constantes Fondamentales, Paris, Belin 2005
4. A. Einstein On the Boltzmann’s principle and some consequences that immediately follow from it, Unpublished
manuscript of the conference given at the physical society in Zurich (November 1910), translated by Bertrand
Duplantier (http://parthe.lpthe.jussieu.fr/poincare.htm )
5. A. Einstein, Annalen de Physik, vol. XVII, 1905, p. 549-560
6. A. Einstein, Annalen der Physik, vol. XVII, 1905, p. 132
7
Lev. B. Okun Trialogue on the Number of Fundamental Constants arXiv:physics/0110060v3
8
C. Borde, Base units of the SI, fundamental constants and modern quantum physics, Phil. Trans. R.
Soc. A (2005) 363, 2177-2201
9. R.P. Feynman, Review of Modern Physics, 1948 20, p. 367
10. C.N. Yang and R.L. Mills, Phys. Rev. 96, p. 191, 1954
11. G. ‘t Hooft and M.T. Veltman, Nucl. Phys., ser B, vol 50, p. 318, 1972
12. B.W. Lee and J. Zinn-Justin, Phys. Rev., ser. D vol. 5 pp. 3121 ; 3137 and 3155, ser. D vol. 7, p. 318, 1972
13. S.L. Glashow, Nucl. Phys. 22, 579, 1961
14. A. Salam, in Elementary Particle Physics, ed. N. Svartholm (Almqvist and Wiksells, Stokhol,
1968)
15. S. Weinberg, Phys. Rev. Lett. 19, 1264 (1967)
16. P.W. Higgs, Phys. Lett. 12, 132 (1964) ; F. Englert and R. Brout, Phys. Rev. Lett. 13, 321 (1964)
17. A. Einstein, Relativity, the special and the general theory, Crown Trade Paperback, New York 1961
18. T. Padmanabhan, Advanced Topics in Cosmology: A Pedagogical Introduction, arXiv:astro-ph/0602117v1
19. H. Georgi, H.R. Quinn and S. Weiberg, Phys. Rev. Lett. 33, 451 - 454 (1974)
20. P. Fayet, J. Iliopoulos, Phys. Lett. B 51 (1974) 461.
21. R. Balian La physique quantique à notre échelle in Un siècle de quanta M. Crozon and Y. Sacquin (ed.) EDP
Sciences, 2003
22. M. Gell-Mann and J. B. Hartle, Quasiclassical Coarse Graining and Thermodynamic Entropy arXiv:quant-
ph/0609190v1
23. M. Gell-Mann, What is Complexity ? http://www.santafe.edu/~mgm/complexity.html
24. L.P. Kadanoff, Physics 2, 263, (1966)
25. K.G. Wison and J. Kogut, Phys. Rep. 12C 75 (1974)
26. H. Georgi, Effective field theory, Ann. Rev. Nucl. Sci. 1993, 43, 209 - 52
27. J.D. Bekenstein, Phys. Rev. D9 (1973) 3292
28. L. Susskind and J. Lindesay, An introduction to Black Holes, Information and the String Theory Revolution-
The hologtraphic Universe, World Scientific, 2005
29. Lee Smolin, Three Roads to Quantum Gravity, pp. 169-178, New York, Basic books, 2001
30. T. Padmanabhan, Dark energy: Mystery of the Millenium, Talk presented at the Albert Einstein century
International Conference at Palais de l’Unesco, Paris, July 2005 (arXiv:astro-ph/0603114), 2006