1. The document estimates the maximum intensity of dark matter glow based on Dr. Hewett's Time Symmetric Cosmology theory, which predicts dark matter is a boson resulting from Hawking evaporation of primordial black holes.
2. Using classical approximations, the author estimates the total photon intensity of the Andromeda galaxy's dark matter halo would be 4.67x1024, far greater than its visible light intensity of 3.19x109.
3. However, the author acknowledges the model's assumptions and approximations likely overestimate the intensity by many orders of magnitude, and more rigorous modeling is needed to test if dark matter glow could be detectable.
The Cosmic Microwave (CMB), and Infra-Red (CIRB) Backgrounds are Simple Effec...David Harding
In Continuous Quantum Iteration (CQI) theory, a new approximation of the universal engagement is explored. Science, as of yet, cannot define infinity; the absolute beginning. Yet it is only natural that science should ask the question....How did the local Universe get started? And where does this energy keep coming from? And what is the quantum vacuum really doing? Aspects of virtual quantum iteration are examined considering the plausibility of this new approximation. The Cosmic Microwave (CMB), and the Infra-Red Background (CIRB) are shown to be simple effects of Continual Quantum Vacuum Iterations occurring through Planck-Stoney scale interactions with the universal equilibrium event - horizon background.
Understanding the experimental and mathematical derivation of Heisenberg's Uncertainty Principle. Simple application for estimating single degree of freedom particle in a potential free environment is also discussed.
The optimal cosmic_epoch_for_precision_cosmologySérgio Sacani
The document discusses the optimal epoch for precision cosmology measurements based on the number of independent Fourier modes available. It finds that the best constraints on the primordial power spectrum are accessible at redshifts around z=10 through instruments like 21-cm intensity mapping. The ability to constrain the initial cosmological conditions will deteriorate rapidly in our cosmic future as the exponential expansion erases information beyond 100 Hubble times from now.
This document provides an overview of quantum electrodynamics (QED). It begins by discussing cross sections and the scattering matrix, defining cross section as the effective size of target particles. It then derives an expression for cross section in terms of the transition rate and flux of incident particles. Next, it summarizes the derivation of the differential cross section and decay rate formulas in QED using relativistic quantum field theory and Feynman diagrams. It concludes by briefly reviewing the historical development of QED and the equivalence of the propagator approach and other formulations.
Planck’s length is the scale in which the classical ideas of gravity and space-time cease to be valid and where uncertainty dictates the rules. This is the size of the information bits on the black holes event horizon and there is a good reason to assume that it is the size of the basic building blocks of the fabric of space. This article assumes three leading assumptions: 1. the quantization of space into a lattice (grid) of unit cells, which I will refer to as 3D voxels of space (voxels) in the size of Planck’s length in each dimension. 2. The quantization of time into Planck’s time sequences (pulses). 3. Light travels one space voxel for each time pulse. Based on these three assumptions, this article will show that the Newton gravitational constant (G) increases as the universe expands. This increase in the gravitational constant can illuminate some light on the mysterious dark matter.
This document reviews research on the convergence of perturbation series in quantum field theory. It discusses Dyson's argument that perturbation series in quantum electrodynamics (QED) have zero radius of convergence due to vacuum instability when the coupling constant is negative. Large-order estimates show that perturbation series coefficients grow factorially fast in quantum mechanics and field theories. Finally, it describes the method of Borel summation, which may allow extracting the exact physical quantity from a divergent perturbation series through a unique mapping.
The document describes an experiment to measure the refractive index of HCl gas using a Michelson interferometer. A HeNe laser beam is split into two paths, with one path passing through an evacuated glass cell. As the cell is pumped out, the interference fringes shift due to the changing optical path length. Counting the number of fringe shifts allows calculating the refractive index from the changing wavelength of light in the gas versus vacuum. The experiment is performed at varying HCl pressures and temperatures, with results corrected to standard temperature and pressure for comparison to literature values of the molar refractivity and effective molecular radius of HCl.
The document summarizes a study of the recycling timescale of the quiet Sun's coronal magnetic field. The key points are:
1) Using magnetogram data, the researchers tracked the motions of photospheric magnetic flux concentrations and extrapolated their connectivity into the corona.
2) They found that through reconnection alone, driven by the motions of photospheric concentrations, all coronal flux would remap in about 1.4 hours - much faster than the 14 hour recycling timescale of photospheric flux.
3) When also accounting for emergence and cancellation of photospheric flux, they estimated the coronal recycling timescale is even shorter, at only 1 hour and 24 minutes. This implies the quiet solar
The Cosmic Microwave (CMB), and Infra-Red (CIRB) Backgrounds are Simple Effec...David Harding
In Continuous Quantum Iteration (CQI) theory, a new approximation of the universal engagement is explored. Science, as of yet, cannot define infinity; the absolute beginning. Yet it is only natural that science should ask the question....How did the local Universe get started? And where does this energy keep coming from? And what is the quantum vacuum really doing? Aspects of virtual quantum iteration are examined considering the plausibility of this new approximation. The Cosmic Microwave (CMB), and the Infra-Red Background (CIRB) are shown to be simple effects of Continual Quantum Vacuum Iterations occurring through Planck-Stoney scale interactions with the universal equilibrium event - horizon background.
Understanding the experimental and mathematical derivation of Heisenberg's Uncertainty Principle. Simple application for estimating single degree of freedom particle in a potential free environment is also discussed.
The optimal cosmic_epoch_for_precision_cosmologySérgio Sacani
The document discusses the optimal epoch for precision cosmology measurements based on the number of independent Fourier modes available. It finds that the best constraints on the primordial power spectrum are accessible at redshifts around z=10 through instruments like 21-cm intensity mapping. The ability to constrain the initial cosmological conditions will deteriorate rapidly in our cosmic future as the exponential expansion erases information beyond 100 Hubble times from now.
This document provides an overview of quantum electrodynamics (QED). It begins by discussing cross sections and the scattering matrix, defining cross section as the effective size of target particles. It then derives an expression for cross section in terms of the transition rate and flux of incident particles. Next, it summarizes the derivation of the differential cross section and decay rate formulas in QED using relativistic quantum field theory and Feynman diagrams. It concludes by briefly reviewing the historical development of QED and the equivalence of the propagator approach and other formulations.
Planck’s length is the scale in which the classical ideas of gravity and space-time cease to be valid and where uncertainty dictates the rules. This is the size of the information bits on the black holes event horizon and there is a good reason to assume that it is the size of the basic building blocks of the fabric of space. This article assumes three leading assumptions: 1. the quantization of space into a lattice (grid) of unit cells, which I will refer to as 3D voxels of space (voxels) in the size of Planck’s length in each dimension. 2. The quantization of time into Planck’s time sequences (pulses). 3. Light travels one space voxel for each time pulse. Based on these three assumptions, this article will show that the Newton gravitational constant (G) increases as the universe expands. This increase in the gravitational constant can illuminate some light on the mysterious dark matter.
This document reviews research on the convergence of perturbation series in quantum field theory. It discusses Dyson's argument that perturbation series in quantum electrodynamics (QED) have zero radius of convergence due to vacuum instability when the coupling constant is negative. Large-order estimates show that perturbation series coefficients grow factorially fast in quantum mechanics and field theories. Finally, it describes the method of Borel summation, which may allow extracting the exact physical quantity from a divergent perturbation series through a unique mapping.
The document describes an experiment to measure the refractive index of HCl gas using a Michelson interferometer. A HeNe laser beam is split into two paths, with one path passing through an evacuated glass cell. As the cell is pumped out, the interference fringes shift due to the changing optical path length. Counting the number of fringe shifts allows calculating the refractive index from the changing wavelength of light in the gas versus vacuum. The experiment is performed at varying HCl pressures and temperatures, with results corrected to standard temperature and pressure for comparison to literature values of the molar refractivity and effective molecular radius of HCl.
The document summarizes a study of the recycling timescale of the quiet Sun's coronal magnetic field. The key points are:
1) Using magnetogram data, the researchers tracked the motions of photospheric magnetic flux concentrations and extrapolated their connectivity into the corona.
2) They found that through reconnection alone, driven by the motions of photospheric concentrations, all coronal flux would remap in about 1.4 hours - much faster than the 14 hour recycling timescale of photospheric flux.
3) When also accounting for emergence and cancellation of photospheric flux, they estimated the coronal recycling timescale is even shorter, at only 1 hour and 24 minutes. This implies the quiet solar
This document discusses key concepts relating to the wave-particle duality of light and matter. It explains that light exhibits both wave-like and particle-like properties, behaving as photons with a specific energy determined by frequency. The document also covers the development of models of the atom, including Bohr's model of electron orbitals and Schrodinger's wave equation describing electron probability clouds. It introduces concepts like atomic orbitals, electron configuration, and the Aufbau principle for building up orbitals according to energy levels.
Supersymmetry (SUSY) is a proposed symmetry between bosons and fermions that could help solve issues in the Standard Model such as the hierarchy problem. SUSY introduces new "quantum" dimensions beyond the usual 3 spatial and 1 time dimension. SUSY generators called Q transform fermions into bosons and vice versa. The SUSY algebra involves the generators Q satisfying anticommutation relations in addition to the usual commutation relations of generators like momentum P and angular momentum M. While experimental evidence for SUSY is still lacking, it is an attractive theoretical idea that may be discovered at energy scales below 1 TeV.
Planck discovered an empirical formula in 1900 that accurately described blackbody radiation spectra across all wavelengths and temperatures. Seeking a physical explanation, he postulated that the walls of a hot cavity contained identical vibrating oscillators that exchange energy with radiation in quantized units proportional to their frequency, called quanta. This quantum hypothesis explained the experimental observations and values Planck calculated for the universal constant h. While initially a mathematical trick, Planck's quantum theory marked a revolutionary beginning of modern quantum mechanics.
1) A new cosmological model is proposed where the universe is spontaneously created from nothing via quantum tunneling into a de Sitter space.
2) After tunneling, the model evolves according to the inflationary scenario, avoiding the big bang singularity and not requiring initial conditions.
3) The model suggests that the universe was created via quantum tunneling from a state of literally nothing into a de Sitter space, which then evolved into the expanding universe we observe according to known physics.
The document discusses string theory, extra dimensions, and cosmology. It notes that inflation and dark energy require new physics beyond general relativity and quantum field theory. Most string cosmology models invoke string theory, branes, or extra dimensions as approximations of string theory solutions. However, reliably anchoring cosmology in string theory has proven difficult, with success being rare. While cosmological observations can't determine the nature of dark energy, understanding fundamental physics through experiments like the LHC remains a key hope.
The document discusses incompressible liquid droplets, specifically:
1) Spherically symmetric liquid droplets have a density that is independent of size and molecules move within them.
2) Heat of vaporization and binding energy relate to the energy required to separate molecules in a liquid.
3) Fission and fusion reactions can be modeled using terms that account for volume, surface area, electrostatic forces, asymmetry, pairing, and binding energies.
4) Equations are presented for calculating force between charges, energy, volume, area, and nuclear mass.
Planck was able to account for the measured spectral distribution of radiation from a thermal source by postulating that the energies of harmonic oscillators are quantized. Einstein then used this idea to explain the photoelectric effect. The Planck radiation law provides the frequency distribution of stored energy in a resonator in thermal equilibrium. It avoids the ultraviolet catastrophe seen in the Rayleigh-Jeans law. Einstein introduced phenomenological coefficients (A and B) to describe absorption, stimulated emission, and spontaneous emission in a two-level system, which relate to the Planck radiation law.
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...IOSR Journals
A new theory has been presented, for the first time, called the "Phase Theory", which is the natural evolution of the physical thought and is considered the one beyond the super string theory. This theory solves the unsolved problems of the mysterious of matter, antimatter and interactions and makes a wide step towards the unification of the forces of nature. In this theory, the vibrating string of different frequency modes which determines the different types of elementary particles is replaced by a three dimensional infinitesimal pulsating (black)holes with the same frequency. Different types of elementary particles are determined by different phase angles associated with the same frequency. This allows the force of interactions to take place among elementary particles, without the need to invoke the notion of the force carrier particles, as the (stable) force of interactions can never take place between elementary particles at different frequencies. Besides the strong mathematical proofs given in this paper to prove its truthfulness, an experimental prediction has been given to confirm the theory presented in the form of the relation between the electron radius and quarks radii. The paper shows that quarks are direct consequence of this theory, and solves "the flavor problem" in QCD, and gives the clue to answer the questions of "Why are there so many flavors? The paper also derives the equation of the big bang theory which describes the singularity of the moment of creation of the universe.
Lecture 4: Introduction to Quantum Chemical Simulation graduate course taught at MIT in Fall 2014 by Heather Kulik. This course covers: wavefunction theory, density functional theory, force fields and molecular dynamics and sampling.
This document describes an experiment to observe and record the surface plasmon resonance (SPR) curve for a thin metal film. Light from a laser is shone through a glass prism onto the metal film at varying angles of incidence. The intensity of the reflected light is recorded versus the angle to generate the SPR curve. Surface plasmons are quantum phenomena that can be excited at the metal-air interface by photons and decay back into photons. The SPR curve depends on the dielectric constant of the metal film and its thickness. Matching the wavevector of incident light to that of surface plasmons requires increasing the wavevector by passing light through a higher index material like glass before it reaches the metal film.
This document discusses several laws relating to the radiation of heat, including:
1) Kirchhoff's law, which states that the emission of a body at a given wavelength and temperature is equal to the ratio of the emissivity and blackbody emission. Real objects emit less radiation than black bodies.
2) Wien's law, which says the dominant wavelength at which a blackbody emits radiation is inversely proportional to its temperature.
3) The Stefan-Boltzmann law, stating that a blackbody's total energy flux is directly proportional to the fourth power of its temperature.
The document contains conceptual problems and their solutions related to properties of light.
1. A ray of light reflects from a plane mirror at an angle of 70° between the incoming and reflected rays. The angle of reflection is 35°.
2. A lifeguard hears a swimmer calling for help. Taking the least time path, the lifeguard chooses to run on land then swim through point D to reach the swimmer.
3. Blue light appears blue underwater because the color molecules in the eye respond to the frequency of light, not the wavelength, which changes with the medium's index of refraction.
Louis De Broglie proposed in 1924 that electrons and other particles exhibit wave-like properties described by an equation relating the wavelength of a particle to its momentum. De Broglie's equation showed that all moving particles can be associated with a wavelength, and calculated wavelengths for everyday objects like cars and baseballs, though the wavelengths are too small to detect directly. The wavelength of electrons calculated using the equation could be measured using specialized equipment, providing evidence for the wave-particle duality of matter.
The nuclear shell model can explain the observed magic numbers of nuclei by considering the quantized energy levels of nucleons moving in a spherically symmetric nuclear potential. However, it fails to accurately predict properties like binding energies, magnetic dipole moments, and excited state spectra. The collective model improves on this by allowing for a non-spherical nuclear potential, leading to nuclear deformation and rotational excited states. This explains phenomena like large electric quadrupole moments observed in heavier nuclei and the characteristic rotational band structure seen in their low-lying excited states.
- The atom consists of a small, dense nucleus surrounded by an electron cloud.
- Electrons can only exist in certain discrete energy levels around the nucleus. Their wavelengths are determined by the principal quantum number.
- The Bohr model improved on earlier models by introducing energy levels and quantization, but had limitations. The quantum mechanical model treats electrons as waves and uses Schrodinger's equation.
1) De Broglie hypothesized that particles like electrons can behave as waves, with a wavelength given by λ = h/mv, where h is Planck's constant, m is the particle's mass, and v is its velocity.
2) This hypothesis provided an explanation for the quantization of angular momentum and energy levels in Bohr's model of the hydrogen atom.
3) Experiments have verified that electrons and other particles do exhibit wave-like properties such as interference and diffraction, confirming the wave-particle duality predicted by De Broglie's hypothesis.
Lecture 5: Introduction to Quantum Chemical Simulation graduate course taught at MIT in Fall 2014 by Heather Kulik. This course covers: wavefunction theory, density functional theory, force fields and molecular dynamics and sampling.
Lecture 3: Introduction to Quantum Chemical Simulation graduate course taught at MIT in Fall 2014 by Heather Kulik. This course covers: wavefunction theory, density functional theory, force fields and molecular dynamics and sampling.
The document discusses nuclear models, specifically the liquid drop model. It provides three key points:
1. The liquid drop model views the nucleus as similar to a liquid drop, with nucleons interacting through short-range forces like molecules in a liquid. This explains trends in binding energy with mass number.
2. The Beithe-Weizsacker formula provides a semi-empirical expression for binding energy as a function of mass and atomic number. It includes terms for volume, surface tension, electrostatic repulsion and asymmetry.
3. The formula allows predicting stability against alpha or beta decay. Alpha decay energy can be calculated and nuclei with mass over 200 are predicted to alpha decay. Mass parabol
This document provides recipes and event planning tips from an international Christmas celebration held at Laurea University of Applied Sciences in Finland. It includes an appetizer, main course and dessert recipes from 10 different countries representing 5 continents. The recipes are accompanied by tips on budgeting, facilities, entertainment and menu planning for events. The document aims to give the reader ideas and tastes from the international buffet menu that was served at the Christmas event.
This document discusses key concepts relating to the wave-particle duality of light and matter. It explains that light exhibits both wave-like and particle-like properties, behaving as photons with a specific energy determined by frequency. The document also covers the development of models of the atom, including Bohr's model of electron orbitals and Schrodinger's wave equation describing electron probability clouds. It introduces concepts like atomic orbitals, electron configuration, and the Aufbau principle for building up orbitals according to energy levels.
Supersymmetry (SUSY) is a proposed symmetry between bosons and fermions that could help solve issues in the Standard Model such as the hierarchy problem. SUSY introduces new "quantum" dimensions beyond the usual 3 spatial and 1 time dimension. SUSY generators called Q transform fermions into bosons and vice versa. The SUSY algebra involves the generators Q satisfying anticommutation relations in addition to the usual commutation relations of generators like momentum P and angular momentum M. While experimental evidence for SUSY is still lacking, it is an attractive theoretical idea that may be discovered at energy scales below 1 TeV.
Planck discovered an empirical formula in 1900 that accurately described blackbody radiation spectra across all wavelengths and temperatures. Seeking a physical explanation, he postulated that the walls of a hot cavity contained identical vibrating oscillators that exchange energy with radiation in quantized units proportional to their frequency, called quanta. This quantum hypothesis explained the experimental observations and values Planck calculated for the universal constant h. While initially a mathematical trick, Planck's quantum theory marked a revolutionary beginning of modern quantum mechanics.
1) A new cosmological model is proposed where the universe is spontaneously created from nothing via quantum tunneling into a de Sitter space.
2) After tunneling, the model evolves according to the inflationary scenario, avoiding the big bang singularity and not requiring initial conditions.
3) The model suggests that the universe was created via quantum tunneling from a state of literally nothing into a de Sitter space, which then evolved into the expanding universe we observe according to known physics.
The document discusses string theory, extra dimensions, and cosmology. It notes that inflation and dark energy require new physics beyond general relativity and quantum field theory. Most string cosmology models invoke string theory, branes, or extra dimensions as approximations of string theory solutions. However, reliably anchoring cosmology in string theory has proven difficult, with success being rare. While cosmological observations can't determine the nature of dark energy, understanding fundamental physics through experiments like the LHC remains a key hope.
The document discusses incompressible liquid droplets, specifically:
1) Spherically symmetric liquid droplets have a density that is independent of size and molecules move within them.
2) Heat of vaporization and binding energy relate to the energy required to separate molecules in a liquid.
3) Fission and fusion reactions can be modeled using terms that account for volume, surface area, electrostatic forces, asymmetry, pairing, and binding energies.
4) Equations are presented for calculating force between charges, energy, volume, area, and nuclear mass.
Planck was able to account for the measured spectral distribution of radiation from a thermal source by postulating that the energies of harmonic oscillators are quantized. Einstein then used this idea to explain the photoelectric effect. The Planck radiation law provides the frequency distribution of stored energy in a resonator in thermal equilibrium. It avoids the ultraviolet catastrophe seen in the Rayleigh-Jeans law. Einstein introduced phenomenological coefficients (A and B) to describe absorption, stimulated emission, and spontaneous emission in a two-level system, which relate to the Planck radiation law.
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...IOSR Journals
A new theory has been presented, for the first time, called the "Phase Theory", which is the natural evolution of the physical thought and is considered the one beyond the super string theory. This theory solves the unsolved problems of the mysterious of matter, antimatter and interactions and makes a wide step towards the unification of the forces of nature. In this theory, the vibrating string of different frequency modes which determines the different types of elementary particles is replaced by a three dimensional infinitesimal pulsating (black)holes with the same frequency. Different types of elementary particles are determined by different phase angles associated with the same frequency. This allows the force of interactions to take place among elementary particles, without the need to invoke the notion of the force carrier particles, as the (stable) force of interactions can never take place between elementary particles at different frequencies. Besides the strong mathematical proofs given in this paper to prove its truthfulness, an experimental prediction has been given to confirm the theory presented in the form of the relation between the electron radius and quarks radii. The paper shows that quarks are direct consequence of this theory, and solves "the flavor problem" in QCD, and gives the clue to answer the questions of "Why are there so many flavors? The paper also derives the equation of the big bang theory which describes the singularity of the moment of creation of the universe.
Lecture 4: Introduction to Quantum Chemical Simulation graduate course taught at MIT in Fall 2014 by Heather Kulik. This course covers: wavefunction theory, density functional theory, force fields and molecular dynamics and sampling.
This document describes an experiment to observe and record the surface plasmon resonance (SPR) curve for a thin metal film. Light from a laser is shone through a glass prism onto the metal film at varying angles of incidence. The intensity of the reflected light is recorded versus the angle to generate the SPR curve. Surface plasmons are quantum phenomena that can be excited at the metal-air interface by photons and decay back into photons. The SPR curve depends on the dielectric constant of the metal film and its thickness. Matching the wavevector of incident light to that of surface plasmons requires increasing the wavevector by passing light through a higher index material like glass before it reaches the metal film.
This document discusses several laws relating to the radiation of heat, including:
1) Kirchhoff's law, which states that the emission of a body at a given wavelength and temperature is equal to the ratio of the emissivity and blackbody emission. Real objects emit less radiation than black bodies.
2) Wien's law, which says the dominant wavelength at which a blackbody emits radiation is inversely proportional to its temperature.
3) The Stefan-Boltzmann law, stating that a blackbody's total energy flux is directly proportional to the fourth power of its temperature.
The document contains conceptual problems and their solutions related to properties of light.
1. A ray of light reflects from a plane mirror at an angle of 70° between the incoming and reflected rays. The angle of reflection is 35°.
2. A lifeguard hears a swimmer calling for help. Taking the least time path, the lifeguard chooses to run on land then swim through point D to reach the swimmer.
3. Blue light appears blue underwater because the color molecules in the eye respond to the frequency of light, not the wavelength, which changes with the medium's index of refraction.
Louis De Broglie proposed in 1924 that electrons and other particles exhibit wave-like properties described by an equation relating the wavelength of a particle to its momentum. De Broglie's equation showed that all moving particles can be associated with a wavelength, and calculated wavelengths for everyday objects like cars and baseballs, though the wavelengths are too small to detect directly. The wavelength of electrons calculated using the equation could be measured using specialized equipment, providing evidence for the wave-particle duality of matter.
The nuclear shell model can explain the observed magic numbers of nuclei by considering the quantized energy levels of nucleons moving in a spherically symmetric nuclear potential. However, it fails to accurately predict properties like binding energies, magnetic dipole moments, and excited state spectra. The collective model improves on this by allowing for a non-spherical nuclear potential, leading to nuclear deformation and rotational excited states. This explains phenomena like large electric quadrupole moments observed in heavier nuclei and the characteristic rotational band structure seen in their low-lying excited states.
- The atom consists of a small, dense nucleus surrounded by an electron cloud.
- Electrons can only exist in certain discrete energy levels around the nucleus. Their wavelengths are determined by the principal quantum number.
- The Bohr model improved on earlier models by introducing energy levels and quantization, but had limitations. The quantum mechanical model treats electrons as waves and uses Schrodinger's equation.
1) De Broglie hypothesized that particles like electrons can behave as waves, with a wavelength given by λ = h/mv, where h is Planck's constant, m is the particle's mass, and v is its velocity.
2) This hypothesis provided an explanation for the quantization of angular momentum and energy levels in Bohr's model of the hydrogen atom.
3) Experiments have verified that electrons and other particles do exhibit wave-like properties such as interference and diffraction, confirming the wave-particle duality predicted by De Broglie's hypothesis.
Lecture 5: Introduction to Quantum Chemical Simulation graduate course taught at MIT in Fall 2014 by Heather Kulik. This course covers: wavefunction theory, density functional theory, force fields and molecular dynamics and sampling.
Lecture 3: Introduction to Quantum Chemical Simulation graduate course taught at MIT in Fall 2014 by Heather Kulik. This course covers: wavefunction theory, density functional theory, force fields and molecular dynamics and sampling.
The document discusses nuclear models, specifically the liquid drop model. It provides three key points:
1. The liquid drop model views the nucleus as similar to a liquid drop, with nucleons interacting through short-range forces like molecules in a liquid. This explains trends in binding energy with mass number.
2. The Beithe-Weizsacker formula provides a semi-empirical expression for binding energy as a function of mass and atomic number. It includes terms for volume, surface tension, electrostatic repulsion and asymmetry.
3. The formula allows predicting stability against alpha or beta decay. Alpha decay energy can be calculated and nuclei with mass over 200 are predicted to alpha decay. Mass parabol
This document provides recipes and event planning tips from an international Christmas celebration held at Laurea University of Applied Sciences in Finland. It includes an appetizer, main course and dessert recipes from 10 different countries representing 5 continents. The recipes are accompanied by tips on budgeting, facilities, entertainment and menu planning for events. The document aims to give the reader ideas and tastes from the international buffet menu that was served at the Christmas event.
The document discusses improving medication adherence and the complex landscape of issues involved. It notes that non-adherence results in high healthcare costs and negative health outcomes. Pharma companies are engaging with adherence issues due to business risks from lower sales and regulatory scrutiny. The landscape is complex with many patient, stakeholder, and systemic factors influencing adherence. Human-centered design and the transtheoretical model of behavior change are recommended approaches for developing well-targeted adherence solutions. Connected health technologies also show promise if designed with privacy, costs, and usability in mind.
Aquaponics systems often require production fish like tilapia as a protein source, but this increases costs and complexity. Using ornamental fish instead could increase project success by providing a cheap, easy to cultivate waste source. This study tested a backyard raft aquaponics system using goldfish or koi as the waste source under LED lights or sunlight. Koi fish supported better plant growth than goldfish, and LED lights did not significantly impact growth compared to sunlight. Using ornamental fish is an example of "leapfrog technology" that could make aquaponics more economically viable.
Promoting regional trade and agribusiness development in the Pacific :
2nd PACIFIC AGRIBUSINESS FORUM
"Linking the agrifood sector to the local markets for economic growth and improved food and nutrition security"
Organised by PIPSO, CTA, IFAD, SPC and SPTO
Tanoa Tusitala Hotel, Apia, Samoa, 29th August -1st September 2016
El documento es una carta escrita en el año 2070 por Ria Slides, una mujer de 50 años que recuerda cómo era la Tierra antes y cómo es ahora debido a la escasez extrema de agua causada por la negligencia humana. Describe un mundo árido donde el agua es un recurso preciado, las enfermedades son comunes, la calidad de vida es baja y el futuro de la humanidad está en peligro. Advierte que la destrucción del hábitat llegó a un punto sin retorno y lamenta que la gente no haya tomado medidas a tiempo para
Le Briefing de Bruxelles sur le Développement sur le thème « Solutions agricoles intelligentes et abordables pour l’Afrique: le prochain moteur de l’agriculture africaine » s’est tenu le mercredi 13 juillet 2016 (9h00-13h00) au Centre de conférences Albert Borschette, Salle 1.A (Rue Froissart 36, 1040 Bruxelles). Ce Briefing de Bruxelles a été organisé par le Centre technique de coopération agricole et rurale (CTA) en collaboration avec la DG Développement et Coopération de la Commission européenne (CE/DEVCO), le secrétariat ACP, la Confédération européenne des ONG d’urgence et de développement (Concord), le CEMA, l’Organisation PanAfricaine des Agriculteurs (PAFO) et AgriCord.
Building the RIGHT PRODUCT for the RIGHT CUSTOMER - ProductCamp LA 2014Matt Danna
Presented during ProductCamp LA on Saturday, March 1st, 2014.
Website: http://productcamp.la
========================
SESSION DESCRIPTION:
A lot of things need to go RIGHT to be successful. Successful products require:
...Building the RIGHT product...
...for the RIGHT customer...
...in the RIGHT market...
...using the RIGHT technology...
...at the RIGHT time...
...by the RIGHT team...
...with the RIGHT amount of luck.
Come learn some strategies for building the right stuff. :)
Este documento resume los conceptos clave relacionados con los derechos humanos. Explica que existen tres generaciones de derechos humanos que protegen los derechos civiles y políticos, económicos, sociales y culturales, y de solidaridad entre los pueblos. También describe las tres formas en que se pueden violar los derechos humanos y cómo la Constitución mexicana protege estos derechos. Finalmente, señala que los gobiernos son responsables de garantizar el respeto a los derechos humanos de acuerdo
This document contains a list of over 70 different animal species, including mammals like lions, giraffes, camels and primates; birds such as ostriches, parrots and penguins; reptiles like snakes, lizards and crocodiles; amphibians including poison dart frogs; invertebrates like scorpions, spiders and ants; and fish such as narwhals and belugas. The wide-ranging list covers African savanna species, Arctic animals, rainforest dwellers, insects, and more from around the world.
This blog focuses on early childhood education and child development. It provides tips and resources for parents to help their young children learn through play. Various articles explore how caregivers can implement different learning activities and foster skills like problem-solving in a fun, supportive environment.
The document contains repeating patterns of numbers from 1 to 10 arranged in rows. While there are no words, the numbers seem to be organized in a way to convey specific information or demonstrate an underlying concept through their positioning and repetition. The document references a blog site URL but does not provide any other context.
This document summarizes an presentation on opportunities in urban agriculture related to aquaculture and aquaponics. It provides an overview of aquaculture trends in Canada and Ontario, describes various aquaculture and aquaponics business models operating in Ontario, reviews regulations around aquaculture licensing, and lists upcoming aquaculture and aquaponics training workshops. The presentation explores opportunities for small-scale fish farming and integrated aquaponics systems in an urban context.
Le Briefing de Bruxelles sur le Développement, numéro 44, sur le thème « Promouvoir l’approvisionnement responsable et durable par le biais du commerce équitable », s’est tenu le mercredi 22 juin 2016 (9h00-13h00) au sein du Secrétariat ACP. Ce Briefing de Bruxelles a été organisé par le Centre technique de coopération agricole et rurale (CTA) en collaboration avec la DG Développement et Coopération de la Commission européenne (CE/DEVCO), le secrétariat ACP, la Confédération européenne des ONG d’urgence et de développement (Concord) et le Fair Trade Advocacy Office.
1) The document proposes an alternative cosmological model where dark matter and dark energy are described as forms of ether, analogous to Mach's principle of inertia.
2) In this model, dark matter arises from the QCD vacuum or "sea" of quark-antiquark pairs and gluons at the confinement scale, while dark energy corresponds to the zero-point energy of the QCD vacuum.
3) The model aims to replace the standard LambdaCDM model, treating the expanding universe as a dynamically stable "biking" Einstein universe where the running cosmological constant compensates for the effect of gravity at all epochs.
Fundamental principle of information to-energy conversion.Fausto Intilla
Abstract. - The equivalence of 1 bit of information to entropy was given by Landauer in 1961 as kln2, k the Boltzmann constant. Erasing information implies heat dissipation and the energy of 1 bit would then be (the
Landauer´s limit) kT ln 2, T being the ambient temperature. From a quantum-cosmological point of view the minimum quantum of energy in the universe corresponds today to a temperature of 10^-29 ºK, probably forming a cosmic background of a Bose condensate [1]. Then, the bit with minimum energy today in the Universe is a quantum of energy 10^-45 ergs, with an equivalent mass of 10^-66 g. Low temperature implies low energy per bit and, of course, this is the way for faster and less energy dissipating computing devices. Our conjecture is this: the possibility of a future access to the CBBC (a coupling/channeling?) would mean a huge
jump in the performance of these devices.
Dark Matter Annihilation inside Large-Volume Neutrino DetectorsSérgio Sacani
New particles in theories beyond the standard model can manifest as stable relics that interact strongly with visible matter and make up a small fraction of the total dark matter abundance. Such particles represent an interesting physics target since they can evade existing bounds from direct detection due to their rapid thermalization in high-density environments. In this work we point out that their annihilation to visible matter inside large-volume neutrino telescopes can provide a new way to constrain or discover such particles. The signal is the most pronounced for relic masses in the GeV range, and can be efficiently constrained by existing Super-Kamiokande searches for dinucleon annihilation. We also provide an explicit realization of this scenario in the form of secluded dark matter coupled to a dark photon, and we show that the present method implies novel and stringent bounds on the model that are complementary to direct constraints from beam dumps, colliders, and direct detection experiments.
This document is Einstein's seminal 1905 paper "Concerning an Heuristic Point of View Toward the Emission and Transformation of Light". In the paper, Einstein summarizes issues with existing theories of light and blackbody radiation. He proposes that light energy is quantized rather than continuous, consisting of discrete "energy quanta" localized in space that can only be emitted or absorbed as complete units. This revolutionary idea helped lay the foundations for the development of quantum mechanics.
Dark matter modeled as a Bose Einstein gluon condensate with an energy density relative to baryonic energy density in agreement with observation (ArXiv: 1507.00460)
Black holes and dark matter must have formed early in the universe's development for galaxies and stars to later form, according to this document. It proposes that fundamental particles called dyons, which carry both electric and magnetic charges, aggregated in the early exponentially expanding universe to form black holes and dark matter. As the universe expanded and its energy density decreased, these dyon aggregates could have evaporated or dissociated into the elementary particles observed in experiments today. The document presents models showing how fundamental particle energies may have decreased exponentially as the universe expanded, in a way that could explain the formation of black holes and dark matter from dyon aggregates in the early universe.
1) The observable universe has a mass of approximately 24x10^53 kg.
2) Normal matter, including atoms, stars and galaxies, constitute only about 4% of the observable universe. The rest is dark matter (27%) and dark energy (71%).
3) Dark matter interacts gravitationally but is unseen, and helps hold galaxies together. Dark energy is causing the accelerating expansion of the universe against the force of gravity.
The current ability to test theories of gravity with black hole shadowsSérgio Sacani
Our Galactic Centre, Sagittarius A*, is believed to harbour a
supermassive black hole, as suggested by observations tracking
individual orbiting stars1,2
. Upcoming submillimetre verylong
baseline interferometry images of Sagittarius A* carried
out by the Event Horizon Telescope collaboration (EHTC)3,4
are expected to provide critical evidence for the existence of
this supermassive black hole5,6. We assess our present ability
to use EHTC images to determine whether they correspond
to a Kerr black hole as predicted by Einstein’s theory
of general relativity or to a black hole in alternative theories
of gravity. To this end, we perform general-relativistic magnetohydrodynamical
simulations and use general-relativistic
radiative-transfer calculations to generate synthetic shadow
images of a magnetized accretion flow onto a Kerr black hole.
In addition, we perform these simulations and calculations for
a dilaton black hole, which we take as a representative solution
of an alternative theory of gravity. Adopting the very-long
baseline interferometry configuration from the 2017 EHTC
campaign, we find that it could be extremely difficult to distinguish
between black holes from different theories of gravity,
thus highlighting that great caution is needed when interpreting
black hole images as tests of general relativity.
Quantization of photonic energy and photonic wave lengthEran Sinbar
The document proposes that if space is quantized at the Planck length, then photonic energy and wavelength must also be quantized. It suggests that future experiments could detect these quantization levels in cosmic radiation or particle collisions. It also puts forward a "grid dimensions" theory that proposes extra non-local dimensions between Planck length pieces of space that could explain quantum non-local effects like entanglement. Key equations presented quantify proposed quantized limits for momentum, mass, velocity of particles if space-time is quantized.
Unification theory with no extra dimensions. The first part unifies the strong nuclear force with the gravitational force in a mathematical way; the quantum vacuum is treated as a deformable system by the strong nuclear force. The second part unifies the nuclear force with the quantum vacuum in a hypothetical structure; the quantum vacuum is treated as a supersymmetric and metastable system with properties related to the different types of particles’ motion.
This document discusses whether quantum mechanics is involved in the early evolution of the universe and if a Machian relationship between gravitons and gravitinos can help answer this question. It proposes that:
1) Gravitons and gravitinos carry information and their relationship, described as a Mach's principle, conserves this information from the electroweak era to today. This suggests quantum mechanics may not be essential in early universe formation.
2) A minimum amount of initial information, such as a small value for Planck's constant, is needed to set fundamental cosmological parameters and could be transferred from a prior universe.
3) Early spacetime may have had a pre-quantum state with low entropy and degrees of freedom
Detailing Coherent, Minimum Uncertainty States of Gravitons, as Semi Classical Components of Gravity Waves, and How Squeezed States Affect Upper Limits to Graviton Mass /• We present what is relevant to squeezed states of initial space time and how that affects both the composition of relic GW and also gravitons. A side issue to consider is if gravitons can be configured as semi classical “particles”, which is akin to the Pilot particles model of Quantum Mechanics as embedded in a larger non linear “deterministic” background.
First Presented Saturday, September 3, 2011 at the G999 Conference, Philadelphia, PA http://ggg999.org/
Next Presentation : Friday, September 9, SAN MARINO WORKSHOP ON ASTROPHYSICS AND COSMOLOGY
FOR MATTER AND ANTIMATTER
http://www.workshops-hadronic-mechanics.org/
San Marino, N. Italy
Discussion: The Nature of Semi-classical Nature of Gravity Reviewed; And Can We Use a Graviton Entanglement Version of the EPR System to Answer if The Graviton is Classical or Quantum in Origin?
Publisher’s note: Dr. Beckwith: I am honored that you have seen fit to acknowledge me in your presentations of today, September 3, 2011 at the G999 Conference in Philadelphia, and Friday, September 9, at the San Marino Workshop on Astrophysics and Cosmology For Matter and Antimatter, in San Marino, N. Italy.
In my view, these are rather extraordinary postulations, especially the probability of extra-universal black hole gravitational origins, along with expansion beyond Hawking of Quantum Wave theory and the “quantizing” of gravity. The latter may very well lead to a thorough reexamination of our concept of space and time, and that the latter might not be so unidirectional afterall. The potential is breathtaking, as it represents steps forward in proving the existence of subspace, the possibility of concomitant multi-universiality and 5D dimensionality of black holes.
Can time manipulation and Biefeld-Brown suggested electro-gravitics as a spacecraft propulsive methodology be far behind? I think not…
Wave particle unity and a physically realist interpretation of lightquantumrealism
Welcome to Quantum Realism, we introduce you to the real world of quantum mechanics and scientific realism. Download eBooks about Quantum Mechanics, Scientific Realism etc.
Black hole entropy leads to the non-local grid dimensions theory Eran Sinbar
Based on Prof. Bekenstein and Prof. Hawking, the black hole maximal entropy , the maximum amount of information that a black hole can absorb, beyond its event horizon is proportional to the area of its event horizon divided by quantized area units, in the scale of Planck area (the square of Planck length).[1]
This quantization in entropy and information in the quantized units of Planck area leads us to the assumption that space is not “smooth” but rather divided into quantized units (“space cells”). Although the Bekenstein-Hawking entropy equation describes a specific case regarding the quantization of the 2D event horizon, this idea can be generalized to the standard 3 dimension (3D) flat space, outside and far away from the black hole’s event horizon. In this general case we assume that these quantized units of space are 3D quantized space “cells” in the scale of Planck length in each of its 3 dimensions.
If this is truly the case and the universe fabric of space is quantized to local 3D space cells in the magnitude size of Planck length scale in each dimension, than we assume that there must be extra non-local space dimensions situated in the non-local bordering’s of these 3D space cells since there must be something dividing space into these quantized space cells.
Our assumption is that these bordering’s are extra non local dimensions which we named as the “GRID” (or grid) extra dimensions, since they look like a non-local 3D grid bordering of the local 3D space cells. These non-local grid dimensions are responsible for all unexplained non-local phenomena’s like the well-known non-local entanglement or in the phrase of Albert Einstein “spooky action at a distance” [2].So by proving that space-time is quantized we prove the existence of the non-local grid dimension that divides space-time to these quantized 3D Planck scale cells.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
Anti-Universe And Emergent Gravity and the Dark UniverseSérgio Sacani
Recent theoretical progress indicates that spacetime and gravity emerge together from the entanglement structure of an underlying microscopic theory. These ideas are best understood in Anti-de Sitter space, where they rely on the area law for entanglement entropy. The extension to de Sitter space requires taking into account the entropy and temperature associated with the cosmological horizon. Using insights from string theory, black hole physics and quantum information theory we argue that the positive dark energy leads to a thermal volume law contribution to the entropy that overtakes the area law precisely at the cosmological horizon. Due to the competition between area and volume law entanglement the microscopic de Sitter states do not thermalise at sub-Hubble scales: they exhibit memory effects in the form of an entropy displacement caused by matter. The emergent laws of gravity contain an additional ‘dark’ gravitational force describing the ‘elastic’ response due to the entropy displacement. We derive an estimate of the strength of this extra force in terms of the baryonic mass, Newton’s constant and the Hubble acceleration scale a0 = cH0, and provide evidence for the fact that this additional ‘dark gravity force’ explains the observed phenomena in galaxies and clusters currently attributed to dark matter.
Relatively and Quantum Mechanics assignment 5&7Brandy Wang
1. General relativity describes large astronomical scales while quantum mechanics describes microscopic scales. When applying the theories at small scales, general relativity's smooth geometric model of space conflicts with quantum mechanics' principle of uncertainty.
2. Quantum tunneling allows particles to temporarily "borrow" energy to pass through classically forbidden areas, but does not violate energy conservation as any additional energy is given back when measured.
3. Pauli's exclusion principle states that two fermions cannot be in the same quantum state. When compressing fermions, their wavelengths shrink and momenta/energy increase, requiring more energy to further reduce separation below their wavelengths. This creates degeneracy pressure resisting compression.
The distribution and_annihilation_of_dark_matter_around_black_holesSérgio Sacani
Uma nova simulação computacional feita pela NASA mostra que as partículas da matéria escura colidindo na extrema gravidade de um buraco negro pode produzir uma luz de raios-gamma forte e potencialmente observável. Detectando essa emissão forneceria aos astrônomos com uma nova ferramenta para entender tanto os buracos negros como a natureza da matéria escura, uma elusiva substância responsável pela maior parte da massa do universo que nem reflete, absorve ou emite luz.
The distribution and_annihilation_of_dark_matter_around_black_holes
KR_SeniorResearch
1. 1
Estimating the Maximum Intensity of Dark Matter Glow
Kevin L. Romans
Texas A&M Kingsville, Texas, 78363, Department of Physics & Geosciences
Submitted December 11, 2014; Revised December 14, 2014
ABSTRACT
In Dr. Lionel Hewett’s theory, Time Symmetric Cosmologyi
(TSC), an attempt to explain the origin of dark matter is made. The
theory predicts that dark matter is a boson that results from the
Hawking Evaporation (/Radiation) of Primordial Black Holes to a
final residual ground state; its predicted mass is 4.5E-34 kg. This
residual Dark Matter Particle (DMP) can form a Bose-Einstein
condensate with another and perforce must decay back down to the
ground state. The most probable mode of decay is the symmetric
release of two photons, again via Hawking Radiation, ~10 nm in
wavelength. Using a series of classical approximations, a simplistic
model was created to estimate the total photon intensity of a given
dark matter halo. Looking at Andromeda a value of 4.67E+24 was
obtained for the dark matter photon intensity while a visible photon
intensity of 3.19E+09 was calculated for comparison. This stark
difference needs to be accounted for with more rigorous modeling
methods.
Key Topics: Time-Symmetric Cosmology, Dark Matter Particles, Photon Intensity
2. 2
I. Introduction
The current standard model of cosmology, the Λ-Cold Dark Matter model ii, does not
present a satisfactory explanation for the origin and properties of Dark Matter (DM). Time-
Symmetric Cosmology (TSC) is an alternative theory to cosmological inflation authored by Dr.
Lionel Hewett of Texas A&M University – Kingsville. Assuming the universe began with a
physical singularity (or creation event) and utilizing the symmetry of time surrounding this
event, TSC is able to correctly predict over twenty cosmological variables such as: the
Cosmological constant, the Density of Vacuum Energy, and the Age of Photon Decoupling. In
developing the model Dr. Hewett was able to explain the origin of DM, the mechanisms behind
its strange properties, and a way to truly see DM beyond mere gravitational interactions.
TSC is composed of two models, Classical and Quantum. The classical model is similar
to the well-known Friedman-Lemaitre-Robertson-Walker model after inflation. The quantum
model expands upon the classical model to predict that the ensemble of first events which
immediately followed the creation event were Primordial Black Holes (PBH). Created out of a
high energy density confined to an infinitesimal spatial extent these PBH are predicted to have
zero velocity relative to their respective timelines and are kinetically cold. They also exhibit
Schwarzschild geometry and evaporate symmetrically via Hawking Radiation as to produce the
expected radiation density in the early universe, the excess of baryonic matter, and observed DM
content of the universe.
These PBH are assumed to radiate down to a ground state leaving a residual black hole.
This tiny particle retains its Schwarzschild characteristics, has all zero quantum numbers except
its mass, and this predicted mass of
#1
is enough to account for the observed DM content; it is a perfect candidate for a Dark Matter
Particle (DMP). Since the DMP has zero spin it is a Boson and two of them may form a Bose-
Einstein Condensate iii. When two DMP condense they will form a particle with twice the ground
state energy and perforce must evaporate via Hawking Radiation. In doing so it must release the
energy of one DMP’s rest energy.
#2
Given this small energy the most likely mode of decay is by the symmetric emission of two
photons each of wavelength,
#3
This suggests that DM should glow at the spectral border between high Ultra-violet and X-ray.
II. The Project Model
Before describing the model let us consider the simplifying assumptions needed. The
target of interest is a spherically symmetric, homogenous DM Halo. In order for the DMP’s to
condense their interaction needs to be perfectly symmetric; they must be in the same quantum
state within uncertainty. For the scope of this article this will simply mean that two particles are
in the same location with the same momentum vector. Finally, the probability of two DMP’s
3. 3
reacting under the above conditions is assumed to be unity. All of these simplifying assumptions
will have the effect of increasing the number and rate of emission events, thus increasing the
intensity of glow.
The goal of the model is to predict the photon intensity of a DM halo as seen from Earth.
In order to get this we first need the photon flux,
( 𝐹𝑙𝑢𝑥) =
(2 𝑝ℎ𝑜𝑡𝑜𝑛𝑠 𝑝𝑒𝑟 𝑒𝑚𝑖𝑠𝑠𝑖𝑜𝑛)
(𝑎𝑣𝑒𝑟𝑎𝑔𝑒 𝑡𝑖𝑚𝑒 𝑏𝑒𝑡𝑤𝑒𝑒𝑛 𝑐𝑜𝑙𝑙𝑖𝑠𝑖𝑜𝑛𝑠 )
(𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑎 𝑐𝑜𝑙𝑙𝑖𝑠𝑖𝑜𝑛 𝑖𝑠 𝑎𝑛 𝑒𝑚𝑖𝑠𝑠𝑖𝑜𝑛) #4
A collision here is a classical one in which the cross section of a particle sweeps out a volume
and another identical particle’s geometric center falls into this volume. As a particle sweeps this
volume out in a given time interval it may interact with many particles. However not all of these
collisions will result in an emission event and this collision rate must be scaled by the
appropriate factor.
A. Same State Probability
We begin by deriving the factor that corresponds to the probability that two DMP’s will
be in the same state and therefore emit. This will be tackled in two parts by treating the
probability of them being in the same location and having the same momentum separately; the
total probability factor will simply be a product of the two.
1. Position
To get a handle on the probability that two DMP’s will be in the same location within
some reference volume (V) we need to estimate the uncertainty in size of the particle.
Assumption 1: The size of a 3-D Infinite Square Well (ISW), with ground state energy
equal to the rest mass energy of the DMP, represents the uncertainty in position of the
particle.
#5
We have made the further simplifying assumption that this uncertainty in position is
symmetric with respect to each of the three Cartesian axes (x,y,z). With this we can use the
length of one side of the (ISW) to be the uncertainty in position of the particle along each
dimension. Furthermore, we will use the square of the uncertainty in position along one
dimension (say x) and treat it like the particle’s collision cross-section. Solving for this
uncertainty we get,
#6
Using the Minimum Uncertainty Principle relationship we can get the uncertainty in
momentum of the particle still using the three dimensional symmetry,
#7
The following values have been calculated and are tabulated in Table 1 below,
4. 4
Δx
3.0E -9 m 1.75E -26 kg m/s
Table 1. Calculated uncertainties along one dimension
Assumption 2: The probability that at least two DMP’s share the same location is given by
the ratio of one’s volume to that of the reference volume.
#8
To see the logic behind this consider the following two dimensional example. Imagine
that you are throwing darts into an area A at a constant rate and at random locations. Within this
area there is a smaller area A1 seen in Figure 1. When averaged over time we expect the ratio of
the number of darts that landed in A1 vs A to be equal to the ratio of these areas. This is easily
extended into three dimensions as a ratio of volumes.
If we recognize that in a given reference volume V, filled with N
particles, where each one sees approximately N other particles, then the
probability scales up by a factor of N 2. Also, since the volume is arbitrary
we can divide P1 by another factor of V leading to probability per unit
volume,
#9
where n is the number density of the halo (N/V). This will be calculated by dividing the mass
density of the halo by the mass of the DMP.
#10
2. Momentum
Assumption 3: The DMP’s can be described with Kinetic Molecular Theory where their
source of energy is gravitational.
In this section we shall use the simplifying assumptions that the halo of interest has
DMP’s that are slow moving (relative to light) and have low density (or the distance between the
particles is large). Under these conditions Classical Statistical Mechanics applies. Starting from
Boltzmann Statistics one can derive a momentum vector distribution,
#11
A1
A
Figure 1
5. 5
where p is the magnitude of the momentum , β is , kB is the Boltzmann constant, T is the
absolute temperature, fB is the distribution, and 𝑑3
𝑝 is a volume element in momentum space.
Given some momentum p the distribution times the cube of the uncertainty in momentum
gives us the probability that the DMP’s have momentum within that small interval. However
what we are interested in is how often (or how likely) at each value of p neighboring momenta
are within our calculated uncertainty. This average is taken over all possible values of p,
#12
and evaluating this integral yields,
#13
3. Same State
As stated earlier the probability that at least two DMP’s will be in the same state is the
product of equations 9 and 12. Carrying out the product and simplifying we get the total
probability per unit volume,
#14
B. Relaxation Time
Assumption 4: The relaxation time for a DM halo is equal to the average time between
collisions found in Classical Collision Theory.
Here the relaxation time refers to the average amount of time needed for the system
(particles in the halo) to be randomly shuffled and the probability relations can be reapplied. The
result can be quoted directly from any undergraduate university physics textbook iv,
#15
where the speed v used here is the approximate root-mean-square (rms) speed of the particles in
the halo. Local particles a small radial distance apart will be moving at slightly different speeds.
Although they cannot condense they can still interact gravitationally. Given a small region of
these weakly interacting particles we would expect each one will be jostled in many directions
with varying strengths. Over time this region will come to contain randomly moving particles
(while still maintaining its orbit); this propagation of energy resembles thermal conduction. We
can then invoke the equipartition theorem and the equation for the kinetic energy per particle to
get rid of β,
6. 6
#16
To use equation 14 effectively we restricted our analysis to halos coupled to a parent galaxy with
a known rotation curve. The speed v is the value taken from averaging the speeds along the flat
section of the curve; this average speed will be approximately constant all the way out to the
edge of the parent galaxy. At this point we have everything we need to fill out equation 4.
C. Photon Flux
Plugging in equation 13 for the probability per unit volume, equation 14 in for the
average time between collisions, and use equation 16 to get rid of β we get,
#17
where F is the number of emitted photons per unit time (or photon flux) per unit volume emitted
by a DM halo. We can multiply through by a spherical volume of radius R to get the total photon
flux. Also, if we use the inverse square law of intensity at a distance d, then we can derive an
equation for the number of photons striking a unit area per unit time (or photon intensity),
#18
As a comparison let us consider the visible photon flux that would be emitted by the
parent galaxy. If we know the apparent magnitude of the galaxy, then we can calculate the
visible photon intensity using the Pogson Equation in the form v,
#19
Where Is is the visible photon intensity of the Sun and ms , m are the apparent magnitude of the
Sun and target galaxy respectively.
The calculated values of I and Iv for three galaxies, as well as the pertinent parameters,
are given below in Table 2 and Table 3 respectively on the next page.
Object Mass (kg) M Radius (m) R Distance to Earth (m) d Number Density(m^-3) n Orbital speed (m/s) v I (1/m^2*s)
7. 7
Andromeda 2.98E+42 7.57E+20 2.37E+22 3.64E+12 2.00E+05 4.67E+24
Triangulum 9.94E+40 4.73E+20 2.84E+22 4.98E+11 1.10E+05 6.71E+21
Vergo A 1.19E+43 1.54E+21 5.11E+23 1.73E+12 2.00E+05 9.03E+21
Table 2. Galaxy parameters were sourced from Wikipedia
Object m ms Is (visible 550nm) (1/m^2*s) Iv (visible 550nm) (1/m^2*s)
Andromeda 3.44 -26.7 3.77E+21 3.20E+09
Triangulum 5.72 3.91E+08
Vergo A 9.59 1.11E+07
Table 3. Parameters sourced from Wikipedia
III. Conclusion
As can be seen from comparing the photon intensities in Tables 2 and 3 we notice that the
DM glow is some ten orders of magnitude larger than the parent galaxy’s visible glow. If this
model were true, then someone would have observed this intense light by now even accidentally.
Looking at equation 16 for qualitative reference we can see the high dependence on the
number density n, cross-section 𝛥𝑥2
, and the geometry of the halo. Since they are raised to
second and third powers varying them will significantly alter the photon intensity. A true DM
density profile, such as NFW profile, is much lower that the assumed homogenous blob; it falls
off as the distance from the center of the halo increases. This would also force us to consider a
different halo geometry as this profile tends to form DM webs or strings rather than spherical
clumps. Also, the assumed probability for reaction was taken to be unity, but when comparing
the reaction cross-sections of known particles (this relates to the reaction probability), we can
expect the true reaction cross-section to be much smaller than 𝛥𝑥2
. Finally, although not as
critical, it might be worthwhile to utilize the formalism of Quantum Statistics.
With this ten orders of magnitude leeway, and the fact that more rigorous considerations
should yield a lower DM photon intensity, then the DM should be detectable with modern
telescopes plus the correct monochromatic filter. We now have some justification in searching
out this glow thereby giving TSC its first empirical test.
ACHKNOWLEDGMENTS
8. 8
This research project was made possible by the support of the Texas A&M University –
Kingsville (TAMUK) and its physics faculty. In particular the guidance of Dr. Hewett kept the
author from being hopelessly lost in the dark. The discussions with Mr. Charles Allison, physics
lecturer at TAMUK, concerning astronomy and light were extremely helpful. The constructive
criticism posed by Dr. Daniel Vrinceanu, professor of physics at Texas Southern University, was
vital in discovering fallacies in the logic used. Finally, special thanks to Mr. Jesus Salas, whose
help early on in the project saved the author from pursuing an impossible alternative model.
REFERENCES
i. L. D. Hewett, Time-Symmetric Cosmology, Professor’s handout, (unpublished).
ii. R. J. GaBany, The Formation and Evolution of Galaxies,
(http://www.cosmotography.com/images/galaxy_formation_and_evolution.html).
iii. P. A. Tipler, R. A. Llewellyn, Modern Physics, 6th ed., (W. H. Freeman and Company,
NY, 2012), pp. 351 – 357, pp. 277 – 279, pp. 213 – 214.
iv. H. D. Young, R. A. Freedman, Sears and Zemansky’s University Physics, 12th ed.,
(Pearson Education, CA, 2008), pp. 611 – 625, pp. 629 – 631.
v. D. Scott Birney, Guillermo Gonzalez, David Oesper, Observational Astronomy, 2nd ed.,
(Cambridge University Press, NY, 2010), pp. 85 - 87