We explain that black holes are the most efficient capacitors of quantum information. It is thereby expected that all sufficiently advanced civilizations ultimately employ black holes in their quantum computers. The accompanying Hawking radiation is democratic in particle species. Due to this, the alien quantum computers will radiate in ordinary particles such as neutrinos and photons within the range of potential sensitivity of our detectors. This offers a new avenue for SETI, including the civilizations entirely composed of hidden particles species interacting with our world exclusively through gravity.
Life, the Universe, andSETI in a NutshellSérgio Sacani
Todate,no
SETI
observing
program
hassucceededindetectinganyunambiguousevidenceofan
extraterrestrial
technology.Regrettably,
this
paper
willtherefore
not
dazzleyou
with
the
analysisof
the
contentsofanyinterstellarmessages.However,asis
appropriate
foraplenarypresentation,
this
paper
doesprovidean
update
on
the
state
of
SETI
programsworldwide.Itdiscusses
the
various flavors ofobservational
SETI
projectscurrentlyon
the
air,plansforfutureinstrumentation,recent
attempts
toproactively
plan
forsuccess,
andthe
prospectsforfuture
public/private
partnershipstofundtheseefforts.
Thepaper
concludes
with
some
tentative
responsesto
the What
ifeverybodyislistening,
and
nobodyis
transmitting?
query.
The general theory of space time, mass, energy, quantum gravityAlexander Decker
The document discusses the relationships between various concepts in physics including general unified theory (GUT), space-time, mass-energy, quantum gravity, vacuum energy, and quantum fields. It explores how quantum computation may be possible using quantum discord rather than entanglement. Experiments showed that noisy, mixed quantum states could still enable computation through discord rather than requiring pristine entangled states. Theoretical work is ongoing to better understand how and when discord enables computation compared to entanglement.
Quantum Information Science and Quantum Neuroscience.pptMelanie Swan
This document summarizes a presentation on quantum neuroscience given by Melanie Swan. It discusses how quantum effects may be relevant to neuroscience, outlines various research topics within quantum neuroscience like imaging and protein folding, and describes mathematical approaches like wavefunctions and topological data analysis that are being applied. It also provides background on the levels of organization in the brain from the nervous system down to ion channels, and reviews the current status of the connectome and motor neuron mapping projects in different organisms. Finally, it discusses modeling of neural signaling across scales using techniques like partial differential equations.
1. Teleportation is the theoretical transfer of matter or objects from one point to another without traversing the physical space between them.
2. Quantum teleportation uses the phenomenon of quantum entanglement to transfer quantum information between particles. An experiment in 1998 successfully teleported photons over a short distance.
3. While human teleportation remains theoretical due to the complexity of the human body, quantum teleportation experiments provide a potential foundation for long-distance quantum communication networks.
Philosophy-aided Physics at the Boundary of Quantum-Classical Reality The philosophical themes of truth-knowledge and appearance-reality are used to interrogate the contemporary situation of the quantum-classical boundary, and more broadly the quantum-classical-relativistic stratification of physical scale boundaries. The contemporary moment finds us at breakneck pace in the industrial information revolution, digitizing remaining matter-based industries into a seamless exchange between physical-digital reality. Digitized news is giving way to digitized money and perhaps in the farther future, digitized mindfiles (such as personalized connectome files for precision medicine, autologous (own-DNA) stem cell therapies, and CRISPR for Alzheimer’s disease prevention). Our technologies are allowing us control over vast new domains, the relativistic with GPS and space-faring, and the quantum with quantum computing, harnessing the properties of superposition, entanglement, and interference. Philosophy provides critical thinking tools that can help us understand and master these rapid shifts in science and technology to avoid an Adornian instrumental reality (subsuming humanity under societal structures) and to maintain a Heideggerian backgrounded and enabling relation with technology (versus technology enframing us into mindless standing reserve).
The philosophical theme underlying the investigation of the scales of planets, persons, and particles is the relationship between truth and knowledge (or appearance and reality). The truth-knowledge problem is whether knowledge of the truth, true knowledge, the reality under the appearance, is even possible. Three salient moments in the history of the truth-knowledge problem are examined here. These are the German idealism of Kant and Hegel, the deconstructive postmodernism of Foucault and Derrida, and the unclear leanings of the current moment. The German idealism lens incorporates the self-knowing subject as agent into the truth and knowledge problem. The postmodernist view breaks with the subject and emphasizes the hidden opposites in the formulations, the constant reinterpretation of meaning, and porous boundaries. The contemporary moment wonders whether truth-knowledge boundaries still hold, in a Benjaminian view of non-identity between truth and knowledge, and truth increasingly being seen as a Foucauldian biopolitical manufactured quantity. Contemporaneity has a bimodal distribution of the subject: the hyperself (the constantly digitally represented selfie self) and the alienated post-subject subject.
These moments in the truth and knowledge debate inflect into the scale considerations of relativity, classicality, and quantum mechanics. Whereas general relativity and quantum mechanics are domains of universality, totality, and multiplicity, everyday classical reality is squeezed in as a belt between the two multiplicities as the concretion of drawing a triangle or tossing a ball. Recasting truth and k
M82 X-2 is the first pulsating ultraluminous X-ray source discovered. The luminosity of these extreme pulsars, if
isotropic, implies an extreme mass transfer rate. An alternative is to assume a much lower mass transfer rate, but
with an apparent luminosity boosted by geometrical beaming. Only an independent measurement of the mass
transfer rate can help discriminate between these two scenarios. In this paper, we follow the orbit of the neutron star
for 7 yr, measure the decay of the orbit (P P orb orb 8 10 yr 6 1 · » - - - ), and argue that this orbital decay is driven by
extreme mass transfer of more than 150 times the mass transfer limit set by the Eddington luminosity. If this is true,
the mass available to the accretor is more than enough to justify its luminosity, with no need for beaming. This also
strongly favors models where the accretor is a highly magnetized neutron star.
Constraints on the Universe as a Numerical Simulationsolodoe
This document discusses the possibility that our observed universe is a numerical simulation performed on a cubic space-time lattice. It first motivates this idea by extrapolating trends in computational resources needed for lattice QCD simulations into the future. It then investigates potentially observable consequences if the universe is an early simulation with unimproved Wilson fermion discretization. The most stringent bound on the inverse lattice spacing of the universe is derived from the high-energy cutoff of the cosmic ray spectrum. The numerical simulation scenario could be revealed in distributions of the highest energy cosmic rays exhibiting symmetry breaking reflecting the underlying lattice structure.
Life, the Universe, andSETI in a NutshellSérgio Sacani
Todate,no
SETI
observing
program
hassucceededindetectinganyunambiguousevidenceofan
extraterrestrial
technology.Regrettably,
this
paper
willtherefore
not
dazzleyou
with
the
analysisof
the
contentsofanyinterstellarmessages.However,asis
appropriate
foraplenarypresentation,
this
paper
doesprovidean
update
on
the
state
of
SETI
programsworldwide.Itdiscusses
the
various flavors ofobservational
SETI
projectscurrentlyon
the
air,plansforfutureinstrumentation,recent
attempts
toproactively
plan
forsuccess,
andthe
prospectsforfuture
public/private
partnershipstofundtheseefforts.
Thepaper
concludes
with
some
tentative
responsesto
the What
ifeverybodyislistening,
and
nobodyis
transmitting?
query.
The general theory of space time, mass, energy, quantum gravityAlexander Decker
The document discusses the relationships between various concepts in physics including general unified theory (GUT), space-time, mass-energy, quantum gravity, vacuum energy, and quantum fields. It explores how quantum computation may be possible using quantum discord rather than entanglement. Experiments showed that noisy, mixed quantum states could still enable computation through discord rather than requiring pristine entangled states. Theoretical work is ongoing to better understand how and when discord enables computation compared to entanglement.
Quantum Information Science and Quantum Neuroscience.pptMelanie Swan
This document summarizes a presentation on quantum neuroscience given by Melanie Swan. It discusses how quantum effects may be relevant to neuroscience, outlines various research topics within quantum neuroscience like imaging and protein folding, and describes mathematical approaches like wavefunctions and topological data analysis that are being applied. It also provides background on the levels of organization in the brain from the nervous system down to ion channels, and reviews the current status of the connectome and motor neuron mapping projects in different organisms. Finally, it discusses modeling of neural signaling across scales using techniques like partial differential equations.
1. Teleportation is the theoretical transfer of matter or objects from one point to another without traversing the physical space between them.
2. Quantum teleportation uses the phenomenon of quantum entanglement to transfer quantum information between particles. An experiment in 1998 successfully teleported photons over a short distance.
3. While human teleportation remains theoretical due to the complexity of the human body, quantum teleportation experiments provide a potential foundation for long-distance quantum communication networks.
Philosophy-aided Physics at the Boundary of Quantum-Classical Reality The philosophical themes of truth-knowledge and appearance-reality are used to interrogate the contemporary situation of the quantum-classical boundary, and more broadly the quantum-classical-relativistic stratification of physical scale boundaries. The contemporary moment finds us at breakneck pace in the industrial information revolution, digitizing remaining matter-based industries into a seamless exchange between physical-digital reality. Digitized news is giving way to digitized money and perhaps in the farther future, digitized mindfiles (such as personalized connectome files for precision medicine, autologous (own-DNA) stem cell therapies, and CRISPR for Alzheimer’s disease prevention). Our technologies are allowing us control over vast new domains, the relativistic with GPS and space-faring, and the quantum with quantum computing, harnessing the properties of superposition, entanglement, and interference. Philosophy provides critical thinking tools that can help us understand and master these rapid shifts in science and technology to avoid an Adornian instrumental reality (subsuming humanity under societal structures) and to maintain a Heideggerian backgrounded and enabling relation with technology (versus technology enframing us into mindless standing reserve).
The philosophical theme underlying the investigation of the scales of planets, persons, and particles is the relationship between truth and knowledge (or appearance and reality). The truth-knowledge problem is whether knowledge of the truth, true knowledge, the reality under the appearance, is even possible. Three salient moments in the history of the truth-knowledge problem are examined here. These are the German idealism of Kant and Hegel, the deconstructive postmodernism of Foucault and Derrida, and the unclear leanings of the current moment. The German idealism lens incorporates the self-knowing subject as agent into the truth and knowledge problem. The postmodernist view breaks with the subject and emphasizes the hidden opposites in the formulations, the constant reinterpretation of meaning, and porous boundaries. The contemporary moment wonders whether truth-knowledge boundaries still hold, in a Benjaminian view of non-identity between truth and knowledge, and truth increasingly being seen as a Foucauldian biopolitical manufactured quantity. Contemporaneity has a bimodal distribution of the subject: the hyperself (the constantly digitally represented selfie self) and the alienated post-subject subject.
These moments in the truth and knowledge debate inflect into the scale considerations of relativity, classicality, and quantum mechanics. Whereas general relativity and quantum mechanics are domains of universality, totality, and multiplicity, everyday classical reality is squeezed in as a belt between the two multiplicities as the concretion of drawing a triangle or tossing a ball. Recasting truth and k
M82 X-2 is the first pulsating ultraluminous X-ray source discovered. The luminosity of these extreme pulsars, if
isotropic, implies an extreme mass transfer rate. An alternative is to assume a much lower mass transfer rate, but
with an apparent luminosity boosted by geometrical beaming. Only an independent measurement of the mass
transfer rate can help discriminate between these two scenarios. In this paper, we follow the orbit of the neutron star
for 7 yr, measure the decay of the orbit (P P orb orb 8 10 yr 6 1 · » - - - ), and argue that this orbital decay is driven by
extreme mass transfer of more than 150 times the mass transfer limit set by the Eddington luminosity. If this is true,
the mass available to the accretor is more than enough to justify its luminosity, with no need for beaming. This also
strongly favors models where the accretor is a highly magnetized neutron star.
Constraints on the Universe as a Numerical Simulationsolodoe
This document discusses the possibility that our observed universe is a numerical simulation performed on a cubic space-time lattice. It first motivates this idea by extrapolating trends in computational resources needed for lattice QCD simulations into the future. It then investigates potentially observable consequences if the universe is an early simulation with unimproved Wilson fermion discretization. The most stringent bound on the inverse lattice spacing of the universe is derived from the high-energy cutoff of the cosmic ray spectrum. The numerical simulation scenario could be revealed in distributions of the highest energy cosmic rays exhibiting symmetry breaking reflecting the underlying lattice structure.
1) Conventional computers will soon face fundamental limits to performance, but quantum computers based on molecules in a liquid may become powerful in the future.
2) Quantum computers encode information in quantum bits (qubits) that can exist in multiple states simultaneously, allowing immense parallel processing power.
3) Researchers have used nuclear magnetic resonance techniques to manipulate the quantum states of molecules in liquid and demonstrate basic quantum logic operations, taking an important step toward building a practical quantum computer.
1. The document introduces the Union-Dipole Theory (UDT), a proposed Theory of Everything based on the discovery of a fundamental building block particle called the Union-Dipole Particle (UDP).
2. The UDP consists of two empty spheres that are united and accentuate each other, existing within a permeable medium. The motion of this medium generates electric and magnetic disturbances that cause the UDP to move at light speed.
3. The UDT aims to unify all forces including gravity as one electromagnetic force. It provides explanations for mysteries in nature like photons, charges, and mass. The theory claims to offer a simple and consistent framework for understanding the universe.
This document discusses quantum teleportation. It begins by explaining the need for teleportation to reduce transportation time. It then describes how early theories of quantum teleportation failed due to limitations of measuring quantum states. The document outlines breakthroughs in 1998 when entanglement was used to teleport photons across laboratories. It discusses challenges in teleporting larger objects like atoms and humans. The document concludes by mentioning applications of quantum teleportation in cryptography and computing, as well as advantages like secure data transmission and reduced transportation costs.
The document discusses three proposed research projects:
1) Studying quantum phases such as supersolidity and quantum glass in ultra-cold atoms confined in optical lattices, to support future experiments.
2) Extending diagrammatic quantum Monte Carlo techniques to calculate properties of multi-species boson and fermion systems.
3) Investigating vector and chiral spin liquid phases in type-II multiferroic materials using computational and analytical approaches, to better understand and control their properties for applications.
Cryptography, entanglement, and quantum blocktime: Quantum computing offers a more scalable energy-efficient platform than classical computing and supercomputing, and corresponds more naturally to the three-dimensional structure of atomic reality. Blockchains are a decentralized digital economic system made possible by the 24-7 global nature of the internet.
[DSC Europe 23] Mila Pandurovic - Data science in high energy physicsDataScienceConferenc1
High energy physics experiments such as currently running Large Hadron Collider (LHC) or the future collider experiments (CEPC, CLIC, ILC, FCC), rely strongly on data science. Only from four LHC experiments the CERN Data Centre stores more than thirty petabytes of data per year, where over hundred petabytes of data are archived permanently. The collider experiments are characterized not only by the vast amount of data, but also with the necessity for the high precision measurement, unfavorable ratio of signal to background, where the tiny signals are covered by the huge pile of background events, with ratio of one per million, or less. In Higgs physics special challenge present the studies with purely hadronic final states, jets, where the lack of the sharp tagging variables lead to strenuous signal and background separation. The presentation will give the overview of the use of data science in the Higgs boson physics at future Circular electron positron collider, CEPC, China.
The document discusses methods for studying quantum dynamics, localization, and quantum machine learning. It is divided into four parts. Part I develops numerical and analytical methods for studying complex quantum systems and their dynamics. Part II explores scenarios where dynamics is slow and information is localized, focusing on disorder-induced and kinetically constrained localization. Part III designs thermodynamic protocols to speed up thermalization while keeping dissipated work constant. Part IV examines intersections between tensor network methods and machine learning, applying techniques like neural network quantum states and tensor networks to problems in machine learning and approximating probability distributions.
Abstract: Dr. David Joseph Bohm an American scientist who theorized quantum mechanics in the most ordinary and understandable way, which is somewhat referred to as the “Pilot Wave-model”. Also he prophesized in neuropsychology, and gave the Holonomic model of brain affecting our view of the quantum mechanics. His theories suggest that the phenomenon of “NON LOCALITY” or quantum entanglement is due to the famous “frame dragging” phenomenon predicted by Sir. Albert Einstein’s theory of relativity.
Bohm’s theory also suggests that time doesn’t exist in the way we think it does as stated by “THE BIG CRUNCH” theory. According to it time exists due to the interacting frequencies of the waves due to particle vibrations in space and that the universe never began.
In this paper existence of quantum entanglement is used to question the degree of correctness of the Space-time fabric theory.
Continuum emission from within the plunging region of black hole discsSérgio Sacani
The thermal continuum emission observed from accreting black holes across X-ray bands has the potential to be leveraged as a
powerful probe of the mass and spin of the central black hole. The vast majority of existing ‘continuum fitting’ models neglect
emission sourced at and within the innermost stable circular orbit (ISCO) of the black hole. Numerical simulations, however,
find non-zero emission sourced from these regions. In this work, we extend existing techniques by including the emission
sourced from within the plunging region, utilizing new analytical models that reproduce the properties of numerical accretion
simulations. We show that in general the neglected intra-ISCO emission produces a hot-and-small quasi-blackbody component,
but can also produce a weak power-law tail for more extreme parameter regions. A similar hot-and-small blackbody component
has been added in by hand in an ad hoc manner to previous analyses of X-ray binary spectra. We show that the X-ray spectrum
of MAXI J1820+070 in a soft-state outburst is extremely well described by a full Kerr black hole disc, while conventional
models that neglect intra-ISCO emission are unable to reproduce the data. We believe this represents the first robust detection of
intra-ISCO emission in the literature, and allows additional constraints to be placed on the MAXI J1820 + 070 black hole spin
which must be low a• < 0.5 to allow a detectable intra-ISCO region. Emission from within the ISCO is the dominant emission
component in the MAXI J1820 + 070 spectrum between 6 and 10 keV, highlighting the necessity of including this region. Our
continuum fitting model is made publicly available.
Quantum communication and quantum computingIOSR Journals
Abstract: The subject of quantum computing brings together ideas from classical information theory, computer
science, and quantum physics. This review aims to summarize not just quantum computing, but the whole
subject of quantum information theory. Information can be identified as the most general thing which must
propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics.
However, the mathematical treatment of information, especially information processing, is quite recent, dating
from the mid-20th century. This has meant that the full significance of information as a basic concept in physics
is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information
and computing puts this significance on a firm footing, and has led to some profound and exciting new insights
into the natural world. Among these are the use of quantum states to permit the secure transmission of classical
information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of
quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible
noise processes (quantum error correction), and the use of controlled quantum evolution for efficient
computation (quantum computation). The common theme of all these insights is the use of quantum
entanglement as a computational resource.
Keywords: quantum bits, quantum registers, quantum gates and quantum networks
This document discusses quantum dots and quantum wires. It begins by providing a brief history of quantum dots, discovered in 1981 and coined in 1985. It describes their characteristics as semiconductor nanoparticles 1-100 atoms in diameter that exhibit quantum confinement. Applications include transistors, solar cells, LEDs and lasers. Health concerns relate to toxicity from cadmium. Quantum wires are nanowires that exhibit quantum effects, quantizing transverse energy. They may be used to link tiny circuit components. Various materials can be used including metals, semiconductors and insulators. Properties result from reducing dimensionality from quantum wells to wires to dots, decreasing degrees of freedom until full three-dimensional confinement in dots.
A Search for Technosignatures Around 11,680 Stars with the Green Bank Telesco...Sérgio Sacani
We conducted a search for narrowband radio signals over four observing sessions in 2020–2023 with
the L-band receiver (1.15–1.73 GHz) of the 100 m diameter Green Bank Telescope. We pointed the
telescope in the directions of 62 TESS Objects of Interest, capturing radio emissions from a total of
∼11,860 stars and planetary systems in the ∼9 arcminute beam of the telescope. All detections were
either automatically rejected or visually inspected and confirmed to be of anthropogenic nature. In
this work, we also quantified the end-to-end efficiency of radio SETI pipelines with a signal injection
and recovery analysis. The UCLA SETI pipeline recovers 94.0% of the injected signals over the usable
frequency range of the receiver and 98.7% of the injections when regions of dense RFI are excluded. In
another pipeline that uses incoherent sums of 51 consecutive spectra, the recovery rate is ∼15 times
smaller at ∼6%. The pipeline efficiency affects SETI search volume calculations as well as calculations
of upper bounds on the number of transmitting civilizations. We developed an improved Drake Figure
of Merit for SETI search volume calculations that includes the pipeline efficiency and frequency drift
rate coverage. Based on our observations, we found that there is a high probability (94.0–98.7%) that
fewer than ∼0.014% of stars earlier than M8 within 100 pc host a transmitter that is detectable in
our search (EIRP > 1012 W). Finally, we showed that the UCLA SETI pipeline natively detects the
signals detected with AI techniques by Ma et al. (2023).
The document discusses the concept of teleportation technology. It defines teleportation as the transfer of physical objects from one place to another without physically moving the object. The document outlines several pioneering experiments in teleportation, including teleporting photons over short distances. It explains the key principles of quantum teleportation using entanglement theory and discusses challenges of teleporting larger objects like humans. While theoretically possible, the document notes that teleporting humans remains a challenge and is not yet feasible with today's technology.
This document numerically analyzes the wave function of atoms under the combined effects of an optical lattice trapping potential and a harmonic oscillator potential, as used in Bose-Einstein condensation experiments. It employs the Crank-Nicolson scheme to solve the Gross-Pitaevskii equation. The results show that the wave function distribution responds to parameters like the trapping frequencies ratio, optical lattice intensity, chemical potential, and energy. Careful adjustment of the time step and grid spacing is needed to satisfy conservation of norms and energy as required by the physical system. Distributions of the overlapping potentials for different q-factors are presented.
The paper proposes a model of a unitary quantum field theory where the particle is represented as a wave packet. The frequency dispersion equation is chosen so that the packet periodically appears and disappears without changing its form. The envelope of the process is identified with a conventional wave function. Equation of such a field is nonlinear and relativistically invariant. With proper adjustments, they are reduced to Dirac, Schroedinger and Hamilton-Jacobi equations. A number of new experimental effects are predicted both for high and low energies.
The 2022 Nobel Prize in Physics was awarded to Alain Aspect, John Clauser, and Anton Zeilinger for their groundbreaking experimental work on quantum entanglement and violations of Bell's inequalities. John Clauser performed the first conclusive experiment in 1972 showing violations of Bell's inequalities. Alain Aspect then designed experiments in the 1980s enforcing stricter locality conditions. Anton Zeilinger demonstrated quantum teleportation in 1997 and performed another key Bell violation experiment in 1998. Together, their work confirmed the predictions of quantum mechanics and ruled out local hidden variable theories, resolving a decades-long debate between Einstein and Bohr. This established the foundations for the rapidly growing field of quantum information science.
Superconducting qubits for quantum information an outlookGabriel O'Brien
The document discusses the progress and future directions of quantum information processing using superconducting qubits. It describes the stages needed to build a functional quantum computer, from controlling individual qubits to implementing error correction. Superconducting qubits are well-suited for this task as their Hamiltonians can be designed using circuit elements like inductors and Josephson junctions. While full fault-tolerant quantum computing has yet to be achieved, the performance of superconducting qubits has improved dramatically in recent years, suggesting the goals may be within reach this century.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
More Related Content
Similar to Black holes as tools for quantum computing by advanced extraterrestrial civilizations
1) Conventional computers will soon face fundamental limits to performance, but quantum computers based on molecules in a liquid may become powerful in the future.
2) Quantum computers encode information in quantum bits (qubits) that can exist in multiple states simultaneously, allowing immense parallel processing power.
3) Researchers have used nuclear magnetic resonance techniques to manipulate the quantum states of molecules in liquid and demonstrate basic quantum logic operations, taking an important step toward building a practical quantum computer.
1. The document introduces the Union-Dipole Theory (UDT), a proposed Theory of Everything based on the discovery of a fundamental building block particle called the Union-Dipole Particle (UDP).
2. The UDP consists of two empty spheres that are united and accentuate each other, existing within a permeable medium. The motion of this medium generates electric and magnetic disturbances that cause the UDP to move at light speed.
3. The UDT aims to unify all forces including gravity as one electromagnetic force. It provides explanations for mysteries in nature like photons, charges, and mass. The theory claims to offer a simple and consistent framework for understanding the universe.
This document discusses quantum teleportation. It begins by explaining the need for teleportation to reduce transportation time. It then describes how early theories of quantum teleportation failed due to limitations of measuring quantum states. The document outlines breakthroughs in 1998 when entanglement was used to teleport photons across laboratories. It discusses challenges in teleporting larger objects like atoms and humans. The document concludes by mentioning applications of quantum teleportation in cryptography and computing, as well as advantages like secure data transmission and reduced transportation costs.
The document discusses three proposed research projects:
1) Studying quantum phases such as supersolidity and quantum glass in ultra-cold atoms confined in optical lattices, to support future experiments.
2) Extending diagrammatic quantum Monte Carlo techniques to calculate properties of multi-species boson and fermion systems.
3) Investigating vector and chiral spin liquid phases in type-II multiferroic materials using computational and analytical approaches, to better understand and control their properties for applications.
Cryptography, entanglement, and quantum blocktime: Quantum computing offers a more scalable energy-efficient platform than classical computing and supercomputing, and corresponds more naturally to the three-dimensional structure of atomic reality. Blockchains are a decentralized digital economic system made possible by the 24-7 global nature of the internet.
[DSC Europe 23] Mila Pandurovic - Data science in high energy physicsDataScienceConferenc1
High energy physics experiments such as currently running Large Hadron Collider (LHC) or the future collider experiments (CEPC, CLIC, ILC, FCC), rely strongly on data science. Only from four LHC experiments the CERN Data Centre stores more than thirty petabytes of data per year, where over hundred petabytes of data are archived permanently. The collider experiments are characterized not only by the vast amount of data, but also with the necessity for the high precision measurement, unfavorable ratio of signal to background, where the tiny signals are covered by the huge pile of background events, with ratio of one per million, or less. In Higgs physics special challenge present the studies with purely hadronic final states, jets, where the lack of the sharp tagging variables lead to strenuous signal and background separation. The presentation will give the overview of the use of data science in the Higgs boson physics at future Circular electron positron collider, CEPC, China.
The document discusses methods for studying quantum dynamics, localization, and quantum machine learning. It is divided into four parts. Part I develops numerical and analytical methods for studying complex quantum systems and their dynamics. Part II explores scenarios where dynamics is slow and information is localized, focusing on disorder-induced and kinetically constrained localization. Part III designs thermodynamic protocols to speed up thermalization while keeping dissipated work constant. Part IV examines intersections between tensor network methods and machine learning, applying techniques like neural network quantum states and tensor networks to problems in machine learning and approximating probability distributions.
Abstract: Dr. David Joseph Bohm an American scientist who theorized quantum mechanics in the most ordinary and understandable way, which is somewhat referred to as the “Pilot Wave-model”. Also he prophesized in neuropsychology, and gave the Holonomic model of brain affecting our view of the quantum mechanics. His theories suggest that the phenomenon of “NON LOCALITY” or quantum entanglement is due to the famous “frame dragging” phenomenon predicted by Sir. Albert Einstein’s theory of relativity.
Bohm’s theory also suggests that time doesn’t exist in the way we think it does as stated by “THE BIG CRUNCH” theory. According to it time exists due to the interacting frequencies of the waves due to particle vibrations in space and that the universe never began.
In this paper existence of quantum entanglement is used to question the degree of correctness of the Space-time fabric theory.
Continuum emission from within the plunging region of black hole discsSérgio Sacani
The thermal continuum emission observed from accreting black holes across X-ray bands has the potential to be leveraged as a
powerful probe of the mass and spin of the central black hole. The vast majority of existing ‘continuum fitting’ models neglect
emission sourced at and within the innermost stable circular orbit (ISCO) of the black hole. Numerical simulations, however,
find non-zero emission sourced from these regions. In this work, we extend existing techniques by including the emission
sourced from within the plunging region, utilizing new analytical models that reproduce the properties of numerical accretion
simulations. We show that in general the neglected intra-ISCO emission produces a hot-and-small quasi-blackbody component,
but can also produce a weak power-law tail for more extreme parameter regions. A similar hot-and-small blackbody component
has been added in by hand in an ad hoc manner to previous analyses of X-ray binary spectra. We show that the X-ray spectrum
of MAXI J1820+070 in a soft-state outburst is extremely well described by a full Kerr black hole disc, while conventional
models that neglect intra-ISCO emission are unable to reproduce the data. We believe this represents the first robust detection of
intra-ISCO emission in the literature, and allows additional constraints to be placed on the MAXI J1820 + 070 black hole spin
which must be low a• < 0.5 to allow a detectable intra-ISCO region. Emission from within the ISCO is the dominant emission
component in the MAXI J1820 + 070 spectrum between 6 and 10 keV, highlighting the necessity of including this region. Our
continuum fitting model is made publicly available.
Quantum communication and quantum computingIOSR Journals
Abstract: The subject of quantum computing brings together ideas from classical information theory, computer
science, and quantum physics. This review aims to summarize not just quantum computing, but the whole
subject of quantum information theory. Information can be identified as the most general thing which must
propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics.
However, the mathematical treatment of information, especially information processing, is quite recent, dating
from the mid-20th century. This has meant that the full significance of information as a basic concept in physics
is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information
and computing puts this significance on a firm footing, and has led to some profound and exciting new insights
into the natural world. Among these are the use of quantum states to permit the secure transmission of classical
information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of
quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible
noise processes (quantum error correction), and the use of controlled quantum evolution for efficient
computation (quantum computation). The common theme of all these insights is the use of quantum
entanglement as a computational resource.
Keywords: quantum bits, quantum registers, quantum gates and quantum networks
This document discusses quantum dots and quantum wires. It begins by providing a brief history of quantum dots, discovered in 1981 and coined in 1985. It describes their characteristics as semiconductor nanoparticles 1-100 atoms in diameter that exhibit quantum confinement. Applications include transistors, solar cells, LEDs and lasers. Health concerns relate to toxicity from cadmium. Quantum wires are nanowires that exhibit quantum effects, quantizing transverse energy. They may be used to link tiny circuit components. Various materials can be used including metals, semiconductors and insulators. Properties result from reducing dimensionality from quantum wells to wires to dots, decreasing degrees of freedom until full three-dimensional confinement in dots.
A Search for Technosignatures Around 11,680 Stars with the Green Bank Telesco...Sérgio Sacani
We conducted a search for narrowband radio signals over four observing sessions in 2020–2023 with
the L-band receiver (1.15–1.73 GHz) of the 100 m diameter Green Bank Telescope. We pointed the
telescope in the directions of 62 TESS Objects of Interest, capturing radio emissions from a total of
∼11,860 stars and planetary systems in the ∼9 arcminute beam of the telescope. All detections were
either automatically rejected or visually inspected and confirmed to be of anthropogenic nature. In
this work, we also quantified the end-to-end efficiency of radio SETI pipelines with a signal injection
and recovery analysis. The UCLA SETI pipeline recovers 94.0% of the injected signals over the usable
frequency range of the receiver and 98.7% of the injections when regions of dense RFI are excluded. In
another pipeline that uses incoherent sums of 51 consecutive spectra, the recovery rate is ∼15 times
smaller at ∼6%. The pipeline efficiency affects SETI search volume calculations as well as calculations
of upper bounds on the number of transmitting civilizations. We developed an improved Drake Figure
of Merit for SETI search volume calculations that includes the pipeline efficiency and frequency drift
rate coverage. Based on our observations, we found that there is a high probability (94.0–98.7%) that
fewer than ∼0.014% of stars earlier than M8 within 100 pc host a transmitter that is detectable in
our search (EIRP > 1012 W). Finally, we showed that the UCLA SETI pipeline natively detects the
signals detected with AI techniques by Ma et al. (2023).
The document discusses the concept of teleportation technology. It defines teleportation as the transfer of physical objects from one place to another without physically moving the object. The document outlines several pioneering experiments in teleportation, including teleporting photons over short distances. It explains the key principles of quantum teleportation using entanglement theory and discusses challenges of teleporting larger objects like humans. While theoretically possible, the document notes that teleporting humans remains a challenge and is not yet feasible with today's technology.
This document numerically analyzes the wave function of atoms under the combined effects of an optical lattice trapping potential and a harmonic oscillator potential, as used in Bose-Einstein condensation experiments. It employs the Crank-Nicolson scheme to solve the Gross-Pitaevskii equation. The results show that the wave function distribution responds to parameters like the trapping frequencies ratio, optical lattice intensity, chemical potential, and energy. Careful adjustment of the time step and grid spacing is needed to satisfy conservation of norms and energy as required by the physical system. Distributions of the overlapping potentials for different q-factors are presented.
The paper proposes a model of a unitary quantum field theory where the particle is represented as a wave packet. The frequency dispersion equation is chosen so that the packet periodically appears and disappears without changing its form. The envelope of the process is identified with a conventional wave function. Equation of such a field is nonlinear and relativistically invariant. With proper adjustments, they are reduced to Dirac, Schroedinger and Hamilton-Jacobi equations. A number of new experimental effects are predicted both for high and low energies.
The 2022 Nobel Prize in Physics was awarded to Alain Aspect, John Clauser, and Anton Zeilinger for their groundbreaking experimental work on quantum entanglement and violations of Bell's inequalities. John Clauser performed the first conclusive experiment in 1972 showing violations of Bell's inequalities. Alain Aspect then designed experiments in the 1980s enforcing stricter locality conditions. Anton Zeilinger demonstrated quantum teleportation in 1997 and performed another key Bell violation experiment in 1998. Together, their work confirmed the predictions of quantum mechanics and ruled out local hidden variable theories, resolving a decades-long debate between Einstein and Bohr. This established the foundations for the rapidly growing field of quantum information science.
Superconducting qubits for quantum information an outlookGabriel O'Brien
The document discusses the progress and future directions of quantum information processing using superconducting qubits. It describes the stages needed to build a functional quantum computer, from controlling individual qubits to implementing error correction. Superconducting qubits are well-suited for this task as their Hamiltonians can be designed using circuit elements like inductors and Josephson junctions. While full fault-tolerant quantum computing has yet to be achieved, the performance of superconducting qubits has improved dramatically in recent years, suggesting the goals may be within reach this century.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
The debris of the ‘last major merger’ is dynamically youngSérgio Sacani
The Milky Way’s (MW) inner stellar halo contains an [Fe/H]-rich component with highly eccentric orbits, often referred to as the
‘last major merger.’ Hypotheses for the origin of this component include Gaia-Sausage/Enceladus (GSE), where the progenitor
collided with the MW proto-disc 8–11 Gyr ago, and the Virgo Radial Merger (VRM), where the progenitor collided with the
MW disc within the last 3 Gyr. These two scenarios make different predictions about observable structure in local phase space,
because the morphology of debris depends on how long it has had to phase mix. The recently identified phase-space folds in Gaia
DR3 have positive caustic velocities, making them fundamentally different than the phase-mixed chevrons found in simulations
at late times. Roughly 20 per cent of the stars in the prograde local stellar halo are associated with the observed caustics. Based
on a simple phase-mixing model, the observed number of caustics are consistent with a merger that occurred 1–2 Gyr ago.
We also compare the observed phase-space distribution to FIRE-2 Latte simulations of GSE-like mergers, using a quantitative
measurement of phase mixing (2D causticality). The observed local phase-space distribution best matches the simulated data
1–2 Gyr after collision, and certainly not later than 3 Gyr. This is further evidence that the progenitor of the ‘last major merger’
did not collide with the MW proto-disc at early times, as is thought for the GSE, but instead collided with the MW disc within
the last few Gyr, consistent with the body of work surrounding the VRM.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Multi-source connectivity as the driver of solar wind variability in the heli...Sérgio Sacani
The ambient solar wind that flls the heliosphere originates from multiple
sources in the solar corona and is highly structured. It is often described
as high-speed, relatively homogeneous, plasma streams from coronal
holes and slow-speed, highly variable, streams whose source regions are
under debate. A key goal of ESA/NASA’s Solar Orbiter mission is to identify
solar wind sources and understand what drives the complexity seen in the
heliosphere. By combining magnetic feld modelling and spectroscopic
techniques with high-resolution observations and measurements, we show
that the solar wind variability detected in situ by Solar Orbiter in March
2022 is driven by spatio-temporal changes in the magnetic connectivity to
multiple sources in the solar atmosphere. The magnetic feld footpoints
connected to the spacecraft moved from the boundaries of a coronal hole
to one active region (12961) and then across to another region (12957). This
is refected in the in situ measurements, which show the transition from fast
to highly Alfvénic then to slow solar wind that is disrupted by the arrival of
a coronal mass ejection. Our results describe solar wind variability at 0.5 au
but are applicable to near-Earth observatories.
Gliese 12 b: A Temperate Earth-sized Planet at 12 pc Ideal for Atmospheric Tr...Sérgio Sacani
Recent discoveries of Earth-sized planets transiting nearby M dwarfs have made it possible to characterize the
atmospheres of terrestrial planets via follow-up spectroscopic observations. However, the number of such planets
receiving low insolation is still small, limiting our ability to understand the diversity of the atmospheric
composition and climates of temperate terrestrial planets. We report the discovery of an Earth-sized planet
transiting the nearby (12 pc) inactive M3.0 dwarf Gliese 12 (TOI-6251) with an orbital period (Porb) of 12.76 days.
The planet, Gliese 12 b, was initially identified as a candidate with an ambiguous Porb from TESS data. We
confirmed the transit signal and Porb using ground-based photometry with MuSCAT2 and MuSCAT3, and
validated the planetary nature of the signal using high-resolution images from Gemini/NIRI and Keck/NIRC2 as
well as radial velocity (RV) measurements from the InfraRed Doppler instrument on the Subaru 8.2 m telescope
and from CARMENES on the CAHA 3.5 m telescope. X-ray observations with XMM-Newton showed the host
star is inactive, with an X-ray-to-bolometric luminosity ratio of log 5.7 L L X bol » - . Joint analysis of the light
curves and RV measurements revealed that Gliese 12 b has a radius of 0.96 ± 0.05 R⊕,a3σ mass upper limit of
3.9 M⊕, and an equilibrium temperature of 315 ± 6 K assuming zero albedo. The transmission spectroscopy metric
(TSM) value of Gliese 12 b is close to the TSM values of the TRAPPIST-1 planets, adding Gliese 12 b to the small
list of potentially terrestrial, temperate planets amenable to atmospheric characterization with JWST.
Gliese 12 b, a temperate Earth-sized planet at 12 parsecs discovered with TES...Sérgio Sacani
We report on the discovery of Gliese 12 b, the nearest transiting temperate, Earth-sized planet found to date. Gliese 12 is a
bright (V = 12.6 mag, K = 7.8 mag) metal-poor M4V star only 12.162 ± 0.005 pc away from the Solar system with one of the
lowest stellar activity levels known for M-dwarfs. A planet candidate was detected by TESS based on only 3 transits in sectors
42, 43, and 57, with an ambiguity in the orbital period due to observational gaps. We performed follow-up transit observations
with CHEOPS and ground-based photometry with MINERVA-Australis, SPECULOOS, and Purple Mountain Observatory,
as well as further TESS observations in sector 70. We statistically validate Gliese 12 b as a planet with an orbital period of
12.76144 ± 0.00006 d and a radius of 1.0 ± 0.1 R⊕, resulting in an equilibrium temperature of ∼315 K. Gliese 12 b has excellent
future prospects for precise mass measurement, which may inform how planetary internal structure is affected by the stellar
compositional environment. Gliese 12 b also represents one of the best targets to study whether Earth-like planets orbiting cool
stars can retain their atmospheres, a crucial step to advance our understanding of habitability on Earth and across the galaxy.
The importance of continents, oceans and plate tectonics for the evolution of...Sérgio Sacani
Within the uncertainties of involved astronomical and biological parameters, the Drake Equation
typically predicts that there should be many exoplanets in our galaxy hosting active, communicative
civilizations (ACCs). These optimistic calculations are however not supported by evidence, which is
often referred to as the Fermi Paradox. Here, we elaborate on this long-standing enigma by showing
the importance of planetary tectonic style for biological evolution. We summarize growing evidence
that a prolonged transition from Mesoproterozoic active single lid tectonics (1.6 to 1.0 Ga) to modern
plate tectonics occurred in the Neoproterozoic Era (1.0 to 0.541 Ga), which dramatically accelerated
emergence and evolution of complex species. We further suggest that both continents and oceans
are required for ACCs because early evolution of simple life must happen in water but late evolution
of advanced life capable of creating technology must happen on land. We resolve the Fermi Paradox
(1) by adding two additional terms to the Drake Equation: foc
(the fraction of habitable exoplanets
with significant continents and oceans) and fpt
(the fraction of habitable exoplanets with significant
continents and oceans that have had plate tectonics operating for at least 0.5 Ga); and (2) by
demonstrating that the product of foc
and fpt
is very small (< 0.00003–0.002). We propose that the lack
of evidence for ACCs reflects the scarcity of long-lived plate tectonics and/or continents and oceans on
exoplanets with primitive life.
A Giant Impact Origin for the First Subduction on EarthSérgio Sacani
Hadean zircons provide a potential record of Earth's earliest subduction 4.3 billion years ago. Itremains enigmatic how subduction could be initiated so soon after the presumably Moon‐forming giant impact(MGI). Earlier studies found an increase in Earth's core‐mantle boundary (CMB) temperature due to theaccumulation of the impactor's core, and our recent work shows Earth's lower mantle remains largely solid, withsome of the impactor's mantle potentially surviving as the large low‐shear velocity provinces (LLSVPs). Here,we show that a hot post‐impact CMB drives the initiation of strong mantle plumes that can induce subductioninitiation ∼200 Myr after the MGI. 2D and 3D thermomechanical computations show that a high CMBtemperature is the primary factor triggering early subduction, with enrichment of heat‐producing elements inLLSVPs as another potential factor. The models link the earliest subduction to the MGI with implications forunderstanding the diverse tectonic regimes of rocky planets.
Climate extremes likely to drive land mammal extinction during next supercont...Sérgio Sacani
Mammals have dominated Earth for approximately 55 Myr thanks to their
adaptations and resilience to warming and cooling during the Cenozoic. All
life will eventually perish in a runaway greenhouse once absorbed solar
radiation exceeds the emission of thermal radiation in several billions of
years. However, conditions rendering the Earth naturally inhospitable to
mammals may develop sooner because of long-term processes linked to
plate tectonics (short-term perturbations are not considered here). In
~250 Myr, all continents will converge to form Earth’s next supercontinent,
Pangea Ultima. A natural consequence of the creation and decay of Pangea
Ultima will be extremes in pCO2 due to changes in volcanic rifting and
outgassing. Here we show that increased pCO2, solar energy (F⨀;
approximately +2.5% W m−2 greater than today) and continentality (larger
range in temperatures away from the ocean) lead to increasing warming
hostile to mammalian life. We assess their impact on mammalian
physiological limits (dry bulb, wet bulb and Humidex heat stress indicators)
as well as a planetary habitability index. Given mammals’ continued survival,
predicted background pCO2 levels of 410–816 ppm combined with increased
F⨀ will probably lead to a climate tipping point and their mass extinction.
The results also highlight how global landmass configuration, pCO2 and F⨀
play a critical role in planetary habitability.
Constraints on Neutrino Natal Kicks from Black-Hole Binary VFTS 243Sérgio Sacani
The recently reported observation of VFTS 243 is the first example of a massive black-hole binary
system with negligible binary interaction following black-hole formation. The black-hole mass (≈10M⊙)
and near-circular orbit (e ≈ 0.02) of VFTS 243 suggest that the progenitor star experienced complete
collapse, with energy-momentum being lost predominantly through neutrinos. VFTS 243 enables us to
constrain the natal kick and neutrino-emission asymmetry during black-hole formation. At 68% confidence
level, the natal kick velocity (mass decrement) is ≲10 km=s (≲1.0M⊙), with a full probability distribution
that peaks when ≈0.3M⊙ were ejected, presumably in neutrinos, and the black hole experienced a natal
kick of 4 km=s. The neutrino-emission asymmetry is ≲4%, with best fit values of ∼0–0.2%. Such a small
neutrino natal kick accompanying black-hole formation is in agreement with theoretical predictions.
Detectability of Solar Panels as a TechnosignatureSérgio Sacani
In this work, we assess the potential detectability of solar panels made of silicon on an Earth-like
exoplanet as a potential technosignature. Silicon-based photovoltaic cells have high reflectance in the
UV-VIS and in the near-IR, within the wavelength range of a space-based flagship mission concept
like the Habitable Worlds Observatory (HWO). Assuming that only solar energy is used to provide
the 2022 human energy needs with a land cover of ∼ 2.4%, and projecting the future energy demand
assuming various growth-rate scenarios, we assess the detectability with an 8 m HWO-like telescope.
Assuming the most favorable viewing orientation, and focusing on the strong absorption edge in the
ultraviolet-to-visible (0.34 − 0.52 µm), we find that several 100s of hours of observation time is needed
to reach a SNR of 5 for an Earth-like planet around a Sun-like star at 10pc, even with a solar panel
coverage of ∼ 23% land coverage of a future Earth. We discuss the necessity of concepts like Kardeshev
Type I/II civilizations and Dyson spheres, which would aim to harness vast amounts of energy. Even
with much larger populations than today, the total energy use of human civilization would be orders of
magnitude below the threshold for causing direct thermal heating or reaching the scale of a Kardashev
Type I civilization. Any extraterrrestrial civilization that likewise achieves sustainable population
levels may also find a limit on its need to expand, which suggests that a galaxy-spanning civilization
as imagined in the Fermi paradox may not exist.
Jet reorientation in central galaxies of clusters and groups: insights from V...Sérgio Sacani
Recent observations of galaxy clusters and groups with misalignments between their central AGN jets
and X-ray cavities, or with multiple misaligned cavities, have raised concerns about the jet – bubble
connection in cooling cores, and the processes responsible for jet realignment. To investigate the
frequency and causes of such misalignments, we construct a sample of 16 cool core galaxy clusters and
groups. Using VLBA radio data we measure the parsec-scale position angle of the jets, and compare
it with the position angle of the X-ray cavities detected in Chandra data. Using the overall sample
and selected subsets, we consistently find that there is a 30% – 38% chance to find a misalignment
larger than ∆Ψ = 45◦ when observing a cluster/group with a detected jet and at least one cavity. We
determine that projection may account for an apparently large ∆Ψ only in a fraction of objects (∼35%),
and given that gas dynamical disturbances (as sloshing) are found in both aligned and misaligned
systems, we exclude environmental perturbation as the main driver of cavity – jet misalignment.
Moreover, we find that large misalignments (up to ∼ 90◦
) are favored over smaller ones (45◦ ≤ ∆Ψ ≤
70◦
), and that the change in jet direction can occur on timescales between one and a few tens of Myr.
We conclude that misalignments are more likely related to actual reorientation of the jet axis, and we
discuss several engine-based mechanisms that may cause these dramatic changes.
The solar dynamo begins near the surfaceSérgio Sacani
The magnetic dynamo cycle of the Sun features a distinct pattern: a propagating
region of sunspot emergence appears around 30° latitude and vanishes near the
equator every 11 years (ref. 1). Moreover, longitudinal flows called torsional oscillations
closely shadow sunspot migration, undoubtedly sharing a common cause2. Contrary
to theories suggesting deep origins of these phenomena, helioseismology pinpoints
low-latitude torsional oscillations to the outer 5–10% of the Sun, the near-surface
shear layer3,4. Within this zone, inwardly increasing differential rotation coupled with
a poloidal magnetic field strongly implicates the magneto-rotational instability5,6,
prominent in accretion-disk theory and observed in laboratory experiments7.
Together, these two facts prompt the general question: whether the solar dynamo is
possibly a near-surface instability. Here we report strong affirmative evidence in stark
contrast to traditional models8 focusing on the deeper tachocline. Simple analytic
estimates show that the near-surface magneto-rotational instability better explains
the spatiotemporal scales of the torsional oscillations and inferred subsurface
magnetic field amplitudes9. State-of-the-art numerical simulations corroborate these
estimates and reproduce hemispherical magnetic current helicity laws10. The dynamo
resulting from a well-understood near-surface phenomenon improves prospects
for accurate predictions of full magnetic cycles and space weather, affecting the
electromagnetic infrastructure of Earth.
Extensive Pollution of Uranus and Neptune’s Atmospheres by Upsweep of Icy Mat...Sérgio Sacani
In the Nice model of solar system formation, Uranus and Neptune undergo an orbital upheaval,
sweeping through a planetesimal disk. The region of the disk from which material is accreted by
the ice giants during this phase of their evolution has not previously been identified. We perform
direct N-body orbital simulations of the four giant planets to determine the amount and origin of solid
accretion during this orbital upheaval. We find that the ice giants undergo an extreme bombardment
event, with collision rates as much as ∼3 per hour assuming km-sized planetesimals, increasing the
total planet mass by up to ∼0.35%. In all cases, the initially outermost ice giant experiences the
largest total enhancement. We determine that for some plausible planetesimal properties, the resulting
atmospheric enrichment could potentially produce sufficient latent heat to alter the planetary cooling
timescale according to existing models. Our findings suggest that substantial accretion during this
phase of planetary evolution may have been sufficient to impact the atmospheric composition and
thermal evolution of the ice giants, motivating future work on the fate of deposited solid material.
Exomoons & Exorings with the Habitable Worlds Observatory I: On the Detection...Sérgio Sacani
The highest priority recommendation of the Astro2020 Decadal Survey for space-based astronomy
was the construction of an observatory capable of characterizing habitable worlds. In this paper series
we explore the detectability of and interference from exomoons and exorings serendipitously observed
with the proposed Habitable Worlds Observatory (HWO) as it seeks to characterize exoplanets, starting
in this manuscript with Earth-Moon analog mutual events. Unlike transits, which only occur in systems
viewed near edge-on, shadow (i.e., solar eclipse) and lunar eclipse mutual events occur in almost every
star-planet-moon system. The cadence of these events can vary widely from ∼yearly to multiple events
per day, as was the case in our younger Earth-Moon system. Leveraging previous space-based (EPOXI)
lightcurves of a Moon transit and performance predictions from the LUVOIR-B concept, we derive
the detectability of Moon analogs with HWO. We determine that Earth-Moon analogs are detectable
with observation of ∼2-20 mutual events for systems within 10 pc, and larger moons should remain
detectable out to 20 pc. We explore the extent to which exomoon mutual events can mimic planet
features and weather. We find that HWO wavelength coverage in the near-IR, specifically in the 1.4 µm
water band where large moons can outshine their host planet, will aid in differentiating exomoon signals
from exoplanet variability. Finally, we predict that exomoons formed through collision processes akin
to our Moon are more likely to be detected in younger systems, where shorter orbital periods and
favorable geometry enhance the probability and frequency of mutual events.
Emergent ribozyme behaviors in oxychlorine brines indicate a unique niche for...Sérgio Sacani
Mars is a particularly attractive candidate among known astronomical objects
to potentially host life. Results from space exploration missions have provided
insights into Martian geochemistry that indicate oxychlorine species, particularly perchlorate, are ubiquitous features of the Martian geochemical landscape. Perchlorate presents potential obstacles for known forms of life due to
its toxicity. However, it can also provide potential benefits, such as producing
brines by deliquescence, like those thought to exist on present-day Mars. Here
we show perchlorate brines support folding and catalysis of functional RNAs,
while inactivating representative protein enzymes. Additionally, we show
perchlorate and other oxychlorine species enable ribozyme functions,
including homeostasis-like regulatory behavior and ribozyme-catalyzed
chlorination of organic molecules. We suggest nucleic acids are uniquely wellsuited to hypersaline Martian environments. Furthermore, Martian near- or
subsurface oxychlorine brines, and brines found in potential lifeforms, could
provide a unique niche for biomolecular evolution.
WASP-69b’s Escaping Envelope Is Confined to a Tail Extending at Least 7 RpSérgio Sacani
Studying the escaping atmospheres of highly irradiated exoplanets is critical for understanding the physical
mechanisms that shape the demographics of close-in planets. A number of planetary outflows have been observed
as excess H/He absorption during/after transit. Such an outflow has been observed for WASP-69b by multiple
groups that disagree on the geometry and velocity structure of the outflow. Here, we report the detection of this
planet’s outflow using Keck/NIRSPEC for the first time. We observed the outflow 1.28 hr after egress until the
target set, demonstrating the outflow extends at least 5.8 × 105 km or 7.5 Rp This detection is significantly longer
than previous observations, which report an outflow extending ∼2.2 planet radii just 1 yr prior. The outflow is
blueshifted by −23 km s−1 in the planetary rest frame. We estimate a current mass-loss rate of 1 M⊕ Gyr−1
. Our
observations are most consistent with an outflow that is strongly sculpted by ram pressure from the stellar wind.
However, potential variability in the outflow could be due to time-varying interactions with the stellar wind or
differences in instrumental precision.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...Advanced-Concepts-Team
Presentation in the Science Coffee of the Advanced Concepts Team of the European Space Agency on the 07.06.2024.
Speaker: Diego Blas (IFAE/ICREA)
Title: Gravitational wave detection with orbital motion of Moon and artificial
Abstract:
In this talk I will describe some recent ideas to find gravitational waves from supermassive black holes or of primordial origin by studying their secular effect on the orbital motion of the Moon or satellites that are laser ranged.
The technology uses reclaimed CO₂ as the dyeing medium in a closed loop process. When pressurized, CO₂ becomes supercritical (SC-CO₂). In this state CO₂ has a very high solvent power, allowing the dye to dissolve easily.
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
aziz sancar nobel prize winner: from mardin to nobel
Black holes as tools for quantum computing by advanced extraterrestrial civilizations
1. arXiv:2301.09575v1
[physics.pop-ph]
19
Jan
2023
Draft version January 24, 2023
Typeset using L
A
TEX twocolumn style in AASTeX63
Black holes as tools for quantum computing by advanced extraterrestrial civilizations
Gia Dvali1, 2
and Zaza N. Osmanov3, 4
1Arnold Sommerfeld Center, Ludwig Maximilians University, Theresienstraße 37, 80333 Munich, Germany
2Max Planck Institute for Physics, Föhringer Ring 6, 80805 Munich, Germany
3School of Physics, Free University of Tbilisi, 0183, Tbilisi, Georgia
4E. Kharadze Georgian National Astrophysical Observatory, Abastumani 0301, Georgia
(Received June 1, 2019; Revised January 10, 2019; Accepted January 24, 2023)
ABSTRACT
We explain that black holes are the most efficient capacitors of quantum information. It is thereby
expected that all sufficiently advanced civilizations ultimately employ black holes in their quantum
computers. The accompanying Hawking radiation is democratic in particle species. Due to this,
the alien quantum computers will radiate in ordinary particles such as neutrinos and photons within
the range of potential sensitivity of our detectors. This offers a new avenue for SETI, including
the civilizations entirely composed of hidden particles species interacting with our world exclusively
through gravity.
Keywords: Technosignatures — SETI — Astrobiology
1. INTRODUCTION
The search for extraterrestrial intelligence (SETI) is
one of the longstanding problems of humanity. It is
clear that any technological civilization most probably
should have certain technosignatures that might be po-
tentially detectable by our facilities. The first SETI
experiments started in the last century in the frame-
work of radio listening to stars (Tarter 2001). Nowa-
days, the most advanced technology is the five hundred
meter aperture spherical radio telescope FAST, which
is dedicated to observing the sky, and one of the mis-
sions is to search for radio messages from extraterrestrial
civilizations (Lu et al. 2020). In 1960 Freeman Dyson
proposed a revolutionary idea of searching for megas-
tructures visible in the infrared spectrum (Dyson 1960).
He has proposed that if an advanced alien civilization
is capable of constructing a spherical shell around their
host star to utilise its total emitted energy, the megas-
tructure, having a radius of the order of 1AU, will be
visible in the infrared spectral band. This idea has
been revived thanks to the discovery of the Tabby’s star
Zaza Osmanov: z.osmanov@freeuni.edu.ge
Gia Dvali: gdvali@mpp.mpg.de
(Boyajian et al. 2016) (characterized by flux dips of the
order of 20%, automatically excluding the possibility of
a planet shading the star). This stimulated further re-
search and a series of papers have been published consid-
ering the Fermi’s paradox (Osmanov 2021a) the plan-
etary megastructures (Osmanov 2021b), the stability
of Dyson spheres (DS) (Wright 2020); extension to hot
DSs (Osmanov & Berezhiani 2018, 2019); megastruc-
tures constructed around black holes (Hsiao et al. 2022);
white dwarfs (Zuckerman 2022) and pulsars (Osmanov
2016, 2018).
Since the search for technosignatures is very complex
and therefore sophisticated, the best strategy for SETI
should be to test all possible channels. In the light of this
paradigm, it is worth noting that, on average, the solar-
type stars in the Milky Way are much older than the
Sun. Therefore, potentially detected technosignatures
will belong to advanced alien societies.
It is natural to assume that the technological advance-
ment of any civilization is directly correlated with the
level of its computational performance. This is fully con-
firmed by our own history. Here, we wish to point out
that there exist certain universal markers of computa-
tional advancement which can be used as new signatures
in SETI.
2. 2 Dvali & Osmanov
In particular, in the present paper we wish to argue
that any highly advanced civilization is expected to em-
ploy black holes for the maximal efficiency of their quan-
tum computations. The optimization between the infor-
mation storage capacity and the time scale of its process-
ing suggests that a substantial fraction of the exploited
black holes must be the source of highly energetic Hawk-
ing radiation.
This gives new ways for the SETI. It also provides
us with new criteria for constraining the parameters of
highly advanced civilizations.
The logic flow of the paper is as follows. We first in-
troduce the current framework of fundamental physics
and justify why the most basic laws of quantum the-
ory and gravity must be shared by all ETI. This applies
to civilizations that may entirely be composed by yet
undiscovered elementary particle species and experienc-
ing some unknown interactions.
We then argue that for all civilizations, regardless
their composition, black holes are the most efficient stor-
ers of quantum information. This does not exclude the
parallel use of other hypothetical objects which within
their particle sector maximize the information storage
capacity, the so-called “saturons” (Dvali 2021). Fol-
lowing the previous paper, we explain that among all
such objects the black holes are the most efficient and
universal capacitors of quantum information.
Our conclusions are based on recent advance-
ments in development of a microscopic theory of
a black hole (Dvali & Gomez 2011; Dvali & Gomez
2012; Dvali et al. 2013) as well as on understand-
ing the universal mechanisms of enhanced informa-
tion capacity (Dvali 2017a,b,c; Dvali, Michel, Zell
2018; Dvali & Panchenko 2016; Dvali 2018; Dvali et al.
2020) and using them for “black hole inspired” quantum
computing in prototype systems (Dvali & Panchenko
2016).
Especially important is the knowledge that among ob-
jects saturating the universal quantum field theoretic
bound on microstate entropy, black holes posses the
maximal capacity (Dvali 2021).
Distilling the above knowledge, we argue that black
hole based computer technologies represent an universal
attractor points in the advancement path of any suffi-
ciently long-lived civilization.
Relying solely on the laws of physics that must be
shared by all ETI, we draw some very general conclu-
sions about their usage of black hole computing. The
optimization of information storage capacity and the
time of its retrieval suggests that it is likely that ETI
invest their energy in manufacturing a high multiplicity
of microscopic black holes, as opposed to few large ones.
Correspondingly, their computers are likely to produce
a Hawking radiation of high temperature and intensity.
Another source of radiation of comparable frequency
is expected to come from the manufacturing process of
black holes. As we explain, this process likely requires
the high energy particle collisions.
We give some estimates of what the above implies
for SETI. In particular, considering the IceCube instru-
ments (the most efficient neutrino observatory), based
on their sensitivity and the observed VHE neutrino flux
in the energy domain 40 TeV, we conclude that the men-
tioned observatory can detect emission from the black
holes of advanced civilizations if their numbers are in
the following intervals: 2.5 × 103
≤ Nν
II
≤ 4.2 × 104
,
2 × 106
≤ Nν
III
≤ 1.4 × 108
.
Finally, we discuss the possibility that nature houses
a large number of hidden particle species interacting
with us via gravity. We point out that because of
the known universal relation between the number of
species and the mass of the lightest black holes (Dvali
2007; Dvali & Redi 2008), the existence of many species
makes black hole manufacturing more accessible for ETI.
This applies equally to our civilization. In this light, a
possible creation of microscopic black holes at our par-
ticle accelerators, shall reveal a powerful information
about the computational capacities of ETI.
The paper is organized as follows: in sections 2 - 10, we
examine basic concepts of quantum computing by using
black holes, in section 11 we discuss the observational
features of such black holes, in sec.12 we discuss the
hidden sector of particles and in sec.13, we summarise
our findings.
2. THE FRAMEWORK
2.1. Quantum physics and elementary particles
Our current framework for describing nature at a fun-
damental level is based on quantum physics. This frame-
work passes all the tests, both theoretical and experi-
mental, available to our civilization. The experiments
prove that the quantum laws work to the shortest dis-
tances directly probed at the human made accelerators.
In particular, the scale ∼ 10−17
cm has been reached at
the Large Hadron Collider (LHC) at CERN. This ex-
traordinary resolution is achieved in particle collisions
at around ∼TeV energies.
In addition, the predictions obtained by the theoret-
ical extrapolations of the quantum framework to much
higher energies, are fully consistent with all the exist-
ing data. A classic example is provided by Quantum
Chromo-Dynamics (QCD), which is the theory of the
nuclear forces. QCD is an asymptotically-free quantum
3. Black holes as tools for quantum computing 3
field theory, self-consistently valid at arbitrarily short
distances/high energies.
This success by no means implies that our knowledge
of fundamental forces is complete. Certainly, new in-
teractions can (and hopefully will) be discovered by the
experiments with higher energies and/or higher preci-
sions. Many such interactions have been hypothesized
and studied. These are motivated by various open ques-
tions, such as the origin of dark matter, early cosmology,
unification of forces and so on. For none of these exten-
sions, there exists even a slight evidence for non appli-
cability of the basic laws of quantum physics. Even the
string theory, which theoretically culminates the idea of
unification of all known interactions, fully relies on the
laws of quantum physics.
Of course, the particular realizations of the quantum
framework may encounter some computational difficul-
ties in certain regimes. This happens, for instance, when
the interactions among the elementary particles become
strong. In such a case, the perturbation theory in a
given coupling constant may break down and one needs
to resort to non-perturbative techniques such as, for ex-
ample, the lattice simulations. None of these computa-
tional obstacles bare any indication of the invalidity of
the basic concepts of the quantum theory. Rather, they
only signal the need for improvement of computational
tools without going outside of the quantum framework.
In the light of all the above, in our analysis we shall
assume the validity of the laws of quantum physics for
ETI.
2.2. Gravity
The second powerful factor is gravity. At the clas-
sical level, the gravitational interaction is described by
Einstein’s General Relativity (GR). Again, this is a re-
markably successful theory, tested by direct observations
within an impressive range of distances, 10−1
− 1028
cm.
In addition, various successes of the early cosmology sug-
gest that the validity domain of GR extends to much
shorter distances.
At the quantum level, the excitations of the grav-
itational field are gravitons, the massless particles of
spin = 2. All the classical interactions of macroscopic
sources can be derived from the quantum theory in form
the exchanges of virtual gravitons, in the expansion in
series of Newton’s constant G.
The unique property of graviton is that it is sourced
by the energy momentum tensor. This feature is uni-
versal and accounts for all contributions including the
energy-momentum of the graviton itself. Due to this
property, the gravitons self-gravitate. In other words,
any graviton acts as a source for others.
The quantum states of the isolated gravitons have
never been tested directly in lab experiments. This is
explained by an extraordinary weakness of their cou-
pling at laboratory scales. For example, an interaction
strength of a graviton of ∼cm wavelength is suppressed
by ∼ 10−66
(Dvali & Gomez 2011). The strength of
the classical gravitational field produced by the macro-
scopic sources is a collective effect of a large number of
gravitons. This is why, it is much easier to detect the
classical effects of gravity as opposed to its quantum
manifestations.
The story however is scale-dependent. In the quan-
tized version of Einstein gravity, the interactions among
the individual gravitons are weak at distances larger
than the Planck length,
LP ≡
p
~G/c3 , (1)
where ~ is the Planck’s constant and c is the speed of
light. The Planck length is approximately 10−33
cm. At
such separation, the characteristic momentum-transfer
among the individual gravitons is of order the Planck
momenta, ∼ MP c, where
MP ≡
~
cLP
, (2)
is the Planck mass. The quantum gravitational effects
become strong at this scale. Due to this, LP is com-
monly considered to be an ultimate cutoff of the Ein-
stein theory of gravity, viewed as effective quantum field
theory.
However, in any extension of the pure Einstein grav-
ity, the actual length-scale, L∗, at which quantum grav-
ity becomes strong, is longer than LP . In general, if a
theory includes Nsp distinct elementary particle species,
we have (Dvali 2007; Dvali & Redi 2008),
L∗ =
p
NspLP . (3)
The corresponding scale of the momentum-transfer,
M∗c, is respectively lowered to
M∗c =
MP c
p
Nsp
. (4)
This scale is usually referred to as the “species scale”.
The scale L∗ imposes an absolute lower bound
on compactness. In particular, the localization ra-
dius of a quantum information bit is bounded by L∗
(Dvali & Gomez 2011).
The currently known elementary particle species in-
clude the degrees of freedom of the Standard Model
(quarks, leptons, Higgs and gauge bosons) and the gravi-
ton. Counting all physical polarizations, this amounts
4. 4 Dvali & Osmanov
to Nsp ∼ 100. Correspondingly, the theory-independent
bound on the compactness cutoff is L∗ ≥ 10LP ∼
10−32
cm.
2.3. Universal criteria obeyed by ETI
Putting all the knowledge together, we cannot assume
that an advanced ETI necessarily shares the elementary
particle composition with us. In particular, some ETIs
may originate from entirely different particle species, in-
teracting with us only gravitationally.
In the light of the above, we shall make the following
(maximally-conservative) assumption:
We share with ETI the laws of quantum physics and
gravity.
One of the consequences of this is that the physics
of black holes is common, irrespectively by which ETIs
they are manufactured. This shall play the crucial role
in our idea of SETI.
3. OPTIMIZATION OF INFORMATION STORAGE
The two essential ingredients of any type of computa-
tion are hardware and software. We shall not attempt
to make any assumptions about the software used by
the advanced ETI. The power of programming and the
sophistication of their algorithms are most likely way
beyond our imagination.
However, since the laws of quantum physics and grav-
ity are common, we can make certain conclusions about
the limiting capacities of their hardware. We shall pa-
rameterize these limits according to certain well-defined
characteristics. The main features we will be interested
in are the efficiencies of the quantum information stor-
age and of its retrieval. These parameters are not payed
much attention at the present stage of development of
the quantum computing by our civilization, due to more
urgent technological issues. However, they must become
crucial for any civilization that is able to approach the
limiting capacities.
In order to understand what this limits imply, we go in
two steps. First, in the current section we define what it
means for a generic device to have an enhanced capacity
of information (memory) storage and what is the under-
lying mechanism behind it. That is, we shall reduce the
feature of the enhanced capacity of information storage
to its bare essentials borrowing the results of the series
of papers (Dvali 2017a,b,c; Dvali, Michel, Zell 2018;
Dvali 2018; Dvali et al. 2020). The reader can find a
self-contained summary and discussion in (Dvali et al.
2020).
Next, we shall explain that among all possible hypo-
thetical devices that saturate the information storage ca-
pacity, the black holes are the most efficient ones (Dvali
2021).
In general, the quantum information is stored in a
quantum state of the system. These states are usually
labeled by the excitation levels of a set of N elementary
degrees of freedom. Their choice is a matter of con-
venience depending on the system parameters, available
technology and the computational task. The choice may
range from atomic spin states, for less advanced ETI,
all the way to graviton excitations that can be used by
highly advanced civilizations. Regardless of their partic-
ular nature, it is customary to describe them as quan-
tum oscillators in the number representation. Following
the terminology of (Dvali 2017a,b,c; Dvali, Michel, Zell
2018; Dvali 2018; Dvali et al. 2020), we shall refer to
these as the “memory modes”.
The simplest mode is a qubit, a quantum degree of
freedom that can exist in two basic states. These states
can be labelled as the occupation number eigenstates |0i
and |1i of the corresponding quantum oscillator with a
number operator n̂j, and j = 1, 2, ..., N the mode label.
Correspondingly, the set of N qubits produces nst =
2N
basic states. These represent the tensor prod-
ucts of individual basis states of N distinct mem-
ory modes, |n1, n2, ..., nN i ≡ |n1i ⊗ |n1i ⊗, ..., ⊗ |nN i,
where nj = 0 or 1, for each j. In short, they
count all possible sequences of 0-s and 1-s, such as,
|0, 0, ..., 0i , |1, 0, ..., 0i , ..., |1, 1, ..., 1i. We shall refer to
these states as the “basic memory patterns”. They form
a Hilbert (sub)space, which we shall call the “memory
space”.
The system can exist in an arbitrarily entangled state
of the memory modes. A typical entangled state would
represent a linear superposition of order-one fraction of
nst basis states. However, there can exist the maximally
entangled states of special types, such as the generalized
EPR state, 1
√
2
(|0, 0, ..., 0i + |1, 1, ..., 1i).
As a measure of the information storage capacity of
the system, it is convenient to use the microstate en-
tropy. As usual, we define it as the log from the dimen-
sionality of the memory space,
S = kB ln(nts) . (5)
We shall focus on two important characteristics of the
efficiency of a quantum memory device: The energy-
efficiency and the compactness.
The first one is determined by the energy gap of
qubits, ǫ. This also determines the total energy gap
occupied by all nst memory states, which we denote by
∆E.
5. Black holes as tools for quantum computing 5
For example, if for each degree of freedom the states
|0i and |1i are gapped by ǫ, the gap spanned by the
memory space will be ∆E = Nǫ.
Another important parameter is the radius of local-
ization, R, of the quantum information storing device.
Ordinarily, the compactness of the system increases the
gap. For example, consider a qubit realized by a rela-
tivistic particle, say a photon, localized within a box of
size R. The two basis states can for example be chosen
as the states with one and zero photons. If the interac-
tions of the photon inside the box are weak, the minimal
energy gap due to quantum uncertainty is ǫ ∼ c ~
R . Con-
sequently, the localization of N such qubits within the
same box, will result in a basic energy cost ∆E ∼ N ~c
R .
The first necessary requirement fulfilled by the device
of the enhanced memory capacity is to minimize the
gap of the memory qubits below the “naive” quantum
uncertainty,
ǫ ≪
~c
R
, (6)
while maximizing their number N (equivalently, S).
The crucial point is that the two are not independent:
for a system of given compactness R there exists an ab-
solute upper bound on S imposed by unitarity (Dvali
2021). We shall discuss this bound later.
At the same time, the energy gap of a quibt ǫ, sets the
lower bound on the time-scale required for processing
the information stored in that qubit,
tq =
~
ǫ
. (7)
In particular, this is the minimal time required for ex-
tracting the information from the qubit. For example,
(7) is the minimal time required for detecting a photon
of energy ǫ.
Notice that ǫ stands for an effective energy gap in
which the contributions due to interactions with the en-
vironment, with other qubits and other parts of the de-
vice, are already taken into account.
The expression (7) shows that there is a tradeoff. The
cheap storage of information requires a longer time for
its retrieval. This shall play an important role in esti-
mating the computational parameters of ETI.
At the earliest stages of the development of quantum
computers, as it is currently the case with our civiliza-
tion, the pressing issues related to the hardware are the
problems of decoherence. The questions of the energy
efficiency of qubits, posed above, are secondary and not
payed much attention. This makes sense, since even for
highly optimistic numbers of the memory qubits, the en-
ergy costs due to the minimal quantum uncertainty are
insignificant.
For example, for N ∼ 1010
, the energy gap of the
memory space for a modern-technology information-
storing device of size R ∼cm, is only ∆E ∼ 105
eV.
This is less than the rest energy of a single electron.
Obviously, this is negligible as compared to the rest en-
ergy of a typical supporting device that consist of the
macroscopic number of atoms.
However, the amount of quantum information pro-
cessed by an advanced civilization must require uncom-
parably larger values of N. Therefore, an advanced civ-
ilization that cares about the compactness and the en-
ergy efficiency of its computations must find the way
of optimizing the information storage capacity of the
system. That is, it has to manufacture devices that
maximizes the microstate entropy per given size of the
system. Basically, the goal is to maximize S for given
R.
Let us now explain that, for achieving the above goal,
the sufficiently advanced ETIs are expected to use black
holes.
4. WHY BLACK HOLES?
Black holes exhibit an extraordinary capacity of the
information storage and processing. For various com-
parisons, it is illuminating to take the human brain as
a reference point (Dvali 2017c). For example, a black
hole of a mass of a human brain (∼kg) has a million
times larger memory capacity while being in size only
∼ 10−25
cm and having the operational time of memory
read-out tq ∼ 10−19
sec. This is impressive, but are they
the optimal tools?
In order to classify black holes as the ultimate device
used in quantum computing by advanced ETIs, the fol-
lowing set of questions must be answered:
• How unique are black holes in their ability of in-
formation storage?
• What are the general mechanisms behind this abil-
ity?
• Can these mechanisms be implemented in other
devices for achieving a comparable, or even a
higher, efficiency of information storage and pro-
cessing?
Fortunately, these questions have already been largely
addressed in a research program consisting of three par-
allel strategies:
• Formulate a microscopic theory of a black hole
(Dvali & Gomez 2011). Demonstrate that the
analogous mechanisms of information storage and
processing can be implemented in other quantum
6. 6 Dvali & Osmanov
systems (Dvali & Gomez 2012; Dvali et al. 2013;
Dvali et al. 2015; Dvali & Panchenko 2016).
• Use the well stablished features of a black hole
in order to deduce the properties of its memory
modes, such as, the energy gaps of qubits, their
diversity and the evolution time scales. Next,
identify the mechanisms in generic quantum sys-
tems leading to the similar features. Implement
such systems for quantum information processing
(Dvali & Panchenko 2016; Dvali 2017b,c,a, 2018;
Dvali, Michel, Zell 2018; Dvali et al. 2020)
• Identify objects in generic quantum field theories
that are equally (or more) efficient in information
storage and processing as black holes. Extract the
universal underlying physical reasons behind these
features (Dvali 2021).
Remarkably, all three strategies have converged on one
and the same picture of the properties of the black hole
memory modes (qubits) and the mechanism behind their
gaplessness and abundance.
In particular, it turns out that these features are
generic for the objects saturating a certain universal up-
per bounds on the microstate degeneracy imposed by
unitarity (Dvali 2021). The objets are called “satur-
ons”. Since, the bound is universal, it is shared by all
sectors of the universe, regardless their particle compo-
sition. In particular, it is applicable even for yet un-
detected hypothetical particle species from the hidden
sectors.
A long list of such objects has been iden-
tified in various consistent quantum field theo-
ries (Dvali 2019a,b; Dvali & Sakhelashvili 2021;
Dvali, Kaikov & Bermúdez 2021; Dvali, Kühnel & Zantedeschi
2021). Interestingly, they evidently exist in already dis-
covered interactions, such as QCD, in form of color glass
condensate of gluons (Dvali & Venugopalan 2021).
The universal mechanism of enhanced information
storage capacity can be potentially implemented in lab-
oratory settings, such as cold atoms. This is a realistic
prospect even for our civilization in not far future.
The bottomline of this discussion is that an advanced
civilization can certainly use non gravitational saturons
for the efficient information storage.
Depending on the perspective, this may be regarded
as good news or bad news. On one hand, it is excit-
ing to be able to exploit the black hole mechanism of
information storage without the need of manufacturing
the actual black holes, which for our civilization can be
a long prospect. Instead, we can implement the same
mechanism in ordinary systems such as cold atoms or
the nuclear matter.
On the other hand, one may worry that this option
makes the signatures for SETI less universal and more
technology dependent.
However, we can claim with certainty (Dvali 2021)
that among all possible saturons, the black holes stand
out as the most efficient information storers. This gives
us a confidence that a highly advanced ETI must be
using them as the information storing device.
This provides us with a prospect of search for ad-
vanced civilizations even if they are composed out of
entirely unknown particle species. For example, ETI
may consist of particles from a hidden sector not shar-
ing any constituents with the species that are known to
us.
Nevertheless, the black hole quantum computing de-
vice of such a hidden civilization will provide signatures
detectable by our observers. This is because Einstein
gravity is universal for all the particle species. A direct
consequence of this fact is the universality of the Hawk-
ing radiation. Correspondingly, the quantum computers
of dark ETI will radiate democratically into the particles
of all the species, including the particles of the Standard
Model which we can potentially detect.
5. CLOSER LOOK AT BLACK HOLES
5.1. Generalities
Black holes are the most compact objects of nature.
Their effective size is given by the gravitational radius R.
For a non-rotating and electrically-neutral black hole,
this is set by the Schwarzschild radius, R = 2GM/c2
.
In the classical description, black holes are stable and
eternal. In quantum picture the story changes. Due
to quantum effects, the black holes evaporate by emit-
ting Hawking radiation. Hawking’s original calculation
(Hawking 1974) was performed in exact semi-classical
limit. This implies taking the black hole mass to in-
finity and Newton’s constant to zero, while keeping the
black hole radius fixed, G = 0, M = ∞, R =finite. In
this limit, the back reaction from the emitted quanta
vanishes and the emission rate can be evaluated exactly.
The resulting spectrum of radiation is thermal, with an
effective temperature given by (Hawking 1974),
T =
~c3
8πkB
MG
. (8)
This leads to the following Stephen-Boltzmann law
c2 dM
dt
= −Nsp4πσR2
H
T 4
. (9)
Here Nsp counts all particle species (all polarizations)
lighter than T . That is, all species are produced with
an universal thermal rate. The particles heavier that
7. Black holes as tools for quantum computing 7
T are Boltzmann suppressed and can be ignored. The
democratic production of all particle species is one of
the important properties of the Hawking radiation. This
feature originates from the universal nature of the grav-
itational interaction.
Of course, in Hawking’s computation the black hole
has infinite mass and therefore it evaporates eternally.
For a finite mass black hole, the story is different. In
order to estimate the decay time, one needs to make
some additional assumptions. The most common is to
assume that the black hole evaporation is self-similar.
That is, the relation between the mass and the temper-
ature (8) holds throughout the evaporation process. Un-
der this assumption, the system can be easily integrated
and one gets the following expression for the half-decay
time,
τ1/2
= N−1
sp
4480πG2
M3
~c4
sec (10)
For getting a better feeling about time scales, it is useful
to rewrite this as an approximate expression,
τ1/2
≃ 0.7 ×
100
Nsp
×
M
109 g
3
sec . (11)
The factor ∼ 100 in the numerator stands for a normal-
ization relative to the number of known particle species,
while Nsp refers to a total number of species available
at initial temperature of a black hole.
However, it is important to keep in mind that for the
finite mass black holes, the thermality is only an approx-
imation, since the back reaction is finite. By the time
the black hole looses half of its mass, both expressions
are expected to receive the substantial corrections due
to quantum back reaction. This is evident from vari-
ous sources: the general consistency arguments (Dvali
2015), the detailed analysis of prototype quantum sys-
tems (Dvali et al. 2020), as well as, from the explicit mi-
croscopic theory of a black hole (Dvali Gomez 2011).
It is independently evident that due to the back reac-
tion from the carried quantum information, the black
hole evaporation cannot stay self similar throughout
the process, and certainly not beyond its half-decay
(Dvali et al. 2020).
This knowledge is no obstacle for our analysis. For
the conservative estimates, the above expressions can be
used as good approximation till the black hole half-decay
time. This self-consistently shows that the expression
(11) gives a correct order of magnitude approximation
for this time scale. This is fully sufficient for our pur-
poses. Our discussions will be limited by this time-scale.
Let us now discuss the information storage capacity
of black holes. It is well known that a black hole of
radius R has a microstate entropy equal to its surface
area measured in units of G (Bekenstein 1973),
S = kB
πc3
R2
~G
= kB
πR2
L2
P
, (12)
where in the last part of equation we have expressed it
through the Planck length.
Another important feature of a black hole is the in-
formation retrieval time. The huge entropy of a black
hole shows that it can carry a large amount of infor-
mation. During the evaporation, the black hole looses
its mass. However, initially a very little information
comes out. This is because at initial stages of decay, the
spectrum of Hawking radiation is thermal within a very
good approximation. The quantum information about
the black hole state is encoded in deviations from ther-
mality, which are of order 1/S per one emission time
(Dvali Gomez 2011; Dvali 2015). Initially, the ef-
fects of these corrections are small and it takes a very
long time to resolve them.
Page argued (Page 1993) that the start of the infor-
mation retrieval from a black hole is given by its half
decay time (11). As we shall discuss below, this fea-
ture has a physically transparent explanation in terms
of gaps of the black hole qubits (Dvali Gomez 2011).
It also was shown that the same feature is shared by all
objects of maximal microstate entropy (Dvali 2021).
Taking (12) as a fact, one can deduce an useful
knowledge about the information-storing qubits with-
out making any assumptions about their nature (Dvali
2017a,c,b; Dvali et al. 2020; Dvali et al. 2015). In par-
ticular, it is clear that the number of qubits is N ∼
S/kB. From here it follows that the energy gap of black
hole qubits is around,
ǫ =
kB
S
~c
R
=
~2
G
πc2R3
. (13)
Notice that for Nsp ∼ 1, the corresponding information
storage time-scale (7),
tq =
~
ǫ
=
S
kB
R
c
=
πc2
R3
~G
, (14)
is of the same order as the half-decay time (11). Thus,
we have,
tq ∼ τ1/2 . (15)
This nicely reproduces the Page’s estimate. It also shows
that this estimate is a straightforward consequence of
the energy gaps of the black hole qubits.
Of course, for Nsp ≫ 1, the decay time is shortened
and so is the time-scale of information processing,
tq =
1
Nsp
S
kB
R
c
=
1
Nsp
πc2
R3
~G
, (16)
8. 8 Dvali Osmanov
5.2. Cross-check from a microscopic theory
The expressions (8), (12), (13), (14), (16) have
been independently derived within a microscopic the-
ory of a black hole, the so-called “quantum N-portrait”
(Dvali Gomez 2011). This theory tells us that at the
fundamental level a black hole is described as a bound-
state of N ∼ S gravitons of wavelengths ∼ R. This
collection of gravitons can be regarded as a coherent
state or a condensate.
The black hole qubits originate from the collective
Bogoliubov-type excitations of the graviton condensate.
As shown in (Dvali Gomez 2012), and further stud-
ied in series of papers (Dvali et al. 2013; Dvali et al.
2015; Dvali Panchenko 2016; Dvali 2017b,a,c, 2018;
Dvali, Michel, Zell 2018; Dvali et al. 2020), a similar
emergence of gapless qubits is characteristic to systems
of attractive Bose-Einstein condensates at quantum crit-
ical points.
For a black hole, the qubits represent the excitations
of graviton with high angular momentum, which be-
come gapless due to attractive interaction with the con-
densate. These excitations can be described in several
equivalent languages. For example, they can be viewed
as Goldstone modes of the symmetries that are sponta-
neously broken by a black hole. Their emergence can
also be understood in terms of the mechanism of “as-
sisted gaplessness” Dvali (2017a). The essence of this
phenomenon is that the negative energy of the attrac-
tive interaction compensates the positive kinetic energy
of the would-be free mode.
The independent derivation of the relations (8), (12),
(13), (14), (16) from the microscopic theory provides an
important cross-check for the validity of these equations.
6. COMPETING DEVICES: SATURONS
Let us now ask: How do we know that there ex-
ist no information-storing devices competing with black
holes? For a long time, it was implicitly assumed that
the area form of the entropy (12) was exclusively the
property of black holes. However, this understanding
has changed recently (Dvali 2021). It has been shown
that the area-law expression is common for all field the-
oretic objects that have maximal entropy permitted by
unitarity. Namely, for a localized object of radius R,
there exists the following universal upper bound on the
microstate entropy (Dvali 2021),
S = kB
πc3
R2
~G̃
, (17)
where G̃ is the coupling of the Nambu-Goldstone mode
of spontaneously broken Poincare symmetry. This mode
is universally present for an arbitrary localized object,
since any such object breaks the Poincare symmetry
spontaneously. Due to this, G̃ represents a well-defined
parameter for an arbitrary macroscopic object. The ob-
jects saturating the above bound are referred to as “sat-
urons” (Dvali 2021).
Various explicit examples of such objects have
been constructed in non-gravitational quantum field
theories, including some candidates of cold atomic
systems(Dvali 2019a,b; Dvali Sakhelashvili 2021;
Dvali, Kaikov Bermúdez 2021; Dvali, Kühnel Zantedeschi
2021).
An important finding of these studies is that all satur-
ons exhibit the same information processing properties
as black holes, with the role of G taken up by G̃. Namely,
all saturons decay by emission of Hawking-like radiation
with temperature given by (8),
T =
~c
4πkB
R
. (18)
The energy gap of saturon quibts is,
ǫ =
kB
S
~c
R
=
~2
G̃
πc2R3
, (19)
and the corresponding information recovery time is,
tq =
~
ǫ
=
S
kB
R
c
=
πc2
R3
~G̃
. (20)
Of course, unlike black holes, the generic saturons radi-
ate only in particle species to which their constituents
interact.
It is certainly conceivable that some civilizations use
non-gravitational saturon devices, as opposed to black
holes, for their quantum computations. In fact, even our
civilization may not be too far from implementing the
saturated systems for such a purpose.
First, soon after the formulation of black hole N-
portrait, the existence of similar gaplessness mecha-
nisms were established in the prototype N-boson mod-
els (Dvali Gomez 2012; Dvali et al. 2013; Dvali et al.
2015) .
Further, it has been proposed (Dvali Panchenko
2016; Dvali 2017b,c; Dvali, Michel, Zell 2018) that crit-
ical Bose-Einstein condensates, can be used for imple-
menting this black hole mechanism for controlled infor-
mation processing.
More recently, it has been argued (Dvali Venugopalan
2021) that saturated states have already been created
in laboratory experiments in form of the color glass
condensate of gluons (Gelis et al, 2010). It is not nec-
essarily too far-fetched to expect the use of such states
for quantum information processing on a reasonable
time-scale of the advancement of our civilization.
9. Black holes as tools for quantum computing 9
7. CAN ETI USE OTHER SATURONS INSTEAD
OF BLACK HOLES?
In the above light, one may wonder whether the ex-
istence of saturons as of alternative information storing
devices may impair our idea of SETI. The reason is that
the radiation signature from such quantum computers
will not be universal and will rather depend on a parti-
cle composition of ETI. Fortunately, this is not the case.
The point is that among all saturated objects of any
given size R, the black holes are unbeatable in their in-
formation storage capacity (Dvali 2021). This is due
to the fact that the spontaneous breaking of Poincare
symmetry is maximized by a black hole. Correspond-
ingly, the coupling of the Poincare Goldstone in case of
a black hole is the weakest. That is, for any other object
of the same size, we have G̃ G. In other words, the
non-black-hole saturons, irrespective of their composi-
tion, must have G̃ G. Any decrease of G̃ below G,
will collapse the saturon into a black hole.
The details of the proof can be found in the original ar-
ticle (Dvali 2021). However, due to an interdisciplinary
nature of the present paper, for the sake of the reader’s
comfort, we shall make the presentation maximally self-
contained. For this purpose, we provide the following
physical explanation.
Imagine we would like to produce a saturated device
of size ∼ R in form of a self-sustained bound state of N
quanta of the wavelengths ∼ R. Let us assume that the
dominant interaction responsible for binding the system
has an interaction strength set by a dimensionless cou-
pling α. That is, the attractive potential energy among
the pair of particles is V = −α~c/R.
Of course, at the same time these particles interact
gravitationally, with the strength αgr = ~G
c3R2 =
L2
P
cR2 .
The physical meaning of this parameter is easy to under-
stand if we recall that the energies of quanta are ∼ ~c/R
(see, e.g., (Dvali Gomez 2011)). However, we have
assumed that gravity is subdominant with respect to
the binding force of coupling α.
It is clear that in order to form a bound state, the
kinetic energy of each quantum c~
R must be balanced by
the potential energy of attraction from the rest, V =
αNc~/R. This gives an equilibrium condition αN ∼ 1.
Now, the bound state breaks Poincare symmetry
spontaneously. It breaks both the translations and the
Lorentz boosts. The order parameter for breaking of
Poincare symmetry is ~N/πc3
R2
, and the Goldstone
coupling is G̃ = πc3 R2
~N . Then, according to the uni-
tarity bound (17), the maximal possible entropy of such
a bound state is S = kBN. Let us assume that this
entropy is attained. Can it exceed the entropy of the
same size black hole (12)?
This requires making G̃ = πc3 R2
~N as small as G. But
this implies that the gravitational coupling αgr must
become equal to the one of non-gravitational binding
force α ∼ 1/N. Thus, we are pushed to equality
αgr = α. Remembering that the rest energy of the ob-
ject is ∼ Nc~/R, it is easy to evaluate its gravitational
radius, which comes out equal to ∼ R.
It is clear that at this point the object is within its
gravitational radius and thus is a black hole. That is,
any attempt of pushing the entropy of the saturon of
size R beyond the entropy of the same size black hole,
collapses the saturon into a black hole.
Correspondingly, if the civilization is sufficiently ad-
vanced, for information processing, it will ultimately re-
sort to black holes rather than to other saturons.
This gives an exciting prospect of searches for ad-
vanced civilizations based on a detection of Hawking
radiation from their quantum computers. This applies
equally to the civilizations belonging to hidden parti-
cle species, interacting with us exclusively via gravity.
Regardless, due to its universal nature, the Hawking ra-
diation from their quantum computers will inevitably
contain our particle species.
8. SOME REMARKS ON BLACK HOLE
MANUFACTURING TECHNOLOGY
An universal prescription for a creation of a black hole
of mass M is that the energy E = Mc2
must be localized
within the sphere of the corresponding Schwarzschild
radius R = 2MG/c2
.
Of course, we cannot make a precise prediction of the
methods used by an advanced civilization for achieving
this goal. However, it is possible to outline some expec-
tations depending on the masses of black holes manu-
factured by ETI.
In particular, we wish to separate the two regimes of
black hole formation, to which we shall refer as “soft”
and “hard” respectively.
If the mass of a black hole is sufficiently high, it can
be manufactured softly, by putting together a low den-
sity of many non-relativistic particles and letting them
to collapse. This is essentially how an ordinary stellar
collapse works.
However, for creation of microscopic black holes this
method may not be used. Instead, one needs to acceler-
ate particles to high energies and collide them with small
impact parameters. Such collisions are usually referred
as “hard” in the sense that the momentum-transfer per
colliding particle exceeds their rest masses.
Let us explain the difference on a concrete example. In
order to produce a solar mass black hole (R ∼km), it is
sufficient to place of order 1057
non-relativistic neutrons
10. 10 Dvali Osmanov
within the proximity of few km. Such a system will
collapse into a black hole without the need of any hard
collision among the neutrons.
At the same time, if we wish to produce a black hole of
the size of a neutron R ∼ 10−14
cm, this method will not
work. The mass of such a black hole is approximately
1038
GeV, which is 1038
times the rest mass of a neutron.
We thus need to localize about 1038
neutrons within a
Compton wavelength of a single neutron. This cannot be
done softly due to the resistance from the nuclear forces.
For example, a chunk of a nuclear matter consisting of
1038
neutrons will not collapse easily into a black hole,
due to Fermi pressure and other factors.
Instead, in order to achieve the goal, the neutrons
have to be accelerated to ultra-relativistic energies and
collided with a very high momentum transfer.
For example, in an extreme case we can accelerate two
neutrons to a center of mass energy of ∼ 1038
GeV and
collide them with an impact parameter ∼ 10−14
cm.
Of course, one can instead collide more neutrons each
with less energy. However, for each choice the collision
must be hard since the neutrons must be relativistic and
exchange momenta exceeding (or comparable to) their
rest masses.
This example shows the general tendency: manufac-
turing the microscopic black holes, requires hard colli-
sions among high energy particles.
Of course, one can imagine an exotic situation when
an advance civilization has at its disposal a new type of
a heavy elementary particle, interacting purely via grav-
ity. For example, we can imagine a stable scalar particle
of mass ∼ MP . Notice, such a particle is essentially
a quantum black hole, since its Compton wavelength
as well as its gravitational radius are both of order LP
(Dvali Gomez 2011). Using this elementary build-
ing block, the black holes of arbitrary masses can be
produced softly. Indeed, an arbitrary number of non-
relativistic particles with proper initial velocities can
collapse into a black hole. Of course, upon crossing the
Schwarzschild radius, the system always becomes rela-
tivistic. But the process is soft in the sense that such
particles need not exchange momenta exceeding their
rest masses.
Putting such highly exotic possibilities aside, it is rea-
sonable to expect that for manufacturing small black
holes, the advanced ETI use the method of colliding
highly energetic elementary particles.
Such collisions are expected to be accompanied by the
radiation of quanta of the typical energy given by the
inverse impact parameter. Since the latter is of order
the black hole radius, the expected energy of the ra-
diated quanta is ∼ ~c/R. This is comparable to the
temperature of the Hawking radiation from a black hole
that is the outcome of the collision. The difference of
course is that the collision radiation is neither thermal
nor democratic in particle species.
Due to this, its detectability, unlike the Hawking radi-
ation, will depend on the composition of particle species
used in collisions for manufacturing black holes.
In conclusion, the following general message emerges.
The ETI can also be searched through the primary radi-
ation emitted during the process of their manufacturing
of black holes. The characteristic energy is set by the
inverse impact parameter and is comparable to the en-
ergy of the Hawking quanta emitted by a black hole
manufactured in the process.
9. REMARKS ON OPTIMISATION OF
INFORMATION PROCESSING
Another remark we would like to make concerns
the optimisation of information processing by ETI. Of
course, we are not going to speculate about the details.
Our point will be limited by aspects that are imposed by
laws of physics which all ETI have to respect. These are
the laws of quantum field theory and gravity. These laws
imply the relation (16) between the information extrac-
tion time from a black hole and its radius/temperature
and entropy. The analogous relation holds for all satur-
ons (20) (Dvali 2021).
This universal relation shows that for a single black
hole the information extraction time tq is fixed in terms
of its radius R (or temperature, T ∼ 1/R) and Nsp. Of
course, from the same relation, the entropy of a black
hole is also uniquely fixed in terns of tq and Nsp. Notice
that the total number of species Nsp at a given scale is
the constant of nature, which ETI cannot change. The
optimization problem must thereby be solved for the
fixed Nsp.
This allows us to draw some general conclusions about
a likely arrangement of information storing devices by
ETI. Namely, for a given tq, the only way of increasing
the amount of stored information, is by multiplying the
number of black holes. Increasing the entropies of indi-
vidual black holes is not possible, as this will increase
tq.
As it is clear from (16), for a system of n black
holes, each of mass M, the information retrieval time
tq ∝ M3
/M2
P is independent of n. One can thus arbi-
trarily increase the entropy of the system by increasing n
without altering tq. The total entropy grows as n while
tq stays the same.
In contrast, taking a single black hole of the mass
M′
= nM, increases the entropy of the system by extra
power of n relative to the total entropy of n separated
11. Black holes as tools for quantum computing 11
black holes. However, simultaneously the information
retrieval time grows as the cubic power of n,
tq → t′
q = n3
tq . (21)
Obviously, this compromises the information processing
efficiency of the system. Thus, the optimal way of in-
creasing the information capacity of the system for a
fixed tq is to increase the number of black holes instead
of investing the same total energy into a creation of big-
ger ones.
We thus arrive to a very general conclusion:
The quantum computing devices of advanced ETI are
more likely to consist of multiple small black holes rather
than a fewer big ones.
This implies that from black hole “farms” of ETI, we
are more likely to expect the Hawking radiation of higher
temperature and intensity.
10. TIME SCALES AND MESSENGERS
The relation (16) between the information processing
time tq and the black hole radius/temperature allows us
to give some estimates about the energies and the na-
ture of the emitted Hawking radiation. We must how-
ever note that such estimates change if we move beyond
Einstein gravity. This departure is scale-dependent. As
already discussed, in any theory with number of parti-
cle species Nsp, the gravitational cutoff is lowered to the
species scale L∗ given by the equation (3) (Dvali 2007;
Dvali Redi 2008).
This length gives an absolute lower bound on the size
of a black hole tractable within Einstein gravity. By
taking into account the number of all currently known
elementary particle species (Nsp ∼ 100), the minimal
size of a black hole is still very short, L∗ ∼ 10−32
cm.
The corresponding half-life of an Einsteinian black
hole is only slightly longer than the Planck time, tq ∼
10−43
sec. This time sets an absolute lower bound on
the information processing speed offered by any ETI.
Here, for definiteness, we shall assume that L∗ ∼
10−32
cm marks the validity domain of Einsteinian grav-
ity. A discussion of the cases with much larger number
Nsp will be postponed towards the end of the paper.
However, here we wish to note that increasing Nsp only
helps in more efficient manufacturing of “artificial” black
holes. This is due to the fact that the threshold energy
of black hole formation in particle collisions is lowered
according to (4) (Dvali Redi 2008). The size of the
smallest black hole (4) is correspondingly much larger
than in Einstein gravity with a small number of species.
Correspondingly, existence of large number of species
makes the black hole quantum computing more likely
accessible for less advanced civilizations. In particular,
for the limiting case of Nsp ∼ 1032
, manufacturing of
smallest black holes will become (or perhaps already is)
possible by our civilization.
Assuming for the time being that we are within the
validity of Einstein gravity for the scales of interest, we
can estimate the parameters of the expected Hawking
radiation for certain reasonable time-scales.
For example, demanding that tq is not much longer
than a few seconds, we arrive to the conclusion that the
energy of the Hawking quanta is above ∼TeV energy.
With such a high temperature all the known particle
species will be radiated by ETI black hole computers.
Of course, how efficiently the radiation can reach us
depends on number of factors. These factors include
the presence of a medium that can absorb or shield the
radiation.
In general, since the neutrinos have the deepest pene-
tration length, they appear to be the most robust mes-
sengers. In addition, the mediation by high energy
gamma quanta and other species is certainly a possi-
bility.
Of course, the high energy radiation coming from
the distant ETI, will be subjected to the same uni-
versal cutoffs as the high energy cosmic rays produced
by the natural sources. The example is given by the
Greisen-Zatsepin-Kuzmin (GZK) cutoff (Greisen 1966;
Zatsepin, Kuzmin 1966) due to scattering of high en-
ergy protons at the cosmic microwave background.
On top of the known natural factors, there can be un-
known artificial shielding effects. For instance, ETI may
compose absorbing shields around their black hole com-
puters. The motivation may vary from a simple defence
against the life-threatening high energy radiation, as it is
the case for humans, all the way to sophisticated energy-
recycling purposes. These are the factors about which
it is hard to speculate. However, in their light the case
for neutrino, as of a most promising known messenger,
is strengthened.
11. SOME GENERAL ESTIMATES
All the parameters required for our estimates - such as
the time scale of processing, the temperature of accom-
panying radiation, the black hole mass and the entropy
- can be expressed through one another. The only extra
quantity is Nsp, which shortens the processing time. At
the moment we assume that this quantity is not much
different from the number of known species. Then, we
are left with a single input parameter which we shall
take to be the information processing time tq (14). As
we already discussed (15), this time is approximately
12. 12 Dvali Osmanov
equal to a half decay time of a black hole τ1/2, which
shall be used below.
We assume that the typical timescale which is required
for computation is of the order of 1 sec. Then, from Eq.
11 one can show that the black hole mass will be of the
order of 1.2×109
g. In this context, an important issue a
civilization has to address is energy, which is required for
constructing an extremely compact object, Mc2
≃ 1030
erg.
In order to understand the capabilities of extraterres-
trial civilizations, one should examine them in the light
of the classification introduced by Kardashev (1964).
According to this approach, alien societies should be dis-
tinguished by their technological level. Type-I civiliza-
tions, in particular, use the entire power attained on a
planet. In the case of the Earth, the corresponding value
is of the order of PI
≃ L⊙
4 ×
RE
RAU
2
≃ 1.7 × 1024
erg
s−1
, whereas the current power consumption is about
1.5 × 1020
erg s−1
, classifying our civilization as level
0.7 (Sagan 1973). Type-II civilization utilizes the whole
stellar power, PII
≃ L⊙, and Type-III is an alien so-
ciety, that consumes the total power of a host galaxy,
PIII ≃ 1011
× L⊙. Here, L⊙ ≃ 3.8 × 1033
erg s−1
is the
Solar luminosity, RAU ≃ 1.5 × 1013
cm is the distance
from the Earth to the Sun (astronomical unit - AU) and
RE ≃ 6.4 × 108
cm denotes the radius of the Earth.
From these estimates it is obvious that even Type-I civ-
ilization can build a black hole with the aforementioned
mass: it only needs to accumulate the energy incident
on the planet during ∼ 7 days.
In the search for extraterrestrial intelligence, the ma-
jor task is to find specific fingerprints of their activity.
Using black holes will inevitably lead to such observa-
tional features which, on the one hand, might be de-
tected and, on the other hand, distinguished from typi-
cal astrophysical events. In particular, we use the knowl-
edge that black holes emit particles with energy in the
range (E, E + dE) with the rate (Hawking 1974)
R(E, M) ≡
d2
N
dtdE
=
1
2π~
×
Γs(E, M)
e
8πGME
~c3 − (−1)2s
, (22)
where Γs(E, M) = E2
σs(E, M)/(π~2
c2
), s denotes the
spin of a particle, σs = αsG2
M2
/c4
, α1/2 = 56.7, α1 =
20.4 and α2 = 2.33 (MacGibbon Webber 1990). As
discussed, the flux of particles might be screened out
but, we assume that neutrinos escape. Therefore, the
major observational feature might be the energetic flare
of neutrinos. Using Eq. (22), one can easily demonstrate
that the spectral power, ER, of neutrinos, (s = 1/2), has
a peak at energies
Eν
m ≃
3.13~c3
8πGM
≃ 48 ×
100
Nsp
×
1 sec
τ1/2
!1/3
T eV, (23)
which can be detected by the IceCube detector
(Ahrens et al. 2004). For the neutrino emission lumi-
nosity at Em we obtain
Lν
m ≃ 3 × (Eν
m)2
R(Eν
m, M) ≃
≃ 1.2 × 1028
×
120
Nsp
×
1 sec
τ1/2
!2/3
erg/s. (24)
where the multiplier, 3, comes from the three flavors of
neutrinos. As it is evident, even a single event is charac-
terized by the very high luminosity of the neutrino flare.
In general, it is obvious that a civilization might perform
many calculations per second. On the other hand, any
civilization is limited by the level of technology it pos-
sesses. The maximum number of quantum processings
per second, nx (where x = (I, II, III) denotes the level
of technology), is severely constrained by a civilization’s
maximum power, nx ≃ Px/hLi, where hLi ≃ Mc2
/2τ1/2
is the average luminosity of black hole’s emission:
nI = 3.5×10−6
×
100
Nsp
1/3
×
τ1/2
1 sec
2/3
sec−1
, (25)
nII = 7.7 × 103
×
100
Nsp
1/3
×
τ1/2
1 sec
2/3
sec−1
, (26)
nIII
= 7.7×1014
×
100
Nsp
1/3
×
τ1/2
1 sec
2/3
sec−1
. (27)
Since the value of nI
is very small, the best way is
to capture a single event of duration τ1/2
. For Type-I
alien societies the luminosity will be defined by Eq. (24),
but for Type-II,III civilizations the luminosity might be
much higher, reaching the solar luminosity for Type-II
and the galactic luminosity for Type-III societies.
This is not the end of the story. In 1961, Frank Drake
formulated an equation used to estimate the number of
communicative civilizations (Drake 1961)
N = R⋆ × fp × ne × fl × fi × ft × L, (28)
where R⋆ denotes the average rate of star formation, fp
denotes the fraction of stars with planets, and ne denotes
the average number of potentially habitable planets per
star, fl is the fraction of planets, that can potentially
support life and that actually develop it, fi denotes the
fraction of planets with life that develop intelligent life,
13. Black holes as tools for quantum computing 13
ft is the fraction of technological communicative civi-
lizations, and L is the average length of a time-scale for
which alien advanced civilizations release detectable sig-
nals into space. From the study of the Milky Way (MW),
it is well known that the modern value of the star for-
mation rate is in the interval (1.5 − 3) stars per year
(Kennicutt Evans 2012). Based on the Kepler space
mission, it has been revealed that there should be around
40 billion Earth-sized planets in the habitable zones of
solar-type stars and red dwarfs (Petigura et al. 2013).
This automatically implies that a value of fp ×ne should
be 0.4. The average value of RAstro
= R⋆ ×fp ×ne ≃ 0.9,
is more or less well defined, but about the rest, almost
nothing is known and we can only apply very specu-
lative approaches. Based on the observation that life
on Earth appeared soon after the conditions became fa-
vorable, one can speculate that the value of fl is close
to 1. In his statistical approach to the Drake equation,
Maccone (2012) uses a moderate value of 1/3 and a value
of 0.01 for fi × ft. In the scientific literature, a range
(10−3
−1) (Prantzos 2013) is considered for the quantity
fBiotech
= fl × fi × ft.
For estimating the value of L, we assume that it should
not be less than the timescale on which we need to reach
the corresponding level of technology. Following an ap-
proach by Dyson (1960), if one accepts that 1% of the
annual growth rate of industry is maintained, then by
taking the current power consumption, P0 ≃ 1.5 × 1020
erg/s, into account, one can straightforwardly show that
to reach Type-I and Type-II civilizations, one requires
approximately tI = 1000 yrs and tII = 3000yrs respec-
tively. Calculation of tIII should be performed in a
different way. By definition, a galactic civilization is
the one which colonized the whole galaxy, therefore, the
corresponding value should be of the order of DMW /υ,
where DMW ≃ 26.8 kpc is its average diameter of the
MW galaxy and υ is a velocity by which a civilization
covers the mentioned distance. By taking a very mod-
erate value of it, 0.01c, one obtains tIII ≃ 108
yrs. It is
clear that a theoretical minimum value of tIII imposed
by the speed of light is of the order of 106
yrs.
By assuming that the minimum values of Lx are of
the order of tx, then the number of civilizations, N =
RAstro
×fBiotech
×L can vary in the interval (3, 1.2×103
)
(Type-I) and (6, 3.6 × 103
) (Type-II). The aforemen-
tioned values might be even higher, if Lx exceed tx. One
should also emphasise that since nearly 75% of stars
with habitable planets are by ∼ 1 Gy older than the
Sun (Lineweaver et al. 2004), the previously mentioned
assumption is quite natural (Sagan 1963; Cameron 1963;
Shklovskii Sagan 1966), where the number of civiliza-
tions has been estimated to be of the order of 106
.
For the uniform distribution of civilizations in the
galactic plane, one can find that the order of magni-
tude of a minimum distance between alien societies is
given by dm ≃
p
AMW /N, where AMW ≃ πD2
MW /4 is
the area of the MW galaxy. Then, for the neutrino flux,
Fν
≃ Lν
m/(4πd2
m), multiplied by π2
nII
for Type-II civ-
ilizations (multiplier π2
comes from the summation over
all civilizations in the galactic plane), one obtains
Fν
I
≃ 10−13
×
100
Nsp
×
1 sec
τ1/2
!2/3
×
×
RAstro
0.9yr−1
×
fBiotech
1
×
LI
tI
GeV s−1
cm−2
, (29)
Fν
II
≃ 2.3 × 10−8
×
100
Nsp
×
×
RAstro
0.9yr−1
×
fBiotech
1
×
LII
tII
GeV s−1
cm−2
. (30)
Considering a Type-III civilization, for the flux from
a nearest galaxy one obtains
F
ν
III
≃ 9.2 × 10−6
×
100
Nsp
×
×
RAstro
0.9yr−1
×
fBiotech
1
×
LIII
tIII
2/3
GeV s−1
cm−2
.
(31)
We have taken into account that an average distance
between closest galaxies is given by (Vg/Ng)1/3
, where
VU ≃ 1.22 × 1013
Mpc3
is the volume of the visible
universe and Ng ≃ 1012
is an average number of galaxies
in the universe.
As we have already explained, for Type-I societies the
best way is to capture a single event, whereas for Type-II
and Type-III civilizations upper limits of possible fluxes
will be defined by a net effect of all quantum process-
ing events every second. An interesting feature of Eqs.
(30,31) is that they do not depend on τ1/2
, or black
hole mass. Such a behaviour is a direct consequence
of the fact that nx ∼ τ1/2
/M ∼ τ2/3
1/2
, multiplied by
Lν
m ∼ τ−2/3
1/2
(see Eq. 24) cancels the dependence on
τ1/2
.
The IceCube sensitivity to detecting muon neutrinos
is of the order of Fν
IC,min ≃ 7 × 10−9
GeV s−1
cm−2
(Ahrens et al. 2004), therefore, the Type-I alien society
cannot be detected for the mentioned parameters and
it will be possible only for fBiotech × LI /tI ≃ 7 × 104
,
which seems to be an unrealistically large value.
On the contrary, Type-II and Type-III alien societies
might be visible to the Ice Cube detectors. Therefore, it
is reasonable to estimate upper limits of the number of
14. 14 Dvali Osmanov
civilizations which might provide the flux for 30 TeV:
Fν
IC ≃ 3.6 × 10−7
GeV s−1
cm−2
observed by the Ice-
Cube instruments. By assuming that Fν
IC is produced
by the advanced ETI, one obtains
Nν
II,max
≃
D2
MW Fν
IC
nII
Lν
m
≃ 4.2 × 104
, (32)
Nν
III,max
≃ VU ×
36Fν
IC
13πnII Lν
m
3/2
≃ 1.4 × 108
, (33)
where we have taken into account that a flux from the
nearest Type-III civilization should be multiplied by
13π2
/9 to obtain the net effect from all civilizations uni-
formly distributed over the observed universe. Similarly,
by taking Fν
IC,min, into account, one can obtain the min-
imum values of civilizations, which might be detectable
by the IceCube instruments: Nν
II,min
≃ 2.5 × 103
,
Nν
III,min
≃ 2 × 106
.
Other signatures also might exist if ETI does not need
to screen out the black hole radiation. In this case other
particles will escape a region. In particular, if one con-
siders escaping photons (s = 1 and α1 = 20.4, see Eq.
(22)), one can straightforwardly show that the spectral
power peaks at
Eγ
m ≃
2.82~c3
8πGM
≃ 43 ×
100
Nsp
×
1 sec
τ1/2
!1/3
T eV, (34)
corresponding to the following gamma ray luminosity
Lγ
m ≃ (Eγ
m)2
R(Eγ
m, M) ≃
≃ 1.5 × 1027
×
100
Nsp
×
1 sec
τ1/2
!2/3
erg/s. (35)
By taking the number of all civilizations in the galaxy
into account, one can derive the high energy photon
fluxes for Type-I,II,III ETIs
Fγ
I
≃ 1.2 × 10−14
×
100
Nsp
×
1 sec
τ1/2
!2/3
×
×
RAstro
0.9yr−1
×
fBiotech
1
×
LI
tI
GeV s−1
cm−2
, (36)
Fγ
II
≃ 2.8 × 10−9
×
100
Nsp
×
RAstro
0.9yr−1
×
×
fBiotech
1
×
LII
tII
GeV s−1
cm−2
, (37)
F
γ
III
≃ 1.1 × 10−6
×
100
Nsp
×
×
RAstro
0.9yr−1
×
fBiotech
1
×
LIII
tIII
2/3
GeV s−1
cm−2
.
(38)
The Major Atmospheric Imaging Cherenkov
(MAGIC) telescope aiming to detect very high en-
ergy (VHE) gamma rays has sensitivity of the order
of 10−13
erg cm−2
s−1
≃ 6 × 10−11
GeV cm−2
s−1
(Aharonian 2004). Therefore, MAGIC can detect
the VHE gamma rays from Type-I civilizations if
LI
∼ 5 × 103
tI
∼ 5 × 106
yrs.
From Eqs. (37,38) it is evident that for the chosen pa-
rameters the VHE photons from Type-II and Type-III
alien societies might be detected by the MAGIC instru-
ments. But one should clearly realise that if the ad-
vanced civilizations do not manufacture the black holes
at the maximum rate (see Eqs. (25-27)) the fluxes will
be smaller. In any case, the sensitivity of MAGIC fa-
cilities impose a constraint on the minimum number of
civilizations Nγ
II,min
≃ 60, Nγ
III,min
≃ 7.5 × 103
.
12. ETI FROM DARK SIDE
All currently known elementary particles are described
by the Standard Model plus Einstein gravity. The
known particle species consist of: the Higgs scalar, the
fermions (quarks and leptons) and the mediators of grav-
itational and gauge interactions. Counting all polariza-
tions, this amounts to Nsp ∼ 100 field theoretic particle
species.
The Standard Model is an extraordinarily successful
theory. It suffices to say that within the range of dis-
tances of approximately 10−17
− 1028
cm there is no ex-
perimental evidence invalidating it. Yet, we know that
this theory is incomplete and there must exist particle
species beyond it.
Perhaps, the most direct evidence for this is pro-
vided by the dark matter. Although a part of dark
matter could reside in black holes (for a review, see,
(Escrivà, Kuhnel Y. Tada 2022)), almost certainly,
this option too requires the existence of new particle
species. The current experimental constraints tell us
that, if lighter than ∼TeV, the new particle species can-
not interact with us via the known forces other than
gravity. A potential exception is provided by the hypo-
thetical particles with very small electric charges.
Of course, new particles may interact with our species
via some yet undiscovered forces. The only constraint
on such interactions is that they are sufficiently weak for
avoiding the experimental detection.
Nothing much beyond this is known about the hid-
den sector species. They can come in number of inter-
esting forms. For example, dark sectors can be orga-
15. Black holes as tools for quantum computing 15
nized in form of many hidden copies of the Standard
Model (Dvali 2007; Dvali Redi 2008), or the paral-
lel branes separated from us in large extra dimensions
(Arkani-Hamed, Dimopoulos, Dvali 1998).
However, their number is limited. There exist an uni-
versal upper bound (Dvali 2007),
Nsp ∼ 1032
, (39)
on the total number of distinct particle species. This
bound comes from the fact that species lower the fun-
damental scale of quantum gravity to (4) (Dvali 2007;
Dvali Redi 2008). Then, the lack of the observation
of strong quantum gravity effects - such as the creation
of micro black holes - at Large Hadron Collider, puts
the lower bound M∗ ≥TeV. This translates in the up-
per bound (39) on the total number of particle species.
This number of course includes the number of the known
particle species of the Standard Model.
Currently, there are no model-independent restric-
tions on the ways the hidden species organize them-
selves in different sectors and on how they interact with
one another. Given this vast freedom and our almost
zero knowledge of the possible forms and the evolution-
ary paths of intelligence, there is no reason to restrict
ETI exclusively to the creatures consisting of particles of
the Standard Model. Thus, with our current knowledge
there is a room for the existence of ∼ 1032
distinct dark
sectors each producing more than one type of intelligent
beings.
On this extended landscape of possibilities, the black
holes offer to provide an universal search tool. First,
as explained, due to their maximal efficiency, the black
hole based devices represent universal attractor points
in the evolution of quantum computing. Secondly, the
usage of black holes as the information storing devices
can produce the observable signatures of the dark ETI.
Due to the fact that Hawking radiation is democratic
in particle species, the dark quantum computers will
inevitably radiate the species of the Standard Model,
such as neutrinos and photons.
Some comments are however in order. If the number of
species Nsp is significant, the characteristics of the light-
est black holes will be affected. In pure Einstein gravity,
the quantum gravity becomes strong at the Planck scale
MP . That is, the interactions of particles at distances
∼ LP are fully dominated by gravity.
Correspondingly, the Planck scale sets both the mass,
M ∼ MP , as well as the radius, R ∼ LP , of the lightest
black hole. No lighter black hole can exist in nature
within Einstein theory of gravity.
This fact creates a technical obstacle for the civiliza-
tions seeking to manufacture the light black holes. For
example, in order to produce a MP -mass black hole, we
need to collide some elementary particles with a center
of mass energy ∼ MP and an impact parameter ∼ LP .
This is highly non-trivial. For example, with the cur-
rent technology available to our civilization, boosting
particles to such high energies would require a particle
accelerator of the cosmic extend.
However, in a theory with large number of particle
species, the story is dramatically different. The scale of
strong gravity is lowered to the scale M∗ given by (4).
Correspondingly, the mass of the lightest black hole is
now given by M∗, as opposed to MP . Such a low scale
can be much easier accessible. It suffices to notice that,
for example, in the extreme case of Nsp ∼ 1032
, such
a black hole can even be manufactured by our current
civilization at LHC (Dvali Redi 2008).
In fact, a particular case of this is provided by
a well known prediction (Antoniadis, et al 1998) of
the theory of large extra dimensions introduced in
(Arkani-Hamed, Dimopoulos, Dvali 1998). In this the-
ory, the role of additional particle species is played by
the Kaluza-Klein tower of gravitons. Their number is
precisely Nsp ∼ 1032
(Dvali 2007).
The general message is that with large Nsp, the black
holes smaller than a certain critical radius R∗ are lighter
than their counterparts in pure Einstein gravity (Dvali
2010). Below this size, the precise relation between
M and R is theory-dependent. For example, in theory
Arkani-Hamed, Dimopoulos, Dvali (1998) with d extra
dimensions of size R∗, the black holes of radius R R∗
and mass M obey the following relation,
R1+d
∼
M
M2+d
∗
. (40)
The theory-independent phenomenological bound on the
critical scale R∗ comes from the short distance tests of
Newtonian gravity. The current bound of R∗ ≤ 38µm is
provided by the torsion-balance experiments (Lee et al.
2020).
In the light of the above, we must take into account
that for large number of species, it becomes easier to
manufacture the small black holes. Correspondingly,
with larger Nsp, the compact storage and a fast pro-
cessing of information via black holes becomes more ac-
cessible even for less advanced ETIs.
13. SUMMARY
The advancement of civilization is directly linked with
its ability to efficiently store and process information.
It is therefore important to understand what are the
fundamental limitations of this advancement imposed by
our current understanding of laws of nature. At most
16. 16 Dvali Osmanov
fundamental and experimentally best-verified level, this
understanding comes from the framework of quantum
field theory.
Recent studies (Dvali 2021) show that the validity of
quantum field theory imposes an universal upper bound
on the information storage capacity of a device that can
be composed by the corresponding quanta. The objects
saturating this bound, the so-called ”saturons”, are the
most efficient storers of quantum information.
Interestingly, the universality of the gravitational in-
teraction tells us that, among all possible hypothetical
saturons, the black holes have the highest capacity of
the information storage (Dvali 2021).
Correspondingly, any sufficiently advanced civiliza-
tion, is ultimately expected to develop the black hole
based quantum computers. This opens up an exciting
prospect for SETI through the Hawking radiation from
such computers.
Certain generic features emerge.
The estimated efficiency of information processing in-
dicates that ”memory disks” that allow for short time
(e.g., tq ≤ sec) of information retrieval must be based
on microscopic black holes (e.g., of size, R ≤ 10−18
cm).
Correspondingly, the Hawking radiation from ETI must
be in a high energy spectrum that is potentially de-
tectable by the human devices.
Even if ETI is entirely composed of totally new types
of particle species, due to universality of Hawking radi-
ation, the emitted quanta will contain the known parti-
cles, such as neutrinos or photons. Thus, the ultimate
efficiency of black hole quantum computing allows us
to search for ETI even if they are entirely composed of
some unknown quanta.
The main message of our paper is that a new knowl-
edge about ETI can be deduced by analysing the Hawk-
ing radiation from their quantum computers. In turn,
the non-observation of such a radiation, can be trans-
lated as new limits on ETI.
We gave some indicative estimates. Analysing the
IceCube’s VHE ( 30 TeV) data we showed that the
mentioned instruments are able to detect the neutrino
emission flux from black hole farms of Type-II,III ad-
vanced civilizations. In particular, the current sensitiv-
ity and the observed VHE flux, impose constraints on
the numbers of civilizations 2.5×103
≤ Nν
II
≤ 4.2×104
,
2 × 106
≤ Nν
III
≤ 1.4 × 108
.
We have also pointed out that need for the black hole
computers can lead to accompanying signatures from
the manufacturing process. For example, black hole fac-
tories will result into a radiation from the high energy
collisions that are necessary for formation of micro black
holes.
Although, most likely, our knowledge covers only a
limited number of the existing particle species, the above
features are independent on this lack of knowledge. Mor-
ever, a detection of Hawking radiation from ETI, can in-
directly contribute to widening of our knowledge about
hidden particles. This is due to a general fact that
larger is the number of particle species Nsp, easier it
becomes for any civilization to manufacture black holes
(Dvali Redi 2008). This is true also for our civiliza-
tion, which has a chance of creating micro black holes
at colliders, provided Nsp ∼ 1032
.
In summary, the universality of the laws of quantum
physics and gravity allows us to draw some surprisingly
powerful conclusions about ETI. We have argued that
on the evolution road of any long-lasting civilization the
black hole based quantum computers are the natural
attractor points. This opens up a new avenue for SETI.
The general smoking guns include potential detections
of high energy neutrino fluxes with approximately ther-
mal distribution. Another interesting but more exotic
possibility, that requires additional clarifications, is that
farms of closely spaced black holes which likely produce
and extended high temperature medium may occasion-
ally create and radiate away heavy composite structures.
Acknowledgments — The work of G.D. was sup-
ported in part by the Humboldt Foundation under Hum-
boldt Professorship Award, by the Deutsche Forschungs-
gemeinschaft (DFG, German Research Foundation)
under Germany’s Excellence Strategy - EXC-2111 -
390814868, and Germany’s Excellence Strategy under
Excellence Cluster Origins. Z.O. would like to thank
Arnold Sommerfeld Center, Ludwig Maximilians Uni-
versity and Max Planck Institute for Physics for hospi-
tality during the completion of this project.
REFERENCES
Aartsen, M. G., et al. 2014, Phys. Rev. L., 113, 101101
Ahrens, J., et al. 2004, Astropart. Phys., 20, 507
Aharonian, F.A., Very High Energy Cosmic Gamma
Radiation - A Crucial Window on the Extreme Universe.
World Scientific Publishing Co. Pte. Ltd.; 2004
Antoniadis, I., Arkani-Hamed, G.R. Dimopoulos, S.
Dvali, G.R., 1998, Phys. Lett. B 436, 257
[arXiv:hep-ph/9804398 [hep-ph]].
Arkani-Hamed, N., Dimopoulos, S. and Dvali, G. R., 1998,
Phys. Lett. B 429, 263 [arXiv:hep-ph/9803315 [hep-ph]].
17. Black holes as tools for quantum computing 17
Boyajian, T.S. et al., 2016, MNRAS, 457, 3988
Bekenstein, J. D. , 1973, Phys. Rev. D 7, 2333
Cameron, A.C.W. (1963). Communicating with
Extraterrestrial Intelligence on other worlds. Sky
Telescope 26, 258
Bradley W. Carroll Dale A. Ostlie, ”An Introduction to
Modern Astrophysics and Cosmology”, Cambridge
University Press, Cambridge, UK, 2010.
Dyson, F., 1960, Science, 131, 1667
Drake, F.D., 1961, Discussion of Space Science Board,
National Academy of Sciences, Conference on
Extraterrestrial Inteligent life, November 1961, Green
bank, West Virginia
Dvali, G. Gomez, C., 2009, Phys. Lett. B 674, 3031940
Dvali, G. Gomez, C., 2011, Fortsch. Phys., 61, 742
Dvali, G. Gomez, C., 2012, Eur. Phys. J. C., 74, 2752
Dvali, G., Flassig, D., Gomez, C., Pritzel, A.
Wintergerst, N., 2013, Phys. Rev. D, 88, 124041
Dvali, G., Franca, A., Gomez, C. Wintergerst, N., 2015,
Phys. Rev. D, 92, 125002
Dvali, G., 2010, Fortsch. Phys., 58, 528, [arXiv:0706.2050
[hep-th]].
Dvali, G. Redi, M., 2008, Phys. Rev. D, 77, 045027,
[arXiv:0710.4344 [hep-th]].
Dvali, G. Redi, M., 2009, Phys. Rev. D, 80, 055001,
[arXiv:0905.1709 [hep-ph].
Dvali, G., 2010, Int. J. Mod. Phys. A, 25, 602
Dvali, G. Panchenko, M., 2016, Fortsch. Phys., 64, 569
Dvali, G., [arXiv:1711.09079 [quant-ph]].
Dvali, D., 2018, Phys. Rev. D 97, 105005, [arXiv:1712.02233
[hep-th]];
Dvali, D., 2018, Fortsch. Phys. 66, 1800007,
[arXiv:1801.03918 [hep-th]].
Dvali, G., Michel, M, Zell, S., 2019, EPJ Quant. Technol.
6, 1, [arXiv:1805.10292 [quant-ph]].
Dvali, G., [arXiv:1810.02336 [hep-th]].
Dvali, G., Eisemann, L., Michel, M. Zell, S., 2020, Phys.
Rev. D, 102, 103523
Dvali, G., 2021, JHEP, 03, 126
Dvali, G., 2019, Fortsch. Phys., 69, 2000090,
[arXiv:1906.03530 [hep-th]].
Dvali, G., 2021, Fortsch. Phys. 69, 2000091,
[arXiv:1907.07332 [hep-th]].
Dvali, G. Sakhelashvili, O., 2022, Phys. Rev. D 105,
065014, [arXiv:2111.03620 [hep-th]].
Dvali, G., Kaikov, O., Bermúdez, J.S.V., 2022, Phys.
Rev. D, 105, 056013, [arXiv:2112.00551 [hep-th]].
Dvali, G., Kühnel, F. Zantedeschi, M., 2022, Phys. Rev.
Lett. 129, 061302, [arXiv:2112.08354 [hep-th]].
Dvali, G. Venugopalan, R., 2022, Phys. Rev. D, 105,
056026
Dvali, G., 2016, Fortsch. Phys. 64, 106, [arXiv:1509.04645
[hep-th]]
Escrivà, A., Kuhnel, F. Tada, Y., [arXiv:2211.05767
[astro-ph.CO]].
Gelis, F., Iancu, E., Jalilian-Marian, J. Venugopalan, R.,
2010, Ann. Rev. Nucl. Part. Sci. 60, 463
Greisen, K., 1966, Phys. Rev. Lett. 16, 748
Zatsepin, G.T. Kuzmin, V.A., 1966, JETP Lett. 4, 78
Hsiao et al., 2022, MNRAS, 506, 1723
Hawking, S.W., 1974, Nature, 248, 30
Kardashev, N.S. 1964, Soviet Ast., 8, 217
Kennicutt, R.C. Evans, N.J., 2012, ARAA, 50, 531
Lee, J.G., Adelberger, E.G., Cook, T.S., Fleischer, S.M.
Heckel, B.R., 2020, Phys. Rev. Lett., 124, 101101
Lineweaver, C.H., Fenner, Y. Gibson, B.K., 2004,
Science, 303, 59
Los, J. H., Zakharchenko, K. V., Katsnelson, M. I.
Fasolino, A., 2015, PhRev B, 91, 045415
Lu, J., Lee, K. Xu, R. 2020, Science China Physics,
Mechanics, and Astronomy, 63, 229531
Maccone, C., 2012, AcAau., 78, 3
MacGibbon, J.H. Webber, B.R., 1990, Phys. Rev. D., 41,
3052
Tarter, J., 2001, ARAA, 39, 511
Osmanov, Z.N., 2021, IJAAE, 6, 049
Osmanov, Z.N., 2021, JBIS, 74, 193
Osmanov, Z., 2018, IJAsB, 17, 112
Osmanov, Z., 2016, IJAsB, 15, 127
Osmanov, Z. Berezhiani, V.I., 2019, JBIS, 72, 254
Osmanov, Z. Berezhiani, V.I., 2018, IJAsB, 17, 356
Page, D. N., 1993, Phys. Rev. Lett. 71, 3743
Petigura, E.A., Howard, A.W. Marcy, G.W., 2013,
”Prevalence of Earth-size planets orbiting Sun-like stars”,
Proceedings of the National Academy of Sciences of the
United States of America, 110 (48): 19273?19278
Prantzos, N., 2013, IJAsB, 12, 246
Sagan, C., ”The Cosmic Connection: An Extraterrestrial
Perspective”, Dell Publishing Co., Inc. New York, USA,
1973.
Sagan, C. (1963). Direct contact among galactic
civilizations by relativistic interstellar spaceflight. Planet.
Space Sci. 11, 485
Sahu, S. Miranda, L.S., 2015, EPJC, 75, 273
Schleicher, D.R.G. Bovino, S., 2022, IJAsB (accepted)
Shapiro S.L. Teukolsky, S.A., 2004, Black Holes, White
Dwarfs, and Neutron Stars: The Physics of Compact
Objects. Wiley-VCH, Weinheim
18. 18 Dvali Osmanov
Shklovskii, I. Sagan, C. (1966). Intelligent Life in the
Universe. Random House, New York.
Wright, J.T., 2020, Ser. Astr. J., 200, 1
Zuckerman, B, 2022, MNRAS, 514, 227