Kinetics of X-ray conductivity for an ideal wide-gap semiconductor irradiated...
1502.03424v3 dark energy decoherence
1. arXiv:1502.03424v3[quant-ph]19Feb2015
Is dark energy an artifact of decoherence?
Chris Fields
528 Zinnia Court
Sonoma, CA 95476 USA
fieldsres@gmail.com
February 20, 2015
Abstract
Within the quantum Darwinist framework introduced by W. H. Zurek (Nat. Phys.,
5:181-188, 2009), observers obtain pointer-state information about quantum systems
by interacting with the surrounding environment, e.g. the ambient photon field. This
framework is applied to the observation of stellar center-of-mass positions, which are
assumed to be encoded in a way that is uniformly accessible to all observers regardless
of their location. Assuming Landauer’s Principle, constructing such environmental
encodings requires ∼ kT per bit. For 1025 stars and a binary encoding of center-of-
mass positions into (10 km)3 voxels, the free energy required at T = 2.7 K is ∼ 5 ·
10−27 kg · m−3, in striking agreement with the observed value of ΩΛρc. Decreasing
the voxel size to l3
P results in a free energy requirement 10117 times larger.
Keywords: Classicality; Critical density; Dark sector; Environment as witness; Informa-
tion; Landauer’s Principle; Quantum Darwinism
1 Introduction
The Planck results [1] together with earlier data [2, 3, 4] clearly establish the observational
effects of dark energy in our universe, setting the value ΩΛ = 0.69 ± 0.01 for the fractional
contribution of dark energy to the critical density. The physical meaning of this result,
however, remains unclear. Here I show that if the ambient photon field is taken to classically
encode the center-of-mass position pointer states of 1025
stars with 10 km spatial resolution,
the free energy required to construct this encoding, assuming Landauer’s principle [5, 6],
is equivalent to a uniform mass density of ∼ 5 · 10−27
kg·m−3
, in good agreement with
1
2. the observed value for the dark-energy density ΩΛρc = 5.7 · 10−27
kg·m−3
. Decreasing
the encoding resolution to the Planck scale results in a factor of 10117
increase in the free
energy required for encoding, suggesting that the well-known discrepancy between effective
field theory calculations of the vacuum energy and the observed dark-energy density may
be largely due to unrealistic assumptions about the effective classicality of information at
small scales.
2 Decoherence and the environment as witness
Decoherence is the apparent loss of quantum coherence from a system that is exposed to an
unobserved and uncharacterized environment [7, 8, 9, 10, 11, 12, 13, 14]. Following [11, 14],
decoherence can be viewed as a scattering process characterized by a scattering constant L.
Using [14], Eq. 3.67 and converting to SI units, L for an object of radius a and dielectric
constant ε ≫ 1 exposed to an ambient photon field at an effective temperature T can be
approximated as:
L ∼ 1032
· a6
· T9
m−2
· s−1
. (1)
The characteristic time for decoherence at a spatial scale x is then ([14], Eq. 3.58):
τx = L−1
x−2
s. (2)
Consider the CMB as the ambient field, so T = 2.7 K, T9
∼ 104
and let x = 1 m. For an
atom of the Bohr radius, we obtain L ∼ 1.5 · 10−26
m−2
· s−1
and τ ∼ 7 · 1025
s, considerably
more than the age of the universe. For a star of one solar radius, however, L ∼ 1089
m−2
· s−1
and τ ∼ 10−89
s, i.e. the CMB efficiently decoheres the center-of-mass positions of stars.
W. H. Zurek and colleagues introduced the “environment as witness” formulation of de-
coherence [15, 16] in recognition of the fact that observers typically obtain pointer-state
information about quantum systems by interacting with the surrounding environment, e.g.
the ambient photon field. In this formulation and its “quantum Darwinist” extension to
multiple observers [17, 18, 19, 20, 21], the environment is taken to encode pointer-state
information for quantum systems embedded in it in a way that is massively redundant
and hence equally accessible to many non-interacting observers. Other than interaction
with the environment, no physical restrictions are placed on the observers in this picture;
in particular, they are not restricted in either their location with respect to the observed
system(s) or the resolution with which they interact with the environment. The resolution
with which the environment encodes pointer-state information about any particular object
is similarly unrestricted by any a priori consideration.
Consider now a description of astronomical observations within this quantum Darwinist
framework. The apparently classical values of center-of-mass positions of stars are, within
this framework, attributed to decoherence of their position pointer states by the ambient
2
3. photon field, which as seen above is a very efficient process. These center-of-mass position
values are encoded by the ambient photon field in a way that is massively redundant and
hence equally accessible to many non-interacting observers. Multiple observers can, there-
fore, obtain these center-of-mass position values by interacting with the ambient photon
field without disturbing either the encodings available to other observers or the center-of-
mass positions themselves, consistent with the effective classicality of the encoding.
3 Free energy required for encoding
Equal-sized voxels provide a simple encoding of 3-dimensional positional information that
is independent of the location of potential observers. Let us assume that the center-of-mass
positions of stars (or of other objects with apparently classical locations) are given by a
digital encoding, with “1” assigned to a voxel if the center of mass of the object in question
is within that voxel and “0” assigned to the voxel otherwise. The number of bits required
to unambiguously specific the center-of-mass position of any object then equals the number
of voxels. For an observable universe with approximate volume VU ∼ 4 · 1080
m3
and (10
km)3
voxels, the number NV of voxels is:
NV = VU · 10−12
m−3
voxels ∼ 4 · 1068
voxels. (3)
Assuming a total of 1025
stars, consistent with observations indicating a “bottom-heavy”
initial mass function [22, 23, 24], the total number Nb of bits required to encode the voxel
location of each star is:
Nb = NV · 1025
∼ 4 · 1093
bits. (4)
We now invoke Landauer’s Principle [5, 6] by requiring that this environmental encoding
be irreversible and hence consistent with the possibility of observations being made by
observers at different locations after different elapsed times. In this case a free-energy cost
of ∼ kT per bit is required for the encoding. At T = 2.7 K, kT ∼ 4 ·10−23
J, so an
expenditure of 1 J is sufficient to encode ∼ 2.5 ·1022
bits. In this case, the total energy
required to encode the center-of-mass positions of all 1025
stars is:
Eencoding = NbkT ∼ 1.6 · 1071
J, (5)
or an equivalent mass ∼ 2 ·1054
kg. Assuming an approximately uniform distribution of
stars at cosmological scale, this energy expenditure must also be approximately uniform,
yielding a free-energy density:
ρencoding = (NbkT)/(VU c2
) ∼ 5.0 · 10−27
kg · m−3
, (6)
3
4. in equivalent mass units. This number clearly compares well with the value:
ΩΛρc = ΩΛ[(3H2
0 )/(8πG)] = 5.7 · 10−27
kg · m−3
(7)
for the dark energy contribution to the critical density ρc obtained with the values ΩΛ = 0.69
and H0 = 67.8 km · s−1
per Mpc reported by Planck [1].
4 Physical interpretation
As emphasized by Zurek [12, 13], decoherence is a factorization-dependent phenomenon:
absent an a priori division of the universe into a “system” and its “environment” there is
no system-environment interaction and hence no decoherence. As pointed out from both
physical [25, 26, 27, 28, 29, 30, 32] and philosophical [33, 34] perspectives, no physical
principle requires that any particular factorization be “preferred” in any way. What has
been shown here is that stipulating some particular factorization as being preferred in the
sense of being equally accessible to all observers regardless of their location has a physical
consequence within the framework of quantum Darwinism and Landauer’s Principle: any
such stipulation generates a free energy cost that depends on the resolution x at which
pointer states of the preferred systems are encoded by the environment. If the stipulated
factorization picks out 1025
stellar center-of-mass positions that are encoded at 10 km
resolution, that free energy cost is remarkably close to the observed value of ΩΛρc.
Limiting the free energy cost of encoding to that allowed by the observed value of ΩΛρc
effectively limits the number of bits that may be encoded to a fixed NΛ. In this case, Eq.
(6) and (7) can be combined with Nb as the empirical NΛ, allowing G to be written in an
essentially thermodynamic form:
G = (ΩΛ3H2
0 VU c2
)/(8πNΛkT) (8)
It is unusual to think of G as a thermodynamic constant, but from the “it from bit”
perspective of decoherence it must be one.
For any fixed number of objects, increasing the encoding resolution by ∆x increases the
number of bits required and hence the free energy cost of the encoding by a factor (∆x)3
.
Encoding 1025
stellar center-of-mass positions at x ∼ lP would incur a free-energy cost
∼ 10117
larger than that found here. It is well known that estimates of ΩΛρc from the
vacuum energy given by an effective field theory are in error by ∼ 10120
. As a free energy
cost on the order of the vacuum energy would be expected to characterize encodings of
classical information at x ∼ lP , it seems reasonable to suggest that the discrepancy between
these numbers may be due to the assumption that encoding classical information at x ∼ lP
can be considered physically meaningful.
If dark energy is indeed the free energy required to encode classical center-of-mass positions
as suggested here, it is in an important sense an artifact of the “choice” of a factorization
4
5. in which center-of-mass position is an observable. It is not surprising that this free energy
cost would remain unnoticed prior to deep-sky surveys revealing a previously unsuspected
density of distant galaxies embedded in an almost-uniformly cold background. One would,
moreover, expect an equivalent “dark energy” cost associated with all observations medi-
ated by a decohering environment. The potential observational signatures of such encoding
costs in relatively high-temperature environments at non-cosmological scales remain to be
investigated.
5 Conclusion
While it is easy to dismiss simple calculations such as demonstrated here as mere numerol-
ogy, they point to a physical consideration, that of the free-energy cost of encoding classical
information, that has heretofore largely been ignored in cosmological discussions. This free-
energy cost is factorization-dependent; absent factorization of the universe into “systems”
there is no classical information and hence no cost of encoding. One would expect, therefore,
a purely quantum mechanical description of the universe to be consistent with ΩΛ = 0.
Acknowledgement
This work was supported in part by The Federico and Elvia Faggin Foundation. Thanks
to Durmus Ali Demir for encouraging me to think more about the physical interpretation
of Eq. 6.
References
[1] PLANCK Collaboration (P. A. R. Ade et al.). Planck 2013 results. I. Overview of
products and scientific results. Astronomy and Astrophysics, 571:A1, 2013 [48 pages].
[2] M. Tegmark et al. Cosmological parameters from SDSS and WMAP Phys. Rev. D 69:
103501, 2004.
[3] M. Tegmark et al. Cosmological constraints from the SDSS luminous red galaxies
Phys. Rev. D 74: 123507, 2006.
[4] J. Frieman, M. Turner and D. Huterer. Dark energy and the accelerating universe.
Annual Review of Astronomy and Astrophysics, 46:385-432, 2008.
[5] R. Landauer. Irreversibility and heat generation in the computing process. IBM J.
Research Development, 5: 183-195, 1961.
[6] R. Landauer. Information is a physical entity. Physica A, 263:63-67, 1999.
5
6. [7] D. Zeh. On the interpretation of measurement in quantum theory. Found. Phys., 1:
69-76, 1970.
[8] D. Zeh. Toward a quantum theory of observation. Found. Phys., 3: 109-116, 1973.
[9] W. H. Zurek. Pointer basis of the quantum apparatus: Into what mixture does the
wave packet collapse? Phys. Rev. D, 24: 1516-1525, 1981.
[10] W. H. Zurek. Environment-induced superselection rules. Phys. Rev. D, 26: 1862-1880,
1982.
[11] E. Joos and D. Zeh. The emergence of classical properties through interaction with
the environment. Z. Phys. B: Condensed Matter, 59: 223-243, 1985.
[12] W. H. Zurek. Decoherence, einselection and the existential interpretation (the rough
guide). Phil. Trans. Royal Soc. A, 356: 1793-1821, 1998.
[13] W. H. Zurek. Decoherence, einselection, and the quantum origins of the classical. Rev.
Mod. Phys., 75:715-775, 2003. (arXiv:quant-ph/0105127v3).
[14] M. Schlosshauer. Decoherence and the Quantum to Classical Transition. Berlin:
Springer, 2007.
[15] H. Ollivier, D. Poulin, and W. H. Zurek. Objective properties from subjective quantum
states: Environment as a witness. Phys. Rev. Lett., 93:220401, 2004. (arXiv:quant-
ph/0307229v2).
[16] H. Ollivier, D. Poulin, and W. H. Zurek. Environment as a witness: Selective prolifer-
ation of information and emergence of objectivity in a quantum universe. Phys. Rev.
A, 72:042113, 2005. (arXiv:quant-ph/0408125v3).
[17] R. Blume-Kohout and W. H. Zurek. Quantum Darwinism: Entanglement, branches,
and the emergent classicality of redundantly stored quantum information. Phys. Rev.
A, 73:062310, 2006. (arXiv:quant-ph/0505031v2).
[18] W. H. Zurek. Quantum Darwinism. Nat. Phys., 5:181-188, 2009. (arXiv:0903.5082v1
[quant-ph]).
[19] C. J. Riedel and W. H. Zurek. Quantum Darwinism in an everyday environment: Huge
redundancy in scattered photons. Phys. Rev. Lett., 105: 020404, 2010.
[20] C. J. Riedel, W. H. Zurek and M. Zwolak. The rise and fall of redundancy in decoher-
ence and quantum Darwinism. New J. Phys., 14: 083010, 2012.
[21] M. Zwolak, C. J. Riedel and W. H. Zurek. Amplification, redundancy, and quantum
Chernoff information. Phys. Rev. Lett., 112: 140406, 2014.
[22] P. van Dokkum and C. Conroy. A substantial population of low-mass stars in luminous
elliptical galaxies. Nature, 468: 940-942, 2010.
6
7. [23] C. Conroy and P. van Dokkum. Counting low-mass stars in integrated light. Astro-
physical J., 747, 69, 2012.
[24] C. Tortora, A. J. Romanowsky and N. R. Napolitano. An inventory of the stellar initial
mass function in early-type galaxies. Astrophysical J., 765: 8, 2013.
[25] P. Zanardi. Virtual quantum subsystems. Phys. Rev. Lett., 87:077901, 2001.
arXiv:quant-ph/0103030v2.
[26] P. Zanardi, D. A. Lidar, and S. Lloyd. Quantum tensor product structures are
observable-induced. Phys. Rev. Lett., 92:060402, 2004. arXiv:quant-ph/0308043v1.
[27] M. Dugi´c and J. Jekni´c. What is “system”: Some decoherence-theory arguments. Int.
J. Theor. Phys., 45: 2249-2259, 2006.
[28] M. Dugi´c and J. Jekni´c - Dugi´c. What is “system”: The information-theoretic argu-
ments. Int. J. Theor. Phys., 47: 805-813, 2008.
[29] A. C. de la Torre, D. Goyeneche, and L. Leitao. Entanglement for all quantum states.
Eur. J. Phys., 31:325–332, 2010. (arXiv:1002.2893v1).
[30] N. L. Harshman and K. S. Ranade. Observables can be tailored to change the en-
tanglement of any pure state. Phys. Rev. A, 84:012303, 2011. (arXiv:1102.0955v1
[quant-ph]).
[31] W. Thirring, R. A. Bertlmann, P. K¨ohler, and H. Narnhofer. Entanglement or sepa-
rability: The choice of how to factorize the algebra of a density matrix. Eur. Phys. J.
D, 64:181–196, 2011. (arXiv:1106.3047v2 [quant-ph]).
[32] M. Dugi´c and J. Jekni´c - Dugi´c. Parallel decoherence in composite quantum systems.
Pramana 79:199–209, 2012. (arXiv:1011.5919v6 [quant-ph]).
[33] C. Fields. On the Ollivier-Poulin-Zurek definition of objectivity. Axiomathes, 24:
137-156, 2014. (arXiv:1102.2826v5 [quant-ph]).
[34] R. Kastner. Einselection of pointer observables: The new H-theorem? Stud. Hist.
Phil. Mod. Phys., 48 Part A, 56-58, 2014. (arXiv:1406.4126v2 [quant-ph]).
7