The Principle Of Coservtion Of Informtion As The Evidence Of Existence Of GodDocument Transcript
THE PRINCIPLE OF CONSERVATION OF INFORMATION AS THE EVIDENCE OF
EXISTENCE OF GOD
Prof.dr hab. Inż Jerzy Lechowski
International Academy of Sciences AIS
Departament of Cybernetics
This work is intended to prove that the rightness of the second principle of thermodynamics results out
of the principle of conservation of information which proof had been carried out in this work. It had
been ascertained that in the universe as, we know it, that principle must be limited by the nature of its
carrier (material or energetic) but is at the same time infinite as it may be transformed into another one
like the mass or energy. The way to calculate the maximum volume of information contained in the
whole matter of the universe, as we know it, has been presented. It was proved that the water is most
perfect antientropic material of all living creatures.
1.Relation between the principle of conservation of information and the second principle of
The second principle of thermodynamics is considered to be one of the most important in physical
and chemical sciences. Its rightness might have been questioned by “Maxwell’s demon”, sly and
intelligent being, were it not for the principle of conservation of information in isolated patterns
published the first time in 1972, which, in decisive way, expelled the demon out of
thermodynamics. The second principle of thermodynamic is connected with the notion of
proficiency p . Proficiency . is defined as the quotient of useful work Wu and performed work
Ww = Wu + Wr
Ww: , and because: , where:
Wr –is diffused work (swapped for heat)), hence:
Ww − Wr Q1 − Q2 mcw ∆T1 − mcw ∆T ∆T − ∆T2 (T1 − T ) − (T2 − T0 )
η= = = = 1 =
Ww Q1 mcw ∆T1 ∆T1 T1 − T0
Assuming T0 = 0 K we have
T1 − T2 Q1 − Q2
1− = 1− 2
Assuming that: T1 = T2 = T and:
We arrive at:
∆S = S1 − S 2 = =0
In isolated patterns (we can assume that the universe represents isolated pattern) and in reversible
processes (happening in the nature) entropy remains unchanged. So does the volume of information
defined as negentropy, that is, decrease of entropy . This problem may be reversed allowing to
affirm that out of the principle of conservation of the volume of information in reversible processes
and in isolated patterns results the rightness of the second principle of thermodynamics. i.e.n S = 0.
This problem is linked strictly to the intellectual experience of L.Szilard of 1928  who founded
himself on suggestions articulated by M. Smoluchowski in 1906 referred to in the second principle of
thermodynamics. L. Brillouin commented this fact in 1956 as follows:
„When we discover notable similarity between information and entropy- the physics comes in
question. This similarity was noted long time ago by Szilard in his work of 1929 which is believed to
be a precursor of the actual theory of information. That work was really a pioneering penetration of a
new territory which we are thoroughly investigating right now”.[1,s.20].
Out of experience of M. Smoluchowski – Szilard comes out that the entropy of the system must be
changing when subject to data collection if the second principle of thermodynamics is to be
reaffirmed. The collection of certain information about the system is possible only when entropy of
both the system and its environment is changing, as during data collection of one system about another
one – the change of entropy in both systems and their environment is taking place. In order to reveal
this fact, that below experience of Smoluchowski-Szilard is provided [5,6].
The cylinder containing one molecule is partitioned on two parts, i.e. two volumes respectively V 1 and
V2. Let the partition serve as a piston sliding up and down inside the cylinder. We are assuming that a
man would be able to observe the said molecule and depending on its actual position, either in V1 or
in V2, , could have switched the piston with the help of coupled lever making the molecule to perform
a work in isothermal process lifting, for example, some weight and loosing its energy. Depending on
the position of the molecule we should distinguish two ingredients of entropy S1 and S2, such, that
S = p1S1 + p2 S 2
where:: p1 i p2 –the respective probabilities defined by the formula below:
V1 + V2
At the same time , during the process of isothermal inflation of gas its entropy is decreasing by
s1 caused by the observation of the molecule and collection of data about its position.
Average value of entropy with andi and average value of entropy, produced by observation (data
collection) is negative and makes up for:
s2 = k lg
V1 + V2
Without the second principle of thermodynamics could not be fulfilled due to “Maxwell’s demon”.
Demon could have sorted all the molecules into those of smaller and greater energy, and then, it would
be possible to avail oneself of the heat of the pattern without the difference of temperature T1 i T2
If the second principle of thermodynamics is to be fulfilled in open patterns, the following condition
must be fulfilled:
S+ s ≥ 0
Therefore, the second principle of thermodynamics arises out of the principle of conservation of
information in isolated patterns. However, two conditions must be fulfilled.
p1 + p2 = 1
The condition I is fulfilled by definition of probability, since the molecule may be present only either
in V1, or in V2.
The condition II is also fulfilled because:
S2 = −k ln p2
− = ln p2
In each case of separate events – inequality:
− = ln p2
is right and that is why inequality: V2.
is also right. .
Out of the second condition is evident that if S1 will be low at will, then S2 must be respectively of
high value and vice versa. Each measurement is being saddled with an error because it is disturbing to
the lesser or the greater extend the measured subject, hence, it is good example to support the
principle of conservation of information. This fact becomes evident because during the measurement
certain entropy equal to the information collected is being introduced into the pattern along with
provided energy. If the pattern is providing information sending off some energy to the environment,
the entropy may remain in this environment and the information may be taken over by another pattern.
It is linked directly to the principle of indetermination by Heisenberg which says that we cannot
measure exactly, and, at the same time, impetus( speed) and position of the body: when mp is
decreasing, then, d x is rising, so, the below condition must always be fulfilled:
∆E ⋅ ∆t ≥ h
It happens that not only taking measurement is responsible for introducing information and the
entropy to the pattern and its environment. So is our intention to collect some information about the
pattern through calculation, because, when doing it, certain energy must be used up increasing the
entropy of that environment to compensate for the collected information.
2.The examples removing doubts over the evidence of the principle of conservation of
1. Why a lecturer while passing over free information to the listeners is not loosing it himself?
A lecturer to deliver a lecture had to capture from the environment both energy and
information. Consuming food products he had to get out of the environment both energy, in
digesting process, and essential for life as well as the linked information inherent in food
products increasing their entropy.
2.Why, while availing themselves of information sources such as books, CD and others we are
not draining them up ?. Moreover, often different beneficiaries are getting different
information from the same sourcing depending on the level of their erudition and actual
necessities. To answer that question it is enough to carry out the reasoning similar to that
which led to the discovery of the principle of saving electric charge. That principle was
accepted in spite of alleged evidence that electric charges of different poles are tending do
disappear when colliding. Moreover, that principle was accepted in physics as one of utmost
importance of saving, even more important that the principle of saving mass, which,
according to the theory of relativity, is changing in function of speed. Brilloun  citing
Szilard  and underlining the fundamental importance of his work, drew attention to the fact,
that any collected information must be “paid for” with entropy. In each case of collecting
information from any sourcing, energy is needed, which is dissipating, increasing the entropy
of the environment .
3.To read a book, a light source is necessary to light up the text. Light energy is being dissipated
increasing the entropy of the environment, which becomes “the price” paid for information collected
from the text. In any other case of collecting information from a source, certain energy is always
needed, which, dissipating, increases the entropy of the environment.
4.It might be said that destroying information source one is destroying information itself contained in
it. That situation is similar to fuel combustion, making free the energy, which had been dissipated,
though the principle of saving energy is still valid. Information contained in destroyed source is saved
like the energy of fuel combustion. What is changed is the form of information. Information continues
to exist what is underlined in a word “in-forma”, that is, in the other form.10]
5.Information like God alone is eternal and infinite. It may transform itself infinitely from one form to
6.The way a Maxwell’s demon” had been expelled from thermodynamics the same way the forgery
will be expelled from the human minds and the truth will make man free.
3. Calculation of the potential and information power of the system
In principle, any physical value becomes information potential, which gradient is a stimulus forcing
the flow of information carrier, and along with it, the information itself. Therefore, also gravitation
power falling on the unit of mass may be considered as information potential, in other words, intensity
of gravitation field, that is gradient of gravitation potential like the other gradients: temperature,
pressure, concentration, electric field, magnetic field etc. However, according to the accepted
definition of information potential [4,5,9]
In order to determine its value we must assume the value q (information charge of one bite) and (the
work needed to transpose one bite of information along with its minimum carrier from infinity to the
fixed point A, in which, there is already another bite of information) equal:
W1 = kT ln 2
(for T = 1K).Information potential of point A in which there is already another bite of information
V1 = 0,7 ⋅1,38 ⋅10−23 = 0,966 ⋅10−23
The work to transpose subsequent bites from infinity to the given material point with radius r o is
determined by formula
(n − 1)q ⋅ q (n − 1)q 2
W∞ = ∫ dr = ∫ r −2 (n − 1)q 2 dr =
As may be seen, the work is rising in linear way along with transposed information. Information
potential will also have upward linear trend in that point. In general, it may be put down that:
Wn = n ⋅W1
, (n – natural number corresponding to the subsequent information bites transposed to the physical
point, or otherwise – to the elementary information carrier with radius ro, in which there is one bite of
information). Let’s fix VnA as information potential of the point A, in which there are n bites of
information. That potential makes up for::
n ⋅W1 J
VnA = = n ⋅10−23
[Vi ] = J
Assuming the definition of information volt as : , information potential of point A will make up for:
n.10-23 Vi. Information potential of the given point will represent value 1Vi, if at this point 1/6 mole of
information bites will be found. Farade of information Fi may be determined on the basis of the
Material and energetic carriers may be identified as mass end energy are bound by the formula (E =
The mass may always be recounted on energy, which ( as an elementary carrier of information) next
may be recounted on information, transposed by the carrier. The carrier of information cannot equal 0.
Theoretically it is possible to calculate maximum volume of information contained (bound
information) or transposed (free information ) of the given mass or energy.
Example: let’s calculate maximum volume of information which may be contained in one gram of the
matter. That maximum volume of bound information in structure of the system may be calculated by
n= [ B]
kT ln 2
m – mass of the system
c – speed of light
k = 1,38 ⋅10−23
k - Boltzmann’s constant ()
T – temperature by Kelwin
There is also formula to calculate maximum volume of free information the system can emit without
abruptly changing its internal structure, as below
ε ⋅ S ⋅σ ⋅ T 4 ⋅ t
- coefficient of emission capacity of a given body
S – surface of the system
σ = 5,67 ⋅10−8
m2 K 4
- constant by Stefan – Boltzmann ()
t – time of emission of the information,
h – constant of Planck (h = 6,62.10-34Js
J - frequency of emitted quantum
Max. Volume of bound information contained in one gram of the matter depends, visibly, on
temperature. Assuming we have reached the temperature of 1 K, then, in one gram of the matter there
1g (3.108 ) 2
mc 2 s
n= = ≈ 1036 bitów
k ln 2 0.7 ⋅1,38 ⋅10−23 J
It is easy to calculate that one electron cannot contain more than 8,4.109 bites.
Assuming that in the whole universe, known to us, there is about 1080 atoms, it can be estimated that
it cannot contain more than 10100 bites of information.
Each system, apart from maximum of bound information which may possess has also information
potential, as well as information power . Information potential of the system, depends, electric
potential likewise, on capacity, according to the formula : .
Therefore, solely information power of the system is decisive for its quality. Often, however, excess of
information creates a problem, because we do not know which information is most important .
Well, the most important is that one which has the greatest information power, i.e. has higher product
of “draft-lift power” [4,9] and higher flow speed. What advantage can we get from information which
is stored and cannot be transposed?. In such case the only and the most important advantage is the fact
that it creates the specific structure of the system which remains unproductive for arranging the
environment, unless, it constitutes an element somewhat important of the system. Free information of
great power have capacity to overcome information resistance because of their higher speed of flow v.
The power is expressed by the formula
W F ⋅s
P= = = F ⋅v
According to M. Mazur  the power of the system is determined by the formula:
P = a ⋅ c ⋅ν
a – quality of material,
c – volume of material.
c - unit power of material, i.e., information power proper to the system
The quality of material, or otherwise, information power proper to the system is bound with its
resistance to increase of entropy.
Water is the material resistant to the increase of entropy due to hydrogen inherent in it. Hydrogen
features one of the highest proper temperature and is probably why both hydrogen and water are the
main ingredients of all living creatures. This fact certifies that it was God Creator of life who put up
the best material to build up living organism, not the casual natural selection glorified by the theory of
4.Antientropic material of living creatures.
kg ⋅ K
The best material for building life has liquid form, neither solid nor gas. Liquid substance certainly
represents the richest variety of physical resources as well as dynamic and organizational possibilities.
Water proved to be the best building material of life because of many reasons. First and most
important reason is that water has, after helium and hydrogen, the highest specific heat out of all other
substances present in nature. It is probably owing to hydrogen , which is the lightest. Specific heat of
hydrogen makes up for: , and that of water . The specific heat of helium ; is really higher than that of
but helium, being noble gas, is not reacting with other elements.
It is not accidentally that hydrogen, having the highest specific heat, is inherent to all chemical
compounds. Apart, it is worth noting, that all traced microelements found in living organism have
relatively high specific heat. It is easy to prove making review of traced microelements present in
living organism. Why just specific heat feature is so important for selection of building material of
life? This feature is bound to the principle of conservation of information and to entropy conception.
It is turning out that the substances of high specific heat are resistant to the increase of entropy, that is
to a mess, that is, to deficient organizational order.. Isn’t great having the body resistant to a
mess? The increase of entropy in living bodies leads to death destroying their structure and
organization. The growth of entropy, as is known, is determined by the formula:
∆Q J m ⋅ cw ⋅ ∆T
∆S = =
Out of this formula is noted that the lower specific heat of the body the smaller quantum of heat is
needed to warm up given mass m by 1 K ,and vice versa, the higher specific heat, the more heat
should be provided to rise body temperature by 1 K. Hence the conclusion that living bodies with high
specific heat are resistant to the increase of entropy, that is, are more fit to retain their organization
against destructive activity of the environment.
Some scientist point out that we owe our life to the evolution and to random natural selection.
However, out of the principle of conservation of information follows that it is Creator who had
designed this wonderfully organized universe for us. Determinism (which assumes existence of basic
natural laws) and random (which prizes statistical methods as the basic instrument for understanding
of interdependencies) take us to the same way of understanding of the universe. Therefore, they do not
differ much except for the method chosen and initial assumptions assorted. There is no room for
casualty in the universe. Only a man unfamiliar with complexity of natural laws may think otherwise.
So, we can, with conviction say that all began with information. Moreover, the information is and
always will be present and decisive (though limited in view of its material and energetic carrier) and
infinite (in view of the process of permanent transformation we are the witness of).
 Brillouin L., Nauka a teoria informacji Warszawa, 1969.
 Gabor D., CommcuniationTheory and Physics, Phil. Mag. 41, 7, 1950.
 Lechowski J., Modelowanie elektryczne pola informacyjnego w strukturach biologicznych,
Materiały III Sympozjum PTFM, Zabrze 1972.
 Lechowski J., Analiza możliwości modelowania elektrycznego przepływu informacji w
środowisku, Postępy Cybernetyki, 3, 1983.
 Smoluchowski M., Granice stosowalności drugiej zasady termodynamiki, [W:] Wkład
polskich uczonych do fizyki statystyczno-molekularnej, PWN, Warszawa 1962.
 Szilard L., Űber die Entropieverminderung in einem termodynamischen System bei
Eingriffenin intelligenter Wessen, Zeitschrift fur Physik, Bd 53, Berlin 1928.
 Mazur M., Cybernetyczna teoria układów samodzielnych, PWN Warszawa 1966.
 Mitiiugow W. W., Fizyczne podstawy teorii informacji, PWN, Warszawa 1980.
 Lechowski J., Zastosowanie zasady zachowania informacji w układach izolowanych, Postępy
Cybernetyki, 3, 1987.
 Lechowski J., Analiza możliwości modelowania elektrycznego przepływu informacji w
organizmie człowieka i jego otoczeniu, Praca habilitacyjna, AIS, San Marino 1994.