We have great knowledge of self-organization at the levels of physical, chemical, biological and social phenomena. Research, complexity analysis is an intellectual challenge and very useful. The paper presents a theoretical insight into the scientific complexity in specific areas of financial trading through the third law of thermodynamics / entropy and the relation between quantum and Shannon theory of information. Theoretical analysis gives implications for entropy through the information of pillars of reality. Financial Trading (different financial models of the stock market, some based on competition between informed investors and "noise trader" -noisy coding theory, game theory, and extreme theory) are followed by analysis and synthesis of the interrelationship of the third law of thermodynamics / entropy and their meaning following Ludwig Boltzmann formulas for physical entropy (S = k log W-relation between the microscopic and macroscopic world view) and the evolution of biology and biological complexity (John Kelly's formula for maximizing profits by Fredrik Burton and Richard Dawkins), which is the same as the profit of a successful gambler / investor (grows the same way). Physics and information create interplay; Real-world information is different in comparison to what seems to us at first glance, the world is in quantum mechanics ultimately (quantum theory information is over-set Shannon's information by somehow falling into Shannon's information). Information is synonymous with knowledge. The network of human behavior is based on theoretical principles of information technology (such as biology and physics). We wonder what is the entropy of the stock? The role of intelligence / human brain and entropy?
2. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
Njegovanović A. 116
analogy between entropy known in thermodynamics and
entropy in information (this does not mean that the
understanding of the basis of information theory is a
necessary knowledge of thermodynamics)." As it is In the
mathematical, general definition there are numerous ways
of understanding and interpreting entropy. However,
Shanon's theorem gives the upper bound in the connection
capacity, bits per second (bps) as the available bandwidth
and signal ratio and noise (noise) link. We can state C =
B * log2 (1 + S /N). Where C reaches the channel capacity,
B is the bandwidth of the line, S is the average signal
strength, N is the average noise strength and is expressed
in decibels (dB) per given form of 10 * log10 (S / N). While,
Shannon formally determined the amount of information in
the message as a function of the likelihood of the
occurrence of any possible message. Given the space
message M = {m1, m2, .., mn} and the probability p (mi)
for the occurrence of each message, the informative
content of the message in M is: Σᵢͫ-p (mᵢ) log2 (p (mᵢ)).
Financial markets are dynamic systems that are constantly
developing and continually generating large amounts of
data. Trading in stocks, bonds, or currencies creates a
transaction network among brokers resulting from the
architecture of the financial market. The attention of
physicists to financial markets enables them to investigate
the dynamics of the market by the techniques of statistical
mechanics. Stock markets, futures, currencies, goods are
analyzed and analyzed in research of universal properties.
By defining the third law of thermodynamics / entropy (from
the aspect of physics) we try to find an analogy with the
financial system through the information theory.
"One version of the third law of thermodynamics states that
an infinite number of steps would be needed to achieve
absolute zero, which means you will never get there. If you
could reach absolute zero, it would be a violation of
another law, because if you had a refrigerator at the
absolute zero, you could build a 100 percent effective
machine. "(Siabal Mitra, professor of physics at Missouri
State University). According to David McKee, a professor
of physics at the South Missouri State University, "there is
a field of exploration of ultra-low temperatures, and every
time you turn, the new is a record low. Nanocel (nK = 10-
9 K) temperatures are easy to achieve, now all (pK =, 10-
12 K) The YKI Group of Low Temperature Laboratories at
Aalto University in Finland They cooled a piece of rhodium
metal up to 100 pK or 100 trillion of Celsius degrees above
the absolute zero which surpassed the previous record of
280 pK set in 1993. "
"Since the absolute zero temperature is physically
unattainable, the third law can be repeated to apply to the
real world as: the entropy of the perfect crystal approaches
the zero as its temperature approaches the absolute zero.
From the experimental data we can extrapolate that the
entropy of the perfect crystal reaches zero in absolute
zero, but we can never empirically prove it. " We can
conclude that entropy is the most intriguing term in science
and is used in the interpretation of processes in the field of
research of natural, social and human sciences.
The definition of the third law of thermodynamics / entropy
and market volatility (the range and price movements, the
market volatility, the index and certain securities)
measured by "standard deviation" (deviations from
average assets) but due to some standard deviation
deficiencies, an alternative measure to address the
problem of volatility in the market? Some of the assets are
more volatile than the other, ie the shares are more volatile
than the stock exchange index that contains many different
stocks. Investors with less risk can choose volatile
securities due to uncertainty of return. Volatility suggests
that investors can evaluate a securities bet (as far as
securities are volatile in relation to the broader market)
using the Capital Asset Pricing Model (CAPM), which
calculates the expected return on assets based on its
stake and expected market return. Changes in return
volatility can affect decision makers who are not prone to
risk, but also spending patterns, decisions on corporate
capital investment and macroeconomic variables. We are
asking you how to solve volatility? If standard deviation has
some drawbacks? What is then an alternative measure?
Entropy as an alternative measure to standard deviation?
However, why entropy? Entropy can encompass
uncertainty and mess in the time series without imposing
any restrictions on the theoretical distribution of probability
which is its main advantage. Entropy is a measure of a
disorder or a coincidence of a particular system. Because
it depends on the initial and final state of the system, the
absolute value of entropy can not be determined. We have
to consider the difference between the initial and the final
state in order to determine the entropy change. What
measure entropy should be used because there are
several entropy? Shannon's entropy or Tallis's entropy.
The difference is that Renyi and Tsallis are suitable for
anomalous systems, while Shannon is for optimal balance
systems.
Trading on the stock exchange is a complex interlinking of
coexistence between rationality and irrationality. The
principles of biology are complex, complicated, and require
a great infrastructure to develop new knowledge.
However, our understanding of finance in permanent
global changes, new technologies, techniques, seeks a
new, innovative way of thinking. The principles of biology
are useful for understanding the internal functioning of the
financial industry. Financial markets are the adjustment
that enables us to stimulate economic growth and
development and risk management. Adjustment in the
sense of finance implies high innovation and
competitiveness, whereby evolution will help explain the
dynamics of financial markets.
Application of biology information where genetics have
been developed using the language of preservation and
transfer of information. Information has a clear and well-
defined meaning. Biological information is known for its
3. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
World J. Econ. Fin. 117
persistence, but the core is in the principles of universality.
Perhaps we can use it as a framework for successful stock
trading? Complexity is on average increasing, the key
component of this growth is the natural selection process
that clearly defined Charles Darwin. Natural selection is a
process that connects us to our environment because the
properties that survive are the best adapted to that
environment. In accordance with Darwin's definition of
natural selection, (in the financial world it is not quite
analogous to Darwin's natural selection processes)
evolutionary financing leads to the study of stock trading
behavior. In this regard, the market can be seen as a
mechanism of choice in making financial decisions,
conscious adaptation and accidental mutation. Namely,
information is fueled by evolutionary forces within the
financial markets.
THEORETICAL BACKGROUND
Understanding the notion of entropy necessarily requires
its historical context from Clausius, Boltzmann, Gibbs and
Planck. According to J.Bernard Brissauda, "entropy
measures freedom and allows a coherent interpretation of
entropy formulas and experimental facts". According to
Brissaud entropy we have various aspects: 1. entropy is
the loss of information about the physical system observed
by the outsider but within the system; it represents
countable information. 2. Measure the degree of freedom.
Brissaud believed that entropy was assimilated with
disorder (temperance is a better measure of disorder).
Entropy in finance can be observed through the entropy of
information and probability entropy (portfolio selection and
asset pricing). G. C. Philippatos and C. J. Wilson have
suggested approaching medium entropy despite some
shortcomings in experimental research yet being
recognized in the area of portfolio selection. Namely, "the
middle entropy portfolios are consistent with Markowitz's
full covariates and Sharpe's single index.
Applying entropy to pricing options like the Entropic Price
Theory (EPT) introduced her by applying the pricing of
stock options and pricing the bonds. Determining the
option price leads to the principle of maximum entropy
(MEP), and the Buchen and Kelly surveys have shown that
"maximum distribution of entropy could correspond to the
known probability of precision density function." Their
research also influenced Neri and Schneider who
developed a test for maximum distribution of entropy.
When correlating between different currency pairs
(Krishnan, Nelken, 2001), obtaining the neutral density of
future risky shares or other risky assets (Rompolis,
Retrieving, 2010) can be derived the presumed probability
density and distribution from the price option (Guo, 2001).
Stuart and Hawkins used MEPs to determine the price of
derivative securities, swaps (Stuzer, 1996, Hawkins,
1997).
Shannon's (1948) entropy is a "mathematically quantified
degree of lost information" based on the work of Nyquist
and Hartley and is the core in formulating the theory of
information. By Shannon (official definition) entropy is the
average amount of information, choice and insecurity
encoded in samples drawn from the signal or message.
The generalization of Shannon's entropies of any series
with a defined probability distribution is known in financials,
namely applications include financial prediction
(Molgedey, Ebeling, 2000), market efficiency (Jang, Kwak,
Kaizoji, 2008, Mensi, 2012, Zhang, courses (Petron,
Serva, 2003). The basis of Shannon's works is based on
quantifying insecurity and disruption in various problems
where classical methods are insufficient (Fourier
transformation). However, the lack of temporal
relationships implies some knowledge of the system, and
there is also a weak description of the non-linear chaotic
regime. This led to permutation entropy (application in bio-
medicine) as a technique to overcome the problem (Brandt
and Pompe, 2002).
It is also necessary to emphasize Boltzman-Gibss's
statistics as the basis of modern physics, which represents
the relation between statistical mechanics and classical
thermodynamics by expanding to Clausius's concept of
insecurity for the microscopic state of the system (Boon,
Tsallis, 2005). The applicability of Boltzman-Gibss's
statistics is limited and universal and applies to extensive
regimes. Contrary to Boltzman-Gibsu's Tsallis entropy, it
must be extensive as prescribed by the law of
thermodynamics, leading to a non-additive entropy known
as Tsallis's entropy. Tsallis's entropy, despite some
shortcomings, is accepted in the concept of non-
extensibility in the study of financial systems. It is important
to emphasize here the research of Rak et al., In non-
extensively statistical features price fluctuations in the
Polish market (Rak et al., 2007); Matsuba and Takahashi
on the Nikkeia Market Index (Vosvrda, 2009); Vigor has
quantified the efficiency of capital with Tasallis entropy
(Matsuba, Takahashi, 2003); Namaki and associates
compared emerging markets with mature markets in times
of financial crisis (Namaki et al., 2013); while Senapati
proposed the theoretical framework based on Tsallis
entropy to explain the behavior of trace stock throughout
the day (Senapati, Karmesh, 2016). In 1960, Alfred Rényi
proposed a mathematical generalization of Shannan's
entropy with the motivation to introduce theoretical
evidence for a centralized limited theorem but never
concluded. However, Rényi's entropy lacks connection
with thermodynamics, such as Shannon's entropy, leading
to non-integration into statistical mechanics. Stacionarity is
a prerequisite in statistical methods, meaning that the
mean, variance and auto-correlation structure remain
invariable in time translations, which is important for
modeling and predicting the financial system. But the
batches are usually unpredictable or not stationary where
the statistical properties change over time. Time-
dependent entropy information theory is introduced to
achieve time evolution of entropy. Applications include
different disciplines such as biomedicine and finance
(Alvarez-Ramirez, Rodrigues, 2012).
4. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
Njegovanović A. 118
Multiscale entropy was introduced in 2002 for assessing
the complexity of finite signals in the range of time scales.
Multiscale entropy also records the short-term and long-
term correlations of the "structural wealth" of the signal. As
a prevalent measure of complexity, the algorithm of
multipurpose entropy is applicable in biomedicine (Costa
et al., 2005), geophysics (Guzman-Vargas et al., 2008),
hydrology (Li, Z., Zhang, 2007) (Wang et al., 2013) and
finance (Wang et al., 2012; Liu et al., 2010; Xia et al., 2014;
Niu and Wang, 2015; Sharma et al. 2010, Xia; Shang,
2012) Standard multipath entropy relies on coarse-
graining algorithms that modulate signal fluctuations in
different scales.
FINANCIAL EVOLUTION
The finance theory continues to develop beyond the
traditional academic areas of rational expectations and
efficient markets with innovations coming from
psychology, neuroscience, biology, and physics.
Analyzing financial markets from a biological perspective,
within the evolutionary framework in which financial
markets, instruments, institutions, investors communicate
dynamically, developing under the "financial selection"
law. In the financial world financial intermediaries are
adapted, but perhaps not in an optimal way. The
evolutionary models of financial markets are new frontiers
whose research is still under way. Nuclear finance is a
necessity of compromise between risk and expected
return. The change in the price of securities that is positive
is the attracting prize for investors and the corresponding
risks. However, aversion to risk, ie non-risk, implies
avoiding investment due to unforeseen returns.
The fruitful collaboration of Doyne Farmer (University of
Oxford) with Andrew Lo (MIT Sloan School of
Management, finance professor) and John Geanakoplos
brought understanding of the financial market through
Andrew Lo's hypothesis of market adaptability trying to
align the Famas hypothesis with the humanities behavior
by using evolutionary biology and economic equilibrium
John Farrow's Method of Understanding Market Behavior
is based on modeling mediator-simulation of investor
participation in the financial market. Complex approach
means building a library of information needed to simulate
the interaction of participants in the financial market.
Professor Andre Lo (MIT) points to the need for paradigm
shift in the financial economy with the aim of analyzing
financial markets from the perspective of the evolutionary
perspective. The reason for this approach is "evolutionary
mismatch" evolution may help explain risk aversion and
tendency to "match probability" in financial terms that
ultimately can lead to non-optimal investment. Financial
behavior does not explain why people make suboptimal
decisions, we can look for a response in the field of
neurofinancing that adds biology that is a mix of economics
and psychology. The brain does not like risk / insecurity.
Why do people give up investing in risky forms of
investment? Research conducted at the University of Bonn
with an innovative approach to socioeconomic,
psychological and neuroscience data has found anterior
insular cortical regions more active among people who do
not trade stocks, while at the University of Bonn (CENs-
Center for Economics and Neuroscience) professional
retailers interior insular was lower.
Professor Andre Lo (MIT) points to the need for paradigm
shift in the financial economy with the aim of analyzing
financial markets from the perspective of the evolutionary
perspective. The reason for this approach is "evolutionary
mismatch" evolution may help explain risk aversion and
tendency to "match probability" in financial terms that
ultimately can lead to non-optimal investment. Financial
behavior does not explain why people make suboptimal
decisions, we can look for a response in the field of
neurofinancing that adds biology that is a mix of economics
and psychology. The brain does not like risk / insecurity.
Why do people give up investing in risky forms of
investment? Research conducted at the University of Bonn
with an innovative approach to socioeconomic,
psychological and neuroscience data has found anterior
insular cortical regions more active among people who do
not trade stocks, while at the University of Bonn (CENs-
Center for Economics and Neuroscience) professional
retailers interior insular was lower. Namely, many factors
influence investment decision-making, whether the
investment is lucrative or not. The decision made follows
the company's development: price movements, quarterly
review and stock price movements. It is difficult to review
the existence of a large number of information coming from
investors. The brain was not set up to fully understand
complex issues such as the financial market and extract
rational decisions. Our brain has not changed, in a sense,
we still use the same "hardware" as our ancestors did
before the millennium. A study conducted by Cambridge
University under the supervision of neuroscientist John
Coates, Wall Street has carefully examined stress
hormone cortisol. Monitoring the work of a broker at the
London Stock Exchange over a longer period of time, the
stress hormone appeared 68% more often among brokers
than usual (double blind-checking results). The conclusion
of the study suggests that cortisol causes mediators to fall
into shock-induced paralysis. This makes them less
tolerant than usual and causes them to miss decisive
opportunities, especially in times of crisis. In turn, it
reduces market falls longer than needed. Without this
shock-induced paralysis, researchers are convinced that
risk aware investors will soon be able to identify favorable
opportunities and the crisis will be overcome. Research
shows that both professional investors are driven by their
emotions. Two key brain players in financial matters are
the reward system and the center of anxiety. The reward
system is responsible for motivating a person. It reminds
of the experiences that felt good and should be repeated.
On the other hand, the center of anxiety is responsible for
risk awareness. Potential threats, including abstract capital
loss, are discussed here before each decision. Human
5. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
World J. Econ. Fin. 119
intelligence involves understanding and thinking about an
endlessly changing external environment. The brain is
capable of high variability in neural configurations or
states, it will understand and anticipate variable external
events. Entropy measures the variety of possible
configurations within the system, and recently the concept
of brain entropy is defined as the number of neural states
that a particular brain can access. The relationship
between high entropy of the brain and high intelligence
indicates the role of entropy in the functioning of the brain.
Thus, access to variable neuronal states predicts complex
behavioral behavior, and indicates that entropy is derived
from the neuroimaging signal by carrying information on
intellectual capacity. Entropy is a word that is generally
associated with physics especially with other entropy law.
Entropy brain hypothesis is a model of brain function that
establishes the link between total entropy in the brain and
long-term psychological conditions, particularly
pathological conditions such as schizophrenia and
obsessive-compulsive disorder. Small fluctuations in brain
entropy are natural, but according to the Carhart-Harris
model, many conscious states can be classified as "low
entropy" or "high entropy". Neuroscientists have isolated
the two main networks in the brain: a Task Positive
Network (TPN) associated with activity and action, and the
Default Mode Network (DMN), associated with idle and
maintaining a coherent ego, or feeling "self" while active in
active tasks. As Ilya Prigogine says, "the entropy is the
price of the structure. "Entropy is more than" mess "or"
chaos ". Metaphors that cover some aspects of entropy
(such as slag structure and rigidity) have negative,
normative connotations that subtract what entropy
generally means in math and physics: increasing
capabilities.
QUANTUM THEORY AND SHANON'S THEORY OF
INFORMATION
The contemporary definition of information contains
information in an event and a proportional logarithm of its
inverse probability of occurrence: I = log 1 / p. This
definition shows the existence of an event, the second
possibility is to calculate the probability of an event /
activity event. Translated into the economic dictionary one
event could be a drop in stock prices (whatever the event
we were able to apply it to the theory of information).
Shannon's definition of information is proportional to the
logarithm of the inverse probability of an event (a well-
known and highly plastic story of two users of the
communication channel Alise and Bob).
Understanding the interchange channels of information
plays an important role in the quantum Shannan theory
and proves several important properties (Adami, Cerf,
1997). The operational significance of the channel
information Benett et al., shows as a tool of classical
capacity merging. Giovannetti and Fazio (2005) found
several capacities of the damped channel, with Wolf and
Perez-Garc'ı added clarifications. The classical channel
capacity assisted by an unlimited quantum feedback is
equal to the classical ability of assisted overlap (Bowen,
2004). There is also a knowledge of the strong conversion
and characterization of the second order of capacity
assisted by overlap. Understanding the interchange
channels of information plays an important role in the
quantum Shannan theory and proves several important
properties (Adami, Cerf, 1997). The operational
significance of the channel information Benett et al.,
Shows as a tool of classical capacity merging. Giovannetti
and Fazio (2005) found several capacities of the damped
channel, with Wolf and Perez-Garc'ı added clarifications.
The classical channel capacity assisted by an unlimited
quantum feedback is equal to the classical ability of
assisted overlap (Bowen, 2004). There is also a
knowledge of the strong conversion and characterization
of the second order of capacity assisted by overlap.
Quantum Reverse Shannon's Theory quantifies the speed
of classical communication needed to simulate a quantum
channel in the presence of unlimited interlinking between
sender and recipient (Bennett, 2014). It is necessary to
point out the work of Gupta and Wilde (2015) who provided
direct evidence of strong conversation using R'enyi
entropy. The same strong conversations are held in the
presence of a quantum back channel amplifying Bowen's
results (2004). Second-order results for classical
communication assisted by overlapping revealed the
characterization for some channels using the previous
Matthew and Wehner (Datta et al., 2014). The relationship
between information and entropy is deeply subtle.
Shannon's derivation of the mathematical theorem uses
the simple axioms that the information has to satisfy
(Shannon, Weaver, 1949), Shannon introduced his theory
of information as a mathematical communication theory
(Shannon, 1948, Shannon and Weaver, 1949) introducing
the general model of the communication system.
The quantum capacity theorem is one of the most
important theorems in quantum Shannon theory. This is a
basic quantum theorem that shows the fundamental
quantum of information, the coherent information reaching
the quantum information rate across the quantum channel.
Coherent information does not have a strong analogy in
classical Shannan theory by dividing the quantum and
classical theory of information.
If we look at the information as a binding force between
different aspects of the quantum system, this means that
quantum systems can share more than 100% of the
information, so each theory of information should be able
to deal with it to fully describe reality; Otherwise, some
parts of reality will be unavailable to our understanding.
The quantum information theory is over-set Shannon's
information by reducing the Shannon's information in some
cases. Quantum information tells us about reality, in
addition to what we have learned through Shannon, that
is, that it has a lot of untapped potential.
6. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
Njegovanović A. 120
How mathematical form of information is also entroped.
We ask whether the information is the same quantity that
appears in mechanics statistics. This is a simple example
of getting Boltzmann's distribution of information
maximization.
Can information theory help maximize profits on the stock
market? Maximizing profits in financial speculation is the
same problem as maximizing the communication channel
capacity. We know that Shanon solved this latter problem,
so this is the solution to other problems. Namely, the
solution consisted in maximizing channel capacity so that
the length of the message follows Shannon's rule of
"reverse probability logs". In that sense, the amount of
money after investing is the length of the message we get
what the stock exchange mixers call the "optimum
logarithm portfolio". In practice, the portfolio of the optimal
algorithm implies investing in one option can lead to a loss
in the initial stage and lack of funding for the next
investment, similar to the one in communication, some
roles include large amounts of money, while some have to
be very small. The value of a company's shares increases
but also falls in some period, ie there is a natural fluctuation
in the market. If the company is financially and strategically
healthy then the overall growth trend is long-term. By using
publicly available information we can roughly estimate that
some companies will grow next year by X%. The Shannon
formula gives us guidelines if the investment shows a
success rate of only 50% that means that on average there
is no gain with the potential for losing the invested capital.
We are asking whether financial market analyzes and
monetary speculation can we use thermodynamics to
describe market behavior and to deduce the general
trends of the financial world? Thermodynamics points to
the energy exhaustion of resources and causing
consequences for Earth's planet as they disappear. We
notice such an impact on the economy (the price of
gasoline is linked to the price of oil that depends on oil
reserves and its availability), any political instability in the
Middle East leads to higher gas prices. Behavior in the
financial market reflects the behavior of physical systems
in thermodynamics. The Finance Act states "There is no
risk-free financial gain on an efficient market. "The third
law of thermal dynamics has its analogy in stock trading,
the law defends us to reach absolute zero; in terms of
trading, it means being cooler, especially from the risk of
investing. We can conclude that Shannon's formula for
maximum channel capacity, Bolcmon's formula for
physical entropy, and Keliyeva formula for profit
maximization is the same formula.
Communication signal is transmitted by a quantum
system. Simple classical information is expressed in bits;
and is transmitted by a physical system with two different
states, one state is logically 0, the other is logically 1.
Consequently, the classical information expressed in the
bits can be applied in the quantum communication Ι 1 Ι.
However, we need to elaborate the qubit, which can be
any quantum system with two orthogonal states.
QUANTUM COMPUTERS AND STOCK TRADING
The quantum computer is defined as a binary code, a data
storage method, such as strings (1) and zero (0), the basis
of all conventional computers. Quantified computers use
qubite (qubite based on quantum-mechanical principles
that describe the partial and widespread behavior of matter
and energy on an extremely small scale), a unit of
information that can be "1", "0" or both "1" and "0".
How many computers can be fast in processing
information? Is Exponential Growth Limit? If we use 100
electrons today to encode one bit of information, then we
can ask how many electrons will be used to encode one
bit of information? Understanding the boundaries
introduces us to simply defining a computer. Today we use
a computer based on Bull's logic laws (Bull markets are
defined so that the market is aggressively developing over
a period of time. As the market starts to grow, stock market
is increasingly greedy)
Classical information processing can be a good
approximation of the situation we have on the macroscopic
level, and sometimes the level of detail that offers a higher
level is sufficient for daily purposes. Exploring the
properties of computer science from physics, computer
science, and information theory to math and philosophy on
a quantum basis offers all the complexity of whose
research is rapidly growing. If we start with computing as
a process that maximizes mutual information between
inputs and outputs - in case the question arises, the speed
of computer operations can be considered as the rate of
establishing mutual information, ie the growth rate of
correlation between outputs and inputs. The Qubits offer a
higher level of mutual information than is possible with the
bits, it is directly translated into the quantum acceleration
we see in Shore (Shor's algorithm, named by
mathematician Peter Shore, is a quantum algorithm that
works on a quantum computer for total factorization,
formulated in 1994 ) and Grover's (quantum algorithm that
finds a unique entry into the function of a black box that
produces a particular output value by evaluating the
function, was created by Lov Grover, 1996) algorithm. .
Quantum computers are in front of us. Quantum effects are
experimentally observed in macroscopic objects such as
parts of solid bodies and organic molecules in living
systems, pointing out that quantum computers impose a
higher order of data processing by solving problems that
conventional computers are not capable of solving the
"higher order" of data processing.
Quantum Computing can help marketers, analysts and
their companies overcome the challenges of financial
7. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
World J. Econ. Fin. 121
research. Finance is not like empirical science because it
has no chance of repeating experiments or testing the
theory under controlled conditions (most of the claims
about research results in the financial economy are
probably incorrect, argues Professor Campbell Harvey,
President of the American Financial Association; outside
"simple toy models", Lopez de Prado).
As modern physics can not progress without objects such
as the Great Hadron Collider (LHC), beneficial financing
that takes into account the real coincidentalness and
complexity of the market requires machines that are
consistent with the task pointed out by Lopez de Prado.
Many financial problems require numerous calculations
that go far beyond the capabilities of the fastest
supercomputers (Lopez de Prado). Dynamic optimization
of the portfolio to calculate the optimal path for core
investment "A better portfolio means less rebalance and
cost (Lopez de Prado). "Some complex derivatives depend
on the path," (Lopez de Prado). The first rule of quantum
mechanics is: "If you think you understand quantum
mechanics then you do not understand quantum
mechanics."
The work of Ovidius Sorin Racorean "Decoding Stock
Market Behavior with the Topological Quantum Computor
/ 2014" (uses modern mathematical concepts, braids,
nodes, Jones polynomial nodes, invariant nodes) "shows
the parallels between the movement of the stock and the
movement of tiny particles" non-abelian anyons.
The relationship between physics and financial markets, or
the relation of the principle of uncertainty in a wider sense,
has developed Werner Heinsberg in quantum physics.
Understanding Heinsberg's uncertainty can be applied to
financial markets. Namely, Heinsberg explains that precise
particle identification is smaller in identifying the position of
the same particle. An interesting and intriguing analysis of
the application of quantum mechanics and quantum
computing in solving financial problems (Quantum
computing for finance: overview and prospects,“ Roman
Orus, Samuel Mugel, Enrique Lizaso, 2018). In reference
to the quantum annealing analysis (quantum annealing is
mainly used for problems where the search space is
discrete (combinational optimization problems) with many
local minimums, such as finding the base state of the spin
glass, T. T. Kadowaki and H. Nishimori in "Quantum
Growth in the Transverse Model", although the proposal in
the second form was made by AB Finilla, MA Gomez, C.
Sebenik and JD Doll, in "Quantum Annealing: A New
Method for Minimizing Multi-Dimensional Functions",
Wikipedia) optimization that is the nucleus of financial
problems. Quantum annaeling uses quantum physics to
find low-energy status problems and hence an optimal or
almost optimal combination of elements. Sampling from
energy-based distributions is a computationally intensive
task that suits the way the D-Wave system solves
problems; that is, seeking low-energy states. Thus, in
quantum annealing we use tunnel events with the goal of
a global minimum. Optimum trading paths have already
been developed to reduce trading costs and control
algorithms by executing a balance between fast (market
risk minimization) and slow trading (minimum impact on
the market).
INSTEAD CONCLUSION
The paper gives a theoretical insight into the complexity of
the financial world in comparing physics (third law of
thermodynamics) and entropy with quantum and Shanon's
theory of information. Comparison gives a wider
understanding and interdependence of the dynamic
activities of the financial system.
Concepts and applications of entropies are relevant in the
field of finance, although the term entropy itself is related
to thermodynamics. Benefits are in risk measurement and
distribution descriptor. Entropy as a risk measure in
portfolio selection, or variance with entropy as a substitute
for medium variance models or models for adding entropy
to the original portfolio and optimizing new models. The
principles of entropy in determining property prices allow
the likelihood of extraction of property from incomplete or
limited information. Successfully resolving canonical
problem assessment options. The principle of maximum
entropy is used in setting up different pricing models.
Financial systems show extremely dynamic and chaotic
behavior. The work is characterized by analyzes of
financially well documented methods such as
autocorrelation (Mantegna and Stanley, 2000), nonlinear
analysis of time series (Kodba et al., 2005), cross
correlations (Coronnello et al 2005, Coronnello et al 2007,
Garas and Argyrakis (2007), Gonzalez et al.,
2007;Gopikrishnan et al 2000), Jung et al 2006 (Laloux et
al 1999), Mantegna (1999), Noh Mantegna (1999), Noh
2000; Pafka and Kondor 2004; Plerou et al 2002 (Utsugi
et al. such systems are extremely complex and require a
different approach to understanding puzzling "financial
mysteries. "
The third law states: "The perfect crystal entropy is zero
when the crystal temperature is equal to the absolute zero
(0 K)." According to Purdue University, "also must be at 0
K; otherwise the inside of the crystal will get the heat
movement, which leads to disorder. "Siabal Mitra (Physical
Professor at Missouri State University points out," One
version of the Third Law states that an infinite number of
steps would be needed to achieve absolute zero, which
means you will never get there. "If you could reach the
absolute zero, was a violation of the Second Law, because
if you had a refrigerator at the absolute zero, you could
build a machine that was 100 percent effective. "The
Historical Aspect Begins" The third law of thermodynamics
was first formulated by German chemist and physicist
Walther Nernst In his book "Review of Thermodynamics"
(American Institutes of Physics, 1994), Martin Bailyn cites
Nernst's Third Law statement as "It is impossible for any
8. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
Njegovanović A. 122
procedure to lead to isotherm T = 0 in the final number This
essentially establishes absolute zero temperature as
unattainable in something similar to the speed of light c.
Theoretical states and experiments have shown that no
matter how quickly something moves, it can always be
made faster but they can never reach the speed of light.
Similarly, no matter how cold the system is, it can always
cool, but it can never reach absolute zero.
According to David McKee, a professor of physics at the
University of Missouri, "there is a field of exploration of
ultra-low temperatures, and every time you turn around, a
new law. Nanocelvin (nK = 10-9 K) temperatures are
understandable simple and nowadays are done on
picokelvine (pK =, 10-12 K) The YKI Group of Low
Temperature Laboratories at Aalto University in Finland
They cooled a piece of rhodium metal up to 100 pK or 100
trillion of the Celsius degree above the absolute zero,
which surpassed the previous record of 280 pK set in
1993. "
Physician Valery Chalidze first suggested the existence of
relations between entropy and money, but did not try to
make this relationship explicitly analytical. This can be
done by taking into account the probability of a transaction,
which means that a financial transaction takes place at a
given time interval.
The information theory is the result of a specific issue that
Shennon considered through maximizing the
communication capacity between two users. It is enough
to attribute likelihood to his activity. This is the foundation
of defining a metric that allows quantification of the
information content in that event. Biological information
can observe Shennon's theory as a communication in time
(where the goal of natural selection is to spread the gene
source in the future). However, it should be emphasized
not just trying to communicate and biology to optimize
information, so physics systems regulate that entropy is
maximized, this entropy is quantified in the same way as
Shannon's information. Financial speculations are
governed by the same concept of entropy, so optimizing
the capacity of our channel. Shannon's theory of
information has been expanded by explaining the quantum
theory. The quantum theory of information is manifested in
cryptographic protocols, a new order in computer science.
We have great knowledge of self-organization at the levels
of physical, chemical, biological and social phenomena.
Global financial intercourse, digital technology as a tool in
financial systems, requires multidisciplinarity as a network
of neuroscience, biology, physics and engineering in
understanding and complex financial system activities.
REFERENCES
Alvarez-Ramirez, J.; Rodriguez, E.; Alvarez, J. A
multiscale entropy approach for market efficiency.
International Review of Financial Analysis, v. 21, p. 64
– 69, 2012.
Benguigui, L. The different paths to entropy. European
Journal of Physics, v. 34, n. 2, p. 303, mar. 2013.
Boltzmann, L. Weitere studien über das
wärmegleichgewicht unter gasmolekülen. In:
Kinetische Theorie II. [S.l.]: Vieweg Teubner Verlag,
1970, (WTB Wissenschaftliche Taschenbücher, v. 67).
p. 115–225.
Bandt, C.; Pompe, B. Permutation entropy: A natural
complexity measure for time series. Phys. Rev. Lett.,
American Physical Society, v. 88, p. 174102, Apr 2002.
Buchen, P.W.; Kelly, M. The maximum entropy distribution
of an asset inferred from option prices. J. Financ.
Quant. Anal. 1996, 31, 143–159.
Borwein, J.; Choksi, R.; Maréchal, P. Probability
distributions of assets inferred from option prices via the
principle of maximum entropy. J. Soc. Ind. Appl. Math.
2003, 14, 464–478
Boon, J. P.; Tsallis, C. Special issue overview
nonextensive statistical mechanics: new trends, new
perspectives. Europhysics News, v. 36, n. 6, p. 185–
186, 2005
Baltzer, H. et al. Multi-scale entropy analysis as a method
for time-series analysis of climate data. Climate, v. 3, n.
1, p. 227, 2015.
Clausius, R.; Hirst, T. The Mechanical Theory of Heat:
With Its Applications to the Steam-engine and to the
Physical Properties of Bodies. [S.l.]: J. Van Voorst,
1867.
Costa, M.; Goldberger, A. L.; Peng, C.-K. Multiscale
entropy analysis of complex physiologic time series.
Phys. Rev. Lett., American Physical Society, v. 89, Jul
2002.
Christoph Adami and Nicolas J. Cerf. von Neumann
capacity of noisy quantum channels. Physical Review
A, 56(5):3470–3483, November 1997. doi:
10.1103/PhysRevA.56.3470. arXiv:quant-ph/9609024.
Charles H. Bennett. Quantum cryptography using any two
nonorthogonal states. Physical Review Letters,
68(21):3121–3124, May 1992. doi:
10.1103/PhysRevLett.68.3121
Charles H. Bennett. Quantum information and
computation. Physics Today, 48(10):24–30, October
1995
Charles H. Bennett. A resource-based view of quantum
information. Quantum Information and Computation,
4:460–466, December 2004. ISSN 1533-7146.
Carhart-Harris, R., Leech, R., Hellyer, P., Shanahan, M.,
Feilding, S., Tagliazucchi, E., Chialvo D., and Nutt, D.
(2014). The entropic brain: a theory of conscious states
informed by neuroimaging research with psychedelic
drugs. Frontiers in Human Neuroscience.
Coronnello, C., Tumminello, M., Lillo, F., Micciche, S. and
Mantegna, R. N. (2005). Sector Identification in a Set of
Stock Return Time Series Traded at the London Stock
Exchange. Acta Physica Polonica B, 36, 2653–2679.
Coronnello, C., Tumminello, M., Lillo, F., Micciche, S. and
Mantegna, R. N. (2007). Economic Sector Identification
in a Set of Stocks Traded at the New York Stock
Exchange: A Comparative Analysis. Noise and
9. Third Law of Thermodinamics / Entropy and Financial Trading through Quantum and Shanon's Theory of Information
World J. Econ. Fin. 123
Stochastics in Complex Systems and Finance, 6601,
U198–U209.
Darrigol, O. The Origins of the Entropy Concept. In:
Dalibard, J.; Duplantier, B.; Rivasseau, V. (Ed.).
Poincaré Seminar 2003, Progress in Mathematical
Physics, Volume 38. Birkhäuser Verlag. [S.l.: s.n.],
2004. p. 101.
Gulko, L. The entropy theory of stock option pricing. Int. J.
Theoretical Appl. Finance 1999, 2, 331–355
Guo, W.Y. Maximum entropy in option pricing: a convex-
spline smoothing method. J. Futures Markets 2001, 21,
819–832.
Guzman-Vargas, L.; Ramirez-Rojas, A.; Angulio-Brown, F.
Multiscale entropy analysis of electroseismic time
series. Natural Hazards and Earth System Sciences, v.
8, n. 4, p. 855–860, 2008.
Garry Bowen. Quantum feedback channels. IEEE
Transactions on Information Theory, 50(10):2429–
2434, October 2004. arXiv:quant-ph/0209076.
Garry Bowen and Rajagopal Nagarajan. On feedback and
the classical capacity of a noisy quantum channel. IEEE
Transactions on Information Theory, 51(1):320–324,
January 2005. arXiv:quant-ph/0305176.
Garas, A. and Argyrakis, P. (2007). Correlation Study of
the Athens Stock Exchange. Physica A: Statistical
Mechanics and its Applications, 380, 399–410.
Gibbs, J. Elementary principles in statistical mechanics:
developed with especial reference to the rational
foundation of thermodynamics. [S.l.]: Yale University
Press, 1914. (Yale bicentennial publications).
Gopikrishnan, P., Plerou, V., Liu, Y., Amaral, L. A. N.,
Gabaix, X. and Stanley, H. E. (2000). Scaling and
Correlation in Financial Time Series. Physica A:
Statistical Mechanics and its Applications, 287, 362–
373.
Hawkins, R.J. Maximum entropy and derivative securities.
Adv. Econometrics 1997, 12, 277–300
Hartley, R. V. L. Transmission of information1. Bell System
Technical Journal, Blackwell Publishing Ltd, v. 7, n. 3,
p. 535–563, 1928.
Igor Devetak and Andreas Winter. Classical data
compression with quantum side information. Physical
Review A, 68(4):042301, October 2003. doi:
10.1103/PhysRevA.68.042301. arXiv:quant-
ph/0209029.
Jung, W.-S., Chae, S., Yang, J.-S. and Moon, H.-T. (2006).
Characteristics of the Korean Stock Market
Correlations. Physica A: Statistical Mechanics and its
Applications, 361, 263–271
Kodba, S., Perc, M. and Marhl, M. (2005). Detecting Chaos
from a Time Series. European Journal of Physics, 26,
205–215.
Kolmogorov, A. On tables of random numbers. Theoretical
Computer Science, v. 207, n. 2, p. 387 – 395, 1998.
Krishnan, H.; Nelken, L. Estimating implied correlations for
currency basket options using the maximum entropy
method. Derivatives Use Trading Regul. 2001, 7, 1–7.
Laloux, L., Cizeau, P., Bouchaud, J. P. and Potters, M.
(1999). Noise Dressing of Financial Correlation
Matrices. Physical Review Letters, 83, 1467–1470.
LI, Z.; ZHANG, Y.-K. Multi-scale entropy analysis of
mississippi river flow. Stochastic Environmental
Research and Risk Assessment, v. 22, n. 4, p. 507–
512, 2007.
Liu, L.-Z.; Qian, X.-Y.; LU, H.-Y. Cross-sample entropy of
foreign exchange time series. Physica A: Statistical
Mechanics and its Applications, v. 389, n. 21, p. 4785 –
4792, 2010. ISSN 0378-4371.
Mantegna, R. N. (1999). Hierarchical Structure in Financial
Markets. European Physical Journal B, 11, 193–197.
Mantegna, R. N. and Stanley, H. E. (2000). An Introduction
to Econophysics: Correlation and Complexity in
Finance. Cambridge, Cambridge University Press
Matsubai, I.; Takahashi, H. Generalized entropy approach
to stable lèvy distributions with financial application.
Physica A: Statistical Mechanics and its Applications, v.
319, p. 458 – 468, 2003.
Molgedey, L.; Ebeling, W. Local order, entropy and
predictability of financial time series. The European
Physical Journal B-Condensed Matter and Complex
Systems, v. 15, n. 4, p. 733–737, 2000.
Mensi, W. et al. Crude oil market efficiency: An empirical
investigation via the shannon entropy. International
Economics, v. 129, p. 119 – 137, 2012.
Namaki, A. et al. Comparing emerging and mature
markets during times of crises: A non-extensive
statistical approach. Physica A: Statistical Mechanics
and its Applications, v. 392, n. 14, p. 3039 – 3044, 2013.
Neumann, J. V. Mathematische Grundlagen Der
Quantenmechanik. [S.l.]: Dover, 1932. (Grundlehren
der mathematischen Wissenschaften).
Nicolas J. Cerf and Christoph Adami. Negative entropy
and information in quantum mechanics. Physical
Review Letters, 79(26):5194–5197, December 1997.
arXiv:quant-ph/9512022.
Niu, H.; Wang, J. Quantifying complexity of financial short-
term time series by composite multiscale entropy
measure. Communications in Nonlinear Science and
Numerical Simulation, v. 22, n. 1–3, p. 375 – 382, 2015.
Noh, J. D. (2000). Model for Correlations in Stock Markets.
Physical Review E, 61, 5981–5982.
Nyqiust, H. Certain factors affecting telegraph speed1. Bell
System Technical Journal, Blackwell Publishing Ltd, v.
3, n. 2, p. 324–346, 1924.
Nyquist, H. Certain topics in telegraph transmission theory.
Transactions of the American Institute of Electrical
Engineers, v. 47, n. 2, p. 617–644, April 1928.
Pafka, S. and Kondor, I. (2004). Estimated Correlation
Matrices and Portfolio Optimization. Physica A:
Statistical Mechanics and Its Applications, 343, 623–
634.
Pincus, S. M. Approximate entropy as a measure of
system complexity. Proceedings of the National
Academy of Sciences, v. 88, n. 6, p. 2297–2301, 1991.