1. 1
Asset Pricing: Theories of Financial Markets in the Context of a Crisis
By
Bradley Russell
Professor Gray, Thesis Advisor
Senior Thesis
Department of Economics
Willamette University
4/29/15
2. 2
Introduction:
Financial markets play a crucial role in the modern global economy, perhaps best shown
by the 2007-08 financial crisis and the succeeding years of global macroeconomic downturn.
When asset prices across a variety of sectors fell and confounded the expectations of financial
experts the repercussions were felt across the board as investors saw the value of their holdings
plummet.
While the causes of the financial crisis are not the purpose of this paper, one element
bears mentioning, which is the underlying assumptions of the models used throughout the
financial world: The Efficient Markets Hypothesis (EMH). In a similar vein to much of
neoclassical theory, the EMH states that markets are efficient, in this case in an informational
sense, and correctly price financial assets. A large amount of valuation models are based, even
implicitly, on the EMH. Considering the market’s failure to accurately price assets in the lead up
to the 2007-08 financial crisis, the underlying theory deserves some critical attention.
The Random Walk Hypothesis (RWH) is a corollary of the EMH, shown to be
theoretically true by Paul Samuelson (1965) and empirically true by Eugene Fama (1965b) at
roughly the same time, although first hypothesized at the beginning of the 20th
century by Louis
Bachelier (1900). At its core the RWH states that as a market becomes more informationally
efficient, i.e. that as prices reflect more available information, those prices move in a more
random fashion, eventually coming to what is called Brownian motion1
.
By labeling the RWH as Brownian motion, its advocates give an implicit association that
the work that is being done is equivalent to that of the “real” or physical sciences. From an
1
Brownian Motion comes from the work of botanist Robert Brown, who documented the apparent random
movement of pollen grains in water in the early 19th
century. I have included four graphs of Brownian Motion based
on a standard normal distribution for reference.
3. 3
outsider’s perspective, this gives apparent weight to the arguments behind market efficiency and
makes the conclusions being reached seem stronger than those of other social sciences such as
sociology or anthropology.
The idea that financial markets are efficient hinges on the market participants having
rational expectations such that on the aggregate the market always prices an asset appropriately,
given available information. Because rational expectations may not be a realistic assumption on
the part of market participants, this begs questions of the robustness of the EMH.
In sharp contrast to the EMH stands Kindleberger’s Manias, Panics, and Crashes (1978),
detailing a variety of financial crises that persist over time, in some cases far beyond what would
be expected of an efficient market. Minsky (1992) follows in this vein by proposing that debt
accumulation and the easing of credit leads to these financial crashes and crises. While the
explanation of these crashes is not the specific purpose of this paper, a brief overview of them
will help frame the argument being made. The argument made by Stiglitz (2009) seems
particularly relevant to my argument and states that because mortgage originators were not the
ultimate owners of the mortgage they had insufficient incentive to properly evaluate mortgage
applications, thus there was a lack of information available to the mortgage purchasers. This
meant that the collateralized mortgage obligations (CMO’s) and collateralized debt obligations
(CDO’s) in which individual mortgages of various qualities were packaged were not necessarily
priced correctly to compensate for the actual risk associated with the CMO or CDO in question.
Other explanations include the Federal Reserve failing to correctly set interest rates (Taylor
2009), the increased financialization of the economy as a means to prevent economic stagnation
(Foster and Magdoff 2008), increasing inequality (Stockhammer 2012), and many others. While
4. 4
Stiglitz’s explanation may not be complete by any means, it does show how there may not have
been as much information available to investors as they would have liked.
I will first consider the EMH as initially proposed by Fama (1970) and the subsequent
work in the literature and examine the challenges presented by econometric tests before
proceeding on to alternative theories explaining how financial markets function. Additionally I
will conduct an econometric test of the RWH corollary of the EMH, and although this will not
reject or affirm the EMH as a whole as I will show later, it at least serves as empirical evidence
in the argument. I have selected twenty stocks from the NASDAQ stock exchange, specifically
those that are traded in the highest volume. This selection was made because volume trading
ideally serves as a proxy for available information and active market participation, trademarks of
an efficient market. While this by no means is sufficient proof that these stocks are part of an
efficient market, or even an efficient subset of a market, it must suffice for my purposes.
Econometric Tests and the Joint Hypothesis Problem:
The first test provides evidence as to whether prices move in random directions from the
previous price. More technically, for any three prices, call them 𝑡1, 𝑡2, 𝑡3, if 𝑡2 − 𝑡1 and 𝑡3 − 𝑡2
have the same sign (positive or negative) then that is defined as a continuation and if they have
differing signs then it is termed a reversal. If there is a random distribution of continuations and
reversals then there is insufficient evidence to show that the price data in question does not
follow a random walk. This follows the work of Cowles and Jones (1937) in testing stock market
prices from the beginning of the century. Cowles and Jones (1937) rejected the hypothesis that
their prices followed the RWH over the holding periods sampled. It is important to note that it is
a well documented fact since their work was published that individual stocks have a tendency to
5. 5
follow the RWH while indices fail to do so2
. Again though, this could be due to an incorrect
price setting mechanism rather than the market being inefficient as a whole.
The dual hypothesis problem states that in a logical framework, for some conditions A
and B, if A and B hold, then P is true. The contrapositive of this statement is logically equivalent,
which is to say if not P then not (A and B), which by DeMorgan’s Law3
is equivalent to saying if
not P then not A OR not B. Thus for our purposes if we could show that stocks fail to follow a
random walk then we would be able to say that either prices were specified incorrectly or that
they were set in an inefficient manner, or both. However it is impossible to determine which of
these cases it may be. It may seem that this renders econometric testing of the RWH somewhat
useless as there is always a logical defense available to its advocates, however failing to satisfy
the RWH may serve as additional evidence rather than being sufficient in and of itself.
Conversely however, showing that the RWH holds is not necessarily sufficient from a
logical perspective to accept market efficiency as causing the random walk. The EMH is not an
if and only if statement, and thus while the statement if (A and B) then P is true, that does not
imply that if P then (A and B), and thus showing that a stock or a set of stocks satisfies a random
walk, or more technically that the test fails to reject the null hypothesis at the given significance
level, is not sufficient to show that the market from which that set comes from is efficient.
These ideas lend themselves to the conclusion that there is no feasible or logically
consistent way in which to test the EMH in its entirety, or at least by testing the RWH conclusion
of it, which is somewhat the point. However just because a test does not have the logical power
to reject or invalidate an argument does not mean it is not worthwhile, it just means that the
2
Lo and MacKinlay (1988) document this phenomenon quite well.
3
DeMorgan’s Law states that for statements A and B, ¬(A ∧ B) ⇔ (¬A) ∨ (¬B) and that ¬(A ∨ B) ⇔ (¬A) ∧ (¬B)
6. 6
initial assumptions being used could be different than those used in the EMH, however the
robustness of the theory is a subject for later in the paper.
The Efficient Market Hypothesis:
The EMH has three specific forms, the strong, semi-strong, and weak. From Fama
(1970), the three forms are as follows. The strong form states that asset prices incorporate all
information, even private information, instantaneously, and thus markets are always at
equilibrium with no time lag. The semi-strong case weakens the assumptions to not include
private information, however any publicly available information such as company
announcements or performance reports is instantly consumed by market participants and factored
into their decision making. Finally, the weak form assumes that the data set that is incorporated
into decision making is past prices only.
The strong form of the EMH represents an almost untestable and self-fulfilling prophecy
because there is no market available that possesses the properties that it requires. One particular
barrier is the presence of insider trading laws that slow and/or prevent the spread of private
information to market participants. The semi-strong form is somewhat testable, and perhaps is
the most strongly disputed means of modeling a market, as it requires that all public information
is included in people’s expectations. As will be shown later, there are a few well documented
counter examples to this being a constant description of financial markets. The weak form is by
far the least disputable, as the data set that is being assumed to be incorporated into expectations
is just previous price data.
7. 7
Samuelson (1965) provides the theoretical proof that if prices are set in an
informationally efficient manner then they fluctuate randomly. However a core assumption of
this is that market participants act as though they are risk neutral. Further work on the EMH from
Lucas (1978) shows that even with risk-averse market participants prices reflect available
information and when weighted with the aggregated marginal utility of those market participants,
move in Brownian Motion. Effectively this means that when the desires of those involved are
taken into account, and they can vary, then prices fluctuate according to some probability
distribution. Samuelson makes no claims however regarding the probability distribution that
describes the Brownian Motion, thus it could be a Gaussian distribution or as Mandelbrot (1963)
described, a more general probability distribution called a Levy distribution. This contrasts with
Bachelier’s (1900) work that states that prices follow a Gaussian distribution. The mathematical
framework that Samuelson uses provides a further sense of truth to the RWH and the EMH in the
same way that labeling the hypothesized random walk of stock prices as Brownian Motion
makes it appear that the work being done is the equivalent of physics. While logical deductions
and proofs are a series of true statements, the statements themselves are entirely dependent on
the initial assumptions being used, and when those assumptions are changed, the conclusion that
was reached may no longer be logically supported.
There are few proponents of the EMH that argue that the market is always efficient and in
equilibrium (Lo 2007), however there are forces designed to exploit inefficiencies in the market
such that the market will be brought back to an efficient state. Black (1986) provides an excellent
example of this with what he terms as “noise traders” or individuals who believe they are using
accurate information as the basis for their actions but instead are merely using what is effectively
noise that has little actual predictive power. When the market is not efficient it is because
8. 8
individuals have been acting based on incorrect information that causes deviations from correct
pricings, however these misinformed traders represent opportunities for a clever investor to take
advantage of the noise traders’ biases.
One source of empirical tests of the EMH is that as Lo (2007) points out, in a world
without any uncertainty, the price of a share of stock should be equal to its net present value of
future dividends. With the introduction of uncertainty into a model, the difference between the
present value of dividends and the market price must represent errors in forecasting. Shiller
(1981) notes that the sum of the variances of the forecasting error and the market price must
equal the variance of the present value of future dividends. They find that this so called “variance
bound” is violated for their sampling periods, implying that there is potentially too much
volatility in the market for it to be considered efficient. While this would be a refutation of the
EMH, further work found significant issues with it that dramatically reduces the power of
Shiller’s conclusion. In particular, Michener (1982) shows that if one alters the assumption of
risk-neutrality among market participants to include risk averse behavior then theoretically this
variance bound property is violated even though the market is efficient.
Persistent, repetitive anomalies in financial markets provide a potential method of
rejecting the EMH. In particular, momentum strategies as documented by Jegadeesh and Titman
(2001) are a clear departure from the EMH as in a truly efficient market with assets following
random price walks there cannot exist a strategy that delivers returns in excess of the market,
adjusted for risk. These mid-term momentum strategies seem to consistently be exploitable by
traders there does not seem to be a method of resolving the EMH with this apparent
contradiction. However, the presence may be due to the fact that these strategies have an inherent
9. 9
risk above the market and that the returns are in fact fair compensation for the risk that is being
assumed.
While empirical challenges to the EMH seem to be fraught with difficulties and fail to
lead to any firm rejection of the EMH, Grossman and Stiglitz (1980) argue that a perfectly
informationally efficient market is a theoretical impossibility. Their argument centers on the lack
of profit opportunities for market participants if prices already contain all available information.
If an asset is correctly priced and reflects the preferences of the market as a whole then no
individual trader should seek to purchase an asset as they have nothing to gain from its purchase.
Rather Grossman and Stiglitz propose that market inefficiencies provide traders with an
incentive to obtain superior information than their competitors, and thus as a market becomes
more and more efficient, fewer profit opportunities exist. This is a provocative and compelling
argument that provides a source of refutation of the EMH.
The Anatomy of a Financial Crisis:
Minsky (1992) provides the framework that Kindleberger uses to describe financial crises
and bubbles. Initially Minsky suggests that the crisis starts with some form of displacement or
shock to an economic system which in turn generates profitable opportunities for market
participants. To take advantage of these, firms and individuals tend to leverage their assets to
maximize potential profitability. In the case of the 1990’s stock bubble, Kindleberger argues that
it was the increased use of information technology and ease of communications that provided
new opportunities. In the housing bubble he contends that the advent of CDO’s and CMO’s
provided this initial displacement. Following this initial systemic shock, Minsky states that the
10. 10
next step in a mania is that a sense of euphoria may set in. While mindful of previous crises,
regulators and participants convince themselves of the uniqueness of the current situation and the
lack of relation between the current situation and past manias. During this period there is an
“overtrading” of assets that helps increase the price of the asset around which the mania is
forming. This decreases the leveraging of loan holders because their loan has stayed constant
while the asset has appreciated. As non-asset holders see asset holders getting rich, they take out
loans to try and see similar results, driving prices even higher.
Eventually prices distort sufficiently from the asset’s intrinsic value that asset holders
start to sense that they will not go higher. Because asset holders have access to superior
information about the fundamental price drivers of their asset due to actually possessing that
asset, they will seek to sell off prior to the price falling. This is what Minsky and Kindleberger
describe as financial distress. Holders start to sell and soon the price drops, and does so quickly.
Some who can’t sell off their assets soon enough go bankrupt when the price falls too low to be
able to deleverage their debt successfully. Ultimately this leads to a “revulsion” where all holders
are seeking to sell, thus lowering the price even more quickly.
This framework is nestled within Minsky’s (1992) organization of finance that delineates
finance into hedge finance, speculative finance, and Ponzi finance. Minsky defines hedge finance
as when lenders have sufficient funds to cover all of their potential losses and then some.
Speculative finance refers to lenders having sufficient funds to cover all potential losses exactly,
and Ponzi4
finance refers to lenders not having sufficient funds to cover losses. Minsky’s
partition of financial states links to his model of crises as lenders transition down the states,
loosing lending regulations in their efforts to share in the profits being made.
4
Ponzi finance, or a Ponzi scheme, refer to the scheme run by Charles Ponzi in Boston in the early 20th
century
where he used new investor’s funds to pay the interest due to old investors. While not the first to use this scheme, it
has become uniquely associated with his name.
11. 11
Kindleberger and Aliber (2011) describes the ten largest crises in history, including the
bubble in over-the-counter stocks in the United States in the late 1990’s as well as the more
recent housing bubble that started in 2002 that helped precipitate the 2007-2009 financial crisis.
In each case the authors show that the crisis fits well into Minsky’s framework. The initial
displacement in the housing bubble, as has already been stated, was provided by the financial
innovation to create collateralized debt obligations and collateralized mortgage obligations that
allowed assets of varying qualities to be packaged together in a way that concealed poor quality
ones. As the value of houses started to rise, fueled by a variety of causes, the financial industry
responded by gradually relaxing credit standards as evidenced by the proliferation of sub-prime
lending and adjustable rate mortgages in the mid 2000’s. This nicely mirrors the transition
Minsky describes from hedge finance through speculative finance to Ponzi finance. The
accompanying euphoria can be seen in the number of housing starts5
rapidly increasing in this
same period as participants tried to share in the profits being made.
The accompanying crash came about after sub-prime lending increased to the point that
the loan recipients were unable to make mortgage payments as their rates increased with the
adjustable rates they had agreed to. While it seems irrational to take out a mortgage that is
beyond the feasible means of the homebuyer to pay off, there is evidence6
that these mortgage
recipients were either financially illiterate or were duped by the mortgage originators. As more
and more mortgage recipients failed to pay off their mortgages, the effects were felt throughout
the financial system with many individuals failing to recover their investments just as Minsky
describes.
5
The number of housing starts increased annually from 2002-2005 according to the US Census, before sharply
declining starting in 2006.
6
The Washington Post provides anecdotal evidence (St. George, 2009) of the lack of financial acumen that sub-
prime lenders often had. They tended to be first time home buyers and it was not necessarily in the interest of
lenders to provide a more comprehensive understanding of the contract which they were agreeing to.
12. 12
On the whole the EMH fails to describe the crises that Kindleberger describes, and while
the argument is that eventually the market will correct itself from any inefficient state due to the
actions of traders that will take advantage of any inefficiencies. However when some crises
persist for periods of multiple years as Kindleberger shows then it becomes harder to believe that
the market will price assets correctly.
Behavioral Finance and Robustness of the Efficient Market Hypothesis:
The theoretical rejection and empirical rejection of the EMH provided by Grossman and
Stiglitz (1980) and Aliber and Kindleberger (2011) does not deal with the robustness however,
which should be confronted through a variety of lenses, in particular non-rational decision
making with the work of Kahneman and Tversky (1974, 1979), and Simon (1955). In particular
Simon’s work provides a nice stepping stone from the hyper-rational, homo economicus to more
of a behavioral approach, and when mixed with herding behavior shown by Keynes (1936) and
Huberman and Regev (2001) as the potential to lead to a Minskyian style financial crash.
Rather than assuming that individuals operate in a utility maximizing fashion Simon
(1955) posits that economic agents use back of the envelope style calculations and heuristics to
approximate utility maximizing solutions and decisions, which he terms as bounded rationality.
In effect traders do not necessarily have a complete preference ordering of their choices made at
all times and thus are forced to sort their preferences and look for satisfactory solutions rather
than utility maximizing ones. In particular with sequential choices on an unclear horizon
individuals are forced to accept or reject possibilities without a clear alternative in mind in the
way that standard rational choice theory would expect. Thus, for traders who operate inter-
13. 13
temporally there is not necessarily a clear method of evaluating their options at a given time t.
Simon notes that at the time of publishing the distance between psychology and economics was
sufficient so that actual heuristics used would be impossible to determine.
If market participants are assumed to act in this satisficing manner rather than in a
perfectly rational manner then it becomes much harder to show that the market will be efficient.
When individuals are approximating optimal solutions rather than finding them they are not
necessarily maximizing their utility (although there is nothing to say that their approximations
are not in fact optimal), then there is clearly some choice available, although perhaps it is unclear
to that actor, which would increase their utility. While this seems like a contradiction of the
EMH, it is not. Because it is clear that actors are potentially, and perhaps probably, mispricing
assets with their actions there must be a superior action that a rational agent could achieve. A
counterargument against this is that while individuals may fail to be rational, on the aggregate
they are. For larger populations the law of large numbers takes hold which allows the use of a
normal distribution to approximate the set in question; for our purposes the expectations could
thus be assumed to be normally distributed about some mean μ with some standard deviation σ.
This critique is addressed through herding concepts later.
Tversky and Kahneman (1974) provide a more structured approach to Simon’s utility
approximation method. In particular their empirical work finds that individuals base their
predictions off of information that may, in fact, have little to no bearing on the actual outcome.
They find that oftentimes individuals imagine a data set of their own creation to base their
judgments on. For my purposes, this provides a more concrete, yet less rational, starting point
than Simon (1955) as to how individuals go about making decisions. This is somewhat
14. 14
reminiscent of Black’s (1986) noise traders in that if agents are using non-pertinent information
as a basis for their actions then they are, by definition, trading on noise.
To provide a more structured approach to risk taking I examine prospect theory as an
alternative to expected utility theory. Kahneman and Tversky (1979) show that individuals are
significantly risk averse, in fact that the potential for losses drive down an individual’s valuation
of a risk at a faster rate than gains do. A risk neutral trader in the classical version of the EMH
might accept a bet that has a positive expected outcome according to expected utility theory even
if that outcome involves a potential for a large loss, in prospect theory the value of that same bet
would be significantly lower.
Huberman and Regev (2001) provide an example of how behavior spreads among market
participants by documenting how a non-event led to a marked rise in a company’s share price7
.
In general this seems a specific case of Keynes’ (1936) “animal spirits” that lead to herding
behavior among market participants. This specific case can be easily imagined to be a more
general example that investors fail to incorporate available information into their actions and can
follow the actions of others even when there may not be sufficient reason to do so. This does not
refute the assumption of a normal distribution approximating the expectations of the population,
but instead argues that the given mean μ may be significantly far from what analysts might argue
is suggested by the fundamental predictors of asset price.
As an example, let us suppose that we find a trader who has been consistently successful
and outperforming the market repeatedly. The EMH argues that this trader either does not exist
or that their existence is merely a highly improbable event8
. However, if we assume that the
7
A piece of news published by the New York Times saw a large increase in share price of EntreMed shares even
though the same news had already been published in Science more than a year earlier.
8
Consistently outperforming the rest of the market implies that they are either a) trading on superior information
than everyone else, or b) that they are just amazingly lucky.
15. 15
prices are not being set rationally due to market participants’ biases, then the opportunity exists
for individuals to gain profits from acquiring superior information. It is not, therefore, a stretch
of the imagination to suppose that this trader has put in effort to obtain better information than
others. When other traders see this happening the opportunity arises for them to mimic this
exceptional trader’s behavior. However this comes with at least some form of time delay, and
thus it is an imperfect imitation.
Arthur et al. (1997) provide an appropriate model in which both the EMH, rational
expectations style models may exist, and more alternative expectations models may also flourish.
Based on the behavior of traders within a given system and their proclivity to examine
alternative expectation models, they find that as the rate at which traders explore alternative
models goes up, the more the market departs from the rational expectations equilibrium and
instead comes to resemble a much more active one. This nicely conforms to Grossman and
Stiglitz’s (1980) argument about the case of a perfectly efficient market resulting in an entire
lack of trading. The case in which traders explore these alternative models at a high rate seems a
fitting description of stock market participants. The image of floor traders actively working and
filling orders is a fairly typical one that occurs throughout media depictions of stock exchanges.
Thus, and the authors themselves note this, the more active that traders are in terms of trying to
gain an edge through alternative strategies, the more it comes to resemble the stock market as
most are familiar with it.
The combination of non-rational decision making as described by Kahneman and
Tversky (1974 and 1979) and herding behavior shown by Huberman and Regev (2001) and
Keynes (1936) above provides a nice framework in which to set Kindleberger’s (1978) history of
financial crises. As traders get carried away in euphoria they perpetuate optimistic decision
16. 16
making, in particular the accepting of increased debt levels, and as demand for assets grows,
their price eventually exceeds the underlying value. These are in effect what Black (1986)
describes, particularly those who are aping the traders at the forefront of the charge, and provide
the opportunity for profit opportunities to be had by those willing to engage in acquiring superior
information as Grossman and Stiglitz (1980) describe. Those traders who are unwilling to
acquire that superior information will continue to generate their own beliefs based on the market
euphoria that they are experiencing, but eventually there is a sense that there are no more profits
to be had and will attempt to exit the market.
Again the individuals who lead this movement will be those who have put in the effort to
acquire better information, with potentially a sequential cascade down of individuals trying to
sell their assets with those who have acquired the best information selling first then proceeding
down by information quality. Eventually however, there will come a point when the actual value
of the assets held will be greater than the price any individual trader is willing to pay due to their
biases based on the market trending downwards.
I find that the non-rational approach to individual decision making leads to a more
realistic model of financial markets as it fits nicely within Kindleberger’s empirical framework
and explains deviations from equilibrium states in a better, more superior way than the EMH. On
the whole however, there is something almost utopian about the EMH that has rather attractive
qualities to it, particularly with how well it meshes in with the rest of the rational expectations
based neoclassical literature. I might offer that financial markets have the potential to become
more and more efficient, although I am sure that Kindleberger might beg to disagree with a
multitude of examples at his side, through a somewhat Darwinian mechanism such as that
proposed by Arthur et al. (1997) that roots out Black’s noise traders so that the remaining players
17. 17
are those willing to engage in the higher effort act of obtaining superior information. While this
fails to present a stable equilibrium option at any time, particularly as new entrants come in
looking for profitable opportunities, successful ones will be able to remain while those unable to
procure the necessary survival techniques to continue will exit the market.
These explanations provide a theoretical rejection to the EMH, but examining empirical
evidence to determine whether the RWH holds during different time periods provides the
potential for evidence either in support or against the EMH.
Econometric Analysis:
I begin by testing twenty stocks selected from the NASDAQ Stock Exchange, in
particular the twenty stocks that had the highest trading volume. These stocks are Microsoft
(MSFT), Apple (AAPL), Intel (INTC), Invesco PowerShares (QQQ), Sirius XM Radio (SIRI),
CISCO (CSCO), Amarin (AMRN), Zynga (ZNGA), Velocity Shares Daily 2x (TVIX), Velocity
Shares Daily Inverse (XIV), Micron Technology (MU), Facebook (FB), Frontier Technology
(FTR), Fox America (FOXA), Applied Materials (AMAT), Cypress Semiconductor (CY),
Qualcomm (QCOM), Comcast (CMCSA), Huntington Bancshares (HBAN), El Pollo Loco
(LOCO). While these represent a very small proportion of the total stocks traded in the
NASDAQ (there are roughly 3,100), the high trading volume serves as a proxy for information
availability, which is a prerequisite for market efficiency. I chose to select from the NASDAQ as
opposed to the New York Stock Exchange, Chicago Exchange, or other stock exchanges in the
United States because they handle a higher volume in trading than any other exchange in the
United States, thus the stocks selected are a subset of what could be an efficient market.
18. 18
My test follows the work of Cowles and Jones (1937) by testing each of the twenty stocks
selected for a random walk. If an arbitrary stock follows a random walk, then it follows that the
probabilities of a stock increasing in price versus decreasing in price should be equal. An
analogous example of this is an n-repeated coin flip, where obviously if a coin is normal and has
not been weighted then on the nth
coin flip the probability of it landing on heads is equal to the
probability of it landing on tails, regardless of whatever happened in the previous n-1 flips. Thus
at some time ti the probability of some stock increasing in value is .5 and the probability of it
decreasing is also .5.
Cowles and Jones (1937) define continuations as when on any three consecutive prices,
call them pi-1,pi,pi+1, if pi-pi-1 and pi+1-pi have the same sign then there is a continuation in the
price sequence from i-1 to i+1. If pi-pi-1 and pi+1-pi have opposing signs then there is a reversal.
On any three consecutive prices there must either exist a continuation or a reversal. If price
movements are random according to some symmetric distribution, thus with even probabilities of
being positive and negative, there are equal probabilities of continuations and reversals
happening, as is illustrated below.
pi-pi-1>0 pi-pi-1<0
pi+1-pi>0 Continuation Reversal
pi+1-pi<0 Reversal Continuation
Over n prices, there is a significant possibility that there is drift from an even number of
continuations and reversals, however the test is determining whether the drift is far enough for
there to be evidence that the joint hypothesis is false.
19. 19
I examined price data over both a ten-year period and a three-month period to determine
whether a) random walks occur in both time horizons or neither, or b) if random walks occur in
one and not the other. I hypothesize that there may be a difference between the two time horizons
because markets may become more efficient as the observational period lengthens and thus in the
short run stocks may deviate from a random walk. There is also empirical evidence that
individual stocks usually follow random walks whereas indicial assets do not. With more time I
intend to examine a set of indicial assets as well to test for random walks.
In the ten-year period, I found that three stocks out of the sample of twenty had sufficient
drift from an even distribution of continuations and reversals that at an alpha level of .05 the null
hypothesis that the stock follows a random walk could be rejected. However, as more stocks are
added to the sample, the probability that a type I error is made increases. Thus a Bonferonni
Correction9
must be made to correct for this increased probability of a type I error being made.
With the new alpha level of .0025, the sample of twenty fails to reject the general null hypothesis
that stocks follow random walks.
In the three-month period I again found that three stocks out of the sample of twenty had
sufficient drift from an even distribution of continuations and reversals that at an alpha level of
.05 the null hypothesis that the stock follows a random walk could be rejected. With a
Bonferonni Correction that corrects the alpha level down to .0025, the sample fails to reject the
overall null hypothesis that stocks in a three-month period follow random walks.
This tends to follow in the vein of empirical work (Lo and MacKinlay 1988) that has
occurred over the past few decades that document that individual stocks follow random walks
even when indices as a whole do not do so.
9
A Bonferroni Correction controls the familywise error rate by dividing the alpha level (.05) being used by the
number of hypotheses being used, in this case 20, leaving a new post-Bonferroni alpha of .0025.
20. 20
There are several issues with these econometric tests, namely that the sample is so small
in comparison to the population as a whole that there is little that can actually be extrapolated or
inferred from the results. It could very well be that in this small subset (.6% of listed assets in the
NASDAQ), stocks do in fact follow random walks. Because the sample size was so limited the
power of the test is relatively insignificant and thus general conclusions regarding the NASDAQ
market as a whole or even more generally US stock markets.
Additionally, as stated before, the joint hypothesis problem effectively renders any sort of
empirical counterargument logically invalid, and while this test failed to reject the null
hypothesis, even if it had the result was logically useless and could be rejected from a simple
counterargument. However, while a rejection of the null hypothesis is not sufficient to reject
market efficiency, it should also be noted that failing to reject the null hypothesis is not sufficient
to accept market efficiency, as has already been stated.
With more time I would have liked to also examine non-NASDAQ exchanges, both
domestic and international, in particular Chinese exchanges like the Shenzen Stock Exchange or
Shanghai Exchange because the Chinese exchanges are generally less open to foreign investors
which may reduce the efficiency of the market as a whole. Furthermore I would like to examine
stock prices for a set of financial crises or bear market periods to see whether during those time
frames there is sufficient evidence to reject the null hypothesis.
The conclusion that these stocks follow random walks over the ten-year period is
somewhat surprising since this time frame includes the 2007-2009 financial crisis when domestic
stock prices generally fell across the board10
, as well as from 2009-2014 when stock prices
generally rose. It seems like there probably should have been sufficient drift in this time frame to
10
The Dow Jones Index reached a relative minimum on March 6, 2009, when it bottomed out at 6626 points. Since
then the Dow and the two other major indices (S&P 500 and the NASDAQ) have all trended upwards to record
highs. (Google Finance)
21. 21
lead to a rejection of the null hypothesis that there is an even probability of continuations and
reversals. My conjecture is that this consistent drift downwards is countered by micro
movements in day to day trading that lead to slight upticks that are then followed by major drops,
or vice versa in the post 2009 world. Because the test merely measures continuations versus
reversals and not the magnitudes of those continuations, large upward or downward trends that
have slight jitters in the opposite direction can appear to be random walks according to this test,
even if someone looking at them were to come to a completely opposing conclusion.
In addition to the ten-year and three-month periods that I examined the stocks for, I also
examined these stocks over a specific nineteen-month period from August 1st
2007, to March 31st
2009. These dates were selected to incorporate what is effectively the entirety of the 2007-2009
financial crisis and recession. The NASDAQ dropped roughly 1215 points from August 1st
2007
to hit a relative minimum on March 6th
, 2009, and so by examining this data set I hoped to
determine whether the statistical test being used can identify a major downward trend. Of the
twenty stocks selected in my sample, fourteen had data during this period.
In this nineteen-month period I found insufficient evidence to reject the random walk
hypothesis in thirteen out of fourteen qualifying stocks from my initial sample. The one stock
which rejected the hypothesis of a random walk did so after a Bonferroni correction, thus
minimizing the chance of an incorrect rejection of the null hypothesis. In general however, one
stock out of fourteen is a higher rate than would be expected based on pure randomness. Thus a
potential conclusion could be that during this period stocks on the NASDAQ did not move in
Brownian Motion, or more broadly that stocks in general did not conform to the RWH. Limiting
this type of broad conclusion is the sample size and population from which it is drawn. As has
already been stated, the initial sample was an incredibly small subset of total stocks being traded
22. 22
on the NASDAQ, and an even smaller subset of stocks traded in the United States, and the
reduction in sample size reduces the inferential power of the test. Additionally, since the stocks
are only selected from the NASDAQ, there is no statistical basis to make claims about stocks
traded in other exchanges or about stocks in general. Thus while it is useful that this test rejects
the RWH for this small subset of stocks during a severe market crash and recession, it does little
to provide general claims or evidence against the RWH or EMH.
Conclusion:
The EMH, at least in its weak form, provides a suitable explanation for how financial
markets work outside of the financial crises described by Aliber and Kindleberger (2011),
however seems unable to explain the behavior of asset prices during a crisis outside of simply
saying that the market is continuously pricing the asset correctly and that its value is changing
rapidly over time. It seems reasonable to think that at some point the asset or set of assets around
which the bubble was based were mispriced for there to be such a large price change. This
directly contradicts the general conclusion of the EMH that the market always correctly prices an
asset according to the information available. The only way around this is to state that the price is
changing because more relevant information is being made available, which could correspond to
the 2007-2009 financial crisis as the true amount of risk associated with CDO’s and CMO’s
became available to the asset owners. This information was always available to some extent, it
was just never obtained. This seems much more in line with the explanation of Grossman and
Stiglitz (1980) where increased returns are available to those who put in to the effort to obtain
relevant data.
23. 23
The logical consistency offered by the EMH makes it understandably attractive and
provides the ultimate barrier to its refutation. The joint hypothesis problem prevents disproving
through empirical methods and in its weak form it seems reasonable to conclude that past prices
are common knowledge and thus its conclusions should follow, at least at a theoretical level.
The EMH provides an idealized and potentially unobtainable explanation for how
markets function. A more realistic or grounded description could revolve around the relative
efficiency of a financial market or financial markets in general. Engineers describe the efficiency
of an engine or machine, a perfectly efficient version is used only for comparison and no one
considers any existing model to actually obtain that level of efficiency. Similarly, financial
markets may never reach perfect efficiency, even in a stable time period, but the EMH provides a
measuring stick to compare existing markets to. A major barrier to the EMH’s usage as an actual
tool in determining efficiency is that a metric to use must first be constructed, a potentially high
barrier.
The empirical findings of this paper do not necessarily serve to confirm or refute the
EMH, rather they make conclusions regarding its corollary the RWH. The data suggests that over
long periods the RWH is followed, however this does not necessarily mean that the market was
efficient over that time frame, nor does the finding that the RWH does not necessarily describe
asset prices during the August 2007-March 2009 period imply that the market was inefficient at
the time. Although the EMH is shielded by the joint hypothesis problem, evidence that the RWH
does not hold during the recent crisis adds to the weight of the argument that the EMH does not
accurately describe financial markets during periods of financial instability.
Darwinian mechanisms based on behavioral economics provide a more palatable and
compelling description of financial markets, both in stable and unstable times, than the EMH.
24. 24
Market participants as a whole may not have rational expectations and this can cause what is
effectively a mispricing of the asset. Those who have taken the effort to obtain superior
information see higher returns to their investments and those who do not are eventually left
behind and are weeded out of the market. I find the literature in this tradition more compelling
than that in support of the EMH. Because the EMH seems unable to explain the presence of
bubbles well it cannot be considered a truly accurate way of describing financial markets.
25. 25
Aliber, Robert Z., and Charles Poor Kindleberger. Manias, Panics, and Crashes: A History of
Financial Crises. New York: Palgrave Macmillan, 2011. Print.
Arthur, W. B., Holland, J. H., LeBaron, B., Palmer, R., & Tayler, P. (1997). Asset Pricing Under
Endogenous Expectations in an Artificial Stock Market. The Economy as an Evolving Complex
System II.
Bachelier, L. (1900). Théorie de la spéculation. Annales Scientifiques de l’École Normale
Supérieure,3(17), 21–86.
Black, F. (1986). Noise. The Journal of Finance, 41(3), 529–543.
Chan, N., LeBaron, B., Lo, A., & Poggio, T. (1998). Information Dissemination and Aggregation in
Asset Markets with Simple Intelligent Traders.Massachussets Institute of Technology.
Cowles, A., & Jones, H. E. (1937). Some A Posteriori Probabilities in Stock Market
Action.Econometrica, 5(3), 280–294.
Fama, E. F. (1963). Mandlebrot and the Stable Paretian Hypothesis. The Journal of Business, 36(4),
420–429.
Fama, E. F. (1965a). The Behavior of Stock Market Prices. The Journal of Business, 38(1), 34–105.
Fama, E. F. (1965b). Random Walks in Stock Market Prices. Financial Analysts Journal, 21(5), 55–
59.
Fama, E. F. (1970). Efficient Capital Markets: A Review of Theory and Empirical Work. The Journal
of Finance, 25(2), 383–417.
Foster, J., & Magdoff, H. (2008). The Financialization of Capital and the Crisis. Monthly Review, 1-1.
Grossman, S. J., & Stiglitz, J. E. (1980). On the Impossibility of Informationally Efficient
Markets. The American Economic Review, 70(3), 393–408.
Huberman, G., & Regev, T. (2001). Contagious Speculation and a Cure for Cancer: A Nonevent That
Made Stock Prices Soar. The Journal of Finance, 56(1), 387–396.
Jegadeesh, N., & Titman, S. (2001). Profitability of Momentum Strategies: An Evaluation of
Alternative Explanations.The Journal of Finance,56(2), 699–720.
Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under
Risk.Econometrica, 47(2), 263–292.
26. 26
Keynes, J. M. (1936). The General Theory of Employment, Interest, and Money. Palgrave McMillan.
Kindleberger, C. P. (1978).Manias, Panics, and Crashes: A History of Financial Crises.
Lo, A. W. (2007). Efficient Markets Hypothesis. In L. . Blume & S. Durlauf (Eds.), The New
Palgrave: A Dictionary of Economics (Second Edition.). New York: Palgrave McMillan.
Michener, R. (1982). Variance Bounds in a Simple Model of Asset Pricing. Journal of Political
Economy, 90(1), 166–175.
Minsky, H. P. (1992). Financial Instability Hypothesis. Levy Economics Institute.
Samuelson, P. A. (1965). Proof That Properly Anticipated Prices Fluctuate Randomly.Industrial
Management Review, 6(2), 41–49.
Simon, H. A. (1955). A Behavioral Model of Rational Choice. The Quarterly Journal of
Economics, 69(1), 99–118.
Stiglitz, J. (2009). The Anatomy Of A Murder: Who Killed America'S Economy? Critical Review,
329-339.
Stockhammer, E. (2012). Rising Inequality as a Root Cause of the Present Crisis. Working Paper
Series, 282.
St. George, Donna. (2009) Fallout: In the Real Estate Boom, One Mother Took a Chance on
American Dream - and Lost Big. The Washington Post:.
Taylor, J. (2009). Getting off track: How government actions and interventions caused, prolonged,
and worsened the financial crisis. Stanford, Calif.: Hoover Institution Press.
Tversky, A., & Kahneman, D. (1974). Judgement under Uncertainty: Heuristics and
Biases.Science, 185(4157), 1124–1131.
Tversky, A., & Kahneman, D. (1981). The Framing of Decisions and the Psychology of
Choice.Science, 211(4481), 423–458.
United States. (2010) U.S. Census Bureau. New Residential Construction. N.p.: US Census Bureau,
2010. US Census. Web. 26 Apr. 2015.
27. 27
Examples of computer generated Brownian Motion using a standard normal distribution (µ=0,
σ=1)