Efficiency of financial markets during crisis periods: fractal analysis approach

Using the Hirst exponent to assess the effectiveness of financial markets. Analysis of the Russian financial market. Testing of hypotheses about the efficiency of markets calm and crisis periods. Predicting the impending crisis using the Hurst indicator.

Рубрика Экономика и экономическая теория
Вид дипломная работа
Язык английский
Дата добавления 01.09.2017
Размер файла 2,3 M

Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже

Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.

Размещено на http://www.Allbest.ru/

Размещено на http://www.Allbest.ru/

Федеральное государственное автономное образовательное учреждение высшего образования

Национальный исследовательский университет

Высшая школа экономики

Факультет экономических наук

Направление подготовки «Экономика»

Образовательная программа «Экономика и статистика»

Выпускная квалификационная работа

Тема:

Efficiency of financial markets during crisis periods: fractal analysis approach

Применение фрактального анализа для исследования влияния кризисов на эффективность финансовых рынков

Студентка Вилкова М.Ю.

Руководитель к.т.н.,

доцент Копнова Е.Д

Москва 2017

Table of content

  • Introduction
  • 1. Theoretical foundation
    • 1.1 Efficient Market Hypothesis and its Critic
    • 1.2 Fractal Market Hypothesis
    • 1.3 Review of usage of Hurst exponent for measuring markets' efficiency
    • 1.4 Fractal analysis and Hurst exponent
    • 1.5 Methods of Hurst exponent estimation
      • 1.5.1 R/S analysis
      • 1.5.2 Andrew Lo R/S method
      • 1.5.3 Detrended fluctuation analysis
      • 1.5.4 Detrending moving average method
    • 1.6 Estimating significance of Hurst exponent
  • 2. Empirical analysis
    • 21 Data preprocessing
      • 2.1.1 Data description and visual analysis
      • 2.1.2 Testing for structural breaks
      • 2.1.3 Stationarity tests
      • 2.1.4 Filtering data with GARCH (1,1) model
    • 2.2 Measuring efficiency of Russian financial market
      • 2.2.1 R/S analysis
      • 2.2.2 R/S-AL
      • 2.2.3 Andrew Lo rescaled range method
      • 2.2.4 Detrended Fluctuation Analysis
      • 2.2.5 Detrended Moving Average
      • Results
    • 2.3 Analysis of time-dependent Hurst exponent
  • Conclusion
  • References
  • Application 1
  • Application 2
  • Application 3

Introduction

Concept of financial markets' efficiency was developed in the 1960s, and since that time, it is one of the most discussed topics in financial economics. The Efficient Market Hypothesis (EMH) states that markets are efficient, that means that assets always trade at their fair value, because their prices reflect all available information. In the sense of market policy this implies that no regulation is needed, market is a self-regulating system.

EMH is repeatedly applied in theoretical models and empirical studies, however, it is also often criticized. The main argument against EMH is the existence of financial crises: according to EMH, they are not even possible. From the critic of EMH originated another hypothesis - Fractal Market Hypothesis (FMH). Unlike EMH, it is able to explain crises. It assumes that market is stable if it consists of great number of investors with different investment horizons, this guaranties liquidity. When investors change their investment horizons (for instance, fundamental information becomes not reliable and long-term investors leave the market or shorten their horizons), the balance between short and long run is distorted, market becomes less liquid and crisis occurs.

FMH is closely related to the concept of fractals - self-similar structures that can be found in nature or be mathematical objects. It states, that markets have fractal structure, and, among other properties, it means that process of prices changes exhibits long memory. Long memory is a violation of EMH, because the latter assumes that prices changes follow the Random Walk process (are independent from each other). Fractal analysis of time series proposes a measure of long memory - Hurts exponent. Hurst exponent was introduced by British hydrologist E. Hurst in the middle of last century, but it was first applied to time series significantly later, and now it is still a popular measure for analysis of financial markets. Hurt exponent allows to test EMH (if prices are independent it has a value close to 0.5, and if not it differs from 0.5), and also FMH that assumes that Hurst exponent is higher than 0.5 in its stable state, but should follow a decreasing trend before the crisis as a cause of “nervousness” in investor's behaviour.

The aim of this study is:

· to make a review of Hurst exponent usage for evaluation of financial markets' efficiency (including different methods of Hurst exponent estimation and evaluation of its significance);

· to investigate the efficiency of Russian financial market by testing two hypotheses: (1) markets are efficient during tranquil periods, but lose efficiency during financial crises (corresponds to EMH); (2) Hurst exponent can predict upcoming crisis by a decreasing trend just before it (corresponds to FMH).

To achieve this aim the following tasks should be solved:

1) to consider the principles of EMH and FMH and their critic, to identify their differences, especially concerning market's behaviour during crisis;

2) to explain the idea of Hurst exponent as a measure of long memory in time series, to make a review of different methods of its estimation;

3) to describe methods that are used for evaluation of Hurst exponent significance in the conditions of absence of theoretical distribution;

4) to make a review of recent literature that applies Hurst exponent to the problem of market's efficiency;

5) to divide the time series of RTS Index from 11.01.2005 until 10.03.2017 into tranquil and crisis periods;

6) to estimate Hurst exponent and its significance for tranquil and crisis periods using different methods;

7) to calculate time dependent Hurst exponent for the period from 11.01.2005 until 10.03.2010 using Detrended Moving Average method.

The object of this study is Russian financial market, the subject - efficiency of Russian financial market (in terms of independence of price changes). From the methodological point of view, this study is divided into qualitative part (based on previous literature) and quantitative part - statistical analysis. In the quantitative part various econometric techniques are used to prepare data for analysis (divide into periods, remove the effects of heteroscedasticity); for estimation of Hurst exponent R/S-analysis, Detrended Fluctuation Analysis (DFA), Detrended Moving Averge Analysis (DMA) and also Andrew Lo's corrected R/S method are used. In addition, a bootstrap procedure is applied to evaluate the significance of estimated Hurts exponents. For the analysis of Russian financial market series of RTS and RTSVX indexes were used, they are available at the Website of Moscow Stock Exchange. The analysis was performed with the help of statistical software: Excel, EViews, Gretl and Matlab. DFA analysis was performed in Matlab using procedure presented by Weron (2002), DMA analysis - using Matlab procedure proposed by Gu and Zhou (2010).

1. Theoretical foundation

1.1 Efficient Market Hypothesis and its Critic

Market efficiency can be defined as the degree to which prices reflect available information. This concept was independently developed by Eugene F. Fama and Paul A. Samuelson in the 1960s in form of Efficient Market Hypothesis (EMH), and since then it is constantly applied to theoretical models and empirical studies. As often economists find confirmations of this concept, as often it is criticised (mostly by behavioural economists and psychologists). This constant research has led to the extension of EMH in many directions and development of alternative theories.

Idea of EMH

Samuelson came to the concept of efficient markets through his interest in temporal pricing models, he stated that prices are unpredictable in case if they incorporate all the available information and all the expectations of market participants. Fama, in contrast, conducted empirical research of stock prices behaviour (Fama, 1965), and he first formulated the EMH. Briefly, the idea of EMH is that prices fully reflect all available information, thus, all assets trade at their fair value, and it is impossible “to beat the market”. The hypothesis is based on two global economic insights: first, in a competitive market an excessive profit will be reduces or eliminated by a new entry; second, prices are depended on information flow. EMH exists in three different forms (weak, semi-strong, strong), but the general positions about the impact of the information flow are: prices reflect all available information (for weak form - all past information, for semi-strong form - all current publicly available information, for strong form - all information, including “insider” information) and react to new information momentarily.

Random Walk Hypothesis

Efficient market is unforecastable because prices change random, and the more efficient it is - the more random is the sequence of price changes. This random behaviour is not an accident, it is a result of simultaneous behaviour of independent investors. Attempting to make profit, they use even small informational advantages, and doing so, they transfer their information to the whole market, then prices react to this information and profit opportunities are eliminated.

The fact that EMH leads to randomness of price changes creates great possibilities for empirical testing of this hypothesis, because it means that prices changes can be described by Random Walk process. Therefore, large amount of literature tests EMH by testing Random Walk Hypothesis (RWH). Lo (2007) writes that probably the first test of RWH was developed and applied to the stock market by Cowles and Jones (1937), they compared the frequency of sequences and reversals in stock returns (sequences - pairs with the same signs, reversals - pairs with the opposite signs). After formulation of EMH testing RWH for stock or assets prices became a common research design. Lo (2007) provides a review of papers on this topic, the majority of them find support of RWH using historical prices. Fama and French (1988) find negative serial correlation in US indexes, but this evidence is not sufficient to reject RWH at usual levels of significance. Lo (1991) proposes another method of EMH testing. He constructs a test for long-term memory that is robust to short-term correlations. Long-term memory, as short-term memory, is a contradiction to RWH and, hence, to EMH, but it is more difficult to detect, because it assumes that current prices are correlated with ones in the remote past. Lo finds, that there is no evidence of long memory in stock market prices, so the EMH is not violated again. This test will be applied also in this study.

Critic of EMH

As mentioned before, EMH was criticised a lot, empirically and theoretically. Here the most popular arguments of the theory's opponents are presented, however, it is not a comprehensive list.

· Overreacting and underreacting. Some investors may be not rational and overreact or underreact to new information, for instance, buy stocks that just have experienced gains, though there is no other signs of future gains (example of overreacting). In this case, price becomes not fair, or does not correspond to the market value. Rational investors take the other side and bring the market back to the stable condition. The consequence of it is the phenomenon when contrarian strategies (strategies of buying “losers” and selling “winners”) bring high return. This phenomenon has been tested, and DeBondt and Thaler (1985) find using the returns of New York Stock Exchange from 1926 to 1982 that the “winners and losers” of one 36-month period tend to reverse their performance over the next 36-month period. Bernard and Thomas (1990) writes that investors tend to underreact to some information. It is also called “post-earnings announcement drift” - situation in which information from earnings' announcement takes several days to become fully incorporated into market prices, violating the statement of EMH about prices instantly reacting to the new information.

· Behavioural critic. EMH is constantly critiqued by behavioural economists, they argue that investor often, if not always, react irrationally, and their behaviour cannot be described by standard utility functions, as EMH assumes. The cause of this irrationality is the exposure of investors to various cognitive biases. These are overconfidence, confirmation bias, hot-hand fallacy, loss aversion, miscalibration of probabilities, mental accounting and others. The impact of these behavioural biases on decision-making processes is proved by empirical experiments or games (see, e.g. Kahneman and Tversky, 1979). Behavioral finance is one of the relatively new branches of economics and it is one pf the most promising alternatives of EMH.

· Impossibility of efficient markets. Grossman and Stiglitz (1980) develop a theory that is exactly the opposite of EMH. They argue that market inefficiency is the major reason of market's existence. If markets are efficient, gathering information brings no profit, and, overall, there is little reason to trade. The authors show, that the equilibrium in this case will be degenerate. At the same time, the degree of market inefficiency is the determinant of investors' incentive to gather information; and their profit can be interpret as rent that compensates information's costs.

· Financial crises. The existence of financial crises is probably the main argument of the opponents of EMH. Financial bubbles occur then asset price rises are above the true economic value of this asset. EMH says that prices always incorporate all the available information, thus they reflect true economic value. Besides, according to EMH investors are rational and are not affected by such phenomena as herd behaviour. In sum, economic bubbles do not exist under EMH. This discrepancy with reality was noticed already after the development of the theory, it heated the discussion, but did not fully discredit the hypothesis. After the financial crisis 2007-2008 EMH was criticized as intensively as never before. Former Federal Reserve chairman Paul Volcker said that the unjustified faith in market efficiency is one of the major causes of the financial crises; some people both in academic and business societies went even farther, they stated that EMH is fully responsible for this crisis, because it made investors and policy-makers too confident about market's ability to regulate itself (Ball, 2009). At the annual American Economic Association (AEA) meeting in 2010 EMH was also sufficiently discussed, and not in a good way. For instance, Joseph Stiglitz (a long-time opponent of EMH) said that “The idea behind efficient market hypothesis is very powerful but not true” (Subramanian, 2010).

Despite all these convictions, EMH still has many supporters. Ball (2009) writes, that if EMH is the main cause of crises, then one can only wonder, how they could have happen before EMH was formulated (historians usually refer to the 1637 Dutch “tulip mania” as to the first bubble). Supporters of EMH argue that the hypothesis is often interpreted not in the way it was initially formulated. One of the examples of this is a belief that collapses of large financial institutions (usually about Lehman Brothers) indicate the inefficiency of the markets. However, these collapses do not contradict EMH, they even support it by demonstrating that in competitive market if positions are too risky, one day this institution will lose, no matter how big it is.

The most reliable explanation of crisis that does not conflict with EMH is connected with information quality. EMH states that all available information is displayed in prices, but it does not say anything about what information is actually available. In other words, EMH focuses only on the demand side of the information market and not on the supply side. Available information can be not sufficient or of poor quality, it can be distorted by managers in order to deceive investors or just published not in time. Actually, it was the case in 2007 when the crisis began in subprime mortgage market: investors had no information about how risky were mortgages that were pulled in mortgage-backed securities. Focusing only on the demand side of information is one of limitations of EMH together with assumptions of costless information and equal perception of information by different investors, disregard of the concept of liquidity.

In sum, despite all the new evidences, increase in the data availability and advance in statistical techniques, there is still no consensus about EMH among economists. Lo (2007) suggests that one of the reasons of this is that EMH is not a well-defined hypothesis (as mentioned before, it says nothing about information characteristics or investors behaviour). EMH contains at least two independent aspects: the information content and price formation mechanism, so every test of EMH is a test of at least two or probably more hypotheses, and hence, its rejection or acceptance says not a lot about the real nature of the phenomenon. This is also a reason why testing of RWH should not be equalized with testing of EMH, RWH is only a consequence of market efficiency.

EMH can explain some patterns of financial markets, and as every theory, it has its limitations. The heated discussion around it has led to the flourishing of new branches of economic since, such as behavioural finance, and creation of new theories (e.g. Adaptive Market Hypothesis and Fractal Market Hypothesis). For this study, the most relevant alternative theory is Fractal Market Hypothesis (FMH).

1.2 Fractal Market Hypothesis

Fractal Market Hypothesis (FMH) originated from critic of EMH. The author of FMH, Edgar Peters, noticed that assumptions of EMH about statistical characteristics of the markets do not correspond with real data (Peters, 1994). He criticises EMH quite harshly, arguing that it was suggested only to give statisticians the right to use the assumptions of random walk of prices and normal distribution of price changes, though this assumptions does not hold in reality. So, extreme observations (large changes in prices) are more frequent than expected from the normal distribution. This critic may be not always correct, because Fama himself proposed “Paretto returns” - distribution with heavy tails. Peters analyses returns of Dow Jones index, demonstrating that 5- and 90-days returns have similar shape with heavy tails and high kurtosis that differs from the bell-shaped distribution significantly. He calls this structure self-similar (because the distributions of returns have similar shape on different investment horizons). The same is observed also for stocks prices and exchange rates. Studying time structure of volatility he founds that standard deviation is not scaled like the square root of time (like it is standardly assumed). The analysis suggests that standard deviation increases slower for investments horizons longer than 1000 days, and this corresponds to the well-known belief that long-term investors bear more risk than short-term investors (if risk is defined like standard deviation). Although up to 1000 days standard deviation is scaled like square root of time, Peters concludes that volatility structure also leads to violation of random walk hypothesis.

As follow-up to his critic of EMH Peter (1994) presents FMH. FMH is based on the concept that is completely omitted in EMH - liquidity. Liquidity describes the ability of a market to buy or sell an asset at a stable price. If market is stable, when there is no trade-off between the speed of a deal and its price, therefore it is liquidity that ensures that the price is close to the fair one. EMH does not say anything about liquidity: it assumes either that price is always fair regardless liquidity or that markets are always liquid. Market was initially created to provide liquidity, which allows investors with different investment horizons to trade with each other. The heterogeneity of investors is a cornerstone of FMH, because it is actually the source of liquidity. Investors with different horizons perceive price changes differently. Imagine a situation: price of an asset falls and this fall is sufficient for days-traders, when day-traders start to sell this asset causing luck of liquidity, however, this fall is insignificant for traders with long investment horizons, so they can buy this asset and create additional liquidity. The impact of new information on prices is also considered in with regard to investors' heterogeneity. Investors with different horizons treat new inflowing information differently: day- or even minutes-traders (noise trades) react are influenced more by technical information, when long-term investors pay more attention to fundamental information.

FMH says that all investors have the same level of risk, but scaled to the length of the investment horizon. This statement is derived from the fact that distributions of returns look similar for different horizons. This self-similar structure is the reason why hypothesis is called fractal.

FMH does not fail to explain financial crises. As stated before, existence of investors with different horizons guaranties market's stability. Crisis occurs, when the fractal structure of the market is distorted. Such distortion can happen if investors who had long investment horizons either stop to participate at the market or shorten their horizons. When horizons become homogenous, offers of investors are not met by the reverse order, again, not enough liquidity, market is in the “free-fall” and prices collapse. This can happen in the periods of economic or political instability, when nobody is sure about the future and ready to bear high risk. Peters (1994) provides an example of such situation. After assassination of John Kennedy on 22 November 1963, stock prices took a sharp drive: death of the president made future perspectives very uncertain. Fundamental information lost its value and long-term investors, who usually base their decisions on it, leaved the market. That morning S&P500 lost about 3%, and 2:07 p.m. the New York Stock Exchange was closed and opened again only few days after, when investors could make long-term forecasts again. The largest one trading day loss occurred on the first trading day after 9/11 attacks (S&P500 lost about 14%, DJI - 11%), chaos and uncertainty were the major forces that have driven the market even since a week after the tragedy (NYSE and Nasdaq were not open on the 11 September and remained closed until September 17).

Here is the list of the main statements of FMH formulated by Peters (1994):

1. Market is stable if it consists of great number of investors with different investment horizons, this guaranties liquidity;

2. Information set differs with the length of the investment horizon: in short-term it is more about technical information and “market mood”, in long term - fundamental information. It means that prices may incorporate not all available information (like according to EMH), but only information that is important for a particular investment horizon;

3. Price is a combination of short-term changes and long-term estimation. Short-term change in prices are usually more volatile, but they do not say a lot about real economic tendencies, they are mainly the result of profit-making of short-term investors or crowd behaviour;

4. When investors change their investment horizons and the balance between short and long run is distorted, market becomes less liquid and unstable.

FMH did not became the subject of heated debate as EMH did, because its statements are less controversial. Most of the works on FMH focus not on the theory but on statistical technique that Peters (1994) proposes for analysis of fractal time series (a review is presented further in this paper). However, FMH was not forgotten after 1990s. Haldane (2011) suggests that it accounted for the lack of market liquidity during the 2010 Flash Crash (stock crash that lasted 36 minutes, but caused prices to drop considerably). Anderson and Noss (2013) argue that FMH suggests a number of possible measures, policymakers can implement to maintain market stability. For instance, introduction of “circuit breakers”, mechanisms that stops all trading of an asset if price moves too rapidly, can give long-term investors a possibility to regain their confidence about market information. Another measure against crises is setting of a minimum delay between submission and execution of a transaction. According FMH, it will slow down the trade and give traders more time for valuation of inflowing information (but it will also raise the cost of transaction).

It cannot be said that FMH is better than EMH. As every theory, it has its limitations and describes only a part of reality. Concerning financial crisis it can explain the behaviour of investors that starts the mechanism of a crises, but it does not say anything about, for instance, formation of market bubbles. FMH places emphasis on aspects that were hardly taken into account before - investment horizons and liquidity. It turns out, that this concepts can explain such phenomena as crowd behaviour from economic and not psychological point of view. This theory suggests some characteristic features to look for in market behaviour.

1.3 Review of usage of Hurst exponent for measuring markets' efficiency

The Hurst exponent approach has been already applied to the analysis of critical points in market behaviour several times. All the existent work in this topic can be divided into two categories:

1) Studies that aim is to evaluate market efficiency during crisis periods. As stated before, market efficiency, according to EMH, is a complicated concept that includes many aspects, and thus, cannot be tested as one hypothesis. Here, efficiency means that price changes are independent (or exhibit some short-memory) and long memory is absent, usually it will be spoken about relative efficiency. Hurst exponent is quite a unique indicator that can detect long-memory (certainly, long memory is a departure from random walk, but it cannot be tested by standard RWH tests). The hypothesis of most of these papers is that markets are less efficient during crisis periods when during tranquil periods (that means Hurst exponent is higher during crises).

2) Studies that investigate the dynamic of Hurst exponent (time-dependent Hurst exponent) in order to answer the question if it can predict an economic crisis. The idea that Hurst exponent can be used as a tool of forecasting comes directly from FMH. It is assumed that during periods of stability and steady market tendencies prices changes exhibit long memory, but as market becomes uncertain and short-term investors start to dominate - trading activity grows and price changes become more volatile. It means that Hurst exponent should drop shortly before the critical point, predicting the following crisis.

It is not hard to notice that hypotheses of these two directions of research conflict with each other: in one, it is assumed that tranquil market is efficient that means show no signs of long memory, in another - directly the opposite. This situation is not that puzzling, if recall that this ideas about prices behaviour during crises originated from two different theories: EMH and FMH. This study is an attempt to combine ideas from these two directions. A review of particular studies is presented below to provide a better understanding of hypotheses and methods of their testing.

Efficiency evaluating

Though measuring efficiency and testing RWH by Herst exponent is a traditional and quite popular approach (e.g., Weron 2002, Onali and Goddard, 2011), it was applied to comparison of efficiency in different time periods only recently. Cajueiro et al. (2009) used it in context of market reforms: it was found that liberalization of financial market in Greece in the early 1990s improved its efficiency.

Probably first time such research design was applied to the problem of crises by Horta at al. (2014). In this paper the authors investigate indexes of 8 countries during the Global financial crisis and European sovereign debt crisis. The local Hurst exponent value is calculated using MFDMA technique for three periods: tranquil period (from 3.01.2005 to 31.07.2007), Subprime crisis period (from 1.09.2007 to 7.12.2009) and European sovereign debt crisis (from 8.12.2009 to 27.04.2012). The results show that for all the indexes Hurst exponent is higher during the financial crisis period than during the tranquil period. However, for two most developed markets of US, UK and Japan it remains low (probably because these markets are more efficient than others are, corresponds to the results obtained by Kristoufek (2010). Other markets (France, Belgium, Portugal etc.) exhibit long memory during the crisis (high values of Hurst exponent). During the European sovereign debt crisis Hurst exponent for all markets is about 0.5 - higher than before the global crisis, but lower than during it. The authors explained it by the increase in investors' confidence who returned to the market in spite of the recent crisis and brought additional liquidity. Moreover, the European debt crisis had different nature and caused a smaller fall in the stock markets than the 2008-2009 crisis.

Jin (2016) follows the research design of Horta at al. (2014) and provides the analysis for Asian stock market, and obtain similar results, although Asian markets distinguish from European markets. They found that Asian markets are efficient during most of the periods, but they lose their efficiency during the 2007-2008 crisis.

Dynamics of Hurst Exponent

The first attempt to predict critical points in market behaviour using Hurst exponent was done by Grech and Mazur (2004), they supposed that the local Hurst exponent will change its trend prior to the change in prices. The market becomes “nervous” before the crises, so that daily changes in prices become less correlated than in stable state, even if there is still no drastic change in market behaviour. As value of Hurst exponent H>0.5 indicates the existence of long-range correlations or the “persistency” of the process, it corresponds to the strong and long-lasting trend. On the contrary, a low value of Hurst exponent states the “nervousness” of the market or the “antipersistence” of the process (changes in prices look like they have random nature). Therefore, authors' hypothesis was that H is greater than 0.5 for a while prices follow a specific trend and it drops significantly if this trend is going to change in the immediate future.

This hypothesis is tested on the examples of three market crashes: September 1929, October 1987 and July 1998. Authors use historical data on Jones Industrial Average Index (DJIA) index (1896-2003) and apply DFA method with time-window of 215 days (they also provide the estimation of the optimal length of time-window that will be discussed later). The hypothesis is verified with all of three cases. The first crisis - the Wall Street crash began on the 24.10.1929 (also known as “Black Thursday”), but the change of the trend (from positive to negative) occurred about 50 days earlier on 3.09.1929. A decreasing trend in local Hurst-exponent started about a month before and did not end with this date. It reached its minimum 2 weeks before the crash (at the value about 0.45 compared to the 0.65 before change in the trend), and after it started recovering (it can be explained by a strong decreasing trend in the index values).

Even if the decreasing trend before the crisis is clear, this could happen accidentally, because the calculations show that Hurst exponent is not stable. To check it authors analyse also crashes of 1987 and 1998. The crash in 1987 is characterized by a huge fall of value in a very short time, on 19.10.1987 the DJIA lost 22.6% of its value - it is the largest one-day decline in the history of this index. However, as Grech and Mazur (2004) show, the Hurst exponent was decreasing during one year, predicting the crash. At the same time, the index value was constantly rising. Again, the Hurst exponent reached its minimum after the change of trend from increasing to decreasing but just before the major crash. For the crisis in 1998 authors obtain similar results, the Hurst exponent showed even larger decline (the minimum is smaller than 0.35).

Thought these results have some limitations, for instance, the period of decreasing of the Hurst exponent differs from crisis to crises making precise predictions impossible, the hypothesis is verified by three very different crises that means the decrease in Hurst exponent before crises could not be a chance. This study was a pioneer work showing that Hurst exponent can detect investors' sceptic attitude about the market future with still growing indexes, even it cannot give precise forecasts about the crisis date.

Kristoufek (2012) also uses DFA method, but with larger time-window (N=500 days, about two trading years). The hypothesis suggests the existence of decreasing trend in local Hurst exponent before the Global financial crisis of 2007-2008. As the crisis started in the USA (and mainly was caused by problems of this financial market), three US indexes are chosen: Dow Jones Industrial Average Index (DJI), NASDAQ Composite Index (NASDAQ) and S&P500 Index (SPX). Calculations show that the Hurst exponent started decreasing for all three indexes long before the crisis (from the beginning of 2006 until the beginning of 2007) and remained below 0.5 during the crisis. Moreover, the longest period of low H has SPX, which also had the longest recovering period, and the shortest has NASDAQ - the most increasing market after crisis among these three.

This technique was applied not only to the US market, Grech and Pamula (2008) investigated critical points in the behaviour of WIG20 - the main Polish stock index with the help of Hurst exponent (and actually predicted the upcoming financial crises of 2008-2009), Kristoufek (2010) used it to analyse PX50 index of the Czech Republic. These studies have similar results, confirming that Hurst exponent is a useful tool for detecting future critical events.

Some conclusions from the previous research

Review of previous research suggest following conclusions that are important for empirical part of this study:

1) Local Hurst exponent has a decreasing trend during period before crisis (or low value over this period);

2) The length of the decreasing period and the point of its start and end vary from crisis to crises, making all the predictions uncertain;

3) There is no common opinion about the value of Hurst exponent during the crises: Kristoufek (2010) argues it should remain low, Grech and Mazur (2004) show that it recovers quickly;

4) Some studies explain the decreasing trend of Hurst exponent as predicting the upcoming crisis according to FMH, while the EMH studies suppose that low value before the crisis is normal (efficient) state;

5) FMH studies explain higher value of Hurst exponent during crisis as recovering trend, while EMH studies refer to it as to inefficient state;

6) These contradictions are caused not only by different underlying theories but also by difference in research design: first, value of estimated Hurst exponent differs with methods used, second, division into periods matters;

7) Results for developed and emerging market are most likely different, so, for example, applying theories from developed markets to emerging ones can lead to groundless forecasts.

1.4 Fractal analysis and Hurst exponent

Hurst exponent characterises long-term dependences in time series. It was initially suggested by British hydrologist Edwin Hurst (1951) who studied statistical properties of the Nile River overflows for a practical task of optimal dam size determination. His work was inspired by Einstein's work on Brownian motion, according to that the absolute value of the particles displacement is proportional to the square root of time. Hurst assumed that the rescaled range (statistical measure of time series variability also proposed by Hurst) is also a function of time. Studying Nile River overflows, he found that the rescaled range grows faster than the square root of time. He suggested the following relationship:

,

where c - is a constant, n - time span (number of observations in the series) and H - coefficient, that was named Hurst exponent in honour of Edwin Hurst by Benoit Mandelbrot. For Nile River series the value of H was 0.91 (higher than 0.5), this meant that Nile overflows were correlated (explained, e.g. Shiryaev (1988), p.273). After it Hurst studied other natural time series (like temperature and rainfalls), for all of them he estimated H higher than 0.5. Sometimes Hurst exponent is also call Hurst constant or index of dependence. The value of Hurst exponent lies between 0 and 1, and it equals Ѕ for independent series. Hurst exponent larger than 0.5 indicates long-term positive correlations or the “persistence” of the series, the value smaller than 0.5 presents anti-persistence, which means that process changes it sign more frequent that a random process does.

Hurst exponent is closely connected with fractal dimension - a concept from fractal geometry that was developed by Benoit Mandelbrot (1967). Fractals are objects, which parts are organized as the whole. Mathematicians have not agreed on the exact definition of fractals, but self-similarity is one of the defining properties. Fractals can be mathematical objects, but they can also be found in nature. The well-known example is an oak-tree: from a distance it is a trunk with some large branches, if we come closer, we will see that every branch consists of a number of small branches, and so further. This structure is called self-similar or fractal. In case of oak-tree (and all other natural fractals) self-similarity is approximate, mathematical fractals can exhibit exact self-similarity (be identical at every scale), a popular example of it is the Koch snowflake. Fractal theory has many branches and applications, for this study important is the fact that fractals are not limited by geometrical objects, they can also describe processes in time.

It was noticed, that financial time series are characterised by self-similarity, so that distributions of returns look similar on different investment horizons (Peters, 1994). Shiryaev (1998) formulates it as follows:

If - are day prices, than empirical probability density functions and , k>1 for series

and demonstrate:

.

This property was explained in the frame of the concept of self-similarity, and gave rise to the fractal analysis of financial time series. Shiryaev (1998) gives definition of statistical self-similarity:

Stochastic process is self-similar if for every can be found , so that .

If , then the process is a self-similar process with Hurst exponent H.

For Brownian motion with we have:

,

That means that self-similarity property looks like:

.

Hence, Brownian motion is a self-similar process with Hurst exponent equal to

Processes with Hurst coeffcient different from are defined as fractional Browninan motion (or sometimes Brownian motion is also defined as fractional Browninan motion with

Fractional Brownian Motion (fBM) is defined as Gaussian process with zero mean and covariance function

(1)

A comprehesive review on fBM is presentedm for instance, in Shiryaev(1998).

Analoguos to brownian (white noise),

,

where is fBM, is fractional noise with Hurst exponent H. Hence, the covariance function for fractional noise can be derived using covariance formula (1):

(2)

Then, for

(3)

If for and is a sequence of independent variables. If , then from (3) it can be seen that covariance declines slowly with the increase of n, and this can be interpreted as the presence of long-memory. Cases of and are different:

· If than covariance is negative ();

· If than covariance is positive ().

Positive covariance means that positive (negative) values of will be followed by positive (negative) values in the future. This means, that the process is persistent. Negative covariance means that positive values are followed by negative and vice versa. This effect is sometimes called “fast intermittency”, and the process is called anti-persistent. Positive covariance and persistency is what was observed by Herst in natural time-series and what is often observed in financial time-series.

Hurst exponent is related to the fractal dimension D by the formula: (see, e.g. Mandelbrot et al. (1984)). Therefore, processes with higher H (persistent processes) have D close to 1, so they behave in time like an ordinary line, are less erratic and less “nervous”, while processes with low H, and consequently higher D are more erratic.

Therefore, Hurst exponent is a useful measure for financial markets' analysis. Its value can give some insights about the nature of the price changes process. As Hurts exponent can detect long range dependences un time series, it can be used as a tool for testing EMH and FMH, this exactly will be done in the empirical part of this study. Moreover, Hurst exponent can be used for detection of hidden cycles in the data sets (see, e.g. Peters, 1994). There are various methods of Hurst exponent estimation. The first one was R/S-analysis, it is the most popular method still. However, it has some drawbacks, so other techniques were developed. Among them are: Fluctuation Analysis, Detrended Fluctuation Analysis, Detrending Moving Average Analysis, Periodogram Regression and Average Wavelet Coefficient Analysis. In this study three of these methods will be considered: R/S-analysis, Detrended Fluctuation Analysis and Detrending Moving Average Analysis.

1.5 Methods of Hurst exponent estimation

1.5.1 R/S analysis

R/S analysis was first introduced by British hydrologist Hurst (1951), who studied statistical properties of the Nile River overflows for a practical task of optimal dam size determination. The R/S statistic (rescaled range) is the range of partial sums of deviations of time series from its mean, rescale by the standard deviation. Initially, this this technique was inspired by Einstein's work on Brownian motion, according to that the absolute value of the particles displacement is proportional to the square root of time. So, the idea of R/S analysis is the dependency between rescaled cumulative deviations from the mean (displacement) and number of data points used in the analysis (time). The Hurst exponent is the scaling exponent H: , it can be found by calculating the rescaled range statistics for various values of n. The detailed algorithm is presented below.

1) Time series of length N is divided into M periods, each of length n, so that . Each period is labelled each element in is labelled as

2) For each period mean value and standard deviation are calculated:

3) New series of accumulated departures from the mean for each period are created:

4) For each period the range of accumulated departures from the mean is calculated:

5) The rescaled range value for the length n is defined as:

6) Then, this procedure repeated for a range of different period sizes n. Hurst exponent can be found as the coefficient of the regression:

However, R/S-analysis has some shortcomings. First of all, the value of for independent series is an asymptotic limit, and finite length time series never give the true value of H. For short data sets estimated Hurst exponent is significantly higher than 0.5 for independent series, this can lead to incorrect conclusions (showed in Weron (2002) and discussed later in this paper). Second shortcoming is stationarity assumption, R/S-analysis requires that characteristics of the process (mean and variance) remain the same for different time periods. If process is not stationary R/S-analysis can fins long-range correlations though there are not any (Couillard, Davison, 2004). Third shortcoming is the fact that R/S-analysis is sensitive to short-range correlations and can mistakenly recognise them as long-term dependences. The last shortcoming was noticed by Andrew Lo, it is discussed in the next paragraph.

1.5.2 Andrew Lo R/S method

Andrew Lo (1991) argues that classical R/S statistic has one important shortcoming: it is sensitive to short-range dependence. Specific behaviour of R/S statistic that is associated with long memory is not necessarily caused by its presence, it can be just a symptom of short-range dependence. For instance, it is shown through simulations that Hurst exponent obtained by classical R/S method rejects the null hypothesis of no long memory in 47% of time for a stationary Gaussian AR(1)-process with autoregressive parameter of 0.3.

To distinguish between short and long memory statistic should be invariant over short-range dependence (in different forms), but still sensitive to long-range dependence. Lo (1991) suggests modified R/S statistic (it can be also referred as called Andre Lo rescaled range method). It is denoted as

and are usual variance and covariance estimators of X. are Newey and West (1987) weights:

This statistic differs from the classical one only in its denominator: the square root of a consistent estimator of the partial sum's variance, which is not only the sum of variance, but includes autocovariances in case of short-range dependence. Choice of lag q is at the researcher's discretion, q should grow with the number of observations, but should not be too high because then the distribution of the estimator differs from its asymptotic limit (Lo, 1991). However, small q can lead to the mission of significant autocovariances beyond lag q. Therefore, choice of q is data-dependent. Andrews (1992) provide a rule: q should not be larger than: , where denotes the first-order correlation coefficient of a series.

For modified R/S statistic the asymptotic distribution under the null hypothesis of lack of long memory is derived, so this hypothesis can be easily tested. So, 95% confidence interval for normalized by dividing by statistic is [0.809; 1.862].

Teverovsky et al. (1999) studied the behaviour of modified R/S statistic and made a conclusion that this statistic has ``serious drawbacks''. They found that with increase of q statistic has a strong bias towards accepting null hypothesis of no long-term memory. To avoid this problem Teverovsky et al. (1999) advices to use classical R/S-statistic and handle short-term correlations using other techniques (e.g. use residuals instead of series).

1.5.3 Detrended fluctuation analysis

Detrended fluctuation analysis (DFA) was introduced in 1994, first applied to the analysis of DNA sequences (Peng et al, 1994). Later it has been repeatedly successfully applied to such fields as heart rate dynamics, cloud structure, geology, ethnology and others (review in Kantelhardt et al., 2002). This method is also applied in some studies devoted to measuring efficiency of financial markets (e.g. Grech and Mazur (2004) and Kristoufek (2012). DFA can be applied to noisy, non-stationary time series), it avoids detection of long-range correlations that are artefacts of non-stationarity in the series. This is the reason why it is often preferred to the classical R/S analysis.

1) Time series of length N is divided into M periods, each of length n, so that .

2) The linear approximation of the trend in each period is found as

3) The root mean square fluctuation is calculated for each period:

4) The average fluctuation over all M periods:

5) As in R/S analysis, this procedure repeated for a range of different period sizes n, the power-law behaviour is expected: , and can be found fitting a regression.

This exponent is usually called Hurst exponent. It can take values larger than 1, but, as classical Hurst exponent, it is equal to for Brownian motion, and larger than if series exhibit long memory.

1.5.6 Detrending moving average method

Detrending moving average is a more recent method. It was first proposed by Vandewalle and Ausloos (1998). Gu and Zhou (2010) extended DMA to to multifractal detrending moving average _MFDMA_, which is designed to analyze multifractal time series and multifractal surfaces, they found that MFDMA works better than MFDFA for multifractal analysis. As here only monofractal is considered, this advantage of MFDMA over MFDFA is not important for this particular study, but DMA is still used because it can be also applied to non-stationary series without any additional assumptions.

Algorithm of MFDMA is presented in Gu and Zhou (2010), they also present a Matlab realisation of this algorithm.

1) Consider time series X(t), t=1,2…,N. Construct the sequence of cumulative sums:

2) Calculate moving average function in a moving window of length n:

- is the position parameter varying in the range [0,1]. Moving average function considers points in the past and points in the future. For instance, for backward moving average is calculated over past n-1 data points of the series. Case of is called forward moving average (n-1 points in the future), - centred moving average (half past and half future points in the window).

3) Detrend the series and obtain the residuals sequence:

4) This series I divided into disjoint segments with the same size n, Each segment can be denoted as such that for , where After it the root-mean-square function is calculated as:

5) The qth-order overall fluctuation function is calculated as:

Where q can take any real value except for 0.

6) As in previous methods the power-law relation can be determined by varying the segment size n:

1.6 Estimating significance of Hurst exponent

The major drawback of methods discussed above is the fact that there are no derived asymptotic distributions for the Hurst exponent. Estimated exponents can give some information about the processes (for instance, we can compare different series in terms of long memory presence by estimated Hurst exponents for them), but they cannot enable making conclusions about the presence of long memory. It is known that H=0.5 denotes an independent process, however the value higher than 0.5 does not necessarily indicate that the process is persistent. First, it is shown that H of an independent process calculated using R/S-analysis deviates significantly from 0.5, especially for small samples (e.g. Weron, 2002). Secondly, without a statistical test it cannot be decided if obtained exponent differs significantly from the one characterising an independent process.

...

Подобные документы

  • The global financial and economic crisis. Monetary and financial policy, undertaken UK during a crisis. Combination of aggressive expansionist monetary policy and decretive financial stimulus. Bank repeated capitalization. Support of domestic consumption.

    реферат [108,9 K], добавлен 29.06.2011

  • Financial bubble - a phenomenon on the financial market, when the assessments of people exceed the fair price. The description of key figures of financial bubble. Methods of predicting the emergence of financial bubbles, their use in different situations.

    реферат [90,0 K], добавлен 14.02.2016

  • Analysis of the status and role of small business in the economy of China in the global financial crisis. The definition of the legal regulations on its establishment. Description of the policy of the state to reduce their reliance on the banking sector.

    реферат [17,5 K], добавлен 17.05.2016

  • The essence of economic efficiency and its features determination in grain farming. Methodology basis of analysis and efficiency of grain. Production resources management and use. Dynamics of grain production. The financial condition of the enterprise.

    курсовая работа [70,0 K], добавлен 02.07.2011

  • General characteristic of the LLC DTEK Zuevskaya TPP and its main function. The history of appearance and development of the company. Characteristics of the organizational management structure. Analysis of financial and economic performance indicators.

    отчет по практике [4,2 M], добавлен 22.05.2015

  • The stock market and economic growth: theoretical and analytical questions. Analysis of the mechanism of the financial market on the efficient allocation of resources in the economy and to define the specific role of stock market prices in the process.

    дипломная работа [5,3 M], добавлен 07.07.2013

  • Directions of activity of enterprise. The organizational structure of the management. Valuation of fixed and current assets. Analysis of the structure of costs and business income. Proposals to improve the financial and economic situation of the company.

    курсовая работа [1,3 M], добавлен 29.10.2014

  • Establishing a favorable environment for investments, removing administrative barriers. Establishing high-technology parks. Formation of financial mechanisms to attract and support investments, tax stimulation measures. Brand promotion of Russian regions.

    реферат [15,9 K], добавлен 04.06.2013

  • Mergers and acquisitions: definitions, history and types of the deals. Previous studies of post-merger performance and announcement returns and Russian M&A market. Analysis of factors driving abnormal announcement returns and the effect of 2014 events.

    дипломная работа [7,0 M], добавлен 02.11.2015

  • Gas pipeline construction: calculating the pipe diameter, the pressure required for the transportation of natural gas compressors. The definition of capital costs for construction and operation of the pipeline. Financial management of the project.

    статья [774,7 K], добавлен 05.12.2012

  • Organizational structure of "Samruk-Kazyna" JSC. Formation of financial resources of the Fund. Mining and power assets directorate. The characteristic stages of the process of registration of new legal entities. Cash flow from the operating activity has.

    отчет по практике [2,6 M], добавлен 02.02.2015

  • The use of computers in education. Improvements in health, education and trade in poor countries. Financial education as a mandatory component of the curriculum. Negative aspects of globalization. The role of globalization in the economic development.

    контрольная работа [57,9 K], добавлен 13.05.2014

  • A variety of economy of Kazakhstan, introduction of the international technical, financial, business standards, the introduction to the WTO. The measures planned in the new Tax code. Corporation surtax. Surtax reform. Economic growth and development.

    реферат [27,2 K], добавлен 26.02.2012

  • The influence of the movement of refugees to the economic development of host countries. A description of the differences between forced and voluntary migration from the point of view of economic, political consequences. Supply in the labor markets.

    статья [26,6 K], добавлен 19.09.2017

  • Natural gas market overview: volume, value, segmentation. Supply and demand Factors of natural gas. Internal rivalry & competitors' overview. Outlook of the EU's energy demand from 2007 to 2030. Drivers of supplier power in the EU natural gas market.

    курсовая работа [2,0 M], добавлен 10.11.2013

  • Negative consequences proceeding in real sector of economy. Social stratification in a society. Estimation of efficiency of economic safety. The parity of the manufacturers of commodity production. Main problems of the size of pension of common people.

    статья [15,4 K], добавлен 12.04.2012

  • A theoretic analysis of market’s main rules. Simple Supply and Demand curves. Demand curve shifts, supply curve shifts. The problem of the ratio between supply and demand. Subsidy as a way to solve it. Effects of being away from the Equilibrium Point.

    курсовая работа [56,3 K], добавлен 31.07.2013

  • The definition of term "economic security of enterprise" and characteristic of it functional components: technical and technological, intellectual and human resources component, information, financial, environmental, political and legal component.

    презентация [511,3 K], добавлен 09.03.2014

  • The influence of corruption on Ukrainian economy. Negative effects of corruption. The common trends and consequences of increasing corruption. Crimes of organized groups and criminal organizations. Statistical data of crime in some regions of Ukraine.

    статья [26,7 K], добавлен 04.01.2014

  • The air transport system in Russia. Project on the development of regional air traffic. Data collection. Creation of the database. Designing a data warehouse. Mathematical Model description. Data analysis and forecasting. Applying mathematical tools.

    реферат [316,2 K], добавлен 20.03.2016

Работы в архивах красиво оформлены согласно требованиям ВУЗов и содержат рисунки, диаграммы, формулы и т.д.
PPT, PPTX и PDF-файлы представлены только в архивах.
Рекомендуем скачать работу.