Skip to main content

2006 | Buch

Practical Fruits of Econophysics

Proceedings of the Third Nikkei Econophysics Symposium

insite
SUCHEN

Über dieses Buch

Some economic phenomena are predictable and controllable, and some are impos­ sible to foresee. Existing economic theories do not provide satisfactory answers as to what degree economic phenomena can be predicted and controlled, and in what situations. Against this background, people working on the financial front lines in real life have to rely on empirical rules based on experiments that often lack a solid foundation. "Econophysics" is a new science that analyzes economic phenomena empirically from a physical point of view, and it is being studied mainly to offer scientific, objective and significant answers to such problems. This book is the proceedings of the third Nikkei symposium on ''Practical Fruits of Econophysics," held in Tokyo, November 9-11, 2004. In the first symposium held in 2000, empirical rules were established by analyzing high-frequency finan­ cial data, and various kinds of theoretical approaches were confimied. In the second symposium, in 2002, the predictability of imperfections and of economic fluctua­ tions was discussed in detail, and methods for applying such studies were reported. The third symposium gave an overview of practical developments that can immedi­ ately be applied to the financial sector, or at least provide hints as to how to use the methodology.

Inhaltsverzeichnis

Frontmatter

Market’s Basic Properties

Frontmatter
Correlated Randomness: Rare and Not-so-Rare Events in Finance

One challenge of economics is that the systems treated by these sciences have no perfect metronome in time and no perfect spatial architecture—crystalline or otherwise. Nonetheless, as if by magic, out of nothing but

randomness

one finds remarkably fine-tuned processes in time. To understand this “miracle,” one might consider placing aside the human tendency to see the universe as a machine. Instead, one might address the challenge of uncovering how, through randomness (albeit, as we shall see, strongly correlated randomness), one can arrive at many temporal patterns in economics. Inspired by principles developed by statistical physics over the past 50 years—scale invariance and universality—we review some recent applications of correlated randomness to economics.

H. E. Stanley, Xavier Gabaix, Parameswaran Gopikrishnan, Vasiliki Plerou
Non-trivial scaling of fluctuations in the trading activity of NYSE

Complex systems comprise a large number of interacting elements, whose dynamics is not always a priori known. In these cases — in order to uncover their key features — we have to turn to empirical methods, one of which was recently introduced by Menezes and Barabási. It is based on the observation that for the activity

f

i

(

t

) of the constituents there is a power law relationship between the standard deviation and the mean value:

σ

i

∝ <

f

i

>

α

. For stock market trading activity (traded value), good scaling over 5 orders of magnitude with the exponent

α

= 0.72 was observed. The origin of this non-trivial scaling can be traced back to a proportionality between the rate of trades <

N

> and their mean sizes <

V

>. One finds <

V

> ∝ <

N

>

0.69

for the ∼ 1000 largest companies of New York Stock Exchange. Model independent calculations show that these two types of scaling can be mapped onto each other, with an agreement between the error bars. Finally, there is a continuous increase in

α

if we look at fluctuations on an increasing time scale up to 20 days.

János Kertész, Zoltán Eisler
Dynamics and predictability of fluctuations in dollar-yen exchange rates

Analysis of tick data of yen-dollar exchange using random walk methods has showed that there exists a characteristic time scale approximately at 10 minutes. Accordingly, for time scales shorter than 10 minutes the market exhibits anti-persistence, meaning that it self-organizes so that to restore a given tendency. For time scales longer than 10 minutes the market approaches a behavior appropriate to pure Brownian motion. This property is explored here to elucidate the predictability of this type of data. We find that improvement in predictability is possible provided that the data are not “contaminated” with noise.

A. A. Tsonis, K. Nakada, H. Takayasu
Temporal characteristics of moving average of foreign exchange markets

We firstly introduce an optimal moving average for Yen-Dollar tick data that makes the residual term to be an independent noise. This noise separation is realized for weight functions decaying nearly exponentially with characteristic time about 30 seconds. We further introduce another moving average applied to the optimal moving average in order to elucidate underlying force acting on the optimal moving average. It is found that for certain time scale we can actually estimate potential force that satisfies a simple scaling relation with respect to the time scale of moving average.

Misako Takayasu, Takayuki Mizuno, Takaaki Ohnishi, Hideki Takayasu
Characteristic market behaviors caused by intervention in a foreign exchange market

In foreign exchange markets monotonic rate changes can be observed in time scale of order of an hour on the days that governmental interventions took place. We estimate the starting time of an intervention using this characteristic behavior of the exchange rates. We find that big amount of interventions can shift the averaged rate about 1 yen per 1 dollar in an hour, and the rate change distribution becomes asymmetric for a few hours.

Takayuki Mizuno, Yukiko Umeno Saito, Tsutomu Watanabe, Hideki Takayasu
Apples and Oranges: the difference between the Reaction of the Emerging and Mature Markets to Crashes

We study here the behavior of the eigenvalues of the covariance matrices of returns for emerging and mature markets at times of crises. Our results appear to indicate that mature markets respond to crashes differently to emerging ones and that emerging markets take longer to recover than mature markets. In addition, the results appear to indicate that the

second largest

eigenvalue gives additional information on market movement and that a study of the behavior of the other eigenvalues may provide insight on crash dynamics.

Adel Sharkasi, Martin Crane, Heather J. Ruskin
Scaling and Memory in Return Loss Intervals: Application to Risk Estimation

We study the statistics of the return intervals

τ

q

between two consecutive return losses below a threshold −

q

, in various stocks, currencies and commodities. We find the probability distribution function (pdf) of

τ

q

scales with the mean return interval

τ

q

in a quite universal way, which may enable us to extrapolate rare events from the behavior of more frequent events with better statistics. The functional form of the pdf shows deviation from a simple exponential behavior, suggesting memory effects in losses. The memory shows up strongly in the conditional mean loss return intervals which depend significantly on the previous interval. This dependence can be used to improve the estimate of the risk level.

Kazuko Yamasaki, Lev Muchnik, Shlomo Havlin, Armin Bunde, H. Eugene Stanley
Recurrence analysis near the NASDAQ crash of April 2000

Recurrence Plot (RP) and Recurrence Quantification Analysis (RQA) are signal numerical analysis methodologies able to work with non linear dynamical systems and non stationarity. Moreover they well evidence changes in the states of a dynamical system. It is shown that RP and RQA detect the critical regime in financial indices (in analogy with phase transitions) before a bubble bursts, whence allowing to estimate the bubble initial time. The analysis is made on NASDAQ daily closing price between Jan. 1998 and Nov. 2003. The NASDAQ bubble initial time has been estimated to be on Oct. 19, 1999.

Annalisa Fabretti, Marcel Ausloos
Modeling a foreign exchange rate using moving average of Yen-Dollar market data

We introduce an autoregressive-type model with self-modulation effects for a foreign exchange rate by separating the foreign exchange rate into a moving average rate and an uncorrelated noise. From this model we indicate that traders are mainly using strategies with weighted feedbacks of the past rates in the exchange market. These feedbacks are responsible for a power law distribution and characteristic autocorrelations of rate changes.

Takayuki Mizuno, Misako Takayasu, Hideki Takayasu
Systematic tuning of optimal weighted-moving-average of yen-dollar market data

We introduce a weighted-moving-average analysis for the tick-by-tick data of yen-dollar exchange market: price, transaction interval and volatility. The weights are determined automatically for given data by applying the Yule-Walker formula for autoregressive model. Although the data are non-stationary the resulting moving average gives a quite nice property that the deviation around the moving-average becomes a white noise.

Takaaki Ohnishi, Takayuki Mizuno, Kazuyuki Aihara, Misako Takayasu, Hideki Takayasu
Power law and its transition in the slow convergence to a Gaussian in the S&P500 index
Ken Kiyono, Zbigniew R. Struzik, Yoshiharu Yamamoto
Empirical study of the market impact in the Tokyo Stock Exchange

We analyze the trades and quotes database of the TSE (Tokyo Stock Exchange) to derive the average price response to transaction volumes. Through the analysis, we point out that the assumption of the independence of the amplitude of returns on the size of transactions cannot fully explain the profile of the average price response.

Jun-ichi Maskawa
Econophysics to unravel the hidden dynamics of commodity markets

Commodity prices act as leading indicators and have important implications for output and business fluctuations, but their dynamics are not well understood. We used some econophysic tools to evaluate five agricultural commodities traded at the NYBOT (cocoa, coffee, cotton, frozen orange juice and sugar), both in price and volume. Results show important differences between price and volume fluctuations and among the commodities. All commodities have high volatile but non-random dynamic, the less so the larger their market.

Sary Levy-Carciente, Klaus Jaffé, Fabiola Londoño, Tirso Palm, Manuel Pérez, Miguel Piñango, Pedro Reyes
A characteristic time scale of tick quotes on foreign currency markets

This study investigates that a characteristic time scale on an exchange rate market (USD/JPY) is examined for the period of 1998 to 2000. Calculating power spectrum densities for the number of tick quotes per minute and averaging them over the year yield that the mean power spectrum density has a peak at high frequencies. Consequently it means that there exist the characteristic scales which dealers act in the market. A simple agent model to explain this phenomenon is proposed. This phenomena may be a result of stochastic resonance with exogenous periodic information and physiological fluctuations of the agents. This may be attributed to the traders’ behavior on the market. The potential application is both quantitative characterization and classification of foreign currency markets.

Aki-Hiro Sato

Predictability of Markets

Frontmatter
Order book dynamics and price impact

The price impact function describes how prices change if stocks are bought or sold. Using order book data, we explain the shape of the average price impact function by a feedback mechanism due to a strong anticorrelation between price changes and limit order flow. We find that the average price impact function has only weak explanatory power for large price changes. Hence, we study the time dependence of liquidity and find it to be a necessary prerequisite for the explanation of extreme price fluctuation.

Philipp Weber, Bernd Rosenow
Prediction oriented variant of financial log-periodicity and speculating about the stock market development until 2010

A phenomenon of the financial log-periodicity is discussed and the characteristics that amplify its predictive potential are elaborated. The principal one is self-similarity that obeys across all the time scales. Furthermore the same preferred scaling factor appears to provide the most consistent description of the market dynamics on all these scales both in the bull as well as in the bear market phases and is common to all the major markets. These ingredients set very desirable and useful constraints for understanding the past market behavior as well as in designing forecasting scenarios. One novel speculative example of a more detailed S&P500 development until 2010 is presented.

Stan Drożdż, Frank Grümmer, Franz Ruf, Josef Speth
Quantitative Forecasting and Modeling Stock Price Fluctuations

Considering the effect of economic agents’ preferences on their actions, relationships between conventional summary statistics and forecasts’ profit are investigated. Analytical examination demonstrates that investors’ utility maximization is determined by their risk attitude. The computational experiment rejects the claims that the accuracy of the forecast does not depend upon which error-criteria are used. Profitability of networks trained with

L

6

loss function appeared to be statistically significant and stable.

Serge Hayward
Time series of stock price and of two fractal overlap: Anticipating market crashes?

The features of the time series for the overlap of two Cantor sets when one set moves with uniform relative velocity over the other looks somewhat similar to the time series of stock prices. We analyze both and explore the possibilities of anticipating a large (change in Cantor set) overlap or a large change in stock price. An anticipation method for some of the crashes has been proposed here, based on these observations.

Bikas K. Chakrabarti, Arnab Chatterjee, Pratip Bhattacharyya
Short Time Segment Price Forecasts Using Spline Fit Interactions

Empirically, correlations are seen to exist between market action in specific, short market periods such as the AM, PM and overnight (ON) periods for different days of the week on the one hand and market trends (on various time scales) on the other hand. We use real-time spline fits with tunable smoothness parameters and their signs to obtain signals for these market periods and show that they are stationary (and tradable) for S&P 500 futures.

Ke Xu, Jun Chen, Jian Yao, Zhaoyang Zhao, Tao Yu, Kamran Dadkhah, Bill C. Giessen
Successful Price Cycle Forecasts for S&P Futures Using TF3, a Pattern Recognition Algorithms Based on the KNN Method

Basing on the perceived stationary internal structure of market movements on appropriate time scales, a series of interrelated pattern recognition programs was designed to compare specific features of current cycle “legs” with a selected universe of analogous prior market features periods which are then queried to obtain a prediction as to the future of the current cycle leg. Similarities are determined by a K-Nearest-Neighbor (KNN) method. This procedure yields good results in simulated S&P futures trading and demonstrates the hypothesized stationary of market responses to stimuli.

Bill C. Giessen, Zhaoyang Zhao, Tao Yu, Jun Chen, Jian Yao, Ke Xu
The Hurst’s exponent in technical analysis signals

The fractal nature of financial data has been investigated through literature. The aim of this paper is to use the information given by the detection of the fractal measure of data in order to provide support for trading decisions when dealing with technical analysis signals that can be used to trigger buy/sell orders. Trendlines are considered as a case study.

Giulia Rotundo
Financial Markets Dynamic Distribution Function, Predictability and Investment Decision-Making (FMDDF)

FMDDF system based on the basic scientific ideas from Physics and Economics involved in the procedure for the dynamic probabilistic distribution function of the state (

PDFS

) derivation for any financial market (

FM

). Price moving process defines dynamic FM. Price movement is defined by the volume imbalance V. The necessary condition for the dynamic FM is V ≠ 0 at the same price, while the sufficient condition is defined by nonzero price volatility σ. The total probabilistic distribution function of any FM is the sum of the two incompatible terms: the regular probabilistic distribution function (

PDF

) containing the mean value and PDFS. PDFS structured based on the adiabatic integrals of FM motion that include the existing or expected volume imbalance, price volatility and amount of shares or contracts. PDFS is not path dependable (the new trajectories’ invariant principle) in the special economic space E{ξ}. This fact is important in financial engineering, risk control, quantitative FM predictability and investment decision-making.

Gregory Chernizer
Market Cycle Turning Point Forecasts by a Two-Parameter Learning Algorithm as a Trading Tool for S&P Futures

Among the long-term stationary (although complex) behavior characteristics of futures markets is a set of identifiable intermediate-length (2–21.5 days) price cycles. Using a two-parameter extrapolation technique, time and price objectives of these cycles are determined. The valley-to-valley time differences (wave-lengths) are more regular than those for top-to-top, with standard deviations of the former about 50% smaller than those of the latter. The substantial profitability in S&P futures trading based on these parameters can be further increased by including additional features.

Jian Yao, Jun Chen, Ke Xu, Zhaoyang Zhao, Tao Yu, Bill C. Giessen

Mathematical Models

Frontmatter
The CTRWs in finance: the mean exit time

The continuous time random walk (CTRW) has become a widely-used tool for studying the microstructure of random process appearing in many physical phenomena. We here report the CTRW analysis applied to the market dynamics which has been recently explored by physicists. We focuss on the mean exit problem.

Jaume Masoliver, Miquel Montero, Josep Perelló
Discretized Continuous-Time Hierarchical Walks and Flights as possible bases of the non-linear long-term autocorrelations observed in high-frequency financial time-series

By using regular time-steps we define discrete-time random walks and flights on subordinate (directed) Continuous-Time Hierarchical (or Weierstrass) Walks and Flights, respectively. The obtained results can be considered as a kind of warning that indicates some persistent non-linear long-term autocorrelations (artifacts) accompanying the recording of empirical high-frequency financial time-series by regular time-steps, indeed.

Marzena Kozłowska, Ryszard Kutner, Filip Świtała
Evidence for Superdiffusion and “Momentum” in Stock Price Changes

It is now well established that the probability distribution of relative price changes of stock market aggregates has two prominent features. First, in its central region, the distribution closely resembles a Levy stable distribution with exponent α ≅ 1.4. Secondly, it has power-law tails with exponent

v

≤ 4. Both these results follow from relatively low resolution analyses of the data. In this paper we present the results of a high-resolution analysis of a database consisting of 132,000 values of the S&P 500 index taken at 10 minute intervals. We find a third prominent feature, a delta function at the origin the amplitude of which shows power-law decay over time with an exponent

c

≅ 2 / 3. We show that Continuous-Time Random-Walk (CTRW) theory can account for all three features, but predicts subdiffusion with a growth of the variance of the ln(price) as the

cth

power of time. We find instead superdiffusion with an exponent

c

≅ 9/8 instead of 2/3. We conclude that CTRW theory must be extended to incorporate the effects of “Price Momentum”.

Morrel H. Cohen, Prasana Venkatesh
Beyond the Third Dimension: Searching for the Price Equation

The purpose of this study is to examine the deterministic structure of financial time series of prices in presence of chaos and a low-dimensional attractor. The methodology used consists of transforming the observed system, typically exhibiting higher dimensional characteristics, into its

corresponding best two dimensional system

, via attractor or phase space reconstruction method, with subsequent intersection of the reconstructed attractor with the best two-dimensional (2D) hyperplane. The 2D system resulting from this slicing operation can be used for financial market analysis applications, by means of the determination of the

price equation

.

Antonella Sabatini
An agent-based model of financial returns in a limit order market

A set of finance literature shows that asset return processes are characterized by a GARCH class conditional volatility and fat-tail distributed disturbances, such as mixture of normal distributions and

t

-distribution (Watanabe 2000; Watanabe and Asai 2004). This paper finds that this type of complicated process arises by aggregating returns of a risky asset traded in a limit order market. The conditional volatility of generated return series can be modeled as a GARCH class since the volatility gradually diminishes as the price assimilates the new information about the future asset return. The reason why the error term of estimated model is fat-tail distributed is that the return of transaction prices is distributed as a mixture of normals; one of the two distributions represents the drift of the price process, and the other represents the liquidity effect.

Koichi Hamada, Kouji Sasaki, Toshiaki Watanabe
Stock price process and the long-range percolation

Using a Gibbs distribution developed in the theory of statistical physics and a long-range percolation theory, we present a new model of a stock price process for explaining the fat tail in the distribution of stock returns.

We consider two types of traders, Group A and Group B: Group A traders analyze the past data on the stock market to determine their present trading positions. The way to determine their trading positions is not deterministic but obeys a Gibbs distribution with interactions between the past data and the present trading positions. On the other hand, Group B traders follow the advice reached through the long-range percolation system from the investment adviser. As the resulting stock price process, we derive a Lévy process.

Koji Kuroda, Joshin Murai
What information is hidden in chaotic time series?

Foundations of Flicker-Noise Spectroscopy (FNS) which is a new phenomenological approach to extract information hidden in chaotic signals are presented. The information is formed by sequences of distinguished types of signal irregularities — spikes, jumps, and discontinuities of derivatives of different orders — at all space-time hierarchical levels of systems. The ability to distinguish irregularities means that parameters or patterns characterizing the totality of properties of the irregularities are distinguishably extracted from the power spectra

S

(

f

) (

f

— frequency) and difference moments Ф

(

p

)

(

τ

) (

τ

— temporal delay) of the

p

th

order. It is shown that FNS method can be used to solve the problems of two types: to show of the parameters characterizing dynamics and peculiarities of structural organization of open complex systems; to reveal the precursors of the sharpest changes in the states of open dissipative systems of various nature on the base of a priori information about their dynamics. Applications of the FNS for getting information hidden in economical data (daily market prices for the Nasdaq- and Nikkei-Index time series) are presented.

Serge F. Timashev, Grigory V. Vstovsky, Anna B. Solovieva
Analysis of Evolution of Stock Prices in Terms of Oscillation Theory

Taking advantage of the oscillatory evolution of stock prices, we analyze the evolution of stock prices in terms of the oscillation theory. We apply the formalisms to Nikkei225 data and compare with the predictions of the random walk theory.

Satoshi Nozawa, Toshitake Kohmura
Simple stochastic modeling for fat tails in financial markets

We have shown that the return distributions observed in the S&P500 can be obtained for a random-walk which reacts to moving averages in the technical analysis sense. Characteristic ingredients are mini-trends in accordance with moving averages, which lead to fat tails, delay in trading, which shifts the tails lower in the distributions and a reaction to break-outs of the market (in our case, Bollinger bands) which straighten out the curvature of the tails. Though the chart values of the S&P500 are not Gaussian distributed, it is the minitrends which follow a random walk/ Gaussian distribution with unit variance. This leaves considerable doubts about the actual “efficiency” of the market. It will be interesting to analyze other market data whether the local correlation

a

is universal, the mini-trends η

i

are always standard-normal-distributed and whether the delay

D

is shorter in markets with electronic trading.

Hans-Georg Matuttis
Agent Based Simulation Design Principles — Applications to Stock Market

We present a novel agent based simulation platform designed for general-purpose modeling in social sciences. Beyond providing convenient environment for modeling, debugging, simulation and analysis, the platform automatically enforces many of the properties inherent to the reality (such as causality and precise timing of events). A unique formalism grants agents with an unprecedented flexibility of actions simultaneously isolating researchers from most of the overhead of the virtual environment maintenance.

Lev Muchnik, Yoram Louzoun, Sorin Solomon
Heterogeneous agents model for stock market dynamics: role of market leaders and fundamental prices

We have developed a microscopic model of interacting agents where agents buy or sell shares depending on the information they get from neighbours and a relation of a temporary price to a fundamental price. Depending on the magnitude of the noise present in the system (magnitude of market temperature) prices oscillate between the bull and the bear phases or around a mean fundamental value. The oscillation period can be calculated from a mean field theory. A very influencial investor (market leader) does not get larger profits than a typical one. A crucial role for profits is played by a coupling constant to a fundamental price.

Janusz A. Hołyst, Arkadiusz Potrzebowski
Dynamics of Interacting Strategies

This paper presents the model of the dynamics process of switching the strategies adopted by a large number of agents according to their views of what they deem as the most advantageous strategy in relation to the behavior of other agents and/or exogenous environments. The process of switching the strategies is modeled by the master equation by suitably specifying the transition rates of continuous time Markov chains. The computer simulation explains the effects of demand-supply imbalance created by short-medium term traders in the dollar-yen foreign exchange market.

Masanao Aoki, Hiroyuki Moriya
Emergence of two-phase behavior in markets through interaction and learning in agents with bounded rationality
Sitabhra Sinha, S. Raghavendra
Explanation of binarized tick data using investor sentiment and genetic learning

This paper attempts to clarify some time series properties of binarized tick data by investor sentiment and genetic algorithm. For this purpose, first we explore the conditions for genetic algorithm to describe investor sentiment. Then we calculate auto-correlations and conditional probabilities using binarized sample paths generated by estimated models of investor sentiment. The most fitted parameter set of genetic algorithm have the following implications: First, a herd behavior is likely to emerge. Second, traders try to perceive brand-new information even if it is not completely correct.

Takashi Yamada, Kazuhiro Ueda
A Game-theoretic Stochastic Agents Model for Enterprise Risk Management

A model of business scenario simulation is developed by applying game theory to the stochastic agents described by the Langevin equations for enterprise risk management (ERM). Business scenarios of computer-related industries are simulated using the developed model, and are compared with real market data. Economic capital was calculated based on the business scenario, as the most basic requisite of ERM.

Yuichi Ikeda, Shigeru Kawamoto, Osamu Kubo, Yasuhiro Kobayashi, Chihiro Fukui

Correlation and Risk Management

Frontmatter
Blackouts, risk, and fat-tailed distributions

We analyze a 19-year time series of North American electric power transmission system blackouts. Contrary to previously reported results we find a fatter than exponential decay in the distribution of inter-occurrence times and evidence of seasonal dependence in the number of events. Our findings question the use of self-organized criticality, and in particular the sandpile model, as a paradigm of blackout dynamics in power transmission systems. Hopefully, though, they will provide guidelines to more accurate models for evaluation of blackout risk.

Rafał Weron, Ingve Simonsen
Portfolio Selection in a Noisy Environment Using Absolute Deviation as a Risk Measure

Portfolio selection has a central role in finance theory and practical applications. The classical approach uses the standard deviation as risk measure, but a couple of alternatives also exist in the literature. Due to its computational advantages, portfolio optimization based on absolute deviation looks particularly interesting and it is widely used in practice. For the practical implementation of any variant, however, one needs to estimate the parameters from finite return series, which inevitably introduces measurement noise that, in turn, affects portfolio selection. Although much research has been devoted to investigating the noise in the classical model, hardly any attention has been paid to the problem in the case of absolute deviation. In this paper, we study the effect of estimation noise in the case of absolute-deviation-based portfolio optimization. We show that the key parameter determining the effect of noise is the ratio of the length of time series to portfolio size and that, other things being equal, the effect of noise is higher than in the classical, variance-based model. This finding points to the importance of checking whether theoretically „better“ portfolio selection models can indeed outperform the classical one in practice.

Imre Kondor, Szilárd Pafka, Richárd Karádi, Gábor Nagy
Application of PCA and Random Matrix Theory to Passive Fund Management

We use principal component analysis (PCA) for extracting principal components having larger-power in cross correlation from risky assets (Elton and Gruber 1973), and random matrix theory (RMT) for removing noise in the correlation and for choosing statistically significant components (Laloux et al 1999, Plerou et al 1999) in order to estimate expected correlation in portfolio optimization problem. In addition to correlation between every pairs of asset returns, the standard mean-variance model of optimal asset allocation requires estimation of expected return and risk for each assets. Asset allocation is, in practice, quite sensitive to how to estimate the expected return. We applied estimation based on “beta” (following the idea of Black and Litterman 1992) to portfolio optimization for 658 stocks in Tokyo Stock Exchange (TSE). By using daily returns in TSE and verifying that TSE has qualitatively similar principal components as NYSE (Plerou et al 1999), we show (i) that the error in estimation of correlation matrix via RMT is more stable and smaller than either historical, single-index model or constant-correlation model, (ii) that the realized risk-return in TSE based on our method outperforms that of index-fund with respect to Sharpe ratio, and (iii) that the optimization gives a practically reasonable asset allocation.

Yoshi Fujiwara, Wataru Souma, Hideki Murasato, Hiwon Yoon
Testing Methods to Reduce Noise in Financial Correlation Matrices

As the two noise reduction methods are conceptually different, they also produce different results. Our preliminary studies cannot serve as a basis to make schematic suggestions as to which method ought to be preferred in which situation. This will always be difficult. But further and systematic studies extending the ones presented here might yield some guidelines.

Per-Johan Andersson, Andreas Öberg, Thomas Guhr
Application of noise level estimation for portfolio optimization

Time changes of noise level at Warsaw Stock Market are analyzed using a recently developed method basing on properties of the coarse grained entropy. The condition of the minimal noise level is used to build an efficient portfolio. Our noise level approach seems to be a much better tool for risk estimations than standard volatility parameters. Implementation of a corresponding threshold investment strategy gives positive returns for historical data.

Krzysztof Urbanowicz, Janusz A. Hołyst
Method of Analyzing Weather Derivatives Based on Long-range Weather Forecasts

We examined the effectiveness of long-range weather forecasts applied to analyze weather derivatives. We carried out 651 back tests for different historical periods and confirmed that the accuracy of evaluating the risk of weather derivatives could be drastically improved by using long-range weather forecasts.

Masashi Egi, Shun Takahashi, Takeshi Ieshima, Kaoru Hijikata
Investment horizons : A time-dependent measure of asset performance

We review a resent

time-dependent

performance measure for economical time series — the (optimal) investment horizon approach. For stock indices, the approach shows a pronounced gain-loss asymmetry that is

not

observed for the individual stocks that comprise the index. This difference may hint towards an synchronize of the draw downs of the stocks.

Ingve Simonsen, Anders Johansen, Mogens H. Jensen
Clustering financial time series

We analyze the shares aggregated into the Dow Jones Industrial Average (DJIA) index in order to recognize groups of stocks sharing synchronous time evolutions. To this purpose, a pairwise version of the Chaotic Map Clustering algorithm is applied: a map is associated to each share and the correlation coefficients of the daily price series provide the coupling strengths among maps. A natural partition of the data arises by simulating a chaotic map dynamics. The detection of clusters of similar stocks can be exploited in portfolio optimization.

Nicolas Basalto, Francesco De Carlo
Risk portofolio management under Zipf analysis based strategies

A so called Zipf analysis portofolio management technique is introduced in order to comprehend the risk and returns. Two portofoios are built each from a well known financial index. The portofolio management is based on two approaches: one called the “equally weighted portofolio”, the other the “confidence parametrized portofolio”. A discussion of the (yearly) expected return, variance, Sharpe ratio and

β

follows. Optimization levels of high returns or low risks are found.

M. Ausloos, Ph. Bronlet
Macro-players in stock markets

It is usually assumed that stock prices reflect a balance between large numbers of small individual sellers and buyers. However, over the past fifty years mutual funds and other institutional shareholders have assumed an ever increasing part of stock transactions: their assets, as a percentage of GDP, have been multiplied by more than one hundred. The paper presents evidence which shows that reactions to major shocks are often dominated by a small number of institutional players. Most often the market gets a wrong perception and inadequate understanding of such events because the relevant information (e.g. the fact that one mutual fund has sold several million shares) only becomes available weeks or months after the event, through reports to the Securities and Exchange Commission (SEC). Our observations suggest that there is a radical difference between small (< 0.5%) day-to-day price variations which may be due to the interplay of many agents and large (> 5%) price changes which, on the contrary, may be caused by massive sales (or purchases) by a few players. This suggests that the mechanisms which account for large returns are markedly different from those ruling small returns.

Bertrand M. Roehner
Conservative Estimation of Default Rate Correlations

The risk of a credit portfolio depends crucially on correlations between the probability of default (PD) in different economic sectors. We present statistical evidence that a (one-) factorial model is sufficient to describe PD correlations, and suggest a method of parameter estimation which avoids in a controlled way the underestimation of correlation risk.

Bernd Rosenow, Rafael Weißbach
Are Firm Growth Rates Random? Evidence from Japanese Small Firms

Anecdotal evidences suggest that a small number of firms continue to win until they finally acquire a big presence and monopolistic power in a market. To see whether such “winner-take-all” story is true or not, we look at the persistence of growth rates for Japanese small firms. Using a unique dataset covering half a million firms in each year of 1995–2003, we find the following. First, scale variables, such as total asset and sales, exhibit a divergence property: firms that have experienced positive growth in the preceding years are more likely to achieve positive growth again. Second, other variables that are more or less related to firm profitability exhibit a convergence property: firms with positive growth in the past are less likely to achieve positive growth again. These two evidences indicate that firm growth rates are not random but history dependent.

Yukiko Saito, Tsutomu Watanabe
Trading Volume and Information Dynamics of Financial Markets

We describe a new financial diagnostic method, related to the entropy generated when a limit trader satisfies market demand by filling orders, thereby playing the role of a Maxwell Demon. By comparing the real cumulative trading volume to some measure of historically “normal” demand, one may determine whether the market shows excess order or disorder, and accordingly adjust one’s trading strategy.

S.G. Redsun, R.D. Jones, R.E. Frye, K.D. Myers
Random Matrix Theory Applied to Portfolio Optimization in Japanese Stock Market

We examined the effectiveness of random matrix theory applied to portfolio optimization using Japanese stock market data. We carried out 48 back tests for different historical periods and confirmed that it was possible to drastically improve the accuracy of portfolio risk evaluation using random matrix theory.

Masashi Egi, Takashi Matsushita, Seiji Futatsugi, Keizaburo Murakami
Growth and Fluctuations for Small-Business Firms

Small-business firms have qualitatively different characteristics of firm-size growth from those for large firms. Credit Risk Database (CRD) is the largest database of japanese small and midsize companies, which covers nearly 1 million small-business firms, more than 60% of all companies in Japan. By employing stock (total assets and debts) and flow (sales) quantities in the CRD, we show that Gibrat’s law breaks down for the small and midsize companies corresponding to non-power-law region, while the law asymptotically holds in the larger-size region, for all the variables examined. In fact, standard deviation

σ

of logarithmic growth rate

r

= log

R

= log(

x

2

/

x

1

) (where

x

1

and

x

2

are the variable for two successive years) scales as firm size becomes larger (

σ ∝ x

1

β

)), but asymptotically approaches non-scaling regime (

σ

∼ const). We also show that there is scaling relation of growth rates for different time-scales with which one observes firm-size. Standard deviation

σ

of growth rate

r

= log

R

= log(

x

t

+

Δt

/

x

t

) from time

t

and

t

+

Δt

scales as

σ ∝

(

Δt

)

γ

.

Yoshi Fujiwara, Hideaki Aoyama, Wataru Souma

Networks and Wealth Distributions

Frontmatter
The skeleton of the Shareholders Networks
Guido Caldarelli, Stefano Battiston, Diego Garlaschelli
Financial Market - A Network Perspective
Jukka-Pekka Onnela, Jari Saramäki, Kimmo Kaski, János Kertész
Change of ownership networks in Japan

As complex networks in economics, we consider Japanese shareholding networks as they existed in 1985, 1990, 1995, 2000, 2002, and 2003. In this study, we use as data lists of shareholders for companies listed on the stock market or on the over-the-counter market. The lengths of the shareholder lists vary with the companies, and we use lists for the top 20 shareholders. We represent these shareholding networks as a directed graph by drawing arrows from shareholders to stock corporations. Consequently, the distribution of incoming edges has an upper bound, while that of outgoing edges has no bound. This representation shows that for all years the distributions of outgoing degrees can be well explained by the power law function with an exponential tail. The exponent depends on the year and the country, while the power law shape is maintained universally. We show that the exponent strongly correlates with the long-term shareholding rate and the cross-shareholding rate.

Wataru Souma, Yoshi Fujiwara, Hideaki Aoyama
G7 country Gross Domestic Product (GDP) time correlations. A graph network analysis
J. Miśkiewicz, M. Ausloos
Dependence of Distribution and Velocity of Money on Required Reserve Ratio

The impacts of money creation on the statistical mechanics of money circulation were investigated by focusing on the dependence of monetary wealth distribution and the velocity of money on the required reserve ratio in this paper. In reality, money creation is important to economic system. The process of money creation can be represented by the multiplier model of money in traditional economics. From this model, it can be known that the required reserve ratio set by the central bank is one of the main determinants of the monetary aggregate and under some assumptions the monetary aggregate can be expressed as the product of the monetary base and the required reserve ratio in steady state. Taking the role that the required reserve ratio plays in the monetary system into account, we developed a random transfer model by introducing a fractional reserve banking system and carried out some simulations to observe how the monetary aggregate evolves over time, how monetary wealth is distributed among agents, as well as how fast money is transferred in the transferring process. Monetary wealth is found to follow asymmetric Laplace distribution, and the fact that latency time of money follows exponential distribution indicates that the transferring process is Poisson type. The theoretical formulas of monetary wealth distribution and the velocity of money in terms of the required reserve ratio are given respectively which are in a good agreement with the simulation results.

Ning Xi, Ning Ding, Yougui Wang
Prospects for Money Transfer Models

Recently, in order to explore the mechanism behind wealth or income distribution, several models have been proposed by applying principles of statistical mechanics. These models share some characteristics, such as consisting of a group of individual agents, a pile of money and a specific trading rule. Whatever the trading rule is, the most noteworthy fact is that money is always transferred from one agent to another in the transferring process. So we call them money transfer models. Besides explaining income and wealth distributions, money transfer models can also be applied to other disciplines. In this paper we summarize these areas as statistical distribution, economic mobility, transfer rate and money creation. First, money distribution (or income distribution) can be exhibited by recording the money stock (flow). Second, the economic mobility can be shown by tracing the change in wealth or income over time for each agent. Third, the transfer rate of money and its determinants can be analyzed by tracing the transferring process of each one unit of money. Finally, money creation process can also be investigated by permitting agents go into debts. Some future extensions to these models are anticipated to be structural improvement and generalized mathematical analysis.

Yougui Wang, Ning Ding, Ning Xi
Inequalities of Wealth Distribution in a Society with Social Classes

We study a simple model of capital exchange among economic agents in which the effect of a correlation between wealth and connectivity is considered within two different hypotheses: a) agents interact within their own social or economic class and b) agent’s connectivity is related to its success in exchange transactions. The wealth distribution in the first case may generate a two-class society with a clear gap in the middle and highly unequal power law distributions with a great number of strongly impoverished agents and a few very rich ones. In the second case the wealth distribution is modified by the dynamics of the lattice, getting closer to a power law for some values of the parameters of the model. As expected, the lattice itself is different from the random initial one.

J. R. Iglesias, S. Risau-Gusman, M. F. Laguna
Analyzing money distributions in ‘ideal gas’ models of markets
Arnab Chatterjee, Bikas K. Chakrabarti, Robin B. Stinchcombe
Unstable periodic orbits and chaotic transitions among growth patterns of an economy

We analyze a chaotic growth cycle model which represents essential aspects of macroeconomic phenomena. Unstable periodic solutions detected from a chaotic attractor of the model are categorized into some hierarchical classes, and relationships between each class of them and characteristics of the attractor are discussed. This approach may be useful to clarify economic laws hidden behind complicated phenomena.

Ken-ichi Ishiyama, Yoshitaka Saiki
Power-law behaviors in high income distribution

We discussed a model in this paper which shows the power-law behavior in a ranking problem of income. The Monte Carlo simulation of our model shows a satisfactory fit with the data for Japanese and US CEOs. We have investigated the origin of the power-law behavior which is proven by some fractal structure formed in our model system.

Sasuke Miyazima, Keizo Yamamoto
The power-law exponent and the competition rule of the high income model

We propose a model for power-law problem in high-income against ranking. This model is so simple that we can examine the effect of some differences in competition rules where we obtain different power-law exponents as well as exact analysis by the master equation.

Keizo Yamamoto, Sasuke Miyazima, Hiroshi Yamamoto, Toshiya Ohtsuki, Akihiro Fujihara

New Ideas

Frontmatter
Personal versus economic freedom

In this paper we present a model which allows to discriminate between two kinds of behavior connected with areas which we will call personal and economic. It seems that an attitude with regard to the personal area spreads in a different way than that with regard to the economic area. We assume that each agent tries to influence its neighbors, but in the personal area the information flows inward from the neighborhood (like in most opinion dynamic models), whereas in the economic area the information flows outward from the agent or group of agents to the neighborhood (like in the Sznajd model).

Katarzyna Sznajd-Weron, Józef Sznajd
Complexity in an Interacting System of Production
Yuji Aruka, Jürgen Mimkes
Four Ingredients for New Approaches to Macroeconomic Modeling

This paper outlines some of the concepts and tools which, although not in the mainstream macroeconomics literature, have been effective either in providing new results, or insights to known results.

Briefly put, the new approaches borrow concepts and tools from population genetics, condensed matter physics, and recently developped stochastic combinatorial analysis in statistics. Continuous-time Markov chains are constructed for clusters of heterogeneous types of interacting economic agents. We can then draw macroeconomic policy implications by examining solutions of master (Chapman-Kolomogorv) equations, Fokker-Planck equations or Langevin equations, as the needs call for them.

This paper attempts to introduce the reader to some of these new notions, and procedures to gain new insights and results.

This paper reports on some of these by loosely organizing them into four sections.

Masanao Aoki
Competition phase space: theory and practice

Competition phase space approach is proposed. Theoretical background is presented. Practical applications of the proposed approach to competition media monitoring are discussed.

Dmitri B. Berg, Valerian V. Popkov
Analysis of Retail Spatial Market System by the Constructive Simulation Method
Hirorhide Nagatsuka, Katsuya Nakagawa
Quantum-Monadology Approach to Economic Systems

In order to describe non-classical nature of economic phenomena, the use of quantum monadology, a quantum world model reinforced with Leibnizian monadology, is proposed, which provides not only the way of thinking but also a mathematical system that enables us to simulate personal and social human behaviors and to describe integrative or anomalous processes of economic systems.

Teruaki Nakagomi
Visualization of microstructures of economic flows and adaptive control

Economic flows are modeled based on the microscopic particle pictures, and studied by renormalized stochastic dynamics and simulations. From the universal behavior obtained, every catastrophic behavior of the flows may be understood, as well as which sub-system is concerned and how to avoid it by suitable control measures.

Yoshitake Yamazaki, Zhong-can Ou-Yang, Herbert Gleiter, Kunquan Lu, Dianhong Shen, Xing Zhu
Metadaten
Titel
Practical Fruits of Econophysics
herausgegeben von
Hideki Takayasu
Copyright-Jahr
2006
Verlag
Springer Tokyo
Electronic ISBN
978-4-431-28915-9
Print ISBN
978-4-431-28914-2
DOI
https://doi.org/10.1007/4-431-28915-1

Premium Partner