Skip to main content
Top

2005 | Book

New Tools of Economic Dynamics

Editors: Prof. Dr. Jacek Leskow, Prof. Dr. Lionello F. Punzo, Prof. Martín Puchet Anyul

Publisher: Springer Berlin Heidelberg

Book Series : Lecture Notes in Economics and Mathematical Systems

insite
SEARCH

About this book

New Tools of Economic Dynamics gives an introduction and overview of recently developed methods and tools, most of them developed outside economics, to deal with the qualitative analysis of economic dynamics. It reports the results of a three-year research project by a European and Latin American network on the intersection of economics with mathematical, statistical, and computational methods and techniques. Focusing upon the evolution and manifold structure of complex dynamic phenomena, the book reviews and shows applications of a variety of tools, such as symbolic and coded dynamics, interacting agents models, microsimulation in econometrics, large-scale system analysis, and dynamical systems theory. It shows the potential of a comprehensive analysis of growth, fluctuations, and structural change along the lines indicated by pioneers like Harrod, Haavelmo, Hicks, Goodwin, Morishima, and it highlights the explanatory power of the qualitative approach they initiated.

Table of Contents

Frontmatter

Large Interactive Economies

1. Modeling a Large Number of Agents by Types: Models as Large Random Decomposable Structures
Summary
This paper introduces methods, based on decomposable random combinatorial analysis, to model a large number of interacting agents. This paper also discusses a largely ignored possibility in the mainstream economic literature that hitherto unknown types of agents may enter the models at some future time. We apply the notion of holding times, and introduce the results of the one- and two-parameter inductive methods of Ewens, Pitman and Zabell to economic literature. More specifically, we use the notion of exchangeable random partitions of a finite set to produce a simple rule of sucession, that is, the expressions for the probabilties for entries by new or known types, conditional on the observed data. Then Ewens equilibrium distriution for the sizes of clusters is introduced, and its use to examine market behavior is sketched, especially when a few types of agents are dominant. We suggest that the approaches of this paper and the notion of holding times are relevant to agent-based simulations because holding times can be used to randomly select agents that “act” first.
Masanao Aoki
2. An ABM-Evolutionary Approach: Bilateral Exchanges, Bargaining and Walrasian Equilibria
Summary
This paper analyzes, via intensive use of simulation techniques, the effects of the introduction of direct exchange relationships through bilateral trades in a simple general equilibrium pure exchange economy. Agents are heterogeneous in their endowments and repeatedly match in random pairs bargaining on how to split the advantages of a trade; possibly they can agree to exchange at the known market clearing prices. Simulations of this evolutionary process show that while walrasian outcomes emerge in the interaction among people with similar outside opportunities, people of different groups converge to accept an equilibrium in which agents with the best outside opportunity extract the greater part of the surplus out of an exchange. On other hand the acceptance of market mediation (i.e. walrasian outcomes) is more probable when either the parties try to exploit too much from the opponent or when there is anonymity in the trading process. The results show evidence that the acceptance of decentralized, personalized contracting (apart from efficiency considerations) increases the probability of amplifying the asymmetries in the initial distribution beyond what is produced by the pure market mechanism.
Nicolás Garrido, Pier Mario Pacini
3. A Genetic Algorithms Approach: Social Aggregation and Learning with Heterogeneous Agents
Summary
We analyze an economy in which increasing returns to scale incentivate social aggregation in a population of heterogeneous boundedly rational agents; however these incentives are limited by the presence of imperfect information on others’ actions. We show by simulations that the equilibrium coalitional structure strongly depends on agents’ initial beliefs and on the characteristics of the individual learning process that is modeled by means of genetic algorithms. The most efficient coalition structure is reached starting from a very limited set of initial beliefs. Furthermore we find that (a) the overall efficiency is an increasing function of agents’ computational abilities; (b) an increase in the speed of the learning process can have ambiguous effects; (c) imitation can play a role only when computational abilities are limited.
Davide Fiaschi, Pier Mario Pacini
4. Structure and Macroeconomic Performance: Heterogeneous Firms and Financial Fragility
Summary
In this paper we adopt a new macrodynamic tool, i.e. a system of non-linear difference equations describing the evolution over time of the first and second moments of the distribution of firms’ degrees of financial robustness captured by the ratio of the equity base to the capital stock - the equity ratio for short - which affects supply and capital accumulation decisions. For particular configurations of parameters the dynamic patterns of the average equity ratio and the variance generate irregular and asymmetric time series in which growth and fluctuations are jointly determined (fluctuating growth).
Domenico Delli Gatti, Mauro Gallegati
5. Firms Interaction and Technological Paradigms
Summary
This paper deals with the aggregate effects of small, exogenous but idiosyncratic technological shocks on locally interacting firms. Its main purpose is to model a situation in which technological paradigms emerge through endogenous propagation and diffusion of information leading to an aggregate pattern. We develop a theoretical framework in which large technological correlations emerge due to localised interaction of single firms. The paper states some simple results on spill-over dynamics determined by firms trying to improve their current technology and thus generating new information through investment in R&D and through localised technological search. The first part shows that different growth regimes can arise from the general framework of interaction that we propose. The second part shows that an interesting regime characterised both by long run innovation growth and endogenous short run fluctuations emerges spontaneously.
Rainer Andergassen, Franco Nardini, Massimo Ricottilli
6. Can Catastrophe Theory Become a New Tool in Understanding Singular Economies?
Summary
The aim of this paper is to show that economic systems must be characterized by their possible singularities rather than by their regularities. Changes in parameters of a regular economy imply only small changes in the optimal choice of the agents, i.e. the economic system is structurally stable, and they are consistent with one another minor changes. But in a singular economy, small changes in parameters affect the choice of the agents in a relevant way. The equilibria, after and before the changes, are radically different states, i.e. the economic system is structurally unstable. Catastrophe theory and Morse theory are used here to characterize singular economies. These are classical theories in mathematics but nevertheless, they are new tools to help understand the behavior of an economic system. Also the approach of Negishi is followed, and this allows us to consider in a unified way economies with finitely or infinitely many goods.
Elvio Accinelli, Martín Puchet Anyul
7. Pretopological Analysis on the Social Accounting Matrix for an Eighteen-Sector Economy: The Mexican Financial System
Summary
This paper analyzes the structural relationships of the financial transactions represented in a Social Accounting Matrix (SAM) for the Mexican economy through a pretopological approach. Based on a simple binary relationship between incomes and expenditures of institutional accounts, the pretopology is used as a mathematical tool to get an insight into the economic structure represented by the SAM. Such an analysis can be useful to identify the set of relationships between several institutional accounts, ordered according to their influence or domination.
Andrés Blancas, Valentín Solís
8. Firm Creation as an Inductive Learning Process: A Neural Network Approach
Summary
I present a neural network model for the spontaneous emergence of enterprises following a dynamic approach to firm formation in the tradition started by Adam Smith and further pursued by Joseph Schumpeter. I suggest that the “natural propensity to truck and barter” is the observable behaviour of self-interested economic actors who are continually exposed to and confronted with an environment complexity, which transcends their limited cognitive and computational capabilities. In their learning process they build networks of relations with other agents. It is this interaction among heterogeneous agents that often leads to the formation of successful organizations that completely solve the original problem. Firms can be seen simultaneously as the result of the entrepreneurs induction process, but also as the essential instrument for the elaboration of a solution to the problem. Increasing returns to scale and market size find in this framework a very natural representation. The role of competition for the nurturing of efficiency, and the issue of protection of infant industries can be tackled by this model.
Francesco Luna
9. Agent-Based Environments: A Review
Summary
Recent years have seen a proliferation of Multi-agent-based simulation (MABSS) models, in a growing range of domains, and using an increasing variety of software. In this article we compare some Agent Based environments anyone can download in Internet. The aim of this article is to discuss the general principles of each environment, and not to say which is the best or the worst one. The comparison is performed along several dimensions such as ease of learning, flexibility, available support, etc. It should also help the choice of a language by potential practitioners of agent-based economic. At the end of this article there’s a personal proposition about these environments of the author.
Alessandro Perrone

Econometrics and Time Series

10. Smooth Transition Models of Structural Change
Bernhard Böhm
11. Fraction-of-Time Approach in Predicting Value-at-Risk
Summary
The aim of this work is to present a new method of Value-at-Risk calculation using the fraction-of-time probability approach used in signal processing (see e.g Leśkow and Napolitano (2001)). This method allows making statistical type inferences based only on a single observation of phenomenon in time. Such setup is very convenient for time series data in financial analysis, when an assumption of having multiple realization of time series is very seldom realized. Another advantage of this method is the possibility of using it without assumptions on the distributions of returns. The aim of the paper is to present the method as well as application to financial data sets.
Jacek Leśkow, Antonio Napolitano
12. Spectral Analysis for Economic Time Series
Summary
The last ten years have witnessed an increasing interest of the econometrics community in spectral theory. In fact, decomposing the series evolution in periodic contributions allows a more insightful view of its structure and of its cyclical behavior at different time scales. In this paper, the issues of cross-spectral analysis and filtering are concisely broached, dwelling in particular upon the windowed filter [15]. In order to show the usefulness of these tools, an application to real data — namely to US unemployment and inflation — is presented. By means of cross spectral analysis and filtering, a correlation can be found between these two quantities (i.e. the Phillips curve) in some specific frequency bands, even if it does not appear in raw data.
Alessandra Iacobucci
13. Policy Analysis Using a Microsimulation Model of the Italian Households
Summary
In this paper, we apply a dynamic microsimulation approach, which allows us to examine the evolution of the system as a whole and at the same time to focus our attention on the different typologies of workers and pensioners. The latter objective is achieved by simulating individual reactions to systemic changes, while taking into account the regional dimensions. This technique also enables us to perform a general micro-analysis of the effects of past reforms on family pension-income distribution and average individual pension-benefits. The analytical framework used is the dynamic microsimulation model MIND, jointly developed by the University of Parma and Pisa, that incorporates behavioural analysis of individual choices of retirement age, derived from the Stock-Wise [24] option value model. We also perform sensitivity analysis on the model. This represents a valuable technique for treating uncertainty in input variables and for testing the robustness of the simulation results to possible changes in the macroeconomic scenario. In particular, we measure the economic impact resulting from alternative values of the income growth rate and the real interest rate as well as the effect of different distributions of the education degree reached by the individuals.
Carlo Bianchi, Marzia Romanelli, Pietro A. Vagliasindi
14. Validating a Dynamic Microsimulation Model of the Italian Households
Summary
The recent literature — including among others Redmond et al. [22], Gupta and Kapur [13], Mitton et al. [19] — has highlighted model alignment and validation as crucial issues to be tackled when microsimulating the consequences of public policies. This paper discusses some preliminary validation experiments performed on the model inputs, procedures and simulation results. The validation process that we use involves external checks, such as the ex-post comparison (from 1996 since 1999) of aggregated key macro variables (e.g. dependent workers’ incomes, and demographic variables) with official data (e.g. supplied by the Italian National Institute for Statistics ISTAT) at national and at regional level (North, Centre and South). We test the appropriateness of the main assumptions and specification of the model and policies’ effectiveness also using Monte Carlo simulations. Specifically, our analysis allows us to test MIND’s ability in forecasting demographic and economic trends and in capturing socio-economic dynamics for regional areas.
Carlo Bianchi, Marzia Romanelli, Pietro A. Vagliasindi
15. Recent Advances in Micromodeling: The Choice of Retiring
Summary
Recently, much attention has been devoted to econometric models as a new tool for dynamic microsimulation. In particular, microeconometric approaches to retirement decisions have been increasingly adopted for “calibrating” dynamic microsimulation frameworks aiming at endogenizing retirement choices. By doing this, both the understandment and the prediction of the effects of policy reforms (for istance, of Social Security systems) can be significantly improved. In this work an overview of the most recent developments in micromodeling retirement decisions is carried out. In particular, as for the choice of the estimation strategy, special emphasis is posed on the trade-off between the degree of realism of hypotheses, on the one hand, and on data tractability and/or estimation performance, on the other hand. Finally, some issues which represent a challenging avenue for future research are discussed.
Luca Spataro
16. Applied Econometrics Methods and Monetary Policy: Empirical Evidence from the Mexican Case
Summary
The main objective of this paper is to illustrate, using Mexican data, how the results yield by modern econometric methods are dependent upon each specific technique as well as upon the statistical properties of the series analyzed. The problems are even stronger and more evident in the case of economic series with structural changes and high variability as is the case of Mexico. Applied econometrics should be explicitly based upon a probability viewpoint, and different methods should be taken to produce only approximations to the actual data generation process. Thus, alternative techniques can only show distinctive features of the actual data that still need to be validated with the rest of empirical evidence. This indicates that applied econometricians have to look for maximum information by correctly applying different techniques without forgetting the relevance of economic reasoning. Using Mexican data, alternative econometric estimations are evaluated indicating that the formulation of a monetary policy only on the basis of some specific technique, without considering its potential pitfalls, should not be recommended.
Luis Miguel Galindo, Horacio Catalán

Themes of Growth and Development

17. Environmental Policy Options in the Multi-Regimes Framework
Summary
In this paper we extend the multi-regime framework to variables involved in the debate on economic growth and environmental quality, starting from a reexamination of the so-called Environmental Kuznets Curve. The aim is to discuss the double convergence hypothesis that implicitly stems from a recent line of research. According to it, some stylized facts would support the almost paradoxical hypothesis that economic growth produce not only cross-countries or regions convergence in per capita output, but also in (the demand of) environmental quality.
Factual analysis seems to reject the hypothesis of convergence in output or income levels. Available evidence, rather, seems to point out that there is no such a thing as a unique avenue to sustainable development while the convergence predicted in more conventional analyses, in particular within the framework of the so called Environmental Kuznets Curve, is far away from being demonstrated. Actual growth processes do differ from each other in a deep qualitative sense, to the effect of profoundly influencing final outcomes as well as the unfolding of the processes themselves. This reflects differences in initial conditions, of course, but also the different sectoral or integrated policies that have been implemented along the way.
Therefore, in contrast to the double convergence hypothesis, in our contribution we argue that growth is a necessary but not sufficient condition for the required change in the individuals preferences needed to shift social preferences away from private to public goods and that, moreover, the relationship between growth and environmental quality depends crucially upon the countrys growth model. Therefore, more than the quantitative it is the qualitative aspects that matters. The theoretical context that seems to lend itself to the analysis of such issues falls within the boundaries of the theories of endogenous growth. We argue that sustainable development, if it emerges at all, is the result of investment in immaterial capital (research, education and the like) more than the reflection of the exogenous forces (technological progress and demographic ) of the neoclassical theory. In the analysis of such issues, the environment offered by the multiregime approach proves useful as it highlights the qualitative properties of the dynamic processes, instead of focusing upon quantitative estimation of some special asymptotic states whose existence is often all but to be demonstrated.
Salvatore Bimonte, Lionello F. Punzo
18. An Empirical Analysis of Growth Volatility: A Markov Chain Approach
Summary
This paper studies the determinants of growth rate volatility, focusing on the effect of level of GDP, structural change and the size of economy. First we provide a graphical analysis based on nonparametric techniques, then a quantitative analysis which follows the distribution dynamics approach. Growth volatility appears to (i) decrease with per capita GDP, (ii) increase with the share of the agricultural sector on GDP and, (iii) decrease with the size of the economy, measured by a combination of total GDP and trade openness. However, we show that the explanatory power of per capita GDP tends to vanish when we control for the size of the economy.
Davide Fiaschi, Andrea Mario Lavezzi
19. New Measurement Tools of the External-Constrained Growth Model, with Applications for Latin America
Summary
In the spirit of the New Tools perspective, this paper presents an overview that identifies the main changes in the measurement toolkit that have been used to test the empirical adequacy of the balance of payments constrained growth model (BPC-model) for comparative macroeconomic analysis as well as for the study of the long-term constraints on economic growth. It takes into account the BPC-model in its initial formulation, as put forward by A.P. Thirlwall nearly three decades ago, as well as in its recent developments introduced to account for alternative definitions of long-term equilibrium. It shows how the theoretical revisions of the BPC-model have been accompanied by a sophistication of its empirical testing techniques, better suited for the applied analysis of long-term economic relations. In addition, the toolkit currently used within this analytical perspective is illustrated with data for the Mexican economy, based on a version of the BPC-model that explicitly focuses on the relevance of the influence of interest payments on the economys long-term external equilibrium. The paper ends with some comments on the policy implications that may be derived from the application of the BPC-model, with special reference to the case of Latin America.
Juan Carlos Moreno-Brid, Carlos Ricoy
20. The Fractal Structure, Efficiency, and Structural Change: The Case of the Mexican Stock Market
Summary
In the first part of this paper we present experimental evidence that the Mexican stock market has a fractal structure. We obtained the experimental results by using the Matsushita-Ouchi method, the box-counting method, and the fractal image compression technique of M. Barnsley. The results obtained by applying the Matsushita-Ouchi technique to the returns of the Indice de Precios y Cotizaciones (IPC) of the Mexican stock market support the assertion that the returns of the IPC behave like a fractional brownian motion. The box-counting method allows us to calculate the dimension of the graphs of the IPC, and its Hurst exponent H. The application of the fractal image compression technique produces attractors which are closed to the graphs of the returns of the IPC, and has permited us to estimate H. In the second part of the paper, we present two applications: first we study the efficiency graphs of the Mexican stock market, by plotting the Hurst exponent H as a funtion of time, and then we localize its structural changes by making use of a fractal attractor located in the phase space: H-IPC. We show that the fall of the IPC, that occurred between 1994 and 1995, corresponded to a rise of the value of H. We localize approximately the structural changes of the Mexican economy between 1987 and 1996.
Guillermo Romero-Meléndez, Mauricio Barroso-Castorena, Jorge Huerta-González, Manuel Santigo-Bringas, Carlos Alberto García-Valdéz
21. Processes of Evolutionary Self-Organization in High Inflation Experiences
Summary
We study some features of the processes that have generated high inflation in Latin- American countries. The statistical evidence shows that these inflationary experiences are fractional brownian noises. Several authors showed that self-organized criticality (SOC) processes may constitute the best explanation of the origin of such noises. But this hypothesis requires that the underlying structure remains timeinvariant. We conjecture, instead, that the economic structures evolve in time being, at each stage of their evolution, self-organized structures. We find that such ESO (evolutionary self-organized) processes still generate fractional brownian noises. Thus, they seem to provide a better explanation for the economic phenomenon of high inflation.
Fernando Tohmé, Carlos Dabús, Silvia London
22. Semi-Infinite Programming: Properties and Applications to Economics
Summary
In recent years the semi-infinite programming became one of the most substantial research topics in the field of operations research. The goal of this paper is to make more familiar to a broader class of mathematicians, economists and engineers the idea of what is a semi-infinite programming problem and how it can be applied to the modelling and solution of real-life problems from economics, finance and engineering. Several examples illustrate the characteristic geometric features of this class of problems as well as their consequences for the use of solution methods. Then, the paper refers to applications which are modelled as semi-infinite programming problems: a control problem, a Stackelberg game, a portfolio problem, a robust optimization problem and a technical trading system for future contracts. Finally, conclusions and open questions are discussed and a special bibliography on applications of semi-infinite programming is presented.
Francisco Guerra Vázquez, Jan-J. Rückmann
Backmatter
Metadata
Title
New Tools of Economic Dynamics
Editors
Prof. Dr. Jacek Leskow
Prof. Dr. Lionello F. Punzo
Prof. Martín Puchet Anyul
Copyright Year
2005
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-28444-4
Print ISBN
978-3-540-24282-6
DOI
https://doi.org/10.1007/3-540-28444-3