Weitere Kapitel dieses Buchs durch Wischen aufrufen
Contemporary economics is in troubled waters. This is true most of all for that particular area of the economic discourse labeled macroeconomics. Although in our days there exists a consolidated and celebrated mainstream framework known as Dynamic Stochastic General Equilibrium (DSGE) model (Blanchard, 2008; Woodford, 2008), its internal coherence and ability in explaining the empirical evidence are increasingly questioned from several quarters (Colander, 2006; Howitt et al., 2008; Juselius and Franchi, 2007), especially after the turmoil of the first global crises of the XXIst century has materialized almost unannounced and misconstrued (Driffill, 2008).
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
In the final chapter of his General Theory, Keynes (1936) wrote of contemporaneous politicians as intellectual slaves of economists passed away at least a decade before. Extending this metaphor, XXI century economists are intellectual slaves of the mummified physicists of the XVII century (see also Mirowski, 1989).
Phelps (1990) inserts the uprising of the probabilistic approach in economics into the wider intellectual revolution known as modernism, a cultural and philosophical approach which transformed the Western world between 1860 and 1930, and that comprises “… the cubism of Picasso and Braque, the atonalism of Schoenberg and Berg, the fragmented poetry of Eliot and Pound, and various writings from Nietzsche to Sartre” ( Phelps, 1990, p. 5).
This is the case for the well-known as if argument put forth by Friedman (1953), according to which the primary appraisal criterion of a theory should be its predictive success, and not the truth content of its assumptions.
Illuminating enough, for evolutionary biologists an organism reaches a state of rest (i.e., an equilibrium, according to mainstream economics) only when it is dead.
We resist the pretension to a more precise definition for the simple reasons that the term itself - and its counterpart complexity, to which we will come in due course - is one or the most slippery in the recent history of science ( Israel, 2005), and its antecedents are dreadfully connected to non-scientific argumentations ( Epstein, 1999).
Methodological reductionism, that is the reduction of theories along a tree of knowledge (from socio-economics down to psychology, biology, chemistry and eventually physics), is referred also as hierarchical reductionism by Dawkins (1976).
Even if we doubt that Smith, Marshall and Walras would have embraced the idea of economic theory that Lucas has in mind.
It seems worthwhile to notice that this procedure of microfoundation of macroeconomics is very different from the methodological counterpart used in physics. The latter starts from the micro-dynamics of the single particle, as expressed by the Liouville equation and, through the Master equation, ends up with macroscopic equations. In the aggregation process, the dynamics of the individual entities lose their degree of freedom and behaves coherently in the aggregate.
Form this viewpoint, modern macroeconomists have provided an unambiguous answer to the question that Marshall posed when commenting a cornerstone of marginalism, that is Mathematical Psychics by F.Y. Edgeworth: “It will be interesting to watch the development of his theory, and, in particular, to see how far he succeeds in preventing his mathematics from running away with him, and carrying him out of sight of the actual facts of economics” (Marshall, 1881, p. 457).
Davis (2006) identifies three “impossibility results” at the root of the breakdown of neoclassical economics and, by extension, of the NNS: (i) Arrow's impossibility theorem, showing that neoclassical theory is unable to explain social choices (Arrow, 1951); (ii) the Cambridge capital debate, pointing out that neoclassical economics is contradictory with respect to the concept of aggregate capital ( Cohen and Harcourt, 2003); and (iii) the Sonnenschein-Debreu-Mantel result, showing that the standard comparative static reasoning is inapplicable in general equilibrium models. In the main text we limit ourselves to discuss several declensions of the latter point.
For instance, one might require that all agents in the economy have Cobb-Douglas preferences.
That is, cases of nonconvergence towards the GE occur on open sets of initial conditions, not just at isolated points.
In philosophy, the dual of the fallacy of composition is called fallacy of division, defined as the wrong practice of attributing properties to a level of analysis different than the one where the property is observed.
Vanberg (2004) talks in this case of the rationality postulate, as opposed to the rationality hypothesis, which require global (i.e., complete) consistence of preferences, beliefs and actions.
A possible way to define Knightian uncertainty is that the potential outcomes of an action are identified by two or more distributions at one time, and that these distributions are overlapping.
An interesting strand of literature which allows a micro-meso-macro computational analysis, and that proves to be very close in spirit - and to some extent complementary - to our approach, is the one which applies agent-based techniques to evolutionary Schumpeterian models of endogenous technological change. Inspired by the path-breaking work of Nelson and Winter (1982), recent examples include Dosi et al. (2008a, b) and Saviotti and Pyka (2008).
According to Beinhocker (2006), complex economies are open-ended, dynamic, non-linear, far from equilibrium systems, while mainstream economics deals with closed, static, linear systems perpetually in equilibrium. Complex systems undertake an evolutionary process of differentiation, selection and amplification, which provides the system itself with novelty and is responsible for its growth in order and complexity.
Actually, the topic of how social relations may affect the allocation of resources has been investigated by some neoclassical economists (e.g., Leibenstein, 1950; Arrow, 1971; Pollack, 1976). However, they went almost completely unheard, until the upsurge in the early 1990s of a brand new body of work aimed at understanding and modelling the social context of economic decisions, usually labelled new social economics or social interaction economics ( Durlauf and Young, 2001). Models of social interactions are generally able to produce several properties, such as multiple equilibria ( Brock and Durlauf, 2001); non-ergodicity and phase transition ( Durlauf, 1993); equilibrium stratification in social and/or spatial dimension ( Benabou, 1996; Glaeser et al., 1996); the existence of a social multiplier of behaviours ( Glaeser et al., 2002). The key idea consists in recognizing that the social relationships in which individual economic agents are embedded can have a large impact on economic decisions. In this literature, the social context impacts on individual economic decisions through several mechanisms. First, social norms, cultural processes and economic institutions may influence motivations, values, and tastes and, ultimately, make preferences endogenous ( Bowles, 1998). Second, even if one admits that individuals are endowed with exogenously given preferences, the pervasiveness of information asymmetries in real-world economies implies that economic agents voluntarily share values, notions of acceptable behaviour and socially based enforcement mechanisms in order to reduce uncertainty and favour coordination ( Denzau and North, 1994). Third, the welfare of individuals may depend on some social characteristics like honour, popularity, stigma or status ( Cole et al., 1992). Finally, interactions not mediated by enforceable contracts may occur because of pure technological externalities in network industries ( Shy, 2001) or indirect effects transmitted through prices (pecuniary externalities) in non-competitive markets ( Blanchard and Kyiotaki, 1987), which may lead to coordination failures due to strategic complementarities ( Cooper, 1999).
According to the mainstream approach, information is complete and free for all the agents. Note that one of the key assumptions in the Walrasian tradition is that any strategic behaviour is ruled out, and the collection of the whole set of the information is left to the market via the auctioneer (or a benevolent dictator [ Barone, 1908]). In fact, one could read the rational expectation “revolution” as an attempt at decentralising the price setting procedure by defenestrating the auctioneer. Limited information is now taken into account, but the constraints have to affect every agent in the same way (the so-called Lucas' islands hypothesis), while the Greenwald-Stiglitz theorem ( Greenwald and Stiglitz, 1986) states that in this case the equilibrium is not even Pareto-constrained. If information is asymmetric or private, agents have to be heterogeneous and direct interaction has to be considered: this simple fact destroys the efficiency property of mainstream model and generates coordination failures. On the contrary, ABM are built upon the hypothesis that agents have limited information and learn through experience and by interacting with other agents.
Brian Arthur offers an effective statement of its relevance for economic theory: “Standard neoclassical economics asks what agents' actions, strategies, or expectations are in equilibrium with (consistent with) the outcome or pattern these behaviours aggregatively create. Agent-based computational economics enables us to ask a wider question: how agents’ actions, strategies or expectations might react to - might endogenously change with - the pattern they create. [. ] This out-of-equilibrium approach is not a minor adjunct to standard economic theory; it is economics done in a more general way. [.] The static equilibrium approach suffers two characteristic indeterminacies: it cannot easily resolve among multiple equilibria; nor can it easily model individuals’ choices of expectations. Both problems are ones of formation (of an equilibrium and of an “ecology” of expectations, respectively), and when analysed in formation - that is, out of equilibrium - these anomalies disappear” ( Arthur, 2006, p. 1552).
- Introducing Bottom-up Adaptive Macroeconomics
Domenico Delli Gatti
- Springer Milan
- Chapter 1
Neuer Inhalt/© Stellmach, Neuer Inhalt/© Maturus, Pluta Logo/© Pluta, Frankfurt School