Skip to main content
main-content

Über dieses Buch

Hans SchneeweiB is- one of the best-known German econometricians and statisticians. He was born in Glatz, Silesia, on March 13, 1933. Hans SchneeweiB studied mathematics and physics and received his Ph. D. degree from the Johann-Wolfgang-Goethe University, Frankfurt, in 1960. He was member of the academic staff of the Faculty of Law and Economics of the Saar University between 1959 and 1965. Following his Habilitation in 1964, he was appointed to the chair of Statistics and Econometrics at the same university. As a visiting professor he worked at the Institute for Aca­ demic Studies in Vienna in 1967 and at the Department of Statistics of the University of Waterloo, Canada, in 1970/71. He has been a full professor of Econometrics and Statistics at the Ludwig-Maximilians-University in Mu­ nich since 1973. His extensive research activities abroad included important projects in Waterloo, Vienna, Dundee (Scotland), Sidney, China, Kiev. During the more than 40 years of his academic work he has published outstanding, original articles on econometrics and statistics. To give an ex­ ample, it is his research on decision theory which has marked developments in this field. His book Entscheidungskriterien bei Risiko, published in 1967, is an excellent starting for anyone looking for an introduction to the complex iSsues involved.

Inhaltsverzeichnis

Frontmatter

Errors-in-Variables

Frontmatter

Errors in Variables in Econometrics

Summary
This article discusses the use of instrumental variables and grouping methods in the linear errors-in-variables or measurement error model. Comparisons are made between these methods, standard measurement error model methods with side conditions, least squares methods, and replicated models. It is demonstrated that there are close relationships between these apparently diverse estimation techniques.
Chi-Lun Cheng, John W. Van Ness

Estimation for the Nonlinear Errors-in-Variables Model

Summary
An estimator for the parameters of the nonlinear errors-in-variables model with smaller bias than that of the functional maximum likelihood estimator is presented. The estimator is a least squares estimator with an internal Monte Carlo adjustment for bias.
Wayne A. Fuller

Nonparameteric Regression Splines for Generalized Linear Measurement Error Models

Summary
In many regression applications both the independent and dependent variables are measured with error. When this happens, conventional parametric and nonparametric regression techniques are no longer valid. This is further complicated when one instead wants to fit a generalized linear model to the collected data. We consider two different estimation techniques. The first method is the SIMEX (SIMulation Extrapolation) algorithm which attempts to estimate the bias, and remove it. The second method is a structural approach, where one hypothesizes a distribution for the independent variable which depends on estimable parameters. For both methods, two different knot selection methods are developed.
Raymond J. Carroll, Jeffrey D. Maca, Suojin Wang

Different Nonlinear Regression Models with Incorrectly Observed Covariates

Summary
We present quasi-likelihood models for different regression problems when one of the explanatory variables is measured with heteroscedastic error. In order to derive models for the observed data the conditional mean and variance functions of the regression models are only expressed through functions of the observable covariates. The latent covariable is treated as a random variable that follows a normal distribution. Furthermore it is assumed that enough additional information is provided to estimate the individual measurement error variances, e.g. through replicated measurements of the fallible predictor variable. The discussion includes the polynomial regression model as well as the probit and logit model for binary data, the Poisson model for count data and ordinal regression models.
Markus Thamerus

ML Estimation from Binomial Data with Misclassifications

A Comparison: Internal Validation versus Repeated Measurements
Summary
Tenenbein (1970) presented a double sampling scheme to estimate the proportion parameter of binomial data in presence of misclassification. In the context of measurement error models this strategy is known as the internal validation method. A second broad strategy is the use of repeated measurements. We show how to apply this strategy for the estimation of a binomial proportion parameter and try to answer the question which method should be preferred by comparing the asymptotic variances of the estimators.
Gerhard Schuster

The Indeterminacy of Latent Variable Models

Abstract
It is well-known that the standard normal factor model is indeterminate in the sense that linear rotations in the factor space leave the covariance matrix unchanged. Thus if the model is written
$${\text{x}} = \mu + \Lambda {\text{y + e}}$$
(1.1)
where where Ψ is diagonal and E(e y′) = 0 then this model is indistinguishable from one with factors z = My and loading matrix * = M where M is a non-singular orthogonal matrix with MM′ = I. In both cases the covariance matrix is Σ = ⋀⋀′+Ψ.
David J. Bartholomew

Theoretical Econometrics

Frontmatter

An Introduction to the Economic Study of Knowledge

Abstract
Philosophy is expected to give answers to three basic questions: What can we know? What may we hope? What shall we do? The theory of knowledge or epistemology deals with the first of these.
Martin J. Beckmann

Potentials and Limitations of Econometric Forecast and Simulation Models

Abstract
Formal methods and models have become an integrated part of social sciences and in particular of economics. The availability of empirical data and the data processing capacity of computers have indeed created demand for quantitative analysis which in former times had been done mostly in a qualitative manner. In this paper I give an overview about different approaches to modelling and empirically analysing economies starting from early approaches of the Cowles type which were first discussed in the 50’s. I also treat large scale models which have been developped in the 60’s and 70’s and which were and still are used mainly for forecasting. However during the last two decades two types of quantitative models have become important which are drastically differerent in their approaches compared to econometric models: One is the “Real Business Cycle” (RBC) approach which is based mainly on macroeconomic theory and the other is the “Computational General Equilibrium (CGE) approach which starts from microeconomics. I shortly describe some recent developments in economics which were at least partly responsible for these two new classes of models which could be characterized as empirical models without data 1 since there is no longer need for a large sample, i.e. a long time series. For economies which have undergone a radical change such as most Eastern-European countries this is the only way to anlyse formally the new structure of the economy and to forecast the future development although equilibrium models in particular may be not the most adequate way of describing the present structure of these economies.
Gerd Ronning

Consistent Estimation of the Number of Cointegration Relations in a Vector Autoregressive Model

Summary
A class of criteria is developed which estimate the cointegration rank of a vector autoregressive (VAR) model consistently. It turns out that the usual consistent criteria for lag length selection can be adapted for the present purpose. However, alternative criteria may be advantageous. The small sample relevance of the asymptotic results is demonstrated by a small simulation study.
Helmut Lütkepohl, Don S. Poskitt

Locally Weighted Autoregression

Summary
Estimation of mean and volatility functions for nonlinear time series models of the ARCH type is discussed. The mean function is estimated with local linear autoregression. The volatility function is estimated with a kernel estimator based on the squared residuals of the mean function. Asymptotic bias and variance of these estimators are investigated. The proposals are applied to daily exchange rates of DEM/USD.
Yuanhua Feng, Siegfried Heiler

Locally Weighted Least Squares in Categorical Varying-Coefficient Models

Summary
In varying-coefficient models the number of coefficients that have to be estimated is usually very high. Consequently local likelihood estimates which are based on an iterative procedure are rather time-consuming. In the present paper an alternative local estimation procedure is proposed. It is based on the weighted least squares estimate applied locally for fixed effect modifier but additionally observations in the neighbourhood are used in a weighted form. Asymptotic behaviour of the locally weighted least squares estimator is shown to be equivalent to the local likelihood estimate. The performance of the estimator is illustrated by a small simulation study and an application to ordinal regression.
Gerhard Tutz, Göran Kauermann

Using First Differences as a Device against Multicollinearity

Abstract
In his textbook, Hans Schneeweiß (1971) warned that taking first differences in econometric models involving data with a common trend does not offer a solution to the multicollinearity problem. In the following we will confirm his point of view by giving an alternative proof that the generalized method of least squares (GLS), applied to the model with first differences, will not produce other estimates than the ordinary least squares (OLS) method in the original model. Moreover, we shall demonstrate that related estimation procedures also cannot be expected to provide a substantial improvement over the OLS-method. For this analysis the tools from the theory of oblique and orthogonal projectors will turn out to be extremely helpful.
Helge Toutenburg, Götz Trenkler

Asymptotic Equivalence of Ordinary Least Squares and Generalized Least Squares with Trending Regressors and Stationary Autoregressive Disturbances

Summary
This note generalizes previous results on the asymptotic equivalence of Ordinary and Generalized Least Squares estimates in Linear Regression models with trending data.
Walter Krämer

The Analysis of Growth and Learning Curves with Mean- and Covariance Structure Models

Abstract
The author is grateful for the support of the Deutsche Forschungsgemeinschaft (DFG) given to the project B2 (Multivariate Prognosesysteme in der Warenwirtschafts- und Produktionssteuerung) within the SFB 475 (Komplexitätsreduktion in multivariaten Datenstrukturen) at the University of Dortmund.
Gerhard Arminger

Applied Econometrics

Frontmatter

How Important are Real Shocks for the Real Exchange Rate?

Summary
This paper analyzes long-run relations and driving stochastic trends of the real exchange rate between Germany and the United States in a structural cointegrated VAR framework. This allows the identification of common trends with permanent effects.
We find that monetary shocks are predominant especially in the long run while real disturbances with permanent effects on output fail to explain real exchange rate movements.
Kai Carstensen, Gerd Hansen

Trading Strategies of a Financially Strong Investor in Futures and Stocks

Is Profitable Manipulation Possible?
Summary
We consider a financially strong investor whose orders have an impact on share prices. We address the question whether trading strategies in the spot market and the corresponding market for index futures can be profitable. A criterion for profitability is derived and, given this criterion is satisfied, the manipulator’s optimal order volume is calculated explicitly. It seems that under realistic conditions the profit does not exceed transaction costs.
Günter Bamberg, Gregor Dorfleitner, Klaus Röder

Estimation of the Stochastic Volatility by Markov Chain Monte Carlo

Summary
In this paper, we consider the stochastic volatility which is used to measure the fluctuation of financial assets. Based on the stochastic volatility model introduced by Taylor (1986) a Bayesian point of view is taken to estimate the stochastic volatility by MCMC methods. The performance of these methods is evaluated in a simulation study. In addition, they are applied to a real dataset for estimating the volatility of swap rates.
Hans Boscher, Eva-Maria Fronk, Iris Pigeot

Money and Prices in Germany

Empirical Results for 1962 to 1996
Summary
In this contribution the linkages between money growth and inflation are investigated. Two vector error correction models are estimated one with data before and the other one with data including the German monetary union (GMU), which took place on 1 July 1990. The models contain the variables money, prices, output, and interest rate. We found one cointegrating relation, which can be interpreted as a long-run money demand equation. Including the GMU this relation remains stable if the structural break is captured by dummy variables. Although the dynamic structure of the two vector error correction models is nearly not influenced by the GMU the linkages between money growth and inflation are sensitive whether international price movements are included or not.
Imke Brüggemann, Jürgen Wolters

Quasi-Minimax Estimation, Prior Information and Money Demand in Germany

Summary
We illustrate the application of various quasi minimax estimators in a linear regression model in which money demand in Germany is related to real GNP, inflation- and nominal interest rate. Initial interval constraints on the coefficients are transformed into ellipsoidal restrictions. The resulting quasi minimax estimators are shown to outperform ordinary least squares according to a minimax risk criterion.
Hans-Joachim Mittag, Dietmar Stemann, Bernhard Schipp

Bayesian Forecasting of Turning Points in Economic Cycles

Summary
Bayesian methods are applied to the ifo business climate to predict the state of the German economy at the beginning of 1996. The models predict a cyclical downturn. However, 1996 is not registered as a recession year in Germany. It is argued that Germany was in fact in a recession and that business cycle timing is in a poor state. Official GDP data for 1996 is implausibly high.
Walter Naggl

Regression Approaches to Rental Guides

Summary
Rental guides have to provide information about “usual” rents that are paid for flats comparable in type, size, equipment, quality and location. At least in larger cities, traditional cross-classification tables have serious drawbacks. Model-based techniques of regression analysis with rent as the dependent variable are obvious alternatives in this situation. The statistician has to build a regression model that is simple, robust and easily interpretable, but also complex enough to reflect reality. In this paper, we first discuss some parametric regression approaches, including a model that has been used for the Munich’94 rental guide. We then proceed to semiparametric alternatives that are useful at least for exploratory data analysis and for the selection of adequate parametric models.
Ludwig Fahrmeir, Christian Gieger, Artur Klinger

An Econometric Model for the Transition Process of China’s Economy

Summary
The Chinese economy has experienced the most striking changes in the last two decades, with a structural transition from a centrally planned economy to a market oriented economy with a growth of GNP. The economic reform was the basis for a new industrialization mechanism that is responsible for a fast growth of production, mainly outside the planned segment.
This paper presents an econometric model that can be used for an analysis of the quantitative expansion of the non-planned activities and the qualitative changes of the state-owned enterprises.
Chen Pu, Joachim Frohn

Decision Theory and Statistics

Frontmatter

Information Value as a Metacriterion for Decision Rules Under Strict Uncertainty

Summary
This contribution proposes the notions of information structure and information value as a tool for judging on the rationality of decision rules in decision theory under uncertainty. Traditionally, sets of axioms which consider the ranking of alternatives or the choice set are used for this purpose. But some intuitively unfavourable criteria are not rejected decidedly enough with the help of the known axioms only, as for instance the maximax-rule. With the concept of information resistance, here newly introduced in the discussion, so called “global decision rules” can be criticized in an adequate manner.
On the other hand it is pointed out, that the concept of information value must sometimes used with caution. For instance, A. Wald’s much esteemed maximin-rule, in general reacts to a intuitively “worthless” information with a positive information value.
Franz Ferschl

Analysing Ellsberg’s Paradox by Means of Interval-Probability

Summary
The results reported by Ellsberg (1961) are often considered paradoxical. Indeed they reveal behaviour contrasting the sure-thing principle. An analysis employing modern tools produces two insights: 1) The experimental setting of Ellsberg can be adequately described only by means of interval-probability. 2) The validity of the sure-thing principle is restricted to those situations which can sufficiently be described by classical probability. — In addition Ellsberg’s results indicate a typology of behaviour when knowledge is given by interval-probability.
Kurt Weichselberger, Thomas Augustin

Some Robust and Adaptive Tests Versus F-Test for Several Samples

Summary
Testing the equality of c means the application of the F-test depends on very restrictive assumptions such as normality and equal variances of the c populations. If these assumptions are not satisfied it is more appropriate to apply a robust version of the F-test. We consider the Welch test, a rank version of the Welch test, the trimmed Welch test and some nonparametric counterparts where each of them is very efficient for a special class of distributions. But usually the practising statistician has no clear idea of the underlying distribution. Therefore, an adaptive test should be applied which takes into account the given data. We compare the F-test with its robust and adaptive competitors under normality and nonnormality as well as under homoscedasticity and heteroscedasticity. The comparison is referred to level α and power β of the test and is carried out via Monte Carlo simulation. It turns out that the Welch test is the best one in the case of unequal variances, for equal variances, however, special rank tests are to prefer. It is also shown that the adaptive test behaves well over a broad class of distributions.
Herbert Büning

A Sequential Test for the Comparison of the Precision of Two Measurement Methods

Summary
In order to compare the precision (expresed by the variance) of a new measurement method with that of a traditional measurement method the null hypothesis of equality of the two variances can be tested with a one-sided F-test. A sequential Wald test is proposed as an alternative, and its operating characteristic function is investigated by means of a simulation study. This study shows that the test can be ended with twice the sample size of the corresponding F-test without a substantial change of the operating characteristic function.
Peter-Th. Wilrich

Backmatter

Weitere Informationen