Skip to main content

2012 | Buch

Regression Analysis Under A Priori Parameter Restrictions

insite
SUCHEN

Über dieses Buch

This monograph focuses on the construction of regression models with linear and non-linear constrain inequalities from the theoretical point of view. Unlike previous publications, this volume analyses the properties of regression with inequality constrains, investigating the flexibility of inequality constrains and their ability to adapt in the presence of additional a priori information The implementation of inequality constrains improves the accuracy of models, and decreases the likelihood of errors. Based on the obtained theoretical results, a computational technique for estimation and prognostication problems is suggested. This approach lends itself to numerous applications in various practical problems, several of which are discussed in detail The book is useful resource for graduate students, PhD students, as well as for researchers who specialize in applied statistics and optimization. This book may also be useful to specialists in other branches of applied mathematics, technology, econometrics and finance

Inhaltsverzeichnis

Frontmatter
Chapter 1. Estimation of Regression Model Parameters with Specific Constraints
Abstract
Consider the regression
$${y}_{t} =\tilde{ f}({\mathbf{x}}_{t},{\alpha }^{0}) + {\epsilon }_{ t},\quad t = 1,2,\ldots,$$
(1.1)
where y t 1 is the dependent variable, x t q is an argument (regressor), α0 n is a true regression parameter (unknown), \(\tilde{f}({\mathbf{x}}_{t},\alpha )\) is some (nonlinear) function of α, ε t is a noise, and t is an observation number.
Pavel S. Knopov, Arnold S. Korkhin
Chapter 2. Asymptotic Properties of Parameters in Nonlinear Regression Models
Abstract
In this chapter, we investigate some regression models with unknown coefficients. We assume that the parametric set of unknown parameters is closed and, generally speaking, unbounded. The case of open sets is easier to study, because in most of the cases the asymptotic distribution of estimates is normal. This is not always true when the constraints are compact sets. Everywhere in the text we consider discrete time observations. It is known that the observation errors taken at different times can be dependent. We do not consider here the continuous time version, although in that case many statements listed below also take place.
Pavel S. Knopov, Arnold S. Korkhin
Chapter 3. Method of Empirical Means in Nonlinear Regression and Stochastic Optimization Models
Abstract
In stochastic optimization and identification problems (Ermoliev and Wets 1988; Pflug 1996), it is not always possible to find the explicit extremum for the expectation of some random function. One of the methods for solving this problem is the method of empirical means, which consists in approximation of the existing cost function by its empiric estimate, for which one can solve the corresponding optimization problem. In addition, it is obvious that many problems in mathematical statistics (for example, estimation of unknown parameters by the least squares, the least modules, the maximum likelihood methods, etc.) can be formulated as special stochastic programming problems with specific constraints for unknown parameters which stresses the close relation between stochastic programming and estimation theory methods. In such problems the distributions of random variables or processes are often unknown, but their realizations are known. Therefore, one of the approaches for solving such problems consists in replacing the unknown distributions with empiric distributions, and replacing the corresponding mathematical expectations with their empiric means. The difficulty is in finding conditions under which the approximating problem converges in some probabilistic sense to the initial one. We discussed this briefly in Sect. 2.1. Convergence conditions are of course essentially dependent on the cost function, the probabilistic properties of random observations, metric properties of the space, in which the convergence is investigated, a priori constraints on unknown parameters, etc. In the notation used in statistical decision theory the problems above are closely related with the asymptotic properties of unknown parameters estimates, i.e. their consistency, asymptotic distribution, rate of convergence, etc.
Pavel S. Knopov, Arnold S. Korkhin
Chapter 4. Determination of Accuracy of Estimation of Regression Parameters Under Inequality Constraints
Abstract
This chapter is devoted to the accuracy of estimation of regression parameters under inequality constraints. In Sects. 4.2 and 4.3 we construct the truncated estimate of the matrix of m.s.e. of the estimate of multi-dimensional regression parameter. In such a construction inactive constraints are not taken into account. Another approach (which takes into account all constraints) is considered in Sects.4.4 4.7.
Pavel S. Knopov, Arnold S. Korkhin
Chapter 5. Asymptotic Properties of Recurrent Estimates of Parameters of Nonlinear Regression with Constraints
Abstract
In Sect. 1.2 we consider iterative procedures of estimation of multidimensional nonlinear regression parameters under inequality constraints. Here we investigate asymptotic properties of iterative approximations of estimates, obtained on each iteration. Moreover, we allow the situation when the multidimensional regressor has a trend, for example, it may increase to infinity. We start with a special case in which the constraints are absent.
Pavel S. Knopov, Arnold S. Korkhin
Chapter 6. Prediction of Linear Regression Evaluated Subject to Inequality Constraints on Parameters
Abstract
In this chapter we investigate statistical properties of the prediction in a regression with constraints. This problem is very complicated, since even in the case of linear regression and linear constraints the estimation of parameters is a nonlinear problem. Especially, the problem of interval prediction, i.e., of finding the confidence interval for the estimated value, is highly non-trivial. In Sect. 6.1 the interval prediction is constructed based on the distribution function of the prediction error, whose parameters are the true regression parameter α0 and the variance σ2 of the noise. Section 6.2 is devoted to the interval prediction based on the conditional distribution function of the prediction error.
Pavel S. Knopov, Arnold S. Korkhin
Backmatter
Metadaten
Titel
Regression Analysis Under A Priori Parameter Restrictions
verfasst von
Pavel S. Knopov
Arnold S. Korkhin
Copyright-Jahr
2012
Verlag
Springer New York
Electronic ISBN
978-1-4614-0574-0
Print ISBN
978-1-4614-0573-3
DOI
https://doi.org/10.1007/978-1-4614-0574-0

Premium Partner