Skip to main content

2010 | Buch

Advances in Degradation Modeling

Applications to Reliability, Survival Analysis, and Finance

herausgegeben von: M.S. Nikulin, Nikolaos Limnios, N. Balakrishnan, Waltraud Kahle, Catherine Huber-Carol

Verlag: Birkhäuser Boston

Buchreihe : Statistics for Industry and Technology

insite
SUCHEN

Über dieses Buch

William Q. Meeker has made pioneering and phenomenal contributions to the general areaofreliabilityand,inparticular,tothetopicsofdegradationandacceleratedtesting. Hisresearchpublicationsandthenumerouscitationshehasreceivedoverthepastthree decades provide an ample testimony to this fact. Statistical methods have become critical in analyzing reliability and survival data. Highly reliable products have necessitated the development of accelerated testing and degradation models and their analyses. This volume has been put together in order to (i) review some of the recent advances on accelerated testing and degradation, (ii) highlight some new results and discuss their applications, and (iii) suggest possible directions for future research in these topics. With these speci?c goals in mind, many authors were invited to write a chapter for this volume. These authors are not only experts in lifetime data analysis, but also form a representative group from former students, colleagues, and other close professional associates of William Meeker. All contributions have been peer reviewed and organized into 26 chapters. For the convenience of readers, the volume has been divided into the following six parts: • Review, Tutorials, and Perspective • Shock Models • Degradation Models • Reliability Estimation and ALT • Survival Function Estimation • Competing Risk and Chaotic Systems Itneedstobeemphasizedherethatthisvolumeisnotaproceedings,butacarefully anddeliberatelyplannedvolumecomprisingchaptersconsistentwiththeeditorialgoals and purposes mentioned above. Our thanks go to all the authors who have contributed to this volume. Thanks are also due to Mrs. Debbie Iscoe for the excellent typesetting of the entire volume. SpecialthanksgotoMs.ReginaGorenshteynandMr.TomGrasso(Editor,Birkh¨ auser, Boston) for their interest and support for this project.

Inhaltsverzeichnis

Frontmatter

Review, Tutorials, and Perspective

Frontmatter
1. Trends in the Statistical Assessment of Reliability
Abstract
Changes in technology have had and will continue to have a strong effect on changes in the area of statistical assessment of reliability data. These changes include higher levels of integration in electronics, improvements in measurement technology and the deployment of sensors and smart chips into more products, dramatically improved computing power and storage technology, and the development of new, powerful statistical methods for graphics, inference, and experimental design and reliability test planning. This chapter traces some of the history of the development of statistical methods for reliability assessment and makes some predictions about the future.
William Q. Meeker
2. Degradation Processes: An Overview
Abstract
In this chapter we survey research on different types of degradation processes. Over time, a device is subject to degradation, generally described as an increasing stochastic process. The device has a threshold Y, and it fails once the degradation level exceeds the threshold. Life distribution properties and maintenance policies for such devices are discussed.
Mohamed Abdel-Hameed
3. Defect Initiation, Growth, and Failure – A General Statistical Model and Data Analyses
Abstract
This chapter describes a versatile new model for defect initiation and growth leading to specimen failure. This model readily yields (1) the distribution of time to defect initiation and (2) the distribution of time to failure (when a specified defect size is reached). The model can be readily fitted to defect size and failure data from lab tests or actual service, when defect size is observed once for each specimen. This can be done with software for fitting regression models to censored data. The model and fitting methods are illustrated with an application to dendrite growth on circuit boards. The model and data analyses are also suited to non-engineering applications, for example, initiation and growth of tumors.
Wayne B. Nelson
4. Properties of Lifetime Estimators Based on Warranty Data Consisting only of Failures
Abstract
Nowadays, many consumer durable goods, such as, automobiles, appliances, and photocopiers etc., are sold with manufacturer’s warranty to insure product quality and reliability. Warranty claims contain a large amount of useful information about reliability of the products, such as, failure times, usage, and failure modes etc. For engineering purposes, usage is more relevant, and hence, modeling usage accumulation is of great interest in reliability analysis using warranty data. Such models are needed to manufacturers to evaluate reliability, predict warranty costs, and to assess design modification and customer satisfaction. Usually, warranty data consists of only failure information, and non-failure information is not obtainable which makes the reliability analysis difficult. The sales data is also important for reliability analysis as it contains time-in-service in calendar timescale for each non-failed product during the warranty plan. This chapter discusses maximum likelihood estimation of lifetime parameters using warranty data along with sales data and examines the precision of the estimators by the asymptotic variances obtained from Fisher Information Matrix. The practical consequence of this finding is that the proposed method produces estimators of the lifetime parameters with good precision for large sales amount.
Kazuyuki Suzuki, Watalu Yamamoto, Takashi Hara, Md. Mesbahul Alam

Shock Models

Frontmatter
5. Shock Models
Abstract
The standard assumptions in shock models are that the failure (of a system) is related either to the cumulative effect of a (large) number of shocks or that failure is caused by a shock that exceeds a certain critical level. An extension is to consider a mixture, that is, a system breaks down either because of a large shock or as a result of many smaller ones, whichever appears first. In this chapter we survey our results on this problem as well as on a further generalization in which a shock can partly harm the system which implies a lower critical boundary for the following shocks to be fatal. For the cumulative model we also deal with the case in which only the sum of the most recent shocks implies a system failure. In addition we consider the combination of both models with some link functions and briefly discuss an extension of shock models which are based on a Markovian model.
Allan Gut, Jürg Hüsler
6. Parametric Shock Models
Abstract
For analyzing reliability of technical systems it is often important to investigate degradation processes. In this chapter we describe a degradation process (Z t ) which is assumed to be generated by a position-dependent marking of a doubly stochastic Poisson process. Some characteristics of this process are described. For general and special cases it is calculated as the probability that the damage process does not cross a certain threshold before a time t. For some parametric intensity kernels of the corresponding marked point process it is determined as maximum likelihood estimations. Censored observations are taken into account. Furthermore, the large sample case is considered. Moment estimations are found and compared with maximum likelihood estimations.
Waltraud Kahle, Heide Wendt
7. Poisson Approximation of Processes with Locally Independent Increments and Semi-Markov Switching – Toward Application in Reliability
Abstract
In this chapter, the weak convergence of additive functionals of processes with locally independent increments and with semi-Markov switching in the scheme of Poisson approximation is investigated. Singular perturbation problem for the compensating operator of the extended Markov renewal process is used to prove the relative compactness. This approach can be used in applications and especially in shock and degradation in random environment arising in reliability.
V.S. Koroliuk, N. Limnios, I.V. Samoilenko
8. On Some Shock Models of Degradation
Abstract
We discuss several shock models that describe univariate degradation. Poisson and renewal-type point processes of shocks are considered. We assume that degradation (damage) caused by each shock is accumulated by an overall degradation characteristic. The failure occurs when this accumulated degradation reaches the defined boundary. Asymptotic properties for the shot noise shock process and for the “imperfect repair-type” shock process are considered. The combined model when each shock results either in termination of the process or in the corresponding increment of degradation is discussed. A simple explicit formula for the probability of failure in this case is analyzed. Possible generalizations are suggested.
Maxim Finkelstein, Ji Hwan Cha

Degradation Models

Frontmatter
9. The Wiener Process as a Degradation Model: Modeling and Parameter Estimation
Abstract
In this chapter we describe a simple degradation model based on the Wiener process. A failure occurs when the degradation reaches a given level for the first time. In this case, the time to failure is inverse Gaussian distributed. The parameters of the lifetime distribution can be estimated from observation of degradation only, from observation of failures, or from observation of both degradation increments and failure times. In the chapter, statistical methods for estimating the parameters of degradation processes for different data structures are developed and compared.
Waltraud Kahle, Axel Lehmann
10. On the General Degradation Path Model: Review and Simulation
Abstract
In this chapter we review the estimation of survival function based on the general path model, with noise that is studied by Bagdonavičius and Nikulin (Lifetime Data Analysis 10: 65–81, 2004). We also discuss the nonparametric, semi-parametric, and parametric estimation of survival function in degradation model with multiple failure modes originated by the ignorance of measurement error described in Bagdonavičius and Nikulin (Lifetime Data Analysis 10: 65–81, 2004), Bagdonavičius et al. (Communication in Statistics-Theory and Methods 34: 1771–1793, 2005), and Haghighi et al. (Proceedings of Second International Conference on Accelerated Life Testing in Reliability and Quality Control, 2008), respectively. Ultimately, different simulation studies are conducted and the comparison of proposed estimation methods are illustrated.
Firoozeh Haghighi, Nazanin Nooraee, Narges Nazeri Rad
11. A Closer Look at Degradation Models: Classical and Bayesian Approaches
Abstract
Traditionally, reliability assessment of devices has been based on (accelerated) life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is to monitor the device for a period of time and assess its reliability from the changes in performance (degradation) observed during that period. The goal of this chapter is to illustrate how degradation data can be modeled and analyzed by using “classical” and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.
Marta A. Freitas, Thiago R. dos Santos, Magda C. Pires, Enrico A. Colosimo
12. Optimal Prophylaxis Policy Under Non-monotone Degradation
Abstract
A stationary server system with observable degradation is considered. An optimization problem for choice of a time to begin a prophylactic repair of the system is being investigated. A mathematical problem is to choose a Markov time on a random process which is optimal with respect to some criterion. A necessary condition for the time to be optimal is such that it determines a unique solution for monotone random processes. For non-monotone processes this necessary condition determines a set of Markov times containing a time of the first fulfilment of the condition (trivial solution), if this set is not empty. The question arises: Is the trivial solution optimal? We show that it depends on parameters of the process and mainly on difference between the hazard rate and the rate of useful output of the system. For Markov processes the following alternative is true: either the trivial time is optimal or there exist no optimal times.
S.S. Rasova, B.P. Harlamov
13. Deterioration Processes With Increasing Thresholds
Abstract
The present chapter derives the reliability functions and hazard functions, when the threshold for deterioration is an increasing function of time. Four cases are considered. Case I: The threshold is a step function with K jumps at known points. Case II: The threshold is a step function with K jumps, where the location of jumps are random, following a renewal process. Case III: The controlled case. The jumps occur after the first crossing of a control limit. Case IV: The threshold increases linearly. The theory is illustrated in all four cases with a Markovian deterioration process, i.e., a compound Poisson process with i.i.d. jumps following an exponential distribution.
S. Zacks
14. Failure Time Models Based on Degradation Processes
Abstract
Many failure mechanisms can be traced to underlying degradation processes and stochastically changing covariates. Degradation and covariate data can provide much reliability information additional to failure time data and are particularly useful when the application of traditional reliability models is limited due to rare failures of highly reliable items or due to items operating in dynamic environments. This chapter surveys some approaches to model the relationship between failure time data and degradation and covariate data. These models which reflect the dependency between system state and system reliability include threshold models and hazard-based models. In particular, we consider the class of degradation–threshold–shock models in which failure is due to the competing causes of degradation and trauma. For this class of reliability models we compute the survival function of the resulting failure time and derive the likelihood function for the joint observation of failure times and degradation data at discrete times. We consider a special class of degradation–threshold–shock models where degradation is modeled by a process with stationary independent increments and related to external covariates through a random timescale and extend this model class to repairable items by a marked point process approach. The proposed model class provides a rich conceptual framework for the study of degradation–failure issues.
Axel Lehmann
15. Degradation and Fuzzy Information
Abstract
In lifetime analysis different kinds of uncertainty are present. Besides variability there is also imprecision of measurement results. This kind of uncertainty is best described by fuzzy models. Besides fuzziness of degradation parameters there is also fuzziness of probability distributions. Using fuzzy models in combination with statistical models and degradation functions more realistic lifetime analysis is possible.
R. Viertl
16. A New Perspective on Damage Accumulation, Marker Processes, and Weibull’s Distribution
Abstract
This chapter is in two interrelated parts. The two parts share the common feature that items fail when one random variable crosses a fixed or random threshold. Section II of this chapter pertains to Wallodi Weibull’s model for material failure and the genesis of the famous distribution that bears his name. Section I of this chapter is more generic, because it is germane to the failure of biological entities as well. Its key feature is that a stochastic process viewpoint is adopted, and the hitting times of the process to a random threshold drive the failure phenomenon. By contrast Weibull’s classic work does not take a process point of view, but is grounded in fracture mechanics; thus it has a stronger connection with engineering science. Weibull’s work has some errors and unrealistic assumptions. However, it provides a platform for a synergestic enhancement of the first part of this chapter. This enhancement(s) remain to be explored.
Nozer D. Singpurwalla

Reliability Estimation and ALT

Frontmatter
17. Reliability Estimation of Mechanical Components Using Accelerated Life Testing Models
Abstract
This chapter presents an overview of using accelerated life testing (ALT) models for reliability estimation on mechanical components. The reliability is estimated by considering two test plans: a classical one testing a sample system under accelerated conditions only and a second plan with previous accelerated damage. The principle of the test plan with previous accelerated damage is testing the sample under step-stress. In the beginning (until time N 1), the sample is tested under stress s 1 (accelerated testing: \(s_{1}>s_{0}\)); when the tested units have used many of their “resources,” the stress s 1 is replaced by the operating conditions s 0 (until the time N 2). Therefore, failure times under the accelerated conditions can be used to estimate reliability function in operating conditions. The time transformation function is considered as log-linear and four types of estimation are studied: parametric, Extended Hazard Regression (GPH), semi-parametric, and nonparametric models. The chapter is illustrated by a simulation example of ball bearings testing. The results are used to analyze and compare these estimation methods. The simulations have been performed both with censored data and without censoring, in order to examine the asymptotic behavior of the different estimates.
Fabrice Guérin, M. Barreau, A. Charki, A. Todoskoff, S. Cloupet, D. Bigaud
18. Reliability Estimation from Failure-Degradation Data with Covariates
Abstract
Models and estimation methods for failure process parameters and various reliability characteristics using censored multi-mode failure time-degradation data with covariates. Degradation data is supposed to be measured at discrete times. Modified likelihood function which uses predictors of degradation process is defined. Examples of predictors when degradation process is modelled by various stochastic processes are given.
V. Bagdonavičius, I. Masiulaitytė, M.S. Nikulin
19. Asymptotic Properties of Redundant Systems Reliability Estimators
Abstract
Nonparametric and parametric methods of estimation of redundant systems with “warm” standby units reliability are given. Asymptotic properties of the estimators and asymptotic confidence intervals are obtained. Power of goodness-of-fit tests from finite samples is investigated by simulation.
V. Bagdonavičius, I. Masiulaitytė, M.S. Nikulin
20. An Approach to System Reliability Demonstration Based on Accelerated Test Results on Components
Abstract
Reliability demonstration is often mandatory for industries that need to prove the quality of their production. In a reliability objective at the global system level, the basic use of standards such as MIL-HDBK-781 or IEC-60300 for reliability demonstration of each sub-system must be adjusted. Practitioners often have to fix high-level reliability goals (around 0.999) on each sub-system to guarantee that system reliability exceeds 0.8. Moreover, the confidence obtained for an overall reliability drops down to non-admissible values because of the multiplicity of testing procedures. We propose an approach, from k accelerated tests, to determine an optimized test time whilst ensuring that the reliability of a component system is greater than a goal value and by avoiding the too conservative method consisting in multiplying the confidence levels.
Léo Gerville-Réache, Vincent Couallier

Survival Function Estimation

Frontmatter
21. Robust Versus Nonparametric Approaches and Survival Data Analysis
Abstract
W.Q. Meeker and L.A. Escobar famous book Statistical Methods for Reliability Data [26] is a very well-known reference for all engineers and researchers who are interested in reliability problems. W.Q. Meeker has a long experience in modeling and solving degradation problems with the most complex features, so that it is a pleasure for me to participate in this volume in his honor. For a long time, survival data analysis and reliability studies walked their way parallel without much interpenetration. But nowadays, impulsed by several people among which Bagdonavicius and Nikulin [3, 4], one is aware of the multiple links between reliability and survival analysis, acknowledging still for some specificities. Many parametric models are available, as well as large classes of models like accelerated models, mainly in use in reliability, and extended Cox model [11], that are the favorite for survival data which have parametric as well as semi-parametric versions. In the recent years, there was an increasing interest for purely nonparametric approaches. Their advantage is that they are supposed to be able to adjust any possible data set through a vast class of regular functions. The drawbacks are first that there is generally a lack of easy interpretation for practical purposes and second that proving the required properties of the inference procedures like consistency, for example, is not trivial when censoring and truncation are present (Huber et al. [18], [19]), which is often true in the medical field. The result is that the justification of some proposed procedures is only through some simulations without any theoretical proof of their properties. What I would like to emphasize in this chapter is the need for a robust approach to survival data analysis, which means using robust procedures for flexible parametric models, still valid when the observations follow a model close, but not exactly equal, to the assumed model [22]. This always seemed to me a good compromise between a purely parametric approach and a purely nonparametric one. One of the most interesting chapter on robustness in survival analysis is the one by Kosorok et al. [24], which tackles the case of a possibly misspecified frailty model.
Catherine Huber
22. Modelling Recurrent Events for Repairable Systems Under Worse Than Old Assumption
Abstract
The objective of the work is to model the failure process of a repairable system under “worse than old”, or harmful repairs, assumption. The proposed model is founded on the counting process probabilistic approach and interprets harmful repairs as the accumulation of failures on the same system. Increase in the conditional intensity is rather induced by the number of previous repair actions than by time contrarily to virtual age models. The LEYP model is defined and some comparison with existing imperfect repair models is given. The explicit form of likelihood function is provided. A covariate-dependent model is defined in order to take the effect of internal or external factors, which may be constant or time dependent, into account. After a description of the estimation procedure for left-truncated and right-censored data using a multiple systems data set, we provide some useful formulae for prediction of the number of failures in a future period. An application to data from the water distribution system of the city of Oslo (Norway) is given.
G. Babykina, V. Couallier
23. Survival Models for Step-Stress Experiments With Lagged Effects
Abstract
In this chapter, we consider models for experiments in which the stress levels are altered at intermediate stages during the exposure. These experiments, referred to as step-stress tests, belong to the class of accelerated models that are extensively used in reliability and life-testing applications. Models for step-stress tests have largely relied on the cumulative exposure model (CEM) discussed by Nelson. Unfortunately, the assumptions of the model are fairly restrictive and quite unreasonable for applications in survival analysis. In particular, under the CEM the hazard function has discontinuities at the points at which the stress levels are changed.
We introduce a new step-stress model where the hazard function is continuous. We consider a simple experiment with only two stress levels. The hazard function is assumed to be constant at the two stress levels and linear in the intermediate period. This model allows for a lag period before the effects of the change in stress are observed. Using this formulation in terms of the hazard function, we obtain the maximum likelihood estimators of the unknown parameters. A simple least squares-type procedure is also proposed that yields closed-form solutions for the underlying parameters. A Monte Carlo simulation study is performed to study the behavior of the estimators obtained by the two methods for different choices of sample sizes and parameter values. We analyze a real data set and show that the model provides an excellent fit.
N. Kannan, D. Kundu, N. Balakrishnan
24. Estimation of Density on Censored Data
Abstract
We investigate the problem of approximation of MLE as we need to estimate unknown density on censored data. What the accuracy of such approximation should be if we need to save the same accuracy of estimating?
V. Solev

Competing Risk and Chaotic Systems

Frontmatter
25. Toward a Test for Departure of a Trajectory from a Neighborhood of a Chaotic System
Abstract
Many real systems have been identified as potentially chaotic (e.g., trains of heart beats, records of multiple populations in an ecological system, weather and climate data). In such cases, when “healthy” is defined by the measured trajectory being confined to a “strange attractor” in a “phase space,” how can we define and measure degradation? The simplest course is to define degradation as when the attractor changes. The earliest warning of such a change would be when a trajectory leaves an attractor, without an external forcing, but there seems little written on the problem of identifying whether a short trajectory belongs to a given attractor (at least to within a neighborhood defined by stochastic disturbances and uncertainty in the parameters of the defining equations).
This chapter explores the problem of testing short trajectories in the neighborhood of the Lorenz attractor. Although there are some conditions necessary for developing a working test, it turns out that the Lorenz attractor is perhaps one of the kindest examples to explore, being easily visualized as well as having significant work done on its statistical properties. Further work with more complex attractors is necessary before this approach can be considered general. However, the proposed test shows the potential of not only detecting degradation, but of being a general partial goodness of fit test for a short trajectory to a neighborhood of an attractor, even for high-dimensional attractors. In addition, the construction of the test suggests a dimension reduction scheme useful for exploring deficiencies in the attractor in describing the trajectory.
M. LuValle
26. Probability Plotting with Independent Competing Risks
Abstract
In this chapter, we discuss probability plotting for competing risk data in the presence of Type I or right censoring. The construction of probability plots is based on the linearization of the cumulative distribution function of the first-order statistic. We consider the case when risks are independent. We describe procedures for constructing pointwise confidence intervals based on large-sample properties of maximum likelihood estimators. The proposed method is compared with traditional probability plotting that assumes that causes of failure can be eliminated.
Francis G. Pascual, Christopher Gast
Backmatter
Metadaten
Titel
Advances in Degradation Modeling
herausgegeben von
M.S. Nikulin
Nikolaos Limnios
N. Balakrishnan
Waltraud Kahle
Catherine Huber-Carol
Copyright-Jahr
2010
Verlag
Birkhäuser Boston
Electronic ISBN
978-0-8176-4924-1
Print ISBN
978-0-8176-4923-4
DOI
https://doi.org/10.1007/978-0-8176-4924-1