Skip to main content
main-content

Über dieses Buch

This comprehensive book, rich with applications, offers a quantitative framework for the analysis of the various capture-recapture models for open animal populations, while also addressing associated computational methods.

The state of our wildlife populations provides a litmus test for the state of our environment, especially in light of global warming and the increasing pollution of our land, seas, and air. In addition to monitoring our food resources such as fisheries, we need to protect endangered species from the effects of human activities (e.g. rhinos, whales, or encroachments on the habitat of orangutans). Pests must be be controlled, whether insects or viruses, and we need to cope with growing feral populations such as opossums, rabbits, and pigs.

Accordingly, we need to obtain information about a given population’s dynamics, concerning e.g. mortality, birth, growth, breeding, sex, and migration, and determine whether the respective population is increasing , static, or declining. There are many methods for obtaining population information, but the most useful (and most work-intensive) is generically known as “capture-recapture,” where we mark or tag a representative sample of individuals from the population and follow that sample over time using recaptures, resightings, or dead recoveries. Marks can be natural, such as stripes, fin profiles, and even DNA; or artificial, such as spots on insects. Attached tags can, for example, be simple bands or streamers, or more sophisticated variants such as radio and sonic transmitters.

To estimate population parameters, sophisticated and complex mathematical models have been devised on the basis of recapture information and computer packages. This book addresses the analysis of such models. It is primarily intended for ecologists and wildlife managers who wish to apply the methods to the types of problems discussed above, though it will also benefit researchers and graduate students in ecology. Familiarity with basic statistical concepts is essential.

Inhaltsverzeichnis

Frontmatter

Chapter 1. A Brief History of Capture–Recapture

Abstract
Capture–recapture methods have a long history and in this introductory chapter we begin by briefly describing some of the defining research papers up to about 1950. The rest of the book takes us up to the beginning of 2019. Model building plays an essential part in this book and some basic ideas are described relating to fixed and random sample sizes. Classical frequency and Bayesian models are briefly introduced using a simple two-sample capture–recapture model as an example. Reviews referring to capture–recapture for both open and closed populations and extensions to epidemiology, site-occupancy models, evolutionary ecology, spatial and statespace models form the conclusion.
George A. F. Seber, Matthew R. Schofield

Chapter 2. Tagging Methods and Tag Loss

Abstract
With new developments in technology such as with cameras and radio telemetry, we describe a wide variety of tags and markers used for uniquely identifying animals. Examples are various types of attached tags, natural markers on animals (e.g., tiger stripes, tail fins), radio tags, PIT tags that use an integrated circuit chip, water acoustic tags, genetic markers, next-of-kin information, and trace methods without captures using cameras. We consider in detail models for estimating tag loss using different kinds of tag combinations in both a discrete and instantaneous modeling framework.
George A. F. Seber, Matthew R. Schofield

Chapter 3. Tag Returns from Dead Animals

Abstract
This scenario arises when tagged animals die through natural mortality and possibly exploitation (e.g., hunting and fisheries), and their tags are recovered. A wide variety of models for both exploited and unexploited populations are considered depending on what underlying population assumptions are made, the first being time dependence for both the survival and tag–recovery probabilities. In the exploited case, reward and solicited tags are introduced with commercial fisheries having a special mention. Instantaneous models are included based on Poisson processes and, later, catch-age data. Departures from the underlying assumptions are considered. In endeavoring to separate estimates of natural and exploitation mortality, three types of hypotheses are considered, one, for example, being the additive hypothesis. Other special cases are Ricker’s two-release method, a model with constant survival and time-dependent recovery probabilities, and various age-specific and other models. Departures from the assumptions such as heterogeneity lead to the methods of mixtures and random effects.
George A. F. Seber, Matthew R. Schofield

Chapter 4. Using Releases and Resightings

Abstract
Instead of dead recoveries, we can obtain recapture information through simply resighting live tagged individuals using a time-specific model. This was described by Richard Cormack in 1964. Underlying assumptions are considered and tests are provided.
George A. F. Seber, Matthew R. Schofield

Chapter 5. Mark–Recapture: Basic Models

Abstract
We consider two basic live-recapture models referred to as the CJS model (after Cormack–Jolly–Seber) that models the tagged data, and the JS model (after Jolly and Seber in Biometrika, 52:225–247, 1965), which also includes the untagged data. Likelihood methods as originally used are described, while Bayesian and random-effects methods have since been developed for the CJS model using logistic transformations of the parameters. Various goodness-of-fit tests, contingency table tests, and score tests are described for testing underlying assumptions and model fitting. Extensions we consider involve batch methods, utilizing various kinds of covariate information (including missing values), and Bayesian covariate methods. Other techniques include using splines for logistic transformations and the EM algorithm for the CJS (Cormack–Jolly–Seber) model. We consider the important development of a super-population approach that provides a better framework for dealing with hierarchical models and extensions to using groups.
George A. F. Seber, Matthew R. Schofield

Chapter 6. Multiple Recaptures: Further Methods

Abstract
In the previous chapter, we introduced some models that have formed a historical basis for developments in capture–recapture, with some extensions. The subject has now taken off in a number of different directions that will be considered in this and the following chapters. Although the split with Chap. 5 is somewhat arbitrary, we begin this chapter with a useful concept in which a previous model can be reversed or read backward to enable one to consider recruitment. Then follows a powerful technique that is now very popular, and which uses the concept of hidden or latent variables where we use observed values to help us estimate hidden quantities. The method of estimating equations, originally developed for closed populations, is next as well as diversions into the use of time series and the modeling of stopover times.
George A. F. Seber, Matthew R. Schofield

Chapter 7. Departures from Model Assumptions

Abstract
This topic needs particular emphasis in research. Departures considered include the initial effect of tagging, tag-loss models (including telemetry data), heterogeneity, and catchability dependence, along with appropriate test procedures. Mixtures model can be used here together with incorporating age dependency in the JS (Jolly-Seber) model. Statespace models, which later have a large chapter of their own, are incorporated here to deal with heterogeneity. Various types of migration can also affect underlying assumptions, such as random immigration and transients (those just passing through as opposed to residents). Over-dispersion, which affects parameter estimates, is the last topic.
George A. F. Seber, Matthew R. Schofield

Chapter 8. Combined Data Models

Abstract
In previous chapters, we have considered three methods of obtaining recapture data: dead recoveries, resightings, and live recaptures. As the models for all three methods have a similar structure, except for different definitions of the population parameters, it is not surprising we can combine three different pairs of methods or all three into combined models. In this chapter, all four possible models are considered in detail by redefining the parameters. Ken Burnham, in 1993, combined live recaptures with dead recoveries and incorporated either permanent immigration or random immigration into the model. Goodness-of-fit tests are given, and we describe extensions to the model to allow for such things as age, time, age plus time, delayed recoveries, and covariates (including missing values). Bayesian models, hidden Markov models, the utilization of radio tags, and continuous models that focus on instantaneous natural and fishing mortality rates are described in considerable detail. Another combined pair of methods uses recaptures and sightings with losses that provide some dead individuals. Richard Barker in 1997 developed such a complex model particularly aimed at fish angling. Random emigration was included as a special case along with goodness-of-fit tests for it. Finally, we consider the model combining all three methods of data collection, which can incorporate various forms of migration.
George A. F. Seber, Matthew R. Schofield

Chapter 9. Further Bayesian and Monte Carlo Recapture Methods

Abstract
Although Bayesian methods are sprinkled throughout some of the previous chapters, we now give the topic our full attention and add some extensions. Their particular advantage arises from being able to apply Markov chain Monte Carlo techniques along with so-called reversible jump methods to sample from posterior distributions. We then consider modeling associations between parameters beginning with survival and birth (emergence), and then extending to deal with density dependence, covariates, and migration. Random effects can be better dealt with using a Bayesian model and Markov chain Monte Carlo simulations for the CJS (Cormack–Jolly–Seber) model. We finally describe a general method that is gaining popularity called data augmentation that is mentioned again in later chapters.
George A. F. Seber, Matthew R. Schofield

Chapter 10. Log-Linear Models for Multiple Recaptures

Abstract
This short stand-alone chapter is introduced mainly for historical reasons, though it has some interesting ideas and has had a resurgence for closed populations along with some new R software. For open and closed populations, the method consists of taking logarithms of the expected values of random variables, and then applying a two-way analysis of variance model where main effects and interactions are given biological interpretations. This is in contrast to applying logistic transformations to the population parameters. Residual plots are described for detecting heterogeneity and for pinpointing the occurrence of any temporary emigration and the presence of transients. The problem of fitting sub-models is discussed.
George A. F. Seber, Matthew R. Schofield

Chapter 11. Combining Open and Closed Models

Abstract
A range of models for closed populations are described that allow for such things as heterogeneity and variable catchability that are difficult to deal with in open populations. Pollock, in 1982, introduced the simple but elegant idea of combining models for both closed and open populations (the so-called “robust” design) by dividing the study period into primary and secondary periods where the primary periods are well spaced in time, but within each primary period there is a series of short secondary periods. The primary period is regarded as being of short enough duration for the population to be regarded as closed then. This approach has many advantages, for example, there is a variety of closed population models that can be used. Also, recruitment can be separated into immigration and birth with at least two age classes, and various types of migration can be incorporated. A number of versions of the robust design are described in detail, and recaptures, resightings, and dead recoveries can be introduced in various ways into the model. Transients can be modeled also. The chapter ends with a comparatively new topic called spatial capture–recapture models, which is introduced here as they can be incorporated into the closed parts of a robust model. This topic will no doubt eventually warrant its own chapter as extensions to open populations are considered.
George A. F. Seber, Matthew R. Schofield

Chapter 12. Continuous Dead–Recovery Models

Abstract
This chapter could occupy several different places in this book as it is linked to both dead recovery models and continuous models. The latter have already been mentioned in earlier chapters where natural and exploitation mortalities are modeled using Poisson processes with instantaneous rates. The chapter begins with laying out the general theory for continuous processes and then applying it to previous dead recovery models with extensions such as a nonparametric version with individual recovery times, and then allowing for a truncated study or using grouped recovery data. Several models are given such as using grouped releases and single recoveries, allowing for tag loss, incorporating age dependence, and adding catch–effort data to the model.
George A. F. Seber, Matthew R. Schofield

Chapter 13. Multisite and Statespace Models

Abstract
This is a large and complex chapter which, with recent computational developments and matrix methods, is becoming a fundamental model as it includes many of the previous models as special cases (e.g., the dead recovery and CJS models). A multisite model involves applying the capture–recapture method to a number of interconnecting geographical areas (strata or sites) where animals can move from one area to another. This topic has a history going back to 1956. By changing the word “site” to “state” we see that multisite models can now be considered as a special case of statespace models. For example, an individual could be in one of two states, breeding and nonbreeding, with the possibility of changing from one state to the other and back again. The basic statespace model is developed along with several modifications including when there are some unidentifiable species or sex, capture probabilities being dependent on the previous stratum, and then a Bayesian method along with an extension to multiple recaptures. The statespace model can also be incorporated into a dynamic model using stochastic processes and stochastic matrices under the title of integrated population models. Because of the large number of parameters involved with multistate models, we extend the model to include more types of data such as recaptures, resightings, and dead recoveries to increase the efficiency of parameter estimation and develop more hypotheses. Another extension includes recaptures, recoveries, and age data. A number of models are described involving some state uncertainty such as Pradel’s hidden Markov model, so-called memory models, models allowing for partially observable states, and an omnibus model combining the hidden Markov model with the robust model of Pollock described in a previous chapter. Multivariate extensions are also given, as well as considering heterogeneity of reporting, and estimating dispersal components. We see then that statespace models are very versatile.
George A. F. Seber, Matthew R. Schofield

Chapter 14. Designing and Modeling Capture–Recapture Experiments

Abstract
In designing a capture–recapture experiment, a substantial check list of questions that need answers is provided, as well as references to other aspects of design (e.g., designs using radio telemetry). Test methods for model fitting are briefly described before moving into a section on various measures of model comparison such as AIC, TIC, incorporating over-dispersion using QAIC (and a modification), and BIC. Various aspects of these measures are discussed. Parameter redundancy, described as either intrinsic or extrinsic, is a continual headache in capture–recapture, particularly as the models get more complicated such as with multistate models. Various methods are described for determining the existence of parameter redundancy (due to less than full rank of a certain matrix) and for finding out which of the parameters are redundant (nonidentifiable). These include using symbolic rank packages using the analytical computation of a certain derivative matrix, profile likelihood plots, Hessian matrix methods, and simulation (analytical-numerical method). The chapter ends with introducing the so-called “mother of all models” which is obtained by starting with a simple model for a closed population and then progressively adding new components and ending with an all-purpose model.
George A. F. Seber, Matthew R. Schofield

Chapter 15. Statistical Computation

Abstract
Capture–recapture has made considerable advances because of progress in the development of extensive computer packages (a list is given in the Appendix). In the past, the emphasis has been on obtaining explicit maximum likelihood estimates and using mathematically derived formulae for asymptotic variances and covariances to obtain standard errors. However, iterative methods are now available for solving maximum likelihood equations and obtaining asymptotic sample variance and covariance estimates. In addition, we have powerful simulation methods which are particularly useful for Bayesian models. In this chapter, written by Matthew Schofield, we give some theory for the numerical process involved as well as simple numerical examples. We start with numerical optimization using the Newton and quasi-Newton algorithms, and then the Hessian matrix for standard errors, with brief reference to multimaxima problems. In using latent (hidden) variable models, the EM (Expectation–maximization) algorithm can be used along with extensions called the SEM (Structural Equation Modeling) and GEM (generalized EM) algorithms. These methods can be applied to hidden Markov models using forward and backward algorithms. Bayesian models require knowledge of the posterior distributions of the parameters and this can be done numerically using Markov chain Monte Carlo (MCMC) to obtain samples from the posterior distributions. Theoretical properties of finite-space Markov chains are discussed along with the Monte Carlo method. The MCMC approach is applied using the Metropolis–Hastings algorithm along with a special case of the Gibbs sampler and making use of the Hammersley–Clifford theorem. Auxiliary latent variables can be used with so-called slice sampling and Hamiltonian Monte Carlo. Since some posterior distributions have variables that can have variable dimensions, methods for transdimensional sampling are described such as the so-called reversible jump MCMC and a universal variable approach.
George A. F. Seber, Matthew R. Schofield

Chapter 16. Where to Now?

Abstract
This short chapter does some crystal ball gazing about the future of capture–recapture. Clearly, computer software will continue to develop along with increasingly sophisticated computers. As models get more complicated, the number of unknown parameters goes up steeply. In contrast, even if capture–recapture, resighting, and dead recovery data are available, the amount of data is still limited, even if packaged in different ways. Two methods for parameter reduction are using covariates usually via logistic transformations of the parameters, and using random effects with Bayesian models. What is particularly needed is more concentration on experimental design and on applying various types of residuals using, for example, box plots. Another approach is to combine capture–recapture with other sampling methods such as adaptive sampling, count methods, aerial censusing, distance sampling, and using catch–effort data, usually referred to as integrated data analysis. Also, different types of tags can be used together such as DNA and photo identification.
George A. F. Seber, Matthew R. Schofield

Backmatter

Weitere Informationen

Premium Partner

    Bildnachweise