2020 | Book

# Likelihood and Bayesian Inference

## With Applications in Biology and Medicine

Authors: Leonhard Held, Daniel Sabanés Bové

Publisher: Springer Berlin Heidelberg

Book Series : Statistics for Biology and Health

2020 | Book

Authors: Leonhard Held, Daniel Sabanés Bové

Publisher: Springer Berlin Heidelberg

Book Series : Statistics for Biology and Health

This richly illustrated textbook covers modern statistical methods with applications in medicine, epidemiology and biology. Firstly, it discusses the importance of statistical models in applied quantitative research and the central role of the likelihood function, describing likelihood-based inference from a frequentist viewpoint, and exploring the properties of the maximum likelihood estimate, the score function, the likelihood ratio and the Wald statistic. In the second part of the book, likelihood is combined with prior information to perform Bayesian inference. Topics include Bayesian updating, conjugate and reference priors, Bayesian point and interval estimates, Bayesian asymptotics and empirical Bayes methods. It includes a separate chapter on modern numerical techniques for Bayesian inference, and also addresses advanced topics, such as model choice and prediction from frequentist and Bayesian perspectives. This revised edition of the book “Applied Statistical Inference” has been expanded to include new material on Markov models for time series analysis. It also features a comprehensive appendix covering the prerequisites in probability theory, matrix algebra, mathematical calculus, and numerical analysis, and each chapter is complemented by exercises. The text is primarily intended for graduate statistics and biostatistics students with an interest in applications.

Advertisement

Abstract

This chapter introduces several examples which will be considered throughout this book. It also gives a brief discussion of different aspects of statistical inference and the role of statistical models.

Abstract

Chapter 2 introduces the fundamental notion of the likelihood function and related quantities, such as the maximum likelihood estimate, the score function, and Fisher information. Computational algorithms are treated to compute the maximum likelihood estimate, such as optimisation and the EM algorithm. The concept of sufficiency and the likelihood principle are finally discussed in some detail. Exercises are given at the end.

Abstract

This chapter discusses fundamental concepts of frequentist inference, such as unbiasedness and consistency, standard errors and confidence intervals, significance tests and P-values. There is also a section on the bootstrap method. Exercises are given at the end.

Abstract

Frequentist properties of the maximum likelihood estimate of a scalar parameter are derived. The Wald, score and likelihood ratio test statistics and the corresponding confidence intervals are introduced. Variance-stabilising transformations are also discussed. A case study comparing coverage and width of several confidence intervals for a proportion finishes this chapter, completed by a number of exercises at the end.

Abstract

The concepts described in Chap. 4 are now extended to multiparameter models. The concept of profile likelihood is introduced as well as the generalised likelihood ratio statistic. The conditional likelihood, an alternative way to eliminate a nuisance parameter, is discussed. Exercises are given at the end.

Abstract

This chapter gives an introduction to Bayesian inference. Conjugate, improper and Jeffreys prior distributions are introduced as well as various Bayesian point and interval estimates. Bayesian inference in multiparameter models is discussed and some results from Bayesian asymptotics are described. Finally, empirical Bayes methods are described, completed by a number of exercises at the end.

Abstract

This chapter describes methodology for model selection both from a likelihood and a Bayesian perspective. In particular, AIC and BIC is discussed and its connection to cross-validation. Bayesian model selection based on the marginal likelihood is described, including Bayesian model averaging. Finally, DIC is introduced, completed by a number of exercises at the end.

Abstract

This chapter describes numerical methods for Bayesian inference in non-conjugate settings. Standard numerical techniques and the Laplace approximation provide ways to numerically compute posterior characteristics of interest. Monte Carlo methods, including Monte Carlo integration, rejection and importance sampling as well as Markov chain Monte Carlo are described. Finally, numerical computation of the marginal likelihood, necessary for Bayesian model selection, is discussed. Exercises are given at the end.

Abstract

Chapter 9 describes the statistical methodology to predict future data in the presence of unknown model parameters. Emphasis is given on probabilistic predictions, obtained with either a likelihood or Bayesian approach. Connections to the simpler plug-in prediction are also described. Finally, methods to assess the quality of probabilistic predictions, such as the Brier and the logarithmic score, are described. Exercises are given at the end.

Abstract

A time series is a series of observations of a quantity of interest. Markov models are commonly used in applications to take into account the dependence between successive observations. This chapter describes the statistical analysis of different types of Markov models for categorical and continuous time series data, including hidden Markov models and state space models. Several examples are considered to illustrate how likelihood and Bayesian methods can be used for parameter estimation and prediction. Exercises are given at the end.