It has been a joy learning from Dev Basu’s work on aspects of statistical inference, and especially his deep and often provocative essays on fallacies of common statistical principles. I will limit myself to his epic paper
Statistical Information and Likelihood
“Statistical Information and likelihood” is a tour de force in three parts. In the first part, Basu studies the implications of the sufficiency and conditionality principles, and shows that these lead to the likelihood function as the summary of the information in an experiment. His treatment is similar to that of Birnbaum (1962, 1972). His second part reviews non-Bayesian likelihood methods, leaning especially on Fisher’s maximum likelihood method (MLE). He criticizes the use of sampling standard errors around the MLE to create confidence intervals in the grounds that they violate the likelihood principle. His third part gives various examples that illuminate what he finds problematic about fiducial arguments, improper Bayesian priors, and simple-null hypothesis testing. Although most of his effort is critical, on the positive side Basu advocates subjective Bayesian analysis with proper priors, and making optimal decisions using a utility (or loss) function.