1981 | OriginalPaper | Chapter
Statistical Theory and the Computer
Authors : Bradley Efron, Gail Gong
Published in: Computer Science and Statistics: Proceedings of the 13th Symposium on the Interface
Publisher: Springer US
Included in: Professional Book Archive
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
Everyone here knows that the modern computer has profoundly changed statistical practice. The effect upon statistical theory is less obvious. Typical data analyses still rely, in the main, upon ideas developed fifty years ago. Inevitably though, new technical capabilities inspire new ideas. Efron, 1979B, describes a variety of current theoretical topics which depend upon the existence of cheap and fast computation: the jackknife, the bootstrap, cross-validation, robust estimation, the EM algorithm, and Cox’s likelihood function for censored data.