Bootstrapping
A Nonparametric Approach to Statistical Inference
- Christopher Z. Mooney - University of Illinois at Chicago, USA
- Robert D. Duval - West Virginia University, USA
Volume:
95
Other Titles in:
Quantitative/Statistical Research
Quantitative/Statistical Research
August 1993 | 80 pages | SAGE Publications, Inc
Bootstrapping, a computational nonparametric technique for `re-sampling', enables researchers to draw a conclusion about the characteristics of a population strictly from the existing sample rather than by making parametric assumptions about the estimator. Using real data examples from per capita personal income to median preference differences between legislative committee members and the entire legislature, Mooney and Duval discuss how to apply bootstrapping when the underlying sampling distribution of the statistics cannot be assumed normal, as well as when the sampling distribution has no analytic solution. In addition, they show the advantages and limitations of four bootstrap confidence interval methods: normal approximation, percentile, bias-corrected percentile, and percentile-t. The authors conclude with a convenient summary of how to apply this computer-intensive methodology using various available software packages.
PART ONE: INTRODUCTION
Traditional Parametric Statistical Inference
Bootstrap Statistical Inference
Bootstrapping a Regression Model
Theoretical Justification
The Jackknife
Monte Carlo Evaluation of the Bootstrap
PART TWO: STATISTICAL INFERENCE USING THE BOOTSTRAP
Bias Estimation
Bootstrap Confidence Intervals
PART THREE: APPLICATIONS OF BOOTSTRAP CONFIDENCE INTERVALS
Confidence Intervals for Statistics With Unknown Sampling Distributions
Inference When Traditional Distributional Assumptions Are Violated
PART FOUR: CONCLUSION
Future Work
Limitations of the Bootstrap
Concluding Remarks