# Jeromy Anglim's Blog: Psychology and Statistics

## Friday, May 22, 2009

### Bootstrapping and the boot package in R

I was recently asked about options for bootstrapping. The following post sets out some applications of bootstrapping and strategies for implementing it in R. I've found bootstrapping useful in several settings:

• where the statistic I'm interested in is a little unusual: the average R-square across five separate regressions; the difference in the average correlation of a set of variables between two groups
• non parametric statistics, such as the median
• when assumptions such as normality of homoscedasticity are not satisfied

### Bootstrapping in R

R is very cool for bootstrapping. Iâ€™ve mainly used the boot package and found it very good. In fact, it is a classic example of something that R makes easy. It's easy to run loops in R, and R is excellent at taking output from one function and using it as input to another. This is the essence of bootstrapping: taking different samples of your data, getting a statistic for each sample (e.g., the mean, median, correlation, regression coefficient, etc.), and using the variability in the statistic across samples to indicate something about the standard error and confidence intervals for the statistic.

#### Intermediate Bootstrapping in R

• John Fox provides 14 page PDF with a mathematical explanation of bootstrapping and examples in R within a regression context.

### Bootstrapping in SPSS

You can do bootstrapping with SPSS. I seem to remember there being some Python add-on package thatâ€™s designed to make bootstrapping easier. Iâ€™ve never used it and I donâ€™t imagine that it would be as easy to use as R given how difficult it is in SPSS to take SPSS output and process it further programmatically (even if the OMS is trying to make this easier). For certain specific tests you might be able to find already available macros (e.g., for indirect effects ). As an update it seems that IBM SPSS has released a bootstrapping add-on module.