We describe the use of perfect sampling algorithms for Bayesian variable
selection in a linear regression model.
Starting with a basic case solved by Huang and Djuric (2002), where the model
coefficients and noise variance are assumed to be known,
we generalize the model step by step to allow for other sources of
randomness, specifying perfect simulation algorithms that solve these cases
by incorporating various techniques including Gibbs sampling, the perfect
independent Metropolis-Hastings (IMH) algorithm, and recently developed
``slice coupling'' algorithms. Applications to simulated data sets suggest
that our algorithms perform well in identifying relevant predictor variables.