Bayesian inference provides a flexible way of combiningg data with
prior information. However, quantile regression is not equipped with a parametric likelihood, and therefore, Bayesian inference for quantile regression demands careful investigations. This thesis considers the Bayesian empirical likelihood approach to quantile regression. Taking the empirical likelihood into a Bayesian framework, we show that the resultant posterior is asymptotically normal; its mean shrinks towards the true parameter values and its variance approaches that of the maximum empirical likelihood estimator. Through empirical likelihood, the proposed method enables us to explore various forms of commonality across quantiles for efficiency gains in the estimation of multiple quantiles. By using an MCMC algorithm in the computation, we avoid the daunting task of directly maximizing empirical likelihoods. The finite sample performance of the proposed method is investigated empirically, where substantial efficiency gains are demonstrated with informative priors on common features across quantile levels.