NIPS Proceedingsβ

PAC-Bayesian Theory Meets Bayesian Inference

Part of: Advances in Neural Information Processing Systems 29 (NIPS 2016)

[PDF] [BibTeX] [Supplemental] [Reviews]


Conference Event Type: Poster


We exhibit a strong link between frequentist PAC-Bayesian bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam's razor criteria, under the assumption that the data is generated by an i.i.d. distribution. Moreover, as the negative log-likelihood is an unbounded loss function, we motivate and propose a PAC-Bayesian theorem tailored for the sub-gamma loss family, and we show that our approach is sound on classical Bayesian linear regression tasks.