Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

An objective prior that unifies objective Bayes and information-based inference [article]

Colin H. LaMont, Paul A. Wiggins
2015 arXiv   pre-print
There are three principle paradigms of statistical inference: (i) Bayesian, (ii) information-based and (iii) frequentist inference. We describe an objective prior (the weighting or w-prior) which unifies objective Bayes and information-based inference. The w-prior is chosen to make the marginal probability an unbiased estimator of the predictive performance of the model. This definition has several other natural interpretations. From the perspective of the information content of the prior, the
more » ... -prior is both uniformly and maximally uninformative. The w-prior can also be understood to result in a uniform density of distinguishable models in parameter space. Finally we demonstrate the the w-prior is equivalent to the Akaike Information Criterion (AIC) for regular models in the asymptotic limit. The w-prior appears to be generically applicable to statistical inference and is free of ad hoc regularization. The mechanism for suppressing complexity is analogous to AIC: model complexity reduces model predictivity. We expect this new objective-Bayes approach to inference to be widely-applicable to machine-learning problems including singular models.
arXiv:1506.00745v2 fatcat:isfont2ozfcuhd4dueg7ikkm54