A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
Filters
Small-Variance Asymptotics for Exponential Family Dirichlet Process Mixture Models
2012
Neural Information Processing Systems
In this paper, we explore small-variance asymptotics for exponential family Dirichlet process (DP) and hierarchical Dirichlet process (HDP) mixture models. ...
This can often be fulfilled by performing small-variance asymptotics, i.e., letting the variance of particular distributions in the model go to zero. ...
Conclusion We considered a general small-variance asymptotic analysis for the exponential family DP and HDP mixture model. ...
dblp:conf/nips/JiangKJ12
fatcat:tfxv5pkcmra5xlbuhkpaatrzh4
Small-Variance Asymptotics for Hidden Markov Models
2013
Neural Information Processing Systems
Small-variance asymptotics provide an emerging technique for obtaining scalable combinatorial algorithms from rich probabilistic models. ...
We present a smallvariance asymptotic analysis of the Hidden Markov Model and its infinite-state Bayesian nonparametric extension. ...
[8] for a Dirichlet process mixture. ...
dblp:conf/nips/RoychowdhuryJK13
fatcat:ygiqvcyhj5cibcgazoqbvydtdm
Dirichlet Process Mixtures of Generalized Linear Models
[article]
2010
arXiv
pre-print
We empirically analyze the properties of the DP-GLM and why it provides better results than existing Dirichlet process mixture regression models. ...
We propose Dirichlet Process mixtures of Generalized Linear Models (DP-GLM), a new method of nonparametric regression that accommodates continuous and categorical inputs, and responses that can be modeled ...
We review Dirichlet process mixture models and generalized linear models. Dirichlet Process Mixture Models. The Dirichlet process (DP) is a distribution over distributions (Ferguson 1973) . ...
arXiv:0909.5194v2
fatcat:xuibtdgb2rg5hodtkdajj2yvnu
Page 713 of Mathematical Reviews Vol. , Issue 84b
[page]
1984
Mathematical Reviews
Field, Christopher 84b:62040 Small sample asymptotic expansions for multivariate M-estimates. Ann. Statist. 10 (1982), no. 3, 672-689. ...
Stijnen, Theo
A monotone empirical Bayes estimator and test for the
one-parameter continuous exponential family based on spacings.
Scand. J. Statist. 9 (1982), no. 3, 153-158. ...
Page 2842 of Mathematical Reviews Vol. , Issue 84g
[page]
1984
Mathematical Reviews
Some asymptotic results concerning the averaged quantile process are given. A more general model than the location shift model is also explored and some asymptotic theory is given. ...
We obtain recurrence relations for moments and cumulants and the maximum likelihood estimators for the discrete exponential family. ...
On the consistency of Bayes estimates for the infinite continuous mixture of Dirichlet distributions
2021
Hacettepe Journal of Mathematics and Statistics
According to this simulation, we were able to conclude that the prior infinite mixture of Dirichlet distributions offers higher accuracy and flexibility for modeling and learning data. ...
Some asymptotic properties of this estimator were derived, specifically, its bias and variance. ...
The model was achieved using a prior Dirichlet process for the model parameters. The stick-breaking construction is one of the most explicit and general Dirichlet processes. ...
doi:10.15672/hujms.774732
fatcat:c2zu6xcoojd3leakbs2ru3oe7m
Page 8586 of Mathematical Reviews Vol. , Issue 2003k
[page]
2003
Mathematical Reviews
likelihood or mixture models. ...
The comparison is realized on the example of the interval estimation of some distribu- tion parameters belonging to the natural exponential families with a quadratic variance function (binomial, Poisson ...
Page 2596 of Mathematical Reviews Vol. , Issue 2001D
[page]
2001
Mathematical Reviews
Summary: “We characterize the stable convergence of a sequence of density processes corresponding to binary filtered experiments to the exponential of a mixture of Gaussian processes, in terms of the convergence ...
Summary: “We study the LRT statistic for testing a single popula- tion i.i.d. model against a mixture of two populations with Markov regime. ...
White Matter Supervoxel Segmentation by Axial DP-Means Clustering
[chapter]
2014
Lecture Notes in Computer Science
The resulting supervoxel segmentation could be used to map regional anatomical changes in clinical studies or serve as a domain for more complex modeling. ...
We find our approach to be efficient and effective for the automatic extraction of regions of interest that respect the structure of brain white matter. ...
In particular, this approach is a hard clustering algorithm that behaves similarly to a Dirichlet process (DP) mixture model learned with Gibbs sampling, as a result of recent work on small-variance asymptotic ...
doi:10.1007/978-3-319-14104-6_10
fatcat:eq66rzogozbejmpehl2prkhka4
White Matter Supervoxel Segmentation by Axial DP-Means Clustering
[chapter]
2014
Lecture Notes in Computer Science
The resulting supervoxel segmentation could be used to map regional anatomical changes in clinical studies or serve as a domain for more complex modeling. ...
We find our approach to be efficient and effective for the automatic extraction of regions of interest that respect the structure of brain white matter. ...
In particular, this approach is a hard clustering algorithm that behaves similarly to a Dirichlet process (DP) mixture model learned with Gibbs sampling, as a result of recent work on small-variance asymptotic ...
doi:10.1007/978-3-319-05530-5_10
fatcat:vk5lrkns4rgipjytrscymcpstu
Small-variance nonparametric clustering on the hypersphere
2015
2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
The first, DP-vMF-means, is a batch clustering algorithm derived from the Dirichlet process (DP) vMF mixture. ...
Based on the small-variance limit of Bayesian nonparametric von-Mises-Fisher (vMF) mixture distributions, we propose two new flexible and efficient k-means-like clustering algorithms for directional data ...
Dirichlet Process vMF-MM The Dirichlet process (DP) [13, 41] has been widely used as a prior for mixture models with a countably-infinite set of clusters [2, 4, 9, 32] . ...
doi:10.1109/cvpr.2015.7298630
dblp:conf/cvpr/StraubCHF15
fatcat:lzndcla43jemhinpqkwy5izpk4
Bayesian estimation of discrete entropy with mixtures of stick-breaking priors
2012
Neural Information Processing Systems
We therefore define a family of continuous mixing measures such that the resulting mixture of Dirichlet or Pitman-Yor processes produces an approximately flat prior over H. ...
We derive formulas for the posterior mean and variance of H given data. ...
Shlens for retinal data, and Y. .W. Teh for helpful comments on the manuscript. This work was supported by a Sloan Research Fellowship, McKnight Scholar's Award, and NSF CAREER Award IIS-1150186 (JP). ...
dblp:conf/nips/ArcherPP12
fatcat:vwyboyd4pzf7jcgrd2de5q6bn4
Small-Variance Asymptotics for Dirichlet Process Mixtures of SVMs
2014
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Infinite SVM (iSVM) is a Dirichlet process (DP) mixture of large-margin classifiers. ...
This paper presents a small-variance asymptotic analysis to derive a simple and efficient algorithm, which monotonically optimizes a max-margin DP-means (M2DPM) problem, an extension of DP-means for both ...
M 2 DPM was developed by performing small-variance asymptotic analysis to a Gibbs sampler of DP mixtures of SVMs. ...
doi:10.1609/aaai.v28i1.8959
fatcat:cd7i5rlek5bnli6hc7hlpiuudq
J. K. Ghosh's contribution to statistics: A brief outline
[chapter]
2008
Institute of Mathematical Statistics Collections
including prior and model selection. ...
In roughly chronological order, we discuss his major results in the areas of sequential analysis, foundations, asymptotics, and Bayesian inference. ...
In [20] , it was shown that a prior with the Kullback-Leibler property, such as a suitable Pòlya tree or a Dirichlet mixture process, can overcome the inconsistency property of Dirichlet processes for ...
doi:10.1214/074921708000000011
fatcat:usg3orwobzchrhjo4ocupub4im
Pitfalls in the use of Parallel Inference for the Dirichlet Process
2014
International Conference on Machine Learning
We end with suggestions of alternative paths of research for efficient non-approximate parallel inference for the Dirichlet process. ...
Recent work done by Lovell, Adams, and Mansingka (2012) and Williamson, Dubey, and Xing (2013) has suggested an alternative parametrisation for the Dirichlet process in order to derive non-approximate ...
Acknowledgments The authors would like to thank Dr Christian Steinruecken, Dr Daniel Roy, and Dr Jose Miguel Hernandez Lobato for reviewing the paper and their helpful comments. ...
dblp:conf/icml/GalG14
fatcat:ao5kccao4fcq3pnjmnv7f3zpmq
« Previous
Showing results 1 — 15 out of 1,767 results