Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








13,446 Hits in 3.4 sec

Computing Upper and Lower Bounds on Likelihoods in Intractable Networks [article]

Tommi S. Jaakkola, Michael I. Jordan
2013 arXiv   pre-print
We present deterministic techniques for computing upper and lower bounds on marginal probabilities in sigmoid and noisy-OR networks.  ...  These techniques become useful when the size of the network (or clique size) precludes exact computations. We illustrate the tightness of the bounds by numerical experiments.  ...  Saul and the anony mous reviewers for helpful comments and suggestions.  ... 
arXiv:1302.3586v1 fatcat:qtxetpczpvdrlen74cq2an3veu

On Reversing Jensen's Inequality

Tony Jebara, Alex Pentland
2000 Neural Information Processing Systems  
Jensen computes simple lower bounds on otherwise intractable quantities such as products of sums and latent log-likelihoods.  ...  This simplification then permits operations like integration and maximization. Quite often (i.e. in discriminative learning) upper bounds are needed as well.  ...  If we want overall lower and upper bounds on Ie and £(XIE», we need to compute reverse-Jensen bounds.  ... 
dblp:conf/nips/JebaraP00 fatcat:csaozf35cbaaznefy56rwdg5qq

Mean Field Theory for Sigmoid Belief Networks [article]

L. K. Saul, T. Jaakkola, M. I. Jordan
1996 arXiv   pre-print
Our mean field theory provides a tractable approximation to the true probability distribution in these networks; it also yields a lower bound on the likelihood of evidence.  ...  We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechanics.  ...  Neal, and H. Seung for sharing early versions of their manuscripts and for providing many stimulating discussions about this work.  ... 
arXiv:cs/9603102v1 fatcat:gnz6ypmhpbhepn4o6pnrlxzvk4

Mean Field Theory for Sigmoid Belief Networks

L. K. Saul, T. Jaakkola, M. I. Jordan
1996 The Journal of Artificial Intelligence Research  
Our mean field theory provides a tractable approximation to the true probability distribution in these networks; it also yields a lower bound on the likelihood of evidence.  ...  We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechanics.  ...  Neal, and H. Seung for sharing early versions of their manuscripts and for providing many stimulating discussions about this work.  ... 
doi:10.1613/jair.251 fatcat:z2cajh2zqbfklhxehyn4b2yzmq

Field-theoretic methods for intractable probabilistic models [chapter]

Dennis Lucarelli, Cheryl Resch, I-Jeng Wang, Fernando J. Pineda
2003 Proceedings of the 2003 SIAM International Conference on Data Mining  
random variables in the network.  ...  We describe a general technique for estimating the intractable quantities that occur in a wide variety of largescale probabilistic models.  ...  Acknowledgements FJP acknowledges several very educational and enlightening discussions with Pierre Baldi, Manfred Opper and Peter Sollich.  ... 
doi:10.1137/1.9781611972733.33 dblp:conf/sdm/LucarelliRWP03 fatcat:nq362weahnakdbgvbi55k4jvv4

Variational autoencoders trained with q-deformed lower bounds

Septimia Sârbu, Luigi Malagò
2019 International Conference on Learning Representations  
By leveraging the q-deformed logarithm in the traditional lower bounds, ELBO and IWAE, and the upper bound CUBO, we bring contributions to this direction of research.  ...  In training, they exploit the power of variational inference, by optimizing a lower bound on the model evidence.  ...  For one batch of images, we compute the qELBO lower bound and the CUBO upper bound (Dieng et al., 2017) , averaged over the batch.  ... 
dblp:conf/iclr/SarbuM19 fatcat:zyqpafv4ujambnkh7l5ms23h5u

Importance Weighted Hierarchical Variational Inference [article]

Artem Sobolev, Dmitry Vetrov
2019 arXiv   pre-print
We then give an upper bound on the Kullback-Leibler divergence and derive a family of increasingly tighter variational lower bounds on the otherwise intractable standard evidence lower bound for hierarchical  ...  To overcome this roadblock, we introduce a new family of variational upper bounds on a marginal log density in the case of hierarchical models (also known as latent variable models).  ...  Acknowledgements Authors would like to thank Aibek Alanov, Dmitry Molchanov and Oleg Ivanov for valuable discussions and feedback.  ... 
arXiv:1905.03290v1 fatcat:cu53wbwxwfbfla7vb4z2l2zdam

Neural Variational Inference and Learning in Undirected Graphical Models [article]

Volodymyr Kuleshov, Stefano Ermon
2017 arXiv   pre-print
Central to our approach is an upper bound on the log-partition function parametrized by a function q that we express as a flexible neural network.  ...  Here, we propose black-box learning and inference algorithms for undirected models that optimize a variational approximation to the log-likelihood of the model.  ...  This work is supported by the Intel Corporation, Toyota, NSF (grants 1651565, 1649208, 1522054) and by the Future of Life Institute (grant 2016-158687).  ... 
arXiv:1711.02679v2 fatcat:ozbyqcnqbze7pn7tr6n7cp5iua

Variational Bayes on Monte Carlo Steroids

Aditya Grover, Stefano Ermon
2016 Neural Information Processing Systems  
We demonstrate empirical improvements on benchmark datasets in vision and language for sigmoid belief networks, where a neural network is used to approximate the posterior.  ...  We provide a new approach for learning latent variable models based on optimizing our new bounds on the log-likelihood.  ...  Acknowledgments This work was supported by grants from the NSF (grant 1649208) and Future of Life Institute (grant 2016-158687).  ... 
dblp:conf/nips/GroverE16 fatcat:hp2rmlxuira3pbiurpggpiyldi

A Tighter Bound for Graphical Models

M. A. R. Leisink, H. J. Kappen
2001 Neural Computation  
We show t.hat the third order bound is strictly better than mean field. Additionally we show the rough out.line how this bound is applicable to sigmoid belief networks.  ...  \Ve present a method to bound the partition function of a Boltz mann machine neural network with any odd order polynomial. This is a direct extension of the mean fi eld bound, which is first order.  ...  Acknowledgements This research is supported by the Te chnology Foundation STW, applied science devision of NWO and the technology programme of the Ministry of Economic Affairs.  ... 
doi:10.1162/089976601750399344 pmid:11516361 fatcat:iabuokjrlrenreakyt7seoa32a

Recursive Algorithms for Approximating Probabilities in Graphical Models

Tommi S. Jaakkola, Michael I. Jordan
1996 Neural Information Processing Systems  
The approximations we use are controlled: they maintain consistently upper and lower bounds on the desired quantities at all times.  ...  We develop a recursive node-elimination formalism for efficiently approximating large probabilistic networks. No constraints are set on the network topologies.  ...  This applies to parameter estimation as well even if only a lower bound on likelihood of examples is used; such likelihood bound relies on both upper and lower bounds on partition functions.  ... 
dblp:conf/nips/JaakkolaJ96 fatcat:qrwxrare7bfsxnvb2sw6scluhq

Approximating probabilistic inference in Bayesian belief networks

P. Dagum, R.M. Chavez
1993 IEEE Transactions on Pattern Analysis and Machine Intelligence  
Unless P=NP, an efficient, exact algorithm does not exist to compute probabilistic inference in belief networks.  ...  A belief network comprises a graphical representation of dependencies between variables of a domain and a set of conditional probabilities associated with each dependency.  ...  Because this prior knowledge is unknown in advance, BNRAS, likelihood weighting, and logic sampling algorithms employ easily computable lower bounds on the inference to yield upper bounds on N .  ... 
doi:10.1109/34.204906 fatcat:iizconqtvbgfpannqctrpjbjyi

Approximating probabilistic inference in Bayesian belief networks is NP-hard

Paul Dagum, Michael Luby
1993 Artificial Intelligence  
Unless P=NP, an efficient, exact algorithm does not exist to compute probabilistic inference in belief networks.  ...  A belief network comprises a graphical representation of dependencies between variables of a domain and a set of conditional probabilities associated with each dependency.  ...  Because this prior knowledge is unknown in advance, BNRAS, likelihood weighting, and logic sampling algorithms employ easily computable lower bounds on the inference to yield upper bounds on N .  ... 
doi:10.1016/0004-3702(93)90036-b fatcat:73ytlqceyvcoldp7jde3ud5kn4

Tractable Variational Structures for Approximating Graphical Models

David Barber, Wim Wiegerinck
1998 Neural Information Processing Systems  
However, the computing time is typically exponential in the number of nodes in the graph.  ...  Graphical models provide a broad probabilistic framework with applications in speech recognition (Hidden Markov Models), medical diagnosis (Belief networks) and artificial intelligence (Boltzmann Machines  ...  (c,d,e,f): Structures of the directed approximations on H.For each structure, histograms of the relative error between the true log likelihood and the lower bound is plotted.  ... 
dblp:conf/nips/BarberW98 fatcat:kbrdfuqoqbczth7bh3m4nszv3y

Large Deviation Methods for Approximate Probabilistic Inference [article]

Michael Kearns, Lawrence Saul
2013 arXiv   pre-print
In large networks where exact probabilistic inference is intractable, we show how to compute upper and lower bounds on many probabilities of interest.  ...  In particular, using methods from large deviation theory, we derive rigorous bounds on marginal probabilities such as Pr[children] and prove rates of convergence for the accuracy of our bounds as a function  ...  In large networks, where exact probabilistic inference is intractable, we show how to compute upper and lower bounds on various probabilities of interest.  ... 
arXiv:1301.7392v1 fatcat:ehxyz4pjj5fc7dziescbgrtazi
« Previous Showing results 1 — 15 out of 13,446 results