A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Block clustering with collapsed latent block models
[article]
2010
arXiv
pre-print
We introduce a Bayesian extension of the latent block model for model-based block clustering of data matrices. Our approach considers a block model where block parameters may be integrated out. ...
This differs from existing work on latent block models, where the number of clusters is assumed known or is chosen using some information criteria. ...
Conclusion We have considered a collapsed Bayesian extension of the Latent Block Model of Govaert & Nadif (2008) . ...
arXiv:1011.2948v1
fatcat:5jn3w234j5fnhdvr2ylk6tx2hy
SC-VAE: Sparse Coding-based Variational Autoencoder with Learned ISTA
[article]
2024
arXiv
pre-print
The second class of methods learns instead a discrete latent representation using vector quantization (VQ) along with a codebook. ...
The first class suffers from posterior collapse, whereas the second class suffers from codebook collapse. ...
With increasing number of downsampling blocks, our model struggled with image reconstruction. ...
arXiv:2303.16666v2
fatcat:jl3z7wigirfsdmaathvcfkvau4
A research article submitted to Water Resources Research: Electro‐Thermal Subsurface Gas Generation and Transport: Model Validation and Implications
2019
Water Resources Research
Water boiling plateaus (i.e., latent heat), heat recirculation within steam clusters, and steam collapse (i.e., condensation) mechanisms were added to ET-MIP. ...
When coupled with continuum heat and mass transport models, MIP has the potential to simulate complex subsurface scenarios. ...
All of the model inputs required to reproduce the presented results are available from cited references. ...
doi:10.1029/2018wr024095
fatcat:zagddjdscjanlad5slvr62ex7e
BayesPy: Variational Bayesian Inference in Python
[article]
2015
arXiv
pre-print
It also supports some advanced methods such as stochastic and collapsed variational inference. ...
It is based on the variational message passing framework and supports conjugate exponential family models. ...
For GMM, the small model used 10 clusters for 200 observations with 2 dimensions, and the large model used 40 clusters for 2000 observations with 10 dimensions. ...
arXiv:1410.0870v3
fatcat:i4pptetpjfgwferh477sh4sv7e
Blocking Collapsed Gibbs Sampler for Latent Dirichlet Allocation Models
[article]
2016
arXiv
pre-print
Inference for this model typically involves a single-site collapsed Gibbs sampling step for latent variables associated with observations. ...
In this article, we introduce a blocking scheme to the collapsed Gibbs sampler for the LDA model which can, with a theoretical guarantee, improve chain mixing efficiency. ...
In this article, we propose a blocking scheme to improve the efficiency of the collapsed Gibbs sampler for the latent Dirichlet allocation (LDA) model, which is popular for topic modelling. ...
arXiv:1608.00945v1
fatcat:y77koeiqdbdpxppeogbgawq34i
TriNet: stabilizing self-supervised learning from complete or slow collapse on ASR
[article]
2023
arXiv
pre-print
Self-supervised learning (SSL) models confront challenges of abrupt informational collapse or slow dimensional collapse. ...
TriNet learns the SSL latent embedding space and incorporates it to a higher level space for predicting pseudo target vectors generated by a frozen teacher. ...
For pre-training, TriNet uses all but the last Conformer blocks for encoding the mid-level latent embedding space illustrated as blank blocks in Fig. 1 , and dedicates the last Conformer block to the ...
arXiv:2301.00656v2
fatcat:3q32myoxdzczjmlgwxrpwn355i
Chain graphs for multilevel models
2007
Statistics and Probability Letters
After a brief introduction to multilevel models and a description of the conditional independencies derived from the model, the paper defines chain graphs for multilevel models. ...
The present work proposes a possible solution to extend graphical models for correlated data. ...
The introduction of a latent node, representing the course programme effect, is substantial: the likelihood ratio test comparing the models with and without the latent grouping effect is significant. ...
doi:10.1016/j.spl.2006.07.011
fatcat:n5spwuborfevldybijlsj6rgmq
Modeling the social media relationships of Irish politicians using a generalized latent space stochastic blockmodel
[article]
2020
arXiv
pre-print
A Bayesian method with Markov chain Monte Carlo sampling is proposed for estimation of model parameters. ...
The proposed model is capable of representing transitivity, clustering, as well as disassortative mixing. ...
Ryan et al. (2017) proposed a Bayesian model selection method for the latent position cluster model by collapsing the model to integrate out the model parameters which allows posterior inference over the ...
arXiv:1807.06063v2
fatcat:hgvb2ealtfekvboty7glf6dar4
Discovering Relevance-Dependent Bicluster Structure from Relational Data: A Model and Algorithm
2018
Transactions of the Japanese society for artificial intelligence
and (2) all clusters are related to at least one dense block. ...
The proposed model factorizes relational data into bicluster structure with two features: (1) each object in a cluster has a relevance value, which indicates how strongly the object relates to the cluster ...
However, in general, size of latent blocks KL underlying relational data increase as the size of given data grows. ...
doi:10.1527/tjsai.b-i46
fatcat:l57nqf7ptnhobd62vkfgxw2ekq
Inferring Vertex Properties from Topology in Large Networks
2007
Mining and Learning with Graphs
We introduce a simple probabilistic latent-variable model which finds either latent blocks or more graded structures, depending on hyperparameters. ...
With collapsed Gibbs sampling it can be estimated for networks of 10 6 vertices or more, and the number of latent components adapts to data through a Dirichlet process prior. ...
The algorithm introduced here is a simple generative probabilistic latent mixture model, fitted with (collapsed) Gibbs sampling [5] . ...
dblp:conf/mlg/SinkkonenAK07
fatcat:mcwzce3dsrbhrm2ui72bobwwqm
Fast and reliable inference algorithm for hierarchical stochastic block models
[article]
2017
arXiv
pre-print
Statistical inference often treats groups as latent variables, with observed networks generated from latent group structure, termed a stochastic block model. ...
Here we present scalable and reliable algorithms that recover hierarchical stochastic block models fast and accurately. ...
Degree-corrected stochastic block model In some situations, a degree-corrected stochastic block model (DSBM) provides more appealing group structures in a real-world network [14] . ...
arXiv:1711.05150v1
fatcat:ctm2yk272reqddgkqvn2llifai
Bayesian model selection for the latent position cluster model for Social Networks
[article]
2013
arXiv
pre-print
The latent position cluster model is a popular model for the statistical analysis of network data. ...
This is an appealing approach since it allows the model to cluster actors which consequently provides the practitioner with useful qualitative information. ...
A trans-model algorithm for the collapsed latent position cluster model Markov chain Monte Carlo sampling of the collapsed posterior distribution for the latent position cluster model is carried out using ...
arXiv:1308.4871v1
fatcat:db5gipqhmbhbfilzzwxyelqewa
Bayesian model selection for the latent position cluster model for social networks
2017
Network Science
The latent position cluster model is a popular model for the statistical analysis of network data. ...
This is an appealing approach since it allows the model to cluster actors which consequently provides the practitioner with useful qualitative information. ...
A trans-model algorithm for the collapsed latent position cluster model Markov chain Monte Carlo sampling of the collapsed posterior distribution for the latent position cluster model is carried out using ...
doi:10.1017/nws.2017.6
fatcat:3bl3jpalxrehroym2lgahcj6pi
Accelerating Collapsed Variational Bayesian Inference for Latent Dirichlet Allocation with Nvidia CUDA Compatible Devices
[chapter]
2009
Lecture Notes in Computer Science
While LDA is an efficient Bayesian multi-topic document model, it requires complicated computations for parameter estimation in comparison with other simpler document models, e.g. probabilistic latent ...
In this paper, we propose an acceleration of collapsed variational Bayesian (CVB) inference for latent Dirichlet allocation (LDA) by using Nvidia CUDA compatible devices. ...
We are now implementing our method on a cluster of PCs equipped with graphics cards. ...
doi:10.1007/978-3-642-02568-6_50
fatcat:xvubkiiymfb65j3r7kdgduufxa
GOLLIC: Learning Global Context beyond Patches for Lossless High-Resolution Image Compression
[article]
2022
arXiv
pre-print
To address this problem, we propose a hierarchical latent variable model with a global context to capture the long-term dependencies of high-resolution images. ...
Later, shared latent variables are learned according to latent variables of patches and their confidence, which reflects the similarity of patches in the same cluster and benefits the global context modeling ...
Thus, the clustering module collapsed in the model with a scale =1. ...
arXiv:2210.03301v1
fatcat:tcdawumdujfvxagic4fucjnvda
« Previous
Showing results 1 — 15 out of 14,756 results