A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
DeepECT: The Deep Embedded Cluster Tree
2020
Data Science and Engineering
In this paper, we propose the Deep Embedded Cluster Tree (DeepECT), the first divisive hierarchical embedded clustering method. ...
The cluster tree does not need to know the actual number of clusters during optimization. Instead, the level of detail to be analyzed can be chosen afterward and for each sub-tree separately. ...
Deep Embedded Cluster Tree
Overview In this section, we discuss the deep embedded cluster tree (DeepECT). An implementation can be found at https :// dmm.dbs.ifi.lmu.de/downl oads. ...
doi:10.1007/s41019-020-00134-0
fatcat:gdhwz5pokzdpjoeyphf2d4pgcu
Deep hierarchical embedding for simultaneous modeling of GPCR proteins in a unified metric space
2021
Scientific Reports
Furthermore, we demonstrated that further downstream tasks, like phylogenetic reconstruction and motif discovery, are feasible in the constructed embedding space. ...
Metric distances in the deep feature space corresponded to the hierarchical family relation between GPCR proteins. ...
tree 44 . ...
doi:10.1038/s41598-021-88623-8
pmid:33953216
fatcat:cxewfy4ofje5nermpfr6ndmvn4
An Empirical Study on Clustering Pretrained Embeddings: Is Deep Strictly Better?
[article]
2022
arXiv
pre-print
Recent research in clustering face embeddings has found that unsupervised, shallow, heuristic-based methods – including k-means and hierarchical agglomerative clustering – underperform supervised, deep ...
Notably, deep methods are surprisingly fragile for embeddings with more uncertainty, where they match or even perform worse than shallow, heuristic-based methods. ...
Embedding 2 -Norm Confidence + GCN-E + Tree Deduction. ...
arXiv:2211.05183v1
fatcat:6zjfid55ijgaxko5bl5t5rbje4
Triphone State-Tying via Deep Canonical Correlation Analysis
2016
Interspeech 2016
The learned embeddings capture similarity between triphones and are highly predictable from the acoustics. We then cluster the embeddings and use cluster IDs as tied states. ...
Our method first learns low-dimensional embeddings of context-dependent phones using deep canonical correlation analysis. ...
Every subphonetic state is assigned a cluster according to the trained decision tree, and every state within a cluster shares the same DNN target or GMM. ...
doi:10.21437/interspeech.2016-1300
dblp:conf/interspeech/WangTL16
fatcat:rznwp7g3o5aezl7swgc2pukjzi
Deep Extreme Multi-label Learning
[article]
2018
arXiv
pre-print
In this paper, we propose a practical deep embedding method for extreme multi-label classification, which harvests the ideas of non-linear embedding and graph priors-based label space modeling simultaneously ...
In the meanwhile, deep learning has been widely studied and used in various classification problems including multi-label classification, however it has not been properly introduced to XML, where the label ...
Moreover f x and f y are close to each other as measured by the embedding loss. • Cluster embedding space: After getting the feature embedding set f x i , we partition the set into several clusters with ...
arXiv:1704.03718v4
fatcat:iybndvgxv5cwre5yg6ruakl32y
Revisiting Semantic Representation and Tree Search for Similar Question Retrieval
[article]
2019
arXiv
pre-print
So we design a specific tree for searching and combine deep model to solve this problem. We fine-tune BERT on the training data to get semantic vector or sentence embeddings on the test data. ...
We use all the sentence embeddings of test data to build our tree based on k-means and do beam search at predicting time when given a sentence as query. ...
Tree Building We choose 5,8,10 as clustering number for k-means. We name the trees 5-K tree, 8-K tree and 10-K tree, based on the clustering number. ...
arXiv:1908.08326v8
fatcat:ebsoxz2hbfd6lcoiiz2ce5nxdq
Objective-Based Hierarchical Clustering of Deep Embedding Vectors
2021
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
We initiate a comprehensive experimental study of objective-based hierarchical clustering methods on massive datasets consisting of deep embedding vectors from computer vision and NLP applications. ...
This is the first substantial improvement over the trivial 2/3-approximation achieved by a random binary tree. ...
We study performance of objective-based hierarchical clustering methods on large beyond worst-case datasets consisting of deep vector embeddings. ...
doi:10.1609/aaai.v35i10.17094
fatcat:2eu6v2mgmrbylhdh2s34pacfxq
Tree-SNE: Hierarchical Clustering and Visualization Using t-SNE
[article]
2020
arXiv
pre-print
t-SNE embeddings. ...
Building on recent advances in speeding up t-SNE and obtaining finer-grained structure, we combine the two to create tree-SNE, a hierarchical clustering and visualization algorithm based on stacked one-dimensional ...
Clustering (AEC, Song et al. 2013), NMF with Deep learning model (NMF-D, Trigeorgis et al. 2014), Task-specific Deep Architecture for Clustering (TSC-D, Wang et al. 2016), Deep Convolutional Embedded ...
arXiv:2002.05687v1
fatcat:jaxik32vlreylkhe4tmkk7jqpy
Cluster analysis of deep embeddings in real-time strategy games
2020
Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications II
The results of this work indicate that the use of deep embeddings provides a promising basis for clustering and interpreting player behavior in complex game domains. ...
In this article, we examine the effectiveness of learned deep embeddings via internal clustering metrics on a dataset comprised of unlabelled StarCraft 2 game replays. ...
In this article, we examine the efficacy of deep embeddings applied to game play data via metrics for internal clustering. ...
doi:10.1117/12.2558105
fatcat:azvrg2axeffurng57galz3ygy4
Interpretability Study on Deep Learning for Jet Physics at the Large Hadron Collider
[article]
2019
arXiv
pre-print
After successful training of deep neural networks, examining the trained networks not only helps us understand the behaviour of neural networks, but also helps improve the performance of deep learning ...
problem at the LHC as an example, using recursive neural networks as a starting point, aim at a thorough understanding of the behaviour of the physics-oriented DNNs and the information encoded in the embedding ...
A jet representation is obtained through embedding each clustering node recursively along the clustering tree, and can then be easily fed into downstream tasks. ...
arXiv:1911.01872v1
fatcat:7s3v3idvwfhalnlb73ontcszjy
Intelligent anomaly detection for large network traffic with Optimized Deep Clustering (ODC) algorithm
2021
IEEE Access
The evaluation results show ODC deep clustering method outperforms the existing deep clustering methods for anomaly detection. ...
Deep clustering algorithms for anomaly detection gain significant research attention in this era. ...
For example, Deep Embedding Clustering (DEC) [8] , Improved Deep Embedding Clustering (IDEC) [9] and Deep Density-based Clustering (DDC) [10] use dense deep AutoEncoder, Deep Convolutional Embedded ...
doi:10.1109/access.2021.3068172
fatcat:v7i7p6j3yvas3fexgmb4wtj4yq
Deep Unsupervised Feature Learning for Natural Language Processing
2012
North American Chapter of the Association for Computational Linguistics
We investigate deep learning methods for unsupervised feature learning for NLP tasks. ...
Recent results indicate that features learned using deep learning methods are not a silver bullet and do not always lead to improved results. ...
They introduce a method for bootstrapping the construction of the tree by initially using a random binary tree to learn word embeddings, and then rebuilding the tree based on a clustering of the learned ...
dblp:conf/naacl/Gouws12
fatcat:dhgxworkebd7dfvhyhl6kk5ly4
Learning Latent Superstructures in Variational Autoencoders for Deep Multidimensional Clustering
[article]
2019
arXiv
pre-print
Whereas previous deep learning methods for clustering produce only one partition of data, LTVAE produces multiple partitions of data, each being given by one super latent variable. ...
We call our model the latent tree variational autoencoder (LTVAE). ...
Recently, deep learning based clustering methods have been proposed that simultanously learn nonlinear embeddings through deep neural networks and perform cluster analysis on the embedding space. ...
arXiv:1803.05206v3
fatcat:gyzkg7u2cnahbmw3vtdjvi5v6a
AttentionXML: Label Tree-based Attention-Aware Deep Model for High-Performance Extreme Multi-Label Text Classification
[article]
2019
arXiv
pre-print
We propose a new label tree-based deep learning model for XMTC, called AttentionXML, with two unique features: 1) a multi-label attention mechanism with raw text as input, which allows to capture the most ...
Traditionally most methods used bag-of-words (BOW) as inputs, ignoring word context as well as deep semantic information. ...
4 (1-vs-All), MLC2Seq 5 (deep learning), XML-CNN 2 (deep learning), PfastreXML 2 (instance tree), Parabel 2 (label tree) and XT 6 (ExtremeText) (label tree) and Bonsai 7 (label tree). ...
arXiv:1811.01727v3
fatcat:xhvznzdwdzgffpzenbamxf5ipm
The Tree Inclusion Problem: In Optimal Space and Faster
[chapter]
2005
Lecture Notes in Computer Science
Given two rooted, ordered, and labeled trees P and T the tree inclusion problem is to determine if P can be obtained from T by deleting nodes in T . ...
Computing Deep Embeddings In this section we present a general framework for answering tree inclusion queries. As in [10] we solve the equivalent tree embedding problem. ...
For any trees P and T , P ⊑ T iff there exists an embedding of P in T . Fig. 1(c) . We say that the embedding f is deep if there is no embedding g such that f (root(P )) ≺ g(root(P )). ...
doi:10.1007/11523468_6
fatcat:sn4y44qpxzeilfuikal3vcfiqy
« Previous
Showing results 1 — 15 out of 69,075 results