A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Deep Long-Tailed Learning: A Survey
[article]
2021
arXiv
pre-print
To be specific, we group existing deep long-tailed learning studies into three main categories (i.e., class re-balancing, information augmentation and module improvement), and review these methods following ...
can be easily biased towards dominant classes and perform poorly on tail classes. ...
In addition to multi-class classification, long-tailed learning is also applied to multi-label classification based on both artificial tasks [37] , [90] (i.e., VOC-LT and COCO-LT) and real-world tasks ...
arXiv:2110.04596v1
fatcat:lpvt2x6cv5crxm2qxdctjrlkqq
SWIPENET: Object detection in noisy underwater images
[article]
2022
arXiv
pre-print
Then, based on the clean detector, multiple detectors focusing on learning diverse noisy data are trained and incorporated into a unified deep ensemble of strong noise immunity. ...
In this paper, we propose a novel Sample-WeIghted hyPEr Network (SWIPENET), and a robust training paradigm named Curriculum Multi-Class Adaboost (CMA), to address these two problems at the same time. ...
Different from the training loss based sample-reweight methods, Multi-Class Adaboost [29] re-weights the samples according to the classification results. ...
arXiv:2010.10006v3
fatcat:gxphoweyt5eylfi6wbfffiynh4
Dynamic Curriculum Learning for Imbalanced Data Classification
[article]
2019
arXiv
pre-print
; (2) loss scheduler controls the learning importance between classification and metric learning loss. ...
To address this problem, we propose a unified framework called Dynamic Curriculum Learning (DCL) to online adaptively adjust the sampling strategy and loss learning in single batch, which resulting in ...
Based on the above-introduced scheduler functions, we propose Dynamic Curriculum Learning framework for imbalanced data classification. ...
arXiv:1901.06783v2
fatcat:7mdcol5jjvhtzdnzf5ls7fvwdm
Efficient Breast Cancer Classification Network with Dual Squeeze and Excitation in Histopathological Images
2022
Diagnostics
Our method outperformed ResNet101, InceptionResNetV2, and EfficientNetV2 networks on the publicly available BreakHis dataset for the binary and multi-class breast cancer classification in terms of precision ...
In this paper, we propose a convolutional neural network (CNN)-based breast cancer classification method for hematoxylin and eosin (H&E) whole slide images (WSIs). ...
Data Availability Statement: https://web.inf.ufpr.br/vri/databases/breast-cancer-histopathologicaldatabase-breakhis/ (accessed on 12 October 2022). ...
doi:10.3390/diagnostics13010103
pmid:36611396
pmcid:PMC9818943
fatcat:xroax6zzxffbhpfpep3jlkaege
CurriculumNet: Weakly Supervised Learning from Large-Scale Web Images
[article]
2018
arXiv
pre-print
With an ensemble of multiple models, we achieved a top-5 error rate of 5.2% on the WebVision challenge for 1000-category classification. ...
We develop a principled learning strategy by leveraging curriculum learning, with the goal of handling a massive amount of noisy labels and data imbalance effectively. ...
Fig. 4 . 4 Testing loss of four different models with BN-Inception architecture, (left) Density-based curriculum, and (right) K-mean based curriculum. ...
arXiv:1808.01097v4
fatcat:2wpmcqnemrehjo7chvlvpeosra
Striking the Right Balance with Uncertainty
[article]
2019
arXiv
pre-print
We systematically study the class imbalance problem and derive a novel loss formulation for max-margin learning based on Bayesian uncertainty measure. ...
Learning unbiased models on imbalanced datasets is a significant challenge. ...
We systematically study the class imbalance problem and derive a novel loss formulation for max-margin learning based on Bayesian uncertainty measure. ...
arXiv:1901.07590v3
fatcat:6znury3kg5hkppg3q7lx3vjqgy
How Important Is Each Dermoscopy Image?
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
This paper addresses this issue through the extensive comparison of several sample weighting methods, namely class balance and curriculum learning. ...
Systems based on DNNs are able to achieve impressive diagnostic performances, even outperforming experienced dermatologists. ...
Acknowledgments This work was supported by the FCT project and multiyear funding: [CEECIND/ 00326/2017] and LARSyS -FCT Plurianual funding 2020-2023. ...
doi:10.1109/cvprw50498.2020.00379
dblp:conf/cvpr/BarataS20
fatcat:2erir3v65jbp5bze2q6nwztdqu
ACPL: Anti-curriculum Pseudo-labelling for Semi-supervised Medical Image Classification
[article]
2022
arXiv
pre-print
to select informative unlabelled samples, improving training balance and allowing the model to work for both multi-label and multi-class problems, and to estimate pseudo labels by an accurate ensemble ...
Effective semi-supervised learning (SSL) in medical image analysis (MIA) must address two challenges: 1) work effectively on both multi-class (e.g., lesion classification) and multi-label (e.g., multiple-disease ...
[7] explored a pseudo labelling SSL method based on curriculum learning, but we are not aware of SSL methods that explore anti-curriculum learning. ...
arXiv:2111.12918v3
fatcat:rsozqd4jv5hw5gix7hsngjubei
Improved skin lesion recognition by a Self-Supervised Curricular Deep Learning approach
[article]
2021
arXiv
pre-print
For the multi-class skin lesion classification problem, and ISIC-2019 dataset, we provide experimental evidence showing that: i) a model pretrained by a curriculum of pretext tasks outperforms models pretrained ...
by individual pretext tasks, and ii) a model pretrained by the optimal pretext task curriculum outperforms a model pretrained on ImageNet. ...
As of today, leading approaches for image-based skin lesion assessment [3] are based on Convolutional Neural Networks (CNNs), achieving up to 72.5% of accuracy on the multi-class classification problem ...
arXiv:2112.12086v1
fatcat:rgdgsoxrdrcp7jbwxqltsdwh3e
Dynamic Curriculum Learning for Imbalanced Data Classification
2019
2019 IEEE/CVF International Conference on Computer Vision (ICCV)
; (2) loss scheduler which controls the learning importance between classification and metric learning loss. ...
Inspired by curriculum learning, DCL consists of two-level curriculum schedulers: (1) sampling scheduler which manages the data distribution not only from imbalance to balance but also from easy to hard ...
Based on the above-introduced scheduler functions, we propose Dynamic Curriculum Learning framework for imbalanced data classification. ...
doi:10.1109/iccv.2019.00512
dblp:conf/iccv/WangGYWY19
fatcat:kzthfmteorerdkbsg4h2ogtl24
Curriculum Loss: Robust Learning and Generalization against Label Corruption
[article]
2020
arXiv
pre-print
As a result, our loss can be deemed as a novel perspective of curriculum sample selection strategy, which bridges a connection between curriculum learning and robust learning. ...
To efficiently optimize the 0-1 loss while keeping its robust properties, we propose a very simple and efficient loss, i.e. curriculum loss (CL). ...
Then, the classification margin for multi-class classification can be defined as follows
Figure 5 : 5 Test accuracy and label precision vs. number of epochs on CIFAR10 dataset. loss function. ...
arXiv:1905.10045v3
fatcat:6mctuaazzrgqzfs52om6eu6f4q
Adaptive Ensembling: Unsupervised Domain Adaptation for Political Document Analysis
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
To bridge this gap, we present adaptive ensembling, an unsupervised domain adaptation framework, equipped with a novel text classification model and time-aware training to ensure our methods work well ...
Further analysis indicates that our methods are more stable, learn better representations, and extract cleaner corpora for fine-grained analysis. ...
Thanks as well to Greg Durrett, Katrin Erk, and the anonymous reviewers for their helpful comments. This work was partially supported by the NSF Grant IIS-1850153. ...
doi:10.18653/v1/d19-1478
dblp:conf/emnlp/DesaiSRL19
fatcat:hhqzsequhbfezecnu62frzinsu
Adaptive Ensembling: Unsupervised Domain Adaptation for Political Document Analysis
[article]
2019
arXiv
pre-print
To bridge this gap, we present adaptive ensembling, an unsupervised domain adaptation framework, equipped with a novel text classification model and time-aware training to ensure our methods work well ...
Further analysis indicates that our methods are more stable, learn better representations, and extract cleaner corpora for fine-grained analysis. ...
Thanks as well to Greg Durrett, Katrin Erk, and the anonymous reviewers for their helpful comments. This work was partially supported by the NSF Grant IIS-1850153. ...
arXiv:1910.12698v1
fatcat:2xdgjt3k6ndebiitkud7lqbshm
A Survey of Unsupervised Deep Domain Adaptation
[article]
2020
arXiv
pre-print
As a complement to this challenge, single-source unsupervised domain adaptation can handle situations where a network is trained on labeled data from a source domain and unlabeled data from a related but ...
While such approaches for supervised learning have performed well, they assume that training and testing data are drawn from the same distribution, which may not always be the case. ...
Multi-domain learning [61, 113] and multi-task learning [29] are related to transfer learning and domain adaptation. ...
arXiv:1812.02849v3
fatcat:paefg5cywbe3tjsp6dffnwkvxy
A Survey of Unsupervised Deep Domain Adaptation
2020
ACM Transactions on Intelligent Systems and Technology
As a complement to this challenge, single-source unsupervised domain adaptation can handle situations where a network is trained on labeled data from a source domain and unlabeled data from a related but ...
While such approaches for supervised learning have performed well, they assume that training and testing data are drawn from the same distribution, which may not always be the case. ...
Multi-domain learning [62, 113] and multi-task learning [29] are related to transfer learning and domain adaptation. ...
doi:10.1145/3400066
pmid:34336374
pmcid:PMC8323662
fatcat:vh52rfgjgrc37kyctkqwlykq7e
« Previous
Showing results 1 — 15 out of 1,504 results