Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Mixed-supervised segmentation: Confidence maximization helps knowledge distillation release_k4c4hrqlefbi7mx72rjsokbkli

by Bingyuan Liu, Christian Desrosiers, Ismail Ben Ayed, Jose Dolz

Released as a article .

2022  

Abstract

Despite achieving promising results in a breadth of medical image segmentation tasks, deep neural networks require large training datasets with pixel-wise annotations. Obtaining these curated datasets is a cumbersome process which limits the applicability in scenarios. Mixed supervision is an appealing alternative for mitigating this obstacle. In this work, we propose a dual-branch architecture, where the upper branch (teacher) receives strong annotations, while the bottom one (student) is driven by limited supervision and guided by the upper branch. Combined with a standard cross-entropy loss over the labeled pixels, our novel formulation integrates two important terms: (i) a Shannon entropy loss defined over the less-supervised images, which encourages confident student predictions in the bottom branch; and (ii) a KL divergence term, which transfers the knowledge (i.e., predictions) of the strongly supervised branch to the less-supervised branch and guides the entropy (student-confidence) term to avoid trivial solutions. We show that the synergy between the entropy and KL divergence yields substantial improvements in performance. We also discuss an interesting link between Shannon-entropy minimization and standard pseudo-mask generation, and argue that the former should be preferred over the latter for leveraging information from unlabeled pixels. We evaluate the effectiveness of the proposed formulation through a series of quantitative and qualitative experiments using two publicly available datasets. Results demonstrate that our method significantly outperforms other strategies for semantic segmentation within a mixed-supervision framework, as well as recent semi-supervised approaches. Our code is publicly available: https://github.com/by-liu/ConfKD.
In text/plain format

Archived Files and Locations

application/pdf  2.9 MB
file_5wczul53c5hprczytiovxgwiwa
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   accepted
Date   2022-11-23
Version   v4
Language   en ?
arXiv  2109.10902v4
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: b70b755f-bb4a-4a3e-b3d3-9b3bdd9a6f85
API URL: JSON