Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








5 Hits in 3.7 sec

DTFD-MIL: Double-Tier Feature Distillation Multiple Instance Learning for Histopathology Whole Slide Image Classification [article]

Hongrun Zhang, Yanda Meng, Yitian Zhao, Yihong Qiao, Xiaoyun Yang, Sarah E. Coupland, Yalin Zheng
2022 arXiv   pre-print
Multiple instance learning (MIL) has been increasingly used in the classification of histopathology whole slide images (WSIs).  ...  The proposed framework is ready to be extended for wider MIL applications. The code is available at: https://github.com/hrzhang1123/DTFD-MIL  ...  We then propose the DTFD-MIL, which utilizes the idea of pseudo bags and double-tier MIL. The derivation of instance probability serves for the feature distillation in DTFD-MIL.  ... 
arXiv:2203.12081v1 fatcat:filav6rrdfgrlpcbyh6bl6siim

Multiple Instance Learning for Glioma Diagnosis using Hematoxylin and Eosin Whole Slide Images: An Indian Cohort Study [article]

Ekansh Chauhan, Amit Sharma, Megha S Uppin, C.V. Jawahar, P.K. Vinod
2024 arXiv   pre-print
Using a ResNet-50, pretrained on histopathology datasets for feature extraction, combined with the Double-Tier Feature Distillation (DTFD) feature aggregator, our approach achieves state-of-the-art AUCs  ...  This study advances patient care with findings from rigorous multiple instance learning experimentations across various feature extractors and aggregators in brain tumor histopathology.  ...  Double-Tier Feature Distillation (DTFD) MIL, introduced by Zhang et al. [28] , introduces the concept of "pseudo-bags" addressing issues associated with small cohorts.  ... 
arXiv:2402.15832v2 fatcat:3rwax4fqfzacndm2yzpdmioucu

Dynamic Graph Representation with Knowledge-aware Attention for Histopathology Whole Slide Image Analysis [article]

Jiawen Li, Yuxuan Chen, Hongbo Chu, Qiehe Sun, Tian Guan, Anjia Han, Yonghong He
2024 arXiv   pre-print
Histopathological whole slide images (WSIs) classification has become a foundation task in medical microscopic imaging processing.  ...  Finally, we obtain a graph-level embedding through the global pooling process of the updated head, serving as an implicit representation for the WSI classification.  ...  With the necessity to analyze whole slide images (WSIs) in tissue pathology, graph representations that use patches as entities have also been widely discussed. For instance, Chen et al.  ... 
arXiv:2403.07719v1 fatcat:wfhmss6y7vb4vpbhpgvc7v7ztm

Multiple Instance Learning Framework with Masked Hard Instance Mining for Whole Slide Image Classification [article]

Wenhao Tang and Sheng Huang and Xiaoxian Zhang and Fengtao Zhou and Yi Zhang and Bo Liu
2023 arXiv   pre-print
The whole slide image (WSI) classification is often formulated as a multiple instance learning (MIL) problem.  ...  With several instance masking strategies based on attention scores, MHIM-MIL employs a momentum teacher to implicitly mine hard instances for training the student model, which can be any attention-based  ...  Dtfdmil: Double-tier feature distillation multiple instance learning for histopathology whole slide image classification.  ... 
arXiv:2307.15254v3 fatcat:55zalkfg6rckvl3rsaeq2wsj4y

Cross-scale Multi-instance Learning for Pathological Image Diagnosis [article]

Ruining Deng, Can Cui, Lucas W. Remedios, Shunxing Bao, R. Michael Womick, Sophie Chiron, Jia Li, Joseph T. Roland, Ken S. Lau, Qi Liu, Keith T. Wilson, Yaohong Wang (+3 others)
2024 arXiv   pre-print
Analyzing high resolution whole slide images (WSIs) with regard to information across multiple scales poses a significant challenge in digital pathology.  ...  Multi-instance learning (MIL) is a common solution for working with high resolution images by classifying bags of objects (i.e. sets of smaller image patches).  ...  MIL bag (Hashimoto et al., 2020a ); (7) a feature concatenation (DS-MIL) at different scales (Li et al., 2021) ; (8) A Double-Tier Feature Distillation when aggregating features from multiple scales  ... 
arXiv:2304.00216v3 fatcat:gemb3job4rhf3jcym4mneebwdy