Repository logo
  • English
  • Español
  • Log In
    Have you forgotten your password?
Universidad Tecnológica Indoamérica
Repository logo
  • Communities & Collections
  • Research Outputs
  • Projects
  • Researchers
  • Statistics
  • InvestigaciĂłn IndoamĂ©rica
  • English
  • Español
  • Log In
    Have you forgotten your password?
  1. Home
  2. CRIS
  3. Publications
  4. Depth-Enhanced Tumor Detection Framework for Breast Histopathology Images by Integrating Adaptive Multi-Scale Fusion, Semantic Depth Calibration, and Boundary-Guided Detection
 
Options

Depth-Enhanced Tumor Detection Framework for Breast Histopathology Images by Integrating Adaptive Multi-Scale Fusion, Semantic Depth Calibration, and Boundary-Guided Detection

Journal
IEEE Access
ISSN
2169-3536
Date Issued
2025
Author(s)
A. Robert Singh
Suganya Athisayamani
Hariharasitaraman S
Faten Khalid Karim
Varela Aldas, José
Centro de investigaciĂłn en MecatrĂłnica y Sistemas Interactivos
Samih M. Mostafa
Type
journal-article
DOI
10.1109/ACCESS.2025.3554342
URL
https://cris.indoamerica.edu.ec/handle/123456789/9416
Abstract
Multiple modalities offer the advantage of providing complementary visual information that enhances overall understanding. This capability is particularly valuable in the domain of autonomous tumor detection. However, the challenge of occlusions in autonomous tumor perception—especially in cases of tumor-to-tumor occlusion—hinders the effective utilization of occlusion-related features. As a result, these limitations lead to a decline in the accuracy of object detection. We propose a novel framework for tumor detection in histopathological images by integrating multi-scale RGB features and depth-enhanced semantic information. The method comprises three primary modules: the Adaptive Multi-Scale Fusion Module (AMSF), Semantic Depth Integration and Calibration Module (SDICM), and Depth-Guided Tumor Detection Module (DG-TDM). AMSF combines RGB histopathology image channels, binary masks, and multi-scale convolution outputs using attention mechanisms to generate a fused feature map that captures occlusion information and boundary relevance. SDICM constructs dense depth maps by fusing RGB, semantic, and sparse depth data through bidirectional feature aggregation, enhancing spatial continuity and edge clarity. DG-TDM refines tumor boundary detection by combining depth and RGB features using spatial and channel-wise attention mechanisms, which highlight boundary differences and reduce redundancy from background noise. The proposed loss function optimizes RGB-depth feature alignment, balancing contributions from each modality for robust tumor detection. This approach addresses challenges such as overlapping boundaries, occlusions, and poor contrast in histopathology images, enabling precise localization of tumor regions. Experimental results demonstrate that the proposed framework significantly improves detection accuracy by 98% and boundary delineation compared to existing methods. This robust and efficient methodology provides a promising solution for advancing tumor detection in medical imaging.
Subjects
  • Adaptive multi-scale ...

  • Depth calibration

  • Depth integrations

  • Depth-guided tumor de...

  • Detection modules

  • Fusion modules

  • Histopathology

  • Multiscale fusion

  • Semantic depth integr...

  • Tumour detection

google-scholar
Views
Downloads
Logo Universidad Tecnológica Indoamérica Hosting and Support by Logo Scimago

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback