BReMS-Net: Prediction-Guided Coarse-to-Fine Refinement with Boundary-Aware Multi-Scale Dilated Fusion for Robust Breast Mass Segmentation

Authors

  • Tayyba Sarfraz Nanjing University of Information Science and Technology, China
  • Tan Ling Nanjing University of Information Science and Technology, China
  • Ahmad Ijaz Nanjing University of Information Science and Technology, China

DOI:

https://doi.org/10.64539/sjer.v2i3.2026.489

Keywords:

Breast mass segmentation, Mammography, Multi-stage deep learning, Hybrid dilated convolution, Computer-aided diagnosis

Abstract

Breast masses in mammograms are important to segment for computer-aided diagnosis (CAD) to enhance early detection and treatment decisions. Current approaches face challenges in segmenting lesions with low lesion-to-tissue contrast and diverse textures, resulting in misclassification or poor segmentation accuracy. To overcome this challenge, this paper introduces BReMS-Net, a multi-stage segmentation network to improve contextual learning and refined boundaries. We used an MBA-Net backbone with two major components: a Multi-scale Hybrid Dilated Convolution (MHD) module to extract multi-scale contextual features, and a Boundary Feature Auxiliary (BFA) module to strengthen boundary representations via coarse-to-fine feature fusion. Furthermore, a lightweight Prediction-Guided Refinement Module (PRM) uses initial predictions to produce attention maps, remove background clutter, and progressively refine boundary areas. The model has been evaluated on a cross-dataset basis, trained on the CBIS-DDSM dataset and tested on the INbreast dataset, and the results show that the BReMS-Net produces a Dice coefficient of 93.12% and an HD95 of 0.9826, which demonstrate competitive performance compared to several state-of-the-art deep learning methods. These results underline its generalization and robustness. Overall, the framework provides a robust and efficient approach to breast mass segmentation and has important implications for the performance and clinical relevance of automatic breast cancer diagnosis systems.

References

[1] Q. Guo et al., "Context-aware feature complementary screening network for mass segmentation in whole mammograms," Multimedia Systems, vol. 32, no. 1, p. 10, 2026. https://doi.org/10.1007/s00530-025-02072-1.

[2] D. E. M. Jaincy, P. Venkatasubbu, "An empirical study for breast cancer detection using MRI images," Biomedical Signal Processing, vol. 118, p. 109640, 2026. https://doi.org/10.1016/j.bspc.2026.109640.

[3] A. Poonia, M. Meena, A. S. Yadav, S. Maheshwari, and D. Songara, "Efficient Breast Cancer Detection and Classification Model by Analyzing Mammogram Images Using ViT‐Aided MobileNet With LSTM Network Based on Adaptive Segmentation," Computational Intelligence, vol. 42, no. 1, p. e70176, 2026. https://doi.org/10.1111/coin.70176.

[4] N. Brancati and M. Frucci, "USE-MiT: Attention-based model for breast ultrasound images segmentation," Computer Methods and Programs in Biomedicine Update, p. 100226, 2026. https://doi.org/10.1016/j.cmpbup.2025.100226.

[5] L. Yadav, G. Chandra, and D. Yadav, "Breast cancer stage detection by differentiating benign and malignant tumor using L0H-CWSNN and FZB-IS," Expert Systems with Applications, vol. 307, p. 131089, 2026. https://doi.org/10.1016/j.eswa.2026.131089.

[6] D. Mathew, K. V. Grace, and M. M. S. J. Preetha, "Breast cancer detection and classification using optimisation enabled deep learning model," International Journal of Bioinformatics Research and Applications, vol. 22, no. 2, pp. 107-125, 2026. https://doi.org/10.1504/IJBRA.2026.152611.

[7] R. Meegada and H. K. Bhuyan, "Segmentation and Feature Extraction Based Breast Cancer Detection Analysis," in 2026 Sixth International Conference on Advances in Electrical, Computing, Communications and Sustainable Technologies (ICAECT), 2026, pp. 1-6: IEEE. https://doi.org/10.1109/ICAECT68478.2026.11426107.

[8] V. Sreelekshmi, K. Pavithran, and J. J. Nair, "An integrated model for early breast cancer prediction using microcalcifications and patient risk factors," Discover Artificial Intelligence, vol. 6, no. 1, p. 30, 2026. https://doi.org/10.1007/s44163-025-00775-y.

[9] A. Wahiba and R. El Mostafa, "Pixel Intensity in Mammography: A Factor of Error in Breast Cancer Detection," in EPJ Web of Conferences, 2026, vol. 350, p. 03006: EDP Sciences. https://doi.org/10.1051/epjconf/202635003006.

[10] K. Mo et al., "Deep Electrical Impedance Spectroscopic Tomography for the Characterization of Tissue Architecture and Composition," IEEE Transactions on Instrumentation and Measurement, vol. 75, 2026. https://doi.org/10.1109/TIM.2026.3667225.

[11] D. M. Jaincy and V. Pattabiraman, "An empirical study for breast cancer detection using MRI images," Biomedical Signal Processing and Control, vol. 118, p. 109640, 2026. https://doi.org/10.1016/j.bspc.2026.109640.

[12] M. Basith, P. Praveen, and P. C. S. Reddy, "Adaptive deep Q-GAN framework for enhanced breast cancer detection in medical imaging," Biomedical Signal Processing and Control, vol. 112, p. 108638, 2026. https://doi.org/10.1016/j.bspc.2025.108638.

[13] B. Khati, S. Mukherjee, A. Sinitca, D. Kaplun, and R. Sarkar, "Reciprocal cooperative gating fusion of SqueezeNet and ShuffleNetV2 for breast cancer detection in histopathology images," Scientific Reports, vol. 16, art. No. 5904, 2026. https://doi.org/10.1038/s41598-026-36375-8.

[14] T. Lehnen, D. Polenske, B. D. Wichtmann, and N. C. Lehnen, "AI software as a third reader in breast cancer screening—a prospective diagnostic observational study," European Radiology, pp. 1-11, 2026. https://doi.org/10.1007/s00330-026-12359-0.

[15] B. N. Chua, D. K. H. Thng, T. B. Toh, and D. Ho, "Artificial intelligence for breast cancer management," Communications Medicine, vol. 6, 2026. https://doi.org/10.1038/s43856-025-01342-3.

[16] E. Elías-Cabot, S. Romero-Martín, J. L. Raya-Povedano, A. Rodríguez-Ruiz, and M. Álvarez-Benito, "AI-based triage and decision support in mammography and digital tomosynthesis for breast cancer screening: a paired, noninferiority trial," Nature Medicine, vol. 32, pp. 1296–1305, 2026. https://doi.org/10.1038/s41591-026-04277-x.

[17] C. Ma, H. Zhang, and L. Guo, "DFMFI: Ultrasound Breast Cancer Detection Method Based on Dynamic Fusion Multi-Scale Feature Interaction Model," Computerized Medical Imaging and Graphics, vol. 128, p. 102710, 2026. https://doi.org/10.1016/j.compmedimag.2026.102710.

[18] M.-J. Lee et al., "Single-tube total analysis system for ratiometric detection of exosomal miRNAs in breast cancer diagnosis," Chemical Engineering Journal, vol. 529, p. 173140, 2026. https://doi.org/10.1016/j.cej.2026.173140.

[19] H. Abudukelimu et al., "DVF-YOLO-Seg: A two-stage breast mass segmentation model with enhanced feature extraction and small lesion detection," Digital Health, vol. 11, 2025. https://doi.org/10.1177/20552076251374192.

[20] Y. Wang, M. Ali, T. Mahmood, A. Rehman, and T. Saba, "Robust Bi-CBMSegNet framework for advancing breast mass segmentation in mammography with a dual module encoder-decoder approach," Scientific Reports, vol. 15, no. 1, p. 24434, 2025. https://doi.org/10.1038/s41598-025-09775-5.

[21] F. J. M. Shamrat et al., "MammoSegNet: a convolutional network analysis for segmenting tumor tissue masses in digital mammograms of breast cancer patients," Neural Computing and Applications, vol. 37, no. 32, pp. 26437-26484, 2025. https://doi.org/10.1007/s00521-025-11631-6.

[22] T. Fatma, P. K. Sahu, S. Choudhury, and A. Wunnava, "Magnification-independent breast cancer diagnosis using a GWO-enhanced vision transformer with multi-stage stain normalization," Scientific Reports, vol. 16, 2026. https://doi.org/10.1038/s41598-026-42490-3.

[23] Q. Chen, Y. Zhao, X. Luo, W. He, and G. Shi, "Hybrid Deep Learning and Classification Framework for Automatic Traffic Inspection Classification Based on Image Detection," Transactions on Emerging Telecommunications Technologies, vol. 37, no. 2, p. e70325, 2026. https://doi.org/10.1002/ett.70325.

[24] A. Jain, R. K. Rupani, K. P. Arunachalam, and D. Veeraswamy, "Multi-scale feature fusion for breast cancer detection using circular dilated convolutional transformer optimized by enhanced wombat algorithm," Computers and Electrical Engineering, vol. 134, p. 111087, 2026. https://doi.org/10.1016/j.compeleceng.2026.111087.

[25] S. Mohammadi and M. A. Livani, "A two-stage self-supervised learning framework for breast cancer detection with multi-scale vision transformers," Information Sciences, vol. 735, p. 123061, 2026. https://doi.org/10.1016/j.ins.2025.123061.

[26] G. Shruthi and P. Ravikumar, "A Hybrid CNN-Transformer Model for Tumor-Infiltrating Lymphocyte Score Prediction in Breast Cancer Histopathological Image," Engineering, Technology & Applied Science Research, vol. 16, no. 2, pp. 32893-32898, 2026. https://doi.org/10.48084/etasr.15757.

[27] P. J. Ho et al., "Breast Cancer Screening Knowledge and Sentiments in Singaporean Women: Mixed Methods Study Using Topic Modeling, Sentiment Analysis, and Structured Questionnaire Data," Journal of Medical Internet Research, vol. 28, p. e78439, 2026. https://doi.org/10.2196/78439.

[28] R. Varsha and S. Veni, "Hybrid Deep Learning: Parallel CNN and Swin Transformer Fusion Network for Breast Cancer Diagnosis," in 2026 International Conference on Electric Power and Renewable Energy (EPREC), 2026. https://doi.org/10.1109/EPREC66546.2026.11412005.

[29] H. Acikgoz, A. Aytekin, and S. Gezici, "BreasTransNeXt: An Enhanced Multi-Module Vision Transformer For Early Breast Cancer Diagnosis," Journal of Imaging Informatics in Medicine, pp. 1-20, 2026. https://doi.org/10.1007/s10278-026-01863-w.

[30] R. M. Al-Tam, A. M. Al-Hejri, F. A. Hashim, S. M. Narangale, M. A. Al-Antari, and S. A. Alzakari, "An Interpretable Ensemble Transformer Framework for Breast Cancer Detection in Ultrasound Images," Diagnostics, vol. 16, no. 4, p. 622, 2026. https://doi.org/10.3390/diagnostics16040622.

[31] L. G. Falconi, M. Perez, W. G. Aguilar, and A. Conci, "Transfer learning and fine tuning in breast mammogram abnormalities classification on CBIS-DDSM database," Adv. Sci. Technol. Eng. Syst. J., vol. 5, no. 2, pp. 154-165, 2020. https://doi.org/10.25046/aj050220.

[32] I. C. Moreira, I. Amaral, I. Domingues, A. Cardoso, M. J. Cardoso, and J. S. Cardoso, "Inbreast: toward a full-field digital mammographic database," Academic radiology, vol. 19, no. 2, pp. 236-248, 2012. https://doi.org/10.1016/j.acra.2011.09.014.

Downloads

Published

2026-05-06

How to Cite

Sarfraz, T., Ling, T., & Ijaz, A. (2026). BReMS-Net: Prediction-Guided Coarse-to-Fine Refinement with Boundary-Aware Multi-Scale Dilated Fusion for Robust Breast Mass Segmentation. Scientific Journal of Engineering Research, 2(3), 327–338. https://doi.org/10.64539/sjer.v2i3.2026.489

Similar Articles

<< < 1 2 3 4 > >> 

You may also start an advanced similarity search for this article.