Hybrid CNN-ViT for MRI Image Brain Tumor Classification with Enhanced Explainability

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

ASTU

Abstract

Brain tumor is an abnormal growth of tissue in the brain that can interfere with normalbrain function. Now adays, brain tumorclassification is very challenging tasks due to its complexity nature. Due to this, it has significant worldwide human life and socio-economic consequences, with its expensive treatment and diagnosis strategies. Numerous research studies have been introduced using deep learning state of arts such as CNNandViTtosolve such issues. Most existing approaches rely solely oneither CNNs orVision Transformers, each with limitations in capturing both local andglobal features. Despite their strength, CNN struggle with long range dependen cies and global context modeling, while ViT address this but challenge with local inductive bias and data inefficiency. Their black-box nature is also another issue of both model. Bytaking this into account, we propose a hybrid CNN-ViT to enhance the accuracy and explainability of brain tumor classification from MRI images by focusing on glioma, meningioma, no tumor, and pituitary type. ResNet50 was em ployed for local spatial feature extraction with ViT-B/16 of self-attention mechanism for long-range dependencies and global context modeling of spatial features. An enhanced version of Local Interpretable Model-agnostic Explanations(LIME) with Discrete Wavelet Transform(DWT) was employed to provide explanation into the modeldecision-making processes. The model was trained on 18,800 images for train ing, 2,350 images for validation, and evaluated using 2,350 images for testing, which is 80%, 10%, and10%respectively. Our experimental result demonstrated that the hybrid CNN-ViT achieves a precision 99.62%, F1-score 99.62%, recall 99.65%, and test accuracy 99.62%, outperforming standalone CNN, ResNet50, ViT-B/16, ViT-B/32, andbaseline studies by utilizing both local and global features. An enhanced XAI of LIMEfurther improves the explainability by highlighting the specific tumor pattern, which contributes most and moderately to lead model decision-making the model a promising tool for reliable and explainable brain tumor diagnosis in medical imag ing.

Description

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By