2024 Medical Technologies Congress, TIPTEKNO 2024, Muğla, Türkiye, 10 - 12 Ekim 2024
This paper introduces a novel deep learning framework for the discovery of breast cancer stages, which integrates GAN-generated synthetic images with multi-omics data. By employing StyleGAN3 for the generation of realistic histopathological images and Swin Transformer for classification, the model draws upon both visual and biological data to enhance the accuracy of cancer staging predictions. The proposed methodology entails the generation of high-quality synthetic images using StyleGAN3, with a Fréchet Inception Distance (FID) score of 35, indicating a reasonable degree of similarity to real images. The images, in conjunction with RNA, miRNA, and clinical data, are integrated into a Swin Transformer-based classifier, resulting in an accuracy of 95.03%, a precision of 95.00%, and an F1 score of 95.00%. A threshold-based softmax probability analysis was employed during the inference stage to explore the potential discovery of new cancer stages. The preliminary observation-based threshold of 30% may be optimized through further experimentation. In the event that the model exhibited a confidence level for a given class below the specified threshold, the image was identified as a potential candidate for a previously unidentified stage. This study underscores the potential of multimodal data integration in enhancing breast cancer staging and offers insights into leveraging deep learning models for generating and classifying histopathological data, alongside identifying novel disease stages.