Article
Author(s):
"The improvement in technologies could potentially assist radiation oncologists in making timely decisions,” said Himanshu Arora, PhD.
Investigators at Sylvester Comprehensive Cancer Center and the Desai Sethi Urology Institute at the University of Miami, Florida found that generative adversarial networks (GANs) have the potential to generate high-quality synthetic images from MRI images.1
“Technically, the technology developed here is the first start to building more sophisticated models of ‘data augmentation’ where new digital images can be used in further analysis. This is an early phase of our study, but the outcomes are extremely promising,” said Himanshu Arora, PhD, in a news release on the findings.2 Arora is an assistant professor at Sylvester and the Desai Sethi Urology Institute.
The investigators used T2-weighted prostate MRI images from the BLaStM trial (NCT02307058) and other sources to train the Single Natural Image GAN (SinGAN) models to make a generative model. The SinGAN generative model was then trained with a deep learning semantic segmentation pipeline to segment the prostate biopsy on 2D MRI and histology slices. Scientists with varying degrees of experience (more than 10 years, 1 year, and no experience) then participated in a quality control assessment.
The group of scientists with more than 10 years of experience were able to correctly identify conventional vs synthetic imaging with 67% accuracy on average. The group with 1 year of experience were able to identify conventional vs synthetic imaging with 58% accuracy, and the group with no experience were able to identify images with 50% accuracy.
The investigators also compared outcomes on the assessment when the synthetic images were manually selected by the study team vs when the images were randomly pooled after passing the segmentation pipeline. Findings showed no significant difference in participants’ quality control performance based on correct scores (P = .725) between their first and second test. However, the number of false negatives increased between the 2 tests, where 47% of the synthetic images included in the assessment were mistakenly identified as conventional images.
The team also conducted a blind assessment with a board-certified radiologist, who graded images on a 1 to 10 scale based on the ability to read and make a report from the image based on quality alone, where a higher score indicates better quality. The mean grade given by the radiologist was 6.2 for the synthetic images and 5.5 for the conventional images (P = .4839).
Based on these findings, the study authors suggest that machine learning models have the potential to serve as an adjunct to medical decision-making, rather than a replacement.
“Timely diagnosis and assessment of prognosis are challenges for prostate cancer, and this results in many deaths and increases [risk of disease progression]. We cannot replace the human eye when it comes to medical decision-making. Still, the improvement in technologies could potentially assist radiation oncologists in making timely decisions,” said Arora in the news release.
References
1. Xu IRL, Booven DJV, Goberdhan S, et al. Generative adversarial networks can create high quality artificial prostate cancer magnetic resonance images. J Pers Med.2023;13(3):547. doi:10.3390/jpm13030547
2. Scientists pioneer research to harness power of machine learning in prostate cancer imaging. News release. University of Miami Health System, Miller School of Medicine. March 23, 2023. Accessed March 29, 2023. https://www.newswise.com/articles/scientists-pioneer-research-to-harness-power-of-machine-learning-in-prostate-cancer-imaging?sc=sphr&xy=10016681