TY - JOUR
T1 - Automated lesion detection of breast cancer in [18F] FDG PET/CT using a novel AI-Based workflow
AU - Leal, Jeffrey P.
AU - Rowe, Steven P.
AU - Stearns, Vered
AU - Connolly, Roisin M.
AU - Vaklavas, Christos
AU - Liu, Minetta C.
AU - Storniolo, Anna Maria
AU - Wahl, Richard L.
AU - Pomper, Martin G.
AU - Solnes, Lilja B.
N1 - Funding Information:
The authors wish to acknowledge the following for their support of this work: National Institutes of Health/National Cancer Institute P30CA006973, the National Institutes of Health/National Cancer Institute U01-CA140204, the Translational Breast Cancer Research Consortium, the funding support to the TBCRC from the AVON Foundation, The Breast Cancer Research Foundation, and Susan G. Komen, and the Nvidia Corporation GPU Grant Program.
Funding Information:
The authors wish to acknowledge the following for their support of this work: National Institutes of Health/National Cancer Institute P30CA006973 the National Institutes of Health/National Cancer Institute U01-CA140204, the Translational Breast Cancer Research Consortium, the funding support to the TBCRC from the AVON Foundation, The Breast Cancer Research Foundation, and Susan G. Komen, and the Nvidia Corporation GPU Grant Program.
Publisher Copyright:
Copyright © 2022 Leal, Rowe, Stearns, Connolly, Vaklavas, Liu, Storniolo, Wahl, Pomper and Solnes.
PY - 2022/11/15
Y1 - 2022/11/15
N2 - Applications based on artificial intelligence (AI) and deep learning (DL) are rapidly being developed to assist in the detection and characterization of lesions on medical images. In this study, we developed and examined an image-processing workflow that incorporates both traditional image processing with AI technology and utilizes a standards-based approach for disease identification and quantitation to segment and classify tissue within a whole-body [18F]FDG PET/CT study. Methods: One hundred thirty baseline PET/CT studies from two multi-institutional preoperative clinical trials in early-stage breast cancer were semi-automatically segmented using techniques based on PERCIST v1.0 thresholds and the individual segmentations classified as to tissue type by an experienced nuclear medicine physician. These classifications were then used to train a convolutional neural network (CNN) to automatically accomplish the same tasks. Results: Our CNN-based workflow demonstrated Sensitivity at detecting disease (either primary lesion or lymphadenopathy) of 0.96 (95% CI [0.9, 1.0], 99% CI [0.87,1.00]), Specificity of 1.00 (95% CI [1.0,1.0], 99% CI [1.0,1.0]), DICE score of 0.94 (95% CI [0.89, 0.99], 99% CI [0.86, 1.00]), and Jaccard score of 0.89 (95% CI [0.80, 0.98], 99% CI [0.74, 1.00]). Conclusion: This pilot work has demonstrated the ability of AI-based workflow using DL-CNNs to specifically identify breast cancer tissue as determined by [18F]FDG avidity in a PET/CT study. The high sensitivity and specificity of the network supports the idea that AI can be trained to recognize specific tissue signatures, both normal and disease, in molecular imaging studies using radiopharmaceuticals. Future work will explore the applicability of these techniques to other disease types and alternative radiotracers, as well as explore the accuracy of fully automated and quantitative detection and response assessment.
AB - Applications based on artificial intelligence (AI) and deep learning (DL) are rapidly being developed to assist in the detection and characterization of lesions on medical images. In this study, we developed and examined an image-processing workflow that incorporates both traditional image processing with AI technology and utilizes a standards-based approach for disease identification and quantitation to segment and classify tissue within a whole-body [18F]FDG PET/CT study. Methods: One hundred thirty baseline PET/CT studies from two multi-institutional preoperative clinical trials in early-stage breast cancer were semi-automatically segmented using techniques based on PERCIST v1.0 thresholds and the individual segmentations classified as to tissue type by an experienced nuclear medicine physician. These classifications were then used to train a convolutional neural network (CNN) to automatically accomplish the same tasks. Results: Our CNN-based workflow demonstrated Sensitivity at detecting disease (either primary lesion or lymphadenopathy) of 0.96 (95% CI [0.9, 1.0], 99% CI [0.87,1.00]), Specificity of 1.00 (95% CI [1.0,1.0], 99% CI [1.0,1.0]), DICE score of 0.94 (95% CI [0.89, 0.99], 99% CI [0.86, 1.00]), and Jaccard score of 0.89 (95% CI [0.80, 0.98], 99% CI [0.74, 1.00]). Conclusion: This pilot work has demonstrated the ability of AI-based workflow using DL-CNNs to specifically identify breast cancer tissue as determined by [18F]FDG avidity in a PET/CT study. The high sensitivity and specificity of the network supports the idea that AI can be trained to recognize specific tissue signatures, both normal and disease, in molecular imaging studies using radiopharmaceuticals. Future work will explore the applicability of these techniques to other disease types and alternative radiotracers, as well as explore the accuracy of fully automated and quantitative detection and response assessment.
KW - PERCIST v1.0
KW - artificial intelligence
KW - breast cancer
KW - deep learning
KW - machine learning
UR - http://www.scopus.com/inward/record.url?scp=85143122003&partnerID=8YFLogxK
U2 - 10.3389/fonc.2022.1007874
DO - 10.3389/fonc.2022.1007874
M3 - Article
C2 - 36457510
AN - SCOPUS:85143122003
SN - 2234-943X
VL - 12
JO - Frontiers in Oncology
JF - Frontiers in Oncology
M1 - 1007874
ER -