Diagnosing colorectal abnormalities using scattering coefficient maps acquired from optical coherence tomography

Yifeng Zeng, William C. Chapman, Yixiao Lin, Shuying Li, Matthew Mutch, Quing Zhu

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Optical coherence tomography (OCT) has shown potential in differentiating normal colonic mucosa from neoplasia. In this study of 33 fresh human colon specimens, we report the first use of texture features and computer vision-based imaging features acquired from en face scattering coefficient maps to characterize colorectal tissue. En face scattering coefficient maps were generated automatically using a new fast integral imaging algorithm. From these maps, a gray-level cooccurrence matrix algorithm was used to extract texture features, and a scale-invariant feature transform algorithm was used to derive novel computer vision-based features. In total, 25 features were obtained, and the importance of each feature in diagnosis was evaluated using a random forest model. Two classifiers were assessed on two different classification tasks. A support vector machine model was found to be optimal for distinguishing normal from abnormal tissue, with 94.7% sensitivity and 94.0% specificity, while a random forest model performed optimally in further differentiating abnormal tissues (i.e., cancerous tissue and adenomatous polyp) with 86.9% sensitivity and 85.0% specificity. These results demonstrated the potential of using OCT to aid the diagnosis of human colorectal disease.

Original languageEnglish
Article numbere202000276
JournalJournal of Biophotonics
Volume14
Issue number1
DOIs
StatePublished - Jan 2021

Keywords

  • colorectal cancer
  • feature engineering
  • machine learning
  • optical coherence tomography
  • scattering coefficient map

Fingerprint

Dive into the research topics of 'Diagnosing colorectal abnormalities using scattering coefficient maps acquired from optical coherence tomography'. Together they form a unique fingerprint.

Cite this