Deep image reconstruction using unregistered measurements without groundtruth

Weijie Gan, Yu Sun, Cihat Eldeniz, Jiaming Liu, Hongyu An, Ulugbek S. Kamilov

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

One of the key limitations in conventional deep learning based image reconstruction is the need for registered pairs of training images containing a set of high-quality groundtruth images. This paper addresses this limitation by proposing a novel unsupervised deep registration-augmented reconstruction method (U-Dream) for training deep neural nets to reconstruct high-quality images by directly mapping pairs of unregistered and artifact-corrupted images. The ability of U-Dream to circumvent the need for accurately registered data makes it widely applicable to many biomedical image reconstruction tasks. We validate it in accelerated magnetic resonance imaging (MRI) by training an image reconstruction model directly on pairs of undersampled measurements from images that have undergone nonrigid deformations.

Original languageEnglish
Title of host publication2021 IEEE 18th International Symposium on Biomedical Imaging, ISBI 2021
PublisherIEEE Computer Society
Pages1531-1534
Number of pages4
ISBN (Electronic)9781665412469
DOIs
StatePublished - Apr 13 2021
Event18th IEEE International Symposium on Biomedical Imaging, ISBI 2021 - Nice, France
Duration: Apr 13 2021Apr 16 2021

Publication series

NameProceedings - International Symposium on Biomedical Imaging
Volume2021-April
ISSN (Print)1945-7928
ISSN (Electronic)1945-8452

Conference

Conference18th IEEE International Symposium on Biomedical Imaging, ISBI 2021
Country/TerritoryFrance
CityNice
Period04/13/2104/16/21

Keywords

  • Deep learning
  • Deformable image registration
  • Image reconstruction
  • Magnetic resonance imaging

Fingerprint

Dive into the research topics of 'Deep image reconstruction using unregistered measurements without groundtruth'. Together they form a unique fingerprint.

Cite this