Image Reconstruction for MRI using Deep CNN Priors Trained without Groundtruth

Weijie Gan, Cihat Eldeniz, Jiaming Liu, Sihao Chen, Hongyu An, Ulugbek S. Kamilov

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

We propose a new plug-and-play priors (PnP) based MR image reconstruction method that systematically enforces data consistency while also exploiting deep-learning priors. Our prior is specified through a convolutional neural network (CNN) trained without any artifact-free ground truth to remove under-sampling artifacts from MR images. The results on reconstructing free-breathing MRI data into ten respiratory phases show that the method can form high-quality 4D images from severely undersampled measurements corresponding to acquisitions of about 1 and 2 minutes in length. The results also highlight the competitive performance of the method compared to several popular alternatives, including the TGV regularization and traditional UNet3D.

Original languageEnglish
Title of host publicationConference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages475-479
Number of pages5
ISBN (Electronic)9780738131269
DOIs
StatePublished - Nov 1 2020
Event54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020 - Pacific Grove, United States
Duration: Nov 1 2020Nov 5 2020

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
Volume2020-November
ISSN (Print)1058-6393

Conference

Conference54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
Country/TerritoryUnited States
CityPacific Grove
Period11/1/2011/5/20

Keywords

  • Image reconstruction
  • deep learning
  • magnetic resonance imaging
  • plug-and-play priors

Fingerprint

Dive into the research topics of 'Image Reconstruction for MRI using Deep CNN Priors Trained without Groundtruth'. Together they form a unique fingerprint.

Cite this