Abstract

Accurately counting cells in microscopic images is important for medical diagnoses and biological studies, but manual cell counting is very tedious, time-consuming, and prone to subjective errors, and automatic counting can be less accurate than desired. To improve the accuracy of automatic cell counting, we propose here a novel method that employs deeply-supervised density regression. A fully convolutional neural network (FCNN) serves as the primary FCNN for density map regression. Innovatively, a set of auxiliary FCNNs are employed to provide additional supervision for learning the intermediate layers of the primary CNN to improve network performance. In addition, the primary CNN is designed as a concatenating framework to integrate multi-scale features through shortcut connections in the network, which improves the granularity of the features extracted from the intermediate CNN layers and further supports the final density map estimation. The experimental results on immunofluorescent images of human embryonic stem cells demonstrate the superior performance of the proposed method over other state-of-the-art methods.

Original languageEnglish
Title of host publicationMedical Imaging 2019
Subtitle of host publicationDigital Pathology
EditorsJohn E. Tomaszewski, Aaron D. Ward
PublisherSPIE
ISBN (Electronic)9781510625594
DOIs
StatePublished - 2019
EventMedical Imaging 2019: Digital Pathology - San Diego, United States
Duration: Feb 20 2019Feb 21 2019

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume10956
ISSN (Print)1605-7422

Conference

ConferenceMedical Imaging 2019: Digital Pathology
Country/TerritoryUnited States
CitySan Diego
Period02/20/1902/21/19

Keywords

  • Automatic cell counting
  • Concatenating network
  • Deeply-supervised learning
  • Density regression
  • Microscopic images

Fingerprint

Dive into the research topics of 'Automatic microscopic cell counting by use of deeply-supervised density regression model'. Together they form a unique fingerprint.

Cite this