Human Connectome Project informatics: Quality control, database services, and data visualization

Daniel S. Marcus, Michael P. Harms, Abraham Z. Snyder, Mark Jenkinson, J. Anthony Wilson, Matthew F. Glasser, Deanna M. Barch, Kevin A. Archie, Gregory C. Burgess, Mohana Ramaratnam, Michael Hodge, William Horton, Rick Herrick, Timothy Olsen, Michael McKay, Matthew House, Michael Hileman, Erin Reid, John Harwell, Timothy CoalsonJon Schindler, Jennifer S. Elam, Sandra W. Curtiss, David C. Van Essen

Research output: Contribution to journalArticlepeer-review

313 Scopus citations

Abstract

The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study.

Original languageEnglish
Pages (from-to)202-219
Number of pages18
JournalNeuroImage
Volume80
DOIs
StatePublished - Oct 15 2013

Fingerprint

Dive into the research topics of 'Human Connectome Project informatics: Quality control, database services, and data visualization'. Together they form a unique fingerprint.

Cite this