Objective. Hospitalsin the National Healthcare Safety Network beganreporting laboratory-identified(LabID)Clostridium difficileinfection (CDI) events in January 2013. Our study quantified the differences between the LabID and traditional surveillance methods. design. Cohort study. setting. Acohort of 29 community hospitals in the southeastern United States. methods. A period of 6 months (January 1, 2013, to June 30, 2013) of prospectively collected data using both LabID and traditional surveillance definitions were analyzed. CDI events with mismatched surveillance categories between LabID and traditional definitions were identified and characterized further. Hospital-onset CDI (HO-CDI) rates for the entire cohort of hospitals were calculated using each method, then hospital-specific HO-CDI rates and standardized infection ratios (SIRs) were calculated. Hospital rankings based on each CDI surveillance measure were compared. results. A total of 1,252 incident LabIDCDI events were identified during 708,551 patient-days;286(23%)mismatched CDI events were detected. The overall HO-CDI rate was 6.0 vs 4.4 per 10,000 patient-days for LabID and traditional surveillance, respectively (P <.001); of 29 hospitals, 25 (86%) detected a higher CDI rate using LabID compared with the traditional method. Hospital rank in the cohort differed greatly between surveillance measures. A rank change of at least 5 places occurred in 9 of 28 hospitals (32%) between LabID and traditional CDI surveillance methods, and for SIR. conclusions. LabID surveillance resulted in a higher hospital-onset CDI incidence rate than did traditional surveillance. Hospital-specific rankings varied based on the HO-CDI surveillance measure used. A clear understanding of differences in CDI surveillance measures is important when interpreting national and local CDI data.