Background: A neutrophil count of 1500 cells per microliter has been traditionally used as the cutoff for neutropenia and has been considered a marker of increased susceptibility to infections and adverse prognosis. Despite conventional use of this definition, there are no large studies that have examined the direct relationship of low neutrophil counts with overall survival, especially in relation to different ethnicities. Thus, the aim of this study was to examine the prevalence of neutropenia and its prognostic impact among the various ethnicities.
Methods: The study cohort was selected from an inner city, multiethnic outpatient population and included 26,652 subjects that were 65 years of age or above. Cox Proportional Hazard Models were built to compare the survival experience of neutropenic and non-neutropenic subjects within three separate ethnic strata. LOESS curves were used to graphically assess the relationship of mortality risk with absolute neutrophil counts in the different ethnicities.
Results: Using a cutoff of 1500 neutrophils per microliter, neutropenia was observed in approximately 0.6% of non-Hispanic Caucasians, 2.9% of non-Hispanic Blacks (African Americans) and 1.4% of Hispanics (p<0.001, ChiSquare test). The presence of neutropenia was associated with significantly shorter overall survival among non-Hispanic Caucasians (HR 2.14; 95% CI 1.40 - 3.30; p<0.01), whereas it conferred a lower, albeit not statistically significant, risk for African Americans (HR 0.80; 95% CI 0.60 - 1.06; p=0.12). There was a 26% increased risk of death for Hispanic neutropenic individuals compared to those with normal neutrophil counts, but this was also not statistically significant (HR 1.26; 95% CI 0.90 - 1.77; p=0.18). After adjustment for age, gender, Charlson comorbidity index (CCI) and other hematologic parameters (total white blood cell count, hemoglobin and platelet count), neutropenia remained an independent predictor of mortality for the non-Hispanic Caucasians (HR 1.78; 95% CI 1.11 - 2.71; p=0.015) and maintained its non-significant association with shorter overall survival in the non-Hispanic Black and Hispanic groups (HR 1.03; 95% CI 0.76 -1.39; p=0.86 and HR 1.28; 0.90 - 1.82; p=0.17, respectively). Next, to determine the neutrophil cutoff with optimal prognostic value among African Americans, we used LOESS regression curves and Cox Proportional Hazard modeling to show that mortality risk became significant at neutrophil counts below 1100 cells per microliter (Figure 1). Using this new cutoff, prevalence of neutropenia within the African American subcohort was 0.6% and it was significantly associated with a two-fold increase in the risk of death adjusted for all the above covariates (HR: 2.01; 95% CI 1.22 - 3.32; p<0.01).
Conclusions: Using a large inner city cohort, we demonstrated that neutropenia is an independent prognostic variable in the elderly. Our study suggests that the prognostic value of the currently used definition of neutropenia on overall survival varies among different ethnicities. A cutoff of 1100 neutrophils per microliter may be a more prognostically relevant marker of neutropenia in African Americans and can be used as a novel potential threshold of intervention in this population.
No relevant conflicts of interest to declare.
Asterisk with author names denotes non-ASH members.