z-logo
Premium
Blood donor deferrals: biting the hand that feeds us!
Author(s) -
Gorlin Jed
Publication year - 2008
Publication title -
transfusion
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.045
H-Index - 132
eISSN - 1537-2995
pISSN - 0041-1132
DOI - 10.1111/j.1537-2995.2008.01990.x
Subject(s) - deferral , medicine , hematocrit , donation , blood donor , surgery , culling , demography , immunology , veterinary medicine , finance , herd , sociology , economics , economic growth
I n this issue of TRANSFUSION, the article by Zou and colleagues details trends in donor deferral and retention using the American Red Cross database. The power of the study is the enormous experience culled from almost half the nation’s blood supply. Prominent among the observations are the increasing deferrals for failing to achieve minimum hemoglobin (Hb)/ hematocrit (Hct) criteria and for donor travel, especially to areas considered to pose a potential risk for malaria. Travel deferrals for residing in countries with variant Creutzfeldt-Jakob disease (vCJD) risk showed a gradual fall, perhaps reflecting prior culling followed by selfdeferral and general awareness of these criteria among repeat donors and the public in general. More worrisome is the far higher rate of nonreturn among younger donors who receive a temporary deferral (2/2 times higher among 16to 19-year-olds than those 50-59 years). Although this phenomenon has been previously observed, this study serves to quantitate the enormous lifetime loss of donors that results from these early deferrals. Whether it is because the donors have yet to bond with the center or whether they have not yet incorporated “donorness” into their self-concept is being explored in various donor motivational studies, but the simple empirical observation is that temporary deferrals play greatest havoc with the donors who represent the future of volunteer blood donation. The magnitude of the deferrals should give pause, especially in light of their disproportionate effect on younger donors. Since any new donor infectious disease screening test requires extensive validation, so too should our current or new donor screening questions. For example, the authors cite the apparent illogic that although there is empirical evidence to support the transfusion transmissibility of vCJD, no case of transfusion transmission from donors subsequently deferred for a family history of CJD has ever been observed, despite extensive lookback efforts. Hence, the cost (in terms of donor loss) for continuing to defer donors when even a distant family member is identified with CJD clearly exceeds any benefit. Fortunately, the number of these deferrals is relatively small. Conversely, malarial travel represents the single greatest reason for deferral with an estimated more than 90,000 donors deferred annually (Dr Roger Dodd, American Red Cross testimony before Blood Products Advisory Committee, Sept 11, 2008). Even more potential donors are likely self-deferring, knowing that others who went on similar trips are deferred. Since many of these donors have traveled to Mexico, primarily to areas with minimal risk, such as the “Mexican Riviera” region south of Cancun, there is an unintended emphasis on deferring the majority of donors for having traveled to an area unassociated with prior transfusion transmissions. Specifically, in the retrospective study by the CDC of the last 91 cases of transfusion-transmitted malaria, none in the past 45 years was associated with casual travel to Mexico and only 3 of 64 were from civilian travelers. The authors further observed an increasing rate of deferrals due to failure to meet Hct/Hb criteria. Since their screening technology was largely static during the time of data collection, they ascribe the high rates of donor loss either to moving the assay earlier in the donation process (generally, for logistic reasons the donor interview stops at time of first deferral) or to more aggressive recruiting of repeat donors. Since the timing of the Hb test is a one-time change, if the trend persists, one might safely conclude that donor erythropoiesis has become the major limiting factor as blood centers engaging in donor iron studies have observed. Finally, they observed that because our current donor base is already so pedigreed and so safe relative to the total population, any further modifications of the donor health history are unlikely to have measurable effects on diseases we already screen for, since the questions have likely already reached the limit of their effectiveness. Of note, requiring donors to answer the high-risk questions on computer results in elicitation of more high-risk behavior than face-to-face interviews, which should result in greater use of computer-assisted self-interview (CASI) strategies. Food and Drug Administration (FDA) requirements for clearance of new blood donor screening tests or software systems emphasize extensive validation. Yet historically, this same rigor has not been applied to implementing new donor screening questions. Indeed, the operating standard seems to have been if adding the question might reasonably be expected to defer a donor at risk of transmitting a transfusion borne pathogen, then the question was implemented, regardless of the query’s positive or negative predictive value. Indeed, this application of the precautionary principle was raised in advocacy for a question about xenotransplantation, which fewer than 100 US citizens have undergone and, most recently, the implementation of a simian foamy virus screening question in Canada. The role of the AABB Uniform Donor TRANSFUSION 2008;48:2484-2486.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here