This paper examines the current state of the blood supply in the US and focuses on the potential for augmenting blood availability by attention to the iron status of donors. Increasing demands are being made upon the national blood supply as rates of blood donation are declining, in part because of the loss of blood donors as a result of enhanced screening and testing procedures. Iron-related means of expanding the blood supply include the use of blood from individuals undergoing therapeutic phlebotomy for hereditary hemochromatosis and enhancing the retention and commitment of women of childbearing age as donors by using iron supplementation to prevent iron deficiency.
In Section I, Dr. Klein discuss the circumstances responsible for a decline in the population of eligible donors, including public attitudes toward donation, factors influencing the retention of donors by blood centers, and the effects of increased screening and testing to maintain the safety of the blood supply.
In Section II, Drs. Kushner and Ajioka focus on the consequences of the decision by the US Food and Drug Administration (FDA) to develop recommendations to permit blood centers to collect blood from patients with hereditary hemochromatosis and to distribute this blood obtained without disease labeling if all other screening and testing procedures are passed. After summarizing the pathophysiology of hereditary hemochromatosis, the use by blood centers of blood obtained from heterozygotes and homozygotes for hereditary hemochromatosis is considered.
In Section III, Dr. Brittenham reviews the use of low dose, short-term carbonyl iron supplementation for women donors of childbearing age. Replacing the iron lost at donation can help prevent iron deficiency in women of childbearing age and, by decreasing deferral, enhance the retention and commitment of women who give blood regularly. He emphasizes the use by blood centers of iron-related means to enhance recruitment and retention of blood donors.
I. The Shrinking Pool of Blood Donors
Harvey G. Klein, MD**
NIH Clinical Center/DTM, 9000 Rockville Pike, Bethesda MD 20892
According to the 2000 Nationwide Blood Collection and Utilization Survey conducted by the National Blood Data Resource Center (NBDRC), the most recent national data set available regarding blood collections,1 13,760,000 units of whole blood were collected in 1999. An additional 116,000 units of red blood cells were collected by the process of apheresis including the newly licensed 2-unit technique. Overall, red cell supply in 1999 was 13,876,000 units, a 10.1% increase over the collections reported in 1997, the last previous year for which national data are available.2 In view of this information, one could reasonably question whether there is a shortage of blood in the US and whether there is a problem regarding the size of the blood donor pool and the willingness of volunteers to donate blood. Nonetheless, blood usage has been rising more rapidly than has whole blood collection, and additional testing and screening standards, as well as a variety of social and demographic changes, have progressively pared down the number of eligible donors.
Protection of the Recipient
Blood donor deferrals are introduced for two purposes: to protect the health of the transfusion recipient and to protect the health of the donor. Although screening and testing of blood donors long predated the human immunodeficiency virus (HIV) epidemic, increasing concerns regarding blood safety during this period resulted in new emphasis on donor screening and testing. Measures introduced to increase blood safety have also had the unintended consequence of decreasing blood availability. Results from demographic studies indicate that certain donor groups or donor sites present an unacceptable risk of disease transmission. For example, blood collectors no longer schedule mobile drives at prisons or institutions for the mentally retarded because of the recognized high prevalence of transfusion-transmissible viruses.3,4 Few would argue with the risk-benefit ratio of these exclusions. More questionable were the temporary exclusions of soldiers exposed to multiple tick bites at Fort Chaffee, Arkansas, and the half-million Desert Storm veterans who were deferred for a year because of the fear that they might harbor Leishmania donovani, an agent not known to be associated with transfusion risk. Donors who have received human growth hormone injections have been indefinitely deferred because of the possible risk of transmitting Creutzfeldt-Jakob disease. The impact of this deferral on the blood supply has been negligible. In contrast, the recent exclusion of donors who resided in the United Kingdom for an total of six months or longer between 1980 and 1996, designed to reduce the theoretic risk of transmission of the human variant of “mad cow disease,” has eliminated an estimated 2.2% of US donors. The proposed expansion of this geographic exclusion to the European continent is estimated to eliminate another 2–8% of otherwise acceptable blood donors (Alan E. Williams, Ph.D., personal communication).
Additional donor exclusions appear to be on the horizon. The geographic exclusion for visitors to regions where malaria is endemic is longstanding, but the length of exclusion and provisions of the exclusion (length of stay, area of the country, possibility of mosquito exposure) have been debated for years and frequently modified. Geographic exclusions to address the transmission of Trypanosoma cruzi (the agent that causes Chagas' disease), Babesia microti (the parasite associated with babesiosis), and Borellia burgdorferi (the agent associated with Lyme disease) have all been discussed. Such exclusions would likely have little impact on blood safety, but each shrinks the potentially eligible volunteer donor pool.
Donor medications constitute another significant area of deferral losses. Certain medications, for example etretinate (Tegison), isotretinoin (Accutane) and finasteride (Proscar), have been identified by the US Food and Drug Administration (FDA) as posing a risk to transfusion recipients because of their teratogenic potential at low plasma concentrations. For other medications, blood centers set their own policies. Most blood centers defer donors on antibiotic therapy, although such deferrals are generally short. These deferrals usually reflect the concern that antibiotics are being administered for some bacterial infection that might taint the collected unit. As rules and policies become more complicated, and as increasing numbers of Americans take prescription and non-prescription drugs, more and more donors are lost.
More troublesome are donor deferrals resulting from false-positive infectious disease screening tests. This problem has been recognized since the introduction of serologic tests for syphilis. However, over the past fifteen years the introduction of as many as seven new screening tests and the immanent licensure of nucleic acid testing (NAT) have resulted in numerous deferrals for “questionable” test results and either complex reentry algorithms or no approved method to re-qualify such donors. Each year an estimated 14,000 donors are deferred from donating blood for an indefinite period because of repeatedly reactive EIA screening tests for HIV and hepatitis C virus, and several hundred donors are deferred for apparently false-positive NAT tests. (Louis Katz M.D., personal communication). The American Red Cross does not engage in donor re-entry, nor do 40% of non-Red Cross community blood centers.
Protection of the Donor
Donor screening criteria are designed to protect the donor as well as the patient. In practice these criteria are not currently decreasing the donor pool to any substantial degree. The criteria are designed to identify individuals predisposed to postdonation reactions (e.g. small donors), to restrict donors for whom a postdonation reaction might have particularly severe consequences (e.g. those with coronary artery disease), and to protect the donor from iron deficiency as a result of frequent donation. In fact the most common cause of on-site donor deferral is failure to meet the hemoglobin standard (12.5 g/dL). The Red Cross transition from earlobe sampling, a practice that overestimated venous hematocrit, to fingerstick sampling in August of 2000 resulted in an immediate deferral of 6% of donors, primarily women. How many of these donors will eventually qualify for donation is uncertain. Whether the increasing use of noninvasive methods to detect asymptomatic vascular disease or the trend toward strict vegetarian diets leading to a reduction in iron stores will affect donor qualification and donations adversely remains to be seen.
Social and Demographic Issues
Comprehensive studies of donor motivation, attitudes toward blood donation, and decisions about participation and nonparticipation were carried out in the early 1980s.5 In a study using controlled populations in cities representative of different kinds of blood supplies (New York, Hartford, Houston) and interviewers skilled in seven languages, Drake et al concluded that donation is limited primarily by the actual need for blood. While there may be more active donors in other countries, the issue in the US is how to make collections efficient and predictable, not how to significantly increase the donor base. With an estimated 5% of the eligible population donating annually and 25% having donated at some time, there appeared to be no shortage in the potential donor pool.
Unfortunately the US population has changed dramatically since 1982, and no comparably detailed analysis of donors and donor motivation has been carried out. The American population, like that of most of Western Europe and Japan, is aging and the World War II generation of donors is disappearing. The country has become culturally more diverse, which raises issues of both recruitment strategies and deferral characteristics. No data are available to indicate whether the current US population is either as willing or as eligible to donate blood as the previous generation of volunteers. Conventional wisdom asserts that the current generation is less altruistic and thus less likely to volunteer to donate blood. I have seen no data to substantiate this opinion. It does appear that, in the wake of the AIDS epidemic, the popular impression of blood transfusion has evolved from a “gift of life” to a risk to be avoided, and, anecdotally, the image of blood donation has suffered as a result. During the height of the AIDS epidemic in the US, 25–50% of surveyed adults believed that donating blood could lead to their being infected with HIV.
The disappearances of large manufacturing plants, sites of large mobile blood drives in the past, and the reluctance of small employers to allow employees to participate more than once or twice a year may make recruitment more challenging but probably does not greatly shrink the donor pool. On the other hand, labor shortages have made employers less willing to provide time to donate blood during work hours (Ronald Gilcher, M.D., personal communication). The complicated and sometimes confrontational screening process, occasionally referred to as the “donor inquisition,” also may have discouraged otherwise eligible volunteers.
Longitudinal Studies of Blood Availability
The balance between transfusion demand and blood collection determines adequacy of the blood supply. Demand drives the blood system. More than 95% of the transfusion demand originates in the approximately 4,500 hospitals in the US, and changes in demand may occur relatively abruptly. From 1987 to 1997 there was a pronounced decline in the number of allogeneic transfusions followed by an even greater decline in allogeneic collections (Figure 1 ). The margin of supply over demand, the “safety margin” of inventory, fell to 5.4% in 1997, about half of what it had been only two years earlier. Demand changed again in the subsequent two years as allogeneic transfusions increased more than 8%. Despite the increased collections, the margin between supply and demand was only 9.1%, a decline of 35.7% over that in the decade 1989-99. If demand for red blood cells continues to increase at the same rate, an additional 1.1 million units of blood will be required to meet demand in 2001 and to avoid further reduction of this margin. Autologous collections totaled an additional 651,000 units, an increase of 1.2% over 1997. By contrast, autologous transfusions declined 12.6% from 1997 and represented only 3% of all units of red cells transfused. The number of units discarded because of positive screening tests was 226,000.
The number of units collected per thousand US inhabitants of usual donor age (18–65 years old) was 80.8 in 1999. While this compares favorably with the rate of 72.2 per thousand in 1997, it pales in comparison with the 100 units per thousand population collected in Switzerland. While it is treacherous to try to interpret these numbers, they do suggest that US collecting facilities are performing more efficiently. Data from the National Red Cross indicate that the average volunteer donates about 1.7 times a year (Jacqueline Frederick, American National Red Cross, personal communication). Red cell transfusion rate in 1999 was 45.5 units per thousand population, an upward trend.1 Outdated red cells accounted for 5.3% of the supply, but given the fact that red cells can be transfused only to compatible recipients, the number of usable units outdated appears to be extremely small. More than 99% of group O units and 97% of group A units were transfused.
An estimate of supply adequacy is difficult to obtain. It is probably insufficient to record blood center and hospital inventories or orders partially filled or unfilled. In any case, such data are not available. Nonetheless, of the more than 2,500 hospitals surveyed, 6.6% reported that elective surgeries were cancelled on one or more days during the survey year because of blood shortages and 16.2% of responding hospitals reported at least one day in which non-surgical transfusion needs could not be met.1
Augmentation of Supply
Several approaches are being undertaken to augment the nation's blood supply. The American Association of Blood Banks has appealed to the US Department of Health and Human Resources to support a national campaign for increasing public awareness of the need for blood. The American Red Cross has recently announced a $5 million national campaign, the first in its history, to recruit additional blood donors. New technology has made it possible to collect double units of red cells from selected donors and to freeze supplies more efficiently for better inventory control. It has been estimated that the use of hereditary hemochromatosis patients could add between 202,500 and 3 million additional donors to the pool.6 While the actual number of new donors and the contribution of red cell units from patients with hereditary hemochromatosis continues to be debated, in a small ongoing study at the NIH Clinical Center, some 4% of red cell transfusions currently come from donors with hereditary hemochromatosis who are homozygous for the C282Y mutation. Continuing to assemble information from those with hereditary hemochromatosis regarding management and safety may help address some of the concerns regarding the shrinking blood donor pool.
II. Hemochromatosis and the Blood Bank
James P. Kushner, MD,**
University of Utah Medical Center, Dept. of Hematology/Oncology, 50 North Medical Drive, Salt Lake City UT 84132
Hemochromatosis is the most common monoallelic inherited disease in people of European ancestry, occurring with a frequency of approximately 5 per 1000.1 The disorder is usually due to homozygosity for a mutation in the HLA-linked hemochromatosis gene (HFE), causing a change from cysteine to tyrosine at position 282 in the HFE protein (C282Y).2 Wild type HFE protein forms a heterodimer with β2-microglobulin, and the heterodimer is expressed on the surface of many cells as part of a high-affinity complex with the transferrin receptor.3,4 The C282Y mutation alters the configuration of the HFE protein, impairing the assembly of the transferrin receptor-HFE-β2-microglobulin complex in the endoplasmic reticulum.5 Formation of the complex reduces the affinity of the transferrin receptor for diferric transferrin, somehow resulting in down-regulation of cellular iron uptake through the diferric transferrin-transferrin receptor-mediated endocytic pathway.6–,8 In the absence of HFE protein, cellular iron uptake via this pathway is enhanced, but the mechanism by which HFE regulates the transfer of dietary iron across the absorptive enterocyte and into plasma remains unresolved (Figure 2; color page 550).
Absorption of Dietary Iron
The average daily Western diet contains 15–25 mg of iron, but in iron-replete normal adults only about 1 mg is actually absorbed. The amount absorbed matches the amount lost each day within sloughed cells. There is no iron excretory pathway (Figure 3; color page 550). The absorptive behavior of the enterocyte is influenced by both the magnitude of body iron stores and by the rate of erythropoiesis.9,10 A “store regulator,” as yet uncharacterized, increases iron absorption when iron deficiency is present and reduces absorption when iron stores are increased. The amount of dietary iron that can be absorbed when iron deficiency is present is limited by the bioavailability of dietary iron. Iron absorption, in the absence of iron supplementation, rises to 3–4 mg daily when iron deficiency anemia is present. When iron overload is produced in genetically normal subjects, daily iron absorption is reduced to less than 0.5 mg.
Erythropoiesis also affects iron absorption, and the effects of the “erythroid regulator” seem more pronounced than the store regulator.9 The erythroid regulator appears to be related to erythron mass and not to erythropoietin. This point is emphasized by the findings in aplastic anemia, where erythropoietin levels are high, erythropoiesis is absent and iron absorption is not increased.11 The ability of the erythroid regulator to increase intestinal iron absorption is particularly evident when ineffective erythropoiesis is present (as occurs in thalassemia).12
Plasma iron, all of which is bound to transferrin, is derived from three cellular sources: the absorptive enterocytes of the duodenum and proximal jejunum; the macrophage; and parenchymal cells of the liver and other organs. The macrophage is the major source of plasma iron (Figure 4; color page 550 ). Studies utilizing radiolabeled iron have demonstrated that patients with hemochromatosis hyperabsorb iron from the gut, but the increase above normal is small, absorption equaling 2 or 3 mg daily. As iron overload develops, iron absorption is down-regulated to normal or nearly normal levels, but in relation to the greatly increased iron stores, absorption is inappropriately high.13 During phlebotomy therapy, absorption rises to high levels, presumably in response to increased erythropoiesis.14–,16 Once iron stores are depleted, absorption remains high. These data indicate that patients with hemochromatosis respond to the store regulator, but the down-regulation response is blunted. Patients with hemochromatosis appear to up-regulate iron absorption appropriately in response to iron depletion and to the erythroid regulator.
Export of iron from the macrophage (iron derived mainly from senescent red cells) is also accelerated in patients with hemochromatosis and explains the elevated transferrin saturation that serves as the most reliable phenotypic marker for the homozygous hemochromatosis genotype. The transferrin saturation is high before organ iron overload occurs, and remains high even after iron stores have been depleted by phlebotomy therapy.17
Phenotypic Expression and Screening
Phenotypic expression of the homozygous hemochromatosis genotype may vary from that of a fully penetrant clinical syndrome (with skin pigmentation, cirrhosis, cardiomyopathy, endocrinopathy, and arthritis) to a simple laboratory abnormality, namely an elevated percent saturation of transferrin. Numerous factors influence phenotypic expression, including age, sex, and modifier loci.18 The proportion of homozygotes destined to develop organ damage due to iron overload remains a controversial issue, mainly because of ascertainment biases in the reported series. All homozygotes identified because of clinical sequelae of iron overload have disease-related morbidity, whereas screening of healthy subjects generally uncovers few clinically affected homozygotes. An accurate estimate of the frequency of disease-related morbidity could be determined by a large-scale population-based screening project in which the entire population is screened, regardless of health status. Either a phenotypic screen (measuring the transferrin saturation) or a genotypic screen (detection of homozygosity for the C282Y mutation) could be employed. A multicenter, NIH-funded study evaluating both screening methods is now being carried out in the US.19 A population-based screening study in which HFE genotyping was used as the screening probe identified 16 C282Y homozygotes in a population of 3011 of Northern European ancestry, ranging in age from 20 to 79 years, living in a small city in southwestern Australia (5.3 per 1000).20 Half of the homozygotes detected had clinical features of hemochromatosis, but one-quarter had no evidence of iron overload.
An alternate approach to determining the frequency of disease-related morbidity in hemochromatosis homozygotes has been taken by Bulaj et al.18 Two hundred-fourteen homozygous relatives of persons with hemochromatosis were identified in pedigree studies by using HLA typing and HFE genotyping. These homozygous relatives were considered to be clinically unselected, as they were ascertained without regard to health status. Nearly all underwent liver biopsy. Hepatic fibrosis, cirrhosis, abnormal liver function tests, and arthropathy served as objective indicators of disease-related morbidity. Eighty-five percent of the men (mean age, 41 years) studied had iron overload, as did 68% of the women (mean age, 44 years). In spite of the high incidence of iron overload as measured by hepatic iron content, disease-related morbidity was documented in only 38% of homozygous men and 10% of homozygous women. With increasing age, the frequency of disease-related morbidity increased, particularly in homozygous men over 40 years of age. These data indicate that if hemochromatosis homozygotes are identified as young adults, most have no health-related consequences and would be ideal blood donors.
Hemochromatosis Homozygotes as Blood Donors
Recruitment of hemochromatosis homozygotes to serve as regular blood donors is dependent upon identifying homozygotes as young adults, before an appreciable incidence of iron-induced organ damage occurs. Thus, recruitment of donors and the implementation of large-scale screening programs are intimately connected issues. Data from the year 2000 US Census indicate that there are approximately 127,000,000 Americans of European origin between the ages of 20 and 74 years. The incidence of homozygosity for hemochromatosis in this group may approach 5 per 1000, or approximately 635,000 individuals, half men and half women. Identifying even a portion of these individuals would benefit the nation's blood supply, but concerns related to discrimination from health and life insurers and the reluctance of organizations that distribute blood and blood components to utilize donors with hemochromatosis require resolution.
The FDA does not currently prohibit the use of blood from individuals with hemochromatosis but requires that blood obtained through therapeutic phlebotomy be labeled with the donor's disease.21 Blood obtained from individuals with hemochromatosis is infrequently distributed for transfusion, as consignees have been reluctant to accept blood that is labeled with a disease. The labeling requirement has served as a barrier to the use of blood from donors with hemochromatosis even though this blood is as safe as blood from any other donors.22 The major barrier to the use of donors with hemochromatosis is the concern about creating an incentive to donate blood (free of charge) in contrast to paying for a therapeutic phlebotomy. In one study it was estimated that the average charge for a therapeutic phlebotomy done in the home is $48, $52 in a blood center, $69 in a physician's office and $90 in a hospital.23 A donor with hemochromatosis might have an incentive to deny disqualifying conditions in order to avoid the costs associated with therapeutic phlebotomy.24 A number of studies have shown higher rates of post-transfusion hepatitis when individuals with an incentive to donate (paid donors) rather than volunteer donors were utilized.25 These data may relate to the “high-risk” status of the populations from which these paid donors were recruited. Paid cytapheresis donors from a “low-risk” population were found to exhibit no increase in transfusion-transmittable viral infections compared to volunteer whole-blood donors.26
In the spring of 1999, the Public Health Service Advisory Committee on Blood Safety and Availability recommended that the Department of Health and Human Services (DHHS) create new policies removing the potential financial incentive for individuals with hemochromatosis to donate blood instead of paying for therapeutic phlebotomies. It was also suggested that the labeling requirement be eliminated as a barrier to the use of donors with hemochromatosis.27 In response, the FDA has made a commitment to consider case-by-case exemptions to existing blood labeling regulations providing that the blood center can verify that therapeutic phlebotomy is provided free of charge even if the prospective donor with hemochromatosis does not meet allogeneic donor suitability requirements.28 In addition, the FDA will also require that safety data be collected and submitted so that comparisons can be made with data gathered on the general donor pool. For the foreseeable future, therefore, blood centers planning to utilize donated hemochromatosis blood without labeling will have the responsibility of removing financial incentives for these donors and the administrative responsibility for additional data collection and submission.
A guidance document outlining the FDA's requirements for the variances required for blood centers wishing to use donors with hemochromatosis without labeling the donated blood is available on the web at www.fda.gov/cber/guidelines.htm. The key requirement is that the blood center verify that there will be no charge for phlebotomies performed on individuals with hemochromatosis, including those who do not meet allogeneic donor suitability requirements. This requirement would cast blood centers in the role of a provider of cost-free medical care for a portion of the population with hemochromatosis. It seems unlikely that many blood centers would wish to assume this role.
The treatment of hemochromatosis when hepatic iron overload is established involves an initial phase of rapid-sequence phlebotomy designed to eliminate excessive iron stores and minimize organ injury. Individuals with marked iron loading usually tolerate the removal of 500 mL of blood 4 to 5 times monthly until iron-limited erythropoiesis occurs.29 Less marked iron overload can be safely depleted with two 500 mL phlebotomies per month. There is no inherent reason to exclude blood obtained during the initial phase of treatment from the donor pool, but current guidelines limit the frequency of collection of blood from a donor to once every 8 weeks. A variance permitting blood centers to collect blood from donors with hemochromatosis more frequently would require either a physician's prescription for iron depletion through therapeutic phlebotomy or certification by a blood bank physician that the donor is in good health on the day of donation in accordance to the current regulatory code.30
The issue of utilizing donors with hemochromatosis more frequently than every 8 weeks should become moot if screening programs become accepted practice. The objective of screening programs is to detect hemochromatosis homozygotes before iron loading and organ damage occurs. In iron-loaded homozygotes, once iron depletion is accomplished by rapid-sequence phlebotomy, iron balance can be maintained with two to six 500 ml phlebotomies annually.29,31 Healthy homozygotes detected through screening programs would be unlikely to require phlebotomy more frequently than every 8 weeks (approximately 6 times per year).
A 500 ml phlebotomy contains approximately 200 mg of iron. It follows that to maintain normal iron stores while donating blood 6 times annually would require the absorption of 1200 mg of dietary iron or an average of 3.3 mg of iron daily. This figure approaches the amount of bioavailable iron in the average Western diet. A male with hemochromatosis can achieve this level of iron absorption, but both men and women who are genetically normal and donate frequently are likely to become iron depleted.32–,34
The response by the FDA to the recommendations of the Public Health Service Advisory Committee on Blood Availability has provided a mechanism permitting blood centers to accept donors with hemochromatosis but the barriers to using this resource have not been eliminated. These barriers are not based on biological considerations. Patient advisory groups, physicians, blood centers and the FDA will have to work together to achieve a more workable arrangement for incorporating healthy individuals with hemochromatosis into the blood donor pool.
III. Carbonyl Iron Supplementation for Women Who Give Blood
Gary M. Brittenham, MD**
Columbia University, College of Physicians and Surgeons, Harkness Pavilion, Room HP 550, 180 Fort Washington, NewYork NY 10032
The final section in this chapter will consider the use of low dose, short-term carbonyl iron to replace the iron lost at donation and to prevent iron deficiency in women of childbearing age. Carbonyl iron has a bioavailability similar to that of ferrous sulfate but is a safe, non-toxic form of elemental iron that virtually eliminates the risk of iron poisoning in children.1–,3 After briefly reviewing the available evidence with respect to the need for iron replacement in women who are committed blood donors, we will summarize a draft protocol for a program of carbonyl iron replacement that was presented at the National Heart, Lung and Blood Institute meeting “Workshop on Maintaining Iron Balance in Women Blood Donors of Child-Bearing Age” (Bethesda, MD, June 8, 2001). The overall aim of the draft protocol is to provide recommendations for consideration by blood centers that wish to develop a program of carbonyl iron replacement for women of childbearing age.
A safe and effective means of preventing iron deficiency resulting from blood donation is needed for women who give blood.4–,6 Phlebotomy of a unit of blood produces a loss of 200 to 250 mg of iron in hemoglobin. Because the average amount of storage iron in a woman of childbearing age is only about 300 mg,6 donation of a unit of blood requires the subsequent mobilization of much or all of this reserve. Further donations of blood will produce iron deficiency and then anemia. Normally iron balance is maintained by controlling gastrointestinal iron absorption; iron stores and iron absorption are reciprocally related so that as stores decline absorption increases. In women who donate blood repeatedly, iron absorption from a usual diet cannot increase sufficiently to replace iron losses from frequent phlebotomy. Because of menstrual losses of iron, which average 13.5 mg per month, menstruating woman have a higher estimated daily iron requirement than men (1.5 vs. 1.0 mg).4 A minimum acceptable interval between donations of 56 days is recognized by the FDA and national blood collection services. For menstruating women who donate at this interval, an iron loss of 200 mg at donation increases their estimated daily iron requirement to 5.1 mg per day. Because the maximum iron absorption from a usual Western diet is at most 3-4 mg per day,6 a net deficit of 62-118 mg of iron would be expected in the 56-day interval. Repeated donations on this schedule make iron deficiency and then anemia inevitable.
The national blood supply is provided by the voluntary donations of a small minority of the population; overall about 45% of these dedicated donors are women, most of childbearing age.6,21 Because the major factor limiting the frequency of repeated blood donations is iron depletion, the limited iron stores of donors is one of the main determinants of blood availability. Virtually every investigation of the iron status of women who give blood has confirmed that iron deficiency is a common problem and the major factor limiting the frequency of repeated donation.6,16,17,19–,21 Lack of iron is also the most important medical reason for deferral from repeat donation. In donors deferred because of a low hemoglobin concentration, evidence of iron deficiency was found in more than 70%4.
Three potential means of preventing iron deficiency in women who are committed blood donors of childbearing age are available: (i) further limitation of the frequency of blood donation, (ii) improved methods for detection and deferral of iron deficient donors, and (iii) iron supplementation. Limiting donations by women of childbearing age to 4 per year in the absence of iron supplementation has been proposed but is unlikely to be adequate because even among women who donate only once per year, the prevalence of iron deficiency is as much as 24%, as judged by a serum ferritin concentration < 12 μg/L.16 Attempting to exclude iron deficient donors by increasing the hemoglobin concentration required for acceptability for donation would not be effective. Only standards for hemoglobin concentration that would exclude most donors would protect against iron depletion.5 Improved screening methods for donors have been difficult to devise. Iron supplementation provides a potential means of preventing iron deficiency, but blood collection services have been reluctant to supply routine iron supplementation after blood donation because of the risk of accidental iron poisoning in the small children of donors that is associated with the preparations of ferrous sulfate and other iron salts now available. Although unit packaging of potent iron supplements is expected to decrease the risk, iron salts remain the leading cause of death from accidental poisoning in children in the US.18 Other possible disadvantages of supplementation programs include limited compliance due to side effects of ferrous salts, the possibility of masking underlying pathological conditions associated with blood loss and the risk of giving iron to individuals with undiagnosed hereditary hemochromatosis.
Carbonyl Iron Supplementation
Low dose, short-term carbonyl iron after blood donation provides a potential method for iron replacement that avoids many of the risks and disadvantages associated with other iron supplementation programs. Carbonyl iron is a small particle preparation of highly purified metallic iron. “Carbonyl” describes the process of manufacture of the iron particles, not their composition. Heating gaseous iron pentacarbonyl (Fe[CO]5) deposits metallic iron as submicroscopic crystals that form microscopic spheres of < 5 μm in diameter.3 The preparation is more than 98% pure. Carbonyl iron is inert and incapable of reacting with strong chelators of iron such as transferrin and deferoxamine. When given orally, carbonyl iron is much less toxic than ionized forms of iron such as ferrous sulfate. In humans, as in rats, the estimated lethal dose of oral ferrous sulfate is about 200 mg Fe/kg body weight.13,14,18 Adult human volunteers have taken oral doses of 10,000 mg carbonyl iron (about 140 mg Fe/kg or 70% of the lethal dose of iron as ferrous sulfate) “without deleterious effect.”4 Ethical considerations preclude direct studies of the toxicity of carbonyl iron in human infants and children. Formal toxicity studies of carbonyl iron in rats and guinea pigs found that the LD0 (the dose that all animals survive) was 10,000-15,000 mg Fe/kg and the lethal dose (LD100) was 50,000-60,000 mg Fe/kg.1
Studies in animals have been carried out to determine the mechanism by which carbonyl iron is absorbed and the reason for its low toxicity.8 Within the stomach, iron is oxidized to the ferrous form by the reaction Fe0 + 2H+Cl− → Fe2+Cl2 + H2, in which the hydrogen ion is derived from hydrochloric acid. In brief, the low toxicity of carbonyl iron could be related to its pattern of absorption. Fatal amounts of iron may be absorbed through an anatomically intact intestinal mucosa. With ferrous iron, all the iron is in a soluble ionized form that is potentially available for absorption. By contrast, with carbonyl iron only that fraction solubilized by gastric acid is available for absorption; in addition, the rate of solubilization is restricted by the rate of gastric acid production. Iron toxicity, the result of the deleterious effects of high concentrations of ionized iron, is minimized both by the rate of gastric acid production and the equilibrium between production of ferrous iron by solubilization by gastric acid and its discharge from the stomach to the intestine. Thus solubilization of carbonyl iron by gastric acid is a prerequisite for subsequent absorption. The slow rate of solubilization results in a more prolonged absorption, which is responsible for the low toxicity of carbonyl iron. After oral administration of equivalent amounts of carbonyl and ferrous iron, the amount of iron absorbed and the internal distribution of the absorbed iron are all similar. These results indicate that, after solubilization, the fate of iron given in the carbonyl form is indistinguishable from that given in the ferrous state.8 Thus carbonyl iron is an inexpensive form of iron with a safety margin 250 to 300 times greater than that of ferrous sulfate and other iron salts.
The principal potential advantage of carbonyl iron is its low toxicity. Studies in human volunteers have suggested that the overall bioavailability of carbonyl iron is high, about 70% that of ferrous sulfate, when expressed in terms of elemental iron.12 Other studies with healthy, nonanemic volunteers, with patients with iron deficiency anemia7,10 and with women who were blood donors9 showed that carbonyl iron treatment could correct iron deficiency anemia and replace iron stores. Because of these results, a regimen for carbonyl iron supplementation in women of childbearing age who were committed blood donors was devised that would replace iron losses from donation in all, or nearly all, donors with minimal or absent side effects. Carbonyl iron, 100 mg given once daily at bedtime, was chosen as the treatment regimen to be compared with placebo therapy. Administration at bedtime was chosen to allow the carbonyl iron to remain in the gastrointestinal tract for as long as possible without food buffering the stomach acid required for solubilization of the elemental iron. In a trial of carbonyl iron supplementation for blood donors11 with a randomized, double-blind design, women 18-40 years of age were given placebo or low-dose carbonyl iron, 100 mg po qhs, for 56 days after blood donation. Side effects with placebo and carbonyl iron were almost indistinguishable; capsule counts indicated that compliance with both regimens was similar. On the average, more iron was absorbed by donors who initially had no iron reserves (serum ferritin < 12 μg/L) than by those with some stores. Overall, enough iron was absorbed to replace that lost at donation in 85% of the carbonyl iron group but in only 29% of the placebo group (p < 0.001).11 This experience with single courses of supplementation provided the basis for subsequent extended trials of carbonyl iron supplementation in committed women donors of childbearing age.15
Draft protocol for carbonyl iron replacement for women of childbearing age who are committed blood donors
Participation in this program of iron replacement is restricted to women, 18 to 40 years of age, who are menstruating and meet other inclusion criteria (see below). After each successful donation, eligible women are offered a child-resistant bottle with 56 capsules, each containing 100 mg of iron as carbonyl iron, and instructions to take one capsule at bedtime until all capsules are gone. After completing the 56-day course of iron supplementation, each donor is again eligible to donate and, after each successful donation, receive another course of carbonyl iron replacement. The major features of the draft iron replacement protocol are shown diagrammatically in Figure 5 .
Population eligible for participation in project; inclusion and exclusion criteria
Eligible women must also meet the following inclusion criteria: (i) They must intend to donate at least two units of blood in the coming year; (ii) they must wish to receive the iron supplement; (iii) they must satisfy all other blood center requirements for donation and (iv) successfully donate a unit of blood; and (v) they must be free of any family or personal history of hereditary hemochromatosis, intestinal polyps, colon cancer, chronic gastrointestinal or other chronic medical problems. Donors not meeting the above criteria will be excluded.
Consent process and documentation
Eligible women will generally first learn about the iron supplementation program in a mailing from the Blood Center, which contains a brochure describing the program. Subsequently, after each successful donation, an iron supplement will be offered to each eligible woman. To receive the offered iron supplement, donors will be required at the time of each donation to read and review the Program Brochure and sign an Iron Supplement Form that includes a brief health questionnaire and the elements required for Informed Consent. Together the Program Brochure and Iron Supplement Form are designed to provide the information required for a standard Informed Consent Form but in a format consistent with that used by blood centers to obtain consent for blood donation.
All eligible donors who meet the program criteria will be offered participation in the program through a mailing from the Blood Center. Some Centers may wish to offer each eligible woman a special donor card identifying her as a participant. In the program, an iron supplement will be offered after each donation to each participant. Fifty-six iron capsules will be contained in a child-resistant bottle with instructions that a capsule should be taken once daily at bedtime for 2 months after blood donation until all have been taken. To receive the iron supplement, donors will be required at the time of each donation to read the Program Brochure and sign the Iron Supplement Form, which includes a brief health questionnaire and the elements required for Informed Consent.
Risks and discomforts
The known risks of the program are (i) the risk of giving iron to individuals with undiagnosed hereditary hemochromatosis and (ii) the risk of masking underlying pathological conditions associated with blood loss. The inclusion and exclusion criteria, the use of low dose, short-term iron supplementation, and the restriction of provision of iron supplementation to only those women who successfully donate a unit of blood are all intended to minimize these risks. There may be other risks of participation as yet unknown.
Other features of the draft protocol
This program requires no additional testing of donors. The only additional expenses for the blood center are those associated with the provision of the carbonyl iron supplement (estimated cost about $1-$2/bottle). Women participating in the program may also self-supplement themselves with iron or other nutritional supplements without restriction by the program. Men, post-menopausal women and women who are not eligible for donation for any reason are excluded from this program.
This draft protocol is presented to provide concrete recommendations for consideration by blood centers that wish to develop a program of carbonyl iron replacement for women of childbearing age. The primary goal of iron replacement in these women donors is to prevent iron deficiency, their most common medical cause of deferral from donation. By preventing iron deficiency and consequent deferral from donation, iron replacement enhances the retention and commitment of these dedicated women donors and improves their iron status even as they increase their blood donations. The adoption of low dose, short-term carbonyl iron supplementation for women of childbearing age who are committed blood donors as a standard procedure for blood centers could improve the iron status of these donors while increasing the national blood supply.
Dr. Klein serves on the board of directors for HaemoneticsCorporation and is scientific advisor for Sangart, Viacell,Zymequest, Vitex, Gambro-BCT, and Alliance.