The value of the allogeneic graft-versus-leukemia effect in adult acute lymphoblastic leukemia (ALL) has now been conclusively demonstrated and confirmed. While this is true for adults in all age groups, it may not be the best clinical option for young adults for whom increasingly intensive pediatric protocols are clearly of benefit. On the other hand, there is potentially wider applicability of allogeneic donor transplantation for adults 25 to 45 years old, for whom matched unrelated donors may be as safe and effective as sibling donors, and for the patient older than 45 years for whom reduced-intensity conditioning may be a promising way forward.
Since the treatment-related mortality of allogeneic transplantation remains significant, careful selection of patients is mandatory. Patients with the Philadelphia chromosome, those with t(4;11) and those with a complex karyotype remain transplant candidates, and allogeneic transplantation remains the best option for salvage, where achievable, in a remission beyond first.
As in childhood ALL minimal residual disease studies may be extremely useful in predicting outcome and, therefore, strategy, but at present there are less definite data in adults. Clinical indications to harness the allogeneic effect will mature as the true value of pediatric protocols in adult patients and the safety and efficacy of a sibling, unrelated and reduced intensity transplant emerge in this disease.
Throughout the last 40 years the outcome of treat treatment in adult acute lymphoblastic leukemia (ALL) has lagged behind that of children for whom a long-term survival of well over 80% is regularly achieved. Adults have fared much worse: the survival in the best series may not even reach 35% to 40% in those younger than 60 years.1,–4 For those older than 60 years the outcome falls off steeply; survival of 10% or so is regularly seen. For adults, there have been two distinct kinds of approaches to treatment, those based on a relatively liberal use of transplantation, both allograft and autograft, and those that attempt to optimize chemotherapy on a “risk basis,” reserving transplantation only for patients with the Philadelphia chromosome (Ph).
Allogeneic transplantation has remained a viable proposition in some clinical circumstances as it is seen to be the most effective antileukemic modality of all, better than any chemotherapy, passive antibody-induced immunotherapy or autologous transplantation. The “price” to pay for allogeneic transplantation has hitherto been relatively high. Patients have high treatment-related mortality (TRM), which is very much correlated to age, and even those who survive may experience the acute and chronic morbidity of graft-versus-host disease. Allogeneic transplantation will remain a viable option for some patients in the following circumstances: 1) when the outcome data from transplantation, even as modified by TRM, is superior to that by other conventional therapy; 2) if transplant morbidity and mortality can be reduced, then more patients will potentially benefit from transplant; 3) if patients without a sibling donor and in a higher age group can be offered a safe transplant as the outcome from matched unrelated donor (MUD) and reduced intensity transplants (RIC) improves. If newer treatments with chemotherapy, and perhaps targeted agents, produce better outcomes without transplantation, then fewer patients will benefit from transplant.
In adult ALL, the overall complete remission (CR) rate of 85% to 90% is very high3,5,–7 and confirms the efficacy of induction regimens with relatively low toxicity, which allow a very high percentage of patients to receive post-remission therapy. Therefore, the problem with this disease is relapsing from remission, despite the fact that, as in pediatric ALL, the treatment is initially very intensive throughout induction and consolidation, and the maintenance treatment typically goes on for an additional 2 years or more. Thus, the evidence is that many patients must be undertreated. It appears difficult, however, to understand how conventional chemotherapy could contribute any more without the introduction of new drugs, of which there are very few. The possibility arises that monoclonal antibodies to surface antigens such as CD20, CD22 or CD52 may contribute to a reduced relapse in rare circumstances where remission rates remain very high.8,–10
Before the definitive MRC/ECOG study, allogeneic stem cell transplantation had been accepted as being of value in second remission and for high-risk patients in first remission.6,11,12 In the MRC/ECOG study the outcome of allogeneic transplantation was best estimated from a donor versus no donor analysis for patients between 15 and 50 years old, or up to 55 years old as modified midway through the study.13 This eliminated selection bias that would operate if only those who underwent transplantation were compared with those who did not. The advent of an increasing constituency of patients with a high-quality matched unrelated donor may serve to challenge the approach of donor versus no donor analysis as the way forward for the future. More and more patients who do not have sibling donors and would previously have fallen into the “no donor” group now have unrelated donors, and the results of these transplantations may be very similar in positive outcome to those from matched sibling donors.14 For the future, allogeneic transplantation may be more relevant for those who 1) have a high-quality matched unrelated donor, if not a family donor; 2) can undergo at an older age safe RIC allogeneic transplantation; or 3) can be more clearly identified as having a poor prognosis without an allogeneic transplant. These may include patients who have positive minimal residual disease (MRD) markers at an agreed point in their induction consolidation therapy and those who have “poor prognosis” cytogenetics, who may not be cured by any other treatment than an allogeneic transplant.
On the other hand, in the future there may be a constituency who will possibly be less in need of allogeneic transplantation than they are today: 1) adolescents and young adults with this disease;15 2) those who may have a better outcome on conventional therapy with the addition of targeted agents, such as monoclonal antibodies to CD20 or CD22, and those with T-cell disease who may benefit by the addition of nelarabine to chemotherapy16; 3) those with Ph-positive disease who may be given imatinib in addition to conventional induction and consolidation therapy17,–19; 4) those who are MRD negative at a certain point and therefore have a good prognosis without transplant20,21; and 5) those who have “good” cytogenetics who may do well without taking the risk of a transplant.
Is There Evidence for a Graft Versus Leukemia Effect in Adult ALL?
There is clear evidence of a graft-versus-leukemia effect in this disease22; this was, in fact, initially recognized in ALL in 1979.23 This has led to initial acceptance of the use of allogeneic transplantation to reduce leukemic relapse at least in high-risk patients but less so until recent years in standard-risk patients. The recent large MRC/ECOG study confirms a major reduction in relapse by the allogeneic effect in both standard-risk patients and high-risk patients.13
What Are the Prognostic Factors that Make Patients Standard Risk or High Risk in Adult ALL?
Patients have been considered high risk and therefore, in theory, better candidates for allogeneic transplantation if they (a) carried the Ph chromosome, (b) if they had a high white count at presentation, (c) if they had a very immature phenotype, and (d) if they had a slow initial response.6,11,12 All patients over 35 years old were considered high risk irrespective of other factors. Immunophenotype is important, with T-cell disease perhaps having a better outcome than B-cell disease in adults.24 Many studies in adult ALL are risk-adapted, using one or more of these prognostic factors to determine the course of therapy after induction. One thing that has to be emphasised is that poor risk disease should not “automatically” go to transplant since data from the MRC/ECOG study is less convincing of the overall value of allogeneic transplant in poor-risk patients than in standard-risk patients.13
Can MRD Studies Indicate Which Patient Should Have a Transplant?
MRD studies have proved extremely useful for predicting outcome in childhood ALL, but there are less definitive data in adults. It is possible that a similar approach will be useful in determining which patients should be treated by chemotherapy alone and which patients should be considered for high-risk treatments such as bone marrow transplantation. The techniques involved include polymerase chain reactions (PCR) and amplification of immunoglobulin and T-cell receptor (TCR) gene rearrangements. Such molecular targets can be identified in the vast majority of patients with adult ALL, which can potentially be used to identify different patient groups with different long-term outcomes. MRD can be identified at different time points in the disease and potentially allow the definition of different risk groups. In the GMALL studies a small percentage of patients, around 10%, had a rapid decline of MRD to lower than 10−4 and below the detection limit at day 11 and day 24; these are low-risk patients who had a low relapse rate at 3 years.25 Conversely, there was a high-risk group (approximately 23%) with an MRD of 10−4 or higher up until week 16; they had a 3-year relapse rate of 94%.25 These patients can clearly be considered for alternative therapy including allogeneic transplantation. The remaining group had a close to 50% relapse rate and represent the intermediate-risk group. All these patients might originally have been considered as “standard risk” without MRD data. Ten percent of patients can be identified who certainly do not need to be considered for the high risk of allogeneic transplantation, and a further 23% who may well be very strong candidates. Clearly, MRD values at any time point are dependent on specific prior therapy given, and care must be exercised in extrapolating from one protocol to another. While this theoretically better defines the strategy for defining transplantation candidates, it remains to be demonstrated that this is an effective strategy, since those with high MRD have an increased risk of relapse and may possibly need further conventional therapy to reduce tumor load before allogeneic transplantation.
Is There Evidence of a Donor versus No Donor Effect in Randomized Studies?
The French LALA 87 and LALA 94 studies both reported that there was an advantage for high-risk patients who had an HLA-compatible donor.11,12 This was confirmed in the GOELAM study.6 The Spanish group did not confirm these findings.26 The advantage for high-risk patients in having a donor was confirmed by recent meta-analysis of seven published studies in ALL reporting a significant advantage for sibling allogeneic transplant in high-risk patients with a donor compared with no donor.27
The utility of an allogeneic transplant for patients who are not at high risk is controversial. There have been various problems in these analyses. Many of the studies are small and the definition of “high-risk” features is itself variable across each of the studies. The MRC/ECOG trial13 addressed these issues. In this study all patients received virtually identical therapy irrespective of their risk assignment; this differed from the approach of most current or recent clinical trials of therapy in ALL. This highlights one of the most pivotal issues in the whole area: the question of a “risk-adapted” approach versus a “blanket” approach for a given population. The MRC/ECOG trial accrued almost 2000 patients over 13 years (1993 to 2006) and produced a very high CR rate; the 5-year overall survival (OS) was significantly improved in the 443 patients who had a donor versus the 588 Ph-negative patients who did not (53% vs 45% P = .01). In contrast to other published studies, the standard-risk patients also showed it was beneficial to have a donor, the OS of 239 of them being significantly improved over the 323 who did not have a donor (62% vs 52% at 5 years, P = .02) (Figure 1 ). In this trial the high-risk patients show a high non-relapse mortality: 36% at 2 years, although the incremental risk of mortality over conventional therapy was only 22%.13 This abrogated the benefits of reduction in relapse produced by the allograft such that the benefit of transplant was less clear among the high-risk patients. In this high-risk group, the 5-year survival for the 204 patients with a donor was 41% compared with the 35% OS for the 261 patients without a donor (P = .2). It appears that the overall benefit from having a donor in standard-risk patients was present at all ages. The major issue was the over 35- to 40-year-old patients in whom a survival advantage could not be demonstrated.
The “donor versus no donor” approach is predicated on the presence of a sibling or matched family donor in the “donor” group. It has always been thought of as a form of biological randomization since it has never been possible to carry out any sort of randomized trial among those who actually have a sibling donor, randomizing half to the transplant and half not to have the transplant. With new forms of molecular typing we are getting closer to where it may be possible to identify unrelated donors so closely matched to the patients that the outcomes may be little different from those of sibling donor transplantations.14
Could Allogeneic Transplantation in Adult ALL Have Its Constituency Extended by the Use of Matched Unrelated Donor Transplants?
Marks et al14 reported retrospective outcomes from the Center for International Blood and Marrow Transplant Research (CIBMTR) for unrelated donor transplants in 169 patients with ALL in first remission. The 5-year TRM, relapse and OS were 42%, 20% and 39%, respectively. The relapse risk was modest, and thus nearly 40% of adults with ALL in CR1 survived 5 years after unrelated donor transplantation. Selecting closely HLA-matched unrelated donors should improve the results even further (Figure 2 ).
There are more encouraging data about unrelated donor transplant (URD) in this disease. Dahlke et al reported on 84 patients of a young age group (median recipient age 23 years) of whom 43 patients were transplanted in CR1.28 After follow-up at 18 months the estimated OS for all patients was 45%. There was no difference in outcome between unrelated and related donors. Kiehl et al described 264 adult patients receiving a myeloablative allogeneic stem cell transplantation for ALL at nine transplant centers in Europe.29 Of these, 221 patients received a matched related or unrelated graft. The authors did not observe any differences in patients receiving transplantation in CR1 between matched related and matched unrelated donors. Disease-free survival (DFS) at 5 years in patients receiving transplantation in CR1 from a matched unrelated donor was 45% and for patients with a matched related donor was 42%. Some caution is advised in interpreting such retrospective comparisons. The selection of patients for URD transplants is often more stringent than for matched sibling transplants, possibly introducing a significant bias.
Patel et al reported a retrospective series of URD transplants in 48 adults with Ph-negative ALL in CR1 from the British Society of Blood and Marrow Transplantation Registry.30 The median age of these patients was 26 years, and 89% of them had at least one adverse prognostic factor. Sixty-six percent of transplants were matched and 34% were mismatched. T-cell depletion with alemtuzumab in vivo was carried out in 96% of recipients. OS, DFS and non-relapse mortality were 61%, 59% and 13% at 5 years, respectively. This clearly illustrates that T-depleted URD transplants can result in good OS and low TRM. Trials of this approach with high-risk patients in CR1 merit prospective comparison with other methods of GVHD prophylaxis.
One is forced to conclude from all of these studies that there is little disadvantage today for these patients in having a fully matched unrelated donor compared with those having a sibling donor for transplantation.
Can Reduced Intensity Conditioning Offer Anything to Patients, Particularly in the Older Age Group?
The rationale for RIC is that the toxicity of the conditioning regimen has been virtually removed, and the technique relies almost entirely on harnessing the graft-versus-leukemia effect. The procedure is becoming established in AML, but the experience in ALL is far more limited. Mohty et al looked at 97 adult patients with ALL from the EBMT Registry. In patients transplanted in CR1 OS was 52%, leukemia-free survival 42%, and non-relapsed mortality 18%.31
The Spanish group32 also looked at the feasibility of RIC transplantation in 27 adult patients, of whom 44% were chemorefractory and 41% were Ph-positive. These patients had a median age of 50 years; the age profile is important as it potentially demonstrates applicability in general to older patients in first remission. In these patients the 2-year TRM was 23% and the 2-year OS was 31%. The Japanese group reported on 33 adult patients receiving RIC for ALL,33 13 in CR1; OS was 30% with a 2-year TRM of 21%. Very recently the group from Minnesota reported on 22 high-risk adult patients with ALL (14 of them in CR1) who received RIC. The OS, TRM and relapse rate at 3 years were 50%, 27% and 36%, respectively (Figure 3 ).34 Although some of these studies are difficult to interpret in general terms due to the fact that they contain a variety of patients with advanced disease, some with Ph-positive disease, and some patients who had previously had a myeloablative transplant, the outcomes are encouraging, with OS in many cases over 30% and a TRM somewhere between 10% and 20%.
What Is the Best Approach for Young Adults?
The one group that is not routinely considered for allogeneic transplants for ALL is adolescents and perhaps young adults with ALL. One of the issues is what exactly represents a “young adult”? Some protocols define adolescents aged 15 to 25 years; in a few centers pediatric-type protocols are being used for adults up to the age of 45 years.15 Publications have recently compared pediatric and adult regimens for ALL in young persons, seemingly showing a superior outcome for young adults treated on typical pediatric regimens.15 We are almost certainly in a position where one cannot study prospectively either the apparent superiority of pediatric protocols to adult protocols for these age groups in a randomized fashion; therefore, pediatric protocols have been adopted despite the fact that all comparisons have been retrospective and recognize the difficulty of comparing patients treated in adult settings with those treated in pediatric units. Pediatric protocols are rather generally more intensive, with more asparaginase and perhaps fewer gaps in therapy. In pediatric protocols and pediatric units adherence to time and dose schedules may also be more exact, although this is far from clear.35 The biology of the disease changes quite radically in patients between ages 15 and 20 years, and the distribution of these young adults following apparently similar protocols may differ subtly in the age groups allocated to a pediatric group versus an adult unit in the same area. In the MRC/ECOG study, however, it should be noted that young adults with a donor had a superior outcome to those who did not have a donor.
Is Allograft Still Relevant for the Ph-positive Patients in the Era of Tyrosine Kinase Inhibitors?
The MRC/ECOG recently reported the experience of 267 Ph-positive adults with ALL in the pre-imatinib era showing an overall CR rate of 82% and an OS of 22% at 5 years including patients treated with chemotherapy alone, both sibling and matched unrelated allogeneic transplants, and a tiny number of autologous transplants.36 Only 28% of patients actually received the proposed stem cell transplant that was intended for all patients in the study. Patients did not receive such transplants if they were beyond the age limits or had an early event that prevented transplantation even when a donor was available. Adjusting as far as possible to allow for the selection bias favoring receipt of transplant by adjustments of age, sex and presenting white cell count, etc, the relapse-free survival remains significantly superior in the transplantation group as opposed to the chemotherapy group, although the OS was not significantly better for the group with a sibling donor, with an OS at 5 years of 34% compared with 25%, respectively. However, a substantial proportion of patients in the “no sibling donor group” received MUD or mismatched stem cell transplants; if this analysis were repeated but centering on MUD, mismatched and non-myeloablative stem cell, it still leaves a slightly increased difference between the two groups, with the 5-year survival for the sibling donor group of 36% and 23% for the no donor group.
In published nonrandomized studies of patients with de novo disease, imatinib appears to improve the CR rate compared with historical controls, which was the result in the Japanese adult leukemia study group.17 The CR rate in the historical control arm in that study is low at 51%, whereas in the MRC/ECOG study 83% went into CR on chemotherapy alone. Imatinib appears to increase the number of patients going into CR and in most studies this is followed by an increased number of patients actually receiving allogeneic transplants in CR1 with typically good outcome. Imatinib alone without the stem cell transplant is unlikely to make a significant contribution to long-term survival. In a recent study 55 older patients, median age 68 years, were randomized to imatinib alone in induction or to induction chemotherapy followed by consolidation chemotherapy together with imatinib. The CRs were 96% and 50%, respectively, but there was no difference between the two cohorts in overall survival.19 The initial benefit of imatinib in improving CR may therefore not translate into improved survival; this may relate to the fact that abnormal tyrosine kinase (TK) activity alone is not entirely responsible for the phenotype of Ph-positive ALL. It appears that although Src kinases are dispensable for the development of CML, they are absolutely required for the development of Ph-positive ALL.37 It may be that the simultaneous inhibition of tyrosine kinase and Src kinase might be better for Ph-positive ALL and that using imatinib alone is not enough. There may be a role for TK inhibitors both during induction and post stem cell transplantation but there are no prospective studies as yet. We can conclude that, as of 2009, the need for allogeneic stem cell transplantation has not been eliminated by TK inhibitors in the treatment of adult ALL.
Do Other Cytogenetic Subgroups and Oncogene Expression Influence the Indications and Outcome for Allogeneic Stem Cell Transplantation?
Additional Chromosome Abnormalities in Ph-positive ALL
In the MRC/ECOG study38 more extensive analysis revealed the impact of cytogenetics on relapse-free survival. Patients with a Ph chromosome, t(4;11)(q21;q23), t(8;14)(q24.1;q32), complex karyotype or low hypodiploidy/near triploidy had inferior rates of DFS and OS when compared with other patients. In contrast, patients with hyperdiploidy or with a del(9p) had a significantly improved outcome.38 The knowledge of these cytogenetic abnormalities and further investigation of their pathological significance could contribute to a stratified biological classification of adult Ph-positive ALL in the future based on cytogenetics, much like in AML.
t(1;19)/E2a/Pbx1 Oncogene and t(4;11)/MLL-AF4
Data from the LALA 94 study indicate that adult patients with B-cell ALL should probably undergo allogeneic transplantation at first remission.39 There were 58 patients in the study with either t(1;19) (n = 24) or t(4;11) (n = 34). CR rates for these two abnormalities were similar to other B-cell ALL patients and while in CR, patients with a donor were assigned to allogeneic stem cell transplantation (n = 22) and the remainder were randomized between auto SCT (n = 15) or chemotherapy (n = 8). In both groups the DFS was higher in the allo stem cell transplantation arm compared with the auto SCT and chemotherapy arms, and the authors suggest that chemotherapy or autograft intensification did not overcome the poor prognosis of adults with either of these cytogenetic abnormalities. It is important to emphasize that in the large MRC/ECOG study t(1;19) was not associated with poor prognosis.38
TLX1 (Hox 11) Oncogene Expression
Patients with TLX1 oncogene expression have been studied by Ferrando et al.40 The probability of death from leukemia in TLX1-positive patients at a median of 7 years follow-up was 12% compared with 44% for all other cases of T-cell ALL. Overexpression of TLX1 confers a better outlook for adults with T-cell ALL and may mean that allogeneic bone marrow transplantation is not necessary for these patients.
Notch1 and FBXW7 Mutations in Adult T-cell ALL
The prognostic implications of Notch 1 and FBXW7 mutations have recently been reviewed in the GRAALL study.41 Notch1 and FBXW7 mutations lead to activation of the NOTCH1 pathway and are among the most frequent mutations in T-cell ALL. There was no significant correlation between Notch1 and FBXW7 mutations. By multivariate analysis, it appeared that the presence of NOTCH1/FBXW7 mutations was an independent positive prognostic factor for DFS and OS. Although all these data confirm that patients with NOTCH1 pathway activation by either NOTCH1 or FBXW7 mutation appeared to have a better prognosis, currently these data are insufficient to influence therapy in adults and therefore the choice of bone marrow transplantation or not.
Allogeneic Transplantation for Patients with Relapsed/Refractory Disease
Two large studies in adults, one from the MRC/ECOG group42 and one from the LALA group,43 confirmed that in most cases salvage after relapse is very unlikely. Patients who do most poorly are those with the shortest duration in first remission and those who are older. Maybe half of those are unlikely to reach another remission, and many will not be eligible for hematopoietic stem cell transplantation for reasons such as performance status and age. There is some minimal evidence that the occasional patient with refractory leukemia can benefit, and Registry data can also show sometimes an apparently useful retrieval of relapsed refractory patients with adult ALL. However, it remains true that for the younger adult patient achieving a second remission some form of allogeneic transplantation remains the best chance for long-term survival.42,44
Allogeneic stem cell transplantation remains the single most potent therapy for the reduction of relapse in adult patients with ALL. The largest ever randomized trial in this disease, the MRC/ECOG study, which studied donor versus no donor transplantation, firmly established the benefit of having a sibling donor. This benefit was true of patients at all ages in the study (between 15 and 50 years). However, the donor versus no donor appproach is in danger of becoming outmoded, as in many studies those previously in a “no donor” category will now undergo matched unrelated donor transplants or haploidentical transplants and possibly also cord blood transplants. There is increasing evidence that the outcome from “unrelated donor” category transplants may be not very different in this disease than that of a sibling transplant. Hitherto, transplants have been ruled out for patients over the age of 50 or 55 because of the potential toxicity of the procedures, although those patients have by far the worse prognosis on conventional therapy. It is now possible that RIC transplants may offer an allogeneic effect for this group at a level of safety hitherto unobtainable.
Finally, MRD studies that need to be worked through considerably further in the adult group may offer the possibility of identifying two groups of patients, those who do not need a transplant at all and those who have a greater predisposition to relapse and may therefore benefit from a transplant either immediately or at the first molecular sign of relapse.
Disclosures Conflict-of-interest disclosure: AHG declares no competing financial interests. JMR is a consultant for Teva Pharmaceuticals and EpiCept Corporation. Off-label drug use: None disclosed.
UCL Hospitals, London, United Kingdom; Rambam Medical Center and Technion, Haifa, Israel
Israel Institute of Technology, Haifa, Israel