**2. Natural course**

In 1968, when kidney transplant patients were first examined for the development of anti‐ bodies after graft failure, antibodies were detected in 11 (38%) of 29 patients who had rejected their grafts (Morris et al., 1969).

The fact that some patients in desensitization protocols developed AMR and others with similar levels of DSA at baseline did not, has remained unexplained due to the lack of detailed studies of these patients post transplant. Burns et al. (Burns et al., 2008) aimed to define the natural history of AMR in highly sensitized patients undergoing positive cross-match kidney transplantation. They found that the serum DSA level after transplantation was the major determinant of AMR. Patients who developed high levels of DSA within the first month after transplantation almost invariably developed acute humoral rejection (AHR), whereas those who maintained low levels were rejection-free. Importantly, more than half of the patients who had high levels of DSA at baseline did not develop high levels of DSA after transplantation. Almost all patients, including those who developed AMR, had a significant decrement or even disappearance of DSA early after transplantation (Gloor et al., 2004; Zachary et al., 2005). This finding that increases in DSA levels in AMR may be transient and self-limited in many patients presents difficulties in assessing the effectiveness of therapy aimed at treating AMR.

During the 12th International Histocompatibility workshop, a multicenter prospective study was initiated to test patients with functioning kidney transplants once for HLA antibodies post-transplantation. The 806 patients without HLA antibodies, had a subsequent 4 year graft survival of 81%, compared with 58% for 158 patients with HLA antibodies [the presence of anti-HLA antibodies led to 5% allograft loss every year; therefore, after 4 years, 20% of the grafts will be lost ]( Terasaki et al., 2007).

Among 512 patients followed for 1 year post-testing in Sao Paulo, 12% of antibody positive patients lost their grafts, whereas graft failure occurred in only 5.5% of those without HLA antibodies (P=0.03) (Campos et al., 2006). These results have been updated, demonstrating that at 3 years post-transplantation, patients without HLA antibodies had a 94% survival rate compared with 79% for those with HLA class II antibodies (Gerbase-DeLima et al., 2007).

In a large multicentre trial, HLA-specific antibodies were detected in 21% of patients with renal allografts and 14–23% of patients with heart, liver or lung allografts (Terasaki & Ozawa, 2004). Of 2,278 renal-allograft recipients who were followed prospectively, graft failure at 1 year occurred more frequently in patients who developed alloantibodies than in those who did not (8.6% versus 3.0%). Several studies have reported that de novo antibodies that are specific for graft HLA class I and class II molecules are a risk factor for premature graft loss as a consequence of renal and cardiac chronic arteriopathy (Michaels et al., 2003; Pelletier et al., 2002; Piazza et al., 2001).

For example, during a 5-year follow-up period, donor-reactive antibodies were present in 51% of patients with graft failure compared with 2% of stable control individuals. The presence of antibodies preceded graft failure in 60% of cases (Worthington et al., 2003). Worthington et al (Worthington et al., 2001) showed that among 12 patients who developed ELISA-detected HLA antibodies post-transplantation, 92% of the grafts failed, whereas among the 64 patients who remained negative, only 11% of the grafts failed (P<0.001).

So, circulating HLA-specific antibodies are typically present months to years before graft dysfunction, indicating that antibody-mediated graft injury might be slow to develop.
