2015-11-06

A moral dilemma argument against clinical trials of incentives for kidney donation

Abstract

Commercial transplant tourism results in significant harm to both kidney donors and recipients. However, proponents of incentives for kidney donation assert that proper oversight of the process prevents these harms and also that transplant numbers can be safely increased so that the moral burden of poor end-stage kidney disease outcomes can be alleviated. In a moral dilemma analysis, the principle of preventing donor harm can be dissociated from the principles of providing benefits to the recipient and to society. It is plausible that an incentivized donor is fundamentally different from an uncompensated donor. Incentivized donors can experience harms unrelated to lack of regulation because their characteristics are determined by the incentive superimposed upon a poverty circumstance. Moreover, creating a system of incentivized donation without established national registries for capturing all long-term donor outcomes would be morally inconsistent, since without prior demonstration that donor outcomes are not income or wealth-dependent, a population of incentivized donors cannot be morally created in a clinical trial. Socioeconomic factors adversely affect outcome in other surgical populations, and interventions on income or wealth in these populations have not been studied. Coercion will be increased in families not affected by kidney disease, where knowledge of a new income source and not of a potential recipient is the incentive. In the case of elective surgery such as kidney donation, donor non-maleficence trumps donor autonomy, recipient beneficence, and beneficence to society when there is a conflict among these principles. Yet, we are still faced with the total moral burden of end-stage kidney disease, which belongs to the society that cannot provide enough donor kidneys. Acting according to one arm of the dilemma to prevent donor harm does not erase obligations towards the other, to provide recipient benefit. To resolve the moral burden, as moral agents, we must rearrange our institutions by increasing available donor organs from other sources. The shortage of donor kidneys creates a moral burden for society, but incentives for donation will only increase the total moral burden of end-stage kidney disease.

Contraindications to kidney transplantation: uneven grounds?

Abstract

Background

Determining eligibility for a kidney transplant is an important decision. Practice guidelines define contraindications to transplantation; however many are not evidence based. Canadian guidelines recommend that patients unlikely to survive the wait period not be evaluated. The purpose of this study was to evaluate what proportion of patients with a contraindication would survive the wait time.

Methods

Consecutive incident dialysis patients (January 2006 to December 2012) with a contraindication, defined using Canadian guidelines, were studied. Mortality rates were determined for each individual contraindication. Theoretical survival to the median wait time to transplantation was calculated.

Results

Of 746 incident patients, 435 (58 %) were deemed to have a contraindication at dialysis start. Nearly 80 % had a contraindication with a high mortality rate (dementia, multisystem disease, etc.). Patients with high mortality rates were less likely to survive the wait list than be transplanted. Patients with non-adherence, obesity, and potentially reversible disease had relatively low mortality rates, were more likely to survive, and possibly be transplanted at a time with the prospect of a better outcome.

Conclusions

This study gives some credence that many patients with a contraindication are not likely to benefit. A better framework of defining contraindications is needed to allow better decision-making.

Skin cancer in solid organ transplant recipients: are mTOR inhibitors a game changer?

Abstract

While immunosuppressive agents are necessary to prevent the rejection of transplanted organs, and are a great medical success story for protecting against early allograft loss, graft and patient survival over the long term are diminished by side effects from these same drugs. One striking long-term side effect is a high rate of skin cancer development. The skin cancers that develop in transplant recipients tend to be numerous, as well as particularly aggressive, and are therefore a major contributor to morbidity and mortality in transplant recipients. An apparent reason for the high incidence of skin cancer likely relates to suppression of immune surveillance mechanisms, but other more direct effects of certain immunosuppressive drugs are also bound to contribute to cancers of UV-exposed skin. However, over the past few years, evidence has emerged to suggest that one class of immunosuppressants, mammalian target of rapamycin (mTOR) inhibitors, could potentially inhibit skin tumour formation through a number of mechanisms that are still being studied intensively today. Therefore, in light of the high skin cancer incidence in transplant recipients, it follows that clinical trials have been conducted to determine if mTOR inhibitors can significantly reduce these post-transplant skin malignancies. Here, the problem of post-transplant skin cancer will be briefly reviewed, along with the possible mechanisms contributing to this problem, followed by an overview of the relevant clinical trial results using mTOR inhibitors.

Vitamin D and cinacalcet administration pre-transplantation predict hypercalcaemic hyperparathyroidism post-transplantation: a case-control study of 355 deceased-donor renal transplant recipients over 3 years

Abstract

Background

The effects of pre-transplantation medication for secondary hyperparathyroidism on post-transplantation parathyroid hormone (PTH) and calcium levels have not yet been conclusively determined. Therefore, this study sought to determine the level of off-label use of cinacalcet and to determine predictors of its administration during the long-term follow-up of a cohort of individuals who received deceased-donor renal transplants. Furthermore, safety considerations concerning the off-label use of cinacalcet are addressed.

Methods

This was a case-control study of 355 stable renal transplant recipients. The patient cohort was divided into two groups. Transplant group A comprised patients who did not receive cinacalcet treatment, and transplant group B comprised patients who received cinacalcet treatment during follow-up after renal transplantation. The characteristics of the patients were evaluated to determine predictors of cinacalcet use after successful renal transplantation.

Results

Compared with the control individuals (n = 300), the cinacalcet-treated individuals (n = 55) had significantly higher PTH levels at 4 weeks post-transplantation (20.3 ± 1.6 versus 40.7 ± 4.0 pmol/L, p = 0.0000) when they were drug naive. At 3.2 years post-transplantation, cinacalcet-treated patients showed higher PTH (26.2 ± 2.3 versus 18.4 ± 2.3 pmol/L, p = 0.0000), higher calcium (2.42 ± 0.03 versus 2.33 ± 0.01 mmol/L, p = 0.0045) and lower phosphate (0.95 ± 0.04 versus 1.06 ± 0.17 mmol/L, p = 0.0021) levels. Individuals in the verum group were more likely to receive cinacalcet therapy (45.5% versus 14.3%, p = 0.0000), and they had higher pill burdens for the treatment of hyperparathyroidism (1.40 ± 0.08 versus 0.72 ± 0.03 pills per patient, p = 0.0000) whilst they were on the waiting list for transplantation. Regression analysis confirmed the associations between hypercalcaemic hyperparathyroidism and PTH levels at 4 weeks post-transplantation (p = 0.0001), cinacalcet use (p = 0.0000) and the preoperative total pill burden (p = 0.0000). Renal function was the same in both groups.

Conclusions

Parathyroid gland dysfunction pre-transplantation translates into clinically relevant hyperparathyroidism post-transplantation, despite patients being administered more intensive treatment whilst on dialysis. PTH levels at 4 weeks post-transplantation might serve as a marker for the occurrence of hypercalcaemic hyperparathyroidism during follow-up.

Effect of left atrial and ventricular abnormalities on renal transplant recipient outcome—a single-center study

Abstract

Background

Premature cardiovascular (CV) death is the commonest cause of death in renal transplant recipients. Abnormalities of left ventricular (LV) structure (collectively termed uremic cardiomyopathy) and left atrial (LA) dilation, a marker of fluid status and diastolic function, are risk factors for reduced survival in patients with end stage renal disease (ESRD). In the present analysis, we studied the impact of pre-transplant LA and LV abnormalities on survival after successful renal transplantation (RT).

Methods

One hundred nineteen renal transplant recipients (first transplant, deceased donors) underwent cardiovascular MRI (CMR) as part of CV screening prior to inclusion on the waiting list. Data regarding transplant function and patient survival after transplantation were collected.

Results

Median post-transplant follow-up was 4.3 years (interquartile range (IQR) 1.9, 6.2). During the post-transplant period, 13 patients returned to dialysis after graft failure and 23 patients died with a functioning graft. Survival analyses, censoring for patients returning to dialysis, showed that pre-transplant LV hypertrophy and elevated LA volume were significantly associated with reduced survival after transplantation. Multivariate Cox regression analyses demonstrated that longer waiting time, poorer transplant function, presence of LV hypertrophy and higher LA volume on screening CMR and female sex were independent predictors of death in patients with a functioning transplant.

Conclusions

Presence of LVH and higher LA volume are significant, independent predictors of death in patients who are wait-listed and proceed with renal transplantation.

Multipotent adult progenitor cells decrease cold ischemic injury in ex vivo perfused human lungs: an initial pilot and feasibility study

Abstract

Background

Primary graft dysfunction (PGD) is a significant cause of early morbidity and mortality following lung transplantation. Improved organ preservation techniques will decrease ischemia-reperfusion injury (IRI) contributing to PGD. Adult bone marrow-derived adherent stem cells, including mesenchymal stromal (stem) cells (MSCs) and multipotent adult progenitor cells (MAPCs), have potent anti-inflammatory actions, and we thus postulated that intratracheal MAPC administration during donor lung processing would decrease IRI. The goal of the study was therefore to determine if intratracheal MAPC instillation would decrease lung injury and inflammation in an ex vivohuman lung explant model of prolonged cold storage and subsequent reperfusion.

Methods

Four donor lungs not utilized for transplant underwent 8 h of cold storage (4°C). Following rewarming for approximately 30 min, non-HLA-matched allogeneic MAPCs (1 × 107 MAPCs/lung) were bronchoscopically instilled into the left lower lobe (LLL) and vehicle comparably instilled into the right lower lobe (RLL). The lungs were then perfused and mechanically ventilated for 4 h and subsequently assessed for histologic injury and for inflammatory markers in bronchoalveolar lavage fluid (BALF) and lung tissue.

Results

All LLLs consistently demonstrated a significant decrease in histologic and BALF inflammation compared to vehicle-treated RLLs.

Conclusions

These initial pilot studies suggest that use of non-HLA-matched allogeneic MAPCs during donor lung processing can decrease markers of cold ischemia-induced lung injury.

Factors influencing survival after kidney transplant failure

Abstract

Background

The failure of a kidney transplant is now a common reason for initiation of dialysis therapy. Kidney transplant recipients commencing dialysis have greater morbidity and mortality than transplant-naïve, incident dialysis patients. This study aimed to identify variables associated with survival after graft failure.

Methods

All recipients of first, deceased donor kidney transplants performed in Northern Ireland between 1986 and 2005 who had a functioning graft at 12 months were included (n = 585). Clinical and blood-derived variables (age, gender, primary renal disease, diabetic status, smoking status, human leukocyte antigen (HLA) mismatch, acute rejection episodes, immunosuppression, cardiovascular disease, graft survival, haemoglobin, albumin, phosphate, C reactive protein, estimated glomerular filtration rate (eGFR), rate of eGFR decline, dialysis modality, and access) were collected prospectively and investigated for association with re-transplantation and survival. The association between re-transplantation and survival was explored by modelling re-transplantation as a time-dependent covariate.

Results

Median follow-up time was 12.1 years. Recipients with a failing graft (158/585) demonstrated rapid loss of eGFR prior to graft failure, reducing the time available to plan for alternative renal replacement therapy. Median survival after graft failure was 3.0 years. In multivariate analysis, age and re-transplantation were associated with survival after graft failure. Re-transplantation was associated with an 88% reduction in mortality.

Conclusions

Optimal management of kidney transplant recipients with failing grafts requires early recognition of declining function and proactive preparation for re-transplantation given the substantial survival benefit this confers. The survival benefit associated with re-transplantation persists after prolonged exposure to immunosuppressive therapy.

Cascade plasmapheresis (CP) as a preconditioning regime in ABO-incompatible live related donor liver transplants (ABOi-LDLT)

Abstract

Background

ABO-incompatible live donor liver transplant (ABOi-LDLT) is being widely done to bridge the gap of demand and supply of organs. Different desensitization regimes are being used to reduce titer of blood group antibodies for successful transplant and accommodation of graft. The authors used cascade plasmapheresis (CP) to bring down titer of naturally occurring blood group antibody to 16 or lower.

Material and methods

Four recipients of ABOi-LDLT were of blood groups O, O, B, and B while donors were of blood groups B, A, AB, and AB, respectively. Desensitization protocol included immunosuppressive drugs and plasmapheresis. CP consisted of separating patient’s plasma as the first step and passing it through pore size based filter column as the second step. The first step was performed using disposable kit (PL1, Fresenius Kabi, Germany) with minor modification on apheresis equipment COM.TEC (Fresenius Kabi, Germany). Pore size based filter column used was 2A column (Evaflux, Kawasumi Laboratories, Japan). Blood group antibody titer (immunoglobulin G (IgG)) was done by column agglutination technology (Ortho-Clinical Diagnostics).

Results

Cases 1, 2, 3, and 4 with pre-CP titer of 1,024, 512, 32, and 64 required four, three, one, and one CP procedures, respectively. No signs of antibody-mediated rejection were exhibited on histopathological evaluation by any of the patients. Successful organ engraftment occurred as documented by post-operative liver function tests and liver biopsy.

Conclusion

Cascade plasmapheresis offers a cost-effective and efficient way to decrease blood group antibody titer and helps in successful transplant.

The feasibility and applications of non-invasive cardiac output monitoring, thromboelastography and transit-time flow measurement in living-related renal transplantation surgery: results of a prospective pilot observational study

Abstract

Introduction

Delayed graft function (DGF) remains a significant and detrimental postoperative phenomenon following living-related renal allograft transplantation, with a published incidence of up to 15%. Early therapeutic vasodilatory interventions have been shown to improve DGF, and modifications to immunosuppressive regimens may subsequently lessen its impact. This pilot study assesses the potential applicability of perioperative non-invasive cardiac output monitoring (NICOM), transit-time flow monitoring (TTFM) of the transplant renal artery and pre-/perioperative thromboelastography (TEG) in the early prediction of DGF and perioperative complications.

Methods

Ten consecutive living-related renal allograft recipients were studied. Non-invasive cardiac output monitoring commenced immediately following induction of anaesthesia and was maintained throughout the perioperative period. Doppler-based TTFM was performed during natural haemostatic pauses in the transplant surgery: immediately following graft reperfusion and following ureteric implantation. Central venous blood sampling for TEG was performed following induction of anaesthesia and during abdominal closure.

Results

A single incidence of DGF was seen within the studied cohort and one intra-operative (thrombotic) complication noted. NICOM confirmed a predictable trend of increased cardiac index (CI) following allograft reperfusion (mean CI – clamped: 3.17 ± 0.29 L/min/m2, post-reperfusion: 3.50 ± 0.35 L/min/m2; P < 0.05) mediated by a significant reduction in total peripheral resistance. Reduced TTFM at the point of allograft reperfusion (227 ml/min c.f. mean; 411 ml/min (95% CI: 358 to 465)) was identified in a subject who experienced intra-operative transplant renal artery thrombosis. TEG data exhibited significant reductions in clot lysis (LY30 (%): pre-op: 1.0 (0.29 to 1.71), post reperfusion 0.33 (0.15 to 0.80); P = 0.02) and a trend towards increased clot initiation following allograft reperfusion.

Conclusions

Reduced renal arterial blood flow (falling without the 95% CI of the mean), was able to accurately predict anastomotic complications within this pilot study. TEG data suggest the emergence of a prothrombotic state, of uncertain clinical significance, following allograft reperfusion. Abrogation of characteristic haemodynamic trends, as determined by NICOM, following allograft reperfusion may permit prediction of individuals at risk of DGF. The findings of this pilot study mandate a larger definitive trial to determine the clinical applications and predictive value of these technologies.

Active video gaming in patients with renal transplant: a pilot study

Abstract

Background

Patients with renal transplant are at higher risk of mortality from cardiovascular disease (CVD) compared with the general population. Physical activity has been shown to reduce the risk of CVD mortality in these patients. Unfortunately, barriers such as the harsh Canadian climate prevent patients from engaging in and harvesting the health benefits of physical activity. This pilot study explored active video gaming (AVG) as a way for patients with renal transplant to obtain physical activity and examined its effect on their functional status and quality of life (QOL).

Main text

We recruited nine patients for an 8-week prospective pilot study. All patients received a Microsoft Xbox 360™ video gaming console, a Microsoft Kinect™ sensor, and the video game Your Shape Fitness Evolved 2012. Assessment of each participant before and after the intervention included blood pressure measures, a 6-minute walk test, and the Godin Leisure Time Questionnaire (GLTQ). We analyzed all nine patients at the end of the 8-week study period, and found no changes in blood pressure or GLTQ scores. However, there was a significant increase in the 6-minute walk distance (P = 0.022), which represented a consistent increase for most patients (correlation = 0.977). In addition, participants over the age of 45 years (n = 4) were more likely to use the AVG system (P = 0.042).

Conclusion

AVG has the potential to improve the functional status in patients with renal transplant. Further research is required to corroborate the full health benefits of AVG in this patient population.

from #Med Blogs by Alexandros G.Sfakianakis via paythelady61 on Inoreader http://ift.tt/1SvEVr3

via IFTTT

Show more