Intradialytic hypotension continues to be a leading problem, especially in the elderly and cardiovascularly compromised patients. This predominance can be explained by the fact that structural abnormalities of the heart and blood vessels increase the sensitivity of patients to changes in fluid status. Intradialytic hypotension does not only cause discomfort, but also increases mortality, according to recent data in which a low post dialytic blood pressure was associated with a significantly increased risk for mortality.[1] Therefore prevention of intradialytic hypotension, especially in the elderly and vulnerable patient, remains an important challenge for the dialysis physician.

The pathogenesis of hemodialysis hypo­tension is multifactorial, but generally results from a disturbance in any of the three major factors implicated in the maintenance of hemodynamic stability during hemo­dialysis: firstly, the refilling of blood volume from the interstitial to the intra­vascular compartment, so called blood volume preservation; secondly, the cons­triction of the resistance vessels such as small arteries and arterioles, leading to an increase in systemic vascular resistance, and thirdly, the maintenance of cardiac output, which occurs through an increase in myocardial contractility, heart rate, and constriction of the capacitance vessels such as venules and veins, leading to centralization of blood volume. In this respect an intact cardiovascular system is of major importance in preserving hemo­dynamic stability.[2] There are very strong arguments for an important role of temperature in the pathogenesis of the impaired vascular response during dialysis. Therefore, in this review the effect of dialysate temperature in the management of hypotension in dialysis patients will be discussed.

Extracorporeal blood temperature

During standard temperature dialysis (37 0 C to 38 0 C) core temperature increases, which suggests an effect of the dialysis treatment on the regulation of core tempe­rature.[3],[4],[5] The pathophysiologic mechanism behind the increase in core temperature during standard dialysis remains speculative. It was hypothesized by Gotch et al that this phenomenon was caused by initial peripheral vasoconstriction resulting in a reduction in surface heat loss, leading to an increase in core temperature.[6] An increase in core temperature causes vasodilatation and might thus interfere with the normal vascular response to a decrease in blood volume.[7] On the other hand, during isolated ultrafiltration in the absence of dialysate and infusate, blood cools during extra­corporeal circulation. Maggiore et al. found a difference of 2.0 0 C in blood temperature between the arterial and venous extra­corporeal circulations in isolated ultra­filtration.[8] More recent studies showed the same results.[9],[10]

In hemofiltration, heat transfer from the substitution fluid to the blood compensates extracorporeal blood cooling. However, depending upon the infusate temperature and as a result of the relatively large extracorporeal circuit and a low infusate temperature, this compensation is only partial. During hemodiafiltration with a relatively low exchange volume of 1 l/hr, extracorporeal blood temperature increases significantly. On the other hand, extra­corporeal blood temperature remains unchanged during hemodiafiltration with an exchange volume of 2.5 l/hr.[10]

During cold temperature dialysis (35.5 0 C), core temperature remained unchanged as was shown in several studies.[11],[12],[13],[14] Moreover, the relation between predialytic core temperature and the change in core temperature during dialysis, as shown in one of our studies,[9] might provide an explanation for the findings of Fine and Penner[5] that cold temperature dialysis is most effective in patients with a low predialytic core temperature, because in these patients during standard temperature dialysis, the increase in core temperature is more distinct. Although often a standard dialysate temperature of 37.5 0 C to 38.0 0 C is chosen based on the erroneous belief that this temperature is physiological, most patients (9% to 73%) have subnormal predialysis core temperatures of less than 36.5 0 C.[5],[12],[13],[14],[15]

Dialysate temperature and hemodynamics

Since the early 1980's the effects of different dialysate temperatures on hemodynamic stability have been investigated.[16],[17],[18],[19],[20],[21] Most authors who studied the relation between the dialysate temperature and cardiovascular stability during hemodialysis found an improvement in hemodynamic stability when dialysate temperature was lowered to about 35.0 0 C.[22],[23] A better increase in total peripheral resistance[17],[24],[25] and myocardial contractility[26] in association with higher levels of plasma epinephrine[22],[25] are probably responsible for these observations.

Baldamus et al, observed an increase in systemic vascular resistance during the first 2 hours of dialysis treatment and tended to decrease afterwards.[27] We observed only minor changes in forearm vascular resistance and venous tone during standard temperature dialysis, in contrast to isolated ultrafiltration and cold temperature dialysis, during which the vascular response was more pro­nounced.[7],[28],[29]

Energy balance and hemodynamics

Although the effect of dialysate temperature on hemodynamics is well recognised, information on the energy transfer rate between the patient and extracorporeal system is scarce. As mentioned above, it has been observed that core temperature increases with the use of standard dialysate temperature. However, whether this is caused by heat load from the extracorporeal system or by effect of the dialysis procedure on core temperature regulation is not known. Moreover, there is little detailed information on the difference in energy balance between standard and cold temperature dialysate. Recently, new devices have been developed, which enable us to measure energy balances between patient and extracorporeal system.[30],[31],[32] In a study by Schneditz et al[33] arterial blood temperatures increased with mild extracorporeal cooling, but slightly decreased when heat flux was more negative (heat loss from the patient to the extracorporeal system). Moreover, the results of this study showed that during cold temperature dialysis, the higher the arterial blood temperature and blood flow rate the more negative were the energy flow rates. On the other hand, during standard temperature dialysis energy flow rates were less negative and became even positive when arterial blood temperature was lower and blood flow rates higher. In a recent study, the energy transfer rate between the extracorporeal circuit and patient, as well as the blood pressure response, the use of cold and standard temperature dialysate was assessed.[34] Core temperature increased during standard temperature dialysis despite a small negative energy balance from the patient to the extracorporeal circuit. However, during cold temperature dialysis, energy loss was much more pronounced, whereas core temperature remained stable, and even increased in some patients with a low predialytic core temperature. Core temperature generally remains stable during cool dialysis and decreases during isolated ultrafiltration due to pronounced energy loss from the patient to the extracorporeal circuit.[9] Thermal factors may therefore be of great importance in this respect. However, until recently, it was not known to what extent differences in extracorporeal energy transfer or changes in core tempe­rature were responsible for the divergent vascular response between isolated ultra­filtration and hemodialysis. Therefore, we studied the effect of isolated ultrafiltration, standard temperature hemodialysis, cold temperature hemodialysis, and hemodialysis matched for the same extracorporeal energy transfer rate as found during isolated ultrafiltration on changes in core temperature and vascular reactivity.[10] Core temperature decreased significantly during isolated ultrafiltration and hemodialysis matched for the extracorporeal energy transfer rate, remained unchanged during cold temperature hemodialysis, and increased significantly during standard temperature hemodialysis. Vascular reactivity increased significantly during the first three treatment sessions, but not during the latter. So when isolated ultrafiltration and hemodialysis were matched for the extracorporeal energy transfer rate, all differences in vascular response disappeared, showing that dif­ferences in extracorporeal energy transfer rate were the single most important factor for the observed difference in vascular response between isolated ultrafiltration and hemodialysis. In contrast to standard temperature hemodialysis, vascular reactivity improved when the increase in core temperature was prevented during cold temperature and appeared to increase more when core temperature was lowered. Finally, in a recent study, it was concluded that in order to prevent an increase in core temperature approximately 6% of energy expenditure must be removed through the extracorporeal circulation for each one percent of ultrafiltration-induced body­ weight change.[35]

The disadvantage is that a negative energy balance and a decrease in core temperature is uncomfortable for the patient. It may be sufficient, however, to only prevent the increase in core temperature during dialysis sessions.[10],[36] New techniques enable the clinician to measure and model core tempe­rature and energy transfer continuously during dialysis. They may be of great help in the prevention of intradialytic hypotension.

Conclusion

The standard dialysate temperature should not be 37.5 0 C but should be set at 35.5 0 C. When it is still necessary to improve vascular reactivity in order to improve hemodynamic stability, decreasing the core temperature by adjusting the energy transfer rate over the extracorporeal system might have an additional, but minor, beneficial effect.