Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 101
Filter
1.
BMC Health Serv Res ; 21(1): 283, 2021 Mar 26.
Article in English | MEDLINE | ID: mdl-33771133

ABSTRACT

Blackpool is one of the most deprived Local Authority (LA) areas in England; in April 2015 the Blackpool Better Start (BBS) Partnership was allocated £45 million over 10 years from the Big Lottery Fund (BLF) as one of five 'A Better Start' initiative areas in England. The aim of the 'A Better Start' initiative is to improve outcomes for children from conception to 3 years of age. Co-designed by professionals and the community, the Community Connector (CCx) programme employs residents to directly engage caregivers of children, in seven of Blackpool's most socio-economically deprived wards. The CCx follow a socioecological framework which proposes that caregivers will be positively influenced to engage in early years activities because of connections to trained peers. Peer support models are commonly applied within targeted early years health settings (i.e., infant feeding support, literacy) yet their role to improve child outcomes at a universal level has received little attention. This paper focuses on caregiver-level evidence of the strategies employed by CCx - part of an early stage pilot study supported by Frontiers of Innovation, the Harvard Centre on the Developing Child's Research and Development platform.The study collated attendance data from Children's Centres, these are publically funded community centres providing information and activities for families with children 0-5 years of age. The study data included individual interactions between a CCx and caregiver over a 1 year period (1st April 2018 - 31st March 2019). A sampling frame was created from which a total of 22 interviews with caregivers were undertaken in early years community settings. The interview data was thematically analysed; the findings highlighted the mechanisms by which CCx served to mediate service and caregiver communication boundaries, negotiate access to spaces, and encouraged sustained engagement in longer term activities such as volunteering and training. Value was embedded by the CCx in their process of establishing and maintaining connections with caregivers through the 'everyday' conversations, their individualised approach and in demonstrating self-efficacy behaviours. Further research is required to review the impact of the CCx role in caregiver's recall of early years information, nevertheless the study provided important learning for establishing formalised CCx programmes elsewhere, and has implications for community health and early years policy and practice.


Subject(s)
Caregivers , Child Health , Child , England , Family , Humans , Infant , Pilot Projects
2.
Kidney Int ; 70(1): 24-5, 2006 Jul.
Article in English | MEDLINE | ID: mdl-16763569

ABSTRACT

The importance of hemodialysis session length relative to small solute (e.g., urea) clearance has been debated for many years. Longer session length augments clearance of larger molecules and may facilitate ultrafiltration; however, the independent effects of session length on survival and other outcomes are unknown. In this report, we review two recently published observational studies examining the association between hemodialysis session length and survival. Prospective clinical trials will be required to resolve the debate.


Subject(s)
Kidney Failure, Chronic/mortality , Renal Dialysis , Humans , Kidney Failure, Chronic/therapy , Time Factors
3.
Kidney Int ; 70(1): 211-7, 2006 Jul.
Article in English | MEDLINE | ID: mdl-16723982

ABSTRACT

New technology now supports direct online measurements of total dialysis dose per treatment, Kt. An outcome-based, nonlinear method for estimating target Kt in terms of ionic clearance measurements and body surface area (BSA) has been described recently. This is a validation study of the new method that evaluates the relationship between the (actual Kt-target Kt) difference and death risk. Patients with Kt measurements during March 2004 were identified (N=59,644). Target Kt was determined for each patient using the new method. Patients were then grouped by (actual Kt-target Kt) decile. They were also grouped by (actual URR-target URR) decile. Cox analysis-based risk profiles were constructed using those groupings. The (actual Kt-target Kt) difference profiles suggested improving death risk as Kt increased from below target to equal target. Risk ratios then flattened and remained so until (actual Kt-target Kt) reached the highest decile at which it appeared to improve, suggesting a possible biphasic profile. The (URR-target URR) risk profile was U-shaped. Death risk was related to the difference between the actual Kt and a target Kt value selected using the new nonlinear method. The method is therefore valid for prescribing and monitoring hemodialysis treatment.


Subject(s)
Kidney Failure, Chronic/mortality , Monitoring, Physiologic/methods , Online Systems , Renal Dialysis/standards , Body Surface Area , Female , Humans , Ions/analysis , Kidney Failure, Chronic/therapy , Male , Middle Aged , Prognosis , Risk , Urea/analysis
4.
Kidney Int ; 60(5): 1917-29, 2001 Nov.
Article in English | MEDLINE | ID: mdl-11703611

ABSTRACT

BACKGROUND: Medical communities often develop practice guidelines recommending certain care processes intended to promote better clinical outcome among patients. Conformance with those guidelines by facilities is then monitored to evaluate care quality, presuming that the process is associated with and can be used reliably to predict clinical outcome. Outcome is often monitored as a facility-specific mortality rate (SMR) standardized to the mix of patients treated, also presuming that inferior outcome implies a suboptimal process. The U.S. Health Care Financing Administration monitors three practice guidelines, called Core Indicators, in dialysis facilities to assist management of its end-stage renal disease program: (1) patients' hematocrit values should exceed 30 vol%, (2) the urea reduction ratio (URR) during dialysis should equal or exceed 65%, and (3) patients' serum albumin concentrations should equal or exceed 3.5 g/dL. METHODS: The associations of a facility-specific SMR were evaluated with the fractions of hemodialysis patients not conforming to (that is, at variance with) the Core Indicators during three successive years (1993 to 1995) in large numbers of facilities (394, 450, and 498) using one-variable and multivariable statistical models. Three related strategies were used. First, the association of the SMR with the fraction of patients not meeting the guideline was evaluated. Second, each facility was classified by whether its SMR exceeded the 80% confidence interval above 1.0 (worse than 1.0, Group 3), was less than the interval below 1.0 (better than 1.0, Group 1), or was within the interval (Group 2). The fraction of those patients who did not meet the Indicator guidelines was then evaluated in each group. Third, the ability of variance from Indicator guidelines to predict into which of the three SMR groups a facility would be categorized was evaluated. RESULTS: SMR was directly correlated with variance from the Indicator guidelines, but the strengths of the associations were weak particularly for the hematocrit (R(2) = 2.2%, 5.6, and 2.2 for each of the 3 years) and URR Indicators (R(2) = 2.6, 0.6, 3.3). It was stronger for the albumin Indicator (R(2) = 11.6, 20.4, 21.8). The fractions of patients falling outside of the Indicator guidelines tended to be higher in the highest SMR group. The groups were not well separated, however, particularly for the hematocrit and URR Indicators, and there was substantial overlap between them. Finally, although the likelihood that a facility would be a member of the high or low SMR group was associated with fractional variance from Core Indicator guidelines, the strengths of association were weak, and the probability that a facility would be a member of the high or low group could not be easily distinguished from the probability that it would be a member of the middle group. CONCLUSIONS: While there were statistical associations between SMR and the fraction of patients in facilities who were at variance with these guidelines, they were weak and variances from the guidelines could not be used reliably to predict high or low SMR. Such findings do not imply that measures reflecting anemia, dialysis dose, or medical processes that influence serum albumin concentration are irrelevant to the quality of care. They do suggest, however, that more attention needs be paid to these and other associates and causes of mortality among dialysis patients when developing care process indicator guidelines.


Subject(s)
Practice Guidelines as Topic , Quality of Health Care , Renal Dialysis/mortality , Humans , Multivariate Analysis , Regression Analysis , Serum Albumin/analysis , Urea/metabolism
5.
Am J Kidney Dis ; 37(2): 267-75, 2001 Feb.
Article in English | MEDLINE | ID: mdl-11157366

ABSTRACT

Health care quality is assessed by profiling measures of care and/or health outcomes. However, such tools to measure outcome as standardized mortality ratio (SMR) are often used without thorough validation of their strengths and limitations. Our study compared the dialysis facility-specific SMR and SMR-based rating using different statistical methods and followed them over time. All Fresenius Medical Care, North America dialysis facilities (n = 377) that contributed patient data from 1993 to 1995 (>103,500 patient-years) were included. Four distinct statistical methods (US Renal Data System [USRDS], Poisson, logistic, and Cox regression) were used to compute facility-specific SMRs and rank and classify facilities. The analysis compared the SMR and SMR-based rating of dialysis facilities between SMR method and over time. Different methods produced statistically significant differences in SMR distribution (P < 0.05). The USRDS method produced SMR values that decreased over time (P < 0.001). Based on 90% confidence intervals to determine outliers, the SMR-dependent ranking of dialysis facilities varied by method (P < 0.001). SMR-based ranking was stable over time except for the USRDS method (P < 0.001). Contingency table analysis showed up to a 33% total misclassification rate between SMR methods when ranking facilities. The facility-specific SMR and SMR-based ranking are both sensitive to statistical technique. Because the SMR yields different results in a year and over time and because there is no demonstrable gold standard, conclusions based on any one technique are unstable and unreliable. Regulatory monitoring, actions, and/or performance awards should be avoided based on this measure. However, a facility-specific SMR estimated in any valid way may be useful as an epidemiological research tool.


Subject(s)
Ambulatory Care Facilities/statistics & numerical data , Data Interpretation, Statistical , Quality of Health Care , Renal Dialysis/mortality , Adult , Female , Health Services Research , Humans , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Male , North America , Regression Analysis , Total Quality Management
6.
Kidney Int ; 59(2): 738-45, 2001 Feb.
Article in English | MEDLINE | ID: mdl-11168957

ABSTRACT

BACKGROUND: The urea reduction ratio (URR), a measure quantitating solute removal during hemodialysis, is the fractional reduction of the blood urea concentration during a single hemodialysis treatment. The URR is the principal measure of hemodialysis dose in the United States. Based on studies of patients dialyzed prior to 1994, a minimum URR value of 65% was recommended to optimize survival. Because of new hemodialysis technologies and evolving demographics of the hemodialysis population, the relationship between the amount of hemodialysis and mortality was examined in contemporary cohorts. METHODS: This retrospective cohort included> 15,000 patients per year receiving hemodialysis during 1994 through 1997. Each patient's URR was averaged for the three months prior to the beginning of each year. Mortality odds ratios were calculated for patients by URR. To determine the URR value above which no further improvement in mortality was seen ("threshold"), spline functions were tested in logistic regression models, both unadjusted and adjusted for case mix measures. The strength of fit for URR, defined by a range of candidate thresholds from 55 to 75%, was evaluated in increments of 1% for each year using spline functions. RESULTS: The median URR was 63.2, 65.4, 67.4, and 68.1% for 1994 through 1997, respectively. The median length of hemodialysis treatments increased only six minutes from the beginning to the end of the period of analysis. Using spline functions, the threshold URR values were 61.1, 65.0, 68.0, and 71.0% for 1994 through 1997 in models adjusted for case mix. The ratio of median URR to URR threshold decreased from 1.03 in 1994 to 0.97 in 1997. CONCLUSIONS: From 1994 to 1997, the median URR and the URR threshold for mortality benefit increased. Although an increased need in the amount of hemodialysis may be a consequence of changes in patients' demographic characteristics, the likely explanation(s) is a change in the dialysis procedure and/or blood sampling favoring higher URR values without changing the amount of dialysis provided. The recommended minimum URR of 65% appears to be too low to confer an optimal mortality benefit in the context of current practices.


Subject(s)
Renal Dialysis , Aged , Cohort Studies , Differential Threshold , Female , Humans , Male , Middle Aged , Mortality , Odds Ratio , Retrospective Studies , Survival Analysis , Time Factors , Urea/blood
7.
Kidney Int ; 58(6): 2512-7, 2000 Dec.
Article in English | MEDLINE | ID: mdl-11115085

ABSTRACT

BACKGROUND: Although serum prealbumin is considered a valid indicator of nutritional status in hemodialysis patients, there is relatively little evidence that its determination is of major prognostic significance. In this study, we aimed to determine the independent association of serum prealbumin with survival in hemodialysis patients, after adjusting for serum albumin and other indicators of protein energy nutritional status. METHODS: Serum prealbumin was measured in more than 1600 maintenance hemodialysis patients. We determined the correlations among prealbumin and other indicators of nutritional status, including serum albumin, and bioimpedance-derived indicators of body composition. The relationship between serum prealbumin and survival was determined using proportional hazards regression. RESULTS: The serum albumin was directly correlated with the serum prealbumin (r = 0.47, P < 0.0001), but still explained <25% of the variability in prealbumin. Prealbumin was inversely related to mortality, with a relative risk reduction of 6% per 1 mg/dL increase in prealbumin, even after adjusting for case mix, serum albumin, and other nutritional indicators. The increase in risk with lower serum prealbumin concentrations was observed whether the serum albumin was high or low. CONCLUSION: In hemodialysis patients, the serum prealbumin provides prognostic value independent of the serum albumin and other established predictors of mortality in this population.


Subject(s)
Kidney Failure, Chronic/blood , Nutrition Assessment , Prealbumin/metabolism , Renal Dialysis , Serum Albumin , Aged , Female , Humans , Kidney Failure, Chronic/mortality , Male , Middle Aged , Predictive Value of Tests , Prognosis , Proportional Hazards Models , Protein-Energy Malnutrition/blood , Protein-Energy Malnutrition/diagnosis , Protein-Energy Malnutrition/mortality
11.
Am J Kidney Dis ; 35(4): 598-605, 2000 Apr.
Article in English | MEDLINE | ID: mdl-10739778

ABSTRACT

The urea reduction ratio (URR) and normalized treatment ratio (Kt/V) are related quantities that have become accepted measures of hemodialysis dose. Recent studies, however, have suggested that they combine two elements, both favorably associated with clinical outcome, as a single ratio. These elements, Kt and V, may offset each other, producing a complex quantity that does not reflect a true relationship between dialysis exposure and clinical outcome. This project explored and compared the associations of the URR and the ¿urea clearance x time¿ product (Kt) with mortality in a large sample of hemodialysis patients (37,108 patients) during 1998. Survival analyses using conventional techniques were the primary analytic tools. The relationship between URR and survival was U-shaped or J-shaped, with greater relative mortality at both extremes of the URR distribution than at its middle. Thus, identifying a threshold for adequate dialysis was not possible unless one considers also a threshold for overdialysis. Conversely, the association between Kt and outcome was much simpler, reflecting progressive improvement over the range of Kt evaluated here. These analyses suggest that such measures as URR and Kt/V are compound and complex, and that a simpler, more direct, measure, such as the Kt, should be considered to describe hemodialysis dose.


Subject(s)
Renal Dialysis/methods , Urea/metabolism , Body Mass Index , Female , Humans , Male , Middle Aged , Renal Dialysis/mortality , Renal Insufficiency/therapy , Survival Rate
12.
Kidney Int ; 57(3): 1176-81, 2000 Mar.
Article in English | MEDLINE | ID: mdl-10720970

ABSTRACT

BACKGROUND: The link between dialysis "vintage" (length of time on dialysis in months to years) and survival has been difficult to define, largely because of selection effects. End-stage renal disease (ESRD) is thought to be a wasting illness, but there are no published reports describing the associations between vintage and body composition in hemodialysis patients. METHODS: We explored the relationships among vintage, nutritional status, and survival in a 3009 patient cohort of prevalent hemodialysis patients. Body weight, total body water, body cell mass, and phase angle by bioelectrical impedance analysis were the body composition parameters of interest. We examined vintage as an explanatory variable in multiple linear regression analyses (adjusted for age, gender, race, and diabetes) using body composition parameters and biochemical indicators of nutritional status as dependent variables. Proportional hazards regression was used to evaluate the association of vintage and survival with and without adjustment for case mix and laboratory variables. RESULTS: Dialysis vintage was 3.8 +/- 3.7 (median 2.6) years. Body composition parameters tended to be lower after dialysis year 2. Linear estimates per year of vintage beyond year 2 include -0.66 kg body wt (P < 0.0001), -0.17 kg total body water (P = 0.0003), -0.14 kg body cell mass (P < 0.0001), and -0.07 degrees phase angle (P < 0.0001). In unadjusted analyses, vintage was not associated with survival, either as a linear or higher order term. The adjustment for case mix yielded a vintage term associated with an increased relative risk (RR) of death (RR 1.04 (95% CI, 1.01 to 1.07 per year). A further adjustment for laboratory data yielded a RR of 1.06 (95% CI, 1.03 to 1.09 per year). CONCLUSION: Dialysis vintage is related to nutritional status in hemodialysis patients, with vintage of more than years associated with a significant decline in all measured nutritional parameters. Cross-sectional analyses probably underestimate these effects. A year accrued on dialysis is associated with a 6% increase in the risk of death, all else equal. Longitudinal assessments of nutritional status, including body composition, are required to better understand the natural history of wasting with ESRD and its implications for long-term survival.


Subject(s)
Nutritional Status , Renal Dialysis , Adult , Aged , Body Composition , Cholesterol/blood , Cohort Studies , Female , Humans , Male , Middle Aged , Prealbumin/analysis , Proportional Hazards Models , Serum Albumin/analysis , Survival Analysis , Time Factors
13.
Am J Kidney Dis ; 35(2): 293-300, 2000 Feb.
Article in English | MEDLINE | ID: mdl-10676729

ABSTRACT

Short Form 36 (SF-36) is a well-documented health-related quality-of-life (HRQOL) instrument consisting of 36 questions compressed into eight scales and two primary dimensions: the physical and mental component scores. This tool was used to evaluate QOL among peritoneal dialysis (PD) and hemodialysis (HD) patients. The results of 16,755 HD and 1,260 PD patients (728 continuous ambulatory PD [CAPD] and 532 continuous cycling PD [CCPD]) completing an SF-36 during 1996 were analyzed. Three analyses of variance were performed, consisting of (1) no adjustment, (2) case mix (age, sex, race, and diabetes), and (3) case mix plus laboratory parameters. PD patients were younger (P < 0.001), a larger fraction were white (P < 0.001), fewer had diabetes (P < 0.001), and had lower serum albumin concentrations (P < 0.001) and higher creatinine, hemoglobin, and white blood cell count values (P < 0.001) than HD patients. Diabetes was present in a larger fraction of CCPD than CAPD patients (P < 0.001). HD and PD patients scored similarly for scales reflecting physical processes. PD patients scored higher for mental processes, but only after statistical adjustment for the laboratory measures. Scores on scales reflecting physical processes were worse, and those reflecting mental processes were better among CCPD than CAPD patients. HD and CAPD scores were similar. CCPD patients perceived themselves as more physically impaired but better adjusted than HD or CAPD patients. These descriptive data show that perception of QOL among PD and HD patients is similar before adjustment, but PD patients score higher for mental processes with adjustment. CCPD patients score worse for physical function and better for mental function than either CAPD or HD patients. We cannot, however, exclude the influence of therapy selection.


Subject(s)
Peritoneal Dialysis , Quality of Life , Renal Dialysis , Surveys and Questionnaires , Female , Humans , Male , Middle Aged
14.
Kidney Int ; 56(5): 1872-8, 1999 Nov.
Article in English | MEDLINE | ID: mdl-10571796

ABSTRACT

BACKGROUND: Although accepted worldwide as valid measures of dialysis adequacy, neither the Kt/V (urea clearance determined by kinetic modeling) nor the urea reduction ratio (URR) have unambiguously predicted survival in hemodialysis patients. Because the ratio Kt/V can be high with either high Kt (clearance x time) or low V (urea volume of distribution) and V may be a proxy for skeletal muscle mass and nutritional health, we hypothesized that the increase in the relative risk of death observed among individuals dialyzed in the top 10 to 20% of URR or Kt/V values might reflect a competing risk of malnutrition. METHODS: A total of 3,009 patients who underwent bioelectrical impedance analysis were stratified into quintiles of URR. Laboratory indicators of nutritional status and two bioimpedance-derived parameters, phase angle and estimated total body water, were compared across quintiles. The relationship between dialysis dose and mortality was explored, with a focus on how V influenced the structure of the dose-mortality relationship. RESULTS: There were statistically significant differences in all nutritional parameters across quintiles of URR or Kt/V, indicating that patients in the fifth quintile (mean URR, 74.4 +/- 3.1%) were more severely malnourished on average than patients in all or some of the other quintiles. The relationship between URR and mortality was decidedly curvilinear, resembling a reverse J shape that was confirmed by statistical analysis. An adjustment for the influence of V on URR or Kt/V was performed by evaluating the Kt-mortality relationship. There was no evidence of an increase in the relative risk of death among patients treated with high Kt. Higher Kt was associated with a better nutritional status. CONCLUSION: We conclude that the increase in mortality observed among those patients whose URR or Kt/V are among the top 10 to 20% of patients reflects a deleterious effect of malnutrition (manifest by a reduced V) that overcomes whatever benefit might be derived from an associated increase in urea clearance. Identification of patients who achieve extremely high URR (>75%) or single-pooled Kt/V (>1.6) values using standard dialysis prescriptions should prompt a careful assessment of nutritional status. Confounding by protein-calorie malnutrition may limit the utility of URR or Kt/V as a population-based measure of dialysis dose.


Subject(s)
Renal Dialysis/mortality , Urea/metabolism , Adult , Aged , Female , Humans , Male , Middle Aged , Nutrition Disorders/metabolism , Nutritional Status
15.
Kidney Int ; 56(3): 1136-48, 1999 Sep.
Article in English | MEDLINE | ID: mdl-10469384

ABSTRACT

BACKGROUND: Protein-energy malnutrition is a strong predictor of mortality in maintenance hemodialysis (MHD) patients. This association has generally been described for serum chemistry measures of protein-energy malnutrition. We hypothesized that body weight-for-height relationships also predict survival in MHD patients. METHODS: During the last three months of 1993, data were obtained on 12,965 men and women concerning clinical characteristics (height, postdialysis weight, age, gender, race, and presence or absence of diabetes mellitus) and laboratory measurements (predialysis serum albumin, creatinine and cholesterol, and the urea reduction ratio). Patient survival during the next 12 months was evaluated retrospectively. RESULTS: In comparison to values for normal Americans determined from the National Health and Nutrition Evaluation Survey II data, weight-for-height relationships tended to be slightly lower than normal in African American men and women and Caucasian men undergoing MHD and were normal or slightly greater in the taller Caucasian women. In both men and women, the mortality rate decreased progressively as the patients' weight-for-height increased. MHD patients who weighed more than normal had the lowest mortality rates. After adjustment for clinical characteristics and laboratory measurements, the inverse relationship between mortality rates and weight-for-height percentiles was still highly significant for patients within the lower 50th percentile of body weight-for-height. Serum albumin correlated directly with weight-for-height in patients in the lower 50th percentile of weight-for-height. Serum creatinine and cholesterol correlated directly with weight-for-height in the entire population of men and women. In contrast, the urea reduction ratio was inversely correlated with weight-for-height. CONCLUSIONS: These data indicate that weight-for-height is a strong predictor of 12-month mortality in male and female MHD patients. Multivariate analyses indicate that body weight-for-height is an independent predictor of higher mortality in those patients who are in the lower 50th percentile for this measurement.


Subject(s)
Body Height , Body Weight , Renal Dialysis/mortality , Adult , Aged , Body Mass Index , Cohort Studies , Databases, Factual , Female , Humans , Male , Middle Aged , Multivariate Analysis , Nutritional Status , Odds Ratio , United States/epidemiology
16.
Kidney Int ; 56(2): 729-37, 1999 Aug.
Article in English | MEDLINE | ID: mdl-10432415

ABSTRACT

BACKGROUND: The normalized treatment ratio [Kt/V = the ratio of the urea clearance x time product to total body water] and the urea reduction ratio (URR) have become widely accepted measures of dialysis dose. Both are related to and derived from pharmacokinetic models of blood urea concentration during the dialysis cycle. Theoretical reconsideration of the models revealed that the premise about V on which they rest (that is, that V is a passive diluent with no survival-associated properties of its own) is flawed if the intended use of the models is for profiling clinical outcome (for example, mortality) rather than estimating urea concentration. As a proxy for body mass, V has survival-associated properties of its own. Thus, indexing clearance x time to body size could create an offsetting combination whereby one measure favorably associated with survival (Kt) is divided by another (for example, V). Observed clinical paradoxes support that interpretation. For example, patients with a low body mass have both higher URR and higher mortality than heavier patients. Increasing mortality is often observed at high URR, suggesting the possibility of "over-dialysis." Black patients tend to be treated at lower URR than whites but enjoy better survival on dialysis. Therefore, clearance x time was evaluated as an outcome-based measure of dialysis dose, not indexed to V, and various body size estimates were evaluated as separate and distinct measures. METHODS: The retrospective sample included 17,141 black and white hemodialysis patients treated three times per week. Logistic regression analysis was used to evaluate death odds in age-, gender-, race-, and diabetes-adjusted models. Kt and five body size estimates (total body water or V, body weight, body weight adjusted for height, body surface area, and body mass index) were evaluated using two analytical strategies. First, all of the measures were treated as continuous variables to explore different statistical models. Second, Kt and the body size measures were divided into groups to construct risk profiles. RESULTS: All evaluations revealed improving death odds with increasing Kt (whether adjusted for the body size estimates or not) and also with increasing body size (whether adjusted for Kt or not) for each estimate of size. Significant statistical interactions of Kt with gender, but not Kt with race, were observed in all models. There were no statistical interactions, suggesting that higher Kt was routinely required with increasing body size. Separate risk profiles for males and females suggested a higher Kt threshold for males. CONCLUSIONS: The urea clearance x time is a valid outcome-based measure of dialysis dose and is not confounded by indexing it to an estimate of body size, which has outcome-associated properties of its own. Dialysis prescriptions for males and females should be regarded separately, but there appears no need to make a distinction between the races.


Subject(s)
Kidney Failure, Chronic/therapy , Models, Biological , Renal Dialysis/methods , Urea/blood , Adult , Aged , Black People , Body Mass Index , Creatinine/blood , Dialysis Solutions , Female , Humans , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/mortality , Logistic Models , Male , Middle Aged , Risk Assessment , Sex Factors , Treatment Outcome , White People
17.
Adv Ren Replace Ther ; 6(2): 133-40, 1999 Apr.
Article in English | MEDLINE | ID: mdl-10230880

ABSTRACT

A contemporary focus on outcomes assessment has provided affirmation that patient functional status is both an important outcome of medical care and an important predictor of longer term outcomes such as morbidity and/or mortality. Monitoring functional status among end-stage renal disease (ESRD) patients is particularly critical because the cycle of physical deconditioning experienced by renal patients is both insidious and malignant. Over the past several years, patient self-report instruments have been used with increasing frequency to assess functioning. Among ESRD patients, such self-reports have reliably predicted mortality and some morbidity. Additionally, renal patients' self-reported functioning is also correlated with the results of several commonly performed laboratory tests. Based on these findings, measures of self-reported functional status might be considered a practical adjunct to regular patient assessments. They could be routinely used for purposes that might include: identifying the particular areas of functioning and well-being that need improvement; screening for subtle changes in health status; establishing physical status baselines; and corroborating the effectiveness of physical activity interventions. Overall, ESRD patients' self-report of their functioning appears to secure, synthesize, and standardize data about patient health status that is unavailable through any other mechanism. Such information may be essential to medicine's primary missions of promoting health and preserving life.


Subject(s)
Kidney Failure, Chronic/rehabilitation , Outcome Assessment, Health Care , Quality of Life , Activities of Daily Living , Disability Evaluation , Health Status Indicators , Humans , Self Disclosure
19.
Am J Kidney Dis ; 33(3): 523-34, 1999 Mar.
Article in English | MEDLINE | ID: mdl-10070917

ABSTRACT

The current report describes the distributions of selected demographic and biochemical parameters, clearance, and other transport values among patients undergoing peritoneal dialysis (PD) and evaluates the associates of mortality using those values, with and without clearance and peritoneal equilibration test (PET) data. All patients receiving PD on January 1, 1994 were selected (n = 2,686). Patients who switched to another form of dialysis during the study period were removed from the study at the time of therapy change. Working files were constructed from the clinical database to include demographic, laboratory, and outcome data. Laboratory data were available in only 1,603 patients and were used to evaluate the biochemical associates of mortality after merging the biochemical, demographic, and outcome data. Patients with clearance data or PET studies underwent a second analysis to assess the effects of peritoneal and renal clearance on survival. The analysis of demographic and laboratory data confirmed the importance of age and serum albumin concentration as predictors of death. Residual renal function (RRF) was strongly correlated with survival, but peritoneal clearance was not. Several possible explanations for the lack of correlation between peritoneal clearance and survival are discussed. The data suggest that RRF and peritoneal clearance may be separate and not equivalent quantities. Substantial work is required to confirm or refute these findings, because the information is essential to establish the adequate dose of PD in patients with various degrees of RRF.


Subject(s)
Peritoneal Dialysis/mortality , Renal Insufficiency/blood , Renal Insufficiency/mortality , Adult , Age Factors , Aged , Creatinine/blood , Female , Humans , L-Lactate Dehydrogenase/blood , Leukocyte Count , Logistic Models , Male , Middle Aged , Odds Ratio , Peritoneal Dialysis, Continuous Ambulatory/mortality , Renal Insufficiency/therapy , Risk Factors , Serum Albumin/metabolism , Survival Analysis , Urea/blood
20.
Am J Kidney Dis ; 33(3): 584-91, 1999 Mar.
Article in English | MEDLINE | ID: mdl-10070924

ABSTRACT

The 1996 Health Care Financing Administration's (HCFA) Core Indicators Project for in-center, hemodialysis patients collects information on the quality of care delivered in four clinical areas that were anticipated to predict patient outcomes. Included among these clinical performance measurements is the delivered dose of hemodialysis, measured by the fractional reduction of urea achieved during a single hemodialysis session (urea reduction ratio [URR]). A random sample (N = 7,310) of adult (aged > or =18 years), in-center hemodialysis patients was selected, and a one-page data collection form for each patient was sent to the dialysis facility in which care was provided during the last quarter of 1995. The dialysis facilities provided information for 6,861 (94%) patients, and at least one paired predialysis and postdialysis blood urea nitrogen (BUN) concentration was reported for 6,655 (97%) of these patients. The URR of this cohort was 65.5% +/- 8.0% (mean +/- SD), and 41% of patients had a URR less than 65%. The mean dialysis session length was 203 minutes, and more than half of the patients received dialysis with a dialyzer membrane with a KUf less than 10 mL/mm Hg/h. The patients with a URR less than 65% had a mean body weight approximately 10 kg greater than patients with a URR of 65% or greater. This relationship was present for all demographic characteristics studied, including age, gender, race, and primary cause of end-stage renal disease (ESRD). Patients receiving dialysis for less than 6 months were more likely to have a URR less than 65% than patients on dialysis for longer periods. By multivariate analysis, variables significantly associated with a delivered URR less than 65% were body weight in the heaviest quartile (odds ratio [OR] = 6.1), male gender (OR = 2.6), on dialysis therapy less than 6 months (OR = 2.5), youngest quartile of age (<49 years) (OR = 2.0), lowest quartile of serum albumin values less than 3.6 g/dL (bromcresol green method) or less than 3.3 g/dL (bromcresol purple method) (OR = 1.6), black (OR = 1.5), dialyzed with a dialyzer KUf less than 20 mL/mm Hg/h (OR = 1.8), lowest quartile hematocrit (<29.7%) (OR = 1.2), and shorter dialysis session length (OR = 1.02/min). In conclusion, both patient-specific demographic variables and treatment-specific parameters are significantly associated with ESRD patients receiving a URR less than 65%. Furthermore, these data suggest statistically significant linkages between the delivered dose of hemodialysis and other independent outcome predictors such as serum albumin concentration. Prospective study is required to determine whether intervention strategies to improve the delivered dose of hemodialysis will affect this outcome predictor or whether serum albumin and dialysis dose share a common cause not amenable to increasing the URR.


Subject(s)
Body Weight , Kidney Failure, Chronic , Renal Dialysis , Urea/blood , Adult , Age Factors , Aged , Female , Humans , Kidney Failure, Chronic/blood , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/physiopathology , Kidney Failure, Chronic/therapy , Male , Middle Aged , Multivariate Analysis , Sex Factors , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...