Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 45
Filter
1.
Public Health ; 151: 114-117, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28780066

ABSTRACT

OBJECTIVES: Infants aged <8 months are ineligible for measles vaccination in China but represent a disproportionate number of cases. We examined the risk factors for measles among infants in Tianjin, China. STUDY DESIGN: Case-control study. METHODS: Cases were enrolled from a surveillance system, and IgG-negative controls were sampled from registries at immunization clinics. A logistic regression model assessed for risk factors. RESULTS: Among 82 cases and 485 controls, exposure to a municipal hospital (OR [odds ratio]: 5.21; 95% confidence interval [CI]: 1.19-22.82) or a specialty hospital (OR: 13.22; 95% CI: 6.13-28.51) was associated with the disease, whereas visiting a township or district hospitals was not associated with increased odds of measles. CONCLUSIONS: Hospitals were an important focal point of measles transmission for infants. Hospitals, particularly higher-level municipal and specialty hospitals, should enforce infection control programs to separate infants with highly communicable diseases to prevent transmission.


Subject(s)
Measles/epidemiology , Case-Control Studies , China/epidemiology , Cross Infection/epidemiology , Female , Hospitals , Humans , Infant , Infant, Newborn , Male , Measles/transmission , Risk Factors
2.
Am J Transplant ; 17(4): 1081-1096, 2017 04.
Article in English | MEDLINE | ID: mdl-27647626

ABSTRACT

Because results from single-center (mostly kidney) donor studies demonstrate interpersonal relationship and financial strains for some donors, we conducted a liver donor study involving nine centers within the Adult-to-Adult Living Donor Liver Transplantation Cohort Study 2 (A2ALL-2) consortium. Among other initiatives, A2ALL-2 examined the nature of these outcomes following donation. Using validated measures, donors were prospectively surveyed before donation and at 3, 6, 12, and 24 mo after donation. Repeated-measures regression models were used to examine social relationship and financial outcomes over time and to identify relevant predictors. Of 297 eligible donors, 271 (91%) consented and were interviewed at least once. Relationship changes were positive overall across postdonation time points, with nearly one-third reporting improved donor family and spousal or partner relationships and >50% reporting improved recipient relationships. The majority of donors, however, reported cumulative out-of-pocket medical and nonmedical expenses, which were judged burdensome by 44% of donors. Lower income predicted burdensome donation costs. Those who anticipated financial concerns and who held nonprofessional positions before donation were more likely to experience adverse financial outcomes. These data support the need for initiatives to reduce financial burden.


Subject(s)
Liver Transplantation , Living Donors/psychology , Socioeconomic Factors , Tissue and Organ Procurement/economics , Adult , Female , Humans , Interpersonal Relations , Male , Middle Aged , Prospective Studies , Quality of Life , Social Support , Surveys and Questionnaires
3.
Am J Transplant ; 17(5): 1267-1277, 2017 May.
Article in English | MEDLINE | ID: mdl-27865040

ABSTRACT

Although single-center and cross-sectional studies have suggested a modest impact of liver donation on donor psychological well-being, few studies have assessed these outcomes prospectively among a large cohort. We conducted one of the largest, prospective, multicenter studies of psychological outcomes in living liver donors within the Adult-to-Adult Living Donor Liver Transplantation Cohort Study2 (A2ALL-2) consortium. In total, 271 (91%) of 297 eligible donors were interviewed at least once before donation and at 3, 6, 12, and 24 mo after donation using validated measures. We found that living liver donors reported low rates of major depressive (0-3%), alcohol abuse (2-5%), and anxiety syndromes (2-3%) at any given assessment in their first 2 years after donation. Between 4.7% and 9.6% of donors reported impaired mental well-being at various time points. We identified significant predictors for donors' perceptions of being better people and experiencing psychological growth following donation, including age, sex, relationship to recipient, ambivalence and motivation regarding donation, and feeling that donation would make life more worthwhile. Our results highlight the need for close psychosocial monitoring for those donors whose recipients died (n=27); some of those donors experienced guilt and concerns about responsibility. Careful screening and targeted, data-driven follow-up hold promise for optimizing psychological outcomes following this procedure for potentially vulnerable donors.


Subject(s)
Depressive Disorder, Major/psychology , Liver Transplantation/psychology , Living Donors/psychology , Quality of Life , Adult , Cross-Sectional Studies , Depressive Disorder, Major/epidemiology , Female , Follow-Up Studies , Graft Survival , Humans , Male , Prognosis , Prospective Studies , Surveys and Questionnaires
4.
Am J Transplant ; 14(11): 2535-44, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25293374

ABSTRACT

Following kidney donation, short-term quality of life outcomes compare favorably to US normative data but long-term effects on mood are not known. In the Renal and Lung Living Donors Evaluation Study (RELIVE), records from donations performed 1963-2005 were reviewed for depression and antidepressant use predonation. Postdonation, in a cross-sectional cohort design 2010-2012, donors completed the Patient Health Questionnaire (PHQ-9) depression screening instrument, the Life Orientation Test-Revised, 36-Item Short Form Health Survey and donation experience questions. Of 6909 eligible donors, 3470 were contacted and 2455 participated (71%). The percent with depressive symptoms (8%; PHQ-9>10) was similar to National Health and Nutrition Examination Survey participants (7%, p=0.30). Predonation psychiatric disorders were more common in unrelated than related donors (p=0.05). Postdonation predictors of depressive symptoms included nonwhite race OR=2.00, p=0.020), younger age at donation (OR=1.33 per 10 years, p=0.002), longer recovery time from donation (OR=1.74, p=0.0009), greater financial burden (OR=1.32, p=0.013) and feeling morally obligated to donate (OR=1.23, p=0.003). While cross-sectional prevalence of depression is comparable to population normative data, some factors identifiable around time of donation, including longer recovery, financial stressors, younger age and moral obligation to donate may identify donors more likely to develop future depression, providing an opportunity for intervention.


Subject(s)
Emotions , Kidney Transplantation , Living Donors/psychology , Adult , Cohort Studies , Depression/psychology , Female , Humans , Male , Middle Aged
5.
Am J Transplant ; 14(8): 1846-52, 2014 Aug.
Article in English | MEDLINE | ID: mdl-25039865

ABSTRACT

The Renal and Lung Living Donors Evaluation Study assesses outcomes of live lung (lobectomy) donors. This is a retrospective cohort study at University of Southern California (USC) and Washington University (WASHU) Medical Centers (1993­2006), using medical records to assess morbidity and national databases to ascertain postdonation survival and lung transplantation. Serious complications were defined as those that required significant treatment, were potentially life-threatening or led to prolonged hospitalization. The 369 live lung donors (287 USC, 82 WASHU) were predominantly white, non-Hispanic and male; 72% had a biological relationship to the recipient, and 30% were recipient parents. Serious complications occurred in 18% of donors; 2.2% underwent reoperation and 6.5% had an early rehospitalization. The two centers had significantly different incidences of serious complications (p < 0.001). No deaths occurred and no donors underwent lung transplantation during 4000+ person-years of follow-up (death: minimum 4, maximum 17 years; transplant: minimum 5, maximum 19). Live lung donation remains a potential option for recipients when using deceased donor lungs lacks feasibility. However, the use of two live donors for each recipient and the risk of morbidity associated with live lung donation do not justify this approach when deceased lung donors remain available. Center effects and long-term live donor outcomes require further evaluation.


Subject(s)
Living Donors/statistics & numerical data , Lung Diseases/mortality , Lung Diseases/surgery , Lung Transplantation , Adolescent , Adult , Cohort Studies , Databases, Factual , Female , Humans , Length of Stay , Lung/surgery , Male , Middle Aged , Quality Control , Research Design , Retrospective Studies , Treatment Outcome , Young Adult
6.
Am J Transplant ; 13(11): 2924-34, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24011252

ABSTRACT

Live donation benefits recipients, but the long-term consequences for donors remain uncertain. Renal and Lung Living Donors Evaluation Study surveyed kidney donors (N = 2455; 61% women; mean age 58, aged 24-94; mean time from donation 17 years, range 5-48 years) using the Short Form-36 Health Survey (SF-36). The 95% confidence intervals for White and African-American donors included or exceeded SF-36 norms. Over 80% of donors reported average or above average health for their age and sex (p < 0.0001). Donors' age-sex adjusted physical component summary (PCS) scores declined by half a point each decade after donation (p = 0.0027); there was no decline in mental component summary (MCS) scores. White donors' PCS scores were three points higher (p = 0.0004) than non-Whites'; this difference remained constant over time. Nine percent of donors had impaired health (PCS or MCS score >1 SD below norm). Obesity, history of psychiatric difficulties and non-White race were risk factors for impaired physical health; history of psychiatric difficulties was a risk factor for impaired mental health. Education, older donation age and a first-degree relation to the recipient were protective factors. One percent reported that donation affected their health very negatively. Enhanced predonation evaluation and counseling may be warranted, along with ongoing monitoring for overweight donors.


Subject(s)
Kidney Transplantation , Living Donors/psychology , Postoperative Complications , Quality of Life , Adolescent , Adult , Aged , Aged, 80 and over , Cross-Sectional Studies , Female , Humans , Male , Medical Records , Middle Aged , Nephrectomy , Obesity , Racial Groups , Risk Factors , Time Factors , Young Adult
7.
BJOG ; 120(13): 1678-84, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23937077

ABSTRACT

OBJECTIVE: To assess whether the risk of vulvodynia is associated with previous use of oral contraceptives (OCs). DESIGN: Longitudinal population-based study. SETTING: Four counties in south-east Michigan, USA. POPULATION: A population-based sample of women, aged 18 years and older, enrolled using random-digit dialling. METHODS: Enrolled women completed surveys that included information on demographic characteristics, health status, current symptoms, past and present OC use, and a validated screen for vulvodynia. The temporal relationship between OC use and subsequent symptoms of vulvodynia was assessed using Cox regression, with OC exposure modelled as a time-varying covariate. MAIN OUTCOME MEASURE: Vulvodynia, as determined by validated screen. RESULTS: Women aged <50 years who provided data on OC use, completed all questions required for the vulvodynia screen, and had first sexual intercourse prior to the onset of vulvodynia symptoms were eligible (n = 906). Of these, 71.2% (n = 645) had used OCs. The vulvodynia screen was positive in 8.2% (n = 74) for current vulvodynia and in 20.8% (n = 188) for past vulvodynia. Although crude cross-tabulation suggested that women with current or past vulvodynia were less likely to have been exposed to OCs prior to the onset of pain (60.7%), compared with those without this disorder (69.3%), the Cox regression analysis identified no association between vulvodynia and previous OC use (HR 1.08, 95% CI 0.81-1.43, P = 0.60). This null finding persisted after controlling for ethnicity, marital status, educational level, duration of use, and age at first OC use. CONCLUSION: For women aged <50 years of age, OC use did not increase the risk of subsequent vulvodynia.


Subject(s)
Contraceptives, Oral, Hormonal/therapeutic use , Vulvodynia/epidemiology , Adolescent , Adult , Case-Control Studies , Female , Humans , Longitudinal Studies , Michigan , Middle Aged , Proportional Hazards Models , Regression Analysis , Risk Assessment , Young Adult
8.
Am J Transplant ; 13(2): 390-8, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23137211

ABSTRACT

While cautious criteria for selection of living kidney donors are credited for favorable outcomes, recent practice changes may include acceptance of less than ideal donors. To characterize trends in donor acceptance, the Renal and Lung Living Donors Evaluation (RELIVE) Study evaluated 8,951 kidney donors who donated between 1963 and 2007 at three major U.S. transplant centers. Over the study interval, there was an increase in the percentage of donors >40 years old from 38% to 51%; donors >60 years varied between 1% and 4%. The proportion of donors with obesity increased from 8% to 26% and with glucose intolerance from 9% to 25%. The percentage of hypertensive donors was consistent (5-8%). Accepted donors ≥60 years old were more likely to have obesity, glucose intolerance, and/or hypertension compared to younger donors (p<0.0001). Our results demonstrate important trends in acceptance of older and more obese donors. The fraction of older donors accepted with glucose intolerance or hypertension remains small and for the majority includes mild elevations in glucose or blood pressure that were previously classified as within normal limits.


Subject(s)
Blood Pressure , Kidney Transplantation/methods , Living Donors/statistics & numerical data , Renal Insufficiency/therapy , Adult , Aged , Female , Glucose Intolerance/complications , Glucose Intolerance/physiopathology , Humans , Hypertension/complications , Hypertension/physiopathology , Male , Middle Aged , Models, Statistical , Obesity/complications , Obesity/physiopathology , Registries , Treatment Outcome
9.
Am J Transplant ; 10(7): 1621-33, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20199501

ABSTRACT

Data submitted by transplant programs to the Organ Procurement and Transplantation Network (OPTN) are used by the Scientific Registry of Transplant Recipients (SRTR) for policy development, performance evaluation and research. This study compared OPTN/SRTR data with data extracted from medical records by research coordinators from the nine-center A2ALL study. A2ALL data were collected independently of OPTN data submission (48 data elements among 785 liver transplant candidates/recipients; 12 data elements among 386 donors). At least 90% agreement occurred between OPTN/SRTR and A2ALL for 11/29 baseline recipient elements, 4/19 recipient transplant or follow-up elements and 6/12 donor elements. For the remaining recipient and donor elements, >10% of values were missing in OPTN/SRTR but present in A2ALL, confirming that missing data were largely avoidable. Other than variables required for allocation, the percentage missing varied widely by center. These findings support an expanded focus on data quality control by OPTN/SRTR for a broader variable set than those used for allocation. Center-specific monitoring of missing values could substantially improve the data.


Subject(s)
Liver Transplantation/statistics & numerical data , Living Donors/statistics & numerical data , Adult , Bilirubin/blood , Body Height , Body Weight , Creatinine/blood , Educational Status , Ethnicity , Female , Humans , International Normalized Ratio , Male , Medical Records , Racial Groups , Registries , Research/statistics & numerical data , United States
10.
Clin Nephrol ; 73(2): 104-14, 2010 Feb.
Article in English | MEDLINE | ID: mdl-20129017

ABSTRACT

BACKGROUND: There has been limited research on sleep quality (SQ) in CKD. METHODS: This prospective cohort study of adults with CKD Stages 3 - 5 at four US centers collected self-reported SQ information from the Kidney Disease Quality of Life (KDQOL) instrument, including an estimated SQ score (0 - 100), and 3 SQ-related questions. "Poor" SQ was defined as SQ score < or = 60. Logistic and multiple linear regression assessed associations between SQ and its potential predictors. Times to death and end stage renal disease (ESRD) were examined using Cox regression. A comparison with SQ in ESRD patients from the Dialysis Outcomes and Practice Patterns Study (DOPPS), was additionally performed. RESULTS: Mean SQ score was 59.4 +/- 23.6 (n = 689), and "poor" SQ was reported by 57%. Mean estimated glomerular filtration rate (eGFR) was 24.9 +/- 10.6 ml/min/1.73 m2. Higher SQ significantly correlated with KDQOL mental and physical component summary scales. Significant predictors of lower SQ score included--younger age, presence of dyspnea, self-reported depression, pain, and itchness. There were no significant pairwise differences in SQ from CKD Stage 3 through ESRD. Self-reported daytime sleepiness was significantly associated with higher risk of mortality prior to ESRD (HR = 1.85, p = 0.02). CONCLUSION: Self-reported "poor" SQ was common in a CKD cohort (Stages 3 - 5) and was not only associated with lower quality of life scores and several modifiable symptoms, but also with higher risk of pre-ESRD mortality. Greater attention to this clinical problem is highly recommended in this high-risk population.


Subject(s)
Kidney Failure, Chronic/physiopathology , Sleep Wake Disorders/physiopathology , Sleep/physiology , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/therapy , Male , Middle Aged , Morbidity/trends , Prognosis , Prospective Studies , Renal Dialysis , Risk Factors , Sleep Wake Disorders/epidemiology , Sleep Wake Disorders/etiology , Surveys and Questionnaires , United States/epidemiology
11.
Am J Transplant ; 8(12): 2569-79, 2008 Dec.
Article in English | MEDLINE | ID: mdl-18976306

ABSTRACT

Patients considering living donor liver transplantation (LDLT) need to know the risk and severity of complications compared to deceased donor liver transplantation (DDLT). One aim of the Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) was to examine recipient complications following these procedures. Medical records of DDLT or LDLT recipients who had a living donor evaluated at the nine A2ALL centers between 1998 and 2003 were reviewed. Among 384 LDLT and 216 DDLT, at least one complication occurred after 82.8% of LDLT and 78.2% of DDLT (p = 0.17). There was a median of two complications after DDLT and three after LDLT. Complications that occurred at a higher rate (p < 0.05) after LDLT included biliary leak (31.8% vs. 10.2%), unplanned reexploration (26.2% vs. 17.1%), hepatic artery thrombosis (6.5% vs. 2.3%) and portal vein thrombosis (2.9% vs. 0.0%). There were more complications leading to retransplantation or death (Clavien grade 4) after LDLT versus DDLT (15.9% vs. 9.3%, p = 0.023). Many complications occurred more commonly during early center experience; the odds of grade 4 complications were more than two-fold higher when centers had performed 40). In summary, complication rates were higher after LDLT versus DDLT, but declined with center experience to levels comparable to DDLT.


Subject(s)
Liver Transplantation/adverse effects , Living Donors/statistics & numerical data , Tissue Donors/statistics & numerical data , Transplantation/statistics & numerical data , Adult , Cohort Studies , Female , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Thrombosis/epidemiology , Thrombosis/etiology , Treatment Outcome
12.
Ann Epidemiol ; 17(11): 854-62, 2007 Nov.
Article in English | MEDLINE | ID: mdl-17689259

ABSTRACT

PURPOSE: Group B Streptococcus (GBS) is a common inhabitant of the bowel and vaginal flora, with known transmission routes including sexual contact and vertical transmission from mother to infant. Food-borne transmission is also possible, as GBS is a known fish and bovine pathogen. We conducted a prospective cohort study in order to identify risk factors for acquisition. METHODS: We identified risk factors for GBS acquisition among college women (n = 129) and men (n = 128) followed at 3-week intervals for 3 months. RESULTS: A doubling in sex acts significantly increased incidence of GBS capsular type V by 80% (95% confidence interval [CI]: 1.19, 2.58), and other non-Ia or -Ib types combined by 40% (95% CI: 1.00, 2.06; incidence of capsular type Ia (odds ratio [OR] = 1.2; 95% CI: 0.71, 1.88; p = 0.57) and Ib (OR = 1.5, 95% CI: 0.75, 2.86; p = 0.27) were elevated, although not significantly. After adjustment for sexual activity and sexual history, gender, and eating venue, fish consumption increased risk of acquiring capsular types Ia and Ib combined 7.3 fold (95% CI: 2.34, 19.50), but not of acquiring other capsular types. Beef and milk were not associated with GBS incidence. CONCLUSIONS: Different GBS capsular types may have different transmission routes.


Subject(s)
Streptococcal Infections/transmission , Streptococcus agalactiae/isolation & purification , Adolescent , Adult , Bacterial Capsules , Diet , Electrophoresis, Gel, Pulsed-Field , Female , Hand Disinfection , Humans , Incidence , Male , Michigan/epidemiology , Polysaccharides, Bacterial/isolation & purification , Prospective Studies , Risk Factors , Sexual Behavior , Streptococcal Infections/epidemiology , Streptococcus agalactiae/classification , Streptococcus agalactiae/pathogenicity , Students
13.
Am J Transplant ; 7(6): 1536-41, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17430402

ABSTRACT

Urinary complications are common following renal transplantation. The aim of this study is to evaluate the risk factors associated with renal transplant urinary complications. We collected data on 1698 consecutive renal transplants patients. The association of donor, transplant and recipient characteristics with urinary complications was assessed by univariable and multivariable Cox proportional hazards models, fitted to analyze time-to-event outcomes of urinary complications and graft failure. Urinary complications were observed in 105 (6.2%) recipients, with a 2.8% ureteral stricture rate, a 1.7% rate of leak and stricture, and a 1.6% rate of urine leaks. Seventy percent of these complications were definitively managed with a percutaneous intervention. Independent risk factors for a urinary complication included: male recipient, African American recipient, and the "U"-stitch technique. Ureteral stricture was an independent risk factor for graft loss, while urinary leak was not. Laparoscopic donor technique (compared to open living donor nephrectomy) was not associated with more urinary complications. Our data suggest that several patient characteristics are associated with an increased risk of a urinary complication. The U-stitch technique should not be used for the ureteral anastomosis.


Subject(s)
Kidney Transplantation/adverse effects , Urologic Diseases/epidemiology , Humans , Incidence , Medical Records , Risk Factors , Urologic Diseases/therapy
14.
Kidney Int ; 69(7): 1222-8, 2006 Apr.
Article in English | MEDLINE | ID: mdl-16609686

ABSTRACT

Longer treatment time (TT) and slower ultrafiltration rate (UFR) are considered advantageous for hemodialysis (HD) patients. The study included 22,000 HD patients from seven countries in the Dialysis Outcomes and Practice Patterns Study (DOPPS). Logistic regression was used to study predictors of TT > 240 min and UFR > 10 ml/h/kg bodyweight. Cox regression was used for survival analyses. Statistical adjustments were made for patient demographics, comorbidities, dose of dialysis (Kt/V), and body size. Europe and Japan had significantly longer (P < 0.0001) average TT than the US (232 and 244 min vs 211 in DOPPS I; 235 and 240 min vs 221 in DOPPS II). Kt/V increased concomitantly with TT in all three regions with the largest absolute difference observed in Japan. TT > 240 min was independently associated with significantly lower relative risk (RR) of mortality (RR = 0.81; P = 0.0005). Every 30 min longer on HD was associated with a 7% lower RR of mortality (RR = 0.93; P < 0.0001). The RR reduction with longer TT was greatest in Japan. A synergistic interaction occurred between Kt/V and TT (P = 0.007) toward mortality reduction. UFR > 10 ml/h/kg was associated with higher odds of intradialytic hypotension (odds ratio = 1.30; P = 0.045) and a higher risk of mortality (RR = 1.09; P = 0.02). Longer TT and higher Kt/V were independently as well as synergistically associated with lower mortality. Rapid UFR during HD was also associated with higher mortality risk. These results warrant a randomized clinical trial of longer dialysis sessions in thrice-weekly HD.


Subject(s)
Renal Dialysis/methods , Ultrafiltration/methods , Adult , Databases, Factual , Humans , Renal Dialysis/mortality , Survival Analysis , Time Factors , Treatment Outcome
15.
Kidney Int ; 69(11): 2087-93, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16641921

ABSTRACT

Hemodiafiltration (HDF) is used sporadically for renal replacement therapy in Europe but not in the US. Characteristics and outcomes were compared for patients receiving HDF versus hemodialysis (HD) in five European countries in the Dialysis Outcomes and Practice Patterns Study. The study followed 2165 patients from 1998 to 2001, stratified into four groups: low- and high-flux HD, and low- and high-efficiency HDF. Patient characteristics including age, sex, 14 comorbid conditions, and time on dialysis were compared between each group using multivariate logistic regression. Cox proportional hazards regression assessed adjusted differences in mortality risk. Prevalence of HDF ranged from 1.8% in Spain to 20.1% in Italy. Compared to low-flux HD, patients receiving low-efficiency HDF had significantly longer average duration of end-stage renal disease (7.0 versus 4.7 years), more history of cancer (15.4 versus 8.7%), and lower phosphorus (5.3 versus 5.6 mg/dl); patients receiving high-efficiency HDF had significantly more lung disease (15.5 versus 10.2%) and received a higher single-pool Kt/V (1.44 versus 1.35). High-efficiency HDF patients had lower crude mortality rates than low-flux HD patients. After adjustment, high-efficiency HDF patients had a significant 35% lower mortality risk than those receiving low-flux HD (relative risk=0.65, P=0.01). These observational results suggest that HDF may improve patient survival independently of its higher dialysis dose. Owing to possible selection bias, the potential benefits of HDF must be tested by controlled clinical trials before recommendations can be made for clinical practice.


Subject(s)
Hemodiafiltration , Renal Dialysis/mortality , Europe , Female , Follow-Up Studies , Humans , Male , Middle Aged , Risk Factors
16.
Clin Nephrol ; 63(5): 335-45, 2005 May.
Article in English | MEDLINE | ID: mdl-15909592

ABSTRACT

BACKGROUND: Mortality in severe acute renal failure (ARF) requiring renal replacement therapy (RRT) approximates 50% and varies with clinical severity. Continuous RRT (CRRT) has theoretical advantages over intermittent hemodialysis (IHD) for critical patients, but a survival advantage with CRRT is yet to be clearly demonstrated. To date, no prospective controlled trial has sufficiently answered this question, and the present prospective outcome study attempts to compare survival with CRRT versus that with IHD. METHODS: Multivariable Cox-proportional hazards regression was used to analyze the impact of RRT modality choice (CRRT vs. IHD) on in-hospital and 100-day mortality among ARF patients receiving RRT during 2000 and 2001 at University of Michigan, using an "intent-to-treat" analysis adjusted for multiple comorbidity and severity factors. RESULTS: Overall in-hospital mortality before adjustment was 52%. Triage to CRRT (vs IHD) was associated with higher severity and unadjusted relative rate (RR) of in-hospital death (RR = 1.62, p = 0.001, n = 383). Adjustment for comorbidity and severity of illness reduced the RR of death for patients triaged to CRRT and suggested a possible survival advantage (RR = 0.81, p = 0.32). Analysis restricted to patients in intensive care for more than five days who received at least 48 hours of total RRT, showed the RR of in-hospital mortality with CRRT to be nearly 45% lower than IHD (RR = 0.56, n = 222), a difference in RR that indicates a strong trend for in-hospital mortality with borderline statistical significance (p = 0.069). Analysis of 100-day mortality also suggested a potential survival advantage for CRRT in all cohorts, particularly among patients in intensive care for more than five days who received at least 48 h of RRT (RR = 0.60, p = 0.062, n = 222). CONCLUSION: Applying the present methodology to outcomes at a single tertiary medical center, CRRT may appear to afford a survival advantage for patients with severe ARF treated in the ICU. Unless and until a prospective controlled trial is realized, the present data suggest potential survival advantages of CRRT and support broader application of CRRT among such critically ill patients.


Subject(s)
Acute Kidney Injury/diagnosis , Acute Kidney Injury/therapy , Renal Replacement Therapy/methods , APACHE , Acute Kidney Injury/mortality , Adult , Aged , Cohort Studies , Critical Care/methods , Female , Follow-Up Studies , Hemofiltration/methods , Humans , Intensive Care Units , Kidney Function Tests , Male , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Prospective Studies , Renal Dialysis/methods , Risk Assessment , Severity of Illness Index , Survival Rate , Treatment Outcome
17.
Cent Afr J Med ; 49(5-6): 47-53, 2003.
Article in English | MEDLINE | ID: mdl-15214282

ABSTRACT

OBJECTIVE: To evaluate the performance and the utility of using birthweight-adjusted scores of the Dubowitz method of estimating gestational age in a Zimbabwean population. DESIGN: A validation study. SETTING: Harare Maternity Hospital, from October to December 1999. SUBJECTS: 364 African newborn infants with a known last menstrual period (LMP), within the first 56 hours of life. MAIN OUTCOME MEASURES: Differences between regression lines and variances explained by Dubowitz scores obtained by examining newborn infants compared to gestational age calculated from the last menstrual period, in models with and without the addition of birthweight. RESULTS: The Dubowitz method was a good predictor of gestational age, useful in differentiating term from pre-term infants. The beta coefficients from regression lines with and without addition of birthweight differed significantly from each other (z = 2.83, p < 0.01). Our regression line without adding birthweight was Y(LMP gestational age) = 23.814 + 0.301* score. Addition of birthweight to the regression models improved prediction of gestational age, Y(LMP gestational age) = 23.512 + 0.219* score + 0.0015* grams, and accounted for 69% of the variance compared to 66% in models without birthweight. CONCLUSION: The introduction of birthweight improves estimation of gestational age, correcting for the overestimation reported for the original Dubowitz methods and the error caused by low birthweight. We recommend the use of our birthweight-adjusted Dubowitz maturity scales for studies of prematurity, and for routine clinical practice.


Subject(s)
Birth Weight/physiology , Gestational Age , Infant, Newborn/physiology , Adolescent , Adult , Female , Hospitals, Maternity , Humans , Infant, Premature , Male , Pregnancy , Sensitivity and Specificity , Zimbabwe/epidemiology
18.
Int J Gynaecol Obstet ; 78(1): 7-18, 2002 Jul.
Article in English | MEDLINE | ID: mdl-12113965

ABSTRACT

OBJECTIVES: To evaluate the performance and the utility of using birthweight-adjusted scores of Dubowitz and Ballard methods of estimating gestational age in a Zimbabwean population. METHOD: The Dubowitz and the Ballard methods of estimating gestational age were administered to 364 African newborn infants with a known last menstrual period (LMP) at Harare Maternity Hospital. RESULTS: Both methods were good predictors of gestational age useful in differentiating term from pre-term infants. Our regression line was Y((LMP gestational age))=23.814+0.301*score for the Dubowitz and Y((LMP gestational age))=24.493+0.420*score for the Ballard method. Addition of birthweight to the regression models improved prediction of gestational age; Y((LMP gestational age))=23.512+0.219*score+0.0015*grams for Dubowitz and Y((LMP gestational age))=24.002+0.292*score+0.0016*grams for Ballard method. CONCLUSIONS: We recommend the use of our birthweight-adjusted maturity scales; the Dubowitz for studies of prematurity, and the Ballard for routine clinical practice.


Subject(s)
Birth Weight , Gestational Age , Infant, Newborn , Adolescent , Adult , Female , Humans , Infant, Premature , Linear Models , Pregnancy , Zimbabwe
19.
Ophthalmology ; 108(11): 1943-53, 2001 Nov.
Article in English | MEDLINE | ID: mdl-11713061

ABSTRACT

PURPOSE: To report interim outcome data, using all available follow-up through 5 years after treatment initiation, in the Collaborative Initial Glaucoma Treatment Study (CIGTS). DESIGN: Randomized clinical trial. PARTICIPANTS: Six hundred seven newly diagnosed glaucoma patients. METHODS: In a randomized clinical trial, 607 patients with newly diagnosed open-angle glaucoma were initially treated with either medication or trabeculectomy (with or without 5-fluorouracil). After treatment onset and early follow-up, patients were evaluated clinically at 6-month intervals. In addition, quality of life telephone interviews were conducted at similar frequency to the clinical visits. Patients in both arms of CIGTS were treated aggressively in an effort to reduce intraocular pressure (IOP) to a level at or below a predetermined target pressure specific for each individual eye. Visual field (VF) scores were analyzed by time-specific comparisons and by repeated measures models. MAIN OUTCOME MEASURES: VF loss was the primary outcome variable in CIGTS. Secondary outcomes of visual acuity (VA), IOP, and cataract were also studied. RESULTS: On the basis of completed follow-up through 4 years and partially completed through 5 years, VF loss did not differ significantly by initial treatment. Over the entire period of follow-up, surgical patients had a greater risk of substantial VA loss compared with medical patients. However, by 4 years after treatment, the average VA in the two groups was about equal. Over the course of follow-up, IOP in the medicine group has averaged 17 to 18 mmHg, whereas that in the surgery group averaged 14 to 15 mmHg. The rate of cataract requiring removal was greater in the surgically treated group. CONCLUSIONS: Both initial medical or initial surgical therapy result in about the same VF outcome after up to 5 years of follow-up. VA loss was greater in the surgery group, but the differences between groups seem to be converging as follow-up continues. When aggressive treatment aimed at substantial reduction in IOP from baseline is used, loss of VF can be seen to be minimal in general. Because 4 to 5 years of follow-up in a chronic disease is not adequate to draw treatment conclusions, these interim CIGTS outcomes do not support altering current treatment approaches to open-angle glaucoma.


Subject(s)
Adrenergic beta-Antagonists/therapeutic use , Glaucoma, Open-Angle/drug therapy , Glaucoma, Open-Angle/surgery , Trabeculectomy , Adult , Aged , Cataract/complications , Female , Fluorouracil/therapeutic use , Follow-Up Studies , Glaucoma, Open-Angle/physiopathology , Humans , Intraocular Pressure/physiology , Intraoperative Complications , Male , Middle Aged , Ophthalmic Solutions , Prospective Studies , Quality of Life , Treatment Outcome , Visual Acuity/physiology , Visual Fields/physiology
20.
Ophthalmology ; 108(11): 1954-65, 2001 Nov.
Article in English | MEDLINE | ID: mdl-11713062

ABSTRACT

OBJECTIVE: To present interim quality of life (QOL) findings in the Collaborative Initial Glaucoma Treatment Study (CIGTS) using all available follow-up through 5 years from treatment initiation. DESIGN: Randomized controlled clinical trial. PARTICIPANTS: Six hundred seven newly diagnosed patients with open-angle glaucoma from 14 clinical centers. INTERVENTION: Patients were randomly assigned to either initial medical therapy or initial trabeculectomy. After treatment initiation and early follow-up, patients received clinical and QOL evaluations at 6-month intervals. QOL assessments were administered by telephone at a centralized interviewing center. MAIN OUTCOME MEASURES: The CIGTS collected comprehensive QOL information that included both generic and vision-specific QOL measures. This article focuses on initial treatment group differences related to symptom reporting, as measured by a Symptom and Health Problem Checklist, and changes in daily visual functioning, as measured by the Visual Activities Questionnaire (VAQ). RESULTS: Across both treatment groups, there was an overall decline in the percent of participants reporting symptoms over time. Of 43 possible symptoms, 12 symptoms were reported with greater frequency by the surgically treated group and 7 symptoms more frequently by the medically-treated group. The surgical patients reported more total Symptom Impact Glaucoma (P = 0.005) and, in particular, more bother related to local eye symptoms. Very few treatment group differences were noted in visual functioning, although surgical patients reported more problems with activities related to their visual acuity (P = 0.024). The percentage of patients across treatment groups reporting worry about blindness was 50% at baseline but declined to approximately 25% over time. CONCLUSIONS: Overall, the QOL impact reported by the two treatment groups as measured by instruments used in this study is remarkably similar, with relatively few significant study group differences observed after up to 5 years of follow-up in the CIGTS. When significant differences in visual function have been detected using the VAQ, they are consistent with the clinical outcomes. To date, the most persistent QOL finding is the increased impact of local eye symptoms reported by the surgical group compared with the medical group. Although no changes are recommended in the treatment of newly diagnosed glaucoma patients at the time of this interim report, further follow-up will allow for more definitive answers to the QOL impact of these two treatment approaches.


Subject(s)
Adrenergic beta-Antagonists/therapeutic use , Glaucoma, Open-Angle/drug therapy , Glaucoma, Open-Angle/surgery , Quality of Life , Trabeculectomy , Adult , Aged , Female , Follow-Up Studies , Glaucoma, Open-Angle/physiopathology , Humans , Intraocular Pressure/physiology , Male , Middle Aged , Ophthalmic Solutions , Prospective Studies , Sickness Impact Profile , Treatment Outcome , Visual Acuity/physiology , Visual Fields/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...