Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
J Hosp Infect ; 139: 82-92, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37308061

ABSTRACT

BACKGROUND: Surgical site infection (SSI) is a health-threatening complication following caesarean section (CS); however, to the authors' knowledge, there is no worldwide estimate of the burden of post-CS SSIs. Therefore, this systematic review and meta-analysis aimed to estimate the global and regional incidence of post-CS SSIs and associated factors. METHODS: International scientific databases were searched systematically for observational studies published from January 2000 to March 2023, without language or geographical restrictions. The pooled global incidence rate was estimated using a random-effects meta-analysis (REM), and then stratified by World-Health-Organization-defined regions as well as by sociodemographic and study characteristics. Causative pathogens and associated risk factors of SSIs were also analysed using REM. Heterogeneity was assessed with I2. RESULTS: In total, 180 eligible studies (207 datasets) involving 2,188,242 participants from 58 countries were included in this review. The pooled global incidence of post-CS SSIs was 5.63% [95% confidence interval (CI) 5.18-6.11%]. The highest and lowest incidence rates for post-CS SSIs were estimated for the African (11.91%, 95% CI 9.67-14.34%) and North American (3.87%, 95% CI 3.02-4.83%) regions, respectively. The incidence was significantly higher in countries with lower income and human development index levels. The pooled incidence estimates have increased steadily over time, with the highest incidence rate during the coronavirus disease 2019 pandemic (2019-2023). Staphylococcus aureus and Escherichia coli were the most prevalent pathogens. Several risk factors were identified. CONCLUSION: An increasing and substantial burden from post-CS SSIs was identified, especially in low-income countries. Further research, greater awareness and the development of effective prevention and management strategies are warranted to reduce post-CS SSIs.


Subject(s)
COVID-19 , Staphylococcal Infections , Humans , Female , Pregnancy , Surgical Wound Infection/prevention & control , Incidence , Cesarean Section/adverse effects , COVID-19/complications , Staphylococcal Infections/epidemiology
2.
Arch Razi Inst ; 75(2): 249-256, 2020 06.
Article in English | MEDLINE | ID: mdl-32621455

ABSTRACT

Peripheral nerve disorders are the most common neurological problems; therefore, it is important to intervene to treat or stop the resulting side effects. This study aimed to investigate the effect of oat extract on experimental sciatic nerve injury in rats. Totally, 50 adult male rats were divided into five groups (n=10). Group 1 was exposed to sham condition, and group 2 was regarded as the control group (nerve injury without treatment). Moreover, groups 3-5 were subjected to sciatic nerve injury, and they received oral gavages of the oat extract (100, 200, and 400 mg/kg), respectively. Subsequently, 2 and 4 weeks later, the rats were euthanized for pathological evaluation of nerve repair. The results showed an increase in the formation of the perineurium and epineurium dose in the oat-treated groups (100, 200, and 400 mg/kg), compared to the control group after 2 weeks (P<0.05). Furthermore, the presence of inflammatory cells in the oat extract-treated groups (100, 200, and 400 mg/kg) decreased, compared to that in the control group after 2 weeks (P<0.05). In addition, the swelling of the axon significantly decreased in the oat extract-treated groups (200 and 400 mg/kg), compared to the control group (P<0.05). However, the axon dose-dependently increased in oat-treated groups (100, 200, and 400 mg/kg), compared to that in the control group after 4 weeks (P<0.05). These results suggest that oat extract has positive effects on sciatic nerve repair in rats.


Subject(s)
Avena/chemistry , Neuroprotective Agents/pharmacology , Peripheral Nerve Injuries/drug therapy , Plant Extracts/pharmacology , Sciatic Nerve/injuries , Animals , Male , Neuroprotective Agents/chemistry , Plant Extracts/chemistry , Rats , Rats, Wistar , Seeds/chemistry
3.
Trop Anim Health Prod ; 52(4): 1779-1786, 2020 Jul.
Article in English | MEDLINE | ID: mdl-31898025

ABSTRACT

The present study was conducted to assess the effects of pre-breeding vitamin E and selenium (ESe) injections on the reproductive performance, antioxidant status, and serum progesterone (P4) concentration in estrus-synchronized Mehraban ewes. During the breeding season, 38 ewes (3-4 years) were divided into two groups (n = 18), and the synchronization of estrus was achieved by intravaginal insertion of 0.3 g progesterone CIDR device for 13 days followed by 350 IU eCG at CIDR withdrawal. Ewes were kept under pasture conditions and exposed to Mehraban rams 48 h following CIDR withdrawal. The experimental treatments were control and ESe injection. The ESe group was received three intramuscular (5 mL) injections of Ese (0.5 mg/mL of selenium as sodium selenite and 50 IU vitamin E as DL-α- tocopheryl) once every 2 weeks. Specifically, Ese was administered at 2 weeks before CIDR insertion, at the times of CIDR insertion and CIDR withdrawal. Fertility, prolificacy, lambing rate, and birth weight were recorded after parturition. Blood samples were collected at CIDR insertion, CIDR withdrawal, 5, 10, and 15 days after ram exposure. Fertility, prolificacy, lambing rate, and birth weight did not improve by Ese treatment, but viability of lambs was higher in ESe than control (P < 0.05). Serum total antioxidant capacity at day 5 and P4 at day 10 after ram exposure was higher in ESe than control (P < 0.05). In conclusion, sheep breeders can use ESe at CIDR insertion and withdrawal times to potentiate antioxidant status and progesterone profile of estrus-synchronized Mehraban ewes.


Subject(s)
Estrus Synchronization/drug effects , Fertility/drug effects , Selenium/pharmacology , Sheep/physiology , Vitamin E/pharmacology , Administration, Intravaginal , Animals , Antioxidants/pharmacology , Birth Weight , Estrus/drug effects , Female , Humans , Pregnancy , Progesterone/blood , Reproduction/drug effects , Seasons , Selenium/administration & dosage , Vitamin E/administration & dosage
4.
Malays Fam Physician ; 14(3): 28-36, 2019.
Article in English | MEDLINE | ID: mdl-32175038

ABSTRACT

BACKGROUND AND OBJECTIVE: A successful family physician program needs ongoing and full cooperation between people and the organizations in charge. Ensuring the satisfaction of family physicians through improvement of the underlying factors could motivate them to provide high-quality services. This study aimed to determine the family physicians' satisfaction level with the factors affecting the dynamism of the urban family physicians program in the Fars and Mazandaran provinces of Iran. METHOD: This cross-sectional study was carried out in urban areas in the Fars and Mazandaran provinces in 2016. The sample consisted of 143 and 96 family physicians, respectively, in Fars and Mazandaran provinces and was selected using the stratified random sampling method. Data were collected using a questionnaire and included both sociodemographic variables and factors assessing the family physicians' satisfaction levels. Each factor was scored based on a Likert scale from 0 to 5 points, and any satisfaction level higher than 3 out of 5 was equated with being satisfied. RESULTS: The overall satisfaction levels among family physicians in Fars and Mazandaran provinces were 2.77±0.53 and 3.37±0.56, respectively, revealing a statistically significant difference between provinces (p<0.001). Moreover, the mean satisfaction scores for the performances of healthcare centers, insurance companies, specialists, healthcare workers, and the population covered were 2.78±0.1, 2.54±0.9, 2.52±0.8, 4.24±0.07, and 2.96±0.8, respectively. The family physicians' levels of satisfaction were significantly correlated with population size (p=0.02, r= -0.106), and willingness to stay in an urban family physician program (p<0.001, r= +0.398). CONCLUSION: This study revealed that family physicians exhibited a low level of satisfaction with the urban family physician program. Given the direct association between family physicians' satisfaction levels and retention in the program, it is expected that family physicians will no longer stay in the program, and it is likely to have subsequent executive problems.

5.
Obes Rev ; 19(3): 313-320, 2018 03.
Article in English | MEDLINE | ID: mdl-29266643

ABSTRACT

The objective of this systematic review and meta-analysis was to examine the association between eating while television viewing (TVV) and overweight or obesity in children (<18 years). A systematic search of PubMed, Scopus, Web of science, PreQuest and Embase was conducted up to April 2017; pooled odds ratio (OR) and 95% confidence intervals (CI) were calculated using a random effects model. Of 4,357 articles identified, 20 observational studies met inclusion criteria (n = 84,825) and 8 of these 20 (n = 41,617) reported OR. Eating while TVV was positively associated with obesity-related anthropometric measurements in 15 studies (75%). The meta-analysis revealed that eating while TVV was positively associated with being overweight (OR = 1.28; 95% CI: 1.17, 1.39). Subgroup analyses showed similar positive associations in both girls and boys, as well as in children who ate dinner while TVV. There was no evidence of publication bias. The present systematic review and meta-analysis suggests that eating while TVV could be a risk factor for being overweight or obese in childhood and adolescents.


Subject(s)
Energy Intake/physiology , Feeding Behavior/psychology , Pediatric Obesity/etiology , Sedentary Behavior , Television , Adolescent , Attitude to Health , Child , Child, Preschool , Humans , Observational Studies as Topic , Pediatric Obesity/epidemiology , Risk Factors
6.
Indian J Nephrol ; 23(3): 201-5, 2013 May.
Article in English | MEDLINE | ID: mdl-23814419

ABSTRACT

Hyperuricemia is common in renal transplant patients (RTRs), especially those on cyclosporine (CsA)-based therapy. We conducted a retrospective study to determine the prevalence of hyperuricemia and its risk factors among RTRs. A total of 17,686 blood samples were obtained from 4,217 RTRs between April 2008 and January 2011. Hyperuricemia was defined as an uric acid level of ≥7.0 mg/dl in men and of ≥6 mg/dl in women that persisted for at least two consecutive tests. Majority (68.2%) of RTRs were normouricemic. Hyperuricemia was more frequent in younger and female RTRs. On multivariate logistic regression, we found high trough level of cyclosporine to be a risk factor for hyperuricemia. In addition, female gender, impaired renal function, and dyslipidemia (hypercholesterolemia, hypertriglyceridemia, and elevated LDL) were also associated with higher probability of hyperuricemia. Hyperuricemia is a common complication after renal transplantation. Risk factors implicated in post-transplant hyperuricemia include high trough level of cyclosporine, female gender, renal allograft dysfunction, and dyslipidemia.

7.
Indian J Nephrol ; 22(4): 280-4, 2012 Jul.
Article in English | MEDLINE | ID: mdl-23162272

ABSTRACT

Hyperuricemia is frequent among adult renal transplant recipients; however, data among pediatric kidney recipients are scarce. This study is designed to estimate the prevalence and risk factors of late post-transplant hyperuricemia in pediatric recipients. A retrospective observational multicenter study on 179 pediatric renal recipients (5-18 years) was conducted between April 2008 and January 2011 from five kidney transplant centers of Tehran, Iran. All recipients were followed up for more than 1 year (5.9 ±3.3 years) after transplantation. A total of 17686 blood samples were obtained for serum uric acid (SUA). The normal range of SUA was defined as SUA 1.86-5.93 mg/dl for children between 2 and 15 years in both genders; 2.40-5.70 mg/dl for girls aged >15 years; 3.40-7.0 mg/dl for boys aged >15 and more than 6 and 7 mg/dl in boys and girls older than 15 years old. The median age of the children was 13 years. Male recipients were more popular than female (male/female 59/41%). Hyperuricemia was detected in 50.2% of patients. Mean SUA concentration was 5.9±1.7 mg/dl and mean SUA concentration in hyperuricemic patients was 7.7±1.2 mg/dl. While at multivariate logistic regression elevated serum creatinine concentration (P<0.001) and the time span after renal transplantation (P=0.02) had impact on late post-transplant hyperuricemia. High cyclosporine level (C0 and C2) was not risk factor for huperuricemia. Late post-transplant hyperuricemia was found in about half of pediatric renal recipients, and was associated with impaired renal allograft function.

10.
Int J Organ Transplant Med ; 3(4): 166-75, 2012.
Article in English | MEDLINE | ID: mdl-25013642

ABSTRACT

BACKGROUND: Kidney transplantation is associated with various biochemical abnormalities such as changes in serum blood level of sodium (Na), potassium (K), calcium (Ca), and phosphorous (P). Although cyclosporine (CsA) is used commonly, the prevalence of its side effects, including electrolytes disturbance, is not well understood. OBJECTIVE: To find the prevalence of electrolytes disturbance and its relation to CsA blood levels. METHODS: In a retrospective study, 3308 kidney transplant recipients transplanted between 2008 and 2011 were studied. We evaluated the relation between serum Ca, P, Na, K and CsA trough (C0) and 2-hour post-dose (C2) levels. RESULTS: The mean±SD age of recipients was 37±15 years; 63% of patients were male. Overall, C2 levels had correlation with Ca blood level (p=0.018; OR: 1.13, 95%CI: 1.02-1.25), C0 levels had also correlation with blood levels of P and Cr (p<0.001; OR: 1.83, 95% CI: 1.59-2.11). CONCLUSION: Electrolyte disturbances are prevalent. Higher serum levels of CsA can worsen the allograft function by disturbing the serum P and Ca levels.

11.
Transplant Proc ; 43(2): 488-90, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440741

ABSTRACT

OBJECTIVE: To determine the correlation between cyclosporine blood concentration at 2 hours after dosing (C2) and renal allograft function. MATERIALS AND METHODS: From 2008 to 2010, 1191 kidney transplant recipients (718 male and 473 female patients) were studied. The correlation between serum creatinine concentration and C2 blood concentration was stratified as 400, 600, 800, and 1000 ng/mL. RESULTS: The mean (SD) C2 was 620 (235) ng/mL, and serum creatinine concentration was 1.49 (0.68) mg/dL. At multivariate regression analysis, no significant correlation was observed between serum creatinine concentration and C2 blood concentrations of 600, 800, or 1000 ng/mL (P=.18, .57, and .76, respectively); however, it was associated at 400 ng/mL (P=.03). Moreover, 36.1% of 3159 samples demonstrated satisfactory renal allograft function despite low C2 blood concentration between 400 and 600 ng/mL. CONCLUSION: During maintenance therapy, C2 blood concentration between 400 and 600 ng/mL is effective and safe for providing prophylaxis against rejection, and can improve long-term survival by decreasing cyclosporine toxicity.


Subject(s)
Cyclosporine/blood , Immunosuppressive Agents/blood , Kidney Transplantation/methods , Creatinine/blood , Cyclosporine/pharmacokinetics , Drug Monitoring/methods , Female , Graft Rejection/prevention & control , Humans , Immunosuppressive Agents/pharmacokinetics , Male , Multivariate Analysis , Regression Analysis , Retrospective Studies , Time Factors , Treatment Outcome
12.
Transplant Proc ; 43(2): 578-80, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440766

ABSTRACT

OBJECTIVE: To evaluate the prevalence of anemia and appraise its risk factors at 6 months after renal transplantation. MATERIALS AND METHODS: This retrospective study was performed between 2008 and 2010 in 2713 adult kidney transplant recipients to determine the prevalence of posttransplantation anemia. Anemia was defined as hemoglobin concentration of 12 g/dL or less in women and 13 g/dL or less in men. RESULTS: The prevalence of posttransplantation anemia was 52.7%, with severe anemia (hemoglobin≤11 g/dL) detected in 24.4% of patients. Impaired renal function was the only risk factor associated with anemia (odds ratio, 3.6; P=.047). However, severe anemia after kidney transplantation was correlated with female sex (P=.001), renal allograft dysfunction (P=.00), and cytomegalovirus infection (P=.002). CONCLUSION: The present study demonstrated a quite high prevalence of posttransplantation anemia, in particular associated with impaired renal allograft function. Severe anemia was correlated with female sex, degree of kidney graft dysfunction, and cytomegalovirus infection.


Subject(s)
Anemia/complications , Anemia/epidemiology , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/therapy , Kidney Transplantation/adverse effects , Cytomegalovirus Infections/complications , Female , Graft Survival , Hemoglobins/biosynthesis , Humans , Iran , Kidney Transplantation/methods , Male , Odds Ratio , Postoperative Complications , Prevalence , Regression Analysis , Retrospective Studies , Risk Factors
13.
Transplant Proc ; 43(2): 581-3, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440767

ABSTRACT

BACKGROUND: Although endogenous erythropoietin secretion returns via the renal allograft a few hours following successful engraftment, anemia is a common early or late complication. In addition, anemia is a risk factor for ischemic heart disease and graft loss. We sought to determine risk factors for and the prevalence of severe anemia immediately posttransplantation (PTA). METHODS: This cross-sectional retrospective study performed between 2006 and 2009 enrolled 864 adult subjects of mean age 40.7±13.8 (range=6-75) years. On the basis of The World Health Organization criteria, a hemoglobin (Hb) level less than 11 g/dL for men and less than 10 g/dL for women was defined as severe anemia. RESULTS: Severe anemia occurred frequently (62.7%) among these patients whose most common underlying disease was hypertension 311 (58.2%). Their mean Hb level was 9.9±1.8 g/dL at the time of hospital discharge, namely, almost 2 weeks after transplantation. More than 90% (n=778) of subjects received a kidney from a living donor. Immediate severe anemia associated with delayed graft function (DGF; P=.01), antithymocyte globulin (ATG)/antilymphocyte globulin (ALG) administration (P=.000), acute rejection (P=.000), recipient gender (P=.000), cold ischemic time (P=.01), pretransplant Hb (P=.000), posttransplant creatinine (P=.001), and acute rejection episodes (P=.000). Upon logistic regression analysis donor age (P=.04, confidence interval [CI]=0.7-0.9), recipient female gender (P=.009, CI=0.08-0.7), and ATG/ALG use (P=.009, CI=1.7-43.4) showed significant effects to cause severe PTA. CONCLUSION: Immediate anemia after renal transplantation is a consequence of poor renal function. In addition, ATG/ALG use and DGF can induce severe PTA, which may play roles in ischemic heart disease and graft loss.


Subject(s)
Anemia/complications , Kidney Failure, Chronic/complications , Adolescent , Adult , Aged , Anemia/etiology , Anemia/metabolism , Child , Cross-Sectional Studies , Erythropoietin/metabolism , Graft Survival , Humans , Kidney Failure, Chronic/therapy , Kidney Transplantation/methods , Middle Aged , Myocardial Ischemia/complications , Postoperative Complications , Risk , Risk Factors
14.
Transplant Proc ; 43(2): 584-5, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440768

ABSTRACT

BACKGROUND: Hyperuricemia is a common complication after kidney transplantation, and may adversely affect graft survival. OBJECTIVE: To assess the prevalence of and predictors for development of hyperuricemia after renal transplantation. MATERIALS AND METHODS: Hyperuricemia was defined as a serum uric acid concentration of at least 7.0 mg/dL in men and 6.0 mg/dL in women. From March 2008 to May 2010, uric acid concentration was measured in 12,767 blood samples from 2961 adult renal transplant recipients (64% male and 36% female patients). RESULTS: Hyperuricemia was observed in 1553 patients (52.4%). The disorder frequently occurred in women (P=.003) and in patients with impaired renal graft function (P=.00). After adjustment for sex, serum creatinine concentration, diabetes mellitus, cyclosporine concentration, and dyslipidemia, only female sex (P=.03) and renal allograft dysfunction (P=.05) were associated with hyperuricemia after kidney transplantation. CONCLUSION: Hyperuricemia is a common complication after kidney transplantation, and renal allograft insufficiency predisposes to higher uric acid concentration.


Subject(s)
Hyperuricemia/etiology , Kidney Failure, Chronic/complications , Kidney Transplantation/adverse effects , Adult , Female , Graft Survival , Humans , Hyperuricemia/complications , Kidney Failure, Chronic/therapy , Male , Middle Aged , Postoperative Complications , Prevalence , Retrospective Studies , Sex Factors , Transplantation, Homologous , Uric Acid/blood
15.
Transplant Proc ; 43(2): 586-7, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440769

ABSTRACT

OBJECTIVE: To determine the prevalence of hyperhomocysteinemia (plasma homocysteine[Hcy] concentration≥15 µmol/L) and evaluate its correlation with allograft function. MATERIALS AND METHODS: The study included 159 stable renal transplant recipients (104 men and 55 women). The prevalence and severity of hyperhomocysteinemia were compared in the transplant recipients vs 72 patients (48 men and 24 women) receiving hemodialysis therapy. RESULTS: The mean (SD; range) fasting total Hcy concentration was higher in the hemodialysis group compared with the renal transplantation group: 27.4 (18.3; 10-95) µmol/L vs 16.6 (9.5; 4.5-45.0) µmol/L (P=.00). Hyperhomocysteinemia occurred more frequently in patients receiving hemodialysis therapy (74% vs 49%). No significant correlation was observed between Hcy concentration and recipient sex, cyclosporine trough concentration and concentration at 2 days after dosing, dyslipidemia,cytomegalovirus infection, diabetes mellitus, or aspartate or alanine aminotransferase concentration. Multivariate regression analysis revealed that serum creatinine concentration (P=.02) was the major determinant of increased total Hcy concentration in renal transplant recipients. CONCLUSION: A high prevalence of moderate hyperhomocysteinemia was observed in renal transplant recipients. There was no correlation between graft function and Hcy concentration.


Subject(s)
Hyperhomocysteinemia/complications , Kidney Failure, Chronic/complications , Kidney Transplantation/methods , Adult , Cross-Sectional Studies , Female , Humans , Immunosuppressive Agents , Kidney Failure, Chronic/therapy , Male , Middle Aged , Multivariate Analysis , Postoperative Complications , Prevalence , Renal Dialysis , Retrospective Studies
16.
Transplant Proc ; 43(2): 588-9, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440770

ABSTRACT

BACKGROUND: The development of posttransplant malignancy is a well-recognized complication of kidney transplantation due to immunosuppressive therapy. The literature on colorectal malignancy in living renal transplant recipients are limited; most of the data have been collected from deceased donor cases. As living kidney donation is now growing, we sought to define the characteristics and pattern of gastrointestinal malignancy among this group. METHODS: This cross-sectional, multicenter study analyzed the incidence and characteristics of colorectal malignancy among 17 patients with gastrointestinal malignancy after living donor renal transplantation between 1985 and 2009 in Iran. We observed a new-onset, biopsy-proven colorectal malignancy in eight patients of mean age 49.6±10.3 years (range=27-60) at transplantation time and a mean age of 61.1±8.6 years (range=53.4-78.6) at cancer diagnosis. RESULTS: The cumulative incidence rate of colorectal malignancy of 0.03% was restricted to the male gender (100%), all of whom had functioning grafts. The mean period from transplantation to diagnosis was 99.7±10.4 months (range=5-284). The majority of the recipients were aged more than 50 years (n=5) and the most frequent immunosuppressive drug was azathioprine (n=5); none had received antithymocyte globulin/antilymphocyte globulin. It was mostly a late-onset malignancy with 50% of recipients presenting beyond 5 years from transplantation. They were followed for a mean of 9.2±2.4 (range=6-12) months after cancer diagnosis with three patients having succumbed within 9 months. CONCLUSION: Due to the long latency after transplantation and the poor outcomes of colorectal malignancy these patients require long-term screening tests for early detection and due to their poor outcomes a new therapeutic approach.


Subject(s)
Colorectal Neoplasms/complications , Colorectal Neoplasms/diagnosis , Kidney Failure, Chronic/complications , Kidney Transplantation/adverse effects , Adult , Age Factors , Aged , Cross-Sectional Studies , Female , Humans , Immunosuppression Therapy/adverse effects , Immunosuppressive Agents/adverse effects , Kidney/metabolism , Kidney Failure, Chronic/therapy , Male , Middle Aged , Postoperative Complications , Risk
17.
Transplant Proc ; 43(2): 590-1, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21440771

ABSTRACT

BACKGROUND: The Kidney Transplant Questionnaire (KTQ) is a quality-of-life instrument designed specifically for renal transplant recipients. AIM: The purpose of this work was to evaluate the validity and reliability of a Persian translation of the KTQ-25 questionnaire as a tool for use in Iran and also to compare the quality of life between dialysis and transplant patients. METHOD: We collected 143 subjects in a cross-sectional study. Their mean age was 40.3±13.3 years (range=15-72). All KDQ-25 scales met the criteria for internal consistency(Cronbach's alpha ranged from 0.8-0.95) and in construct validity, the correlation coefficient between 5 scales and the total scale was also acceptable (0.84-0.91). Furthermore, significant correlations were detected between the scales (P<.001). RESULTS: The mean total score was 2.8±1.4 (range=5.8-1.5). The best mean score observed in uncertainty and fear item was 3.1±1.6 (range=0.5-7), while the lowest was detected in the emotional item, 2.4±1.3 (range=0.17-6). Mean follow-up was 50.1 (range=1-264) months. The most common physical problem was aching, tired legs in 77 (55%) subjects.In comparison between dialysis and transplant patients using the standard Iranian version of Kidney Disease Quality of Life (KDQOL) questionnaire, the total and disease-specific scores for dialysis patients were significantly better than the total score in the KDQ-25 (55.8±14 vs 40.7±20.2, P=.000) and (49.7±15.8 vs 40.7±20.2, P=.000), respectively. CONCLUSION: Considering its validity and reliability, the Persian version of KTQ-25 questionnaire may be useful to assess the health-related quality of life among Iranian transplant recipients.


Subject(s)
Kidney Failure, Chronic/psychology , Kidney Failure, Chronic/therapy , Kidney Transplantation/methods , Surveys and Questionnaires , Adult , Cross-Sectional Studies , Female , Health Status , Humans , Language , Male , Middle Aged , Patient Satisfaction , Persia , Psychometrics/methods , Quality of Life
18.
Int J Organ Transplant Med ; 1(2): 91-3, 2010.
Article in English | MEDLINE | ID: mdl-25013571

ABSTRACT

BACKGROUND: With the success of kidney transplantation, liver disease has emerged as an important cause of morbidity and mortality in kidney recipients. OBJECTIVE: To determine the impact of hepatitis B virus (HBV) infection on patients and graft survival in both short- and long-terms. METHODS: 99 renal transplant patients infected with HBV on follow-up in two major transplant centers were included in a retrospective study. These patients were grafted between 1986 and 2005 and divided into two groups: (1) those only positive for hepatitis B surface antigen (HBsAg) and (2) those who were also positive for hepatitis C virus antibodies (HCV Ab). RESULTS: There were 88 patients with HBsAg(+) and 11 with both HBsAg(+) and HCV Ab(+). The mean±SD age of patients was 38.8±13.2 years, and the median follow-up after transplantation was 19 months. Although not significant, the allograft survival rate in the first group (HBV(+)) was better compared to that in the second group (HBV(+) and HCV(+)); 1, 5 and 10 years graft survival rates were 91, 77 and 62 in the first group and 70, 56 and 28 in the second group, respectively (P=0.07). The overall mortality was 5% (4 of 88) in the first and 27% (3 of 11) in the second group (P=0.02). CONCLUSION: Renal allograft recipients with HBV and HCV infections has a poor survival rate compared to patients with only HBV infection. However, there is no significant difference in terms of renal graft survival between the two groups.

SELECTION OF CITATIONS
SEARCH DETAIL
...