Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Sci Rep ; 13(1): 13651, 2023 08 22.
Article in English | MEDLINE | ID: mdl-37607949

ABSTRACT

A key limitation in assessing the therapeutic impact of non-pharmacological approaches to treating hypertension is the method of reporting outcomes. Reducing the medications required to achieve the same blood pressure may be reported separately to a reduction in the blood pressure without change in medication, and thus lessen the reported beneficial impact of treatment. This study aims to derive a novel scoring system to gauge the therapeutic impact of non-drug treatment of hypertension by utilising a combination of excessive blood pressure and the number of anti-hypertensives into a combined score-the hypertensive index (HTi). The hypertensive index was empirically derived based on the systolic blood pressure and number of antihypertensive drugs, and applied retrospectively to a cohort undergoing intervention for renovascular hypertension. Subgroup and receiver operating characteristic analyses were used to compare the HTi to traditional methods of reporting outcomes. Following intervention (99 patients), 46% had improvement in both medication load and blood pressure, 29% had benefit in blood pressure without reduction in medication load, 15% had reduction in medication load without significant change in blood pressure and 9% showed no benefit in either parameter. The HTi was superior in detecting benefit from intervention compared with measuring blood pressure or medication load alone (AUC 0.94 vs 0.85;0.84). The hypertensive index may be a more sensitive marker of treatment effect than assessing blood pressure measurements alone. The use of such scoring systems in future trial design may allow more accurate reporting of the effects of interventions for hypertension.


Subject(s)
Hypertension, Renovascular , Hypertension , Humans , Retrospective Studies , Hypertension/drug therapy , Blood Pressure , Antihypertensive Agents/therapeutic use
2.
Ann Vasc Surg ; 75: 287-293, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33819582

ABSTRACT

OBJECTIVES: Tunneled central venous catheter infection (TCVCi) is a common complication that often necessitates removal of the TCVC and replacement by a further TCVC. Theoretically, insertion of an early - cannulation graft (ecAVG) early after TCVC infection is possible but not widely practiced with concerns over safety and infection in the ecAVG. With 8 years of ecAVG experience, the aim of this study was to compare the outcomes following TCVC infection, comparing replacement with TCVC (TCVCr) versus immediate ecAVG (ecAVGr). DESIGN: Retrospective comparison of 2 cohorts, who underwent replacement of an infected TCVC either by an early cannulation graft (n = 18) or by a further central catheter (n = 39). METHODS: Data were abstracted from a prospectively completed electronic patient record and collected on patient demographics, TCVC insertion, duration and infection, including culture proven bacteriaemia and subsequent access interventions. RESULTS: Eighteen of 299 patients identified from 2012 to 2020 had an ecAVG implanted as treatment for a TCVCi. In a 1-year time-period (January 1, 2015-December 31, 2015) out of 222 TCVC inserted, 39 were as a replacement following a TCVCi. No patient with an ecAVGr developed an immediate infection, nor complication from the procedure. The rate of subsequent vascular access infection was significantly more frequent for those with a TCVCr than with an ecAVGr (0.6 vs. 0.1/patient/1000 HD days, P< 0.000). The number of further TCVC required was significantly higher in the TCVCr group (7.1 vs. 0.4/patient/1000 HD days, P= 0.000). CONCLUSIONS: An ecAVG early following a TCVC infection is safe, reduces the incidence of subsequent infectious complications and reduces the number of TCVC required, with a better functional patency.


Subject(s)
Arteriovenous Shunt, Surgical , Catheter-Related Infections/prevention & control , Catheterization, Central Venous/adverse effects , Catheterization , Catheters, Indwelling/adverse effects , Central Venous Catheters/adverse effects , Renal Dialysis , Aged , Arteriovenous Shunt, Surgical/adverse effects , Catheter-Related Infections/diagnosis , Catheter-Related Infections/microbiology , Catheterization, Central Venous/instrumentation , Device Removal , Female , Humans , Male , Middle Aged , Reinfection , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome
3.
J Hosp Infect ; 110: 37-44, 2021 Apr.
Article in English | MEDLINE | ID: mdl-33484781

ABSTRACT

BACKGROUND: Infection is the second highest cause of mortality in end-stage renal disease, with a significant proportion relating to haemodialysis (HD) vascular access-related infection (VARI). AIM: To report the rate and antimicrobial resistance (AMR) of all-source bloodstream infections (BSIs) by vascular access type in a Scottish HD cohort. METHODS: Retrospective analysis was undertaken of data on adult patients attending seven HD units during 2017. Total HD days for each vascular access type were calculated. BSIs were analysed with rates expressed per 1000 HD days. AMR was verified using health board microbiology databases. FINDINGS: Excluding contaminant organisms, there was an overall BSI rate of 0.57 per 1000 HD days. The highest all-source and vascular access-related infection (VARI) BSI rates per 1000 HD days were in the non-tunnelled central venous catheter (CVC) group (3.11 and 2.07 respectively), followed by tunnelled CVC (1.10 and 0.67), arteriovenous graft (0.51 and 0.31), and finally arteriovenous fistula (0.29 and 0.02). The non-VARI BSI rates were lowest in the arteriovenous graft group. Staphylococci comprised the majority of events, with Staphylococcus aureus implicated in 29%. Gram-negative BSIs were prevalent, particularly in CVC groups, and associated with higher mortality. Multidrug-resistant (MDR) S. aureus and carbapenem resistance were relatively low. MDR Gram-negatives were high compared with the Scottish population. CONCLUSION: Arteriovenous fistula access is confirmed as having lowest all-source and VARI BSI rates, and arteriovenous graft access the lowest non-VARI BSI rates. Staphylococci remain the prevailing genus; however, the contributions of Gram-negative BSIs, the higher mortality, and proportion of MDR organisms in this group are notable.


Subject(s)
Bacteremia , Catheter-Related Infections , Renal Dialysis , Sepsis , Adult , Arteriovenous Fistula , Bacteremia/epidemiology , Catheter-Related Infections/epidemiology , Central Venous Catheters , Gram-Negative Bacterial Infections/epidemiology , Humans , Retrospective Studies , Scotland/epidemiology , Sepsis/epidemiology , Staphylococcus aureus , Vascular Grafting
4.
Transplant Proc ; 50(10): 3160-3164, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30577182

ABSTRACT

OBJECTIVES: There has been considerable change in the practice of deceased kidney transplantation in the past 15 years, with more extreme phenotypes implanted. The aim of this study was to determine whether increased use of expanded criteria donors (extended criteria donors and donors after circulatory death) affected clinical outcomes, including the incidence and pattern of delayed graft function. METHODS AND MATERIALS: A retrospective analysis of 1359 renal transplants was performed over 15 years. The first 10 years of data (group 1) were compared with the subsequent 5 years (group 2). Outcomes were analyzed at 6 months and 12 months in addition to serum creatinine and patterns of delayed graft function (posttransplant times: on hemodialysis, to peak creatinine, for creatinine to half, and for creatinine to fall within 10% of baseline). RESULTS: There was a significant increase in the percentage of expanded criteria donor allografts used in group 2 with a significant increase in the incidence of delayed graft function. Despite this, serum creatinine and the incidence of biopsy-proven acute rejection had both improved in group 2. Group 2 expanded criteria donor kidneys had a significantly lower incidence of type 1 delayed graft function and a significantly higher incidence of types 3 and 4 delayed graft function. Time for creatinine to half in both groups was the best predictor of a serum creatinine <180 µmol/L at 1 year. CONCLUSION: The increased use of expanded criteria donor kidneys has led to a higher incidence of delayed graft function, but the pattern has shown that the requirement for hemodialysis has significantly reduced.


Subject(s)
Delayed Graft Function/etiology , Donor Selection , Kidney Transplantation/adverse effects , Adult , Biopsy/adverse effects , Creatinine/blood , Female , Graft Survival , Humans , Male , Middle Aged , Retrospective Studies , Treatment Outcome
5.
Transplant Proc ; 47(6): 1605-9, 2015.
Article in English | MEDLINE | ID: mdl-26293021

ABSTRACT

BACKGROUND: Histopathological features on time-zero renal biopsies correlate with graft outcome after renal transplantation. With increasing numbers of marginal donors, assessment of pre-implantation graft quality is essential. The clinician's choice of wedge or core biopsy is performed without evidence of efficacy or safety. This study aims to compare the information derived from wedge biopsy versus core biopsy. METHODS: Prospective evaluation of 37 wedge biopsies and 30 core biopsies was performed. Histopathological data were collected on number of glomeruli and arterioles observed, and Remuzzi scoring for glomerulosclerosis, tubular atrophy, interstitial fibrosis, and arteriolar narrowing was performed. Clinical data on delayed graft function (DGF) were also collated. Sensitivity, specificity, and positive and negative predictive values for DGF were compared. RESULTS: Patient demographics between the two cohorts were comparable. No complications of biopsies occurred; 81% of wedge biopsies versus 50% of core biopsies had >10 glomeruli (P = .01), whereas 32% of wedge biopsies and 57% of core biopsies had >2 arterioles (P = .02). Wedge biopsies were more likely to identify pathology with more glomerulosclerosis, tubular atrophy (P < .01), and interstitial fibrosis (P < .01). There was a non-significant trend toward high Remuzzi scores in wedge biopsy (22% versus 7% with Remuzzi ≥ 4; P = .12). The sensitivity and positive predictive value of Remuzzi ≥ 4 for predicting DGF was better on wedge biopsy (45.5% versus 0%; P < .01 and 62.5% versus 0%; P < .01, respectively). CONCLUSIONS: Wedge biopsies were safe and superior to core biopsies for identifying clinically significant histopathological findings on pre-implantation renal biopsy. We believe that the wedge biopsy is the method of choice for time-zero biopsies.


Subject(s)
Delayed Graft Function/pathology , Kidney Diseases/pathology , Kidney Transplantation , Kidney/pathology , Aged , Arterioles/pathology , Atrophy , Biopsy/methods , Female , Fibrosis , Humans , Kidney/blood supply , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Sclerosis , Sensitivity and Specificity , Tissue Donors
6.
Eur J Vasc Endovasc Surg ; 44(1): 55-61, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22521840

ABSTRACT

OBJECTIVES: Risk indices help quantify the risk of cardiovascular events and death prior to making decisions about prophylactic AAA repair. This paper aims to study the predictive capabilities of 5 validated indices. DESIGN AND METHODS: A prospective observational multi-centre cohort study from August 2005 to September 2007 in Glasgow recruited 106 consecutive patients undergoing elective open AAA repair. The Glasgow Aneurysm Score (GAS), Vascular physiology only Physiological and Operative Severity Score for enUmeration of Mortality (V(p)-POSSUM), Vascular Biochemical and Haematological Outcome Model (VBHOM), Revised Cardiac Risk Index (RCRI) and Preoperative Risk Score of the Estimation of Physiological Ability and Surgical Stress Score (PRS of E-PASS) were calculated. Indices were compared using receiver operating characteristic (ROC) analysis and area under the curve (AUC) estimates. End points were all-cause mortality, Major Adverse Cardiac Events (MACE) and cardiac death. RESULTS: GAS, VBHOM and RCRI did not predict outcome. V(p)-POSSUM predicted MACE (AUC = 0.681), cardiac death (AUC = 0.762) and all-cause mortality (AUC = 0.780), as did E-PASS (AUC = 0.682, 0.821, 0.703 for MACE, cardiac death and all-cause mortality respectively). CONCLUSION: Whilst V(p)-POSSUM and E-PASS predicted outcome, the less complex RCRI and GAS performed poorly which questions the utility of decision making based on these surgical risk indices.


Subject(s)
Decision Making , Elective Surgical Procedures , Laparotomy , Risk Assessment , Vascular Surgical Procedures/methods , Cause of Death/trends , Follow-Up Studies , Hospital Mortality/trends , Humans , Preoperative Period , Prognosis , Prospective Studies , ROC Curve , Risk Factors , Survival Rate/trends , United Kingdom/epidemiology , Vascular Surgical Procedures/mortality
7.
Br J Anaesth ; 107(2): 144-9, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21610013

ABSTRACT

BACKGROUND: The prediction of long-term survival after surgery is complex. Natriuretic peptides can predict short-term postoperative cardiac morbidity and mortality. This study aims to determine the long-term prognostic significance of preoperative B-type natriuretic peptide (BNP) concentration after major non-cardiac surgery. METHODS: We conducted a prospective single-centre observational cohort study in a West of Scotland teaching hospital. Three hundred and forty-five patients undergoing major non-cardiac surgery were included. The primary endpoint was long-term all-cause mortality. RESULTS: Overall survival was 67.8% (234/345), with 27 postoperative deaths (within 42 days) and 84 deaths at subsequent follow-up (median follow-up 953 days). A BNP concentration of >87.5 pg ml(-1) best predicted mortality, and the mean survival of patients with an elevated BNP (>87.5 pg ml(-1)) was 731.9 (95% CI 613.6-850.2) days compared with 1284.6 days [(95% CI 1219.3-1350.0), P<0.001] in patients with a BNP<87.5 pg ml(-1). BNP was an independent predictor of survival. CONCLUSIONS: BNP is an independent predictor of long-term survival after major non-cardiac surgery. A simple preoperative blood test can provide predictive information on future risk of death, and potentially has a role in preoperative risk assessment.


Subject(s)
Heart Diseases/diagnosis , Natriuretic Peptide, Brain/blood , Preoperative Care/methods , Surgical Procedures, Operative , Adult , Aged , Aged, 80 and over , Biomarkers/blood , Epidemiologic Methods , Female , Humans , Male , Middle Aged , Postoperative Complications , Prognosis
8.
Surgeon ; 6(4): 204-6, 2008 Aug.
Article in English | MEDLINE | ID: mdl-18697361

ABSTRACT

BACKGROUND: It is common practice to take a specimen of pus for microscopy and bacterial culture during drainage of abscesses. The aim of this study was to determine if routine culture and sensitivity had any therapeutic value in the care of patients with non-perianal cutaneous abscesses. PATIENTS AND METHODS: A retrospective analysis ofall patients undergoing drainage ofa cutaneous abscess during a two year period (June 2003 - June 2005) was performed. Patients were identified from the hospital database and theatre records, and those with perianal, pilonidal or surgical wound sepsis were excluded. Notes were reviewed for clinical details, culture results, subsequent admissions and attendance at follow-up. RESULTS: Of the 239 patients treated during this period, 74 patients had 77 operations to drain abscesses that matched the inclusion criteria. Specimens were sent from 52 (67.5%) procedures. Only 65.4% had an organism identified, of which methicillin-sensitive Staphylococcus aureus (MSSA) was the most commonly isolated organism (36.5%). Forty-one point six per cent of patients received antibiotics as part of their treatment. The results of the bacterial culture and antibiotic sensitivities were not known prior to discharge of any patient. CONCLUSION: This study shows that bacteriology swabs are frequently taken during incision and drainage of non-perianal cutaneous abscesses and had little impact on the subsequent treatment, though these results may not be applicable to immune-compromised patients.


Subject(s)
Abscess/microbiology , Anti-Bacterial Agents/therapeutic use , Bacteria/isolation & purification , Skin Diseases, Bacterial/microbiology , Abscess/drug therapy , Adolescent , Adult , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Male , Middle Aged , Retrospective Studies , Skin Diseases, Bacterial/drug therapy , Treatment Outcome
9.
Am J Transplant ; 8(8): 1673-83, 2008 Aug.
Article in English | MEDLINE | ID: mdl-18510627

ABSTRACT

We assessed the outcome of pretransplant cardiac assessment in a single center. Three hundred patients with end-stage renal disease underwent electrocardiogram, Bruce exercise testing (ETT) and ventricular assessment by cardiac MRI. Patients with high index of suspicion of coronary artery disease (CAD) underwent coronary angiography and percutaneous coronary intervention (PCI) if indicated. Two hundred and twenty-two patients were accepted onto the renal transplant waiting list; 80 patients were transplanted during the follow-up period and 60 died (7 following transplantation). Successful transplantation was associated with improved survival (mean survival 4.5 +/- 0.6 years vs. listed not transplanted 4.1 +/- 1.4 years vs. not listed 3.1 +/- 1.7 years; p < 0.001). Ninety-nine patients underwent coronary angiography; 65 had normal or low-grade CAD and 34 obstructive CAD. Seventeen patients (5.6%) were treated by PCI. There was no apparent survival difference between patients who underwent PCI or coronary artery bypass graft compared to those who underwent angiography without intervention or no angiography (p = 0.67). Factors associated with nonlisting for renal transplantation included burden of preexisting cardiovascular disease, poor exercise tolerance and severity of CAD. Pretransplant cardiovascular screening provides prognostic information and information that can be used to restrict access to transplantation. However, if the aim is to identify and treat CAD, the benefits are far from clear.


Subject(s)
Coronary Artery Disease/complications , Coronary Artery Disease/diagnosis , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/surgery , Adult , Aged , Angioplasty, Balloon, Coronary , Coronary Artery Bypass , Coronary Artery Disease/therapy , Electrocardiography , Exercise Test , Female , Humans , Kidney Transplantation , Magnetic Resonance Imaging , Male , Mass Screening , Middle Aged , Prognosis , Survival Rate , Waiting Lists
10.
Br J Surg ; 94(7): 903-9, 2007 Jul.
Article in English | MEDLINE | ID: mdl-17330928

ABSTRACT

BACKGROUND: The objective of this study was to determine whether measurement of B-type natriuretic peptide (BNP) concentration before operation could be used to predict perioperative cardiac morbidity. METHODS: A prospective derivation study was performed in high-risk patients undergoing major non-cardiac surgery, with a subsequent validation study. A venous blood sample was taken the day before surgery for measurement of plasma BNP concentration. Screening for cardiac events (non-fatal myocardial infarction and cardiac death) was performed using clinical criteria, cardiac troponin I analysis and serial electrocardiography. RESULTS: Forty-one patients were recruited to the derivation cohort and 149 to the validation cohort. In the derivation cohort, the median (interquartile range) BNP concentration in the 11 patients who had a postoperative cardiac event was 210 (165-380) pg/ml, compared with 34.5 (14-70) pg/ml in those with no cardiac complications (P < 0.001). In the validation cohort, the median BNP concentration in the 15 patients who had a cardiac event was 351 (127-1034) pg/ml, compared with 30.5 (11-79.5) pg/ml in the remainder (P < 0.001). BNP concentration remained a significant outcome predictor in multivariable analysis (P < 0.001). Using receiver-operator curve analysis it was calculated that a BNP concentration of 108.5 pg/ml best predicted the likelihood of cardiac events, with a sensitivity and specificity of 87 per cent each. CONCLUSION: Preoperative serum BNP concentration predicted postoperative cardiac events in patients undergoing major non-cardiac surgery independently of other risk factors.


Subject(s)
Heart Diseases/mortality , Natriuretic Peptide, Brain/metabolism , Postoperative Complications/mortality , Aged , Cohort Studies , Female , Heart Diseases/blood , Humans , Male , Postoperative Complications/blood , Predictive Value of Tests , Preoperative Care , Prospective Studies , Troponin I/blood
11.
Br J Surg ; 94(8): 957-65, 2007 Aug.
Article in English | MEDLINE | ID: mdl-17377931

ABSTRACT

BACKGROUND: Traditional survival curves cannot easily be used to predict outcome for an individual patient on a year-to-year basis. This difficulty is partly overcome by yearly mortality analysis. This method was employed to analyse long-term follow-up of three cancers: colorectal, ovarian and breast cancer. METHODS: The study used prospectively collected cancer registry data from geographically defined regions in Scotland. Cohort sizes were 7196 patients with breast cancer, 3200 with colorectal cancer and 1866 with ovarian cancer. Follow-up extended to 23 years. RESULTS: Two distinct patterns of mortality emerged. Mortality rates for ovarian and colorectal cancer were initially high (41 and 21 per cent) but decreased rapidly; by 10 years patients had either died or were cured. The influence of stage diminished with follow-up. Breast cancer mortality was lower than that of colorectal or ovarian cancer, but remained raised in comparison to the general population throughout follow-up. The influence of breast cancer size reduced with follow-up, whereas that of nodal status persisted. CONCLUSION: Patients with breast cancer live at increased risk of death to the end of follow-up, supporting the concept of dormancy in breast cancer biology. This was not observed with colorectal or ovarian cancer.


Subject(s)
Breast Neoplasms/mortality , Colorectal Neoplasms/mortality , Ovarian Neoplasms/mortality , Adult , Aged , Female , Humans , Middle Aged , Prospective Studies , Risk Factors , Scotland/epidemiology , Survival Analysis , Survival Rate
12.
Br J Surg ; 94(3): 376-81, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17152046

ABSTRACT

BACKGROUND: Quality of care measured by adverse events cannot address errors of process that have no adverse outcomes. The aim of this study was to determine whether process could be used to assess quality of care and whether process analysis could be used to assess interventions designed to improve quality. METHODS: A single-centre prospective cohort study was performed over 12 weeks in an acute surgical admission unit. Data were collected prospectively for the first 24 h of admission on three aspects of process: documentation, general management and presentation-specific criteria. After a period of observation, the impact of three interventions (active observation, increasing awareness and issuing a job description) on the mean number of process errors per patient (process score) was compared. RESULTS: The analysis was based on 566 patients admitted with general surgical pathology. Awareness of being observed failed to improve the process score. Interventions that increased awareness of process reduced the overall process score from 4.79 to 2.38 errors per person (P < 0.001). The mean overall process score in patients with an adverse event was twice that of patients who did not have an adverse event (5.74 (95 per cent confidence interval 4.03 to 7.45) versus 3.43 (3.19 to 3.66)). CONCLUSION: Process can be measured objectively and used as a measure of quality of care. Interventions to increase awareness reduced process error rates and adverse events.


Subject(s)
Emergencies , Medical Errors/prevention & control , Process Assessment, Health Care/standards , Surgical Procedures, Operative/standards , Acute Disease , Cohort Studies , Humans , Prospective Studies , Quality Control
13.
Eur J Vasc Endovasc Surg ; 31(6): 637-41, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16426872

ABSTRACT

OBJECTIVES: The objective of this study was to ascertain the benefit of routine pre-operative cardiac troponin I (cTnI) measurement in patients undergoing major lower extremity amputation for critical limb ischaemia. DESIGN: This was a prospective, blinded observational study. METHODS: All patients scheduled for lower extremity amputation, without evidence of unstable coronary artery disease were recruited prospectively over a period of 1 year. In addition to routine pre-operative evaluation, a blood sample was taken for measurement of serum cTnI. Post-operative screening was conducted for cardiac events with patients followed up to 6 weeks. RESULTS: Ten of the 44 patients included suffered a non-fatal myocardial infarction or died from a cardiac cause post-operatively. A rise in pre-operative cTnI was associated with a very poor outcome (two cardiac deaths and one post-operative myocardial infarction) and was the only significant predictor of post-operative cardiac events. CONCLUSION: Routine pre-operative cTnI measurement may be of use to identify patients at high risk of cardiac complication who would benefit from optimization of cardiac status or in whom surgery could be deferred.


Subject(s)
Amputation, Surgical , Cardiovascular Diseases/blood , Ischemia/blood , Lower Extremity/surgery , Postoperative Complications/blood , Troponin I/blood , Adult , Aged , Aged, 80 and over , Amputation, Surgical/mortality , Cardiovascular Diseases/etiology , Cardiovascular Diseases/mortality , Female , Humans , Ischemia/surgery , Lower Extremity/blood supply , Male , Middle Aged , Postoperative Complications/mortality , Preoperative Care , Prospective Studies
14.
Eur J Surg Oncol ; 31(3): 226-31, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15780555

ABSTRACT

AIM: To determine whether axillary recurrence reflects inadequate axillary treatment or adverse pathological features. METHODS: The case-records were reviewed of 2122 women aged under 75 years, treated for invasive breast cancer during the time-period 1/1/86-31/12/91 in a geographically defined area. Data were abstracted on operations performed, pathological features, post-operative treatments and details of axillary recurrence. The risk of axillary recurrence was examined by pathological, treatment and patient factors. RESULTS: Axillary recurrence was more than twice as likely after inadequate compared to adequate treatment of the axilla (adequate staging or axillary radiotherapy or clearance). Delayed treatment of the axilla was not as successful as adequate primary treatment: multiple axillary recurrences were twice as common, one third of which were uncontrolled at time of death. Inadequate surgical treatment was associated with increased rates of recurrence despite endocrine therapy, chemotherapy or radiotherapy. Lymphoedema was twice as common if axillary radiotherapy was combined with any axillary surgical procedure. CONCLUSIONS: Axillary recurrence is more common in tumours with adverse pathology but may also result from inadequate axillary treatment. In order to minimise axillary recurrence, optimal treatment of the axilla entails adequate staging (sampling of four or more nodes) and treatment (axillary clearance or radiotherapy and endocrine therapy) in all women.


Subject(s)
Breast Neoplasms/epidemiology , Breast Neoplasms/pathology , Carcinoma, Ductal, Breast/epidemiology , Carcinoma, Ductal, Breast/secondary , Lymph Nodes/pathology , Adult , Aged , Axilla , Breast Neoplasms/therapy , Carcinoma, Ductal, Breast/therapy , Female , Humans , Incidence , Lymphatic Metastasis , Middle Aged , Recurrence , Registries , Scotland/epidemiology
15.
Br J Surg ; 92(4): 422-8, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15609383

ABSTRACT

BACKGROUND: Early trials that compared breast and axillary treatments showed differing recurrence rates without significant differences in survival. Consequently, there was a wide range of opinion and practice in the management of breast cancer. The present study explored this variability in surgical management to determine the impact of breast and axillary treatment on recurrence and survival. METHODS: The records of 2776 women with histologically confirmed invasive breast cancer diagnosed between 1986 and 1991 were reviewed. The relationship between adequacy of breast and axillary treatment, recurrence and survival was examined in 2122 women who had surgery with curative intent. A Cox proportional hazards model that included tumour size, node status, grade, socioeconomic status and use of adjuvant therapy was used. RESULTS: Inadequate treatment was associated with a significantly higher risk of local recurrence after breast-conserving surgery (relative hazard ratio (RHR) 4.19 (95 per cent confidence interval (c.i.) 2.73 to 6.43); P < 0.001). Inadequate axillary treatment was associated with a significantly higher risk of regional recurrence (RHR 2.29 (95 per cent c.i. 1.65 to 3.16); P < 0.001). The risk of death from breast cancer was significantly higher if locoregional treatment was inadequate (RHR 1.29 (95 per cent c.i. 1.07 to 1.55); P = 0.008). CONCLUSION: Adequate surgery is fundamental to the optimal treatment of breast cancer. Inadequate surgery resulted in higher recurrence rates despite adjuvant treatments.


Subject(s)
Breast Neoplasms/surgery , Quality of Health Care , Adult , Aged , Axilla , Breast Neoplasms/mortality , Cohort Studies , Female , Humans , Lymph Node Excision/methods , Lymph Node Excision/mortality , Lymphatic Metastasis , Mastectomy/methods , Mastectomy/mortality , Middle Aged , Neoplasm Recurrence, Local/etiology , Neoplasm Recurrence, Local/mortality , Risk Factors , Scotland/epidemiology , Survival Analysis , Treatment Outcome
16.
Breast ; 12(1): 36-41, 2003 Feb.
Article in English | MEDLINE | ID: mdl-14659353

ABSTRACT

BACKGROUND: The assessment of axillary nodal status remains divisive: inaccurate staging may result in untreated axillary disease, and appropriate adjuvant therapy not being delivered. The impact of inadequate axillary treatment on survival remains controversial. We analyse the impact of failure to adequately assess the axillary nodal status on survival. METHODS: All women with confirmed breast cancer in a 15-year period were identified, and the original pathology reports examined, and details of radiotherapy obtained. The survival of women by axillary sample size was compared to a reference group of women and corrected for nodal status, tumour size, age, deprivation category and speciality of treating surgeon. FINDINGS: Sampling less than four nodes is associated with a significantly increased risk of death. This cannot be due to understaging the extent of axillary disease nor is fully explainable by differential prescription of adjuvant therapies. We conclude that the survival of the women studied may have been adversely effected by inadequate axillary treatment.


Subject(s)
Breast Neoplasms/mortality , Breast Neoplasms/pathology , Diagnostic Errors , Lymph Node Excision/methods , Adult , Aged , Axilla , Female , Humans , Lymph Node Excision/standards , Lymphatic Metastasis , Middle Aged , Neoplasm Staging , Radiotherapy, Adjuvant , Registries , Survival Analysis
17.
Clin Transplant ; 15(4): 221-7, 2001 Aug.
Article in English | MEDLINE | ID: mdl-11683814

ABSTRACT

BACKGROUND: Acute graft rejection (AR) following renal transplantation results in reduced graft survival. However, there is uncertainty regarding the definition, aetiology and long-term graft and patient outcome of AR occurring late in the post-transplant period. AIM: To determine if rejection episodes can be classified by time from transplantation by their impact on graft survival into early acute rejection (EAR) and late acute rejection (LAR). MATERIALS AND METHODS: 687 consecutive adult renal transplant recipients who received their first cadaveric renal transplant at a single centre. All received cyclosporine (CyA)-based immunosuppression, from 1984 to 1996, with a median follow-up of 6.9 yr. Details were abstracted from clinical records, with emphasis on age, sex, co-morbid conditions, HLA matching, rejection episodes, patient and graft survival. ANALYSIS: Patients were classified by the presence and time to AR from the date of transplantation. Using those patients who had no AR (NAR) as a baseline, we determined the relative risk of graft failure by time to rejection. The characteristics of patients who had no rejection, EAR and LAR were compared. RESULTS: Compared with NAR, the risk of graft failure was higher for those patients who suffered a rejection episode. A much higher risk of graft failure was seen when the first rejection episode occurred after 90 d. Thus, a period of 90 d was taken to separate EAR and LAR (relative risk of 3.06 and 5.27 compared with NAR as baseline, p<0.001). Seventy-eight patients (11.4%) had LAR, 271 (39.4%) had EAR and 338 (49.2%) had NAR. The mean age for each of these groups differed (LAR 39.6 yr, EAR 40.8 yr compared with NAR 44 yr, p<0.003). The 5-yr graft survival for those who had LAR was 45% and 10-yr survival was 28%. HLA mismatches were more frequent in those with EAR vs. NAR (zero mismatches in HLA-A: 36 vs. 24%, HLA-B: 35 vs. 23% and HLA-DR: 63 vs. 41%, p<0.003). There was no difference in mismatching frequency between NAR and LAR. CONCLUSIONS: AR had a deleterious impact on graft survival, particularly if occurring after 90 d. AR episodes should therefore be divided into early and late phases. In view of the very poor graft survival associated with LAR, it is important to gain further insight into the main aetiological factors. Those such as suboptimal CyA blood levels and non-compliance with medication should be further investigated with the aim of developing more effective immunosuppressive regimens in order to reduce the incidence of LAR.


Subject(s)
Cyclosporine/therapeutic use , Graft Rejection/classification , Graft Survival , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/adverse effects , Acute Disease , Adult , Cadaver , Cyclosporine/administration & dosage , Graft Rejection/epidemiology , HLA Antigens , Histocompatibility Testing , Humans , Immunosuppressive Agents/administration & dosage , Kidney Transplantation/immunology , Kidney Transplantation/mortality , Middle Aged , Risk Factors , Survival Analysis , Time Factors
18.
Clin Transplant ; 15(2): 89-94, 2001 Apr.
Article in English | MEDLINE | ID: mdl-11264633

ABSTRACT

AIMS: To investigate the outcomes in patients who have pre-existing diabetes and those who develop post-transplant diabetes mellitus (PTDM). METHODS: We retrospectively reviewed the charts of 939 patients who received a first functioning renal transplant in the cyclosporine (CyA) era between 1984 and 1999. RESULTS: Sixty-six (7%) patients had renal failure due to insulin-dependent diabetes mellitus (IDDM) and 7 (0.8%) patients due to non-insulin-dependent diabetes mellitus (NIDDM). Ten (1.1%) patients had coexistent diabetes and 48 (5.1%) recipients developed PTDM. The mean graft survival for the patients with PTDM was 9.7 yr versus 11.3 yr for the non-diabetic patients, while mean graft survival was 10.1 yr for patients with IDDM and 2.9 yr with NIDDM and 8.3 yr for those with coexistent diabetes (p=ns). However, there was a statistically significant difference in patient survival between patients who developed PTDM and in those who did not develop this complication. The mean survivals of patients with IDDM, NIDDM, coexistent diabetics and PTDM were 8.4, 3.7, 8.6 and 10.3 yr, respectively. The mean survival of the patients without pre-existing diabetes or PTDM was 12.8 yr (p<0.001). The survival of patients older than 55 yr with PTDM was no different to the control group. However, in those younger than 55 yr, PTDM was associated with a higher risk of death (relative risk of 2.54, p<0.001). Fifty percent of patients with IDDM developed acute rejection episodes, whereas rejection rate was 57.1% in NIDDM group, 50.0% in the PTDM group, 20.0% in the coexistent diabetes group and 44.3% in the control group (p=ns). CONCLUSION: Patient survival, but not graft survival, was adversely affected by both pre-existing diabetes and by PTDM, particularly in those with an age less than 55 yr.


Subject(s)
Diabetes Mellitus/etiology , Diabetic Nephropathies/surgery , Graft Survival , Kidney Failure, Chronic/surgery , Kidney Transplantation , Acute Disease , Adult , Female , Humans , Kidney Failure, Chronic/mortality , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Life Tables , Male , Middle Aged , Retrospective Studies , Risk Factors
19.
J Rheumatol ; 20(12): 2153-7, 1993 Dec.
Article in English | MEDLINE | ID: mdl-8014948

ABSTRACT

Collagenous colitis is an uncommon cause of chronic watery diarrhea, characterized by colonic deposition of collagen. Nonerosive, oligoarticular, peripheral arthritis has previously been noted in about 7% of patients with collagenous colitis. We describe a patient with collagenous colitis who concurrently developed erosive, seronegative spondyloarthropathy affecting peripheral and axial joints. Synovial histology was characterized by a conspicuous inflammatory infiltrate comprised of histiocytes, lymphocytes and plasma cells. Collagenous colitis is suggested to be a systemic autoimmune disorder, with extraintestinal features such as thyroiditis and arthritis.


Subject(s)
Colitis/complications , Spondylitis, Ankylosing/blood , Spondylitis, Ankylosing/complications , Colitis/diagnosis , Colitis/metabolism , Collagen/metabolism , Colon/metabolism , Humans , Male , Middle Aged , Spondylitis, Ankylosing/immunology
SELECTION OF CITATIONS
SEARCH DETAIL
...