Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
1.
Occup Med (Lond) ; 73(6): 304-308, 2023 09 29.
Article in English | MEDLINE | ID: mdl-37477883

ABSTRACT

BACKGROUND: The information technology (IT) workforce has been growing more rapidly than others, with occupational health (OH) risks of sedentary behaviour, physical inactivity and poor diet, yet studies of their non-communicable disease risk, notably cancer, are lacking. AIMS: To investigate cancer risk in IT workers compared to others in employment and the nine major Standard Occupational Classification (SOC) groups. METHODS: We evaluated incident diagnosed cancers in the UK Biobank cohort through national cancer registry linkage. Cox proportional hazard regression models, with 15-year follow-up, were used to compare incident cancer risk among IT workers with all other employed participants and with the nine major SOC groups. RESULTS: Overall, 10 517 (4%) employed participants were IT workers. Adjusting for confounders, IT workers had a slightly lower cancer incidence compared to all other employed participants (Model 2: hazard ratio = 0.91, 95% confidence interval [CI] 0.83-1.01). Compared to the nine major SOC groups, they had a similar (Major Groups 2, 5 and 8) or lower (Major Groups 1, 3, 4, 6, 7 and 9) cancer incidence. CONCLUSIONS: Despite their occupational risks of sedentary behaviour, poor diet and physical inactivity, IT workers do not have an increased cancer incidence compared to all other employed participants and the nine major SOC groups. This study paves the way for large, longitudinal health outcome studies of this under-researched and rapidly growing occupational group.


Subject(s)
Information Technology , Neoplasms , Humans , Biological Specimen Banks , Neoplasms/epidemiology , Neoplasms/etiology , Incidence , United Kingdom/epidemiology , Risk Factors
2.
Public Health ; 196: 107-113, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34182255

ABSTRACT

OBJECTIVES: This study aimed to provide evidence on the therapeutic prescribing activity by community optometrists in Scotland and to determine its impact on workload in general practice and ophthalmology clinics. STUDY DESIGN: Scottish administrative healthcare data for a 53-month period (November 2013-April 2018) were used to analyse non-medical prescribing practice by optometrists. METHODS: Using interrupted time-series regression (Autoregressive Integrated Moving Average), we assessed the impact of optometrist prescribing on ophthalmology outpatient attendances and general practice prescribing for eye disorders. RESULTS: A total of 54,246 items were prescribed by 205 optometrists over the study period. Since the commencement of data recording, optometrist prescribing activity increased steadily from a baseline of zero to 1.2% of all ophthalmic items prescribed. Neither the monthly number of items prescribed nor the size of optometric workforce were associated with a reduction in ophthalmology outpatient appointments over time. CONCLUSIONS: Optometrists increasingly contribute to community ophthalmic prescribing in Scotland, releasing capacity and lessening general practice, but not secondary care workload. There appears to be an underutilisation of optometrists related to the management of dry eye, which represents an opportunity to release further capacity.


Subject(s)
Eye Diseases , Ophthalmology , Optometrists , Optometry , Eye Diseases/drug therapy , Humans , Research Design
3.
Occup Med (Lond) ; 71(2): 68-74, 2021 Apr 09.
Article in English | MEDLINE | ID: mdl-33515462

ABSTRACT

BACKGROUND: Information technology (IT) and the IT workforce are rapidly expanding with potential occupational health implications. But to date, IT worker health is under-studied and large-scale studies are lacking. AIMS: To investigate health, lifestyle and occupational risk factors of IT workers. METHODS: We evaluated self-reported health, lifestyle and occupational risk factors for IT workers in the UK Biobank database. Using logistic regression, we investigated differences between IT workers and all other employed participants. Regression models were repeated for IT worker subgroups (managers, professionals, technicians) and their respective counterparts within the same Standard Occupational Classification (SOC) major group (functional managers, science and technology professionals, science and technology associate professionals). RESULTS: Overall, 10 931 (4%) employed participants were IT workers. Compared to all other employed participants, IT workers reported similar overall health, but lower lifestyle risk factors for smoking and obesity. Sedentary work was a substantially higher occupational exposure risk for IT workers compared to all other employed participants (odds ratio [OR] = 5.14, 95% confidence interval [CI]: 4.91-5.39) and their specific SOC group counterparts (managers: OR = 1.83, 95% CI: 1.68-1.99, professionals: OR = 7.18, 95% CI: 6.58-7.82, technicians: OR = 4.48, 95% CI: 3.87-5.17). IT workers were also more likely to engage in computer screen-time outside work than all other employed participants (OR = 1.42, 95% CI: 1.35-1.51). CONCLUSIONS: Improved understanding of health, lifestyle and occupational risk factors from this, the largest to date study of IT worker health, can help inform workplace interventions to mitigate risk, improve health and increase the work participation of this increasingly important and rapidly growing occupational group.


Subject(s)
Occupational Exposure , Occupational Health , Humans , Information Technology , Life Style , Workplace
4.
Diabet Med ; 37(12): 2116-2123, 2020 12.
Article in English | MEDLINE | ID: mdl-32510602

ABSTRACT

AIMS: To estimate the rate at which people with diabetes and a low risk of foot ulceration change diabetic foot ulceration risk status over time, and to estimate the rate of ulceration, amputation and death among this population. METHODS: We conducted an observational study of 10 421 people with diabetes attending foot screening in an outpatient setting in NHS Fife, UK, using routinely collected data from a national diabetes register, NHS SCI Diabetes. We estimated the proportion of people who changed risk status and the cumulative incidence of ulceration, amputation and death, respectively, among people with diabetes at low risk of diabetic foot ulceration at 2-year follow-up. RESULTS: At 2-year follow-up, 5.1% (95% CI 4.7, 5.6) of people with diabetes classified as low risk at their first visit had progressed to moderate risk. The cumulative incidence of ulceration, amputation and death was 0.4% (95% CI 0.3, 0.6), 0.1% (95% CI 0.1, 0.2) and 3.4% (95% CI 3.1, 3.8), respectively. CONCLUSIONS: At 2-year follow-up, 5% of people at low risk of diabetic foot ulceration changed clinical risk status and <1% of people experienced foot ulceration or amputation. These findings provide information which will help to inform the current debate regarding optimal foot screening intervals.


Subject(s)
Amputation, Surgical/statistics & numerical data , Diabetes Mellitus/epidemiology , Diabetic Foot/epidemiology , Mortality , Aged , Female , Humans , Male , Mass Screening/methods , Middle Aged , Practice Guidelines as Topic , Risk Assessment , United Kingdom/epidemiology
5.
Hypertension ; 73(6): 1202-1209, 2019 06.
Article in English | MEDLINE | ID: mdl-31067194

ABSTRACT

Hypertension is a risk factor for cardiovascular disease. Increased urinary sodium excretion, representing dietary sodium intake, is associated with hypertension. Low sodium intake has been associated with increased mortality in observational studies. Further studies should assess whether confounding relationships explain associations between sodium intake and outcomes. We studied UK Biobank participants (n=457 484; mean age, 56.3 years; 44.7% men) with urinary electrolytes and blood pressure data. Estimated daily urinary sodium excretion was calculated using Kawasaki formulae. We analyzed associations between sodium excretion and blood pressure in subjects without cardiovascular disease, treated hypertension, or diabetes mellitus at baseline (n=322 624). We tested relationships between sodium excretion, incidence of fatal and nonfatal cardiovascular disease, heart failure, and mortality. Subjects in higher quintiles of sodium excretion were younger, with more men and higher body mass index. There was a linear relationship between increasing urinary sodium excretion and blood pressure. During median follow-up of 6.99 years, there were 11 932 deaths (1125 cardiovascular deaths) with 10 717 nonfatal cardiovascular events. There was no relationship between quintile of sodium excretion and outcomes. These relationships were unchanged after adjustment for comorbidity or excluding subjects with events during the first 2 years follow-up. No differing risk of incident heart failure (1174 events) existed across sodium excretion quintiles. Urinary sodium excretion correlates with elevated blood pressure in subjects at low cardiovascular risk. No pattern of increased cardiovascular disease, heart failure, or mortality risk was demonstrated with either high or low sodium intake.


Subject(s)
Blood Pressure/physiology , Cardiovascular Diseases/mortality , Risk Assessment/methods , Sodium/urine , Biomarkers/urine , Cardiovascular Diseases/physiopathology , Cardiovascular Diseases/urine , Cause of Death/trends , Female , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate/trends , United Kingdom/epidemiology
6.
Open Heart ; 3(1): e000140, 2016.
Article in English | MEDLINE | ID: mdl-27335653

ABSTRACT

OBJECTIVES: This is the second of the two papers introducing a cardiovascular disease (CVD) policy model. The first paper described the structure and statistical underpinning of the state-transition model, demonstrating how life expectancy estimates are generated for individuals defined by ASSIGN risk factors. This second paper describes how the model is prepared to undertake economic evaluation. DESIGN: To generate quality-adjusted life expectancy (QALE), the Scottish Health Survey was used to estimate background morbidity (health utilities) and the impact of CVD events (utility decrements). The SF-6D algorithm generated utilities and decrements were modelled using ordinary least squares (OLS). To generate lifetime hospital costs, the Scottish Heart Health Extended Cohort (SHHEC) was linked to the Scottish morbidity and death records (SMR) to cost each continuous inpatient stay (CIS). OLS and restricted cubic splines estimated annual costs before and after each of the first four events. A Kaplan-Meier sample average (KMSA) estimator was then used to weight expected health-related quality of life and costs by the probability of survival. RESULTS: The policy model predicts the change in QALE and lifetime hospital costs as a result of an intervention(s) modifying risk factors. Cost-effectiveness analysis and a full uncertainty analysis can be undertaken, including probabilistic sensitivity analysis. Notably, the impacts according to socioeconomic deprivation status can be made. CONCLUSIONS: The policy model can conduct cost-effectiveness analysis and decision analysis to inform approaches to primary prevention, including individually targeted and population interventions, and to assess impacts on health inequalities.

7.
Public Health ; 132: 13-23, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26917268

ABSTRACT

OBJECTIVE: This paper tests the extent to which differing trends in income, demographic change and the consequences of an earlier period of social, economic and political change might explain differences in the magnitude and trends in alcohol-related mortality between 1991 and 2011 in Scotland compared to England & Wales (E&W). STUDY DESIGN: Comparative time trend analyses and arithmetic modelling. METHODS: Three approaches were utilised to compare Scotland with E&W: 1. We modelled the impact of changes in income on alcohol-related deaths between 1991-2001 and 2001-2011 by applying plausible assumptions of the effect size through an arithmetic model. 2. We used contour plots, graphical exploration of age-period-cohort interactions and calculation of Intrinsic Estimator coefficients to investigate the effect of earlier exposure to social, economic and political adversity on alcohol-related mortality. 3. We recalculated the trends in alcohol-related deaths using the white population only to make a crude approximation of the maximal impact of changes in ethnic diversity. RESULTS: Real incomes increased during the 1990s but declined from around 2004 in the poorest 30% of the population of Great Britain. The decline in incomes for the poorest decile, the proportion of the population in the most deprived decile, and the inequality in alcohol-related deaths, were all greater in Scotland than in E&W. The model predicted less of the observed rise in Scotland (18% of the rise in men and 29% of the rise in women) than that in E&W (where 60% and 68% of the rise in men and women respectively was explained). One-third of the decline observed in alcohol-related mortality in Scottish men between 2001 and 2011 was predicted by the model, and the model was broadly consistent with the observed trends in E&W and amongst women in Scotland. An age-period interaction in alcohol-related mortality was evident for men and women during the 1990s and 2000s who were aged 40-70 years and who experienced rapidly increasing alcohol-related mortality rates. Ethnicity is unlikely to be important in explaining the trends or differences between Scotland and E&W. CONCLUSIONS: The decline in alcohol-related mortality in Scotland since the early 2000s and the differing trend to E&W were partly described by a model predicting the impact of declining incomes. Lagged effects from historical social, economic and political change remain plausible from the available data.


Subject(s)
Alcohol-Related Disorders/mortality , Humans , Income/trends , Mortality/trends , Politics , Population Dynamics/trends , Scotland/epidemiology , Socioeconomic Factors
8.
Public Health ; 132: 24-32, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26921977

ABSTRACT

OBJECTIVE: To provide a basis for evaluating post-2007 alcohol policy in Scotland, this paper tests the extent to which pre-2007 policy, the alcohol market, culture or clinical changes might explain differences in the magnitude and trends in alcohol-related mortality outcomes in Scotland compared to England & Wales (E&W). STUDY DESIGN: Rapid literature reviews, descriptive analysis of routine data and narrative synthesis. METHODS: We assessed the impact of pre-2007 Scottish policy and policy in the comparison areas in relation to the literature on effective alcohol policy. Rapid literature reviews were conducted to assess cultural changes and the potential role of substitution effects between alcohol and illicit drugs. The availability of alcohol was assessed by examining the trends in the number of alcohol outlets over time. The impact of clinical changes was assessed in consultation with key informants. The impact of all the identified factors were then summarised and synthesised narratively. RESULTS: The companion paper showed that part of the rise and fall in alcohol-related mortality in Scotland, and part of the differing trend to E&W, were predicted by a model linking income trends and alcohol-related mortality. Lagged effects from historical deindustrialisation and socio-economic changes exposures also remain plausible from the available data. This paper shows that policy differences or changes prior to 2007 are unlikely to have been important in explaining the trends. There is some evidence that aspects of alcohol culture in Scotland may be different (more concentrated and home drinking) but it seems unlikely that this has been an important driver of the trends or the differences with E&W other than through interaction with changing incomes and lagged socio-economic effects. Substitution effects with illicit drugs and clinical changes are unlikely to have substantially changed alcohol-related harms: however, the increase in alcohol availability across the UK is likely to partly explain the rise in alcohol-related mortality during the 1990s. CONCLUSIONS: Future policy should ensure that alcohol affordability and availability, as well as socio-economic inequality, are reduced, in order to maintain downward trends in alcohol-related mortality in Scotland.


Subject(s)
Alcohol-Related Disorders/mortality , Alcohols/supply & distribution , Commerce/trends , Cultural Characteristics , Humans , Income/trends , Policy , Scotland/epidemiology , Social Norms
9.
Int J Equity Health ; 14: 142, 2015 Nov 25.
Article in English | MEDLINE | ID: mdl-26606921

ABSTRACT

BACKGROUND: Little is known about the interaction between socio-economic status and 'protected characteristics' in Scotland. This study aimed to examine whether differences in mortality were moderated by interactions with social class or deprivation. The practical value was to pinpoint population groups for priority action on health inequality reduction and health improvement rather than a sole focus on the most deprived socioeconomic groups. METHODS: We used data from the Scottish Longitudinal Study which captures a 5.3 % sample of Scotland and links the censuses of 1991, 2001 and 2011. Hazard ratios for mortality were estimated for those protected characteristics with sufficient deaths using Cox proportional hazards models and through the calculation of European age-standardised mortality rates. Inequality was measured by calculating the Relative Index of Inequality (RII). RESULTS: The Asian population had a polarised distribution across deprivation deciles and was more likely to be in social class I and II. Those reporting disablement were more likely to live in deprived areas, as were those raised Roman Catholic, whilst those raised as Church of Scotland or as 'other Christian' were less likely to. Those aged 35-54 years were the least likely to live in deprived areas and were most likely to be in social class I and II. Males had higher mortality than females, and disabled people had higher mortality than non-disabled people, across all deprivation deciles and social classes. Asian males and females had generally lower mortality hazards than majority ethnic ('White') males and females although the estimates for Asian males and females were imprecise in some social classes and deprivation deciles. Males and females who reported their raised religion as Roman Catholic or reported 'No religion' had generally higher mortality than other groups, although the estimates for 'Other religion' and 'Other Christian' were less precise.Using both the area deprivation and social class distributions for the whole population, relative mortality inequalities were usually greater amongst those who did not report being disabled, Asians and females aged 35-44 years, males by age, and people aged <75 years. The RIIs for the raised religious groups were generally similar or too imprecise to comment on differences. CONCLUSIONS: Mortality in Scotland is higher in the majority population, disabled people, males, those reporting being raised as Roman Catholics or with 'no religion' and lower in Asians, females and other religious groups. Relative inequalities in mortality were lower in disabled than nondisabled people, the majority population, females, and greatest in young adults. From the perspective of intersectionality theory, our results clearly demonstrate the importance of representing multiple identities in research on health inequalities.


Subject(s)
Health Status Disparities , Healthcare Disparities , Mortality , Cohort Studies , Ethnicity , Female , Humans , Longitudinal Studies , Male , Religion , Scotland/epidemiology , Sex Factors
10.
Heart ; 101(3): 201-8, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25324535

ABSTRACT

OBJECTIVES: A policy model is a model that can evaluate the effectiveness and cost-effectiveness of interventions and inform policy decisions. In this study, we introduce a cardiovascular disease (CVD) policy model which can be used to model remaining life expectancy including a measure of socioeconomic deprivation as an independent risk factor for CVD. DESIGN: A state transition model was developed using the Scottish Heart Health Extended Cohort (SHHEC) linked to Scottish morbidity and death records. Individuals start in a CVD-free state and can transit to three CVD event states plus a non-CVD death state. Individuals who have a non-fatal first event are then followed up until death. Taking a competing risk approach, the cause-specific hazards of a first event are modelled using parametric survival analysis. Survival following a first non-fatal event is also modelled parametrically. We assessed discrimination, validation and calibration of our model. RESULTS: Our model achieved a good level of discrimination in each component (c-statistics for men (women)-non-fatal coronary heart disease (CHD): 0.70 (0.74), non-fatal cerebrovascular disease (CBVD): 0.73 (0.76), fatal CVD: 0.77 (0.80), fatal non-CVD: 0.74 (0.72), survival after non-fatal CHD: 0.68 (0.67) and survival after non-fatal CBVD: 0.65 (0.66)). In general, our model predictions were comparable with observed event rates for a Scottish randomised statin trial population which has an overlapping follow-up period with SHHEC. After applying a calibration factor, our predictions of life expectancy closely match those published in recent national life tables. CONCLUSIONS: Our model can be used to estimate the impact of primary prevention interventions on life expectancy and can assess the impact of interventions on inequalities.


Subject(s)
Cardiovascular Diseases/epidemiology , Life Expectancy , Models, Cardiovascular , Primary Prevention/standards , Cardiovascular Diseases/economics , Cardiovascular Diseases/prevention & control , Cost-Benefit Analysis , Female , Humans , Male , Middle Aged , Morbidity/trends , Risk Factors , Socioeconomic Factors , Survival Rate/trends , United Kingdom/epidemiology
11.
Child Care Health Dev ; 40(3): 337-45, 2014 May.
Article in English | MEDLINE | ID: mdl-23731277

ABSTRACT

OBJECTIVE: To determine whether there is an association between being smacked by your main caregiver in the first two years and emotional and behavioural problems at age four. DESIGN: Secondary analysis of data from the Growing Up in Scotland Prospective Study (GUS). SETTING: Scotland, UK. PARTICIPANTS: GUS birth cohort children, whose main caregiver had no concerns about their behaviour at 22 months. EXPOSURE: Ever smacked by main caregiver in first 22 months, as measured by caregiver self-report at 22 months. MAIN OUTCOME: Emotional and behavioural problems as measured by parental assessment and the Strengths and Difficulties Questionnaire (SDQ) at 46 months. RESULTS: Preschool children exposed to main caregiver smacking in the first two years were twice as likely to have emotional and behavioural problems as measured by parental assessment [odds ratio (OR) 2.5, 95% confidence interval (CI) 1.9-3.2; absolute risk reduction (ARR) 17.8%, 95% CI 12.1-23.5] and SDQ (OR 2.5, 95% CI 1.7-3.7; ARR 7.5%, 95% CI 3.7-11.5), as children never smacked by their main caregiver. The association remained significant after adjusting for child age and sex, caregiver age, sex, ethnicity, educational attainment and mental health status, sibling number, structural family transitions and socioeconomic status (adj. OR 2.4, 95% CI 1.8-3.2 for parental assessment and adj. OR 2.2, 95% CI 1.4-3.5 for SDQ). CONCLUSIONS: Parental use of physical punishment in the first two years may be a modifiable risk factor for emotional and behavioural difficulties in preschool children.


Subject(s)
Affective Symptoms/etiology , Child Abuse/psychology , Child Behavior Disorders/etiology , Parenting/psychology , Punishment/psychology , Adolescent , Adult , Affective Symptoms/epidemiology , Caregivers/psychology , Child Behavior Disorders/epidemiology , Child, Preschool , Female , Humans , Infant , Male , Middle Aged , Parents/psychology , Prospective Studies , Scotland/epidemiology , Young Adult
12.
Br J Surg ; 99(5): 680-7, 2012 May.
Article in English | MEDLINE | ID: mdl-22318673

ABSTRACT

BACKGROUND: This study examined trends for all first hospital admissions for peripheral artery disease (PAD) in Scotland from 1991 to 2007 using the Scottish Morbidity Record. METHODS: First admissions to hospital for PAD were defined as an admission to hospital (inpatient and day-case) with a principal diagnosis of PAD, with no previous admission to hospital (principal or secondary diagnosis) for PAD in the previous 10 years. RESULTS: From 1991 to 2007, 41,593 individuals were admitted to hospital in Scotland for the first time for PAD. Some 23,016 (55.3 per cent) were men (mean(s.d.) age 65.7(11.7) years) and 18,577 were women (aged 70.4(12.8) years). For both sexes the population rate of first admissions to hospital for PAD declined over the study interval: from 66.7 per 100,000 in 1991-1993 to 39.7 per 100,000 in 2006-2007 among men, and from 43.5 to 29.1 per 100,000 respectively among women. After adjustment, the decline was estimated to be 42 per cent in men and 27 per cent in women (rate ratio for 2007 versus 1991: 0.58 (95 per cent confidence interval 0.55 to 0.62) in men and 0.73 (0.68 to 0.78) in women). The intervention rate fell from 80.8 to 74.4 per cent in men and from 77.9 to 64.9 per cent in women. The proportion of hospital admissions as an emergency or transfer increased, from 23.9 to 40.7 per cent among men and from 30.0 to 49.5 per cent among women. CONCLUSION: First hospital admission for PAD in Scotland declined steadily and substantially between 1991 and 2007, with an increase in the proportion that was unplanned.


Subject(s)
Hospitalization/trends , Peripheral Arterial Disease/epidemiology , Aged , Female , Humans , Male , Peripheral Arterial Disease/complications , Peripheral Arterial Disease/surgery , Scotland/epidemiology , Sex Distribution
13.
Heart ; 95(23): 1920-4, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19713201

ABSTRACT

BACKGROUND: Aldosterone has a key role in the pathophysiology of heart failure. In around 50% of such patients, aldosterone "escapes" from inhibition by drugs that interrupt the renin-angiotensin axis; such patients have a worse clinical outcome. Insulin resistance is a risk factor in heart failure and cardiovascular disease. The relation between aldosterone status and insulin sensitivity was investigated in a cohort of heart failure patients. METHODS: 302 patients with New York Heart Association (NYHA) class II-IV heart failure on conventional therapy were randomised in the ALiskiren Observation of heart Failure Treatment study (ALOFT), designed to test the safety of a directly acting renin inhibitor. Plasma aldosterone and 24-hour urinary aldosterone excretion, as well as fasting insulin and homeostasis model assessment of insulin resistance (HOMA-IR) were measured. Subjects with aldosterone escape and high urinary aldosterone were identified according to previously accepted definitions. RESULTS: 20% of subjects demonstrated aldosterone escape and 34% had high urinary aldosterone levels. At baseline, there was a positive correlation between fasting insulin and plasma (r = 0.22 p<0.01) and urinary aldosterone(r = 0.19 p<0.03). Aldosterone escape and high urinary aldosterone subjects both demonstrated higher levels of fasting insulin (p<0.008, p<0.03), HOMA-IR (p<0.06, p<0.03) and insulin-glucose ratios (p<0.006, p<0.06) when compared to low aldosterone counterparts. All associations remained significant when adjusted for potential confounders. CONCLUSIONS: This study demonstrates a novel direct relation between aldosterone status and insulin resistance in heart failure. This observation merits further study and may identify an additional mechanism that contributes to the adverse clinical outcome associated with aldosterone escape.


Subject(s)
Aldosterone/metabolism , Heart Failure/metabolism , Insulin Resistance/physiology , Aged , Diabetes Mellitus/metabolism , Fasting/blood , Female , Heart Failure/drug therapy , Homeostasis , Humans , Insulin/blood , Male , Middle Aged , Renin/antagonists & inhibitors
14.
Br J Ophthalmol ; 93(1): 13-7, 2009 Jan.
Article in English | MEDLINE | ID: mdl-19098042

ABSTRACT

OBJECTIVES: Following a 3.7-fold increase in the rate of cataract surgery in the UK between 1989 and 2004, concern has been raised as to whether this has been accompanied by an excessive decline in the threshold such that some operations are inappropriate. The objective was to measure the impact of surgery on a representative sample of patients so as to determine whether or not overutilisation of surgery is occurring. DESIGN: Prospective cohort assessed before and 3 months after surgery. SETTING: Ten providers (four NHS hospitals, three NHS treatment centres, three independent sector treatment centres) from across England. PARTICIPANTS: 861 patients undergoing first eye (569) or second eye (292) cataract surgery provided preoperative data of whom 745 (87%) completed postoperative questionnaires. MAIN OUTCOME MEASURES: Patient-reported visual function (VF-14); general health status and quality of life (EQ5D); postoperative complications; overall view of the operation and its impact. RESULTS: Overall, visual function improved (mean VF-14 score increased from 83.2 (SD 17.3) to 93.7 (SD 13.2)). Self-reported general health status deteriorated (20.3% fair or poor before surgery compared with 25% afterwards) which was reflected in the mean EQ5D score (0.82 vs 0.79; p = 0.003). At least one complication was reported by 66 (8.9%) patients, though this probably overestimated the true incidence. If the appropriateness of surgery is based on an increase in VF-14 score of 5.5 (that corresponds to patients' reporting being "a little better"), 30% of operations would be deemed inappropriate. If an increase of 12.2 (patients' reports of being "much better") is adopted, the proportion inappropriate is 49%. Using a different approach to determining a minimally important difference, the proportion inappropriate would be closer to 20%. Although visual function (VF-14) scores were unchanged or deteriorated in 25% of patients, 93.1% rated the results of the operation as "good," "very good" or "excellent," and 93.5% felt their eye problem was "better." This partly reflects inadequacies in the validity of the VF-14. CONCLUSIONS: Improvement in the provision of cataract surgery has been accompanied by a reduction in the visual function threshold. However, methodological difficulties in measuring the impact of cataract surgery on visual function and quality of life mean it is impossible to determine whether or not overutilisation of cataract surgery is occurring.


Subject(s)
Cataract Extraction/statistics & numerical data , Health Services Misuse/statistics & numerical data , Aged , England/epidemiology , Epidemiologic Methods , Female , Humans , Male , Patient Satisfaction , Socioeconomic Factors , Treatment Outcome , Vision Tests , Visual Acuity/physiology
15.
Int Endod J ; 41(1): 6-31, 2008 Jan.
Article in English | MEDLINE | ID: mdl-17931388

ABSTRACT

AIMS: (i) To carry out meta-analyses to quantify the influence of the clinical factors on the efficacy of primary root canal treatment and (ii) to identify the best treatment protocol based on the current evidence. METHODOLOGY: The evidence for the effect of each clinical factor on the success rate (SR) of primary root canal treatment was gathered in three different ways: (i) intuitive synthesis of reported findings from individual studies; (ii) weighted pooled SR by each factor under investigation was estimated using random-effect meta-analysis; (iii) weighted effect of the factor under investigation on SR were estimated and expressed as odds ratio for the dichotomous outcomes (success or failure) using fixed- and random-effects meta-analysis. Statistical heterogeneity amongst the studies was assessed by Cochran's (Q) test. Potential sources of statistical heterogeneity were investigated by exploring clinical heterogeneity using meta-regression models which included study characteristics in the regression models. RESULTS: Out of the clinical factors investigated, pre-operative pulpal and periapical status were most frequently investigated, whilst the intra-operative factors were poorly studied in the 63 studies. Four factors were found to have a significant effect on the primary root canal treatment outcome, although the data heterogeneity was substantial, some of which could be explained by some of the study characteristics. CONCLUSIONS: Four conditions (pre-operative absence of periapical radiolucency, root filling with no voids, root filling extending to 2 mm within the radiographic apex and satisfactory coronal restoration) were found to improve the outcome of primary root canal treatment significantly. Root canal treatment should therefore aim at achieving and maintaining access to apical anatomy during chemo-mechanical debridement, obturating the canal with densely compacted material to the apical terminus without extrusion into the apical tissues and preventing re-infection with a good quality coronal restoration.


Subject(s)
Root Canal Therapy , Clinical Protocols , Humans , Meta-Analysis as Topic , Root Canal Obturation , Root Canal Preparation , Treatment Outcome
16.
Int Endod J ; 40(12): 921-39, 2007 Dec.
Article in English | MEDLINE | ID: mdl-17931389

ABSTRACT

AIMS: The aims of this study were (i) to conduct a comprehensive systematic review of the literature on the outcome of primary (initial or first time) root canal treatment; (ii) to investigate the influence of some study characteristics on the estimated pooled success rates. METHODOLOGY: Longitudinal clinical studies investigating outcome of primary root canal treatment, published up to the end of 2002, were identified electronically (MEDLINE and Cochrane database 1966-2002 December, week 4). Four journals (International Endodontic Journal, Journal of Endodontics, Oral Surgery Oral Medicine Oral Pathology Endodontics Radiology and Dental Traumatology & Endodontics), bibliographies of all relevant papers and review articles were hand-searched. Three reviewers (Y-LN, SR and KG) independently assessed, selected the studies based on specified inclusion criteria, and extracted the data onto a pre-designed proforma. The study inclusion criteria were: longitudinal clinical studies investigating root canal treatment outcome; only primary root canal treatment carried out on the teeth studied; sample size given; at least 6-month postoperative review; success based on clinical and/or radiographic criteria (strict, absence of apical radiolucency; loose, reduction in size of radiolucency); overall success rate given or could be calculated from the raw data. The findings by individual study were summarized and the pooled success rates by each potential influencing factor were calculated for this part of the study. RESULTS: Of the 119 articles identified, 63 studies published from 1922 to 2002, fulfilling the inclusion criteria were selected for the review: six were randomized trials, seven were cohort studies and 48 were retrospective studies. The reported mean success rates ranged from 31% to 96% based on strict criteria or from 60% to 100% based on loose criteria, with substantial heterogeneity in the estimates of pooled success rates. Apart from the radiographic criteria of success, none of the other study characteristics could explain this heterogeneity. Twenty-four factors (patient and operative) had been investigated in various combinations in the studies reviewed. The influence of preoperative pulpal and periapical status of the teeth on treatment outcome were most frequently explored, but the influence of treatment technique was poorly investigated. CONCLUSIONS: The estimated weighted pooled success rates of treatments completed at least 1 year prior to review, ranged between 68% and 85% when strict criteria were used. The reported success rates had not improved over the last four (or five) decades. The quality of evidence for treatment factors affecting primary root canal treatment outcome is sub-optimal; there was substantial variation in the study-designs. It would be desirable to standardize aspects of study-design, data recording and presentation format of outcome data in the much needed future outcome studies.


Subject(s)
Root Canal Therapy , Humans , Probability , Research Design , Treatment Outcome
17.
J Bone Joint Surg Br ; 89(7): 893-900, 2007 Jul.
Article in English | MEDLINE | ID: mdl-17673581

ABSTRACT

A postal questionnaire was sent to 10,000 patients more than one year after their total knee replacement (TKR). They were assessed using the Oxford knee score and were asked whether they were satisfied, unsure or unsatisfied with their TKR. The response rate was 87.4% (8231 of 9417 eligible questionnaires) and a total of 81.8% (6625 of 8095) of patients were satisfied. Multivariable regression modelling showed that patients with higher scores relating to the pain and function elements of the Oxford knee score had a lower level of satisfaction (p < 0.001), and that ongoing pain was a stronger predictor of this. Female gender and a primary diagnosis of osteoarthritis were found to be predictors of lower levels of patient satisfaction. Differences in the rate of satisfaction were also observed in relation to age, the American Society of Anesthesiologists grade and the type of prosthesis. This study has provided data on the Oxford knee score and the expected levels of satisfaction at one year after TKR. The results should act as a benchmark of practice in the United Kingdom and provide a baseline for peer comparison between institutions.


Subject(s)
Arthroplasty, Replacement, Knee/standards , Osteoarthritis, Knee/surgery , Pain Measurement , Patient Satisfaction , Adult , Aged , Aged, 80 and over , Arthroplasty, Replacement, Knee/statistics & numerical data , Female , Humans , Male , Middle Aged , Osteoarthritis, Knee/physiopathology , Quality of Life , Surveys and Questionnaires
18.
Gut ; 56(11): 1606-13, 2007 Nov.
Article in English | MEDLINE | ID: mdl-17356039

ABSTRACT

BACKGROUND AND OBJECTIVE: Surgical mortality in the US is widely perceived to be superior to that in the UK. However, previous comparisons of surgical outcome in the two countries have often failed to take sufficient account of case-mix or examine long-term outcome. The standardised nature of liver transplantation practice makes it uniquely placed for undertaking reliable international comparisons of surgical outcome. The objective of this study is to undertake a risk-adjusted disease-specific comparison of both short- and long-term survival of liver transplant recipients in the UK and Ireland with that in the US. METHODS: A multicentre cohort study using two high quality national databases including all adults who underwent a first single organ liver transplant in the UK and Ireland (n = 5925) and the US (n = 41,866) between March 1994 and March 2005. The main outcome measures were post-transplant mortality during the first 90 days, 90 days to 1 year and beyond the first year, adjusted for recipient and donor characteristics. RESULTS: Risk-adjusted mortality in the UK and Ireland was generally higher than in the US during the first 90 days (HR 1.17; 95% CI 1.07 to 1.29), both for patients transplanted for acute liver failure (HR 1.27; 95% CI 1.01 to 1.60) and those transplanted for chronic liver disease (HR 1.18; 95% CI 1.07 to 1.31). Between 90 days and 1 year post-transplantation, no statistically significant differences in overall risk-adjusted mortality were noted between the two cohorts. Survivors of the first post-transplant year in the UK and Ireland had lower overall risk-adjusted mortality than those transplanted in the US (HR 0.88; 95% CI 0.81 to 0.96). This difference was observed among patients transplanted for chronic liver disease (HR 0.88; 95% CI 0.81 to 0.96), but not those transplanted for acute liver failure (HR 1.02; 95% CI 0.70 to 1.50). CONCLUSIONS: Whilst risk-adjusted mortality is higher in the UK and Ireland during the first 90 days following liver transplantation, it is higher in the US among those liver transplant recipients who survived the first post-transplant year. Our results are consistent with the notion that the US has superior acute perioperative care whereas the UK appears to provide better quality chronic care following liver transplantation surgery.


Subject(s)
Liver Diseases/surgery , Liver Transplantation/mortality , Postoperative Complications/mortality , Adult , Cohort Studies , Female , Hospital Mortality , Humans , Ireland/epidemiology , Liver Diseases/mortality , Male , Middle Aged , Survival Analysis , Tissue Donors , Tissue and Organ Procurement/standards , Treatment Outcome , United Kingdom/epidemiology , United States/epidemiology
19.
Int J Paediatr Dent ; 16(4): 257-62, 2006 Jul.
Article in English | MEDLINE | ID: mdl-16759323

ABSTRACT

OBJECTIVE: The aim of this study was to estimate the prevalence and severity of dental caries in the primary dentition of young children in Ajman, UAE, and investigate its association with sociodemographic characteristics and use of dental services. METHODS: A cluster-sampling approach was used to randomly select children aged 5 or 6 years who were enrolled in public or private schools. Clinical examinations for caries were conducted by a single examiner using World Health Organization criteria. Parents completed questionnaires seeking information on socioeconomic background and dental service utilization. Zero-inflated negative binomial (ZINB) regression modelling was used to identify risk markers and risk indicators for caries experience. RESULTS: The prevalence of dental caries in the sample was high 76.1%. The average dmfs score 10.2. Caries severity was greater among older children and among male children of less educated mothers. Emirati (local) children had higher caries severity than others. Children who had higher level of caries visited the dentist more frequently than other children whose visits were for check-up only. CONCLUSIONS: Dental caries prevalence and severity in young children in Ajman are high, and socioeconomic characteristics and dental utilization are important determinants of their dental caries experience. There is an urgent need for oral health programmes targeted at the treatment and underlying causes of dental caries in these children.


Subject(s)
DMF Index , Dental Care for Children/statistics & numerical data , Dental Caries/epidemiology , Age Factors , Arabs/statistics & numerical data , Child , Child, Preschool , Educational Status , Ethnicity/statistics & numerical data , Female , Humans , Income , Male , Mothers/education , Poverty , Prevalence , Risk Factors , Sex Factors , Socioeconomic Factors , Tooth, Deciduous/pathology , United Arab Emirates/epidemiology
20.
J Bone Joint Surg Br ; 88(6): 716-20, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16720761

ABSTRACT

New brands of joint prosthesis are released for general implantation with limited evidence of their long-term performance in patients. The CUSUM continuous monitoring method is a statistical testing procedure which could be used to provide prospective evaluation of brands as soon as implantation in patients begins and give early warning of poor performance. We describe the CUSUM and illustrate the potential value of this monitoring tool by applying it retrospectively to the 3M Capital Hip experience. The results show that if the clinical data and methodology had been available, the CUSUM would have given an alert to the underperformance of this prosthesis almost four years before the issue of a Hazard Notice by the Medical Devices Agency. This indicates that the CUSUM can be a valuable tool in monitoring joint prostheses, subject to timely and complete collection of data. Regional or national joint registries provide an opportunity for future centralised, continuous monitoring of all hip and knee prostheses using these techniques.


Subject(s)
Hip Prosthesis , Product Surveillance, Postmarketing/methods , Prosthesis Failure , Adult , Aged , Aged, 80 and over , Arthroplasty, Replacement, Hip/instrumentation , Female , Humans , Male , Middle Aged , Prosthesis Design , Retrospective Studies , Risk Assessment/methods , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...