Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 29
Filter
1.
Eur Rev Med Pharmacol Sci ; 27(20): 9530-9539, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37916320

ABSTRACT

OBJECTIVE: The aim of this study was to compare the effectiveness of high vs. low-intensity exercise training on muscle strength, physical function, and quality of life in post-COVID-19 patients with sarcopenia secondary to chronic kidney disease (CKD). PATIENTS AND METHODS: 82 subjects were randomized into 2 groups: high intensity (HIG, n=42), which received high-intensity resistance training, and low intensity (LIG, n=40), which conducted low-intensity aerobic activities. Exercises were performed for 30 min, 3 times per week for 6 weeks. A handheld dynamometer, pinch press, and 1 min sit-to-stand (STS) test were used to assess muscle strength. Modified physical performance test and sarcopenia Quality of Life questionnaire (SAR-QoL) were used to assess function and quality of life, respectively. Measures were collected before and at the end of the treatment program. RESULTS: Participants were similar at baseline. The within-group comparisons demonstrated statistically significant improvement in both HIG and LIG groups in all outcome measures (p<0.001). Between groups, comparisons revealed statistically significant better achievements with high effect size in Modified Physical Performance Test (MMPT) (p<0.001, d=1.28), handgrip (p<0.001, d=3.6), STS (p<0.001, d=2.38), and SAR-QoL (p<0.001, d=3.24) in favor of the HIG. On the other hand, pinch press strength was equally improved in both groups (p=0.09, d=0.36). CONCLUSIONS: High-intensity exercise is better than low-intensity exercises in post-COVID-19 patients with sarcopenia secondary to chronic kidney disease regarding muscle strength, physical function, and quality of life.


Subject(s)
COVID-19 , Renal Insufficiency, Chronic , Resistance Training , Sarcopenia , Humans , Sarcopenia/therapy , Quality of Life , Hand Strength , COVID-19/complications , COVID-19/therapy , Muscle Strength/physiology , Exercise/physiology
2.
Int J Technol Assess Health Care ; 39(1): e3, 2023 Jan 09.
Article in English | MEDLINE | ID: mdl-36621844

ABSTRACT

The launch of innovative technologies has been credited with significant improvements in health indicators, but it comes at a high financial impact, and the value of certain innovations may not be well documented. Health technology assessment (HTA) is a universally established process to assess the incremental value of innovations. Despite its acknowledged value, almost one-third of the countries around the globe have not established yet a formal HTA in their health systems. The UAE is one of the pioneering countries worldwide in adopting innovative health technologies. This emphasizes the importance of exploring the key elements in the UAE's journey toward the establishment of HTA. Our study aims to articulate an academic insight that can support the ongoing endeavors to establish the HTA in the UAE. This case study was guided by an analytical framework. Data was collected from document review and semistructured interviews, then analyzed by applying the codebook thematic analysis technique. The findings outline multiple facilitators and challenges in the perspective process, as they show a multidimensional interlink between all identified elements. Markedly, leveraging the role of specialized academia and building HTA genuine knowledge are the areas that need the most attention. The originality of this research is associated with analyzing the three health policy pillars: the context, actors, and content in a prospective HTA establishment process. The main practical implications generated from this study are supporting global health organizations, HTA policy entrepreneurs, and academics in improving their strategies and designing more effective HTA policy learning programs.


Subject(s)
Health Policy , Technology Assessment, Biomedical , United Arab Emirates , Prospective Studies
3.
Pol J Vet Sci ; 25(3): 365-368, 2022 Sep.
Article in English | MEDLINE | ID: mdl-36155597

ABSTRACT

The equine infectious anaemia virus (EIAV) is one of the most serious equine diseases worldwide. There is scarce information on the epizootiology of equine infectious anaemia (EIA) in Saudi Arabia. Given the importance of the equine industry in Saudi Arabia, this cross- -sectional study aims to provide information about the prevalence of EIAV based on serological surveillance of the equine population in the country. A total of 4728 sera samples were collected (4523 horses and 205 donkeys) between December 2017 and November 2019. All samples were tested using commercially available EIAV ELISA. All tested samples showed negative results for EIAV antibodies with a 95% confidence interval. The results provided evidence that Saudi Arabia's equine populations (horses and donkeys) are currently free of EIAV. The results also suggest the need for continuous monitoring of EIAV and strict regulation when importing horses from other countries.


Subject(s)
Equine Infectious Anemia , Horse Diseases , Infectious Anemia Virus, Equine , Animals , Cross-Sectional Studies , Equidae , Equine Infectious Anemia/epidemiology , Horse Diseases/epidemiology , Horses , Saudi Arabia/epidemiology
4.
Health Policy ; 124(12): 1297-1309, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32962876

ABSTRACT

In this paper we outline and compare pharmaceutical pricing policies for in-patent prescription pharmaceuticals with emphasis on external reference pricing (ERP) in eleven countries across the Middle East and North Africa (MENA) region and explore possible improvements in their pricing systems. Primary and secondary evidence was used to inform our analysis. Comparative analysis of ERP systems across countries followed an analytical framework distilling ERP into twelve salient features, while ERP system performance was benchmarked against a framework of best practice principles across (a) objectives and scope, (b) administration and operations, (c) methods used, and (d) implementation. Results suggest that ERP is the dominant pricing method for in-patent pharmaceuticals. Although several good practice cases were identified, none of the eleven countries satisfy all best practice principles. ERP basket sizes vary significantly and are commonly composed using geographical proximity and low-price countries as criteria. Nine countries do not use the mean or median prices, but resort to using the lowest. Exchange rate fluctuations are routinely used to arrive at price reductions in local currency. Significant opportunities exist for MENA countries to develop their ERP regimes to achieve greater compliance with best practice principles. Over the short-term, incremental changes could be implemented to several ERP salient features and can be achieved relatively easily, thereby enhancing the functionality and performance of national ERP systems. Countries in the region can also focus on the development of explicit value assessment systems, and minimize their dependence on ERP over the longer-term.


Subject(s)
Economics, Pharmaceutical , Pharmaceutical Preparations , Africa, Northern , Costs and Cost Analysis , Drug Costs , Humans , Middle East
5.
Expert Rev Pharmacoecon Outcomes Res ; 19(3): 245-250, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30626231

ABSTRACT

INTRODUCTION: Real-world evidence (RWE) is increasingly being used in coverage, reimbursement and formulary decisions for medicines globally. Areas covered: The Middle East (ME) region is significantly behind in generating and using RWE in health policy decisions due to several factors that shaped the health sector over the past few decades. The trend, however, is changing due to several factors that are shaping the future of the healthcare industry in the region. Among other factors, rising healthcare cost, changing population and disease demographics, increased focus on the quality of healthcare, digitization of medical data, increased demand for local clinical and economic data, and overall greater influence of global trends in the healthcare industry. For the region to realize the benefit of RWE in healthcare decisions, it needs to overcome several challenges including embracing the value that RWE brings to healthcare decisions, building trust between stakeholders, establishing reliability and validity of databases used to generate RWE, enhancing technical capabilities, investing in local data generation, and conducting high-quality RWE studies while maintaining patients' confidentiality. Expert commentary: We believe that the next decade will witness significant increase in RWE generation in the region, and will play a key role in driving efficiency in healthcare delivery.


Subject(s)
Decision Making , Delivery of Health Care/organization & administration , Health Policy/trends , Databases, Factual/standards , Delivery of Health Care/trends , Evidence-Based Medicine , Health Care Costs/trends , Health Care Sector/organization & administration , Humans , Middle East , Reproducibility of Results
6.
Rev Sci Tech ; 37(3): 985-997, 2018 12.
Article in English | MEDLINE | ID: mdl-30964454

ABSTRACT

A cross-sectional study was conducted in five regions in Saudi Arabia to investigate the epidemiology of Middle East respiratory syndrome coronavirus (MERS-CoV) infection in dromedary camels (Camelus dromedarius) during April and May2015. Serum and nasal swab samples were tested for MERS-CoV antibodies andribonucleic acid (RNA) using a recombinant enzyme-linked immunosorbent assay (rELISA) and real-time reverse-transcription polymerase chain reaction (rRT-PCR), respectively. The overall MERS-CoV antibody seroprevalence was 80.5%, whereas the overall viral RNA prevalence was 2.4%. The associations of risk factors with each prevalence were quantified using univariate and multivariate analyses. The multivariate models identified region, age, grazing system, exposure to wild animals and dung removal as factors significantly associated with seroprevalence (p ??0.05). A higher seroprevalence was more likely to occur in camels from the Riyadh, Eastern, Northern and Makkah regions than those from the Jazan region; camels ??4 and 1-3 years of age (marginally significant) than calves < 1 year; and camels raised in zero grazing and semi-open grazing systems than those raised in an open grazing system. However, the presence of wild animals and daily dung removal were negatively associated with seroprevalence. On the other hand, region and sex were significantly associated with MERS-CoV RNA prevalence(p ??0.05). A higher viral RNA prevalence was more likely to occur in camels from the Riyadh region and Eastern region (marginally significant) than in those from the Makkah region, and in male camels than female camels. In conclusion, the risk factors identified in this study can be considered to be predictors of MERS-CoV infection in camels and should be taken into account when developing an efficient and cost-effective control strategy.


Une étude transversale a été réalisée au cours des mois d'avril et de mai 2015 dans cinq régions d'Arabie saoudite afin d'élucider l'épidémiologie de l'infection par le coronavirus responsable du syndrome respiratoire du Moyen-Orient(MERS­CoV) chez les dromadaires (Camelus dromedarius). Des échantillons de sérum et des écouvillons nasaux prélevés de dromadaires ont été analysés afin de détecter la présence d'anticorps dirigés contre le MERS-CoV ou d'ARN de ce même virus, en utilisant respectivement une épreuve immuno-enzymatique recombinante (ELISAr) et une amplification en chaîne par polymérase couplée à une transcription inverse (PCR­RT) en temps réel. La prévalence sérologique globale des anticorps dirigés contre le MERS-CoV s'élevait à 80,5 %, tandis que la prévalence globale de l'ARN viral était de 2,4 %. Les corrélations entre les facteurs de risque et les prévalences obtenues ont été quantifiées au moyen d'analyses à une seule et à plusieurs variables. Les modèles à plusieurs variables ont fait apparaître une association significative (p ??0,05) entre la prévalence sérologique et les facteurs suivants : la région, l'âge des animaux, le système pastoral pratiqué, l'exposition à la faune sauvage et l'élimination du fumier. La probabilité d'une forte prévalence sérologique était plus élevée chez les dromadaires provenant des régions de Riyad, de l'Est, du Nord et de la Mecque que chez ceux de la région de Jizan ; chez les dromadaires âgés de plus de quatre ans, ou âgés d'un à trois ans (différence marginalement significative) plutôt que chez les jeunes de moins d'un an ; et enfin chez les dromadaires nourris en stabulation (zéro pâturage) ou en pâturage semi-ouvert plutôt que chez ceux nourris dans des systèmes de pâturage ouvert. En revanche, une corrélation négative a été constatée entre la prévalence sérologique d'une part et la présence d'animaux sauvages et/ou l'élimination quotidienne du fumier, d'autre part. En ce qui concerne la détection virale, une corrélation significative (p ??0,05) a été constatée entre la région et le sexe des animaux et la prévalence de l'ARN du MERS-CoV. La probabilité d'une prévalence plus élevée de l'ARN viral était plus prononcée chez les dromadaires des régions de Riyad et de l'Est (différence marginalement significative) que chez ceux de la région de La Mecque, et chez les mâles que chez les chamelles. En conclusion, les facteurs de risque identifiés dans cette étude peuvent servir d'annonciateurs de l'infection par le MERS-CoV chez les dromadaires et devraient être pris en compte pour élaborer une stratégie efficace et rentable de lutte contre cette maladie.


Los autores describen un estudio transversal efectuado en abril y mayo de 2015 en cinco regiones de Arabia Saudí con objeto de investigar la epidemiologia de la infección de dromedarios (Camelus dromedarius) por el coronavirus del síndrome respiratorio de Oriente Medio (MERS­CoV). A tal efecto se analizaron muestras de suero y exudado nasal para detectar en ellas anticuerpos contra el MERS­CoV y ácido ribonucleico (ARN) del virus, empleando para ello, respectivamente, una técnica de ensayo inmunoenzimático recombinante (ELISAr) y una de reacción en cadena de la polimerasa acoplada a transcripción inversa en tiempo real (rRT­PCR, por sus siglas en inglés). Se calculó que la seroprevalencia global de anticuerpos contra el virus era del 80,5% y que la prevalencia global de ARN vírico era del 2,4%. Utilizando análisis multifactoriales y de una sola variable se cuantificó también la correlación de cada una de esas prevalencias con una serie de factores de riesgo. Con los modelos multifactoriales se observó que la región, la edad, el régimen de pastoreo, la exposición a animales salvajes y la retirada de estiércol eran factores que presentaban una asociación significativa con la seroprevalencia (p ??0,05): era más probable encontrar niveles elevados de seroprevalencia en dromedarios de las regiones de Riad y La Meca y las regiones oriental y septentrional del país que en los de la región de Jizán; en los de 4 o más años y entre 1 y 3 años de edad (correlación ligeramente significativa) que en las crías menores de 1 año; y en los animales estabulados o criados en sistemas de pasto semiabierto que en los criados con regímenes de pasto al aire libre. La presencia de animales salvajes y la retirada cotidiana del estiércol, por su parte, presentaban una correlación negativa con la seroprevalencia. Por otro lado, los factores asociados significativamente con la prevalencia de ARN vírico (p ??0,05) eran la región y el sexo: había mayor probabilidad de encontrar niveles elevados de prevalencia de ARN vírico en dromedarios de la región de Riad y la región oriental (correlación ligeramente significativa) que en los de la región de La Meca, y en machos más que en hembras. En conclusión, los factores de riesgo detectados con este estudio pueden ser considerados predictivos de la infección de dromedarios por el MERS­CoV y deben ser tenidos en cuenta para elaborar una estrategia de lucha que ofrezca a la vez eficacia y rentabilidad.


Subject(s)
Coronavirus Infections , Middle East Respiratory Syndrome Coronavirus , Animals , Camelus , Coronavirus Infections/veterinary , Cross-Sectional Studies , Female , Male , Saudi Arabia , Seroepidemiologic Studies
7.
Transbound Emerg Dis ; 65(2): e494-e500, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29150916

ABSTRACT

Lumpy skin disease (LSD) is a highly infectious disease of cattle caused by a virus of the Capripoxvirus genus in the family Poxviridae. The disease is a major concern for the dairy industry in Saudi Arabia. In this study, an outbreak of LSD in cattle herds was detected in Saudi Arabia in 2016. LSD outbreak was investigated in five regions of Saudi Arabia: Al-Hassa, Al-Sharqia, Al-Qassim, Riyadh and Al-Taif during the period from April to July 2016. Tissues from skin nodules were collected to characterize the virus by a real-time polymerase chain reaction (rt-PCR). During this period, 64,109 cattle were examined and morbidity, mortality and case fatality rates were 6%, 0.99% and 16.6%, respectively. The analysis showed 3,852 infected cases and 641 deaths. highest number of infected animals was reported in Al-Hassa (2,825), followed by Al-Qassim (547), Riyadh (471), Al-Sharqia (6) and Al-Taif (3). The highest morbidity rates were observed in Al-Qassim (6.8%), Al-Hassa (6.2%), Riyadh (5.5%) and Al-Taif (0.96%), while the lowest morbidity rates were recorded in Al-Sharqia (0.27%). The highest mortality rates were also observed in Al-Qassim (2.3%), followed by Al-Hassa (0.97%), Riyadh (0.19%) and lowest in Al-Sharqia and Taif (0%). LSD virus was detected in all samples (n = 191) by real-time PCR analysis. The disease has been observed in the cattle regardless of previous vaccination using the locally Romanian-pox vaccine; therefore, vaccination programme and vaccine efficacy should be assessed under field conditions.


Subject(s)
Disease Outbreaks/veterinary , Lumpy Skin Disease/diagnosis , Lumpy Skin Disease/epidemiology , Lumpy skin disease virus/isolation & purification , Real-Time Polymerase Chain Reaction/veterinary , Animals , Cattle , Female , Lumpy Skin Disease/virology , Lumpy skin disease virus/genetics , Saudi Arabia/epidemiology , Vaccination/veterinary
10.
J Manag Care Spec Pharm ; 21(9): 742-52, 752a-752e, 2015 Sep.
Article in English | MEDLINE | ID: mdl-26308222

ABSTRACT

BACKGROUND: Major depressive disorder is one of the most common and disabling mental health disorders and is associated with substantial costs in terms of direct health care utilization and workplace productivity. Cognitive dysfunction, which alone substantially increases health care costs, is commonly associated with major depressive disorder. However, the health care costs of cognitive dysfunction in the context of depressive disorder are unknown. Recovery from mood symptoms is not always associated with resolution of cognitive dysfunction. Thus, cognitive dysfunction may contribute to health care burden even with successful antidepressant therapy.  OBJECTIVE: To compare health care utilization and costs for patients with a depressive disorder with and without cognitive dysfunction, at 3 and 6 months after initiation of antidepressant medication.  METHODS: This was an observational study, combining a cross-sectional patient survey, administered during a telephone interview, with health care claims data from a large, geographically diverse U.S. health plan. Included patients had at least 1 pharmacy claim for an antidepressant medication between August 1 and September 30, 2012, and no claim for any antidepressant during the 6 months prior to the index date. In addition to other criteria assessed in the claims data, patients confirmed a diagnosis of depression or major depressive disorder and the absence of any exclusionary neurological diagnoses possibly associated with cognitive impairment. Eligible patients were administered validated cognitive function assessments of verbal episodic memory (Hopkins Verbal Learning Test-Revised, Delayed and Total); attention (Digit Span Forward Maximum Sequence Length); working memory (Digit Span Backward Maximum Sequence Length); and executive function (D-KEFS-Letter Fluency Test). Based on comparison of scores with normative data, patients were assigned to cognitive dysfunction or cognitive normal cohorts. All-cause (all diagnoses) and depressive disorder-related health care utilization and costs (all from a payer perspective) were assessed 6 months prior (baseline) to antidepressant initiation and 3 months and 6 months after (follow-up) initiation of antidepressant medication. Health care utilization and costs included ambulatory (office and hospital outpatient), emergency room, inpatient hospital, pharmacy, other medical (e.g., laboratory and diagnostics), and total (all categories combined). All-cause and depressive disorder-related total costs during the 3- and 6-month follow-up periods were modeled with generalized linear modeling with gamma distribution and log link, while adjusting for potential confounders (age, race, gender, education, employment, and comorbidities). RESULTS: Of the 13,537 patients who were mailed an invitation, 824 (6%) were eligible and agreed to participate. Of these, 563 patients provided informed consent, completed the interview, maintained eligibility, and were included in the 3-month calculations. Among these, 255 (45%) were classified as having cognitive dysfunction. Mean patient age was 41.3 (± 12.5) years; 80% were female. Most patients were white and employed. More patients in the cognitive normal cohort were white (P less than 0.001) and employed full time (P = 0.029), had higher education attainment (P less than 0.001), and had fewer comorbidities (P = 0.007) than those in the cognitive dysfunction cohort. Over the first 3 months, patients with cognitive dysfunction had higher adjusted all-cause costs ($3,309 vs. $2,157, P = 0.002) and higher adjusted depressive disorder-related costs ($718 vs. $406, P less than 0.001) than patients without cognitive dysfunction. At 6 months, data from 4 patients were removed from the analysis because of exclusionary diagnoses. Over 6 months, patients with cognitive dysfunction had higher adjusted all-cause costs ($4,793) than patients without cognitive dysfunction ($3,683, P = 0.034). Over 6 months, depressive disorder-related costs did not significantly differ between patients with ($771) and without cognitive dysfunction ($594, P = 0.071). The main drivers of all-cause costs were office visits, outpatient hospital visits, and inpatient costs, and the main driver of depressive disorder-related costs was inpatient costs. CONCLUSIONS: Cognitive dysfunction was associated with higher adjusted all-cause and depressive disorder-related costs 3 months after initiation of an antidepressant medication. This difference persisted for all-cause costs through 6 months. Identification and treatment of cognitive dysfunction in patients with depressive disorder might reduce health care costs.


Subject(s)
Antidepressive Agents/therapeutic use , Cognition/drug effects , Depressive Disorder, Major/drug therapy , Health Care Costs , Adult , Antidepressive Agents/administration & dosage , Antidepressive Agents/economics , Cognition Disorders/drug therapy , Cognition Disorders/economics , Cross-Sectional Studies , Depressive Disorder, Major/economics , Depressive Disorder, Major/physiopathology , Female , Follow-Up Studies , Humans , Male , Middle Aged , Time Factors , United States
11.
Patient Prefer Adherence ; 9: 971-81, 2015.
Article in English | MEDLINE | ID: mdl-26185426

ABSTRACT

BACKGROUND: Patient satisfaction with treatment directly impacts adherence to medication. OBJECTIVE: The objective was to assess and compare treatment satisfaction with the Treatment Satisfaction Questionnaire for Medication (TSQM), gout-specific health-related quality of life (HRQoL) with the Gout Impact Scale (GIS), and generic HRQoL with the SF-12v2(®) Health Survey (SF-12) in patients with gout in a real-world practice setting. METHODS: This cross-sectional mail survey included gout patients enrolled in a large commercial health plan in the US. Patients were ≥18 years with self-reported gout diagnosis, who filled ≥1 prescription for febuxostat during April 26, 2012 to July 26, 2012 and were not taking any other urate-lowering therapies. The survey included the TSQM version II (TSQM vII, score 0-100, higher scores indicate better satisfaction), GIS (score 0-100, higher scores indicate worse condition), and SF-12 (physical component summary and mental component summary). Patients were stratified by self-report of currently experiencing a gout attack or not to assess the discriminant ability of the questionnaires. RESULTS: A total of 257 patients were included in the analysis (mean age, 54.9 years; 87% male). Patients with current gout attack (n=29, 11%) had worse scores than those without gout attack on most instrument scales. Mean differences between current attack and no current attack for the TSQM domains were: -20.6, effectiveness; -10.6, side effects; -12.1, global satisfaction (all P<0.05); and -6.1, convenience (NS). For the GIS, mean differences were: 30.5, gout overall concern; 14.6, gout medication side effects; 22.7, unmet gout treatment needs; 11.5, gout concern during attack (all P<0.05); and 7.9, well-being during attack (NS). Mean difference in SF-12 was -6.6 for physical component summary (P<0.05) and -2.9 for mental component summary (NS). Correlations between several TSQM and GIS scales were moderate. CONCLUSION: The TSQM and GIS were complementary in evaluating the impact of gout flare on treatment satisfaction and HRQoL. Correlations between the two instruments supported the relationship between treatment satisfaction and HRQoL.

12.
Article in English | MEDLINE | ID: mdl-26088919

ABSTRACT

INTRODUCTION: External price referencing (EPR) is applied frequently to control pharmaceutical prices. Our objective was to analyse how EPR is used in Middle Eastern (ME) countries and to compare the price corridor for original pharmaceuticals to non-pharmaceutical services not subjected to EPR. METHODS: We conducted a survey on EPR regulations and collected prices of 16 patented pharmaceuticals and 14 non-pharmaceutical services in seven Middle Eastern (ME) countries. Maximum and minimum prices of each pharmaceutical and non-pharmaceutical technology were compared to mean prices in the countries studied by using market exchange rates. Influencing factors of pharmaceutical prices were assessed by multivariate linear regression analysis. RESULTS: The average price corridor is narrower for pharmaceuticals (-39.8%; +35.9%) than for outpatient and hospital services (-81.7%; +96.3%). CONCLUSION: Our analysis revealed the importance of population size and EPR implementation on drug price levels; however, EPR results in higher pharmaceutical prices in lower-income countries compared to non-pharmaceutical services.


Subject(s)
Commerce/economics , Drug Costs/statistics & numerical data , Pharmaceutical Preparations/economics , Ambulatory Care/economics , Cost Control , Economics, Hospital/statistics & numerical data , Humans , Income , Linear Models , Middle East , Multivariate Analysis
13.
Arthritis Res Ther ; 17: 120, 2015 May 12.
Article in English | MEDLINE | ID: mdl-25963969

ABSTRACT

INTRODUCTION: To assess the comparative effectiveness of febuxostat and allopurinol in reducing serum urate (sUA) levels in a real-world U.S. managed care setting. METHODS: This retrospective study utilized 2009 to 2012 medical and pharmacy claims and laboratory data from a large U.S. commercial and Medicare Advantage health plan. Study patients had at least one medical claim with a diagnosis of gout, at least one filled prescription for febuxostat or allopurinol and at least one sUA measurement post-index prescription. Reduction in sUA was examined using propensity score-matched cohorts, matched on patient demographics (gender, age), baseline sUA, comorbidities, geographic region and insurance type. RESULTS: The study sample included 2,015 patients taking febuxostat and 14,025 taking allopurinol. At baseline, febuxostat users had a higher Quan-Charlson comorbidity score (0.78 vs. 0.53; P <0.001), but similar age and gender distribution. Mean (standard deviation (SD)) sUA level following propensity score matching among treatment-naïve febuxostat vs. allopurinol users (n = 873 each) were: pre-index sUA, 8.86 (SD, 1.79) vs. 8.72 (SD, 1.63; P = 0.20); and post-index sUA, 6.53 (SD, 2.01) vs. 6.71 (SD, 1.70; P = 0.04), respectively. A higher proportion of febuxostat users attained sUA goals of <6.0 mg/dl (56.9% vs. 44.8%; P <0.001) and <5.0 mg/dl (35.5% vs. 19.2%; P <0.001), respectively. Time to achieve sUA goals of <6.0 mg/dl (346 vs. 397 days; P <0.001) and <5.0 mg/dl was shorter in febuxostat vs. allopurinol users (431 vs. 478 days; P <0.001), respectively. Similar observations were made for overall propensity score-matched cohorts that included both treatment-naïve and current users (n = 1,932 each). CONCLUSIONS: Febuxostat was more effective than allopurinol at the currently used doses (40 mg/day for febuxostat in 83% users and 300 mg/day or lower for allopurinol in 97% users) in lowering sUA in gout patients as demonstrated by post-index mean sUA level, the likelihood of and the time to achieving sUA goals.


Subject(s)
Allopurinol/therapeutic use , Febuxostat/therapeutic use , Gout Suppressants/therapeutic use , Gout/drug therapy , Aged , Cohort Studies , Cost-Benefit Analysis , Creatinine/blood , Dose-Response Relationship, Drug , Drug Administration Schedule , Female , Gout/diagnosis , Humans , Male , Managed Care Programs/economics , Medicare/economics , Reference Values , Retrospective Studies , Severity of Illness Index , Treatment Outcome , United States , Uric Acid/blood
14.
Postgrad Med ; 126(2): 65-75, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24685969

ABSTRACT

BACKGROUND: Febuxostat is recommended as 1 of 2 first-line urate-lowering therapies (ULT) for treating gout in the 2012 American College of Rheumatology Guidelines. Several efficacy trials have compared febuxostat with allopurinol treatment, but real-world comparative data are limited. METHODS: We compared effectiveness of the 2 agents in reaching serum urate (sUA) level goal (< 6 mg/dL) within 6 months (main endpoint), factors impacting the likelihood of reaching goal, and outcomes in allopurinol patients who were switched to febuxostat therapy after failing to reach sUA level goal. Data from the General Electric Electronic Medical Record database on adult patients with newly diagnosed gout, who had started treatment with allopurinol or febuxostat in 2009 or thereafter were analyzed. Descriptive statistics, bivariate analyses, and logistic regressions were used. RESULTS: Allopurinol (n = 17 199) and febuxostat (n = 1190) patients had a mean ± standard deviation (SD) age of 63.7 (± 13.37) years; most patients were men and white. Average daily medication doses (mg) in the first 6 months were 184.9 ± 96.7 and 48.4 ± 15.8 for allopurinol- and febuxostat-treated patients, respectively; 4.8% of allopurinol-treated patients switched to febuxostat, whereas 25.7% of febuxostat-treated patients switched to allopurinol. Febuxostat patients had lower estimated glomerular filtration rate levels, more diabetes mellitus, or tophi at baseline (P < 0.05) and 29.2% and 42.2% of patients in the allopurinol and febuxostat groups achieved goal sUA levels (P < 0.0001). Febuxostat was significantly more effective in patients reaching sUA goal (adjusted odds ratio, 1.73; 95% CI, 1.48-2.01). Older patients and women had greater likelihood of reaching sUA goal level; however, patients with higher Charlson Comorbidity Index scores, blacks, or those with estimated glomerular filtration rates between 15 to ≤ 60 mL/min had reduced likelihood of attaining goal (P < 0.05). Among allopurinol-treated patients who were switched to febuxostat after failing to reach goal, 244 (48.3%) reached goal on febuxostat (median = 62.5 days), with an average 39% sUA level reduction achieved within 6 months. Patients who did not reach goal had a 14.3% sUA level reduction. CONCLUSIONS: The real-life data support the effectiveness of febuxostat in managing patients with gout.


Subject(s)
Allopurinol/therapeutic use , Gout Suppressants/therapeutic use , Gout/drug therapy , Thiazoles/therapeutic use , Uric Acid/blood , Adolescent , Adult , Aged , Aged, 80 and over , Biomarkers/blood , Comparative Effectiveness Research , Drug Administration Schedule , Febuxostat , Female , Follow-Up Studies , Gout/blood , Humans , Logistic Models , Male , Middle Aged , Retrospective Studies , Treatment Outcome , Young Adult
15.
J Rheumatol ; 40(7): 1166-72, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23678154

ABSTRACT

OBJECTIVE: To study the association between serum urate level (sUA) and the risk of incident kidney disease among US veterans with gouty arthritis. METHODS: From 2002 through 2011 adult male patients with gout who were free of kidney disease were identified in the data from the Veterans Administration VISN 16 database and were followed until incidence of kidney disease, death, or the last available observation. Accumulated hazard curves for time to kidney disease were estimated for patients with average sUA levels > 7 mg/dl (high) versus ≤ 7 mg/dl (low) based on Kaplan-Meier analyses; and statistical comparison was conducted using a log-rank test. A Cox proportional hazard model with time-varying covariates was used to estimate the unadjusted and adjusted hazard ratios for kidney disease. RESULTS: Eligible patients (n = 2116) were mostly white (53%), with average age 62.6 years, mean body mass index 31.2 kg/m(2), and high baseline prevalence of hypertension (93%), hyperlipidemia (67%), and diabetes (20%). Mean followup time was 6.5 years. The estimated rates of all incident kidney disease in the overall low versus high sUA groups were 2% versus 4% at Year 1, 3% versus 6% at Year 2, and 5% versus 9% at Year 3, respectively (p < 0.0001). After adjustment, high sUA continued to predict a significantly higher risk of kidney disease development (HR 1.43, 95% CI 1.20-1.70). CONCLUSION: Male veterans with gout and sUA levels > 7 mg/dl had an increased incidence of kidney disease.


Subject(s)
Gout/epidemiology , Hyperuricemia/epidemiology , Kidney Diseases/epidemiology , Uric Acid/blood , Veterans , Aged , Body Mass Index , Comorbidity , Gout/blood , Humans , Hyperuricemia/blood , Incidence , Kidney Diseases/blood , Male , Middle Aged , Prevalence , Risk
16.
J Comp Pathol ; 146(2-3): 211-22, 2012.
Article in English | MEDLINE | ID: mdl-21741053

ABSTRACT

The pathogenesis and kinetics of oral infection by equine herpesvirus (EHV)-9 were studied in mice and hamsters. After oral inoculation of 10(5) plaque-forming units (PFU) of virus, 1-week-old suckling hamsters showed varying severity of neurological disease from 72 hours post inoculation (hpi) and all of these animals had died by 96 hpi. Four-week-old ICR mice inoculated orally with 4 × 10(4)PFU of virus showed no clinical signs, but they developed erosive and ulcerative gastritis from 36 hpi. Varying degrees of encephalitis were seen in infected mice and hamsters, and the hamsters also developed myelitis by 96 hpi. Immunohistochemistry performed on whole body sections of suckling hamsters revealed the kinetics of spread of the virus to the central nervous system. EHV-9 antigen was detected initially in macrophages of the oral and lingual submucosa. At 36 hpi virus antigen was detected in the nerve fibres and pseudounipolar neurons of the trigeminal ganglion and at 96 hpi antigen was present in the myenteric plexuses of the intestine. Virus antigen was also detected in the liver, lungs and heart of affected animals. EHV-9 DNA was detected by polymerase chain reaction in the brain, blood and spinal cord of suckling hamsters at 36, 48 and 96 hpi. These findings show that EHV-9 may spread via the trigeminal nerve when mice and hamsters are inoculated orally with virus.


Subject(s)
Brain/virology , Encephalitis, Viral/virology , Herpesviridae Infections/virology , Varicellovirus/pathogenicity , Animals , Antigens, Viral , Brain/pathology , Cricetinae , Encephalitis, Viral/pathology , Herpesviridae Infections/pathology , Mesocricetus , Mice
17.
J Comp Pathol ; 145(2-3): 271-81, 2011.
Article in English | MEDLINE | ID: mdl-21459386

ABSTRACT

The kinetics of infection and pathogenicity of equine herpesvirus-9 (EHV-9) was studied in a hamster model. Five-week-old Syrian hamsters and 5-day-old suckling hamsters were inoculated intraperitoneally with 10(5) and 4×10(4) plaque-forming units of EHV-9, respectively. EHV-9 antigens were detected by immunocytochemistry in the peritoneal macrophages, which may be the primary site of virus attachment and propagation at 6h post inoculation (hpi). At 12 hpi, viral antigen was observed in the abdominal nerves and ganglia (mainly the coeliac ganglia). Virus antigen was detected in the dorsal root (spinal) ganglia, in parts of the spinal cord (particularly the mid-lumbar area) and in the myenteric plexuses at 36, 48 and 72 hpi, respectively. At 96 hpi, virus antigen was detected in the most caudal part of the brain. Polymerase chain reaction conducted on samples of the blood, spinal cord and brain revealed EHV-9 DNA in the spinal cord at 36 hpi and in the blood at 48 hpi and for 4 days after this initial detection. It is suggested that after initial propagation in the abdominal macrophages, EHV-9 infected the abdominal ganglia or myenteric plexuses and then travelled to the brain via the peripheral nerves and spinal cord. Examination of other organs also revealed the presence of EHV-9, suggesting that the virus might infect tissues other than those of the nervous system.


Subject(s)
Herpesviridae Infections/pathology , Herpesviridae Infections/virology , Varicellovirus , Animals , Antigens, Viral/analysis , Cricetinae , DNA, Viral , Disease Models, Animal , Kinetics , Mesocricetus , Polymerase Chain Reaction
18.
Ann Gen Psychiatry ; 10: 10, 2011 Apr 04.
Article in English | MEDLINE | ID: mdl-21463526

ABSTRACT

BACKGROUND: Because wide variations in mental health care utilization exist throughout the world, determining long-term effectiveness of psychotropic medications in a real-world setting would be beneficial to physicians and patients. The purpose of this analysis was to describe the effectiveness of injectable risperidone long-acting therapy (RLAT) for schizophrenia across countries. METHODS: This was a pragmatic analysis of data from two prospective observational studies conducted in the US (Schizophrenia Outcomes Utilization Relapse and Clinical Evaluation [SOURCE]; ClinicalTrials.gov registration number for the SOURCE study: NCT00246194) and Spain, Australia, and Belgium (electronic Schizophrenia Treatment Adherence Registry [eSTAR]). Two separate analyses were performed to assess clinical improvement during the study and estimate psychiatric hospitalization rates before and after RLAT initiation. Clinical improvement was evaluated using the Clinical Global Impressions-Severity (CGI-S) and Global Assessment of Functioning (GAF) scales, and change from baseline was evaluated using paired t tests. Psychiatric hospitalization rates were analyzed using incidence densities, and the bootstrap resampling method was used to examine differences between the pre-baseline and post-baseline periods. RESULTS: The initial sample comprised 3,069 patients (US, n = 532; Spain, n = 1,345; Australia, n = 784; and Belgium, n = 408). In all, 24 months of study participation, completed by 39.3% (n = 209), 62.7% (n = 843), 45.8% (n = 359), and 64.2% (n = 262) of patients from the US, Spain, Australia, and Belgium, respectively, were included in the clinical analysis. Improvements compared with baseline were observed on both clinical assessments across countries (P < 0.001 at all post-baseline visits). The mean improvement was approximately 1 point on the CGI-S and 15 points on the GAF. A total of 435 (81.8%), 1,339 (99.6%), 734 (93.6%), and 393 (96.3%) patients from the US, Spain, Australia, and Belgium, respectively, had ≥1 post-baseline visit and were included in the analysis of psychiatric hospitalization rates. Hospitalization rates decreased significantly in all countries regardless of hospitalization status at RLAT initiation (P < 0.0001) and decreased significantly in the US and Spain (P < 0.0001) when the analysis was limited to outpatients only. CONCLUSIONS: RLAT in patients with schizophrenia was associated with improvements in clinical and functional outcomes and decreased hospitalization rates in the US, Spain, Australia, and Belgium, despite differences in health care delivery systems.

19.
Curr Med Res Opin ; 26(4): 943-55, 2010 Apr.
Article in English | MEDLINE | ID: mdl-20163295

ABSTRACT

BACKGROUND: Inpatient care to manage relapse of patients with schizophrenia contributes greatly to the overall financial burden of treatment. The present study explores to what extent this is influenced by duration of illness. METHODS: Medical and pharmaceutical claims data for patients diagnosed with schizophrenia (ICD-9 295.xx) were obtained from the PharMetrics Integrated Database, a large, regionally representative US insurance claims database, for the period 1998-2007. Recently diagnosed (n = 970) and chronic patients (n = 2996) were distinguished based on ICD-9 295.xx classification, age and claims history relative to the first year (recently diagnosed) and the third year onwards (chronic) after the first index schizophrenia event. RESULTS: The medical resource use and costs during the year following the index schizophrenia event differed significantly between cohorts. A higher proportion of recently diagnosed patients were hospitalised compared with chronic patients (22.3% vs 12.4%; p < 0.0001), spending a greater mean number of days in hospital (5.1 days vs 3.0 days; p = 0.0065) as well as making more frequent use of emergency room (ER) resources during this time. The mean annual healthcare costs of recently diagnosed patients were also greater ($20,654 vs $15,489; p < 0.0001) with inpatient costs making up a higher proportion of total costs (62.9%) compared with chronic patients (38.5%). CONCLUSIONS: There is a considerably higher overall economic burden in the year following their first schizophrenia event in the treatment of recently diagnosed schizophrenia patients compared with chronic patients. Since hospitalisations and ER visits are the most significant components contributing to this finding, efforts that focus on measures to reduce the risk of relapse, particularly amongst recently diagnosed patients, such as improved adherence programs, may lead to better clinical and economic outcomes in the management of schizophrenia. LIMITATIONS: Only commercially insured patients and direct medical costs were included, therefore, results may underestimate the economic burden of schizophrenia.


Subject(s)
Health Care Costs , Health Services/economics , Health Services/statistics & numerical data , Schizophrenia/economics , Adult , Ambulatory Care/economics , Ambulatory Care/statistics & numerical data , Chronic Disease , Cohort Studies , Emergency Services, Psychiatric/economics , Emergency Services, Psychiatric/statistics & numerical data , Female , Hospitalization/economics , Humans , Logistic Models , Male , Middle Aged , Schizophrenia/therapy , Secondary Prevention , United States
20.
Pharmacoeconomics ; 27(5): 421-30, 2009.
Article in English | MEDLINE | ID: mdl-19586079

ABSTRACT

BACKGROUND: Delayed coverage of pathogens including meticillin-resistant Staphylococcus aureus (MRSA) in pneumonia and bacteraemia has been associated with increased mortality and length of hospital stay (LOS). However, less is known about the impact of delayed appropriate coverage in complicated skin and skin-structure infections (cSSSIs). OBJECTIVE: To evaluate the clinical and economic outcomes associated with early versus late use of vancomycin in the management of patients hospitalized for cSSSIs. METHODS: Retrospective analysis was performed using an inpatient claims database of >500 US hospitals in 2005. Using prescription claims, patients with primary or secondary cSSSI admissions were classified into three groups: 1 = early vancomycin monotherapy; 2 = early vancomycin combination therapy; 3 = late vancomycin therapy. Outcomes studied included LOS and inpatient hospital costs. One-way analysis of variance was used for unadjusted analysis and multivariate regression methods were used to control for co-variates. RESULTS: A total of 34,942 patients (27.78% of all patients with cSSSIs) were treated with vancomycin. Mean age was 54.7 years and 54.3% of the patients were males. Mean unadjusted total LOS was 8.46, 9.44 and 13.2 days, and hospital costs in 2005 values were USD10 211.94, USD12 361.94 and USD18 344.00 for groups 1, 2 and 3, respectively. In-hospital mortality rate was highest in group 3 (4.18%) and lowest in group 1 (1.75%). Generalized linear models used to control for potential confounding variables between early versus late vancomycin use suggest that among cSSSI patients late vancomycin use is an independent predictor of higher LOS and costs. CONCLUSION: In this large inpatient database, later vancomycin use in patients with cSSSIs appears to be significantly associated with higher LOS and total costs.


Subject(s)
Health Care Costs/statistics & numerical data , Skin Diseases, Bacterial/drug therapy , Vancomycin/administration & dosage , Vancomycin/economics , Anti-Bacterial Agents/administration & dosage , Cohort Studies , Databases as Topic , Drug Administration Schedule , Drug Therapy, Combination , Female , Hospitalization , Humans , Length of Stay/economics , Length of Stay/statistics & numerical data , Male , Methicillin-Resistant Staphylococcus aureus/drug effects , Middle Aged , Retreatment , Skin Diseases, Bacterial/economics , Skin Diseases, Bacterial/mortality , Staphylococcal Skin Infections/drug therapy , Time Factors , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL