Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
J Sci Med Sport ; 22(12): 1314-1318, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31445950

ABSTRACT

OBJECTIVES: The vast majority of rugby union ('rugby') participants are community-based players; however, the majority of injury surveillance studies reported relate to the elite, professional game. A potential reason for this dearth of studies could be the perceived difficulty of using the consensus statement for injury recording at the community level. The aim of this study was to identify areas where the consensus statement could be adapted for easier and more appropriate implementation within the community setting. DESIGN: Round-table discussion. METHODS: All community-based injury surveillance issues were discussed during a 2-day facilitated round-table meeting, by an 11-person working group consisting of researchers currently active in rugby-related injury surveillance, sports medicine and sports science issues. The outcomes from the meeting were summarised in a draft guidance document that was then subjected to an extensive iterative review prior to producing methodological recommendations. RESULTS: Each aspect of the rugby-specific consensus statement was reviewed to determine whether it was feasible to implement the standards required in the context of non-elite rugby and the resources available within in a community setting. Final recommendations are presented within a community-based injury report form. CONCLUSIONS: It is recommended that whenever possible the rugby-specific consensus statement for injury surveillance studies be used: this paper presents an adapted report form that can be used to record injury surveillance information in community rugby if suitable medical support is not available.


Subject(s)
Athletic Injuries/epidemiology , Epidemiological Monitoring , Football/injuries , Consensus , Guidelines as Topic , Humans , Incidence
2.
Addict Behav ; 86: 124-129, 2018 11.
Article in English | MEDLINE | ID: mdl-29884421

ABSTRACT

OBJECTIVE: To evaluate the impact of women-centered substance abuse treatment programming on outcomes among pregnant women with opioid use disorder (OUD). METHODS: We compared two retrospective cohorts of pregnant women with OUD on buprenorphine maintenance therapy who delivered an infant at the University of Pittsburgh from 2014 to 2016. Cohort 1 was composed of pregnant women who received women-centered OUD treatment services through the Pregnancy Recovery Program (PRC) and Cohort 2 was composed of pregnant women who received buprenorphine at OUD programs without women-centered services (non-PRC). Women-centered outcomes were defined as a) pregnancy-specific buprenorphine dosing, b) prenatal and postpartum care attendance, c) breastfeeding and d) highly effective contraception utilization. Chi-square and t-tests were used to compare outcomes between PRC and non-PRC patients. RESULTS: Among 248 pregnant women with OUD, 71 (28.6%) were PRC and 177 (71.4%) were non-PRC patients. PRC patients were significantly more likely to initiate buprenorphine during vs. prior to their pregnancy (81.4% vs. 44.2%; p < .01) and have a higher buprenorphine dose at the time of delivery (16.0 mg vs. 14.1 mg; p = .02) compared to non-PRC patients. Likewise, PRC patients were significantly more likely to attend their postpartum visit (67.9% vs. 52.6%; p = .05) and receive a long-acting reversible contraceptive (LARC) method (23.9% vs. 13.0%, p = .03) after delivery compared to non-PRC patients. Finally, PRC patients had a smaller percent decrease in the rate of breastfeeding during their delivery hospitalization (-14.7% vs. -37.1%). CONCLUSIONS: Incorporating women-centered services into OUD treatment programming may improve gender-specific outcomes among women with OUD.


Subject(s)
Analgesics, Opioid/therapeutic use , Buprenorphine/therapeutic use , Delivery of Health Care/methods , Opioid-Related Disorders/therapy , Patient-Centered Care/methods , Postpartum Period , Pregnancy Complications/therapy , Pregnant Women , Adult , Breast Feeding/statistics & numerical data , Contraception/statistics & numerical data , Contraceptive Effectiveness , Female , Humans , Opiate Substitution Treatment/methods , Postnatal Care/statistics & numerical data , Pregnancy , Prenatal Care/statistics & numerical data , Substance Abuse Treatment Centers , Young Adult
3.
Am J Obstet Gynecol ; 217(4): 459.e1-459.e6, 2017 10.
Article in English | MEDLINE | ID: mdl-28669739

ABSTRACT

BACKGROUND: Dose-adjusted plasma concentrations of buprenorphine are significantly decreased during pregnancy compared with the nonpregnant state. This observation suggests that pregnant women may need a higher dose of buprenorphine than nonpregnant individuals to maintain similar drug exposure (plasma concentrations over time after a dose). The current dosing recommendations for buprenorphine during pregnancy address the total daily dose of buprenorphine to be administered, but the frequency of dosing is not clearly addressed. Based on buprenorphine's long terminal half-life, once-daily or twice-daily dosing has generally been suggested. OBJECTIVE: The objective of the study was to assess the impact of dosing frequency on buprenorphine plasma concentration time course during pregnancy. STUDY DESIGN: We utilized 3 data sources to determine an optimal frequency for dosing of buprenorphine during pregnancy: data from a pharmacokinetic study of 14 pregnant and postpartum women on maintenance buprenorphine in a supervised clinical setting; data from pregnant women attending a buprenorphine clinic; and data from a physiologically based pharmacokinetic modeling of buprenorphine pharmacokinetics in nonpregnant subjects. RESULTS: Among the 14 women participating in the pharmacokinetic study during and after pregnancy, plasma concentrations of buprenorphine were <1 ng/mL (the theoretical concentration required to prevent withdrawal symptoms) for 50-80% of the 12 hour dosing interval while at steady state. Among 62 women followed up in a opioid agonist treatment program, in which dosing frequency is determined in part by patient preference, 10 (16%) were on once-daily dosing, 10 (16%) were on twice-daily dosing, 28 (45%) were on thrice-daily dosing, and 14 (23%) were on four-times-daily dosing. A physiologically based pharmacokinetic model in nonpregnant subjects demonstrated that dosing frequency has an impact on the duration over which the plasma concentrations are below a specified plasma concentration threshold. CONCLUSION: A more frequent dosing interval (ie, three-times-daily or four-times-daily dosing) may be required in pregnant women to sustain plasma concentrations above the threshold of 1 ng/mL to prevent withdrawal symptoms and to improve adherence.


Subject(s)
Buprenorphine/administration & dosage , Narcotic Antagonists/administration & dosage , Opiate Substitution Treatment , Opioid-Related Disorders/drug therapy , Administration, Sublingual , Adult , Buprenorphine/pharmacokinetics , Dose-Response Relationship, Drug , Evidence-Based Practice , Female , Humans , Narcotic Antagonists/pharmacokinetics , Pregnancy , Pregnancy Complications/drug therapy
4.
Am J Sports Med ; 45(2): 480-487, 2017 Feb.
Article in English | MEDLINE | ID: mdl-28146395

ABSTRACT

BACKGROUND: Previous research has described general injury patterns in community-level rugby union, but specific information on time-loss head injuries has not been reported. PURPOSE: To establish the incidence and nature of significant time-loss head injuries in English community rugby match play, and to identify the injury risk for specific contact events. STUDY DESIGN: Descriptive epidemiology study. METHODS: Over 6 seasons, injury information was collected from 46 (2009-2010), 67 (2010-2011), 76 (2011-2012), 50 (2012-2013), 67 (2013-2014), and 58 (2014-2015) English community rugby clubs (Rugby Football Union levels 3-9) over a total of 175,940 hours of player match exposure. Club injury management staff reported information for all head injuries sustained during match play whereby the player was absent for 8 days or greater. Clubs were subdivided into semiprofessional (mean player age, 24.6 ± 4.7 years), amateur (24.9 ± 5.1 years), and recreational (25.6 ± 6.1 years) playing levels. Contact events from a sample of 30 matches filmed over seasons 2009-2010, 2010-2011, and 2011-2012 provided mean values for the frequency of contact events. RESULTS: The overall incidence for time-loss head injuries was 2.43 injuries per 1000 player match hours, with a higher incidence for the amateur (2.78; 95% CI, 2.37-3.20) compared with recreational (2.20; 95% CI, 1.86-2.53) ( P = .032) playing level but not different to the semiprofessional (2.31; 95% CI, 1.83-2.79) playing level. Concussion was the most common time-loss head injury, with 1.46 per 1000 player match hours. The tackle event was associated with 64% of all head injuries and 74% of all concussions. There was also a higher risk of injuries per tackle (0.33 per 1000 events; 95% CI, 0.30-0.37) compared with all other contact events. CONCLUSION: Concussion was the most common head injury diagnosis, although it is likely that this injury was underreported. Continuing education programs for medical staff and players are essential for the improved identification and management of these injuries. With the majority of head injuries occurring during a tackle, an improved technique in this contact event through coach and player education may be effective in reducing these injuries.


Subject(s)
Craniocerebral Trauma/epidemiology , Football/injuries , Adolescent , Adult , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Brain Concussion/epidemiology , Brain Concussion/etiology , Craniocerebral Trauma/etiology , England/epidemiology , Humans , Incidence , Male , Middle Aged , Risk Factors , Time Factors , Young Adult
5.
Am J Sports Med ; 43(2): 475-81, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25512663

ABSTRACT

BACKGROUND: All rugby training activities carry an injury risk, but in the training environment these injury risks should be more controllable than during matches. PURPOSE: To (1) describe the incidence, severity, anatomic location, and type of youth rugby training injuries; (2) determine the injury events and type of training activities associated with injuries; and (3) compare 2 levels of play (professional academy vs school) within English youth rugby union. STUDY DESIGN: Cohort study; Level of evidence, 2. METHODS: A 2-season (2006-2007 and 2007-2008) study recorded exposure to training activities and time-loss injuries in male youth rugby union players (age range, 16-18 years) from 12 English Premiership academies (250 player-seasons) and 7 schools (222 player-seasons). Players from the Premiership academies, associated with the top-level professional clubs in England, represented the elite level of youth rugby; the school players were from established rugby-playing schools but were overall considered at a lower level of play. RESULTS: There was a trend for training injury incidence to be lower for the academy group (1.4/1000 player-hours; 95% CI, 1.0-1.7) compared with the school group (2.1/1000 player-hours; 95% CI, 1.4-2.9) (P = .06). Injuries to the ankle/heel and thigh were most common in academy players and injuries to the lumbar spine and ankle/heel region most common in school players. The training activities responsible for injury differed between the 2 groups: technical skills (scrummaging) for school players and contact skills (defense and ruck/maul drills) for academy players. CONCLUSION: For injury risk management in youth rugby, coaches of school players should focus on the development of the correct technique during practice of technical skills such as scrummaging, weight training, and skills training, and coaches of academy players should consider the extent to which contact drills are necessary during training.


Subject(s)
Ankle Injuries/epidemiology , Football/injuries , Physical Conditioning, Human/adverse effects , Schools/classification , Sprains and Strains/epidemiology , Adolescent , Athletic Injuries/epidemiology , Cohort Studies , England/epidemiology , Heel/injuries , Humans , Incidence , Lumbar Vertebrae/injuries , Male , Physical Conditioning, Human/methods , Prospective Studies , Risk Factors , Thigh/injuries , Torso/injuries , Upper Extremity/injuries
6.
Br J Sports Med ; 49(8): 541-6, 2015 Apr.
Article in English | MEDLINE | ID: mdl-24505041

ABSTRACT

AIM: Biomechanical studies of the rugby union scrum have typically been conducted using instrumented scrum machines, but a large-scale biomechanical analysis of live contested scrummaging is lacking. We investigated whether the biomechanical loading experienced by professional front row players during the engagement phase of live contested rugby scrums could be reduced using a modified engagement procedure. METHODS: Eleven professional teams (22 forward packs) performed repeated scrum trials for each of the three engagement techniques, outdoors, on natural turf. The engagement processes were the 2011/2012 (referee calls crouch-touch-pause-engage), 2012/2013 (referee calls crouch-touch-set) and 2013/2014 (props prebind with the opposition prior to the 'Set' command; PreBind) variants. Forces were estimated by pressure sensors on the shoulders of the front row players of one forward pack. Inertial Measurement Units were placed on an upper spine cervical landmark (C7) of the six front row players to record accelerations. Players' motion was captured by multiple video cameras from three viewing perspectives and analysed in transverse and sagittal planes of motion. RESULTS: The PreBind technique reduced biomechanical loading in comparison with the other engagement techniques, with engagement speed, peak forces and peak accelerations of upper spine landmarks reduced by approximately 20%. There were no significant differences between techniques in terms of body kinematics and average force during the sustained push phase. CONCLUSIONS: Using a scrum engagement process which involves binding with the opposition prior to the engagement reduces the stresses acting on players and therefore may represent a possible improvement for players' safety.


Subject(s)
Football/physiology , Athletic Injuries/etiology , Athletic Injuries/physiopathology , Biomechanical Phenomena/physiology , Cross-Sectional Studies , Football/injuries , Humans , Male , Posture/physiology , Pressure , Spinal Injuries/etiology , Spinal Injuries/physiopathology , Stress, Physiological/physiology , Video Recording
7.
Br J Sports Med ; 49(8): 520-8, 2015 Apr.
Article in English | MEDLINE | ID: mdl-24511085

ABSTRACT

OBJECTIVES: This cross-sectional study investigated the factors that may influence the physical loading on rugby forwards performing a scrum by studying the biomechanics of machine-based scrummaging under different engagement techniques and playing levels. METHODS: 34 forward packs from six playing levels performed repetitions of five different types of engagement techniques against an instrumented scrum machine under realistic training conditions. Applied forces and body movements were recorded in three orthogonal directions. RESULTS: The modification of the engagement technique altered the load acting on players. These changes were in a similar direction and of similar magnitude irrespective of the playing level. Reducing the dynamics of the initial engagement through a fold-in procedure decreased the peak compression force, the peak downward force and the engagement speed in excess of 30%. For example, peak compression (horizontal) forces in the professional teams changed from 16.5 (baseline technique) to 8.6 kN (fold-in procedure). The fold-in technique also reduced the occurrence of combined high forces and head-trunk misalignment during the absorption of the impact, which was used as a measure of potential hazard, by more than 30%. Reducing the initial impact did not decrease the ability of the teams to produce sustained compression forces. CONCLUSIONS: De-emphasising the initial impact against the scrum machine decreased the mechanical stresses acting on forward players and may benefit players' welfare by reducing the hazard factors that may induce chronic degeneration of the spine.


Subject(s)
Football/physiology , Analysis of Variance , Biomechanical Phenomena/physiology , Body Weight/physiology , Cross-Sectional Studies , Female , Humans , Male , Movement/physiology
8.
Br J Sports Med ; 49(7): 425-33, 2015 Apr.
Article in English | MEDLINE | ID: mdl-24398223

ABSTRACT

As a collision sport, rugby union has a relatively high overall injury incidence, with most injuries being associated with contact events. Historically, the set scrum has been a focus of the sports medicine community due to the perceived risk of catastrophic spinal injury during scrummaging. The contemporary rugby union scrum is a highly dynamic activity but to this point has not been well characterised mechanically. In this review, we synthesise the available research literature relating to the medical and biomechanical aspects of the rugby union scrum, in order to (1) review the injury epidemiology of rugby scrummaging; (2) consider the evidence for specific injury mechanisms existing to cause serious scrum injuries and (3) synthesise the information available on the biomechanics of scrummaging, primarily with respect to force production. The review highlights that the incidence of acute injury associated with scrummaging is moderate but the risk per event is high. The review also suggests an emerging acknowledgement of the potential for scrummaging to lead to premature chronic degeneration injuries of the cervical spine and summarises the mechanisms by which these chronic injuries are thought to occur. More recent biomechanical studies of rugby scrummaging confirm that scrum engagement forces are high and multiplanar, but can be altered through modifications to the scrum engagement process which control the engagement velocity. As the set scrum is a relatively 'controlled' contact situation within rugby union, it remains an important area for intervention with a long-term goal of injury reduction.


Subject(s)
Football/injuries , Posture/physiology , Acute Disease , Age Factors , Athletic Injuries/etiology , Athletic Injuries/physiopathology , Biomechanical Phenomena , Chronic Disease , Football/statistics & numerical data , Humans , Professional Competence/standards , Spinal Injuries/etiology , Spinal Injuries/physiopathology , Time Factors
9.
J Diabetes Sci Technol ; 8(5): 945-50, 2014 Sep.
Article in English | MEDLINE | ID: mdl-24876448

ABSTRACT

Stress hyperglycemia and hypoglycemia are associated with increased morbidity and mortality in the critically ill. Intermittent, random blood glucose (BG) measurements can miss episodes of hyper- and hypoglycemia. The purpose of this study was to determine the accuracy of the Symphony® continuous glucose monitor (CGM) in critically ill cardiac surgery patients. Fifteen adult cardiac surgery patients were evaluated immediately postoperatively in the intensive care unit. Prelude® SkinPrep prepared the skin and a sensor was applied to 2 test sites on each subject to monitor interstitial fluid glucose. Reference BG was sampled at 30- to 60-minute intervals. The skin at the test sites was inspected for adverse effects. Accuracy of the retrospectively analyzed CGM data relative to reference BG values was determined using continuous glucose-error grid analysis (CG-EGA) and mean absolute relative difference (MARD). Using 570 Symphony CGM glucose readings paired with reference BG measurements, CG-EGA showed that 99.6% of the readings were within zones A and B. BG measurements ranged from 73 to 251 mg/dL. The MARD was 12.3%. No adverse device effects were reported. The Symphony CGM system is able to safely, continuously, and noninvasively monitor glucose in the transdermal interstitial fluid of cardiac surgery intensive care unit patients with accuracy similar to that reported with other CGM systems. Future versions of the system will need real-time data analysis, fast warm-up, and less frequent calibrations to be used in the clinical setting.


Subject(s)
Biosensing Techniques/instrumentation , Blood Glucose/analysis , Monitoring, Physiologic/instrumentation , Adult , Aged , Aged, 80 and over , Critical Illness , Female , Humans , Hyperglycemia/blood , Hypoglycemia/blood , Male , Middle Aged , Reproducibility of Results
11.
A A Case Rep ; 3(9): 123-5, 2014 Nov 01.
Article in English | MEDLINE | ID: mdl-25611864

ABSTRACT

We describe a patient who developed a hypopharyngeal mass (in the setting of a cervical osteophyte) while taking clopidogrel and aspirin for coronary artery disease. He had a 2-month history of progressive dysphagia and hoarseness. Physical examination and computed tomography scan revealed a soft tissue retropharyngeal mass of unclear etiology yet with a stable airway. He was admitted to the intensive care unit for a 48-hour clopidogrel washout followed by surgery. A hematoma and cervical osteophyte were removed with scant bleeding. This case report emphasizes the need to consider the medication history of a patient when assessing the cause of an otherwise unexpected finding.

13.
Am J Sports Med ; 41(4): 749-55, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23380159

ABSTRACT

BACKGROUND: Numerous injury epidemiology studies have reported injury patterns in senior rugby union, but investigations in youth rugby are limited. PURPOSE: To describe the nature of injuries resulting from match play within the English youth rugby union, including a comparison between 2 levels of play within the same age group: professional academy versus school rugby. STUDY DESIGN: Cohort study; Level of evidence, 2. METHODS: A 2-season (2006-2007 and 2007-2008) study obtained information on injuries sustained in male youth rugby union players (age, 16-18 years) from 12 English Premiership academies (n = 250) and 7 schools (n = 222). Match exposure (player-hours) and injury details were recorded. RESULTS: Match injury incidence was 47 per 1000 player-hours for the academy and 35 per 1000 player-hours for the school groups; these rates were statistically different (P = .026). The most common injury site was the lower limb and the most common injury type was a ligament sprain, with injuries to the knee and shoulder region resulting in the greatest burden of injury for both groups. The tackle event was the most common cause of match injury for both academy (51% of injuries) and school (57% of injuries) groups. CONCLUSION: Overall, the incidence of injury for youth rugby was lower than for previous studies in senior rugby, but injury patterns (location, type) and causes were similar. The study confirmed that match injury incidence was significantly greater in elite academy youth rugby union than schools rugby. The results suggest that the specific focus for injury risk management in youth rugby should be on players' tackle technique and prevention strategies for knee and shoulder injuries.


Subject(s)
Athletic Injuries/epidemiology , Football/injuries , Adolescent , England/epidemiology , Humans , Incidence , Male , Schools
14.
Br J Sports Med ; 44(15): 1093-9, 2010 Dec.
Article in English | MEDLINE | ID: mdl-20961921

ABSTRACT

INTRODUCTION: Recent studies report the incidence and epidemiology of injury in professional rugby union; however, there is limited research in amateur and youth rugby. Injuries in youth rugby may have consequences for sports participation and physical development. The authors performed a prospective cohort study of injuries during youth community rugby. METHODS: An injury surveillance programme was established for the 2008-2009 season (9 months, 1636 player-hours) of an English community rugby club. The study included 210 players, all males, in Under 9 to Under 17 (U9-U17) age groups. These were categorised into mini, junior, pubertal and school participation age groupings. Injuries were defined according to the International Rugby Board consensus statements. RESULTS: There were 39 injuries reported (overall injury rate 24/1000 player-hours). Injury rates ranged from 0 to 49.3/1000 player-hours. More injuries occurred in junior (34.2/1000 player-hours) than in minis (11.9/1000 player-hours) (p<0.025). Higher numbers of moderate (20.6/1000 player-hours, p<0.005) and severe (9.5/1000 player-hours, p<0.05) injuries occurred in the U16-U17 age groups compared with younger age groups (U9-U10) where only minor injuries were reported. Most injuries occurred in the tackle (59%). The knee (4.9/1000 player-hours), shoulder (4.9/1000 player-hours) and head (4.3/1000 player-hours) were the most commonly affected areas. Concussion (1.8/1000 player-hours) affected half of the head injuries. CONCLUSIONS: Injuries in youth rugby occur infrequently and are lower than in adult series. The risk of injury and severity of injury increases with age. This study highlights the need for further research into injury risk factors around puberty and the need for first aid provision.


Subject(s)
Football/injuries , Musculoskeletal System/injuries , Adolescent , Age Distribution , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Child , England/epidemiology , Humans , Incidence , Male , Prospective Studies , Recovery of Function , Risk Factors
15.
J Gastrointest Surg ; 14(7): 1081-9, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20354809

ABSTRACT

INTRODUCTION: Treatment options for patients with fecal incontinence (FI) are limited, and surgical treatments can be associated with high rates of infection and other complications. One treatment, sacral nerve stimulation (SNS), is approved for FI in Europe. A large multicenter trial was conducted in North America and Australia to assess the efficacy of SNS in patients with chronic fecal incontinence. The aim of this report was to analyze the infectious complication rates in that trial. METHODS: Adult patients with a history of chronic fecal incontinence were enrolled into this study. Those patients who fulfilled study inclusion/exclusion criteria and demonstrated greater than two FI episodes per week underwent a 2-week test phase of SNS. Patients who showed a > or = 50% reduction in incontinent episodes and/or days per week underwent chronic stimulator implantation. Adverse events were reported to the sponsor by investigators at each study site and then coded. All events coded as implant site infection were included in this analysis. RESULTS: One hundred twenty subjects (92% female, 60.5 +/- 12.5 years old) received a chronically implanted InterStim Therapy device (Medtronic, Minneapolis, MN, USA). Patients were followed for an average of 28 months (range 2.2-69.5). Thirteen of the 120 implanted subjects (10.8%) reported infection after the chronic system implant. One infection spontaneously resolved and five were successfully treated with antibiotics. Seven infections (5.8%) required surgical intervention, with infections in six patients requiring full permanent device explantation. The duration of the test stimulation implant procedure was similar between the infected group (74 min) and the non-infected group (74 min). The average duration of the chronic neurostimulator implant procedure was also similar between the infected (39 min) and non-infected group (37 min). Nine infections occurred within a month of chronic system implant and the remaining four infections occurred more than a year from implantation. While the majority (7/9) of the early infections was successfully treated with observation, antibiotics, or system replacement, all four of the late infections resulted in permanent system explantation. CONCLUSION: SNS for FI resulted in a relatively low infection rate. This finding is especially important because the only other Food and Drug Administration-approved treatment for end-stage FI, the artificial bowel sphincter, reports a much higher rate. Combined with its published high therapeutic success rate, this treatment has a positive risk/benefit profile.


Subject(s)
Electric Stimulation Therapy/adverse effects , Electrodes, Implanted/adverse effects , Fecal Incontinence/therapy , Infections/etiology , Lumbosacral Plexus/physiology , Adult , Aged , Aged, 80 and over , Chronic Disease , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prospective Studies , Treatment Outcome
16.
Ann Surg ; 251(3): 441-9, 2010 Mar.
Article in English | MEDLINE | ID: mdl-20160636

ABSTRACT

BACKGROUND: Sacral nerve stimulation has been approved for use in treating urinary incontinence in the United States since 1997, and in Europe for both urinary and fecal incontinence (FI) since 1994. The purpose of this study was to determine the safety and efficacy of sacral nerve stimulation in a large population under the rigors of Food and Drug Administration-approved investigational protocol. METHODS: Candidates for SNS who provided informed consent were enrolled in this Institutional Review Board-approved multicentered prospective trial. Patients showing > or =50% improvement during test stimulation received chronic implantation of the InterStim Therapy (Medtronic; Minneapolis, MN). The primary efficacy objective was to demonstrate that > or =50% of subjects would achieve therapeutic success, defined as > or =50% reduction of incontinent episodes per week at 12 months compared with baseline. RESULTS: A total of 133 patients underwent test stimulation with a 90% success rate, and 120 (110 females) of a mean age of 60.5 years and a mean duration of FI of 6.8 years received chronic implantation. Mean follow-up was 28 (range, 2.2-69.5) months. At 12 months, 83% of subjects achieved therapeutic success (95% confidence interval: 74%-90%; P < 0.0001), and 41% achieved 100% continence. Therapeutic success was 85% at 24 months. Incontinent episodes decreased from a mean of 9.4 per week at baseline to 1.9 at 12 months and 2.9 at 2 years. There were no reported unanticipated adverse device effects associated with InterStim Therapy. CONCLUSION: Sacral nerve stimulation using InterStim Therapy is a safe and effective treatment for patients with FI.


Subject(s)
Electric Stimulation Therapy , Fecal Incontinence/therapy , Adult , Aged , Aged, 80 and over , Female , Humans , Lumbosacral Plexus , Male , Middle Aged , Prospective Studies
17.
J Diabetes Sci Technol ; 2(4): 595-602, 2008 Jul.
Article in English | MEDLINE | ID: mdl-19885235

ABSTRACT

BACKGROUND: We tested the hypothesis that glucose can be measured continuously and reliably in patients in diverse settings using a transdermal biosensor coupled to a permeated skin site. In addition, we compared a novel, abrasion-based skin permeation method to an ultrasound-based method for transdermal continuous glucose monitoring. METHOD: Transdermal continuous glucose monitors were applied to patients with diabetes (study I), patients undergoing cardiac surgery (study II), and healthy volunteers (study III). Reference blood glucose measurements were performed with glucometers or standard blood glucose analyzers. At the conclusion of the 24-hour study, data were postprocessed for comparison with the reference blood glucose values collected during the study period. RESULTS: Data were validated for 10 subjects for 12 hours in study I, 8 subjects for 24 hours in study II, and 6 subjects in study III. The transdermal continuous glucose monitors usually required 1 hour of warm up. Depending on the study setting, single or multiple calibrations were applied to the datasets. Comparing predicted glucose versus reference blood glucose values, we found that study I yielded 89.6% in zone A and 9.0% in zone B in the Clarke error grid (222 data points), study II yielded 86.4% in zone A and 13.6% in zone B (147 data points), and study III yielded 89.9% in zone A and 10.1% in zone B (378 data points). CONCLUSIONS: Continuous transdermal glucose monitoring was demonstrated successfully in diverse clinical settings. The performance of abrasion was equivalent to ultrasound skin permeation methodology for transdermal glucose monitoring.

18.
J Anesth ; 20(4): 307-11, 2006.
Article in English | MEDLINE | ID: mdl-17072697

ABSTRACT

The physiologic properties of the b-type natriuretic peptide nesiritide include pulmonary, coronary, and renal arterial vasodilation and lusitropic effects on ventricular myocardium. These effects may be useful during cardiac surgery, particularly when myocardial function and cardiac output (CO) are compromised. Intraoperative hemodynamic data were collected retrospectively before and 5-15 min following completion of a nesiritide loading dose in 15 adult cardiac surgical patients with low CO associated with pulmonary hypertension, low left ventricular ejection fraction, diastolic dysfunction, or left ventricular assist device placement. In seven patients, prior alternative pharmacologic interventions had failed to improve CO, and fluid challenges were ineffective in six patients with diastolic dysfunction. Perioperative nesiritide administration (2 microg.kg(-1) load, followed by 0.01 microg.kg(-1).min(-1) for a maximum of 24 h) resulted in a statistically significant median increase in CO of 35% (P = 0.0006). In conclusion, nesiritide was associated with increased CO in patients with low CO syndromes undergoing cardiac surgery, when other measures failed. This novel agent may offer an additional option to inotropes and fluid challenges for these patients perioperatively. Randomized clinical trials are desirable to determine the risks and benefits of nesiritides and to elucidate its role for the cardiac anesthesiologist.


Subject(s)
Cardiac Output, Low/therapy , Natriuretic Agents/therapeutic use , Natriuretic Peptide, Brain/therapeutic use , Aged , Aged, 80 and over , Carbon Monoxide/analysis , Cardiac Output, Low/physiopathology , Cardiac Surgical Procedures , Echocardiography, Transesophageal , Female , Heart-Assist Devices , Humans , Hypertension/drug therapy , Male , Middle Aged , Monitoring, Physiologic/methods , Perioperative Care , Retrospective Studies
19.
Ann Thorac Surg ; 81(3): 880-5, 2006 Mar.
Article in English | MEDLINE | ID: mdl-16488688

ABSTRACT

BACKGROUND: Risk factors have been found for prolonged intensive care unit (ICU) stay in cardiac surgery patients in only a few studies; conflicting results have been described. The focus of this study was twofold: first, to evaluate preoperative, intraoperative, and postoperative risk factors for ICU stay greater than 3 days in a cardiac surgery patient population; second, to evaluate long-term survival in cardiac surgery patients with prolonged ICU stay. METHODS: Records from 2,683 cardiac surgery patients were retrospectively evaluated. Univariate and multivariate analyses for risk factors were performed for an ICU stay greater than 3 days. Thereafter, 2,563 patients were enrolled in a follow-up study for an observational time of 3 years after surgery. RESULTS: Mortality was dependent on renal, respiratory, and heart failure, as well as age, elevated APACHE II scores, and reexploration. Long-term survival analyses demonstrated a significantly lower survival in patients with longer ICU stay. However, the 6-month to 3-year long-term survival was comparable with survival in patients without prolonged ICU stay. CONCLUSIONS: Because of the increasing acuity of patients needing cardiac surgery, it is important to identify those at risk for a prolonged ICU course. It is therefore of paramount interest to implement measures throughout their entire hospital stay that would maximize organ function to improve survival and resource utilization.


Subject(s)
Cardiac Surgical Procedures/adverse effects , Intensive Care Units , Length of Stay , APACHE , Aged , Cardiac Surgical Procedures/mortality , Coronary Artery Bypass/adverse effects , Coronary Artery Bypass/mortality , Humans , Intraoperative Period , Middle Aged , Multivariate Analysis , Patient Selection , Postoperative Period , Preoperative Care , Retrospective Studies , Risk Factors , Survivors , Treatment Outcome
20.
Chem Commun (Camb) ; (5): 549-51, 2006 Feb 07.
Article in English | MEDLINE | ID: mdl-16432579

ABSTRACT

A new and remarkably facile sp3-C-O bond forming reaction of beta-hydroxyalkyl Rh porphyrins to form epoxides has been discovered and its mechanism investigated.


Subject(s)
Biomedical Research , Epoxy Compounds/chemistry , Hydroxides/chemistry , Porphyrins/chemistry , Rhodium/chemistry , Alkylation , Carbon/chemistry , Catalysis , Magnetic Resonance Spectroscopy , Molecular Structure , Oxygen/chemistry , Stereoisomerism
SELECTION OF CITATIONS
SEARCH DETAIL
...