Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 69
Filter
1.
Med Sci Sports Exerc ; 55(2): 167-176, 2023 02 01.
Article in English | MEDLINE | ID: mdl-36084228

ABSTRACT

OBJECTIVE: The objective of this blinded parallel-arm randomized controlled trial was to investigate the effect of resistance training (RT) on pain, maximal strength, and shoulder function in breast cancer survivors (BCS) with persistent pain after treatment. METHODS: Twenty BCS with self-reported pain ≥1.5 yr after treatment were randomized to an experimental group (EXP, n = 10), who performed a supervised progressive total body heavy RT program 2 times per week for 12 wk, or a control group (CON, n = 10), who was instructed to continue their everyday life. Perceived pain intensity, pressure pain threshold (PPT) levels, one-repetition maximum (1RM), and active range of motion were collected pre- and postintervention and at 3 months follow-up. RESULTS: There was a significant 11% decrease in peak pain intensity ( P < 0.05) for both groups, a significant 48% increase in 1RM ( P < 0.05), and a significant 35% increase in PPT levels ( P < 0.001) for EXP, but not for CON. For EXP, maximal strength at follow-up was still significantly greater than at preintervention ( P < 0.05), whereas PPT levels had reverted to baseline levels. There was no change in active range of motion ( P < 0.05) and no change in arm circumference ( P < 0.05). CONCLUSIONS: RT had a significant effect on 1RM and PPT of BCS with persistent pain after treatment, demonstrating both a functional and analgesic effect of progressive RT in this population. Strength was largely maintained after detraining, whereas PPT levels were not, indicating that the process of RT rather than the gain in strength may be associated with analgesia.


Subject(s)
Breast Neoplasms , Resistance Training , Humans , Female , Breast Neoplasms/therapy , Pain , Exercise Therapy , Analgesics/therapeutic use , Muscle Strength
2.
J Strength Cond Res ; 36(6): 1540-1547, 2022 Jun 01.
Article in English | MEDLINE | ID: mdl-33677460

ABSTRACT

ABSTRACT: Kristiansen, M, Thomsen, MJ, Nørgaard, J, Aaes, J, Knudsen, D, and Voigt, M. The effect of anodal transcranial direct current stimulation on quadriceps maximal voluntary contraction, corticospinal excitability, and voluntary activation levels. J Strength Cond Res 36(6): 1540-1547, 2022-Anodal transcranial direct current stimulation (a-tDCS) has previously been shown to improve maximal isometric voluntary contraction (MVIC), possibly through an upregulation of corticospinal excitability. Because muscle strength is an essential part of the performance of many sports, any ergogenic effect of a-tDCS on this parameter could potentially increase performance outcomes. The purpose of this study was to investigate the effect of a-tDCS on MVIC, voluntary activation levels (VALs), and corticospinal excitability, assessed by eliciting motor-evoked potentials (MEPs), in untrained subjects. Thirteen subjects completed 2 test sessions in which they received either a-tDCS or sham stimulation for 3 consecutive intervals of 10 minutes, separated by 5-minute breaks. Before and after each stimulation session, transcranial magnetic stimulation was used to elicit MEPs, and femoral nerve stimulation was used to assess VAL by measuring twitch torque during an MVIC test and in a relaxed state. Two-way analyses of variance with statistical significance set at p ≤ 0.05 were used to test for differences. A significant main effect was identified, as the MVIC pre-test (271.2 ± 56.6 Nm) was on average 4.1% higher compared to the post-test (260.6 ± 61.4 Nm) (p = 0.05). No significant differences were found in MEP, MVIC, or VAL as a result of stimulation type or time. In healthy subjects, the potential for improvement in corticospinal excitability may be negligible, which may in turn explain the lack of improvements in MEP, MVIC, and VAL after a-tDCS. The small decrease in MVIC for both conditions and nonsignificant changes in MEP and VAL do not justify the use of a-tDCS in combination with sporting performance in which the intent is to increase maximal isometric strength performance in the quadriceps muscle of healthy subjects.


Subject(s)
Transcranial Direct Current Stimulation , Evoked Potentials, Motor/physiology , Humans , Isometric Contraction/physiology , Muscle, Skeletal/physiology , Quadriceps Muscle , Transcranial Magnetic Stimulation
3.
PLoS One ; 16(7): e0254888, 2021.
Article in English | MEDLINE | ID: mdl-34270614

ABSTRACT

Anodal transcranial direct current stimulation (a-tDCS) has been shown to improve bicycle time to fatigue (TTF) tasks at 70-80% of VO2max and downregulate rate of perceived exertion (RPE). This study aimed to investigate the effect of a-tDCS on a RPE-clamp test, a 250-kJ time trial (TT) and motor evoked potentials (MEP). Twenty participants volunteered for three trials; control, sham stimulation and a-tDCS. Transcranial magnetic stimulation was used to determine the corticospinal excitability for 12 participants pre and post sham stimulation and a-tDCS. The a-tDCS protocol consisted of 13 minutes of stimulation (2 mA) with the anode placed above the Cz. The RPE-clamp test consisted of 5 minutes ergometer bicycling at an RPE of 13 on the Borg scale, and the TT consisted of a 250 kJ (∼10 km) long bicycle ergometer test. During each test, power output, heart rate and oxygen consumption was measured, while RPE was evaluated. MEPs increased significantly by 36% (±36%) post a-tDCS, with 8.8% (±31%) post sham stimulation (p = 0.037). No significant changes were found for any parameter at the RPE-clamp or TT. The lack of improvement may be due to RPE being more controlled by afferent feedback during TT tests than during TTF tests. Based on the results of the present study, it is concluded that a-tDCS applied over Cz, does not enhance self-paced cycling performance.


Subject(s)
Athletic Performance/physiology , Evoked Potentials, Motor/physiology , Motor Cortex/physiology , Pyramidal Tracts/physiology , Transcranial Direct Current Stimulation/methods , Adult , Bicycling/physiology , Female , Healthy Volunteers , Humans , Male , Transcranial Magnetic Stimulation , Young Adult
4.
Gait Posture ; 86: 319-326, 2021 05.
Article in English | MEDLINE | ID: mdl-33839426

ABSTRACT

BACKGROUND: The walk-to-run transition, which occurs during gradually increasing locomotion speed, has been addressed in research at least eight decades back. RESEARCH QUESTION: Why does the walk-to-run transition occur? In the present review, we focus on the reason for the transition, more than on the consequences of it. The latter has historically constituted a primary focus. METHODS: In the present review, we scrutinize related literature. RESULTS: We present a unifying conceptual framework of the dynamics of human locomotion. The framework unifies observations of the human walk-to-run transition for providing a common understanding. Further, the framework includes a schematic representation of the dynamic interaction between entities of subsystems of the human body during locomotion and the physical environment. We propose that the moving human body can behave as a dynamic non-linear complex system, which basically functions in a self-organized fashion during locomotion. Further, that the stride rate plays a particular key role for the transition. Finally, we propose that the coincidence between attractor stability and minimum energy turnover during locomotion is a consequence of the evolution of the phenotype of the adult human body and the dynamics of the acute process of self-organization during locomotion. SIGNIFICANCE: The novel insight from the present work contributes to the academic understanding of human locomotion, including in particular the central behavioural phenomenon of walk-to-run transition. Furthermore, the understanding is relevant for the ongoing work within for example locomotion rehabilitation and development of assistive devices. Regarding the latter, examples could be devices within neurorobotics and exoskeletons where the basic understanding of human locomotion increases the possibility of a successful combination of human and technology.


Subject(s)
Running/physiology , Walking/physiology , Adult , Biomechanical Phenomena , Female , Humans , Male
5.
J Mot Behav ; 53(3): 351-363, 2021.
Article in English | MEDLINE | ID: mdl-32525455

ABSTRACT

The present study investigated whether the duration of the first tapping bout, which could also be considered 'the priming', would play a role for the occurrence of the behavioral phenomenon termed repeated bout rate enhancement. Eighty-eight healthy individuals were recruited. Sixty-three of these demonstrated repeated bout rate enhancement and they were assigned to two different groups, which performed either active or passive tapping as priming. The durations of the first tapping bouts, which acted as priming, were 20, 60, 120, and 180 s. Following the first bout there was a 10 min rest and a subsequent 180 s tapping bout performed at freely chosen tapping rate. Vertical displacement and tapping force data were recorded. Rate enhancement was elicited independently of the duration of the first bout in both groups. Rate enhancement occurred without concurrent changes of the magnitude of vertical displacement, time to peak force, and duration of finger contact phase. The peak force was reduced when 180 s of tapping had been performed as priming. The increased tapping rate following priming by as little as 20 s active or passive tapping, as observed here, is suggested to be a result of increased net excitability of the nervous system.


Subject(s)
Fingers , Motor Activity , Humans
6.
J Clin Rheumatol ; 27(8): e561-e567, 2021 Dec 01.
Article in English | MEDLINE | ID: mdl-33065628

ABSTRACT

ABSTRACT: Clinicians usually easily recognize cranial manifestations of giant cell arteritis (GCA) such as new-onset headache, jaw claudication, scalp tenderness, and abrupt changes in visual acuity or blindness; however, when presented with an aberrant clinical course, the diagnosis becomes more elusive. In addition to temporal arteries and other extracranial branches of the carotid arteries, large vessel vasculitis (LVV) can also affect other blood vessels including coronary arteries, aorta with its major branches, intracranial blood vessels, and hepatic arteries.Over time, the scope of the symptoms typically associated with LVV has broadened and includes cases of fever of unknown origin accompanied with other constitutional symptoms that can mimic a range of neoplastic and infectious diseases. In up to half of patients with atypical LVV, liver enzyme level elevations with a cholestatic pattern have been observed. Alkaline phosphatase level and γ-glutamyl transferase level elevations tend to be more prevalent in those LVV patients with vigorous inflammatory responses, particularly in those with fever and other nonspecific constitutional symptoms. These patients also have more profound anemia and thrombocytosis. With the exception of rare instances of vasculitides and granulomas affecting the liver tissue, liver biopsy is generally of little help and primarily shows nonspecific changes of fatty liver.In this article, we review 3 patients who were eventually diagnosed with atypical LVV. The diagnosis was confirmed with temporal artery biopsy in 2 patients and with positron emission tomography/computed tomography in 1 patient. The common hepatic abnormality observed in all patients was the elevation of alkaline phosphatase level, which tended to respond rapidly to initiation of immunosuppressive treatment.


Subject(s)
Giant Cell Arteritis , Aorta , Giant Cell Arteritis/complications , Giant Cell Arteritis/diagnosis , Humans , Liver , Positron Emission Tomography Computed Tomography , Temporal Arteries
7.
Hum Mov Sci ; 68: 102520, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31654912

ABSTRACT

In this study we investigated motor variability in individuals who showed (responders) and who did not show (non-responders) a behavioural phenomenon termed repeated bout rate enhancement. The phenomenon is characterized by an increase of the freely chosen index finger tapping rate during the second of two consecutive tapping bouts. It was hypothesized that responders would perform (i) tapping with a lower magnitude, but more complex structure of variability than non-responders and (ii) bout 2 with a lower magnitude and increased complexity of variability than bout 1, as opposed to non-responders. Individuals (n = 102) performed two 3-min tapping bouts separated by 10 min rest. Kinetic and kinematic recordings were performed. Standard deviation (SD), coefficient of variation (CV), and sample entropy (SaEn), representing magnitude and complexity of variability, were computed. For responders, SaEn of vertical displacement of the index finger was higher than for non-responders (p = .046). Further, SaEn of vertical force and vertical displacement was higher in bout 2 than in bout 1 for responders (p < .001 and p = .006, respectively). In general, SD of vertical displacement was lower in bout 2 than in bout 1 (p < .001). SaEn of vertical force was higher in bout 2 than in bout 1 (p = .009). The present lower SD and higher SaEn values of vertical force and displacement time series in bout 2 as compared to bout 1 suggest differences in the dynamics of finger tapping. Further, it is possible that the increases in SaEn of vertical displacement reflected a greater adaptability in the dynamics of motor control among responders compared with non-responders.


Subject(s)
Fingers/physiology , Psychomotor Performance/physiology , Adult , Biomechanical Phenomena/physiology , Entropy , Female , Humans , Kinetics , Male , Movement/physiology , Time Factors , Young Adult
8.
Hum Mov Sci ; 67: 102510, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31442623

ABSTRACT

The purpose of this study was to explore the level of inter- and intra-individual variability in the kinematic profiles of the back squat movement among skilled weightlifters. Ten competitive weightlifters volunteered for participation in this study. Barbell velocity (VBarbell) and angular velocity of the ankle (ωAnkle), knee (ωKnee) and hip joint (ωHip) were obtained by kinematic recording of six trials at 90% of 1RM in the back squat. Inter-individual variability was assessed by analysing inter-individual differences in the velocity curves through the statistical parametric mapping method. Intra-individual variability was assessed through a correlation analysis between the barbell velocity curves of each trial for each participant. Partial least squares regression analysis, was performed to relate changes in intra-individual variability to movement and anthropometric characteristics. Significant inter- and intra-individual differences were identified in VBarbell, ωAnkle, ωKnee, and ωHip (p ≤ 0.05). Having a short trunk and thigh, and a long shin in combination with greater anterior-posterior displacement of the barbell and slower velocities during the acceleration phase increased intra-individual movement consistency over movement variability. The results of the present study clearly demonstrate that skilled weightlifters display both significant inter- and intra-individual variability in the successful execution of the back squat.


Subject(s)
Movement/physiology , Posture/physiology , Weight Lifting/physiology , Acceleration , Adult , Analysis of Variance , Ankle Joint/physiology , Biomechanical Phenomena , Hip Joint/physiology , Humans , Knee Joint/physiology , Male , Psychomotor Performance/physiology , Thigh/physiology
9.
J Hepatobiliary Pancreat Sci ; 26(9): 393-400, 2019 Sep.
Article in English | MEDLINE | ID: mdl-31211912

ABSTRACT

INTRODUCTION: Interferon (IFN) treatment for liver transplant (LT) recipients with hepatitis C virus (HCV) increases acute cellular rejection (ACR) and worsens graft and patient survival. It is unknown if direct-acting antivirals (DAAs) affect rejection rates or post-transplant survival. METHOD: The United Network for Organ Sharing STAR files of December 2017 (n = 25,916) were analyzed. RESULTS: Compared with non-HCV-LT, HCV-LT survival was worse in the IFN-era (2007-2008) and IFN+DAA-era (2011), but not in the DAA-era (2014-2015). ACR6m rate has been less frequent in newer eras and was lower in HCV-LT than in non-HCV-LT in both the DAA-era (6.9% vs. 9.3%, P < 0.001) and in the IFN+DAA-era (8.8% vs. 11.8%, P = 0.001), but not in the IFN-era (10.8% vs. 11.0%, P = 0.39). HCV-LT recipients who had ACR6m had worse 2-year survival than those without ACR6m, in the IFN-era (80.0% vs. 88.4%, P < 0.0001) and in the IFN+DAA-era (81.4% vs. 89.2%, P < 0.01) but not in the DAA-era (90.4% vs. 93.2%, P = 0.085). Cox proportional hazard model identified ACR6m as independent risk factor for mortality in HCV-LT in the IFN-era (HR = 1.88, P ≤ 0.001) and in the IFN+DAA-era (HR = 1.84, P = 0.005), but not in the DAA-era (P = n.s.). CONCLUSIONS: Two-year survival of HCV-LT recipients were significantly better in the DAA-era; these were associated with reduced rate and impact of ACR6m.


Subject(s)
Antiviral Agents/therapeutic use , Graft Rejection/epidemiology , Hepatitis C, Chronic/surgery , Liver Transplantation , Female , Graft Rejection/mortality , Graft Survival , Hepatitis C, Chronic/drug therapy , Hepatitis C, Chronic/mortality , Humans , Interferons/therapeutic use , Liver Transplantation/mortality , Male , Middle Aged , Recurrence , Retrospective Studies , Risk Factors , United States/epidemiology
10.
Gastroenterology ; 157(2): 472-480.e5, 2019 08.
Article in English | MEDLINE | ID: mdl-30998988

ABSTRACT

BACKGROUND & AIMS: Early liver transplantation (without requiring a minimum period of sobriety) for severe alcohol-associated hepatitis (AH) is controversial: many centers delay eligibility until a specific period of sobriety (such as 6 months) has been achieved. To inform ongoing debate and policy, we modeled long-term outcomes of early vs delayed liver transplantation for patients with AH. METHODS: We developed a mathematical model to simulate early vs delayed liver transplantation for patients with severe AH and different amounts of alcohol use after transplantation: abstinence, slip (alcohol use followed by sobriety), or sustained use. Mortality of patients before transplantation was determined by joint-effect model (based on Model for End-Stage Liver Disease [MELD] and Lille scores). We estimated life expectancies of patients receiving early vs delayed transplantation (6-month wait before placement on the waitlist) and life years lost attributable to alcohol use after receiving the liver transplant. RESULTS: Patients offered early liver transplantation were estimated to have an average life expectancy of 6.55 life years, compared with an average life expectancy of 1.46 life years for patients offered delayed liver transplantation (4.49-fold increase). The net increase in life expectancy from offering early transplantation was highest for patients with Lille scores of 0.50-0.82 and MELD scores of 32 or more. Patients who were offered early transplantation and had no alcohol use afterward were predicted to survive 10.85 years compared with 3.62 years for patients with sustained alcohol use after transplantation (7.23 life years lost). Compared with delayed transplantation, early liver transplantation increased survival times in all simulated scenarios and combinations of Lille and MELD scores. CONCLUSIONS: In a modeling study of assumed carefully selected patients with AH, early vs delayed liver transplantation (6 months of abstinence from alcohol before transplantation) increased survival times of patients, regardless of estimated risk of sustained alcohol use after transplantation. These findings support early liver transplantation for patients with severe AH. The net increase in life expectancy was maintained in all simulated extreme scenarios but should be confirmed in prospective studies. Sustained alcohol use after transplantation significantly reduced but did not eliminate the benefits of early transplantation. Strategies are needed to prevent and treat posttransplantation use of alcohol.


Subject(s)
End Stage Liver Disease/surgery , Hepatitis, Alcoholic/surgery , Liver Transplantation/methods , Models, Biological , Time-to-Treatment , Adult , Alcohol Abstinence , Alcohol Drinking/adverse effects , Alcohol Drinking/prevention & control , End Stage Liver Disease/diagnosis , End Stage Liver Disease/etiology , End Stage Liver Disease/mortality , Female , Hepatitis, Alcoholic/complications , Hepatitis, Alcoholic/diagnosis , Hepatitis, Alcoholic/mortality , Humans , Life Expectancy , Liver Transplantation/standards , Male , Middle Aged , Prospective Studies , Risk Assessment/methods , Severity of Illness Index , Survival Analysis , Time Factors , Treatment Outcome , Waiting Lists
11.
Hepatology ; 69(4): 1477-1487, 2019 04.
Article in English | MEDLINE | ID: mdl-30561766

ABSTRACT

Early liver transplant (LT) for alcohol-associated disease (i.e., without a specific sobriety period) is controversial but increasingly used. Using the multicenter American Consortium of Early Liver Transplantation for Alcoholic Hepatitis (ACCELERATE-AH) cohort, we aimed to develop a predictive tool to identify patients pretransplant with low risk for sustained alcohol use posttransplant to inform selection of candidates for early LT. We included consecutive ACCELERATE-AH LT recipients between 2012 and 2017. All had clinically diagnosed severe alcoholic hepatitis (AH), no prior diagnosis of liver disease or AH, and underwent LT without a specific sobriety period. Logistic and Cox regression, classification and regression trees (CARTs), and least absolute shrinkage and selection operator (LASSO) regression were used to identify variables associated with sustained alcohol use post-LT. Among 134 LT recipients for AH with median period of alcohol abstinence pre-LT of 54 days, 74% were abstinent, 16% had slips only, and 10% had sustained alcohol use after a median 1.6 (interquartile range [IQR]: 0.7-2.8) years follow-up post-LT. Four variables were associated with sustained use of alcohol post-LT, forming the Sustained Alcohol Use Post-LT (SALT) score (range: 0-11): >10 drinks per day at initial hospitalization (+4 points), multiple prior rehabilitation attempts (+4 points), prior alcohol-related legal issues (+2 points), and prior illicit substance abuse (+1 point). The C statistic was 0.76 (95% confidence interval [CI]: 0.68-0.83). A SALT score ≥5 had a 25% positive predictive value (95% CI: 10%-47%) and a SALT score of <5 had a 95% negative predictive value (95% CI: 89%-98%) for sustained alcohol use post-LT. In internal cross-validation, the average C statistic was 0.74. Conclusion: A prognostic score, the SALT score, using four objective pretransplant variables identifies candidates with AH for early LT who are at low risk for sustained alcohol use posttransplant. This tool may assist in the selection of patients with AH for early LT or in guiding risk-based interventions post-LT.


Subject(s)
Alcohol Drinking , Hepatitis, Alcoholic/surgery , Liver Transplantation , Postoperative Complications , Adult , Cohort Studies , Female , Humans , Logistic Models , Male , Middle Aged , Risk Assessment
12.
Front Neurosci ; 12: 526, 2018.
Article in English | MEDLINE | ID: mdl-30108479

ABSTRACT

Voluntary rhythmic movements, such as, for example, locomotion and other cyclic tasks, are fundamental during everyday life. Patients with impaired neural or motor function often take part in rehabilitation programs, which include rhythmic movements. Therefore, it is imperative to have the best possible understanding of control and behaviour of human voluntary rhythmic movements. A behavioural phenomenon termed repeated bout rate enhancement has been established as an increase of the freely chosen index finger tapping frequency during the second of two consecutive tapping bouts. The present study investigated whether the phenomenon would be elicited when the first bout consisted of imposed passive finger tapping or air tapping. These two forms of tapping were applied since they can be performed without descending drive (passive tapping) and without afferent feedback related to impact (air tapping) - as compared to tapping on a surface. Healthy individuals (n = 33) performed 3-min tapping bouts separated by 10 min rest. Surface electromyographic, kinetic, and kinematic data were recorded. Supportive experiments were made to measure, for example, the cortical sensory evoked potential (SEP) response during the three different forms of tapping. Results showed that tapping frequencies in the second of two consecutive bouts increased by 12.9 ± 14.8% (p < 0.001), 9.9 ± 6.0% (p = 0.001), and 16.8 ± 13.6% (p = 0.005) when the first bout had consisted of tapping, passive tapping, and air tapping, respectively. Rate enhancement occurred without increase in muscle activation. Besides, the rate enhancements occurred despite that tapping, as compared with passive tapping and air tapping, resulted in different cortical SEP responses. Based on the present findings, it can be suggested that sensory feedback in an initial bout increases the excitability of the spinal central pattern generators involved in finger tapping. This can eventually explain the phenomenon of repeated bout rate enhancement seen after a consecutive bout of finger tapping.

13.
Gastroenterology ; 155(2): 422-430.e1, 2018 08.
Article in English | MEDLINE | ID: mdl-29655837

ABSTRACT

BACKGROUND & AIMS: The American Consortium of Early Liver Transplantation for Alcoholic Hepatitis comprises 12 centers from 8 United Network for Organ Sharing regions studying early liver transplantation (LT) (without mandated period of sobriety) for patients with severe alcoholic hepatitis (AH). We analyzed the outcomes of these patients. METHODS: We performed a retrospective study of consecutive patients with a diagnosis of severe AH and no prior diagnosis of liver disease or episodes of AH, who underwent LT before 6 months of abstinence from 2006 through 2017 at 12 centers. We collected data on baseline characteristics, psychosocial profiles, level of alcohol consumption before LT, disease course and treatment, and outcomes of LT. The interval of alcohol abstinence was defined as the time between last drink and the date of LT. The primary outcomes were survival and alcohol use after LT, defined as slip or sustained. RESULTS: Among 147 patients with AH who received liver transplants, the median duration of abstinence before LT was 55 days; 54% received corticosteroids for AH and the patients had a median Lille score of 0.82 and a median Sodium Model for End-Stage Liver Disease score of 39. Cumulative patient survival percentages after LT were 94% at 1 year (95% confidence interval [CI], 89%-97%) and 84% at 3 years (95% CI, 75%-90%). Following hospital discharge after LT, 72% were abstinent, 18% had slips, and 11% had sustained alcohol use. The cumulative incidence of any alcohol use was 25% at 1 year (95% CI, 18%-34%) and 34% at 3 years (95% CI, 25%-44%) after LT. The cumulative incidence of sustained alcohol use was 10% at 1 year (95% CI, 6%-18%) and 17% at 3 years (95% CI, 10%-27%) after LT. In multivariable analysis, only younger age was associated with alcohol following LT (P = .01). Sustained alcohol use after LT was associated with increased risk of death (hazard ratio, 4.59; P = .01). CONCLUSIONS: In a retrospective analysis of 147 patients who underwent early LT (before 6 months of abstinence) for severe AH, we found that most patients survive for 1 year (94%) and 3 years (84%), similar to patients receiving liver transplants for other indications. Sustained alcohol use after LT was infrequent but associated with increased mortality. Our findings support the selective use of LT as a treatment for severe AH. Prospective studies are needed to optimize selection criteria, management of patients after LT, and long-term outcomes.


Subject(s)
Alcohol Abstinence/statistics & numerical data , Alcohol Drinking/epidemiology , Liver Diseases, Alcoholic/surgery , Liver Transplantation/statistics & numerical data , Patient Selection , Adult , Age Factors , Alcohol Drinking/adverse effects , Female , Follow-Up Studies , Humans , Incidence , Liver Diseases, Alcoholic/etiology , Liver Diseases, Alcoholic/mortality , Liver Transplantation/standards , Male , Middle Aged , Recurrence , Retrospective Studies , Risk Factors , Survival Analysis , Time Factors , Treatment Outcome , United States/epidemiology
14.
J Cancer Res Clin Oncol ; 144(3): 607-615, 2018 Mar.
Article in English | MEDLINE | ID: mdl-29362916

ABSTRACT

PURPOSE: Non-melanoma skin cancer (NMSC) is the most common de novo malignancy in liver transplant (LT) recipients; it behaves more aggressively and it increases mortality. We used decision tree analysis to develop a tool to stratify and quantify risk of NMSC in LT recipients. METHODS: We performed Cox regression analysis to identify which predictive variables to enter into the decision tree analysis. Data were from the Organ Procurement Transplant Network (OPTN) STAR files of September 2016 (n = 102984). RESULTS: NMSC developed in 4556 of the 105984 recipients, a mean of 5.6 years after transplant. The 5/10/20-year rates of NMSC were 2.9/6.3/13.5%, respectively. Cox regression identified male gender, Caucasian race, age, body mass index (BMI) at LT, and sirolimus use as key predictive or protective factors for NMSC. These factors were entered into a decision tree analysis. The final tree stratified non-Caucasians as low risk (0.8%), and Caucasian males > 47 years, BMI < 40 who did not receive sirolimus, as high risk (7.3% cumulative incidence of NMSC). The predictions in the derivation set were almost identical to those in the validation set (r2 = 0.971, p < 0.0001). Cumulative incidence of NMSC in low, moderate and high risk groups at 5/10/20 year was 0.5/1.2/3.3, 2.1/4.8/11.7 and 5.6/11.6/23.1% (p < 0.0001). CONCLUSIONS: The decision tree model accurately stratifies the risk of developing NMSC in the long-term after LT.


Subject(s)
Decision Support Techniques , Decision Trees , Liver Transplantation/adverse effects , Skin Neoplasms/etiology , Adult , Aged , Female , Humans , Immunosuppressive Agents/therapeutic use , Incidence , Liver Transplantation/statistics & numerical data , Male , Middle Aged , Neoplasms, Basal Cell/epidemiology , Neoplasms, Basal Cell/etiology , Neoplasms, Squamous Cell/epidemiology , Neoplasms, Squamous Cell/etiology , Risk Assessment , Risk Factors , Skin Neoplasms/epidemiology , Transplantation Conditioning/adverse effects , Transplantation Conditioning/statistics & numerical data
15.
Gait Posture ; 60: 71-75, 2018 02.
Article in English | MEDLINE | ID: mdl-29161625

ABSTRACT

The transition from walking to running has previously been predicted to occur at a point where the stride frequency starts getting closer to the running attractor than to the walking attractor. The two behavioural attractors were considered to be represented by the freely chosen stride frequencies during unrestricted treadmill walking and running. The aim of the present study was to determine the relative and absolute test-retest reliability of the predicted walk-to-run transition stride frequency. Healthy individuals (n=25) performed walking and running on a treadmill in a day-to-day test-retest design. The two behavioral attractors were determined during walking and running at freely chosen velocities and stride frequencies. Subsequently, the walk-to-run transition stride frequency was predicted using camera recordings and a previously reported equation for prediction. The walk-to-run transition occurred at a velocity of 7.7±0.5kmh-1 at day 1 as well as at day 2. Besides, the predicted walk-to-run transition stride frequencies were 69.7±3.3 strides min-1 and 70.5±3.4 strides min-1 on day 1 and day 2, respectively (p=0.08). A further comparison between the predicted walk-to-run transition stride frequencies at day 1 and day 2 showed an ICC3,1 of 0.89, which indicated almost perfect relative reliability. The absolute reliability was reflected by a%-value of the standard error of the measurement (SEM%) of 1.6% and a%-value of the smallest real difference (SRD%) of 4.4%. In conclusion, the predicted walk-to-run transition stride frequency can be considered reliable across days.


Subject(s)
Exercise Test/methods , Gait/physiology , Running/physiology , Walking/physiology , Adult , Female , Humans , Male , Reproducibility of Results
16.
Sci Rep ; 7(1): 2010, 2017 05 17.
Article in English | MEDLINE | ID: mdl-28515449

ABSTRACT

It remains unclear why humans spontaneously shift from walking to running at a certain point during locomotion at gradually increasing velocity. We show that a calculated walk-to-run transition stride frequency (70.6 ± 3.2 strides min-1) agrees with a transition stride frequency (70.8 ± 3.1 strides min-1) predicted from the two stride frequencies applied during treadmill walking and running at freely chosen velocities and freely chosen stride frequencies. The agreement is based on Bland and Altman's statistics. We found no essential mean relative difference between the two transition frequencies, i.e. -0.5% ± 4.2%, as well as limits of agreement of -8.7% and 7.7%. The particular two freely chosen stride frequencies used for prediction are considered behavioural attractors. Gait is predicted to be shifted from walking to running when the stride frequency starts getting closer to the running attractor than to the walking attractor. In particular, previous research has focussed on transition velocity and optimisation theories based on minimisation of, e.g., energy turnover or biomechanical loadings of the legs. Conversely, our data support that the central phenomenon of walk-to-run transition during human locomotion could be influenced by behavioural attractors in the form of stride frequencies spontaneously occurring during behaviourally unrestricted gait conditions of walking and running.


Subject(s)
Gait , Running , Walking , Biomechanical Phenomena , Humans , Locomotion
17.
J Sci Med Sport ; 20(9): 830-834, 2017 Sep.
Article in English | MEDLINE | ID: mdl-28446388

ABSTRACT

OBJECTIVES: Achilles tendinitis, plantar fasciopathy and medial tibial stress syndrome injuries (APM-injuries) account for approximately 25% of the total number of running injuries amongst recreational runners. Reports on the association between static foot pronation and APM-injuries are contradictory. Possibly, dynamic measures of pronation may display a stronger relationship with the risk of APM-injuries. Therefore, the purpose of the present study was to investigate if running distance until the first APM-injury was dependent on the foot balance during stance phase in recreational male runners. DESIGN: Prospective cohort study. METHODS: Foot balance for both feet was measured during treadmill running at the fastest possible 5000-m running pace in 79 healthy recreational male runners. Foot balance was calculated by dividing the average of medial pressure with the average of lateral pressure. Foot balance was categorized into those which presented a higher lateral shod pressure (LP) than medial pressure, and those which presented a higher medial shod pressure (MP) than lateral pressure during the stance phase. A time-to-event model was used to compare differences in incidence between foot balance groups. RESULTS: Compared with the LP-group (n=59), the proportion of APM-injuries was greater in the MP-group (n=99) after 1500km of running, resulting in a cumulative risk difference of 16%-points (95% CI=3%-point; 28%-point, p=0.011). CONCLUSIONS: Runners displaying a more medial pressure during stance phase at baseline sustained a greater amount of APM-injuries compared to those displaying a lateral shod pressure during stance phase. Prospective studies including a greater amount of runners are needed to confirm this relationship.


Subject(s)
Pressure , Running/injuries , Shoes , Adult , Biomechanical Phenomena , Exercise Test , Foot , Humans , Male , Pronation , Prospective Studies
18.
Clin Transplant ; 31(5)2017 05.
Article in English | MEDLINE | ID: mdl-28295601

ABSTRACT

BACKGROUND: Idiopathic hyperammonemia syndrome (IHS) is an uncommon, often deadly complication of solid organ transplantation. IHS cases in solid organ transplantation seem to occur predominantly in lung transplant (LTx) recipients. However, to the best of our knowledge, the occurrence of IHS has not been systematically evaluated. We set out to identify all reported cases of IHS following nonliver solid organ transplantations. METHODS: Retrospective review of our institutional experience and systematic review of the literature. RESULTS: At our institution six cases (of 844 nonliver solid organ transplants) of IHS were identified: five occurred following LTx (incidence 3.9% [lung] vs 0.1% [nonlung], P=.004). In the systematic review, 16 studies met inclusion criteria, reporting on 32 cases of IHS. The majority of IHS cases in the literature (81%) were LTx-recipients. The average peak reported ammonia level was 1039 µmol/L occurring on average 14.7 days post-transplant. Mortality in previously reported IHS cases was 69%. A single-center experience suggested that, in addition to standard treatment for hyperammonemia, early initiation of high intensity hemodialysis to remove ammonia was associated with increased survival. In the systematic review, mortality was 40% (four of 10) with intermittent hemodialysis, 75% (nine of 12) with continuous veno-venous hemodialysis, and 100% in six subjects that did not receive renal replacement to remove ammonia. Three reports identified infection with urease producing organisms as a possible etiology of IHS. CONCLUSION: IHS is a rare but often fatal complication that primarily affects lung transplant recipients within the first 30 days.


Subject(s)
Hyperammonemia/etiology , Lung Diseases/physiopathology , Organ Transplantation/adverse effects , Humans , Meta-Analysis as Topic , Prognosis , Retrospective Studies
19.
Transplantation ; 101(8): e249-e257, 2017 08.
Article in English | MEDLINE | ID: mdl-28282359

ABSTRACT

BACKGROUND: Locoregional therapy with curative intent (CLRT) followed by salvage liver transplantation (SLT) in case of hepatocellular carcinoma (HCC) recurrence is an alternative to primary liver transplantation (LT) in selected patients with HCC. METHODS: We performed a systematic review and meta-analysis of studies comparing the survival of patients treated with CLRT versus LT, stratified by the stage of liver disease, extent of cancer, and whether SLT was offered or not. RESULTS: We included 48 studies involving 9835 patients (5736 patients with CLRT and 4119 patients with primary LT). Five-year overall survival (OS) and disease-free survival (DFS) was worse for all categories of CLRT combined, than for primary LT (odds ratio [OR] for OS, 0.59; 95% confidence interval [CI], 0.48-0.71; P < 0.01). However, 5-year OS for CLRT and primary LT was not significantly different among patients with (i) Child-A cirrhosis and (ii) single HCC lesion, although DFS was worse. When SLT was offered after CLRT, intention-to-treat analysis showed no significant difference in 5-year OS (OR, 1.0; 95% CI, 0.6-1.7) between CLRT-SLT and primary LT, though noninferiority could not be shown. Only 32.5% patients with HCC recurrence after CLRT actually received SLT, as the rest were not medically eligible. Thus, the DFS was worse with CLRT-SLT (OR, 0.31; 95% CI, 0.2-0.6) compared with LT. CONCLUSIONS: CLRT-SLT may be offered as first-line therapy to patients with HCC and well-compensated cirrhosis instead of primary LT because it may lead to better utilization of donor liver. However, a large proportion of patients with HCC recurrence after CLRT may not be candidates for SLT.


Subject(s)
Carcinoma, Hepatocellular/surgery , Hepatectomy/methods , Liver Neoplasms/surgery , Liver Transplantation/methods , Salvage Therapy/methods , Humans
20.
PLoS One ; 12(1): e0168557, 2017.
Article in English | MEDLINE | ID: mdl-28060839

ABSTRACT

A constant coordination between the left and right leg is required to maintain stability during human locomotion, especially in a variable environment. The neural mechanisms underlying this interlimb coordination are not yet known. In animals, interneurons located within the spinal cord allow direct communication between the two sides without the need for the involvement of higher centers. These may also exist in humans since sensory feedback elicited by tibial nerve stimulation on one side (ipsilateral) can affect the muscles activation in the opposite side (contralateral), provoking short-latency crossed responses (SLCRs). The current study investigated whether contralateral afferent feedback contributes to the mechanism controlling the SLCR in human gastrocnemius muscle. Surface electromyogram, kinematic and kinetic data were recorded from subjects during normal walking and hybrid walking (with the legs moving in opposite directions). An inverse dynamics model was applied to estimate the gastrocnemius muscle proprioceptors' firing rate. During normal walking, a significant correlation was observed between the magnitude of SLCRs and the estimated muscle spindle secondary afferent activity (P = 0.04). Moreover, estimated spindle secondary afferent and Golgi tendon organ activity were significantly different (P ≤ 0.01) when opposite responses have been observed, that is during normal (facilitation) and hybrid walking (inhibition) conditions. Contralateral sensory feedback, specifically spindle secondary afferents, likely plays a significant role in generating the SLCR. This observation has important implications for our understanding of what future research should be focusing on to optimize locomotor recovery in patient populations.


Subject(s)
Feedback, Sensory , Muscle, Skeletal/innervation , Psychomotor Performance , Walking/physiology , Adult , Afferent Pathways , Electromyography , Female , Humans , Leg/physiology , Mechanoreceptors/physiology , Muscle, Skeletal/physiology , Reaction Time , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...