Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 29
Filter
1.
Int J Surg ; 110(1): 144-150, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-37800592

ABSTRACT

BACKGROUND: The detection of haemorrhage in trauma casualties may be delayed owing to compensatory mechanisms. This study aimed to evaluate whether the cardiovascular reserve index (CVRI) on arrival detects massive haemorrhage and predicts haemorrhage development in trauma casualties. METHODS: This was an observational prospective cohort study of adult casualties (≥18 years) who were brought to a single level-1 trauma centre, enroled upon arrival and followed until discharge. Vital signs were monitored on arrival, from which the CVRI and shock index were retrospectively calculated (blinded to the caregivers). The outcome measure was the eventual haemorrhage classification group: massive haemorrhage on arrival (MHOA) (defined by massive transfusion on arrival of ≥6 [O+] packed cells units), developing haemorrhage (DH) (defined by a decrease in haemoglobin >1 g/dl in consecutive tests), and no significant haemorrhage noted throughout the hospital stay. The means of each variable on arrival by haemorrhage group were evaluated using the analysis of variance. The authors evaluated the detection of MHOA in the entire population and the prediction of DH in the remainders (given that MHOA had already been detected and treated) by C-statistic predefined strong prediction by area under the curve (AUC) greater than or equal to 0.8, P less than or equal to 0.05. RESULTS: The study included 71 patients (after exclusion): males, 82%; average age 37.7 years. The leading cause of injuries was road accident (61%). Thirty-nine (54%) patients required hospital admission; distribution by haemorrhage classification: 5 (7%) MHOA, 5 (7%) DH, and 61 (86%) no significant haemorrhage. Detection of MHOA found a strong predictive model by CVRI and most variables (AUC 0.85-1.0). The prediction of DH on arrival showed that only lactate (AUC=0.88) and CVRI (0.82) showed strong predictive model. CONCLUSIONS: CVRI showed a strong predictive model for detection of MHOA (AUC>0.8) as were most other variables. CVRI also showed a strong predictive model for detection of DH (AUC=0.82), only serum lactate predicted DH (AUC=0.88), while all other variables were not found predictive. CVRI has advantages over lactate in that it is feasible in pre-hospital and mass casualty settings. Moreover, its repeatability enables detection of deteriorating trend. The authors conclude that CVRI may be a useful additional tool in the evaluation of haemorrhage.


Subject(s)
Trauma Centers , Wounds and Injuries , Adult , Male , Humans , Retrospective Studies , Prospective Studies , Hemorrhage/diagnosis , Hemorrhage/etiology , Lactates , Wounds and Injuries/complications , Wounds and Injuries/diagnosis
2.
Isr Med Assoc J ; 25(2): 101-105, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36841977

ABSTRACT

BACKGROUND: Acute appendicitis (AA) is a medical emergency. The standard of care for AA had been surgical appendectomy. Recently, non-operative management (NOM) has been considered, mainly for uncomplicated AA. OBJECTIVES: To evaluate AA NOM trends over two decades. METHODS: We conducted a retrospective cohort study based on Israel's National Hospital Discharges Database (NHDD). Inclusion criteria were AA admissions from 1 January 2000 to 31 December 2019, with either primary discharged diagnosis of AA, or principal procedure of appendectomy. Predefined groups were children (5 ≤ 18 years) and adults (≥ 18 years). We compared the last decade (2010-2019) with the previous one (2000-2009). RESULTS: The overall AA incidence rate over two decades was 126/100,000/year; higher in children 164/100,000/year than 113/100,000/year in adults. Surgery was the predominant AA treatment in 91.9%; 93.7% in children and 91.1% in adults. There was an increase in AA NOM rates when comparing the previous decade (5.6%) to the past decade (10.2%); 3.2% vs. 9.1% in children and 6.8% vs. 10.7% in adults, respectively. Annual trends revealed a mild increase in AA NOM rates. Delayed appendectomy (within 90 days of AA NOM) was 19.7% overall; 17.3% in adults and 26.3% in children. CONCLUSIONS: There was an increase in AA NOM rates during the last decade in the overall population. Since 2015, there has been a noticeable increase in AA NOM rates, probably associated with World Society of Emergency Surgery Jerusalem guidelines. Surgery is still the predominant treatment for AA despite the increasing trend in NOM.


Subject(s)
Appendicitis , Adult , Child , Humans , Appendicitis/diagnosis , Appendicitis/epidemiology , Appendicitis/surgery , Retrospective Studies , Israel/epidemiology , Acute Disease , Hospitalization , Appendectomy
3.
Isr Med Assoc J ; 25(1): 27-31, 2023 Jan.
Article in English | MEDLINE | ID: mdl-36718733

ABSTRACT

BACKGROUND: Bacterial meningitis (BM) remains a considerable cause of morbidity. OBJECTIVES: To evaluate BM incidence rate trends in diverse age groups. METHODS: We conducted a retrospective cohort study based on the Israeli national registry. Inclusion criteria were acute admissions 2000 to 2019 with primary diagnosis of BM. Predefined age groups were neonates (≤ 30 days), infants (31 days to 1 year), younger children (1 ≤ 5 years), older children (5 ≤ 18 years), and adults (≥ 18 years). Average annual incidence rates per 100,000/year were calculated for the entire period and by decade. Incidence rates for neonates and infants were calculated per 100,000 live births (LB). RESULTS: There were 3039 BM cases over 2 decades, 60% were adults. The overall BM incidence rate was 2.0/100,000/year, neonates, 5.4/100,000/year LB, infants 17.6/100,000/year LB. First year of life incidence rate (neonates and infants combined) was 23.0/100,000/year, younger children 1.5/100,000/year, older children 0.9/100,000/year, and adults 1.8/100,000/year. All age groups presented a decrease in incidence rate (last decade vs. previous) except neonates, which increased by 34%. Younger and older children presented the most considerable decrease: 48% and 37% (last decade vs. previous). CONCLUSIONS: Adults showed the highest number of BM cases. The incidence rate was highest during the first year of life (neonates and infants combined). All age groups, except neonates, showed a decreasing trend. Younger and older children presented the most considerable decrease, most likely attributable to vaccination. The observed increase in BM incidence rate in neonates may influence whether preventive strategy is considered.


Subject(s)
Meningitis, Bacterial , Infant , Child , Infant, Newborn , Adult , Humans , Adolescent , Incidence , Israel/epidemiology , Retrospective Studies , Meningitis, Bacterial/epidemiology , Morbidity
4.
Medicine (Baltimore) ; 101(36): e30555, 2022 Sep 09.
Article in English | MEDLINE | ID: mdl-36086698

ABSTRACT

To evaluate the daily output measures of the emergency department (ED) in association with seasonality and the day of the week. A retrospective cohort of ED visits to tertiary medical centers between 2016 and 2020. The research unit was each day during the study period. The independent variables were season and day of the week. The dependent variables were ED visits, admission and dropout rates, and duration of ED discharge. The comparison of means was evaluated using ANOVA. Statistical significance was set at P < .05. There were 1826 days, 792 thousand visits, 58% were female. Admission rate 28%, duration to discharge 3.8 h, dropout rate 2%. The average daily visits by season ranged from 101% of the overall average in autumn to 97% in spring. Average daily visits by day of the week were significantly different, with the highest on Sunday (Israel's first working day of the week), 124% of the overall daily average, and the lowest on Saturday (weekly day off) with 70%. Saturdays had the highest admission rate of 30% and 28% of the overall rate. There was a moderate dependency between the ED duration and discharge, with a dropout rate of r2 = 0.19. The average daily visits were not affected by season but differed considerably by day of the week. Admission rates varied slightly by season but were similar by day of the week apart from Saturdays. This may be attributable to the case mix on Saturdays or less restriction to admit when the number of visits is low. We recommended each Emergency Department to evaluate its daily output measures dependency with seasonality and day of the week for operational optimization.


Subject(s)
Emergency Service, Hospital , Hospitalization , Female , Humans , Male , Patient Discharge , Retrospective Studies , Seasons
5.
Ann Med Surg (Lond) ; 79: 103933, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35860137

ABSTRACT

Background: According to the literature, there are sex allocation inequalities in liver transplantation (LT). Sex disparities in outcomes after LT have been debated. This study aimed to evaluate sex-specific outcomes after LT, specifically short-term mortality and long-term survival rates. Methods: A retrospective cohort of the entire LT series from to 2010-2019 in a single center in which the inclusion criteria were adults ≥18 YO age who underwent primary deceased donor LT. Mortality rate was evaluated within 30 days and 6 months. Survival rate was evaluated at 1,3 and 5 years of age. Results: A total of 240 primary and deceased donor LTs (153 men and 87 women) were included. Mean age 55.2Y men and 51.6Y women (p = 0.02). Hepatocellular carcinoma (HCC) was the direct indication in 32.7% of the men and only 17.4% of the women. The leading primary liver morbidities were viral hepatitis (B, C, and D) in 38.3% (N = 92) and nonalcoholic steatohepatitis (NASH) in 20.8% (N = 50) of patients. Thirty-day mortality was 14%, which was significantly higher in men (18%) than in women (8%). Survival rates after 5 years were 64.9% and 78.3%, respectively. Multivariate analysis through logistic regression that included age, direct indication, MELD, and primary liver morbidity revealed statistically significant female to male Odds-Ratio of 0.4 in 30 days, 6 m mortality and a statistically significant higher long-term survival. Conclusions: Our observations revealed better female outcomes, namely, lower short-term mortality and higher long-term survival. Given the consistency after stratification and given the multivariate analysis, this is unlikely to be attributable to confounders. Such findings suggesting consistently better female outcomes have not been previously reported; hence, multi center study is encouraged.

6.
Isr J Health Policy Res ; 11(1): 2, 2022 01 05.
Article in English | MEDLINE | ID: mdl-34986880

ABSTRACT

BACKGROUND: In 2005, Clalit Health Services (CHS), the largest health maintenance organization in Israel, initiated an intervention program aimed at reducing the prevalence rate of infantile anemia (IA). This study evaluated the progress made during the intervention (2005-2014) and its yield 5 years after it ended (2019). METHODS: The CHS database was retrospectively reviewed twice yearly from 2005 to 2014 for repetitive samples of children aged 9 to 18 months regarding the previous half-year interval, and a single sample in 2019. Data were collected on gender, ethnicity (Jewish/non-Jewish), socioeconomic class (SEC; low/intermediate/high), hemoglobin testing (yes/no), and hemoglobin level (if tested). Excluded were infants with documented or suspected hemoglobinopathy. RESULTS: At study initiation, the rate of performance of hemoglobin testing was 54.7%, and the IA prevalence rate was 7.8%. The performance rate was lower in the Jewish than the non-Jewish subpopulation. The low-SEC subpopulation had a similar hemoglobin testing rate to the high-SEC subpopulation but double the IA prevalence rate. Overall, by the end of the intervention (2014), the performance rate increased to 87.5%, and the AI prevalence rate decreased to 3.4%. In 2019, there was little change in the performance rate from the end of the intervention (88%) and the IA prevalence was further reduced to 2.7%. The non-Jewish and low-SEC subpopulations showed the most improvement which was maintained and even bettered 5 years after the intervention ended. CONCLUSIONS: The 10-year IA intervention program introduced by CHS in 2005 led to a reduction in IA prevalence rate to about 3.5% in all sub-populations evaluated. By program end, the results in the weaker subpopulations, which had the highest prevalence of IA at baseline, were not inferior to those in the stronger subpopulations. We recommended to the Israel Ministry of Health to adopt the intervention countrywide, and we challenge other countries to consider similar interventions.


Subject(s)
Anemia , Ethnicity , Anemia/epidemiology , Anemia/prevention & control , Child , Hemoglobins , Humans , Infant , Israel/epidemiology , Retrospective Studies , Socioeconomic Factors
7.
J Pediatr ; 238: 296-304.e4, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34293373

ABSTRACT

OBJECTIVES: To evaluate the sex-specific effects of stimulants in children with attention-deficit/hyperactivity disorder (ADHD) on body mass index (BMI) z and height z trajectories. STUDY DESIGN: A retrospective cohort study using the database of Israel Clalit Health Services was performed. Participants included 5- to 18-year-old insured patients with documentation of at least 2 consecutive prescriptions of stimulant drugs for ADHD. Participants were further compared with sex- and age-matched insured control patients without ADHD. RESULTS: A total of 4561 (66% boys) participants with ADHD were included. Of these, 2151 (70% boys) had follow-up data for ≥2 years of treatment. A decline of ≥1 SD in height and BMI z score was observed in 10.1% and 13.2% of the cohort, respectively. During ≥2 years follow-up, boys had a greater decline in height z score (~0.2 SD) than girls (~0.06 SD). Boys' height z score continued to decline after 1 and ≥2 years, and girls' height z score declined after 1 year, and then stabilized. The trajectory of BMI z score of boys and girls was similar, showing a greater decline after 1 year, followed by an incline after ≥2 years. Younger age at stimulants initiation, better adherence, longer treatment duration, and lower socioeconomic status were correlated with a greater impact on growth attenuation. The non-ADHD group (n = 4561, 66% boys) had baseline height z score and BMI z score similar to those in children with ADHD before treatment initiation. Height z score and BMI z score were greater in children without ADHD compared with children with ADHD following 1 year of treatment (P < .001). CONCLUSIONS: These findings highlight the importance of growth monitoring accompanied with dietary counseling in children with ADHD treated with stimulants.


Subject(s)
Attention Deficit Disorder with Hyperactivity/drug therapy , Body Height , Body Mass Index , Central Nervous System Stimulants/therapeutic use , Adolescent , Age Factors , Child , Child, Preschool , Female , Humans , Israel , Male , Retrospective Studies , Risk Factors , Sex Factors , Socioeconomic Factors
8.
Isr Med Assoc J ; 23(4): 233-238, 2021 Apr.
Article in English | MEDLINE | ID: mdl-33899356

ABSTRACT

BACKGROUND: Evaluation of children's anthropometrics poses challenges due to age-related changes. The main focus is on height and weight. However, since weight is height-dependent, body mass index (BMI) is the best surrogate measurement of adiposity. Israel has not developed national growth tables; therefore, researchers and clinicians utilize either World Health Organization (WHO) or U.S. Centers for Disease Control and Prevention (CDC) tables as benchmarks. OBJECTIVES: To evaluate the anthropometrics of Israeli children benchmarked by CDC and WHO tables. METHODS: A retrospective review was conducted of the 1987-2003 birth cohort (age 4-18 years) from Clalit Health Services databases. Anthropometrics were retrieved twice: at study entry and one year later. We evaluated them as separate cohorts. Gender-specific age-matched median height and BMI were compared with CDC and WHO height and BMI tables. RESULTS: he study consisted of 15,650, mean age at study entry 9.5 years (range 4-18). Gender-specific median heights of the Israeli children were similar to CDC and WHO values at younger ages, but were slightly shorter than the age-matched CDC and WHO toward the age of final height in both cohorts. However, gender-specific median BMI was considerably and statistically significant higher compared to CDC and WHO values consistently along the entire age range in both cohorts. CONCLUSIONS: Israeli children were slightly shorter toward the age of final height, compared to WHO and CDC. However, BMI in Israeli children was significantly higher compared to the CDC and WHO consistently along the age range, which raises an alarm regarding obesity patterns.


Subject(s)
Anthropometry/methods , Body Height , Body Mass Index , Obesity , Pediatrics , Adolescent , Age Factors , Child , Child Development , Cohort Studies , Female , Humans , Israel/epidemiology , Male , Obesity/diagnosis , Obesity/epidemiology , Obesity/prevention & control , Pediatrics/methods , Pediatrics/standards , Reference Standards , Sex Factors , World Health Organization
9.
Clin J Sport Med ; 31(3): 232-236, 2021 May 01.
Article in English | MEDLINE | ID: mdl-30585796

ABSTRACT

OBJECTIVE: To noninvasively explore the heat intolerance condition during exercise-heat stress by assessing cardiovascular (CV) performance. DESIGN: Prospective study of participants undergoing a standard heat-tolerance test (HTT). SETTING: Institutional study. PARTICIPANTS: Ninety-five young males: 16 heat-intolerant (HI) and 79 heat-tolerant (HT). INTERVENTIONS: Cardiovascular performance during an HTT was estimated by heart rate (HR) and blood pressure measurements. MAIN OUTCOME MEASURES: The sensitivity of the cardiovascular reserve index (CVRI) and the dynamic heart rate reserve (dHRR) index to predict heat intolerance was compared. RESULTS: A significant difference in the CV reserve during exercise-heat stress was exhibited between the HI and the HT groups. Starting at a similar level, the reduction in the CV reserve at HTT endpoint was much greater in the HI than the HT individuals (P < 0.0001), as depicted by both the CVRI and the dHRR. This result indicates a greater utilization of the CV reserve by HI individuals. The CVRI is likely to be better predictor of heat intolerance than the dHRR because the partial area under the curve in the high sensitivity (>90%) region of its receiver operating characteristic curve is higher (93.2 vs 76.8). CONCLUSIONS: More than being a predictor, the CVRI may provide a new clinical insight into heat intolerance because it noninvasively characterizes the efficiency of an individual's thermoregulatory mechanism and hints that an impaired CV reserve might underlie heat intolerance. The CVRI provides a noninvasive measurement of thermoregulation, which has been long awaited to enable on-field studies and dynamic monitoring of heat-exposed task forces.


Subject(s)
Cardiovascular System , Exercise , Heat Stress Disorders , Adult , Heart Rate , Hot Temperature , Humans , Male , Prospective Studies , Young Adult
10.
Eur J Ophthalmol ; 30(6): 1268-1271, 2020 Nov.
Article in English | MEDLINE | ID: mdl-31353952

ABSTRACT

BACKGROUND: Cataract surgery is one of the most common elective surgeries. We present a novel approach of preoperative triage using community-based ophthalmologist referral letters for scheduling surgery, thus reducing both patient and physician time prior to surgery. Since most patients are not routinely examined in a preoperative clinic, day of surgery cancelations are a possibility. The aim of this study is to evaluate the efficiency of our triage system. METHODS: Historical prospective study in which the end point was day-of-surgery cancelation. The main outcome measure of this study was the rate of cancelations which could have been prevented by a preoperative visit. Patients' records were reviewed for reasons for cancelation and demographics. RESULTS: During the study period, 1030 patients underwent cataract surgery, 171 patients (16.6%) were examined in the preoperative clinic. Forty-five patients (4.4%) were canceled on the day of surgery due to various reasons. The main reason for cancelation (13 cases, 28.9%) was non-availability of operating theater. In 20 cases (1.9% of total patients, 44.4% of cancelations), the cancelations could have been prevented by a preoperative clinic visit. CONCLUSION: Our results suggests that most cataract patients do not require preoperative visit prior to the day of surgery. The cooperation of community-based ophthalmologists and the availability of senior surgeons in the operating theater allows for the proper implementation of our system. Direct referral to surgery could shorten both costs and time to surgery and provide timely treatment for cataracts in a cost-aware environment.


Subject(s)
Cataract Extraction/methods , Hospitals , Operating Rooms/statistics & numerical data , Preoperative Care/methods , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Prospective Studies
11.
Med Hypotheses ; 122: 35-40, 2019 Jan.
Article in English | MEDLINE | ID: mdl-30593418

ABSTRACT

BACKGROUND: Sudden Infant Death Syndrome (SIDS) mechanisms of death remains obscured. SIDS' Triple Risk Model assumed coexistence of individual subtle vulnerability, critical developmental period and stressors. Prone sleeping is a major risk factor but provide no clues regarding the mechanism of death. The leading assumed mechanisms of death are either an acute respiratory crisis or arrhythmias but neither one is supported with evidence, hence both are eventually speculations. Postmortem findings do exist but are inconclusive to identify the mechanism of death. WHAT DOES THE PROPOSED HYPOTHESIS BASED ON?: 1. The stressors (suggested by the triple risk model) share a unified compensatory physiological response of decrease in systemic vascular resistant (SVR) to facilitate a compensatory increase in cardiac output (CO). 2. The cardiovascular/cardiorespiratory control of the vulnerable infant during a critical developmental period may be impaired. 3. A severe decrease in SVR is associated with hyper-dynamic state, high output failure and distributive shock. THE HYPOTHESIS: Infant who is exposed to one or more stressors responds normally by decrease in SVR which increases CO. In normal circumstances once the needs are met both SVR and CO are stabilized on a new steady state. The incompetent cardiovascular control of the vulnerable infant fails to stabilize SVR which decreases in an uncontrolled manner. Accordingly CO increases above the needs to hyper-dynamic state, high output heart failure and hyper-dynamic shock. CONCLUSIONS: The proposed hypothesis provides an appropriate alternative to either respiratory crises or arrhythmia though both speculations cannot be entirely excluded.


Subject(s)
Death , Shock/physiopathology , Sleep , Sudden Infant Death/etiology , Cardiac Output , Humans , Infant , Infant, Newborn , Models, Theoretical , Perfusion , Posture , Risk Factors , Sudden Infant Death/diagnosis
12.
Isr J Health Policy Res ; 7(1): 13, 2018 02 20.
Article in English | MEDLINE | ID: mdl-29463297

ABSTRACT

BACKGROUND: There is increasing agreement among medical educators regarding the importance of improving the integration between public health and clinical education, understanding and implementation of epidemiological methods, and the ability to critically appraise medical literature. The Sackler School of Medicine at Tel-Aviv University revised its public health and preventive medicine curriculum, during 2013-2014, according to the competency-based medical education (CBME) approach in training medical students. We describe the revised curriculum, which aimed to strengthen competencies in quantitative research methods, epidemiology, public health and preventive medicine, and health service organization and delivery. METHODS: We report the process undertaken to establish a relevant 6-year longitudinal curriculum and describe its contents, implementation, and continuous assessment and evaluation. RESULTS: Central competencies included: epidemiology and statistics for appraisal of the literature and implementation of research; the application of health promotion principles and health education strategies in disease prevention; the use of an evidence-based approach in clinical and public health decision making; the examination and analysis of disease trends at the population level; and knowledge of the structure of health systems and the role of the physician in these systems. Two new courses, in health promotion, and in public health, were added to the curriculum, and the courses in statistics and epidemiology were joined. Annual evaluation of each course results in continuous revisions of the syllabi as needed, while we continue to monitor the whole curriculum. CONCLUSIONS: The described revision in a 6 year-medical school training curriculum addresses the currently identified needs in public health. Ongoing feedback from students, and re-evaluation of syllabus by courses teams are held annually. Analysis of student's written feedbacks and courses evaluations of "before and after" the implementation of this intervention is taking place to examine the effect of the new curriculum on the perceived clinical and research capacities of our 6-year students.


Subject(s)
Competency-Based Education , Education, Medical/trends , Epidemiology/education , Public Health/education , Students, Medical , Curriculum , Delivery of Health Care , Education, Medical/methods , Humans , Israel , Longitudinal Studies , Preventive Medicine/education
13.
Harefuah ; 156(4): 242-245, 2017 Apr.
Article in Hebrew | MEDLINE | ID: mdl-28551929

ABSTRACT

AIMS: To evaluate in-hospital mortality rate within 24 hours in internal medicine wards and to evaluate if it may be used as quality indicator. BACKGROUND: In-hospital mortality rate is an outcome measure which apparently reflects quality of care. There are debates on whether it may be considered a quality indicator since it is difficult to compare different case-mixes between hospitals. Research on mortality within 24 hours had not been published. METHODS: An historical prospective study was conducted including the entire internal wards admissions to the Rabin Medical Center between 1/7/14 and 30/6/15. We evaluated inhospital deaths and 7 days post discharge deaths. We focused on deaths within 24 hours, patients' characteristics, the primary diagnosis (which we assumed is the cause of death) and co-morbidity. The analysis includes descriptive statistics and mortality rates performed with SPSS version 22. RESULTS: Overall, 25,414 patients were admitted to internal wards during the study period. There were 1,620 in-hospitals deaths (6.37%) among which 164 deaths occurred within 24 hours (0.65%), which is 10.1% of in-hospital deaths. These patients were very old (median 82), many were residents of nursing homes and nearly all were brought to the hospital by ambulance. The most frequent primary diagnoses were sepsis (24%), pneumonia (22%), metastatic cancer (10%) and acute neurologic event (5%). CONCLUSIONS: The results exclude excessive inhospital mortality within 24 hours. The patients' characteristics enable researchers to assume that these deaths were expected and not preventable. DISCUSSION: There is no excessive mortality within 24 hours, the deaths were expected and a seasonal modifying effect was evident. All this and the different case mix in between hospitals suggest that early in-hospital mortality seems inadequate as a quality measure.


Subject(s)
Hospital Mortality , Internal Medicine , Quality of Health Care , Hospitalization , Hospitals , Humans , Prospective Studies
14.
Int J Cardiol ; 234: 33-37, 2017 May 01.
Article in English | MEDLINE | ID: mdl-28256325

ABSTRACT

OBJECTIVES: The Cardio-vascular reserve index (CVRI) had been empirically validated in diverse morbidities as a quantitative estimate of the reserve assumed by the cardiovascular reserve hypothesis. This work evaluates whether CVRI during exercise complies with the cardiovascular reserve hypothesis. DESIGN: Retrospective study based on a database of patients who underwent cardio-pulmonary exercise testing (CPX) for diverse indications. METHODS: Patient's physiological measurements were retrieved at four predefined CPX stages (rest, anaerobic threshold, peak exercise and after 2min of recovery). CVRI was individually calculated retrospectively at each stage. RESULTS: Mean CVRI at rest was 0.81, significantly higher (p<0.001) than at all other stages. CVRI decreased with exercise, reaching an average at peak exercise of 0.35, significant lower than at other stages (p<0.001) and very similar regardless of exercise capacity (mean CVRI 0.33-0.37 in 4 groups classified by exercise capacity, p>0.05). CVRI after 2min of recovery rose considerably, most in the group with the best exercise capacity and least in those with the lowest exercise capacity. CONCLUSIONS: CVRI during exercise fits the pattern predicted by the cardiovascular reserve hypothesis. CVRI decreased with exercise reaching a minimum at peak exercise and rising with recovery. The CVRI nadir at peak exercise, similar across groups classified by exercise capacity, complies with the assumed exhaustion threshold. The clinical utility of CVRI should be further evaluated.


Subject(s)
Cardiovascular Diseases , Exercise Test/methods , Heart/physiopathology , Myocardium/metabolism , Oxygen Consumption/physiology , Anaerobic Threshold/physiology , Cardiovascular Diseases/diagnosis , Cardiovascular Diseases/metabolism , Cardiovascular Diseases/physiopathology , Exercise Tolerance/physiology , Female , Humans , Male , Middle Aged , Retrospective Studies
15.
Ann Med Surg (Lond) ; 14: 1-7, 2017 Feb.
Article in English | MEDLINE | ID: mdl-28070330

ABSTRACT

BACKGROUND: To estimate the cardiovascular reserve we formulated the Cardiovascular Reserve Index (CVRI) based on physiological measurements. The aim of this study was to evaluate the pattern of CVRI in haemorrhage-related haemodynamic deterioration in an animal model simulating combat injury. METHODS: Data were collected retrospectively from a research database of swine exsanguination model in which serial physiological measurements were made under anesthesia in 12 swine of haemorrhagic injury and 5 controls. We calculated the approximated CVRI (CVRIA). The course of haemodynamic deterioration was defined according to the cumulative blood loss until shock. The ability of heart rate (HR), mean arterial blood pressure (MABP), stroke volume (SV), cardiac output (CO) and systemic vascular resistance (SVR) and the CVRIA to predict haemodynamic deterioration was evaluated according to three criteria: strength of association with the course of haemodynamic deterioration (r2 > 0.5); threshold for haemodynamic deterioration detection; and range at which the parameter remained consistently monotonous course of deterioration. RESULTS: Three parameters met the first criterion for prediction of haemodynamic deterioration: HR (r2 = 0.59), SV (r2 = 0.57) and CVRIA (r2 = 0.66). Results were negative for MABP (r2 = 0.27), CO (r2 = 0.33) and SVR (r2 = 0.02). The detection threshold of the CVRIA was 200-300 ml blood loss whereas HR, SV and CO showed a delay in detection, MABP and CVRI exhibited a wide indicative range toward shock. CONCLUSIONS: The CVRIA met preset criteria of a potential predictor of haemorrhage-related haemodynamic deterioration. Prospective studies are required to evaluate use of the CVRI in combat medicine. LEVEL OF EVIDENCE: Level III.

16.
Med Hypotheses ; 82(6): 694-9, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24679381

ABSTRACT

BACKGROUND: Heart failure (HF) and shock are incomprehensively understood, inconclusively defined and lack a single conclusive test. The proceedings that preceded and triggered clinical manifestations are occult. The relationships in between different shock and HF types and between each HF type and its matched shock are poorly understood. THE ASSUMED HYPOTHESIS: We suggest that HF and shock are attributed to a momentary cardiovascular performance reserve - "the reserve". The reserve is controlled through an assumed central physiological mechanism that continuously detects and responds accordingly--"the reserve control". The assumed reserve is maximal at rest, and decreases with aerobic activity. When it decreases to a given threshold the reserve control alerts by induces manifestations of dyspnea and fatigue enforcing activity decrease, follow which the manifestations dissolve. HF is a condition of low reserve at baseline; hence, fatigue and dyspnea are frequently experienced following mild activity. Shock is assumed to occur when the cardiovascular reserve deteriorates below a sustainable limit where the reserve control induces a salvage-sacrifice response, preserving vital organ perfusion while impairing microcirculation effective perfusion in non-vital organ in which it causes cellular hypoxia followed by the familiar devastating cascade of events seen in shock. DISCUSSION AND CONCLUSIONS: The hereby hypothesis may comprehensively explain the heart failure - shock puzzle as no alternative theory had ever succeeded. It provides the missing link between the different types of HF as of shock and in between. The hypothesis poses a great prove challenge but opens new research and clinical possibilities.


Subject(s)
Cardiovascular Physiological Phenomena , Fatigue/physiopathology , Heart Failure/physiopathology , Models, Biological , Shock/physiopathology , Humans
17.
Med Hypotheses ; 82(1): 57-9, 2014 Jan.
Article in English | MEDLINE | ID: mdl-24280559

ABSTRACT

Diabetic foot is traditionally attributed to a triad of neuropathy, ischemia and infection. Cellular hypoxia in diabetic foot can neither be attributed to an occlusive large artery disease (which are mostly patent) nor to the so called diabetic small vessel disease (where such occlusion was never proved). The physiological findings that accompany cellular hypoxia are confusing: elevated local blood flow and high oxygen saturation in both the tissue and its collecting veins. It is well known that some tissues (e.g. skin) are wired with two types of capillaries: True capillaries - also known as exchange capillaries, where nutrients and gases exchange takes place, and metarteriole thoroughfare channels - also known as shunting capillaries. We hypothesize that in the diabetic foot tissue blood flow is rerouted through the metarteriole thoroughfare channel, bypassing the exchange capillaries. Hence, nutrient and gas exchange is disabled and tissue cells became hypoxic regardless of the tissue blood flow. As a result of the shunt, arterial oxygen is not consumed and the oxygen saturation in the collecting veins remains high. The hereby hypothesis suggests that mal-perfusion rather than hypo-perfusion is the underlying cause of cellular hypoxia in diabetic foot. This hypothesis complies with the findings of patent arteries proximal to the affected site, normal to elevated tissue blood flow and high oxygen saturation in the affected tissue and its collecting veins.


Subject(s)
Capillaries/physiology , Cell Hypoxia/physiology , Diabetic Foot/etiology , Models, Biological , Diabetic Foot/physiopathology , Humans , Oxygen Consumption/physiology , Regional Blood Flow/physiology
18.
J Immigr Minor Health ; 16(1): 35-43, 2014 Feb.
Article in English | MEDLINE | ID: mdl-23765036

ABSTRACT

Immigrant mortality studies reveal conflicting results that were attributed to diversity in immigrant definition, different classifications, and lack of appropriate comparisons. This work studied mortality patterns of the immigrations absorbed in Israel. Short-term mortality was evaluated by comparing the Standardized Mortality Rate (SMR) of the first year after immigration to the SMR of the second to fifth years. Long-term mortality was evaluated by comparing recent immigrant cohorts to cohorts of immigrants who have been residents 5 and 10 years. Stratification was made by source country classification and gender. Data were derived from the Israel National Population Registry and were analyzed anonymously. Immigrants from developed and developing countries had the highest SMR in the first year, which considerably decreased in both short and long term. Immigrants from mid-developed countries had stable SMR in the short term followed by only a modest decrease in the long term. Ethiopian immigrants exhibited exceptionally low SMR in the first year, following which it increased but remained relatively low. Mortality patterns of different immigrant groups differ even under similar definitions, conditions, and period. Only immigrants of developed and developing countries presented the expected pattern of excessive short-term mortality, which consistently decreased with time. Unique mortality patterns were discovered among two groups: Immigrants from mid-developed countries presented stable mortality attributable to isolation and delayed adaptation, and Ethiopian low mortality attributable to pre-migration natural selection.


Subject(s)
Emigrants and Immigrants , Mortality/ethnology , Mortality/trends , Adult , Female , Humans , Israel/epidemiology , Male , Registries , Retrospective Studies
19.
Ear Nose Throat J ; 92(10-11): E6, 2013.
Article in English | MEDLINE | ID: mdl-24170477

ABSTRACT

Esthesioneuroblastoma (ENB) is a rare tumor of the olfactory mucosa. We treated a 50-year-old man with an ENB in the right ethmoid sinus who had been diagnosed 16 years earlier with syndrome of inappropriate antidiuretic hormone secretion (SIADH) of unknown cause. When the ENB was surgically removed, the patient's osmoregulation returned to normal-that is, his SIADH resolved completely, which suggested that the SIADH was paraneoplastic in nature. These events prompted us to review the literature to determine if there is an association between our patient's ENB and his SIADH in general and between long-standing SIADH that precedes ENB in particular. Based on our review and an extrapolation of data, we have estimated that 1,300 cases of ENB have occurred since it was first described in 1924. Of these cases, SIADH was reported in 26 cases, including ours, which represents an estimated prevalence of 2% (although we believe this is actually an underestimation of the true prevalence). Of the 26 cases, SIADH had already been present in 14 patients (54%) prior to their diagnosis of EBN for a median duration of 3.5 years. We recommend that patients with newly diagnosed EBN be evaluated for SIADH. In those who are SIADH-positive, a resolution of SIADH should be expected once the ENB has been removed. If this does not occur, one should suspect that the ENB was not completely removed. If SIADH resolves but later recurs during follow-up, then a relapse should be suspected. In long-standing SIADH of unknown etiology, nasal sinus imaging should be considered.


Subject(s)
Esthesioneuroblastoma, Olfactory/diagnosis , Inappropriate ADH Syndrome/etiology , Paranasal Sinus Neoplasms/diagnosis , Paraneoplastic Syndromes/etiology , Esthesioneuroblastoma, Olfactory/complications , Esthesioneuroblastoma, Olfactory/surgery , Humans , Male , Middle Aged , Paranasal Sinus Neoplasms/complications , Paranasal Sinus Neoplasms/surgery
20.
Int J Surg ; 11(5): 400-6, 2013.
Article in English | MEDLINE | ID: mdl-23499901

ABSTRACT

BACKGROUND: Excess adverse events may be attributable to poor surgical performance but also to case-mix, which is controlled through the Standardized Incidence Ratio (SIR). SIR calculations can be complicated, resource consuming, and unfeasible in some settings. This article suggests a novel method for SIR approximation. METHODS: In order to evaluate a potential SIR surrogate measure we predefined acceptance criteria. We developed a new measure - Approximate Risk Index (ARI). "Number Needed for Event" (NNE) is the theoretical number of patients needed "to produce" one adverse event. ARI is defined as the quotient of the group of patients needed for no observed events Ge by total patients treated Ga. Our evaluation compared 2500 surgical units and over 3 million heterogeneous risk surgical patients that were induced through a computerized simulation. Surgical unit's data were computed for SIR and ARI to evaluate compliance with the predefined criteria. Approximation was evaluated by correlation analysis and performance prediction capability by Receiver Operating Characteristics (ROC) analysis. RESULTS: ARI strongly correlates with SIR (r(2) = 0.87, p < 0.05). ARI prediction of excessive risk revealed excellent ROC (Area Under the Curve > 0.9) 87% sensitivity and 91% specificity. DISCUSSION AND CONCLUSIONS: ARI provides good approximation of SIR and excellent prediction capability. ARI is simple and cost-effective as it requires thorough risk evaluation of only the adverse events patients. ARI can provide a crucial screening and performance evaluation quality control tool. The ARI method may suit other clinical and epidemiological settings where relatively small fraction of the entire population is affected.


Subject(s)
General Surgery/standards , Risk Assessment/methods , Surgical Procedures, Operative/standards , Clinical Competence/standards , Clinical Competence/statistics & numerical data , Cohort Studies , General Surgery/organization & administration , General Surgery/statistics & numerical data , Humans , Perioperative Period/statistics & numerical data , Postoperative Complications/epidemiology , Predictive Value of Tests , ROC Curve , Surgical Procedures, Operative/adverse effects , Surgical Procedures, Operative/statistics & numerical data , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...