Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 110
Filter
1.
Ir Med J ; 109(9): 466, 2016 Oct 12.
Article in English | MEDLINE | ID: mdl-28125180

ABSTRACT

In Ireland, Warfarin is the primary anticoagulant prescribed in the secondary prevention of provoked DVT. We completed a comprehensive cost analysis of a trial group of 24 patients treated with Rivaroxaban (between November 2013 and December 2014), versus a control group treated with Warfarin (between January 2008 and November 2013). The groups were matched for gender (3/7 M/F ratio), DVT type (5 proximal, 19 distal DVTs), provoking factor (20 traumatic, 4 atraumatc), and age. We calculated the cost for each group based on drug administration and clinic costs (labour, sample analysis, and additional costs). Warfarin patients attended clinic 14.58 times; Rivaroxaban patients attended 2.92 times. Overall, the cost per patient on Rivaroxaban is €273.30 versus €260.68 with warfarin. This excludes patient costs which would further increase cost of Warfarin therapy.


Subject(s)
Anticoagulants/economics , Factor Xa Inhibitors/economics , Rivaroxaban/economics , Venous Thrombosis/drug therapy , Warfarin/economics , Anticoagulants/administration & dosage , Costs and Cost Analysis , Drug Costs , Factor Xa Inhibitors/administration & dosage , Female , Humans , Ireland , Male , Rivaroxaban/administration & dosage , Secondary Prevention/economics , Venous Thrombosis/etiology , Warfarin/administration & dosage
2.
Anaesth Intensive Care ; 35(3): 406-8, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17591137

ABSTRACT

We present the use of transtracheal jet ventilation in two uncooperative patients with a difficult airway. Although transtracheal jet ventilation is considered as a last resort option in the difficult airway algorithm, its use can be a valuable tool in selected difficult airway situations. Transtracheal jet ventilation can effectively maintain arterial oxygenation and provide extra time for attempts to intubate the trachea, either directly or fibreoptically.


Subject(s)
Airway Obstruction/therapy , High-Frequency Jet Ventilation/methods , Intubation, Intratracheal/methods , Accidents, Traffic , Adult , High-Frequency Jet Ventilation/instrumentation , Humans , Male , Middle Aged
3.
Sci Total Environ ; 372(1): 247-55, 2006 Dec 15.
Article in English | MEDLINE | ID: mdl-17095051

ABSTRACT

In this overland flow simulation experiment, the relationships between flow path length, flow rate and the concentration of different P fractions were investigated. Overland flow was simulated using a 3 mx0.12 m laboratory flume. To remove the impact of rainfall on P lost in overland flow, simulated rainfall was not used during these experiments. Instead overland flow was generated by pumping water into the flume at the surface of the grass sod. The experimental setup allowed for the variation in flow path length and flow rate between and during experimental runs. The results demonstrated that an increase in flow path length caused an increase in Total Dissolved P (TDP), Dissolved Reactive P (DRP) and Total Reactive P (TRP) concentration in overland flow (p<0.01) while an increase in flow rate resulted in a decrease in the concentration of these P fractions in overland flow due to dilution (p<0.01). Total P (TP), Particulate P (PP) and Dissolved Organic P were not affected by the variables tested during this study. When flow path length was increased in conjunction with flow rate, there was an increase in TDP, DRP, and TRP concentrations despite the impact of greater dilution. The results indicate that variations in flow path length during a rainfall event may play a role in determining the concentration of dissolved P fractions in overland flow at field scale.


Subject(s)
Phosphorus/analysis , Water Movements , Poaceae , Rain , Soil
4.
Brain Res ; 1041(2): 181-97, 2005 Apr 18.
Article in English | MEDLINE | ID: mdl-15829227

ABSTRACT

The way in which the cerebellum influences the output of the motor cortex is not known. The aim of this study was to establish whether information about force, velocity or duration of movement is encoded in cerebellar thalamic discharge and could therefore be involved in the modulation of motor cortical activity. Extracellular single cell recordings were made from the cerebellar thalamus (66 neurones) and VPLc (49 neurones) of four conscious macaques performing simple wrist movements with various load and gain conditions imposed. A significant correlation (Spearman's; P<0.05) was found between movement duration and the duration of neuronal discharge of most cerebellar thalamic neurones (65%), the velocity of movement and rate of neuronal discharge of some cerebellar thalamic neurones (23%), but not between force of movement and rate of neuronal discharge of any cerebellar thalamic neurones. Similar relationships were found between the activity of VPLc neurones and these movement parameters. The strength of the correlations increased when many cells were grouped and analysed as an ensemble, suggesting that populations of cerebellar thalamic (and VPLc) neurones can encode a signal with higher fidelity than single neurones alone. The ensemble data confirmed that the most robust association was between the duration of neuronal discharge and movement duration. We propose that the cerebellum does not provide the motor cortex with specific information about movement force or velocity, but rather that its major role is in activating many motor cortical regions for a specific duration, thus influencing the timing of complex movements involving many muscles and joints.


Subject(s)
Cerebellum/physiology , Macaca/physiology , Movement/physiology , Neural Pathways/physiology , Thalamic Nuclei/physiology , Wrist/physiology , Action Potentials/physiology , Animals , Biomechanical Phenomena , Cerebellum/anatomy & histology , Macaca/anatomy & histology , Macaca fascicularis , Macaca mulatta , Macaca nemestrina , Models, Neurological , Motor Cortex/physiology , Muscle Contraction/physiology , Muscle, Skeletal/innervation , Muscle, Skeletal/physiology , Neural Pathways/anatomy & histology , Neurons/physiology , Signal Processing, Computer-Assisted , Synaptic Transmission/physiology , Thalamic Nuclei/anatomy & histology , Time Factors , Wrist/innervation , Wrist Joint/physiology
5.
Reproduction ; 126(4): 481-7, 2003 Oct.
Article in English | MEDLINE | ID: mdl-14525530

ABSTRACT

Overnourishing adolescent ewes throughout pregnancy promotes maternal tissue synthesis at the expense of placental growth, which in turn leads to a major decrease in lamb birth weight. As maternal dietary intakes are inversely related to peripheral progesterone concentrations in these adolescent dams, it was hypothesized that sup-optimal progesterone concentrations in overnourished dams may compromise the growth of the differentiating conceptus resulting in fewer uterine caruncles being occupied and, hence, fewer placentomes formed. This hypothesis was tested by supplementing overnourished adolescent dams with exogenous progesterone during early pregnancy and determining the impact on pregnancy outcome at term. Embryos recovered from superovulated adult ewes inseminated by a single sire were transferred in singleton to the uterus of peripubertal adolescent recipients. After transfer of embryos, ewes were offered a moderate or high amount of a complete diet (n=11 per group). A further high intake group received a progesterone supplement each day from day 5 to day 55 of gestation (term=145 days) to restore circulating progesterone concentrations to moderate values throughout the first third of pregnancy (n=11). For ewes establishing pregnancies (n=7 per group), live weight gain during the first 100 days of gestation was 66+/-4, 323+/-17 and 300+/-7 g per day, body condition score at term was 2.1+/-0.05, 3.0+/-0.08 and 3.1+/-0.07 units and the duration of gestation after spontaneous delivery was 148+/-1.7, 144+/-0.8 and 143+/-0.8 days for the moderate intake, high intake and high intake plus progesterone groups, respectively. At delivery, fetal cotyledon mass (136+/-12.1 versus 57+/-8.2g, P<0.001) and lamb birth weight (5164+/-151 versus 2893+/-381 g, P<0.001) were higher in moderate intake than in high intake dams. Progesterone supplementation restored circulating concentrations to moderate values during the first third of gestation. Lamb birth weight in the high intake plus progesterone group (4150+/-389 g) was intermediate between the high intake (P<0.02) and moderate intake (P<0.05) groups, but this change in birth weight was not associated with corresponding changes in fetal cotyledon mass (76+/-10.3 g). Moreover, the number of fetal cotyledons was similar in all three groups. Thus, progesterone did not directly affect the growth of the fetal cotyledon but may have influenced placental vascularity, blood flow or nutrient transfer capacity or alternatively the development of the embryonic inner cell mass.


Subject(s)
Overnutrition/veterinary , Pregnancy Complications/veterinary , Progesterone/administration & dosage , Sheep Diseases/drug therapy , Animals , Animals, Newborn , Birth Weight , Embryonic and Fetal Development/drug effects , Female , Overnutrition/drug therapy , Placenta/drug effects , Placentation , Pregnancy , Pregnancy Complications/drug therapy , Progesterone/blood , Sexual Maturation/physiology , Sheep
6.
Eur J Neurosci ; 18(5): 1175-88, 2003 Sep.
Article in English | MEDLINE | ID: mdl-12956716

ABSTRACT

Previously we described the extent of sprouting that axons of the rat substantia nigra pars compacta (SNpc) undergo to grow new synapses and re-innervate the dorsal striatum 16 weeks after partial lesions. Here we provide insights into the timing of events related to the re-innervation of the dorsal striatum by regenerating dopaminergic nigrostriatal axons over a 104-week period after partial SNpc lesioning. Density of dopamine transporter and tyrosine hydroxylase immunoreactive axonal varicosities (terminals) decreased up to 80% 4 weeks after lesioning but returned to normal by 16 weeks, unless SNpc lesions were greater than 75%. Neuronal tracer injections into the SNpc revealed a 119% increase in axon fibres (4 mm rostral to the SNpc) along the medial forebrain bundle 4 weeks after lesioning. SNpc cells underwent phenotypic changes. Four weeks after lesioning the proportion of SNpc neurons that expressed tyrosine hydroxylase fell from 90% to 38% but returned to 78% by 32 weeks. We discuss these phenotype changes in the context of neurogenesis. Significant reductions in dopamine levels in rats with medium (30-75%) lesions returned to normal by 16 weeks whereas recovery was not observed if lesions were larger than 75%. Finally, rotational behaviour of animals in response to amphetamine was examined. The clear rightward turning bias observed after 2 weeks recovered by 16 weeks in animals with medium (30-75%) lesions but was still present when lesions were larger. These studies provide insights into the processes that regulate sprouting responses in the central nervous system following injury.


Subject(s)
Dopamine/metabolism , Membrane Glycoproteins , Nerve Regeneration/physiology , Nerve Tissue Proteins , Oxidopamine/toxicity , Substantia Nigra/drug effects , Sympatholytics/toxicity , Animals , Axons/physiology , Behavior, Animal , Biotin/pharmacokinetics , Cell Count , Dextrans/pharmacokinetics , Dopamine Plasma Membrane Transport Proteins , Dose-Response Relationship, Drug , Immunohistochemistry , Male , Medial Forebrain Bundle/drug effects , Medial Forebrain Bundle/metabolism , Membrane Transport Proteins/metabolism , Neurons/metabolism , Rats , Rats, Wistar , Rotation , Substantia Nigra/injuries , Substantia Nigra/physiology , Time Factors , Tyrosine 3-Monooxygenase/metabolism
7.
Reproduction ; 122(3): 347-57, 2001 Sep.
Article in English | MEDLINE | ID: mdl-11597301

ABSTRACT

Human adolescent mothers have an increased risk of delivering low birth weight and premature infants with high mortality rates within the first year of life. Studies using a highly controlled adolescent sheep paradigm demonstrate that, in young growing females, the hierarchy of nutrient partitioning during pregnancy is altered to promote growth of the maternal body at the expense of the gradually evolving nutrient requirements of the gravid uterus and mammary gland. Thus, overnourishing adolescent dams throughout pregnancy results in a major restriction in placental mass, and leads to a significant decrease in birth weight relative to adolescent dams receiving a moderate nutrient intake. High maternal intakes are also associated with increased rates of spontaneous abortion in late gestation and, for ewes delivering live young, with a reduction in the duration of gestation and in the quality and quantity of colostrum accumulated prenatally. As the adolescent dams are of equivalent age at the time of conception, these studies indicate that nutritional status during pregnancy rather than biological immaturity predisposes the rapidly growing adolescents to adverse pregnancy outcome. Nutrient partitioning between the maternal body and gravid uterus is putatively orchestrated by a number of endocrine hormones and, in this review, the roles of both maternal and placental hormones in the regulation of placental and fetal growth in this intriguing adolescent paradigm are discussed. Impaired placental growth, particularly of the fetal component of the placenta, is the primary constraint to fetal growth during late gestation in the overnourished dams and nutritional switch-over studies indicate that high nutrient intakes during the second two-thirds of pregnancy are most detrimental to pregnancy outcome. In addition, it may be possible to alter the nutrient transport function of the growth-restricted placenta in that the imposition of a catabolic phase during the final third of pregnancy in previously rapidly growing dams results in a modest increase in lamb birth weight.


Subject(s)
Pregnancy in Adolescence , Adolescent , Animals , Female , Gestational Age , Hormones/physiology , Humans , Infant, Low Birth Weight , Infant, Newborn , Infant, Premature , Nutritional Physiological Phenomena , Placenta/physiology , Pregnancy , Pregnancy Outcome , Sheep
8.
Arch Intern Med ; 161(14): 1751-8, 2001 Jul 23.
Article in English | MEDLINE | ID: mdl-11485508

ABSTRACT

BACKGROUND: The results of in-hospital resuscitations may depend on a variety of factors related to the patient, the environment, and the extent of resuscitation efforts. We studied these factors in a large tertiary referral hospital with a dedicated certified resuscitation team responding to all cardiac arrests. METHODS: Statistical analysis of 445 prospectively recorded resuscitation records of patients who experienced cardiac arrest and received advanced cardiac life support resuscitation. We also report the outcomes of an additional 37 patients who received limited resuscitation efforts because of advance directives prohibiting tracheal intubation, chest compressions, or both. MAIN OUTCOME MEASURES: Survival immediately after resuscitation, at 24 hours, at 48 hours, and until hospital discharge. RESULTS: Overall, 104 (23%) of 445 patients who received full advanced cardiac life support survived to hospital discharge. Survival was highest for patients with primary cardiac disease (30%), followed by those with infectious diseases (15%), with only 8% of patients with end-stage diseases surviving to hospital discharge. Neither sex nor age affected survival. Longer resuscitations, increased epinephrine and atropine administration, multiple defibrillations, and multiple arrhythmias were all associated with poor survival. Patients who experienced arrests on a nursing unit or intensive care unit had better survival rates than those in other hospital locations. Survival for witnessed arrests (25%) was significantly better than for nonwitnessed arrests (7%) (P =.005). There was a disproportionately high incidence of nonwitnessed arrests during the night (12 AM to 6 AM) in unmonitored beds, resulting in uniformly poor survival to hospital discharge (0%). None of the patients whose advance directives limited resuscitation survived. CONCLUSIONS: Very ill patients in unmonitored beds are at increased risk for a nonwitnessed cardiac arrest and poor resuscitation outcome during the night. Closer vigilance of these patients at night is warranted. The outcome of limited resuscitation efforts is very poor.


Subject(s)
Cardiopulmonary Resuscitation/mortality , Heart Arrest/mortality , Adult , Aged , Female , Hospital Mortality , Humans , Male , Middle Aged , Multivariate Analysis , Ohio/epidemiology , Prospective Studies , Resuscitation Orders , Risk , Risk Factors , Survival Analysis , Time Factors , Treatment Outcome
10.
Anesthesiol Clin North Am ; 18(4): 919-51, 2000 Dec.
Article in English | MEDLINE | ID: mdl-11094698

ABSTRACT

Organ viability associated with renal transplantation is a product of the managing of the donor patient, the allograft, and the recipient patient. Short- and long-term outcome is influenced by perioperative fluid and drug treatment, and the function and viability of the transplanted kidney seem to be optimized if graft perfusion is maximized through mild hypervolemia. At the same time, careful balancing of intraoperative fluids is necessary against cardiovascular problems frequently encountered in patients with uremia. Close intraoperative monitoring, optimization of intravascular fluid volume status to maximize kidney perfusion, and prompt correction of electrolyte disturbances (especially potassium) are key to short- and long-term success of renal transplants.


Subject(s)
Anesthesia/methods , Kidney Transplantation , Graft Survival , Humans , Immunosuppressive Agents/adverse effects , Immunosuppressive Agents/therapeutic use , Intraoperative Complications , Postoperative Care , Preoperative Care
11.
Exp Brain Res ; 133(4): 514-31, 2000 Aug.
Article in English | MEDLINE | ID: mdl-10985686

ABSTRACT

Three monkeys were trained to perform stereotyped wrist movements to track a target (phase 1). Changing the gain between the wrist movement and visual display required the monkey to adapt its wrist movement. This adaptation consisted of progressive reduction of movement amplitude over a number of trials (phase 2) until a stereotyped movement accommodating the new gain was learned (phase 3). The experiment's aim was to investigate whether cerebellar thalamic neuronal discharge (ND) changed during motor adaptation and whether this change was related to scaling of kinematic parameters or movement error. Extracellular single-cell recordings were made from "wrist-related" neurones in the cerebellar thalamus (59) and the nucleus ventro-posterior lateralis caudalis (VPLc) (37) of each monkey while they performed the movement paradigm. Neurones were selected for further analysis (37/59 cerebellar thalamic and 23/37 VPLc) if phase-1 movements were stereotyped and motor adaptation occurred in phase 2 (according to statistical definitions). When the gain initially changed, there were positional errors in the form of overshoot. Adaptation to the new gain was achieved by a variety of strategies, including modification of the amplitude of kinematic parameters and positional error in addition to reduction of time to peak velocity and movement time. During stereotyped movements, most cerebellar thalamic neurones fired before movement onset and before VPLc neurones. During adaptation, this order of onset of firing was reversed, and cerebellar thalamic neurones discharged after VPLc neurones and close to the onset of movement. During motor adaptation, the mean rate of phasic ND rose in a large proportion of cerebellar thalamic and VPLc neurones, and the proportion of cerebellar thalamic neurones that encoded a signal about positional error and movement amplitude also increased. In addition, there is set-related activity in the discharge of a majority of cerebellar thalamic and VPLc neurones. This does not appear to be specifically related to motor adaptation, but is related to the movement amplitude. We have discussed the role of the cerebello-thalamo-cortical pathway in error detection in the light of the similarities between discharge patterns of cerebellar thalamic and VPLc neurones. We speculate that, when learned movements are performed, the discharge of cerebellar thalamic neurones occurs before movement, perhaps representing an efference copy of the intended movement. During adaptation, this signal is gated out, and later-arriving peripheral afferent input dominates cerebellar thalamic discharge.


Subject(s)
Cerebellum/physiology , Cues , Movement/physiology , Neurons/physiology , Thalamus/physiology , Action Potentials/physiology , Animals , Macaca fascicularis , Macaca nemestrina , Male
12.
Anesthesiology ; 92(3): 902-3, 2000 Mar.
Article in English | MEDLINE | ID: mdl-10719983
13.
Placenta ; 21(1): 100-8, 2000 Jan.
Article in English | MEDLINE | ID: mdl-10692257

ABSTRACT

The aim was to investigate the consequences of nutritionally-mediated placental growth restriction on fetal organ growth, conformation, body composition and endocrine status during late gestation. Embryos recovered from superovulated adult ewes inseminated by a single sire were transferred in singleton to the uterus of peripubertal adolescent recipients. Post-transfer, adolescent dams were offered a high (H) or moderate (M) level of a complete diet to promote rapid or moderate maternal growth rates, respectively (n=7 per group). After day 100 of gestation the feed intake of the M dams was adjusted weekly to maintain body condition score. Liveweight gain during the first 100 days of gestation was 301+/-24 and 90+/-4.6 g/day for the H and M groups, respectively. Maternal plasma concentrations of insulin, IGF-I and urea were significantly higher and non-esterified fatty acid concentrations significantly lower in H compared with M dams prior to slaughter on day 128 of gestation. At this stage of gestation, total placentome weight was 50 per cent lower in H compared with M groups (P< 0.001) and was associated with a 37 per cent reduction in fetal weight (P< 0.01). All variables of fetal conformation and absolute fetal organ weights, with the exception of the adrenal glands, were lower (P< 0. 05) in the fetuses from H intake dams. However, relative fetal organ weights expressed as g/kg fetal body weight, with the exception of the gut, were not influenced by maternal dietary intake. Furthermore, fetal weight but not maternal nutritional group were predictive of individual organ weight for all organs dissected. Together these results imply that growth restriction in the fetuses derived from H intake dams was largely symmetrical. Fetal plasma concentrations of insulin, IGF-I and glucose were attenuated (P< 0.05) in fetuses from H compared with M groups. The lower fetal body weight in the former group was associated with a reduction in absolute but not relative crude protein (P< 0.01) and fat content (P< 0.05). Total fetal liver glycogen content but not concentration was (P< 0.05) reduced in H versus M groups. The lower mass of both the placenta and fetal liver was due to a reduction in cell number rather than an alteration in cell size. Thus, over-nourishing adolescent sheep is associated with a major restriction in placental growth which mediates a gradual slowing of fetal growth during the final third of pregnancy.


Subject(s)
Embryonic and Fetal Development , Placentation , Animal Nutritional Physiological Phenomena , Animals , Blood Glucose/metabolism , Body Composition , Eating , Embryonic and Fetal Development/physiology , Endocrine Glands/physiology , Female , Fetal Blood/metabolism , Gestational Age , Insulin/blood , Insulin-Like Growth Factor I/metabolism , Maternal-Fetal Exchange , Placenta/physiology , Pregnancy , Sheep , Urea/blood
14.
Anesth Analg ; 90(2): 388-92, 2000 Feb.
Article in English | MEDLINE | ID: mdl-10648327

ABSTRACT

UNLABELLED: We studied 20 patients over the age of 65 yr undergoing prolonged peripheral vascular surgery under continuous lidocaine epidural anesthesia, anticipating that the increased hepatic metabolism caused by small-dose IV dopamine would lower plasma lidocaine concentrations. Subjects were assigned (random, double-blinded) to receive either a placebo IV infusion or dopamine, 2 microg. kg(-1). min(-1) during and for 5 h after surgery. Five minutes after the IV infusion was started, 20 mL of 2% lidocaine was injected through the epidural catheter. One-half hour later, a continuous epidural infusion of 2% lidocaine at 10 mL/h was begun. The epidural infusion was temporarily decreased to 5 mL/h or 5 mL boluses were added to maintain a T8 analgesic level. Arterial blood samples were analyzed for plasma lidocaine concentrations regularly during and for 5 h after surgery. Plasma lidocaine concentrations increased continuously during the epidural infusion and, despite wide individual variation, were similar for the two groups throughout the observation period. During the observation period, the mean maximal plasma lidocaine concentration was 5.8 +/- 2.3 microg/mL in the control group and 5.7 +/- 1.2 microg/mL in the dopamine group. However, the mean hourly lidocaine requirement during surgery was significantly different, 242 +/- 72 mg/h for control and 312 +/- 60 mg/h for dopamine patients (P < 0.03). At the end of Hour 4, the last period when all 20 patients were still receiving the epidural lidocaine infusion, the total lidocaine requirement was significantly different, 1088 +/- 191 mg for the control group and 1228 +/- 168 mg for the dopamine group (P < 0.05). Despite very large total doses of epidural lidocaine (1650 +/- 740 mg, control patients, and 1940 +/- 400, dopamine patients) mean maximal plasma concentrations remained below 6 microg/mL, and no patient exhibited signs or symptoms of toxicity. We conclude that small-dose IV dopamine increased epidural lidocaine requirements, presumably as a consequence of increased metabolism. IMPLICATIONS: We tested dopamine, a drug that increases liver metabolism of the local anesthetic lidocaine to determine if it would prevent excessively large amounts of lidocaine in the blood during prolonged epidural anesthesia in elderly patients. Dopamine did not alter the blood levels of lidocaine, but it did increase the lidocaine dose requirement to maintain adequate epidural anesthesia.


Subject(s)
Adjuvants, Anesthesia , Anesthesia, Epidural , Anesthetics, Local , Dopamine , Lidocaine , Vascular Surgical Procedures , Adjuvants, Anesthesia/administration & dosage , Aged , Anesthetics, Local/administration & dosage , Anesthetics, Local/adverse effects , Anesthetics, Local/blood , Dopamine/administration & dosage , Double-Blind Method , Female , Humans , Lidocaine/administration & dosage , Lidocaine/adverse effects , Lidocaine/blood , Male
15.
Mil Med ; 164(11): 780-4, 1999 Nov.
Article in English | MEDLINE | ID: mdl-10578588

ABSTRACT

The percentage of penetrating eye injuries in war has increased significantly in this century compared with the total number of combat injuries. With the increasing use of fragmentation weapons and possibly laser weapons on the battle-field in the future, the rate of eye injuries may exceed the 13% of the total military injuries found in Operations Desert Storm/Shield. During the Iran-Iraq War (1980-1988), eye injuries revealed that retained foreign bodies and posterior segment injuries have an improved prognosis in future military ophthalmic surgery as a result of modern diagnostic and treatment modalities. Compared with the increasing penetrating eye injuries on the battlefield, advances in ophthalmic surgery are insignificant. Eye armor, such as visors that flip up and down and protect the eyes from laser injury, needs to be developed. Similar eye protection is being developed in civilian sportswear. Penetrating eye injury in the civilian sector is becoming much closer to the military model and is now comparable for several reasons.


Subject(s)
Eye Injuries, Penetrating/etiology , Warfare , Adult , Child , Eye Injuries, Penetrating/epidemiology , Eye Injuries, Penetrating/surgery , Eye Protective Devices , Female , Humans , Male , Middle East , Military Personnel , United States
16.
J Neurosci Methods ; 91(1-2): 123-33, 1999 Sep 15.
Article in English | MEDLINE | ID: mdl-10522831

ABSTRACT

In a previous paper (Churchward PR, Butler EG, Finkelstein DI, Aumann TD, Sudbury A, Horne MK. J Neurosci Methods 1997;76:203-210), we showed that a simple back propagation neural network could reliably model visual inspection by human observers in detecting the point of change of neuronal discharge patterns. The data for that study was deliberately chosen so that the point of change was readily detected and there would be high concordance between human observers. We wished to extend this investigation by comparing a variety of automatic analysis methods on more complex data sets. Two automatic analysis methods have been discussed in this paper. The knowledge based spike train analysis (KBSTA) was designed to emulate the detection of bursts by human observers. The self-organizing feature map (SOFM) spike train analysis determined a burst by classifying the patterns of neuronal discharge. Neuronal discharge was recorded from the motor thalamus and nucleus ventralis posterior lateralis caudalis (VPLc) of a monkey performing consecutive trials of skilled wrist movements. Recordings were made from 36 neurons whose discharge patterns were related to wrist movement. Three hundred and sixty trials performed during the recording of these 36 neurons were chosen at random and used to compare the three methods, KBSTA, SOFM, and visual inspection. The main results of this study show that for the 360 trials the three detection methods have very similar results in detecting the onset and offset of neuronal bursts. The SOFM method is not the best first approach for detecting a burst, but it does provides independent evidence to support the KBSTA and visual inspection methods. In conclusion we propose the KBSTA method as a practical, automatic technique to identify bursts of neuronal discharge.


Subject(s)
Action Potentials/physiology , Movement/physiology , Neurons/physiology , Thalamic Nuclei/physiology , Animals , Macaca mulatta , Neural Networks, Computer , Observer Variation , Poisson Distribution , Thalamic Nuclei/cytology , Time Factors
17.
Anesth Analg ; 89(2): 384-9, 1999 Aug.
Article in English | MEDLINE | ID: mdl-10439752

ABSTRACT

UNLABELLED: Anticipated technical difficulty is one factor that can influence the anesthesiologist's decision to perform neuraxial (spinal or epidural) blockade. Problems during the procedure may be associated with patient dissatisfaction, neurologic sequelae, or hematoma. We designed this study of 595 neuraxial blocks to determine whether any patient characteristics would be useful in predicting a difficult neuraxial block. Before the procedure, the following data were noted: demographic data, body habitus (normal, thin, muscular, obese), spinal landmarks (good = easily palpable spinous processes, poor = difficult to palpate spinous processes, none = unable to positively identify spinous processes), and spinal anatomy (assessed by inspection and examination as normal or deformed). We noted the technique, approach, needle type, needle gauge, etc. We also recorded whether the procedure was completed at the first (first-level success) or second spinal level and the total number of new skin punctures (attempts) necessary to complete the procedure. Of all the factors considered, the quality of landmarks best correlated with technical difficulty as measured by both first-level success and number of attempts. Abnormal spinal anatomy correlated with difficulty as measured by number of attempts. Body habitus also correlated with difficulty, but only as measured by number of attempts. There was no association between either measure of difficulty and any of the following: age, sex, spinal versus epidural, approach, needle type, needle gauge, or training level of the provider. Thoracic epidurals were less difficult than lumbar epidurals by both measures of difficulty. We conclude that body habitus does not seem to be the best predictor of technical difficulty. An examination of the patient's back for the quality of landmarks and obvious anatomical deformity better predicts the ease or difficulty of neuraxial block. Other factors seem to have little or no influence on the difficulty of neuraxial block procedures. IMPLICATIONS: We studied a number of factors, including equipment, technique, and patient characteristics, that may indicate the ease or difficulty of performing neuraxial (spinal and epidural) blocks. Of these factors, only patient characteristics had significant predictive value. We found that an examination of the patient's back for the quality of landmarks and obvious anatomical deformity better predicts the ease or difficulty of neuraxial block than does body habitus.


Subject(s)
Anesthesia, Epidural , Anesthesia, Spinal , Nerve Block , Adolescent , Adult , Aged , Aged, 80 and over , Anesthesiology/education , Anthropometry , Catheterization , Educational Status , Female , Humans , Male , Middle Aged , Needles , Prospective Studies , Risk Factors
18.
Biol Reprod ; 61(1): 101-10, 1999 Jul.
Article in English | MEDLINE | ID: mdl-10377037

ABSTRACT

The aim was to investigate whether placental growth and hence pregnancy outcome could be altered by switching adolescent dams from a high to a moderate nutrient intake, and vice-versa, at the end of the first trimester. Embryos recovered from adult ewes inseminated by a single sire were transferred in singleton to peripubertal adolescents. After transfer, adolescent ewes were offered a high (H, n = 33) or moderate (M, n = 32) level of a diet calculated to promote rapid or moderate maternal growth rates, respectively. At Day 50 of gestation, half the ewes had their dietary intakes switched, yielding 4 treatment groups: HH, MM, HM, and MH. A subset of ewes were killed at Day 104 of gestation to determine maternal body composition in relation to growth of the products of conception. Maternal body composition measurements revealed that the higher live weight in the high-intake dams was predominantly due to an increase in body fat deposition, with a less pronounced increase in body protein. At Day 104, HH and MH groups (high intake during second trimester) compared with MM and HM groups (moderate intake during second trimester) had a lower (p < 0.002) total fetal cotyledon weight; but fetal weight, conformation, and individual organ weights were not significantly influenced by maternal dietary intake. In ewes delivering live young at term, a high plane of nutrition from the end of the first trimester (HH and MH groups) compared with moderate levels (MM and HM groups) was associated with a reduction in gestation length (p < 0.009), total placental weight (p < 0.002), total fetal cotyledon weight (p < 0.001), and mean fetal cotyledon weight per placenta (p < 0.001). Fetal cotyledon number was dependent on maternal dietary intake during the first trimester only and was lower (p < 0.007) in HH and HM ewes compared to MM and MH ewes. The inhibition of fetal cotyledon growth in HH and MH groups was associated with a major decrease (p < 0.001) in lamb birth weight at term relative to the MM and HM groups. Thus, reducing maternal dietary intake from a high to a moderate level at the end of the first trimester stimulates placental growth and enhances pregnancy outcome, and increasing maternal dietary intake at this time point has a deleterious effect on placental development and fetal growth.


Subject(s)
Diet , Embryonic and Fetal Development , Gestational Age , Placenta/physiology , Pregnancy Outcome , Animals , Body Composition , Body Weight , Embryo Transfer , Female , Insulin/blood , Insulin-Like Growth Factor I/analysis , Maternal Age , Organ Size , Placenta/anatomy & histology , Pregnancy , Sheep
20.
J Reprod Fertil Suppl ; 54: 385-99, 1999.
Article in English | MEDLINE | ID: mdl-10692870

ABSTRACT

Inappropriate maternal nutrient intake at key developmental timepoints during ovine pregnancy has a profound influence on the outcome of pregnancy and aspects of postnatal productivity. However, the responses to alterations in maternal nutrition in adult sheep are often highly variable and inconsistent between studies. The growing adolescent sheep provides a new, robust and nutritionally sensitive paradigm with which to study the causes, consequences and reversibility of prenatal growth restriction. Overnourishing the adolescent dam to promote rapid maternal growth throughout pregnancy results in a major restriction in placental mass, and leads to a significant decrease in birthweight relative to moderately fed, normally growing adolescents of equivalent gynaecological age. Maternal insulin and IGF-I concentrations are increased from an early stage of gestation in overnourished adolescent dams and these hormones ensure that the anabolic drive required to promote maternal tissue synthesis is initiated at a time when the nutrient requirements of the gravid uterus are low. The major restriction in fetal growth in rapidly growing dams occurs irrespective of high concentrations of essential nutrients in the maternal circulation and suggests that the small size or altered metabolic and transport capacity of the placenta is the primary constraint to fetal growth. The decrease in placental weight in the overnourished animals reflects a significant reduction in both fetal cotyledon number and mean cotyledon weight. The role of nutritionally mediated alterations in progesterone and the components of the IGF system in this early pregnancy placental phenomenon are being investigated. Nutritional switch-over studies have demonstrated that reducing maternal nutrient intake at the end of the first third of pregnancy can stimulate placental growth and enhance pregnancy outcome, but increasing nutrient intake at this time has a deleterious effect on placental development and fetal growth.


Subject(s)
Animal Nutritional Physiological Phenomena , Embryonic and Fetal Development , Sexual Maturation/physiology , Sheep/physiology , Animals , Female , Gestational Age , Insulin/physiology , Placentation , Pregnancy , Somatomedins/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...