Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
1.
Toxicon ; 235: 107324, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37838003

ABSTRACT

Proatheris superciliaris, the lowland swamp viper, has a limited distribution along lakeshores and rivers in Malawi, Southern Tanzania, and central Mozambique. Its venom is known to be procoagulant. Only five P. superciliaris bites have been reported, all inflicted by captive snakes, and none was fatal. Here we present a case of sudden death following a bite by Proatheris superciliaris in rural Malawi that cannot be attributed to envenoming. A healthy 32-year-old woman was planting rice in a flooded rice paddy field when she suddenly told her sister in a quiet voice that she had been bitten by a snake. She then collapsed face-upwards into the ankle-deep water. She remained motionless while her sister and uncle carried her out of the rice paddy onto dry land a few meters away. The victim did not regain consciousness. Her uncle heard one exhalation but no further breathing. The snake responsible was killed by a friend. Although the venom of this species can cause life-threatening coagulopathy, this woman's death occurred too rapidly to be attributable to envenoming. Only two explanations seem plausible: anaphylaxis, or vasovagal shock triggered by fear. In the present case, the victim died within minutes of the bite, closely observed by her anxious relatives, but showed no features of anaphylaxis. In Malawi, as in much of sub-Saharan Africa, many people are reportedly terrified of snakes, believing that bites by almost any species can cause rapid death. In this case, death occurred less than 2 min after a bite from Proatheris superciliaris. We believe that the cause of death was most likely a severe vasovagal attack, in response to the fear and pain of the snakebite that triggered vasodilatation, bradycardia, and hypotension leading to cardiac arrest.


Subject(s)
Anaphylaxis , Snake Bites , Viperidae , Humans , Animals , Female , Adult , Malawi , Wetlands , Death, Sudden , Fear , Antivenins
2.
J Environ Manage ; 230: 94-101, 2019 Jan 15.
Article in English | MEDLINE | ID: mdl-30273788

ABSTRACT

Decision triggers are defined thresholds in the status of monitored variables that indicate when to undertake management, and avoid undesirable ecosystem change. Decision triggers are frequently recommended to conservation practitioners as a tool to facilitate evidence-based management practices, but there has been limited attention paid to how practitioners are integrating decision triggers into existing monitoring programs. We sought to understand whether conservation practitioners' use of decision triggers was influenced by the type of variables in their monitoring programs. We investigated this question using a practitioner-focused workshop involving a structured discussion and review of eight monitoring programs. Among our case studies, direct measures of biodiversity (e.g. native species) were more commonly monitored, but less likely to be linked to decision triggers (10% with triggers) than measures being used as surrogates (54% with triggers) for program objectives. This was because decision triggers were associated with management of threatening processes, which were often monitored as a surrogate for a biodiversity asset of interest. By contrast, direct measures of biodiversity were more commonly associated with informal decision processes that led to activities such as management reviews or external consultation. Workshop participants were in favor of including more formalized decision triggers in their programs, but were limited by incomplete ecological knowledge, lack of appropriately skilled staff, funding constraints, and/or uncertainty regarding intervention effectiveness. We recommend that practitioners consider including decision triggers for discussion activities (such as external consultation) in their programs as more than just early warning points for future interventions, particularly for direct measures. Decision triggers for discussions should be recognized as a critical feature of monitoring programs where information and operational limitations inhibit the use of decision triggers for interventions.


Subject(s)
Biodiversity , Decision Making , Environmental Monitoring , Humans , Uncertainty
3.
Science ; 349(6255): 1452, 2015 Sep 25.
Article in English | MEDLINE | ID: mdl-26404815
4.
PLoS One ; 10(6): e0127693, 2015.
Article in English | MEDLINE | ID: mdl-26029890

ABSTRACT

There is interest in large-scale and unbiased monitoring of biodiversity status and trend, but there are few published examples of such monitoring being implemented. The New Zealand Department of Conservation is implementing a monitoring program that involves sampling selected biota at the vertices of an 8-km grid superimposed over the 8.6 million hectares of public conservation land that it manages. The introduced brushtail possum (Trichosurus Vulpecula) is a major threat to some biota and is one taxon that they wish to monitor and report on. A pilot study revealed that the traditional method of monitoring possums using leg-hold traps set for two nights, termed the Trap Catch Index, was a constraint on the cost and logistical feasibility of the monitoring program. A phased implementation of the monitoring program was therefore conducted to collect data for evaluating the trade-off between possum occupancy-abundance estimates and the costs of sampling for one night rather than two nights. Reducing trapping effort from two nights to one night along four trap-lines reduced the estimated costs of monitoring by 5.8% due to savings in labour, food and allowances; it had a negligible effect on estimated national possum occupancy but resulted in slightly higher and less precise estimates of relative possum abundance. Monitoring possums for one night rather than two nights would provide an annual saving of NZ$72,400, with 271 fewer field days required for sampling. Possums occupied 60% (95% credible interval; 53-68) of sampling locations on New Zealand's public conservation land, with a mean relative abundance (Trap Catch Index) of 2.7% (2.0-3.5). Possum occupancy and abundance were higher in forest than in non-forest habitats. Our case study illustrates the need to evaluate relationships between sampling design, cost, and occupancy-abundance estimates when designing and implementing large-scale occupancy-abundance monitoring programs.


Subject(s)
Conservation of Natural Resources/economics , Cost-Benefit Analysis , Introduced Species , Trichosurus/physiology , Animals , Australia , Biodiversity , Geography , New Zealand , Pilot Projects
5.
Conserv Biol ; 27(1): 74-82, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23020670

ABSTRACT

Predation on native fauna by non-native invasive mammals is widely documented, but effects of predation at the population level are rarely measured. Eradication of invasive mammals from islands has led to recovery of native biota, but the benefits of controlling invasive mammal populations in settings where eradication is not feasible are less understood. We used various combinations of aerially delivered toxic bait and control measures on the ground to reduce abundances of invasive rats (Rattus rattus) to low levels over large areas on mainland New Zealand and then monitored the abundance of invertebrates on replicated treatment sites to compare with abundances on similar nontreatment sites. We also assessed rat diet by examining stomach contents. Abundance of the rats' most-consumed invertebrate prey item, the large-bodied Auckland tree weta (Hemideina thoracica), increased 3-fold on treatment sites where we maintained rats at <4/ha for approximately 3 years, compared with the nontreatment sites. Auckland tree weta also increased in abundance on sites where rats were controlled with a single aerial-poisoning operation, but rat abundance subsequently increased on these sites and tree weta abundance then declined. Nevertheless, our data suggest that biennial reduction of rat abundances may be sufficient to allow increases in tree weta populations. Other invertebrates that were consumed less often (cave weta [Rhaphidophoridae], spiders [Araneae], and cockroaches [Blattodea]) showed no systematic changes in abundance following rat control. Our results suggest that the significant threat to recruitment and individual survival that predation by rats poses for tree weta can be mitigated by wide-scale aerial pest control.


Subject(s)
Conservation of Natural Resources , Introduced Species , Rats , Rodent Control , Animals , Feeding Behavior , Invertebrates , New Zealand , Population Density , Population Dynamics , Trees
6.
Ecol Lett ; 14(10): 1035-42, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21806747

ABSTRACT

Invasive species are frequently the target of eradication or control programmes to mitigate their impacts. However, manipulating single species in isolation can lead to unexpected consequences for other species, with outcomes such as mesopredator release demonstrated both theoretically and empirically in vertebrate assemblages with at least two trophic levels. Less is known about the consequences of species removal in more complex assemblages where a greater number of interacting invaders increases the potential for selective species removal to result in unexpected changes in community structure. Using a replicated Before-After Control-Impact field experiment with a four-species assemblage of invasive mammals we show that species interactions in the community are dominated by competition rather than predation. There was no measurable response of two mesopredators (rats and mice) following control of the top predator (stoats), but there was competitive release of rats following removal of a herbivore (possums), and competitive release of mice following removal of rats.


Subject(s)
Competitive Behavior/physiology , Ecosystem , Introduced Species , Predatory Behavior/physiology , Animals , Biodiversity , Mice , Population Density , Rats
7.
Eur J Gastroenterol Hepatol ; 16(5): 487-94, 2004 May.
Article in English | MEDLINE | ID: mdl-15097042

ABSTRACT

OBJECTIVES: To assess the effectiveness of a centralised upper-gastrointestinal haemorrhage (UGIH) unit. METHODS: The UK Audit of acute UGIH resulted in the formulation of a simple numerical scoring system. The Rockall score categorises patients by risk factors for death and allows case-mix comparisons. A total of 900 consecutive patients admitted to a UGIH unit between October 1995 and July 1998 were analysed prospectively. Patients were given an initial Rockall score and, if endoscopy was performed, a complete score. This method of risk stratification allowed the proportion of deaths (in our study) to be compared with the National Audit using risk standardised mortality ratios. RESULTS: The distribution of both initial and final Rockall scores was significantly higher in our study than in the National Audit. A total of 73 (8.1%) patients died, compared with the National Audit mortality of 14%. Risk-standardised mortality ratios using both initial and complete Rockall scores were significantly lower in our study when compared with those in the National Audit. CONCLUSION: A specialised UGIH unit is associated with a lower proportion of deaths from UGIH, despite comprising a greater number of high-risk patients than the National Audit. This lower mortality therefore cannot be attributed to a more favourable case mix and demonstrates that further improvements in mortality for UGIH can be made.


Subject(s)
Gastroenterology/standards , Gastrointestinal Hemorrhage/therapy , Hospital Departments/standards , Medical Audit , Acute Disease , Aged , Aged, 80 and over , Duodenal Ulcer/complications , Duodenal Ulcer/therapy , Female , Gastrointestinal Hemorrhage/mortality , Humans , Male , Middle Aged , Prospective Studies , Risk Assessment , Stomach Ulcer/complications , Stomach Ulcer/therapy
9.
Semin Dial ; 15(3): 146-8, 2002.
Article in English | MEDLINE | ID: mdl-12100452

ABSTRACT

Increasing the prevalence of arteriovenous (AV) fistulas is crucial to decreasing the incidence and costs of dialysis access failure. Despite almost uniform agreement in the dialysis community of the need to increase AV fistulas, U.S. fistula prevalence has only increased modestly since the publication of the Dialysis Outcomes Quality Initiative (DOQI) clinical practice guidelines in 1997. Fistula rates of 28% in incident patients and 27% in prevalent patients [Health Care Finance Administration (HCFA) clinical performance measures project data for 1999] do not approach the fistula rates achieved by various focused U.S. programs, nor those routinely observed in Europe. Systemic barriers that limit the availability and funding of both pre-end-stage renal disease (ESRD) care and preoperative imaging, coupled with financial disincentives, lack of accountability, and educational deficiencies, impede progress toward increased fistula placement. Improvements in AV fistula prevalence require a realistic appraisal and correction of the system problems hindering achievement of this goal. The DOQI and Kidney Disease Outcomes and quality Initiative (K/DOQI) were excellent first steps; however, implementation will require modification of other structures that impact on patient care delivery.


Subject(s)
Arteriovenous Shunt, Surgical , Renal Dialysis , Arteriovenous Shunt, Surgical/statistics & numerical data , Humans , Kidney Failure, Chronic/therapy , Renal Dialysis/economics , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...