Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
J Environ Qual ; 44(2): 382-90, 2015 Mar.
Article in English | MEDLINE | ID: mdl-26023957

ABSTRACT

Nutrient enrichment of water resources has degraded coastal waters throughout the world, including in the United States (e.g., Chesapeake Bay, Gulf of Mexico, and Neuse Estuary). Agricultural nonpoint sources have significant impacts on water resources. As a result, nutrient management planning is the primary tool recommended to reduce nutrient losses from agricultural fields. Its effectiveness requires nutrient management plans be used by farmers. There is little literature describing nutrient management decision-making. Here, two case studies are described that address this gap: (i) a synthesis of the National Institute of Food and Agriculture, the Conservation Effects Assessment Project, and (ii) field surveys from three nutrient-impaired river basins/watersheds in North Carolina (Neuse, Tar-Pamlico, and Jordan Lake drainage areas). Results indicate farmers generally did not fully apply nutrient management plans or follow basic soil test recommendations even when they had them. Farmers were found to be hesitant to apply N at university-recommended rates because they did not trust the recommendations, viewed abundant N as insurance, or used recommendations made by fertilizer dealers. Exceptions were noted when watershed education, technical support, and funding resources focused on nutrient management that included easing management demands, actively and consistently working directly with a small group of farmers, and providing significant resource allocations to fund agency personnel and cost-share funds to farmers. Without better dialogue with farmers and meaningful investment in strategies that reward farmers for taking what they perceive as risks relative to nutrient reduction, little progress in true adoption of nutrient management will be made.

2.
J Environ Qual ; 39(1): 85-96, 2010.
Article in English | MEDLINE | ID: mdl-20048296

ABSTRACT

Nonpoint source (NPS) watershed projects often fail to meet expectations for water quality improvement because of lag time, the time elapsed between adoption of management changes and the detection of measurable improvement in water quality in the target water body. Even when management changes are well-designed and fully implemented, water quality monitoring efforts may not show definitive results if the monitoring period, program design, and sampling frequency are not sufficient to address the lag between treatment and response. The main components of lag time include the time required for an installed practice to produce an effect, the time required for the effect to be delivered to the water resource, the time required for the water body to respond to the effect, and the effectiveness of the monitoring program to measure the response. The objectives of this review are to explore the characteristics of lag time components, to present examples of lag times reported from a variety of systems, and to recommend ways for managers to cope with the lag between treatment and response. Important processes influencing lag time include hydrology, vegetation growth, transport rate and path, hydraulic residence time, pollutant sorption properties, and ecosystem linkages. The magnitude of lag time is highly site and pollutant specific, but may range from months to years for relatively short-lived contaminants such as indicator bacteria, years to decades for excessive P levels in agricultural soils, and decades or more for sediment accumulated in river systems. Groundwater travel time is also an important contributor to lag time and may introduce a lag of decades between changes in agricultural practices and improvement in water quality. Approaches to deal with the inevitable lag between implementation of management practices and water quality response lie in appropriately characterizing the watershed, considering lag time in selection, siting, and monitoring of management measures, selection of appropriate indicators, and designing effective monitoring programs to detect water quality response.


Subject(s)
Environmental Monitoring/methods , Water Pollution/prevention & control , Water Supply/standards , Water/analysis , Water/standards , Time Factors
3.
J Environ Qual ; 35(4): 1088-100, 2006.
Article in English | MEDLINE | ID: mdl-16738394

ABSTRACT

Contamination by bacteria is a leading cause of impairment in U.S. waters, particularly in areas of livestock agriculture. We evaluated the effectiveness of several practices in reducing Escherichia coli levels in runoff from fields receiving liquid dairy (Bos taurus) manure. Runoff trials were conducted on replicated hay and silage corn (Zea mays L.) plots using simulated rainfall. Levels of E. coli in runoff were approximately 10(4) to 10(6) organisms per 100 mL, representing a significant pollution potential. Practices tested were: manure storage, delay between manure application and rainfall, manure incorporation by tillage, and increased hayland vegetation height. Storage of manure for 30 d or more consistently and dramatically lowered E. coli counts in our experiments, with longer storage providing greater reductions. Manure E. coli declined by > 99% after approximately 90 d of storage. On average, levels of E. coli in runoff were 97% lower from plots receiving 30-d-old and > 99% lower from plots receiving 90-d-old manure than from plots where fresh manure was applied. Runoff from hayland and cornland plots where manure was applied 3 d before rainfall contained approximately 50% fewer E. coli than did runoff from plots that received manure 1 d before rainfall. Hayland vegetation height alone did not significantly affect E. coli levels in runoff, but interactions with rainfall delay and manure age were observed. Manure incorporation alone did not significantly affect E. coli levels in cornland plot runoff, but incorporation could reduce bacteria export by reducing field runoff and interaction with rainfall delay was observed. Extended storage that avoids additions of fresh manure, combined with application several days before runoff, incorporation on tilled land, and higher vegetation on hayland at application could substantially reduce microorganism loading from agricultural land.


Subject(s)
Escherichia coli/physiology , Manure/analysis , Soil Pollutants/analysis , Water Microbiology , Water Movements , Agriculture , Animals , Colony Count, Microbial , Environmental Monitoring , Escherichia coli/isolation & purification , Manure/microbiology , Rain , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...