Wednesday 13 May, 2009

Water Management

WATER MANAGEMENT

1.0 WATER SOURCES, IMPURITIES, AND CHEMISTRY

Abundant supplies of fresh water are essential to the development of industry. Enormous quantities are required for the cooling of products and equipment, for process needs, for boiler feed, and for sanitary and potable water supply.
THE PLANETARY WATER CYCLE
Industry is a small participant in the global water cycle. The finite amount of water on the planet participates in a very complicated recycling scheme that provides for its reuse. This recycling of water is termed the "Hydrologic Cycle". Evaporation under the influence of sunlight takes water from a liquid to a gaseous phase. The water may condense in clouds as the temperature drops in the upper atmosphere. Wind transports the water over great distances before releasing it in some form of precipitation. As the water condenses and falls to the ground, it absorbs gases from the environment. This is the principal cause of acid rain and acid snow.
WATER AS A SOLVENT
Pure water (H2O) is colorless, tasteless, and odorless. It is composed of hydrogen and oxygen. Because water becomes contaminated by the substances with which it comes into contact, it is not available for use in its pure state. To some degree, water can dissolve every naturally occurring substance on the earth. Because of this property, water has been termed a "universal solvent." Although beneficial to mankind, the solvency power of water can pose a major threat to industrial equipment. Corrosion reactions cause the slow dissolution of metals by water. Deposition reactions, which produce scale on heat transfer surfaces, represent a change in the solvency power of water as its temperature is varied. The control of corrosion and scale is a major focus of water treatment technology.
WATER IMPURITIES
Water impurities include dissolved and suspended solids. Calcium bicarbonate is a soluble salt. A solution of calcium bicarbonate is clear, because the calcium and bicarbonate are present as atomic-sized ions which are not large enough to reflect light. Some soluble minerals impart a color to the solution. Soluble iron salts produce pale yellow or green solutions; some copper salts form intensely blue solutions. Although colored, these solutions are clear. Suspended solids are substances that are not completely soluble in water and are present as particles. These particles usually impart a visible turbidity to the water. Dissolved and suspended solids are present in most surface waters. Seawater is very high in soluble sodium chloride; suspended sand and silt make it slightly cloudy.
Surface Water
The ultimate course of rain or melting snow depends on the nature of the terrain over which it flows. In areas consisting of hard-packed clay, very little water penetrates the ground. In these cases, the water generates "runoff." The runoff collects in streams and rivers. The rivers empty into bays and estuaries, and the water ultimately returns to the sea, completing one major phase of the hydrologic cycle. As water runs off along the surface, it stirs up and suspends particles of sand and soil, creating silt in the surface water. In addition, the streaming action erodes rocky surfaces, producing more sand. As the surface water cascades over rocks, it is aerated. The combination of oxygen, inorganic nutrients leached from the terrain, and sunlight supports a wide variety of life forms in the water, including algae, fungi, bacteria, small crustaceans, and fish. Often, river beds are lined with trees, and drainage areas feeding the rivers are forested. Leaves and pine needles constitute a large percentage of the biological content of the water. After it dissolves in the water, this material becomes a major cause of fouling of ion exchange resin used in water treatment.
The physical and chemical characteristics of surface water contamination vary considerably over time. A sudden storm can cause a dramatic short-term change in the composition of a water supply. Over a longer time period, surface water chemistry varies with the seasons. During periods of high rainfall, high runoff occurs. This can have a favorable or unfavorable impact on the characteristics of the water, depending on the geochemistry and biology of the terrain.
Surface water chemistry also varies over multi-year or multidecade cycles of drought and rainfall. Extended periods of drought severely affect the availability of water for industrial use. Where rivers discharge into the ocean, the incursion of salt water up the river during periods of drought presents additional problems. Industrial users must take surface water variability into account when designing water treatment plants and programs.


Groundwater
Water that falls on porous terrains, such as sand or sandy loam, drains or percolates into the ground. In these cases, the water encounters a wide variety of mineral species arranged in complex layers, or strata. The minerals may include granite, gneiss, basalt, and shale. In some cases, there may be a layer of very permeable sand beneath impermeable clay. Water often follows a complex three-dimensional path in the ground. The science of groundwater hydrology involves the tracking of these water movements.
In contrast to surface supplies, ground waters are relatively free from suspended contaminants, because they are filtered as they move through the strata. The filtration also removes most of the biological contamination. Some ground waters with high iron content contain sulfate-reducing bacteria. These are a source of fouling and corrosion in industrial water systems.
Groundwater chemistry tends to be very stable over time. A groundwater may contain an undesirable level of scale forming solids, but due to its fairly consistent chemistry it may be treated effectively.
Mineral Reactions
As groundwater encounters different minerals, it dissolves them according to their solubility characteristics. In some cases chemical reactions occur, enhancing mineral solubility. A good example is the reaction of groundwater with limestone. Water percolating from the surface contains atmospheric gases. One of these gases is carbon dioxide, which forms carbonic acid when dissolved in water. The decomposition of organic matter beneath the surface is another source of carbon dioxide. Limestone is a mixture of calcium and magnesium carbonate. The mineral, which is basic, is only slightly soluble in neutral water.
The slightly acidic groundwater reacts with basic limestone in a neutralization reaction that forms a salt and a water of neutralization. The salt formed by the reaction is a mixture of calcium and magnesium bicarbonate. Both bicarbonates are quite soluble. This reaction is the source of the most common deposition and corrosion problems faced by industrial users. The calcium and magnesium (hardness) form scale on heat transfer surfaces if the groundwater is not treated before use in industrial cooling and boiler systems. In boiler feed water applications, the thermal break-down of the bicarbonate in the boiler leads to high levels of carbon dioxide in condensate return systems. This can cause severe system corrosion.
Structurally, limestone is porous. That is, it contains small holes and channels called "interstices." A large formation of limestone can hold vast quantities of groundwater in its structure. Limestone formations that contain these large quantities of water are called aquifers, a term derived from Latin roots meaning water-bearing.
If a well is drilled into a limestone aquifer, the water can be withdrawn continuously for decades and used for domestic and industrial applications. Unfortunately, the water is very hard, due to the neutralization/dissolution reactions described above. This necessitates extensive water treatment for most uses.
CHEMICAL REACTIONS
Numerous chemical tests must be conducted to ensure effective control of a water treatment program. Most of these tests are addressed in detail in Chapters 39-71. Because of their significance in many systems, three tests, pH, alkalinity, and silica, are discussed here as well.
pH Control
Good pH control is essential for effective control of deposition and corrosion in many water systems. Therefore, it is important to have a good under-standing of the meaning of pH and the factors that affect it. Pure H2O exists as equilibrium between the acid species, H+ (more correctly expressed as a protonated water molecule, the hydronium ion, H3O+) and the hydroxyl radical, OH¯. In neutral water the acid concentration equals the hydroxyl concentration and at room temperature they both are present at 10¯7 gram equivalents (or moles) per liter. The "p" function is used in chemistry to handle very small numbers. It is the negative logarithm of the number being expressed. Water that has 10¯7 gram equivalents per liter of hydrogen ions is said to have a pH of 7. Thus, a neutral solution exhibits a pH of 7. As it varies, the concentration of OH- must also vary, but in the opposite direction, such that the product of the two remains constant. Confusion regarding pH arises from two sources:
the inverse nature of the function
the pH meter scale
It is important to remember that as the acid concentration increases; the pH value decreases.
The pH meter can be a source of confusion, because the pH scale on the meter is linear, extending from 0 to 14 in even increments. Because pH is a logarithmic function, a change of 1 pH unit corresponds to a 10-fold change in acid concentration. A decrease of 2 pH units represents a 100-fold change in acid concentration.
Alkalinity
Alkalinity tests are used to control lime-soda softening processes and boiler blow down and to predict the potential for calcium scaling in cooling water systems. For most water systems, it is important to recognize the sources of alkalinity and maintain proper alkalinity control.
Carbon dioxide dissolves in water as a gas. The dissolved carbon dioxide reacts with solvent water molecules and forms carbonic acid according to the following reaction:
CO2 + H2O H2CO3
Only a trace amount of carbonic acid is formed, but it is acidic enough to lower pH from the neutral point of 7. Carbonic acid is a weak acid, so it does not lower pH below 4.3. However, this level is low enough to cause significant corrosion of system metals. If the initial loading of CO2 is held constant and the pH is raised, a gradual transformation into the bicarbonate ion HCO3 ¯ occurs. The transformation is complete at pH 8.3. Further elevation of the pH forces a second transformation–into carbonate, CO3 2¯. The three species–carbonic acid, bicarbonate, and carbonate– can be converted from one to another by means of changing the pH of the water.
Variations in pH can be reduced through "buffering"–the addition of acid (or caustic). When acid (or caustic) is added to water containing carbonate/bicarbonate species, the pH of the system does not change as quickly as it does in pure water. Much of the added acid (or caustic) is consumed as the carbonate/bicarbonate (or bicarbonate/carbonic acid) ratio is shifted.
Alkalinity is the ability of a natural water to neutralize acid (i.e., to reduce the pH depression expected from a strong acid by the buffering mechanism mentioned above). Confusion arises in that alkaline pH conditions exist at a pH above 7, whereas alkalinity in natural water exists at a pH above 4.4. Alkalinity is measured by a double titration; acid is added to a sample to the Phenolphthalein end point (pH 8.3) and the Methyl Orange end point (pH 4.4). Titration to the Phenolphthalein end point (the P-alkalinity) measures OH¯ and ½CO3²¯; titration to the Methyl Orange end point (the M-alkalinity) measures OH¯, CO3²¯ and HCO3¯.
Silica
When not properly controlled, silica forms highly insulating, difficult-to-remove deposits in cooling systems, boilers, and turbines. An understanding of some of the possible variations in silica testing is valuable. Most salts, although present as complicated crystalline structures in the solid phase, assume fairly simple ionic forms in solution. Silica exhibits complicated structures even in solution.
Silica exists in a wide range of structures, from a simple silicate to a complicated polymeric material. The polymeric structure can persist when the material is dissolved in surface waters. The size of the silica polymer can be substantial, ranging up to the colloidal state. Colloidal silica is rarely present in ground waters. It is most commonly present in surface waters during periods of high runoff.
The polymeric form of silica does not produce color in the standard molybdate-based colorimetric test for silica. This form of silica is termed "nonreactive." The polymeric form of silica is not thermally stable and when heated in a boiler reverts to the basic silicate monomer, which is reactive with molybdate.
As a result, molybdate testing of a boiler feed water may reveal little or no silica, while boiler blow down measurements show a level of silica that is above control limits. High boiler water silica and low feed water values are often a first sign that colloidal silica is present in the makeup. One method of identifying colloidal silica problems is the use of atomic emission or absorption to measure feed water silica. This method, unlike the molybdate chemistry, measures total silica irrespective of the degree of polymerization.

2.0 Water treatment

Water treatment describes those processes used to make water more acceptable for a desired end-use. These can include use as drinking water, industrial processes, medical and many other uses. The goal of all water treatment process is to remove existing contaminants in the water, or reduce the concentration of such contaminants so the water becomes fit for its desired end-use. One such use is returning water that has been used back into the natural environment without adverse ecological impact. The processes involved in treating water for drinking purpose may be solids separation using physical such as settling and filtration, chemical such as disinfection and coagulation. Biological processes are also employed in the treatment of wastewater and these processes may include, for example, aerated lagoons, activated sludge or slow sand filters.
Potable water purification
Water purification is the removal of contaminants from untreated water to produce drinking water that is pure enough for its intended use, most commonly human consumption. Substances that are removed during the process of drinking water treatment include bacteria, algae, viruses, fungi, minerals such as iron, manganese and sulphur, and man-made chemical pollutants including fertilisers.
It is important to take measures to make available water of desirable quality at the consumer end. That leads to protection of the treated water during conveyance and distribution after treatment. It is common practice to have residual disinfectants in the treated water in order to kill any bacteriological contamination after water treatment.
World Health Organisation (WHO) guidelines are generally followed throughout the world for drinking water quality requirements. In addition of the WHO guidelines, each country or territory or water supply body can have their own guidelines in order for consumers to have access to safe drinking water.
Processes for drinking water
The combination of following processes is used for municipal drinking water treatment worldwide:
Pre-chlorination - for algae control and arresting any biological growth
Aeration - along with pre-chlorination for removal of dissolved iron and manganese
Coagulation - for flocculation
Coagulant aids also known as polyelectrolytes - to improve coagulation and for thicker floc formation
Sedimentation - for solids separation, that is, removal of suspended solids trapped in the floc
Filtration - for removal of carried over floc
Disinfection - for killing bacteria
3.0 WASTEWATER TREATMENT

Many industries use large volumes of water in their manufacturing operations. Because some of this water becomes contaminated, it requires treatment before discharge.
Improvements in determining the effects of industrial waste discharges have led to the adoption of stringent environmental laws, which define the degree of treatment necessary to protect water quality. Discharge permits, issued under the National Pollutant Discharge Elimination System (NPDES), regulate the amount of pollutants that an industry can return to the water source. The permitted quantities are designed to ensure that other users of the water will have a source that meets their needs, whether these needs are for municipal water supply, industrial or agricultural uses, or fishing and recreation. Consideration is given to the feasibility of removing a pollutant, as well as the natural assimilative capacity of the receiving stream. This assimilative capacity varies with the type and amount of pollutant.
Wastewater treatment plants are designed to convert liquid wastes into an acceptable final effluent and to dispose of solids removed or generated during the process. In most cases, treatment is required for both suspended and dissolved contaminants. Special processes are required for the removal of certain pollutants, such as phosphorus or heavy metals.
Wastewater can be recycled for reuse in plant processes to reduce disposal requirements. This practice also reduces water consumption.
POLLUTANTS
Organic Compounds
The amount of organic material that can be discharged safely is defined by the effect of the material on the dissolved oxygen level in the water. Organisms in the water use the organic matter as a food source. In a biochemical reaction, dissolved oxygen is consumed as the end products of water and carbon dioxide are formed. Atmospheric oxygen can replenish the dissolved oxygen supply, but only at a slow rate. When the organic load causes oxygen consumption to exceed this resupply, the dissolved oxygen level drops leading to the death of fish and other aquatic life. Under extreme conditions, when the dissolved oxygen concentration reaches zero, the water may turn black and produce foul odors, such as the "rotten egg" smell of hydrogen sulfide. Organic compounds are normally measured as chemical oxygen demand (COD) or biochemical oxygen demand (BOD).
Nutrients
Nitrogen and phosphorus are essential to the growth of plants and other organisms. However, nitrogen compounds can have the same effect on a water source as carbon-containing organic compounds. Certain organisms use nitrogen as a food source and consume oxygen.
Phosphorus is a concern because of algae blooms that occur in surface waters due to its presence. During the day, algae produce oxygen through photosynthesis, but at night they consume oxygen.
Solids
Solids discharged with a waste stream may settle immediately at the discharge point or may remain suspended in the water. Settled solids cover the bottom-dwelling organisms, causing disruptions in population and building a reservoir of oxygen-consuming materials. Suspended solids increase the turbidity of the water, thereby inhibiting light transmittance.
Deprived of a light source, photosynthetic organisms die. Some solids can coat fish gills and cause suffocation.
Acids and Alkalies
The natural buffering system of a water source is exhausted by the discharge of acids and alkalies. Aquatic life is affected by the wide swings in pH as well as the destruction of bicarbonate alkalinity levels.
Metals
Certain metals are toxic and affect industrial, agricultural, and municipal users of the water source. Metals can cause product quality problems for industrial users. Large quantities of discharged salts necessitate expensive removal by downstream industries using the receiving stream for boiler makeup water.
REMOVAL OF INSOLUBLE CONTAMINANTS
Various physical methods may be used for the removal of wastewater contaminants that are insoluble in water, such as suspended solids, oil, and grease. Ordinarily, water-soluble contaminants are chemically converted to an insoluble form to allow removal by physical methods. Essentially, biological waste treatment is this conversion of soluble contaminants to insoluble forms.
Gravity Separation
Most waste treatment systems employ a gravity separation step for suspended particle or oil removal.
The settling rate of a particle is defined in terms of "free" versus "hindered" settling. A free settling particle's motion is not affected by that of other particles, the vessel's walls, or turbulent currents. A particle has a hindered settling rate if there
is any interference from these effects.
The free settling of a discrete particle in a rising fluid can be described as the resolution of several forces--gravity, the drag exerted on the particle, and the buoyant force as described by Archimedes' principle. The particle's velocity increases until it reaches a terminal velocity as determined by these forces. The terminal velocity is then:
2gmP(rP-rf)
v = AcCdrPrf
where:
v = velocity, ft/sec
g = gravitation constant, ft/sec²
mP = mass of the particle, lb
rP = density of the particle, lb/ft³
rf = density of the fluid, lb/ft³
Ac = cross sectional area of the particle ex-posed to the direction of motion, ft²
Cd = drag coefficient, a function of particle geometry
Gravity settling is employed primarily for removal of inorganic suspended solids, such as grit and sand. Therefore, in the approximation of the drag coefficient, it is assumed that particles are spherical. Further, if a Reynolds number of less than 2.0 is assumed, the settling velocity of a discrete particle can be described by Stokes' settling equation:
GdP
2(rP - rf)
v = 18μ
where:
DP = particle diameter, ft
μ = fluid viscosity, lb/ft-sec
The terminal velocity of a particle in the "free" settling zone is a function of its diameter, the density difference between the particle and the fluid, and the fluid viscosity.
The equipment employed for gravity separation for waste treatment is normally either a rectangular basin with moving bottom scrapers for solids removal or a circular tank with a rotating bottom scraper. Rectangular tanks are normally sized to decrease horizontal fluid velocity to approximately 1 ft/min (0.51 cm/s). Their lengths are three to five times their width, and their depths are 3-8 ft (0.91-2.44 m).
Circular clarifiers are ordinarily sized according to surface area, because velocity must be reduced below the design particle's terminal velocity. The typical design provides a rise rate of 600-800 gpd/ft² (0.28-0.38 L/s/m2).
When wastewater contains appreciable amounts of hydrocarbons, removal of these contaminants becomes a problem. Oil is commonly lower in density than water; therefore, if it is not emulsified, it can be floated in a separate removal stage or in a dual-purpose vessel that allows sedimentation of solids. For example, the refining industry uses a rectangular clarifier with a surface skimmer for oil and a bottom rake for solids as standard equipment. This design, specified by the
American Petroleum Institute is designated as an API separator. The basic principles govern-ing the separation of oil from water by gravity differential are also expressed by Stokes' Law.
Air Flotation
Where the density differential is not sufficient to separate oil and oil-wetted solids, air flotation may be used to enhance oil removal. In this method, air bubbles are attached to the contaminant particles, and thus the apparent density difference between the particles is increased.
Dissolved air flotation (DAF) is a method of introducing air to a side stream or recycle stream at elevated pressures in order to create a supersaturated stream. When this stream is introduced into the waste stream, the pressure is reduced to atmospheric, and the air is released as small bubbles. These bubbles attach to contaminants in the waste, decreasing their effective density and aiding in their separation.
The most important operational parameters for contaminant removal by dissolved air flotation are:
§ air pressure
§ recycle or slip stream flow rate
§ influent total suspended solids (TSS) including oil and grease
§ bubble size
§ dispersion
Air pressure, recycle, and influent TSS are normally related in an air-to-solids (A/S) ratio expressed as:
KSa(fP-1)R
A/S = SSQ
where:
K = a constant, approximately 1.3
Sa = the solubility of air at standard conditions, mL/L
f = air dissolved/Sa, usually 0.5-0.8
P = operating pressure, atm
R = recycle rate, gpm
SS = influent suspended solids, mg/L
Q = wastewater flow, gpm
The A/S ratio is most important in determining effluent TSS. Recycle flow and pressure can be varied to maintain an optimal A/S ratio. Typical values are 0.02-0.06.
In a DAF system, the supersaturated stream may be the entire influent, a slip stream, fresh water, or a recycle stream.
Recycle streams are most common, because pressurization of a high- solids stream through a pump stabilizes and disperses oil and oil-wetted solids.
As in gravity settling, air flotation units are designed for a surface loading rate that is a function of the waste flow and rise velocity of the contaminants floated by air bubbles. The retention time is a function of tank depth.
DAF units can be rectangular in design but are usually circular, resembling a primary clarifier or thickener. They are often single-stage units.
Induced air flotation (IAF) is another method of decreasing particle density by attaching air bubbles to the particles; however, the method of generating the air bubbles differs. A mechanical action is employed to create the air bubbles and their contact with the waste contaminants. The most common methods use high-speed agitators or recycle a slip stream through venturi nozzles to entrain air into the wastewater.
In contrast to DAF units, IAF units are usually rectangular and incorporate four or more air flotation stages in series. The retention time per stage is significantly less than in DAF circular tanks.
As in gravity settling, the diameter of the particle plays an important role in separation. Polyelectrolytes may be used to increase effective particle diameters. Polymers are also used to destabilize oil-water emulsions, thereby allowing the free oil to be separated from the water. Polymers do this by charge neutralization, which destabilizes an oil globule surface and allows it to contact other oil globules and air bubbles. Emulsion breakers, surfactants, or surface-active agents are also used in air flotation to destabilize emulsions and increase the effectiveness of the air bubbles.
Filtration
Filtration is employed in waste treatment wherever suspended solids must be removed. In practice, filtration is most often used to polish wastewater following treatment. In primary waste treatment, filters are often employed to remove oil and suspended solids prior to biological treatment. More commonly, filters are used following biological treatment, prior to final discharge or reuse.
Filters used for waste treatment may be designed with single-, dual-, or multimedia and may be of the pressure or gravity type.
REMOVAL OF INSOLUBLE CONTAMINANTS
pH Adjustment-Chemical Precipitation
Often, industrial wastewaters contain high concentrations of metals, many of which are soluble at a low pH.
Adjustment of pH precipitates these metals as metal oxides or metal hydroxides. The pH must be carefully controlled to minimize the solubility of the contaminant. Some compounds, such as zinc, are amphoteric and redissolve at a high pH. Chemicals used for pH adjustment include lime, sodium hydroxide, and soda ash.
Chemical precipitation of soluble ions often occurs as the result of pH adjustment. Contaminants are removed either by chemical reaction leading to precipitation or by adsorption of ions on an already formed precipitate.
Biological Oxidation-Biochemical Reactions
One of the most common ways to convert soluble organic matter to insoluble matter is through biological oxidation.
Soluble organics metabolized by bacteria are converted to carbon dioxide and bacterial floc, which can be settled from solution.
Various microorganisms feed on dissolved and suspended organic compounds. This natural biodegradation can occur in streams and lakes. If the assimilative capacity of the stream is surpassed, the reduced oxygen content can cause asphyxiation of fish and other higher life forms. This natural ability of microorganisms to break down complex organics can be harnessed to remove materials within the confines of the waste plant, making wastewater safe for discharge.
The biodegradable contaminants in water are usually measured in terms of biochemical oxygen demand (BOD). BOD is actually a measure of the oxygen consumed by microorganisms as they assimilate organics.
Bacteria metabolize oxygen along with certain nutrients and trace metals to form cellular matter, energy, carbon dioxide, water, and more bacteria. This process may be represented in the form of a chemical reaction:
Food Cellular matter
(organic compounds) + Microorganisms
+ Microorganisms ® + Carbon dioxide
+ Oxygen + Water
+ Nutrients + Energy
The purity of the water depends on minimizing the amount of "food" (organic compounds) that remains after treatment.
Therefore, biological waste treatment facilities are operated to provide an environment that will maximize the health and metabolism of microorganisms. An integral part of the biological process is the conversion of soluble organic material into insoluble materials for subsequent removal.
Open Lagoon Biological Oxidation
Where organic loads are low and sufficient land area is available, open lagoons may be used for biological treatment.
Lagoons provide an ideal habitat for microorganisms. Natural infiltration of oxygen is sufficient for biological oxidation if the organic loading is not too high. However, mechanical aeration is often used to increase the ability to handle a higher loading.
Lagoons are nothing more than long-term retention basins. Ordinarily shallow in depth, they depend on surface area, wind, and wave action for oxygen transfer from the atmosphere. Depending on the influent BOD loading and oxygen transfer, lagoons may be aerobic or anaerobic. Lagoons are used primarily for low BOD wastes or as polishing units after other biological operations.
Aerated Lagoons
As BOD loading increases, naturally occurring surface oxygen transfer becomes insufficient to sustain aerobic bacteria. It then becomes necessary to control the environment artificially by supplying supplemental oxygen. Oxygen, as air, is introduced either by mechanical agitators or by blowers and subsurface aerators. Because energy must be expended, the efficiency of the oxygen transfer is a consideration. Therefore, although unaerated lagoons are typically 3-5 ft (0.91-1.52 m) deep, allowing large surface areas for natural transfer, aerated lagoons are usually 10-15 ft (3.05-4.57 m) deep in order to provide a longer, more difficult path for oxygen to escape unconsumed. Aerated lagoons also operate with higher dissolved oxygen content.
Facultative Lagoons Lagoons without mechanical aeration are usually populated by facultative organisms. These organisms have the ability to survive with or without oxygen. A lagoon designed specifically to be facultative is slightly deeper than an unaerated lagoon. Influent suspended solids and solids created by the metabolism of the aerobic bacteria settle to the bottom of the lagoon where they undergo further decomposition in an anaerobic environment.
Activated Sludge Oxidation
According to the reaction presented previously, control of contaminant oxidation at high BOD loadings requires a bacteria population that is equal to the level of food. This need is the basis for the activated sludge process.
In the activated sludge process, reactants, food, and microorganisms are mixed in a controlled environment to optimize BOD removal. The process incorporates the return of concentrated microorganisms to the influent waste.
When bacteria are separated from wastewater leaving an aeration basin and reintroduced to the influent, they continue to thrive. The recirculated bacteria continue to oxidize wastewater contaminants, and if present in sufficient quantity, produce relatively low BOD effluent water.
Because the activated sludge process incorporates the return of concentrated microorganisms, it must include a process for microorganism concentration and removal. This process includes an aeration stage and a sedimentation stage. Because suspended solids are considered wastewater contaminants, the sedimentation stage accomplishes two functions: concentration of bacteria and removal of solids.
The operating parameters that affect the performance of any activated sludge process are BOD, microorganisms, dissolved oxygen, retention time, nutrient concentration, and the external influences of temperature and pH. In order to understand the various activated sludge designs, it is necessary to examine the relationship between available food and bacteria population.
Initially, excess food is present; therefore, the bacteria reproduce in a geometric fashion. This is termed the "log growth phase." As the population increases and food decreases, a plateau is reached in population. From the inflection point on the curve to the plateau, population is increasing but at a decreasing rate. This is called the "declining growth phase."
Once the plateau is crossed, the bacteria are actively competing for the remaining food. The bacteria begin to metabolize stored materials, and the population decreases. This area of the curve is termed "endogenous respiration." Eventually, the bacteria population and BOD are at a minimum.
Because activated sludge is a continuous, steady-state process, each plant operates at some specific point on this curve, as determined by the oxidation time provided. The point of operation determines the remaining bacteria population and BOD effluent.
Optimization of an activated sludge plant requires the integration of mechanical, operational, and chemical approaches for the most practical overall program. Mechanical problems can include excessive hydraulic loading, insufficient aeration, and short-circuiting. Operational problems may include spills and shock loads, pH shocks, failure to maintain correct mixed liquor concentration, and excessive sludge retention in the clarifier.
Various chemical treatment programs are described below. schemes.
Sedimentation. Because activated sludge depends on microorganism recirculation, sedimentation is a key stage. The settleability of the biomass is a crucial factor. As bacteria multiply and generate colonies, they excrete natural biopolymers. These polymers and the slime layer that encapsulates the bacteria influence the flocculation and settling characteristics of bacteria colonies. Newly formed colonies in the log growth phase are relatively non-settleable. At the end of the declining growth phase and the first part of the endogenous phase, natural flocculation is at an optimum. As the endogenous phase continues, colonies break up and floc particles are dispersed, decreasing the biomass settleability.
Although microbes are eventually able to break down most complex organics and can tolerate very poor environments, they are very intolerant of sudden changes in pH, dissolved oxygen, and the organic compounds that normally upset an activated sludge system. These upsets normally result in poor BOD removal and excessive carryover of suspended solids (unsettled microorganisms) in the final effluent.
Aeration. Aeration is a critical stage in the acti-vated sludge process. Several methods of aeration are used:
High Rate Aeration. High rate aeration operates in the log growth phase. Excess food is provi-ded, by recirculation, to the biomass population. Therefore, the effluent from this design contains appreciable levels of BOD (i.e., the oxidation process is not carried to completion). Further, the settling characteristics of the biomass produced are poor. High sludge return rates are necessary to offset poor settling and to maintain the relatively high biomass population. Poor settling increases the suspended solids content of the effluent. The relatively poor effluent produced limits this design to facilities which need only pretreatment before discharge to a municipal system. The advantage of high rate aeration is low capital investment (i.e., smaller tanks and basins due to the short oxidation time).
Conventional Aeration. The most common activated sludge design used by municipalities and industry operates in the endogenous phase, in order to produce an acceptable effluent in BOD and TSS levels. Conventional aeration represents a "middle of the road" approach because its capital and operating costs are higher than those of the high rate process, but lower than those of the extended aeration plants. As The conventional plant operates in the area of the BOD curve where further oxidation time produces little reduction in BOD. Natural flocculation is optimum, so the required sedimentation time for removal of suspended solids from the effluent is minimized.
Extended Aeration. Extended aeration plants operate in the endogenous phase, but use longer periods of oxidation to reduce effluent BOD levels. This necessitates higher capital and operating costs (i.e., larger basins and more air). In conjunction with lower BOD, extended aeration produces a relatively high suspended solids effluent when optimum natural settling ranges are exceeded.
Extended aeration designs may be necessary to meet effluent BOD requirements when the influent is relatively concentrated in BOD or the wastes are difficult to biodegrade. Because extended aeration operates on the declining side of the biomass population curve, net production of excess solids is minimized. Therefore, savings in sludge handling and disposal costs may offset the higher plant capital and operating costs required for extended aeration.
Step Aeration/Tapered Aeration. In a plug flow basin, the head of the basin receives the waste in its most concentrated form. Therefore, metabolism and oxygen demand are greatest at that point. As the waste proceeds through the basin, the rate of oxygen uptake (respiration rate) decreases, reflecting the advanced stage of oxidation.
Tapered aeration and step aeration reduce this inherent disadvantage. Tapered aeration provides more oxygen at the head of the basin and slowly reduces oxygen supply to match demand as the waste flows through the basin. This results in better control of the oxidation process and reduced air costs.
Step aeration modifies the introduction of influent waste. The basin is divided into several stages, and raw influent is introduced to each stage proportionately. All return microorganisms (sludge) are introduced at the head of the basin. This design reduces aeration time to 3-5 hr, while BOD removal efficiency is maintained. The shorter aeration time reduces capital expenses because a smaller basin can be used. Operating costs are similar to those of a conventional plant.
Contact Stabilization. Due to the highly efficient sorptive capabilities of activated biomass, the time necessary for the biomass to "capture" the colloidal and soluble BOD is approximately 30 min to 1 hr. Oxidation of fresh food requires the normal aeration time of 4-8 hr. In the contact stabilization design, relatively quick sorption time reduces aeration tank volume requirements. The influent waste is mixed with return biomass in the initial aeration tank (or contact tank) for 30- 90 min. The entire flow goes to sedimentation, where the biomass and its captured organics are separated and returned to a reaeration tank. In the reaeration tank the wastes undergo metabolism at a high biomass population. The system is designed to reduce tank volume by containing the large majority of flow for a short period of time.
This process is not generally as efficient in BOD removal as the conventional plant process, due to mixing limitations in the contact basin. Operating costs are equivalent. Due to the unstabilized state of the biomass at sedimentation, flocculation is inferior. Suspended solids in the effluent are problematic.
Because this design exposes only a portion of the active biomass to the raw effluent at a time, it is less susceptible to feed variations and toxicants. For this reason it can be beneficial for treatment of industrial wastes.
Pure Oxygen Sludge Processes. Oxygen supply and transfer often become limiting factors in industrial waste treatment. As the name implies, pure oxygen activated sludge processes supply oxygen (90-99% O2) to the biomass instead of air. The increased partial pressure increases transfer rates and provides several advantages. Comparable or higher BOD removal efficiencies are maintained at higher BOD influent loadings and shorter retention times. Generally, aeration time is 2-3 hr. A further advantage is the production of lower net solids per pound of BOD removed. Therefore, sludge disposal costs are reduced.
The units are usually enclosed. Normally, three or four concrete box stages in series are provided for aeration. The raw wastewater, return biomass, and pure oxygen enter the first stage. Wastewater passes from stage to stage in the underflow.
The atmosphere flows over the open surface of each stage to the last stage, from which it is vented to control the oxygen content. Oxygen purity and the demand for oxygen decline through the stages. Each stage contains a mechanical agitator for mixing and oxygen transfer. By design, each stage is completely mixed. After aeration, the waste flows to a conventional sedimentation stage. BOD and TSS removals are usually somewhat better than in a conventional aeration system.
Chemical Treatment Programs. The following additives represent a variety of chemical programs that may be used to address problems and improve system efficiency.
Essential Nutrients. Nutrients, particularly nitrogen and phosphorus, may be added to ensure complete digestion of organic contaminants.
Polymers. Polymer feeding improves the settling of suspended solids. Cationic polymers can increase the settling rate of bacterial floc and improve capture of dispersed floc and cell fragments. This more rapid concentration of solids minimizes the volume of recycle flow so that the oxygen content of the sludge is not depleted. Further, the wasted sludge is usually more concentrated and requires less treatment for eventual dewatering. Polymers may also be used on
a temporary basis to improve the removal of undesirable organisms, such as filamentous bacteria or yeast infestations, that cause sludge bulking or carryover of floating clumps of sludge.
Oxidizing Agents. Peroxide, chlorine, or other agents may be used for the selective oxidation of troublesome filamentous bacteria.
Antifoam Agents. Antifoam agents may be used to control excessive foam.
Coagulants. In addition to antifoam agents, coagulants may be fed continuously to improve efficiency, or to address particularly difficult conditions. They may also be used intermittently to compensate for hydraulic peak loads or upset conditions.
Fixed Media Biological Oxidation
In contrast to activated sludge, in which the bio-mass is in a fluid state, fixed media oxidation passes influent wastewater across a substructure laden with fixed biomass. The parameters for healthy microorganisms remain the same, except the manner in which food and microorganisms come into contact.
Fixed media designs allow a biological slime layer to grow on a substructure continually exposed to raw wastewater. As slime layer grows in thickness, oxygen transfer to the innermost layers is impeded. Therefore, mixed media designs develop aerobic, facultative, and anaerobic bacteria as a function of the thickness of the slime layer. Eventually, either because of size and wastewater shears or the death of the microorganisms, some of the slime layer sloughs off. In a continuous process, this constantly sloughing material is carried to a sedimentation stage, where it is removed. There are no provisions to recycle the microorganisms, because return sludge would plug the fixed media structure. In fact, media plugging and lack of oxygen transfer are the primary difficulties encountered with fixed media designs. Plugging problems can be alleviated by increased wastewater shear. This is normally accomplished by recycling of a portion of the effluent wastewater.
Trickling Filters. Trickling filters are not really filters but a filter-like form of fixed media oxidation. Wastewater is sprayed over a bed of stones, 3-5 in. in diameter. Bed depths range from 5 to 7 ft. Because air contact is the sole means of oxygen transfer, microorganisms become more oxygen deficient as depth increases.
Trickling filters can be classified by hydraulic loading as low-rate, high-rate, or roughing. Due to inherent oxygen transfer difficulties, even low rate filters cannot achieve the BOD removal possible in conventional activated sludge systems.
Industrial trickling filters are usually followed by an activated sludge unit. They may be used as a pretreatment step before discharge to a municipal sewage system.
Biological Towers. Another form of fixed media filter uses synthetic materials in grid fashion as a substructure for biological growth. The high porosity available with artificially designed media alleviates the oxygen transfer problems of trickling filters and allows greater bed depths. Bed depths of up to 20 ft (6.1 m) with adequate oxygen allow longer contact and consequently better BOD removal.
Biodiscs. Biodiscs are a recently developed form of fixed media oxidation. The media is fixed to a rotating shaft that exposes the media alternately to food (wastewater) and oxygen (atmosphere). Design parameters include speed of rotation, depth of the wastewater pool, porosity of the synthetic media, and number of series and parallel stages. These units circumvent the oxygen limitations of the trickling filter and therefore provide BOD removal comparable to conventional activated sludge systems. Solids produced are easily settled in the sedimentation stage, provi-ding acceptable TSS levels in the effluent. Little operational attention is required.
SOLID WASTE HANDLING
Wastewater treatment is a concentration process in which waterborne contaminants are removed from the larger wastewater stream and concentrated in a smaller side stream. The side stream is too large to be disposed of directly, so further concentration processes are required. These processes are called "solid waste handling" operations.
Stabilization/Digestion
Sludge stabilization is a treatment technique applied to biological sludge to reduce its odor-causing or toxic properties.
This treatment often reduces the amount of solids as a side effect. Anaerobic and aerobic digestion, lime treatment, chlorine oxidation, heat treatment, and composting fall into this category.
Anaerobic Digestion. Anaerobic digestion takes place in an enclosed tank, The biochemical reactions take place in the following phases:
Organics + Acid-forming ® Volatile acids organisms
Volatile acids + Methane ® Methane + Carbon formers dioxide
Sludge solids are decreased due to the conversion of biomass to methane and carbon dioxide. The methane can be recovered for its heating value.
Aerobic Digestion. Aerobic digestion is the separate aeration of sludge in an open tank. Oxidation of biodegradable matter, including cell mass, occurs under these conditions. As in anaerobic digestion, there is a decrease in sludge solids, and the sludge is well stabilized with respect to odor formation. Capital costs are lower than those of anaerobic digestion, but operating costs are higher, and there is no by-product methane production.
Lime Treatment. Stabilization by lime treatment does not result in a reduction of organic matter. Addition of sufficient lime to maintain the pH of the sludge above 11.0 for 1-14 days is considered sufficient to destroy most bacteria.
Composting. A natural digestion process, composting usually incorporates sludge material that later will be applied to farmland. Sludge is combined with a bulking material, such as other solid wastes or wood chips, and piled in windrows.
Aeration is provided by periodic turning of the sludge mass or by mechanical aerators. The energy produced by the decomposition reaction can bring the waste temperature to 140-160°F (60-71°C), destroying pathogenic bacteria. At the end of the composting period, the bulking material is separated, and the stabilized sludge is applied to land or sent to a landfill.
Sludge Conditioning
Typically, sludge from a final liquid-solids separation unit may contain from 1 to 5% total suspended solids. Because of the cost savings associated with handling smaller volumes of sludge, there is an economic incentive to remove additional water. Dewatering equipment is designed to remove water in a much shorter time span than nature would by gravity.
Usually, an energy gradient is used to promote rapid drainage. This requires frequent conditioning of the sludge prior to the dewatering step.
Conditioning is necessary due to the nature of the sludge particles. Both inorganic and organic sludge consist of colloidal (less than 1 μm), intermediate, and large particles (greater than 200 μm). The large particles, or flocs, are usually compressible. Under an energy gradient, these large flocs compress and prevent water from escaping. The small particles also participate in this mechanism, plugging the pores of the sludge cake. The pressure drop through the sludge cake, due to the decrease in porosity and pore sizing, exceeds available energy, and dewatering ceases.
The purpose of sludge conditioning is to provide a rigid sludge structure of a porosity and pore size sufficient to allow drainage. Biological sludges are conditioned with FeCl3, lime, and synthetic cationic polymers, either separately or in combination. Heat conditioning and low-pressure oxidation are also used for biological sludges. Inorganic sludges are conditioned with FeCl3, lime, and either cationic or anionic polymers.
Dewatering
Belt Filter Press. Belt filter presses have been used in Europe since the 1960's and in the United States since the early 1970's. They were initially designed to dewater paper pulp and were subsequently modified to dewater sewage sludge.
Belt filter presses are designed on the basis of a very simple concept. Sludge is sandwiched between two tensioned porous belts and passed over and under rollers of various diameters. At a constant belt tension, rollers of decreasing diameters exert increasing pressure on the sludge, thus squeezing out water. Although many different designs of belt filter presses are available, they all incorporate a polymer conditioning unit, a gravity drainage zone, a compression (lowpressure) zone, and a shear (high-pressure) zone.
Polymer Conditioning Unit. Polymer conditioning can take place in a small tank, in a rotating drum attached to the top of the press, or in the sludge line. Usually, the press manufacturer supplies a polymer conditioning unit with the belt filter press.
Gravity Drainage Zone. The gravity drainage zone is a flat or slightly inclined belt, which is unique to each press model. In this section, sludge is dewatered by the gravity drainage of free water. The gravity drainage zone should increase the solids concentration of the sludge by 5-10%. If the sludge does not drain well in this zone, the sludge can squeeze out from between the belts or the belt mesh can become blinded. The effectiveness of the gravity drainage zone is a function of sludge type, quality, and conditioning, along with the screen mesh and the design of the drainage zone.
Compression (Low-Pressure) Area. The compression, or low-pressure, area is the point at which the sludge is "sandwiched" between the upper and lower belts. A firm sludge cake is formed in this zone in preparation for the shear forces encountered in the high-pressure zone.
Shear (High-Pressure) Zone. In the shear, or high-pressure, zone, forces are exerted on the sludge by the movement of the upper and lower belts, relative to each other, as they go over and under a series of rollers with decreasing diameters.
Some manufacturers have an independent high-pressure zone which uses belts or hydraulic cylinders to increase the pressure on the sludge, producing a drier cake. A dry cake is particularly important for plants that use incineration as the final disposal.
Dewatering belts are usually woven from monofilament polyester fibers. Various weave combinations, air permeabilities, and particle retention capabilities are available. These parameters greatly influence press performance.
Usually, cationic polymers are used for sludge conditioning. A two-polymer system is often used on a belt filter press to improve cake release from the upper dewatering belt. The polymer must be selected carefully to ensure optimum performance.
Odors are controlled by proper ventilation, by ensuring that the sludge does not turn septic, and by the use of added chemicals, such as potassium permanganate or ferric sulfate, to neutralize the odor-causing chemicals.
Screw Press. A recent development in sludge dewatering equipment, used primarily in the pulp and paper industry, is the screw press. Screw presses are most effective for primary sludges, producing cake solids of 50-55%, but are also appropriate for primary and secondary blended sludges.
Sludge is conditioned and thickened prior to dewatering. The conditioned sludge enters one end of the machine. A slowly rotating screw, analogous to a solid bowl centrifuge, conveys and compresses the solids.
The screw has the same outer diameter and pitch for the entire length of the press. In some models, the diameter of the screw shaft increases toward the discharge end of the screw press to enhance dewatering. The compression ratio (the ratio of free space at the inlet to the space at the discharge end of the screw) is selected according to the nature of the material to be dewatered and the dewatering requirement. Dewatered cake is discharged as it is pressed against the spring or hydraulically loaded cone mounted at the end of the screw press.
The drum of the screw press consists of a fine strainer screen, a thicker punched holding plate, and a reinforcement rib.
Filtrate is collected in the collecting pan located under the screw press, and the cake is transported to the next stage.
Vacuum Filters. Vacuum filtration uses various porous materials as filter media, including cloth, steel mesh, and tightly wound coil springs. Under an applied vacuum, the porous medium retains the solids, but allows water to pass through.
The relative importance of cake dryness, filtrate quality, and filter cake yield can vary from one system to another.
A decrease in drum speed allows more time for drying of the sludge to increase cake dryness. However, this also decreases the filter cake yield, defined as pounds of dry solids per hour per square foot of filter area. Polymers can help produce a drier cake without the problem of a lower filter cake yield. Synthetic polymers improve cake dryness by agglomerating sludge particles that may hinder the removal of water. This agglomeration also increases the solids capture across the unit, which results in a higher-quality filtrate.
Centrifuges. Centrifugal force, 3500-6000 times the force of gravity, is used to increase the sedimentation rate of solid sludge particles.
The most common centrifuge found in waste treatment dewatering applications is the continuous bowl centrifuge. The two principal elements of a continuous solid bowl centrifuge are the rotating bowl and the inner screw conveyor. The bowl acts as a settling vessel; the solids settle due to centrifugal force from its rotating motion. The screw conveyor picks up the solids and conveys them to the discharge port.
Often, operation of centrifugal dewatering equipment is a compromise between centrate quality, cake dryness, and sludge throughput. For example, an increase in solids throughput reduces clarification capacity, causing a decrease in solids capture. At the same time, the cake is drier due to the elimination of fine particles that become entrained in the centrate. The addition of polymers, with their ability to agglomerate fine particles, can result in increased production rates without a loss in centrate quality.
Polymers are usually fed inside the bowl because shear forces may destroy flocs if they are formed prior to entry. Also, large particles settle rapidly in the first stage of the bowl. Thus, economical solids recovery can be achieved through internal feeding of polymers after the large particles have settled.
Plate and Frame Press. A plate and frame filter press is a batch operation consisting of vertical plates held in a frame.
A filter cloth is mounted on both sides of each plate. Sludge pumped into the unit is subjected to pressures of up to 25 psig (1.76 kg/cm2) as the plates are pressed together. As the sludge fills the chamber between individual plates, the filtrate flow ceases, and the dewatering cycle is completed. This cycle usually lasts from to 2 hr.
Because of the high pressures, blinding of the filter cloth by small sludge particles can occur. A filter precoat (e.g., diatomaceous earth) can be used to prevent filter blinding. Proper chemical conditioning of the sludge reduces or eliminates the need for precoat materials. At 5-10 psig, polymers can produce a rigid floc and eliminate fine particles. At greater pressures, the effectiveness of synthetic polymers is reduced; therefore, inorganic chemicals, such as ferric chloride and lime, are often used instead of polymers.
Sludge Drying Beds. Sludge drying beds consist of a layer of sand over a gravel bed. Underdrains spaced throughout the system collect the filtrate, which usually is returned to the wastewater plant.
Water is drained from the sludge cake by gravity through the sand and gravel bed. This process is complete within the first 2 days. All additional drying occurs by evaporation, which takes from 2 to 6 weeks. For this reason, climatic conditions, such as frequency and rate of precipitation, wind velocity, temperature, and relative humidity, play an important role in the operation of sludge drying beds. Often, these beds are enclosed to aid in dewatering. Chemical conditioning also reduces the time necessary to achieve the desired cake solids.
Sludge Disposal
Disposal of the sludge generated by wastewater treatment plants is dependent on government regulations (such as the
Resource Conservation and Recovery Act), geographical location, and sludge characteristics, among other things. Final disposal methods include reclamation, incin-eration, land application, and landfill.
Reclamation. Because of costs associated with the disposal of wastewater sludge, each waste stream should be evaluated for its reclamation potential. Energy value, mineral content, raw material makeup, and by-product markets for each sludge should be evaluated. Examples include burning of digester gas to run compressors, recalcination of lime sludge to recover CaO, return of steel mill thickener sludge to the sinter plant, and marketing of by-product metallic salts for wastewater treatment use.
Incineration. Biological sludge can be disposed of by incineration; the carbon, nitrogen, and sulfur are removed as gaseous by-products, and the inorganic portion is removed as ash. Old landfill sites are filling up and new ones are becoming increasingly difficult to obtain. Therefore, waste reduction through incineration is becoming a favored disposal practice.
Several combustion methods are available, including hogged fuel boilers, wet air oxidation and kiln, multiple hearth furnace, and fluidized bed combustion processes.
Sludge incineration is a two-step process involving drying and combustion. Incineration of waste sludge usually requires auxiliary fuel to maintain temperature and evaporate the water contained in the sludge. It is critically important to maintain low and relatively constant sludge moisture.
Land Application. Sludge produced from biological oxidation of industrial wastes can be used for land application as a fertilizer or soil conditioner. A detailed analysis of the sludge is important in order to evaluate toxic compound and heavy metal content, leachate quality, and nitrogen concentration.
Soil, geology, and climate characteristics are all important considerations in determining the suitability of land application, along with the type of crops to be grown on the sludge-amended soil. Sludge application rates vary according to all of these factors.
Landfill. Landfill is the most common method of industrial wastewater treatment plant sludge disposal.
Care must be taken to avoid pollution of groundwater. The movement and consequent recharge of groundwater is a slow process, so contamination that would be very small for a stream or river can result in irreversible long-term pollution of the groundwater. Many states require impermeable liners, defined as having a permeability of 10-7 cm/sec, in landfill disposal sites. This requirement limits liners to a few natural clays and commercial plastic liners. In addition to impermeable liners, leachate collection and treat-ment systems are typically required for new and remediated landfills.
Steps can be taken to reduce leachate and leachate contamination. Decreasing the moisture in the sludge removes water that would eventually be available as leachate. Proper consideration of the hydraulics of the landfill site can capture more rainfall as runoff and eliminate ponding and its contribution to leachate.
ENVIRONMENTAL REGULATIONS
Many governmental regulations have been established in recent years for the protection of the environment. The Clean
Water Act and the Resource Conservation and Recovery Act are among the most significant.
Clean Water Act
The Clean Water Act (CWA) of 1972 established regulations for wastewater discharge, provided funding for Publicly Owned Treatment Works (municipal waste treatment plants), and authorized the National Pollutant Discharge Elimination
Systems (NPDES) to regulate and establish wastewater discharge permits for industrial and municipal plants.
Resource Conservation and Recovery Act (RCRA)
The Resource Conservation and Recovery Act (RCRA) of 1976 provided regulations for management of hazardous solid wastes, cleanup of hazardous waste sites, waste minimization, underground storage, and groundwater monitoring.

4.0 Wastewater as a Resource
4.1 Introduction
In many arid and semi-arid regions of the world water has become a limiting factor,
particularly for agricultural and industrial development. Water resources planners are
continually looking for additional sources of water to supplement the limited resources
available to their region. Several countries of the Eastern Mediterranean region, for
example, where precipitation is in the range of 100-200 mm a-1, rely on a few perennial
rivers and small underground aquifers that are usually located in mountainous regions.
Drinking water is usually supplied through expensive desalination systems, and more
than 50 per cent of the food demand is satisfied by importation.
In such situations, source substitution appears to be the most suitable alternative to
satisfy less restrictive uses, thus allowing high quality waters to be used for domestic
supply. In 1958, the United Nations Economic and Social Council provided a
management policy to support this approach by stating that "no higher quality water,
unless there is a surplus of it, should be used for a purpose that can tolerate a lower
grade" (United Nations, 1958). Low quality waters such as wastewater, drainage waters
and brackish waters should, whenever possible, be considered as alternative sources for
less restrictive uses.
Agricultural use of water resources is of great importance due to the high volumes that
are necessary. Irrigated agriculture will play a dominant role in the sustainability of crop
production in years to come. By the year 2000, further reduction in the extent of
exploitable water resources, together with competing claims for water for municipal and
industrial use, will significantly reduce the availability of water for agriculture. The use of
appropriate technologies for the development of alternative sources of water is, probably,
the single most adequate approach for solving the global problem of water shortage,
together with improvements in the efficiency of water use and with adequate control to
reduce water consumption.
Figure 4.1 Types of wastewater use (After WHO, 1989)
4.2 Types of reuse
Water is a renewable resource within the hydrological cycle. The water recycled by
natural systems provides a clean and safe resource which is then deteriorated by
different levels of pollution depending on how, and to what extent, it is used. Once used,
however, water can be reclaimed and used again for different beneficial uses. The
quality of the once-used water and the specific type of reuse (or reuse objective) define
the levels of subsequent treatment needed, as well as the associated treatment costs.
The basic types of reuse are indicated in Figure 4.1 and described in more detail below
(WHO, 1989).
4.2.1 Agriculture and aquaculture
On a world-wide basis wastewater is the most widely used low-quality water, particularly
for agriculture and aquaculture. This rest of this chapter concentrates on this type of
reuse because of the large volumes used, the associated health risks and the
environmental concerns. Other types of reuse are only discussed briefly in the following
sub-sections.
4.2.2 Urban
In urban areas, reclaimed wastewater has been used mainly for non-potable applications
(Crook et al., 1992) such as:
• Irrigation of public parks, recreation centres, athletic fields, school yards and playing
fields, and edges and central reservations of highways.
• Irrigation of landscaped areas surrounding public, residential, commercial and
industrial buildings.
• Irrigation of golf courses.
• Ornamental landscapes and decorative water features, such as fountains, reflecting
pools and waterfalls.
• Fire protection.
• Toilet and urinal flushing in commercial and industrial buildings.
The disadvantages of urban non-potable reuse are usually related to the high costs
involved in the construction of dual water-distribution networks, operational difficulties
and the potential risk of cross-connection. Costs, however, should be balanced with the
benefits of conserving potable water and eventually of postponing, or eliminating, the
need for the development of additional sources of water supply.
Potable urban reuse can be performed directly or indirectly. Indirect potable reuse
involves allowing the reclaimed water (or, in many instances, raw wastewater) to be
retained and diluted in surface or groundwaters before it is collected and treated for
human consumption. In many developing countries unplanned, indirect potable reuse is
performed on a large scale, when cities are supplied from sources receiving substantial
volumes of wastewater. Often, only conventional treatment (coagulation-flocculationclarification,
filtration and disinfection) is provided and therefore significant long-term
health effects may be expected from organic and inorganic trace contaminants which
remain in the water supplied.
Direct potable reuse takes place when the effluent from a wastewater reclamation plant
is connected to a drinking-water distribution network. Treatment costs are very high
because the water has to meet very stringent regulations which tend to be increasingly
restrictive, both in terms of the number of variables to be monitored as well as in terms
of tolerable contaminant limits.
Presently, only the city of Windhoek, Namibia is performing direct potable reuse during
dry periods. The Goreangab Reclamation Plant constructed in 1968 is currently being
enlarged to treat about 14,000 m3 d-1 by 1997 in order to further augment supplies to the
city of Windhoek (Van Der Merwe et al., 1994).
4.2.3 Industry
The most common uses of reclaimed water by industry are:
• Evaporative cooling water, particularly for power stations.
• Boiler-feed water.
• Process water.
• Irrigation of grounds surrounding the industrial plant.
The use of reclaimed wastewater by industry is a potentially large market in developed
as well as in developing and rapidly industrialising countries. Industrial reuse is highly
cost-effective for industries where the process does not require water of potable quality
and where industries are located near urban centres where secondary effluent is readily
available for reuse.
4.2.4 Recreation and landscape enhancement
The use of reclaimed wastewater for recreation and landscape enhancement ranges
from small fountains and landscaped areas to full, water-based recreational sites for
swimming, boating and fishing. As for other types of reuse, the quality of the reclaimed
water for recreational uses should be determined by the degree of body contact
estimated for each use. In large impoundments, however, where aesthetic appearance
is considered important it may be necessary to control nutrients to avoid eutrophication.
4.3 Implementing or upgrading agricultural reuse systems
Land application of wastewater is an effective water pollution control measure and a
feasible alternative for increasing resources in water-scarce areas. The major benefits of
wastewater reuse schemes are economic, environmental and health-related. During the
last two decades the use of wastewater for irrigation of crops has been substantially
increased (Mara and Cairncross, 1989) due to:
• The increasing scarcity of alternative water resources for irrigation.
• The high costs of fertilisers.
• The assurances that health risks and soil damage are minimal, if the necessary
precautions are taken.
• The high costs of advanced wastewater treatment plants needed for discharging
effluents to water bodies.
• The socio-cultural acceptance of the practice.
• The recognition by water resource planners of the value of the practice.
Economic benefits can be gained by income generation and by an increase in
productivity. Substantial increases in income will accrue in areas where cropping was
previously limited to rainy seasons. A good example of economic recovery associated
with the availability of wastewater for irrigation is the Mesquital Valley in Mexico (see
Case Study VII) where agricultural income has increased from almost zero at the turn of
the century when waste-water was made available to the region, to about 16 million
Mexican Pesos per hectare in 1990 (CNA, 1993). The practice of excreta or wastewater
fed aquaculture has also been a substantial source of income in many countries such as
India, Bangladesh, Indonesia and Peru. The East Calcutta sewage fisheries in India, the
largest wastewater use system involving aquaculture in the world (about 3,000 ha in
1987), produces 4-9 t ha-1 a-1 of fish, which is supplied to the local market (Edwards,
1992). Economic benefits of wastewater/excreta-fed aquaculture can also be found
elsewhere (Bartone, 1985; Bartone et al., 1990; Ikramullah, 1994).
Studies carried out in several countries have shown that crop yields can increase if
wastewater irrigation is provided and properly managed. Table 4.1 shows the results of
field experiments made in Nagpur, India, by the National Environmental Research
Institute (NEERI), which investigated the effects of wastewater irrigation on crops
(Shende, 1985).
Effluents from conventional wastewater treatment systems, with typical concentrations of
15 mg l-1 total N and 3 mg l-1 P, at the usual irrigation rate of about 2 m a-1, provide
application rates of N and P of 300 and 60 kg ha-1 a-1, respectively. Such nutrient inputs
can reduce, or even eliminate, the need for commercial fertilisers. The application of
wastewater provides, in addition to nutrients, organic matter that acts as a soil
conditioner, thereby increasing the capacity of the soil to store water. The increase in
productivity is not the only benefit because more land can be irrigated, with the
possibility of multiple planting seasons (Bartone and Arlosoroff, 1987).
Environmental benefits can also be gained from the use of wastewater. The factors that
may lead to the improvement of the environment when wastewater is used rather than
being disposed of in other ways are:
• Avoiding the discharge of wastewater into surface waters.
• Preserving groundwater resources in areas where over-use of these resources in
agriculture are causing salt intrusion into the aquifers.
• The possibility of soil conservation by humus build-up and by the prevention of land
erosion.
• The aesthetic improvement of urban conditions and recreational activities by means of
irrigation and fertilisation of green spaces such as gardens, parks and sports facilities.
Despite these benefits, some potential negative environmental effects may arise in
association with the use of wastewater. One negative impact is groundwater
contamination. The main problem is associated with nitrate contamination of
groundwaters that are used as a source of water supply. This may occur when a highly
porous unsaturated layer above the aquifer allows the deeper percolation of nitrates in
the wastewater. Provided there is a deep, homogeneous, unsaturated layer above the
aquifer which is capable of retaining nitrate, there is little chance of contamination. The
uptake of nitrogen by crops may reduce the possibility of nitrate contamination of
groundwaters, but this depends on the rate of uptake by plants and the rate of
wastewater application to the crops.
Build up of chemical contaminants in the soil is another potential negative effect.
Depending on the characteristics of the wastewater, extended irrigation may lead to the
build up of organic and inorganic toxic compounds and increases in salinity within the
unsaturated layers. To avoid this possibility irrigation should only use wastewater of
predominantly domestic origin. Adequate soil drainage is also of fundamental
importance in minimising soil salinisation.
Extended irrigation may create habitats for the development of disease vectors, such as
mosquitoes and snails. If this is likely, integrated vector control techniques should be
applied to avoid the transmission of vector-borne diseases.
Indirect health-related benefits can occur because wastewater irrigation systems may
contribute to increased food production and thus to improving health, quality of life and
social conditions. However, potential negative health effects must be considered by
public health authorities and by institutions managing wastewater reuse schemes
because farm workers, the consumers of crops and, to some extent, nearby dwellers
can be exposed to the risk of transmission of communicable diseases.
4.3.1 Policy and planning
The use of wastewater constitutes an important element of a water resources policy and
strategy. Many nations, particularly those in the arid and semi-arid regions such as the
Middle Eastern countries, have adopted (in principle) the use of treated wastewater as
an important concept in their overall water resources policy and planning. A judicious
wastewater use policy transforms wastewater from an environmental and health liability
to an economic and environmentally sound resource (Kandiah, 1994a).
Governments must be prepared to establish and to control wastewater reuse within a
broader framework of a national effluent use policy, which itself forms part of a national
plan for water resources. Lines of responsibility and cost-allocation principles should be
worked out between the various sectors involved, i.e. local authorities responsible for
wastewater treatment and disposal, farmers who will benefit from effluent use schemes,
and the state which is concerned with the provision of adequate water supplies, the
protection of the environment and the promotion of public health. To ensure long-term
sustainability, sufficient attention must be given to the social, institutional and
organisational aspects of effluent use in agriculture and aquaculture.
The planning of wastewater-use programmes and projects requires a systematic
approach. Box 4.1 gives a system framework to support the characterisation of basic
conditions and the identification of possibilities and constraints to guide the planning
phase of the project (Biswas, 1988).
Government policy on effluent use in agriculture has a deciding effect on the
achievement of control measures through careful selection of the sites and the crops
that may be irrigated with treated effluent. A decision to make treated effluent available
to farmers for unrestricted irrigation removes the possibility of taking advantage of
careful selection of sites, irrigation techniques and crops, and thereby of limiting the
health risks and minimising the environmental impacts. However, if crop selection is not
applied but a government allows unrestricted irrigation with effluent in specific controlled
areas, public access to those areas can be prevented (and therefore some control is
achieved). The greatest security against health risk and adverse environmental impact
arises from limiting effluent use to restricted irrigation on controlled areas to which the
public has no access.
It has been suggested that the procedures involved in preparing plans for effluent
irrigation schemes are similar to those used in most forms of resource planning, i.e. in
accordance with the main physical, social and economic dimensions summarised in
Figure 4.2. The following key issues or tasks are likely to have a significant effect on the
ultimate success of effluent irrigation schemes:
• The organisational and managerial provisions made to administer the resource, to
select the effluent-use plan and to implement it.
• The importance attached to public health considerations and to the levels of risk taken.
• The choice of single-use or multiple-use strategies.
• The criteria adopted in evaluating alternative reuse proposals.
• The level of appreciation of the scope for establishing a forest resource.
Adopting a mix of effluent use strategies normally has the advantages of allowing
greater flexibility, increased financial security and more efficient use of wastewater
throughout the year, whereas a single-use strategy gives rise to seasonal surpluses of
effluent for unproductive disposal.
4.3.2 Legal and regulatory issues
The use of wastewater, particularly for irrigation of crops, is associated with two main
types of legal issues:
• Establishment of a legal status of wastewater and the delineation of a legal regime for
its use. This may include the development of new, or the amendment of existing,
legislation; creation of new institutions or the allocation of new powers to existing
institutions; attributing roles of, and relationships between, national and local
government in the sector; and public health, environmental and agricultural legislation
such as standards and codes of practice for reuse.
• Securing tenure for the users, particularly in relation to rights of access to and
ownership of waste, and including public regulation of its use. Legislation should also
include land tenure, without which security of access to wastewater is worthless.
The delineation of a legal regime for wastewater management should address the
following aspects (WHO, 1990):
• A definition of what is meant by wastewater.
• The ownership of wastewater.
• A system of licensing of wastewater use.
• Protection of other users of the water resources that may be adversely affected by the
loss of return flows into the system arising from the use of wastewater.
• Restrictions for the protection of public and environmental health with respect to
intended use of the wastewater, treatment conditions and final quality of wastewater,
and conditions for the siting of wastewater treatment facilities.
• Cost allocation and pricing.
• Enforcement mechanisms.
• Disposal of the sludges which result from wastewater treatment processes.
• Institutional arrangements for the administration of relevant legislation.
• The interface of this legal regime with the general legal regime for the management of
water resources, particularly the legislation for water and environmental pollution control
and the legislation governing the provision of water supply and sewerage services to the
public, including the relevant responsible institutions.
At the operational level, regulatory actions are applied and enforced through guidelines,
standards and codes of practice (see Chapters 2 and 5).
Guidelines
One of the many functions of the World Health Organization (WHO) is to propose
regulations and to make recommendations with respect to international health matters.
Guidelines for the safe use of wastewater, produced as part of this function are intended
to provide background and guidance to governments for risk management decisions
related to the protection of public health and to the preservation of the environment.
It must be stressed that guidelines are not intended for absolute and direct application in
every country. They are of advisory nature and are based on the state-of-the-art in
scientific research and epidemiological findings. They are aimed at the establishment of
a health basis and the health risks and, as such, they provide a common background
against which national or regional standards can be derived (Hespanhol and Prost,
1994).
Agriculture. The Scientific Group on Health Guidelines for the Use of Waste-water in
Agriculture and Aquaculture, held in Geneva in 1987 (WHO, 1989) established the basic
criteria for health protection of the groups at risk from agricultural reuse systems and
recommended the microbiological guidelines shown in Table 4.2. These criteria and
guidelines were the result of a long preparatory process and the epidemiological
evidence available at the time. They are related to the category of crops, the reuse
conditions, the exposed groups and the appropriate wastewater treatment systems, in
order to achieve microbiological quality.
Aquaculture. The use of wastewater or excreta to fertilise ponds for fish production has
been associated with a number of infections caused by excreted pathogens, including
invasion of fish muscle by bacteria and high pathogen concentrations in the digestive
tract and the intra-peritoneal fluid of the fish. Limited experimental and field data on
health effects of excreta or wastewater fertilised aquaculture are available and, therefore,
the Scientific Group Meeting recommended the following tentative guidelines:
• A geometric mean of less than 103 faecal coliform per 100 ml for fish pond water, to
ensure that bacterial invasion of fish muscle is prevented. The same guideline value
should be maintained for pond water in which edible aquatic vegetables (macrophytes)
are grown because in many areas they are eaten raw. This can be achieved by treating
the wastewater supplied to the ponds to a concentration of 103-104 faecal coliforms per
100 ml (assuming that the pond will allow one order of magnitude dilution of the
incoming wastewater).
• Total absence of trematode eggs, to prevent infection by helminths such as
clonorchiasis, fascialopsiasis and schistosomiasis. This can be readily achieved by
stabilisation pond treatment.
• High standards of hygiene during fish handling and gutting to prevent infection of fish
muscle by the intra-peritoneal fluid of the fish.
The chemical quality of treated domestic effluents used for irrigation is also of particular
importance. Several variables are relevant to agriculture in relation to the yield and
quality of crops, the maintenance of soil productivity and the protection of the
environment. These variables are total salt concentration, electrical conductivity, sodium
adsorption ratio (SAR), toxic ions, trace elements and heavy metals. A thorough
discussion of this subject is available in FAO (1985).
Standards and Codes of Practice. Standards are legal impositions enacted by means of
laws, regulations or technical procedures. They are established by countries by adapting
guidelines to their own national priorities and by taking into account their own technical,
economical, social, cultural and political characteristics and constraints (see Chapter 5).
They are established by competent national authorities by adopting a risk-benefit
approach. This infers that the standards produced will consider not only health-related
concerns but also a wide range of economic and social consequences. At any time,
national standards can be changed or modified whenever new scientific evidence or new
technologies become available, or in response to changes in national priorities or
tendencies.
Standards are, in many countries, complemented by codes of practice which provide
guidance for the construction, operation and maintenance and surveillance of
wastewater use schemes. Codes of practice should be prepared according to local
conditions, but the following basic elements are frequently included:
• Crops allowed under crop restriction policies.
• Wastewater treatment and effluent quality.
• Wastewater distribution network.
• Irrigation methods.
• Operation and maintenance.
• Human exposure control.
• Monitoring and surveillance.
• Reporting.
• Charges and fines.
4.3.3 Institutional arrangements
Wastewater-use projects at national level touch on the responsibilities of several
ministries and government agencies. For adequate operation and minimisation of
administrative conflicts, the following ministries should be involved from the planning
phase onwards:
• Ministry of Agriculture and Fisheries: overall project planning; management of stateowned
land; installation and operation of an irrigation infrastructure; agricultural and
aquacultural extension, including training; and control of marketing.
• Ministry of Health: surveillance of effluent quality according to local standards; health
protection and disease surveillance; responsibility for human exposure control, such as
vaccination, control of anaemia and diarrhoeal diseases (see section 4.4); and health
education.
• Ministry of Water Resources: integration of wastewater use projects into overall water
resources planning and management.
• Ministry of Public Works and Water Authorities: wastewater or excreta collection and
treatment.
• Ministry of Finance/Economy/Planning: economic and financial appraisal of projects;
and cost/benefit analysis, financing, criteria for subsidising, etc.
According to national arrangements, other ministries such as those concerned with
environmental protection, land tenure, rural development, co-operatives and women's
affairs may also be involved (Mara and Cairncross, 1989).
Countries starting activities involving wastewater use for the first time can benefit greatly
from the establishment of an executive body, such as an inter-agency technical standing
committee, which is under the aegis of a leading ministry (Agriculture or Water
Resources) and which takes responsibility for sector development, planning and
management. Alternatively, existing organisations may be given responsibility for the
sector (or parts of it), for example a National Irrigation Board might be responsible for
wastewater use in agriculture and a National Fisheries Board might be responsible for
the aquacultural use of excreta and wastewater. Such organisations should then coordinate
a committee of representatives from the different agencies having sectoral
responsibilities. The basic responsibilities of inter-agency committees are:
• Developing a coherent national or regional policy for wastewater use and monitoring its
implementation.
• Defining the division of responsibilities between the respective ministries and agencies
involved and the arrangements for collaboration between them.
• Appraising proposed reuse schemes, particularly from the point of view of public health
and environmental protection.
• Overseeing the promotion and enforcement of national legislation and codes of
practice.
• Developing a rational staff development policy for the sector.
In countries with a regional or federal administration, such arrangements for inter-agency
collaboration are even more important at regional or state level. Whereas the general
framework of waste-use policy and standards may be defined at national level, the
regional body will have to interpret and add to these, taking into account local conditions.
In Mexico, the National Water Commission (CNA), which is attached to the Ministry of
Agriculture and Water Resources, administers the water resources of the country and,
as such, is the institution in charge of the planning, administration and control of all
wastewater use schemes at national level. Other governmental departments, such as
the Ministry of Health, the Ministry of the Environment and the Ministry of Social
Development, also participate according to specific interests within their own field of
activity. At regional level, the State government is also integrated with the administration
of local schemes. In the Mesquital Valley, for example, the State of Hidalgo collaborates
with the local agency of CNA for the operation and maintenance of the irrigation districts
as well as for monitoring, surveillance and enforcement actions. In the Mesquital Valley
there is also a strong participation by the private sector, dealing with the administration
of small irrigation units integrated into co-operative systems.
4.3.4 Economic and financial aspects
Economic appraisal of wastewater irrigation projects should be based on the incremental
costs and benefits accrued from the practice. One procedure adopted in many projects is
to adjust marginal benefits and costs to the current value at a real discount rate and to
design the system carefully in order that the benefit/cost ratio is greater than 1. Another
procedure consists of determining the internal rate of return of the project and confirming
that it is competitive (Forero, 1993).
The financial evaluation can be done by comparison with one of the following
hypothetical scenarios, each of which is configured with different benefits and costs:
• No agriculture at all.
• No irrigation at all (rain-fed agriculture).
• Irrigation with water from an alternative source without fertiliser application.
• Irrigation with water from an alternative source with fertiliser application.
Costs. The following costs must be considered in a wastewater irrigation project
(Papadopoulos, 1990):
• Wastewater treatment costs, including land and site preparation, civil engineering
works, system design, materials and equipment.
• Irrigation costs, including water handling, storage, conveyance and distribution.
• On-farm costs, associated with institutional build-up, including facilities and training,
measures for public health protection, hygiene facilities for field workers, and use of
lower value crops associated with specific waste-water application.
• Operation and maintenance costs, including additional energy consumption, labour,
protective clothing for field workers, supplementary fertiliser if needed, management and
overhead costs, and monitoring and testing.
It is of fundamental importance that only marginal costs are taken into account in the
appraisal. For example, only the additional costs required to attain local effluent
standards for reuse should be considered (if they are needed). Costs associated with
treatment systems for environmental protection (which would be implemented anyway),
should not be accounted in the economic evaluation of reuse systems. In the same way,
irrigation and on-farm costs that should be considered are solely the supplementary
costs accrued in association with the use of wastewater rather than any other
conventional source of water.
Benefits. Direct benefits are relatively easy to evaluate. In agriculture or aquaculture
systems they can be directly evaluated, for example in terms of the increase in crop
production and yields, savings in fertiliser costs and saving in freshwater supply. By
contrast, indirect benefits are complex and difficult to quantify properly. Among the many
other benefits that attract decision-making officials who are able to foresee the health
and environmental advantages of wastewater use in agriculture are:
• The improved nutritional status of poor populations through increased food availability.
• The increase in jobs and settlement opportunities.
• The development of new recreation areas.
• Reduced damage to the urban environment.
• Protection of groundwater resources from depletion.
• Protection of freshwater resources against pollution and their conservation.
• Erosion control, reduced desertification, etc.
The indirect benefits are "non-monetary issues" and, unfortunately, they are not taken
into account when performing economical appraisals of projects involving wastewater
use. However, the environmental enhancement provided by wastewater use, particularly
in terms of preservation of water resources, improvement of the health status of poor
populations in developing countries, the possibility of providing a substitute for
freshwater in water-scarce areas, and the incentive provided for the construction of
urban sewerage works, are extremely relevant. They are also sufficiently important to
make the cost/benefit analysis purely subsidiary when taking a decision on the
implementation of wastewater reuse systems, particularly in developing and rapidly
industrialising countries.
Cost recovery. Adopting an adequate policy for the pricing of water is of fundamental
importance in the sustainability of wastewater reuse systems. The incremental cost basis,
which allocates only the marginal costs associated with reuse, seems to be a fair criteria
for adoption in developing countries, where wastewater reuse is assumed to be a social
benefit. A charge in the form of tariffs, or fees, based on the volumes of treated
wastewater distributed, or in terms of hours of distribution, has been used in many
countries. Where the volumes are very large and the distribution network covers a wide
area, as in the Mesquital Valley in Mexico, the charges are made to farmers in relation to
the individual areas being irrigated.
Subsidising reuse systems may be necessary in the early stages of system
implementation, particularly when the associated costs are very large. This would avoid
any discouragement to farmers arising from the permitted use of the treated wastewater.
In order to determine the necessity of governmental support for the cost-recovery
scheme it would be advisable to investigate the willingness and the ability of the farmers
to pay for the services. The easiest way to collect fees is by imposing charges that are
payable just after the harvest season.
4.3.5 Socio-cultural aspects
Public acceptance of the use of wastewater or excreta in agriculture and aquaculture is
influenced by socio-cultural and religious factors. In the Americas, Africa and Europe, for
example, there is a strong objection to the use of excreta as fertiliser, whereas in some
areas of Asia, particularly in China, Japan and Java, the practice is performed regularly
and regarded as economical and ecologically sound.
In most parts of the world, however, there is no cultural objection to the use of
wastewater, particularly if it is treated. Wastewater use is well accepted where other
sources of water are not readily available, or for economic reasons. Wastewater is used
for the irrigation of crops in several Islamic countries provided that the impurities
(najassa) are removed. This results, however, from economical need rather than cultural
preference. According to Koranic edicts, the practice of reuse is accepted religiously
provided impure water is transformed to pure water (tahur) by the following methods
(Farooq and Ansari, 1983): self-purification, addition of pure water in sufficient quantity to
dilute the impurities, or removal of the impurities by the passage of time or by physical
effects.
Due to the wide variability in cultural beliefs, human behaviour and religious dogmas,
acceptance or refusal of the practice of wastewater reuse within a specific culture is not
always applicable everywhere. A complete assessment of local socio-cultural contexts
and religious beliefs is always necessary as a preliminary step to implementing reuse
projects (Cross, 1985).
4.3.6 Monitoring and evaluation
As mentioned before (see section 4.3.3), projects and programmes associated with the
use of wastewater should be led and co-ordinated by inter-agency committees under the
aegis of a leading ministry. This entity should also be in charge of monitoring and
evaluation programmes and should have the legal powers to enforce compliance with
local legislation.
Monitoring activities for wastewater use projects are of two different types. Process
control monitoring is carried out to provide data to support the operation and optimisation
of the system, in order to achieve successful project performance. It includes the
monitoring of treatment plants, water distribution systems, water application equipment,
environmental aspects (such as salinisation, drainage waters, water logging), agricultural
aspects (such as productivity and yield) and health-related problems (such as the
development of disease vectors and health problems associated with the use of
wastewater). In addition to providing data for process control, this level of monitoring
generates information for project revision and updating as well for further research and
development. Responsibility for process control monitoring belongs to the operating
agency (for example, a state agency or a municipal sewerage board) which is part of the
inter-agency committee.
Compliance monitoring is required to meet regulatory requirements and should not be
performed by the same agency in charge of process control monitoring. This
responsibility should be extended to an enforcement agency that possesses legal
powers to enforce compliance with quality standards, codes of practice and other
pertinent legislation. The responsibility for compliance monitoring is usually granted to
Ministries of Health because health problems are of prime importance for wastewater
use systems (see section 4.4).
A successful monitoring programme should be cost effective (only essential data should
be collected and analysed); it should provide adequate coverage (only representative
sectors of the system should be covered); it must be reliable (representative sampling,
accurate analysis with adequate analytical quality control, appropriate storing, handling
and reporting of information); and it should be timely, in order to provide operators and
decision-making officials with fresh and up-to-date information that allows the application
of prompt remedial measures during critical situations.
4.3.7 Public awareness and participation
To achieve general acceptance of reuse schemes, it is of fundamental importance that
active public involvement is obtained from the planning phase to the full implementation
process. Public involvement starts with early contact with potential users, leading to the
formation of an advisory committee and the holding of public workshops on potential
reuse schemes. The continuous exchange of information between authorities and the
public representatives ensures that the adoption of a specific water reuse programme
will fulfil real user needs and generally-recognised community goals for health, safety,
ecological concerns, programme cost, etc. (Crook et al., 1992).
Acceptance of reuse systems depends on the degree to which the responsible agencies
succeed in providing the concerned public with a clear understanding of the complete
programme; the knowledge of the quality of the treated wastewater and how it is to be
used; confidence in the local management of the public utilities and on the application of
locally accepted technology; assurance that the reuse application being considered will
involve minimal health risks and minimal detrimental effects to the environment; and
assurance, particularly for agricultural uses, of the sustainability of supply and suitability
of the reclaimed wastewater for the intended crops.
e concernFigure 4.3 provides a flow chart for establishing programmes to involve thed
community with all phases of wastewater use projects, from the planning phase to full
implementation of the project, and Table 4.3 presents a series of tools to address,
educate and inform the public at different levels of involvement.
4.4 Technical aspects of health protection
Health protection in wastewater use projects can be provided by the integrated
application of four major measures: wastewater treatment, crop selection and restriction,
wastewater irrigation techniques and human exposure control.
4.4.1 Wastewater treatment
Wastewater treatment systems were first developed in response to the adverse
conditions caused by the discharge of raw effluents to water bodies. With this approach,
treatment is aimed at the removal of biodegradable organic compounds, suspended and
floatable material, nutrients and pathogens. However, the criteria for wastewater
treatment intended for reuse in irrigation differ considerably. While it is intended that
pathogens are removed to the maximum extent possible, some of the biodegradable
organic matter and most of the nutrients available in the raw wastewater need to be
maintained.
Conventional primary and secondary treatments
Raw domestic wastewater contains between 107 and 109 faecal coliform per 100 ml.
Conventional treatment systems, such as plain sedimentation, bio-filtration, aerated
lagoons and activated sludge, which are designed particularly for removal of organic
matter, are not able to remove pathogens in order to produce an effluent that meets the
WHO guideline for bacterial quality (≤ 1,000 faecal coliform per 100 ml). In the same way,
they are not generally effective in helminth removal. More research and adaptive work is
required to improve the effectiveness of conventional systems in removing helminth eggs.
Waste stabilisation ponds
Ponding systems are the preferred technology to provide effluents for reuse in
agriculture and aquaculture, particularly in warm climates and whenever land is available
at reasonable cost (Mara, 1976; Arthur, 1983; Bartone, 1991). Ponding systems
integrating anaerobic, facultative and maturation units, with an overall average retention
time of 10-50 days (depending on temperature), can produce effluents that meet the
WHO guidelines for both bacterial and helminth quality.
The FAO Irrigation and Drainage Paper No. 47 Wastewater
Treatment in Agriculture (FAO, 1985) also provides a good review of wastewater
treatment systems which are recommended for wastewater use schemes. The following
advantages are the reasons why stabilisation ponds are an adequate treatment system
for the conditions prevailing in developing countries:
• Lower construction, operation and maintenance costs.
• No energy requirements.
• High ability to absorb organic and hydraulic loads.
• Ability to treat a wide variety of industrial and agricultural wastes.
Disinfection
Disinfection of wastewater through the application of chlorine has never been completely
successful in practice, due to the high costs involved and the difficulty of maintaining an
adequate, uniform and predictable level of disinfection efficiency. Effluents from welloperated
conventional treatment systems, treated with 10-30 mg l-1 of chlorine and a
contact time of 30-60 minutes, provide a good reduction of excreted bacteria, but have
no capacity for removing helminth eggs and protozoa. As a well designed and operating
stabilisation ponding system will provide an effluent with less than 1,000 faecal coliform
per 100 ml and less than one egg of intestinal nematodes per litre, there is usually no
need for disinfection of pond effluents intended for reuse.
Storage reservoirs
Water demand for irrigation occurs mainly in the dry season or during particular periods
of the year. Wastewater intended for irrigation can, therefore, be stored in large, natural
or specially constructed reservoirs, which provide further natural treatment, particularly in
terms of bacteria and helminth removal. Such reservoirs have been used in Mexico and
Israel (Shuval, et al., 1986).
There are insufficient field data available to formulate an adequate design criterion for
storage reservoirs, but pathogen removal depends on retention time and on the
possibility of having the reservoir divided into compartments. The greater the retention
time and the larger the number of compartments in series, the higher the efficiency of
pathogen removal. A design recommendation, based particularly on data available from
natural storage reservoirs operating in the Mesquital Valley, Mexico, is to provide a
minimum hydraulic average retention time of 10 days, and to assume two orders of
magnitude reduction in both faecal coliform and helminth eggs. Thus, the stored wastewater
should contain no more than 102 eggs per litre and not more than 105 faecal
coliform per 100 ml, in order that the WHO guidelines for unrestricted irrigation are
attained.
Tertiary treatment
Tertiary or advanced treatment systems are used to improve the physico-chemical
quality of biological secondary effluents. Several unit operations and unit processes,
such as coagulation-flocculation-settling-sand filtration, nitrification and denitrification,
carbon adsorption, ion exchange and electro-dialysis, can be added to follow secondary
treatment in order to obtain high quality effluents. None of these units are recommended
for use in developing countries when treating wastewater for reuse, due to the high
capital and operational costs involved and the need for highly skilled personnel for
operation and maintenance.
If the objective is to improve effluents of biological plants (particularly in terms of bacteria
and helminths), for the irrigation of crops or for aquaculture, a more appropriate option is
to add one or two "polishing" ponds as a tertiary treatment. If land is not available for that
purpose, horizontal or vertical-flow roughing filtration units (which have been used for
pre-treatment of turbid waters prior to slow-sand filtration) may be considered. These
units, which are low cost and occupy a relatively small area, have been shown to be very
effective for the treatment of secondary effluents and remove a considerable proportion
of intestinal nematodes. Detailed information on the design, operation and removal
efficiencies of roughing filters can be found elsewhere (Wegelin, 1986; Wegelin et al.,
1991).
Sludge treatment
The excess sludge produced by biological treatment plants is valuable as a source of
plant nutrient as well as a soil conditioner. It can also be used in agriculture or to fertilise
aquaculture ponds. However, biological treatment processes concentrate organic and
inorganic contaminants as well as pathogens in the excess sludge. Given the availability
of nutrients and moisture, helminth eggs can survive and remain viable for periods close
to one year. If adequate care is taken during the handling process, raw sludge can be
applied to agricultural land in trenches and covered with a layer of earth. This should be
done before the planting season starts and care should be taken that no tuberous plants,
such as beets or potatoes, are planted along the trenches.
The following treatment methods can be applied to make sludges safe for use in
agriculture or aquaculture:
• Storage, from 6-12 months, at ambient temperature in hot climates.
• Mesophyllic (around 35 °C) anaerobic digestion, which removes 90-95 per cent of total
parasite eggs, but only 30-40 per cent of Ascaris eggs (Gunnerson and Stuckey, 1986).
• Thermophilic (around 55 °C) anaerobic digestion for about 13 days ensures total
inactivation of all pathogens. Continuous reactors can allow pathogens to by-pass the
removal process and therefore the digestion process should be performed under batch
conditions (Strauss, 1985).
• Forced-aeration co-composting of sludge with domestic solid waste or some other
organic bulking agent, such as wood chips, for 30 days at 55-60 °C followed by
maturation for 2-4 months at ambient temperature, will produce a stable, pathogen-free
compost (Obeng and Wright, 1987).
4.4.2 Crop selection
According to the WHO guidelines (see Table 4.2) wastewater of a high microbiological
quality is needed for the irrigation of certain crops, particularly crops eaten uncooked.
Nevertheless, a lower quality is acceptable for irrigation of certain types of crop and
corresponding levels of exposure to the groups at risk, because lower quality waters will
affect consumers and other exposed groups such as field workers and crop handlers.
For example, crops which are normally cooked, such as potatoes, or industrial crops
such as cotton and sisal, do not require a high quality wastewater for irrigation.
Crops can be grouped into two broad categories according to the group of persons likely
to be exposed and the degree to which health protection measures are required:
Category A. Protection required for consumers, agricultural workers and the general
public. This category includes crops likely to be eaten uncooked, spray-irrigated fruits,
sports fields, public parks and lawns.
Category B. Protection required for agricultural workers only, because there would be no
microbiological health risks associated with the consumption of the crops if they were
irrigated with wastewater (there is no risk to consumers because crops in this category
are not eaten raw, or they are processed before they reach the consumer). This category
includes cereal crops, industrial crops, food crops for canning, fodder crops, pastures
and trees. Some vegetable crops may be included in this category if they are not eaten
raw (potatoes and peas), or if they grow well above the ground (chillies, tomatoes and
green beans). In such cases it is necessary to ensure that the crop is not contaminated
by sprinkler irrigation or by falling to the ground, and that contamination of kitchen
utensils by such crops, before cooking, does not give rise to health risks.
The practice of crop restriction infers that crops that are allowed to be irrigated with
wastewater are restricted to those specified under category B. This category protects
consumers but additional protective measures are necessary for farm workers.
Although it appears simple and straightforward, in practice it is very difficult to implement
and to enforce crop restriction policies. A crop restriction policy is effective for health
protection only if it is fully implemented and enforced. It requires a strong institutional
framework and the capacity to monitor and to control compliance with the established
crop restriction regulations. Farmers should be advised of the importance and necessity
of the restriction policy and be assisted in developing a balanced mix of crops which
makes full use of the available partially-treated waste-water. The likelihood of
succeeding is greater where:
• A law-abiding society exists or the restriction policy is strongly enforced.
• A public body controls the allocation of wastewater under a strong central management.
• There is adequate demand for the crops allowed under the policy and they fetch a
reasonable price.
• There is little market pressure in favour of crops in category A.
Crop restriction does not provide health protection in aquaculture schemes, because fish
and macrophytes grown in wastewater or excreta-fertilised ponds are, in many places,
eaten uncooked. An alternative and promising approach, already practised in many parts
of the world, is to grow duckweed (Lemna sp.) in wastewater-fed ponds. The duckweed
is then collected and dried, and fed to high-value fish grown in freshwater ponds. The
same approach can be used to produce fishmeal for animal feed (or for fish food) by
growing the fish to be used for the production of fishmeal in wastewater ponds.
4.4.3 Irrigation techniques
The different methods used by farmers to irrigate crops can be grouped under five
headings (Kandiah, 1994b):
• Flood irrigation: water is applied over the entire field to infiltrate into the soil (e.g. wild
flooding, contour flooding, borders, basins).
• Furrow irrigation: water is applied between ridges (e.g. level and graded furrows,
contour furrows, corrugations). Water reaches the ridge (where the plant roots are
concentrated) by capillary action.
• Sprinkler irrigation: water is applied in the form of a spray and reaches the soil in much
the same way as rain (e.g. portable and solid set sprinklers, travelling sprinklers, spray
guns, centre-pivot systems).
• Sub-surface irrigation: water is applied beneath the root zone in such a manner that it
wets the root zone by capillary rise (e.g. subsurface canals, buried pipes).
• Localised irrigation: water is applied around each plant or group of plants so that only
the root zone gets wet (e.g. drip irrigation, bubblers, micro-sprinklers).
The type of irrigation method selected depends on water supply conditions, climate, soil,
the crops to be grown, the cost of irrigation methods and the ability of the farmer to
manage the system.
There is considerable scope for reducing the negative effects of wastewater use in
irrigation through the selection of appropriate irrigation methods. The choice of method is
governed by the following technical factors:
• Type of crops to be irrigated.
• The wetting of foliage, fruits and aerial parts.
• The distribution of water, salts and contaminants in the soil.
• The ease with which high soil-water potential can be maintained.
• The efficiency of application.
• The potential to contaminate farm workers and the environment.
A border (as well as a basin or any flood irrigation) system involves complete coverage
of the soil surface with treated wastewater which is not normally an efficient method of
irrigation. This system contaminates root crops and vegetable crops growing near the
ground and, more than any other method, exposes field workers to the pathogen content
of wastewater. Thus, with respect to both health and water conservation, border
irrigation with wastewater is not satisfactory.
Furrow irrigation does not wet the entire soil surface, and can reduce crop contamination,
because plants are grown on ridges. Complete health protection cannot be guaranteed
and the risk of contamination of farm workers is potentially medium to high, depending
on the degree of automation of the process. If the treated wastewater is transported
through pipes and delivered into individual furrows by means of gated pipes, the risk to
irrigation workers is minimum. To avoid surface ponding of stagnant wastewater, which
may induce the development of disease vectors, levelling of the land should be carried
out carefully and appropriate land gradients should be provided.
Sprinkler, or spray, irrigation methods are generally more efficient in water use because
greater uniformity of application can be achieved. However, such overhead irrigation
methods can contaminate ground crops, fruit trees and farm workers. In addition,
pathogens contained in the wastewater aerosol can be transported downwind and create
a health hazard to nearby residents. Generally, mechanised or automated systems have
relatively high capital costs and low labour costs compared with manually-operated
sprinkler systems. Rough levelling of the land is necessary for sprinkler systems in order
to prevent excessive head loss and to achieve uniformity of wetting. Sprinkler systems
are more affected by the quality of the water than surface irrigation systems, primarily as
a result of clogging of the orifices in the sprinkler heads but also due to sediment
accumulation in pipes, valves and distribution systems. There is also the potential for
leaf burn and phytotoxicity if the wastewater is saline and contains excessive toxic
elements. Secondary treatment systems that meet the WHO microbiological guidelines
have generally been found to produce an effluent suitable for distribution through
sprinklers, provided that the wastewater is not too saline. Further precautionary
measures, such as treatment with sand filters or micro-strainers and enlargement of the
nozzle orifice to diameters not less than 5 mm, are often adopted.
Localised irrigation, particularly when the soil surface is covered with plastic sheeting or
other mulch, uses effluent more efficiently. It produces higher crop yields and certainly
provides the greatest degree of health protection to farm workers and consumers.
However, trickle and drip irrigation systems are expensive and require a high quality of
treated wastewater in order to prevent clogging of the orifices through which water is
released into the soil. A relatively new technique called "bubbler irrigation", that was
developed for localised irrigation of tree crops, avoids the needs for small orifices. This
system requires, therefore, less treatment of the wastewater but needs careful setting for
successful application.
When compared with other systems, the main advantages of trickle irrigation are:
• Increased crop growth and yield achieved by optimising the water, nutrients and air
regimes in the root zone.
• High irrigation efficiency because there is no canopy interception, wind drift or
conveyance losses, and minimal drainage loss.
• Minimal contact between farm workers and wastewater.
• Low energy requirements because the trickle system requires a water pressure of only
100--300 kPa (1-3 bar).
• Low labour requirements because the trickle system can be easily automated, even to
allow combined irrigation and fertilisation.
In addition to the high capital costs of trickle irrigation systems, another limiting factor in
their use is that they are mostly suited to the irrigation of crops planted in rows.
Relocation of subsurface systems can be prohibitively expensive.
Special field management practices that may be required when wastewater irrigation is
performed, include pre-planting irrigation, blending of waste-water with other water
supplies, and alternating treated wastewater with other sources of supply.
The amount of wastewater to be applied depends on the rate of evapo-transpiration from
the plant surface, which is determined by climatic factors and can therefore be estimated
with reasonable accuracy, using meteorological data. An extensive review of this subject
is available in FAO (1984).
4.4.4 Human exposure control
The groups of people that are more susceptible to the potential risk from the use of
wastewater in agriculture are agricultural field workers and their families, crop handlers,
consumers of crops, meat and milk originating from wastewater irrigated fields, and
those living near wastewater irrigated fields. The basic methods for eliminating or
minimising exposure depend on the target groups. Agricultural field workers and crop
handlers have higher potential risks mainly associated with parasitic infections.
Protection can be achieved by:
• The use of appropriate footwear to reduce hookworm infection.
• The use of gloves (particularly crop handlers).
• Health education.
• Personal hygiene.
• Immunisation against typhoid fever and hepatitis A and B.
• Regular chemotherapy for intense nematode infections in children and the control of
anaemia.
• Provision of adequate medical facilities to treat diarrhoeal diseases.
Protection of consumers can be achieved by:
• Cooking of vegetables and meat and boiling milk.
• High standards of personal and food hygiene.
• Health education campaigns.
• Meat inspection, where there is risk of tapeworm infections.
• Ceasing the application of wastes at least two weeks before cattle are allowed to graze
(where there are risk of bovine cysticercosis).
• Ceasing the irrigation of fruit trees two weeks before the fruits are picked, and not
allowing fruits to be picked up from the ground.
• Provision of information on the location of wastewater-irrigated fields together with the
posting of warning notices along the edges of the fields.
There is no epidemiological evidence that aerosols from sprinklers cause significant risks
of pathogen contamination to people living near wastewater irrigated fields. However, in
order to allow a reasonable margin of safety and to minimise the nuisance caused by
odours, a minimum distance of 100 m should be kept between sprinkler-irrigated fields
and houses and roads.
4.4.5 Integrated measures for health protection
To planners and decision makers, wastewater treatment appears as a more
straightforward and "visible" measure for health protection, second only to crop
restriction. Both measures, however, are relatively difficult to implement fully. The first is
limited by costs and operational problems and the second by lack of adequate markets
for allowable crops or by legal and institutional constraints. The application of single,
isolated measures will not, however, provide full protection to the groups at risk and may
entail high costs of implementation and maintenance. Crop restriction, for example, if
applied alone provides protection to consumers of crops but not to field workers.
To analyse the various measures in an integrated fashion aimed at the optimisation of a
health protection scheme, a generalised model has been proposed (Mara and
Cairncross, 1989; WHO, 1989). This model was conceived to help in decision making,
by revealing the range of options for protecting agricultural workers and the cropconsuming
public, and by allowing flexibility in responses to different situations. Each
situation can be considered separately and the most appropriate option chosen after
taking in account economic, cultural and technical factors.
The graphical conception of the model is shown in Figure 4.4. It was assumed that
pathogens flow to the centre of the circle going through the five concentric rings
representing wastewater or excreta, irrigated field or wastewater-fed fishpond, crops,
field workers and consumers of crops. The thick black ring represents a barrier beyond
which pathogens should not go if the health of the groups at risk is to be protected. The
level of contamination of wastewater, field or crop, or the level of risk to consumers or
workers, is indicated by the intensity of the shading. White areas in the three outer bands
indicate zero or no significant level of contamination and, in the inner rings, they indicate
a presumed absence of risk to human health, thereby indicating that the strategy will
lead to the safe use of wastewater. If no protective measures are taken, both field
workers and consumers will be at the highest risk of contamination. Assuming that a
policy of crop restriction is enforced (regime A in Figure 4.4) consumers will be safe but
workers will still be at high risk. Regime B assumes that application of wastewater is
made through sub-urface or localised irrigation, thereby avoiding crop contamination and,
consequently, maintaining both workers and consumers virtually free of contamination.
If human exposure control is the single protective measure taken, both consumers and
field workers will still be submitted to the same level of risk because such measures are
rarely fully effective in practice. Regime D assumes partial treatment of wastewater
through ponding (D-I) or conventional systems (D-II). Stabilisation ponds with an
average retention time of 8-10 days are able to remove a significant proportion of
helminth eggs, thus providing protection to field workers. However, the reduction of
bacteria present is not sufficient to meet WHO guidelines and hence the risk to
consumers remains high. Since conventional treatment systems are not efficient at
helminth removal there will be some remaining risk for both consumers and field workers.
The regimes E, F and G are examples of the many possible associations of protective
measures. Regime E integrates partial wastewater treatment with crop restriction, thus
providing a large margin of protection to consumers of crops. However, full protection of
field workers can be achieved only if the treatment is made through well-designed
systems of stabilisation ponds. In regime F, human exposure control is integrated with
partial treatment which may lead to complete protection of workers but some low level of
risk remaining to consumers of the crops. The association of crop restriction with human
exposure control (regime G) provides full protection to consumers but some risk remains
to field workers. Finally, regime H provides full wastewater treatment allowing for
complete protection to both field workers and consumers.
The feasibility and efficacy of any combination of protective measures will depend on
several local factors which must be considered carefully before a final choice is made.
Some factors to be considered are the availability of institutional, human and financial
resources, the existing technological level (engineering and agronomic practices), sociocultural
aspects, and the prevalent pattern of excreta-related diseases.
4.5 Conclusions and recommendations
The incorporation of wastewater use planning into national water resource and
agricultural planning is important, especially where water shortages exist. This is not only
to protect sources of high quality waters but also to minimise wastewater treatment costs,
safeguard public health and to obtain the maximum agricultural and aquacultural benefit
from the nutrients that wastewater contains. Wastewater use may well help reduce costs,
especially if it is envisaged before new treatment works are built, because the standards
of effluents required for various types of use may result in costs lower than those for
normal environmental protection. It also provides the possibility of recovering the
resources invested in sewerage and represents a very efficient way of postponing
investment of new resources in water supply (Laugeri, 1989).
The use of wastewater has been practised in many parts of the world for centuries.
Whenever water of good quality is not available or is difficult to obtain, low quality waters
such as brackish waters, wastewater or drainage waters are spontaneously used,
particularly for agricultural or aquacultural purposes. Unfortunately, this form of
unplanned and, in many instances unconscious, reuse is performed without any
consideration of adequate health safeguards, environmentally sound practices or basic
agronomic and on-farm principles.
Authorities, particularly the Ministries of Health and Agriculture, should investigate
current wastewater reuse practices and take gradual steps for upgrading health and
agronomic practices. This preliminary survey provides the basis for the clear definition of
reuse priorities and the establishment of national strategies for reuse.
The implementation of an inter-sectoral institutional framework is the next step that
should be taken. This entity should be able to deal with technological, health and
environmental, economic and financial, and socio-cultural issues. It should also assign
responsibilities and should create capacity for operation and maintenance of treatment,
distribution and irrigation systems, as well as for monitoring, surveillance and the
enforcement of effluent standards and codes of practice.
In countries with little or no experience on planned reuse, it is advisable to implement
and to operate a pilot project. This experimental unit should include treatment,
distribution and irrigation systems and provides the basis for the establishment of
national standards and codes of practice which can then be fully adapted to local
conditions and skills. Once the experimental phase has been completed, the system can
be transformed into a demonstration and training project which could be able to
disseminate the local experience to neighbouring countries.

5.0 Guidelines for wastewater reuse in agriculture and aquaculture: recommended revisions based on new research evidence
1 Introduction
There has been an increasing interest in reuse of wastewater in agriculture over the last few
decades due to increased demand for freshwater. Population growth, increased per capita
use of water, the demands of industry and of the agricultural sector all put pressure on water
resources. Treatment of wastewater provides an effluent of sufficient quality that it should be
put to beneficial use and not wasted (Asano, 1998). The reuse of wastewater has been
successful for irrigation of a wide array of crops, and increases in crop yields from 10-30%
have been reported (cited in Asano, 1998). In addition, the reuse of treated wastewater for
irrigation and industrial purposes can be used as strategy to release freshwater for domestic
use, and to improve the quality of river waters used for abstraction of drinking water (by
reducing disposal of effluent into rivers). Wastewater is used extensively for irrigation in
certain countries e.g. 67% of total effluent of Israel, 25% in India and 24% in South Africa is
reused for irrigation through direct planning, though unplanned reuse is considerably greater.
During the last decade, there has been growing concern that the world is moving towards a
water crisis (Falkenmark, 1989)). There is increasing water scarcity in dry climate regions, for
example, in Africa and South Asia, and there are major political implications of water scarcity
in some regions e.g. Middle East (Murakami, 1995). Water quantity and quality issues are
both of concern. Recycling of wastewater is one of the main options when looking for new
sources of water in water scarce regions. The guidelines or standards required to remove
health risks from the use of wastewater and the amount and type of wastewater treatment
needed to meet the guidelines are both contentious issues. The cost of treating wastewater
to high microbiological standards can be so prohibitive that use of untreated wastewater is
allowed to occur unregulated.
In the last ten years, new epidemiological, microbiological and risk assessment studies have
evaluated the validity of the WHO (1989) Guidelines and other standards, and explored other
issues of concern e.g. the need for a viral guideline. This report will review evidence
supporting current guidelines for wastewater reuse and the main studies that have
contributed to an evaluation of the WHO (1989) Guidelines. Particular emphasis will be given
to studies funded by DFID; these include epidemiological studies done by London School of
Hygiene and Tropical Medicine in collaboration with the Instituto Nacional de la Nutricion in
Mexico City and microbiological studies done by Leeds University with colleagues in
Laboratorio Nacional de Engenharia Civil, Lisboa, Portugal, and Estacao Experimental de
Tratamentos Biologicos de Esgotos Sanitarios, Universidade Federal da Paraiba,
Campina Grande, Paraiba, Brasil. The implications of the studies for the setting of
international guidelines for the use of wastewater in agriculture and aquaculture will be
considered, along with the wastewater treatment and other health protection measures
needed to achieve the guidelines. In a companion document, the implications of the studies
for the evaluation of country standards in Mexico will be considered; this case study will be of
interest to other Less Developed Countries considering the formulation or review of
standards for wastewater use.
2 Agricultural Reuse
2.1 Background
Guidelines for the reuse of effluents, considering methods of wastewater treatment and
health safeguards were developed by the World Health Organisation in 1971 (WHO, 1971).
These focused on defining appropriate levels of treatment needed for different types of
reuse. It was considered that available treatment technologies and use of chlorination could
achieve a bacteriological quality of 100 coliform organisms per 100ml, and this would give
rise to only a limited health risk if used for the unrestricted irrigation of food crops. When
these guidelines were revised (WHO, 1989), more epidemiological and microbiological
evidence concerning health risks related to use of untreated and treated wastewater was
available, and the guidelines were modified accordingly.
The new guidelines have been controversial, particularly relaxation of the guideline for
unrestricted irrigation to 1000 faecal coliform* per 100ml (geometric mean). Criticisms have
included the use of ‘partial’ epidemiological studies in developing countries, ignoring the
acquired immunity of the population involved, and ignoring the health risk assessment
methodology used as a foundation for developing drinking water quality standards (Shelef,
1991). Concern has been expressed over the lack of sensitivity of epidemiological methods
to detect disease transmission that may not lead to apparent infection in exposed individuals
but to secondary transmission from them to cause illness in susceptible individuals (Rose,
1986). Most regulatory agencies in the USA have chosen not to use epidemiological studies
as the basis for determining water quality standards (Crook, 1998). The transmission of viral
infections through treated wastewater use in industrialised countries has been a particular
issue, also related to the relative inefficiency of disinfection processes in removing viruses in
comparison with bacteria. Concern has also been expressed over the transmission of
emerging parasite infections such as Cryptosporidium, Giardia and Cyclospora which are not
easily removed by conventional treatment processes. On the other hand, many countries
have welcomed the guidance from WHO, and standards in many countries have been based
on WHO (1989) Guidelines e.g. France, Mexico.
2.2. Current WHO Guidelines, and standards for selected countries
WHO (1989) Guidelines for the safe use of wastewater in agriculture took into account all
available epidemiological and microbiological data. The faecal coliform guideline (e.g. =1000 FC/100ml for food crops eaten raw) was intended to protect against risks from bacterial infections, and the newly introduced intestinal nematode egg guideline was intended to protect against helminth infections (and also serve as indicator organisms for all of the large settlable pathogens, including amoebic cysts). The exposed group that each guideline was intended to protect and the wastewater treatment expected to achieve the required microbiological guideline were clearly stated. Waste stabilisation ponds were advocated as being both effective at the removal of pathogens and the most costeffective treatment technology in many circumstances.
In contrast, US-EPA (1992) has recommended the use of much stricter guidelines for wastewater use in the USA. The elements of the guidelines applicable to reuse in agriculture are summarised in Table 2. For irrigation of crops likely to be eaten uncooked, no detectable faecal coliforms/100ml are allowed (compared to £1000 FC/100ml for WHO), and for irrigation of commercially processed crops, fodder crops, etc, the guideline is £200 FC/100ml (where only a nematode egg guideline is set by WHO). No nematode egg guideline is
* The term `faecal coliforms’ is used herein as it is the term most commonly used and understood in the wastewater reuse sector. It may be interpreted as being broadly equivalent to the term `thermotolerant coliforms’. A preferred usage would be `thermotolerant coliforms/Escherichia coli; this would allow the eventual use of E.coli as the preferred (and exclusively faecal) coliform bacterium. & Olivieri, 1998). California has some of the strictest standards, requiring <2.2 totalcoliforms/100ml for irrigation of food crops (through secondary treatment followed by filtration and disinfection), and <23 TC/100ml for irrigation of pasture and landscape impoundments (through secondary treatment and disinfection) (Crook, 1998). Standards in use in many countries (e.g. Israel, Oman) have been influenced by standards in the US.
2.3 Summary of evidence supporting WHO (1989) Guidelines
Shuval et al (1986) reviewed all the available epidemiological evidence on the health
effects of agricultural use of wastewater. Their main conclusions were reported in the
technical report of the WHO Guidelines (1989). They are summarised here, with some
supporting details.
2.3.1 Effects of use of untreated wastewater
2.3.1.1 Effects on farm workers or wastewater treatment plant workers
Use of untreated wastewater for crop irrigation causes significant excess infection with
intestinal nematodes in farm workers, in areas where such infections are endemic. In
India, sewage farm workers had a significant excess of Ascaris and hookworm infections,
compared with farm workers irrigating with clean water (Krishnamoorthi et al, 1973). The
intensity of the infections (number of worms per person) and the effects of infection were
also higher, e.g. the sewage farm workers suffered more from anaemia, one of the
symptoms of severe hookworm infection. There is some evidence that sewer workers may
be at increased risk of protozoan infections such as amoebiasis and giardiasis (Dolby et
al, 1980, Knobloch et al, 1983) but other studies have not found such an effect (Clark et
al,1984). There is no reliable data on the impact on amoebiasis on farm workers in contact
with untreated wastewater.
Cholera can be transmitted to farm workers if they irrigate with raw wastewater coming
from an urban area where a cholera epidemic is occurring. This was the case in the
outbreak of cholera in Jerusalem in 1970, where cholera is not normally endemic and the
level of immunity to cholera was low (Fattal et al, 1986a).
There is limited evidence of increased bacterial and viral infections among wastewater
irrigation workers or wastewater treatment plant workers exposed to untreated wastewater
or wastewater aerosols. Sewage treatment plant workers from 3 cities in the USA did not
have excess gastrointestinal illness (compared to controls) but inexperienced workers had
more gastrointestinal symptoms than experienced workers or controls (municipal
workers); however, these were mild and transitory, and there was no consistent evidence
of increased parasitic, bacterial or viral infections from stool examinations or antibody
surveys (Clark et al, 1981). In a follow up study, there were no excess seroconversions to
Norwalk virus or rotavirus in the inexperienced workers with gastroenteritis, but
inexperienced workers had higher rates of antibody to Norwalk virus (Clark et al, 1985).

2.3.1.2 Effects on consumers of vegetable crops
Irrigation of edible crops with untreated wastewater can results in the transmission of
intestinal nematode infections and bacterial infections. The transmission of Ascaris and
Trichuris infections through consumption of wastewater irrigated salad crops has been
demonstrated in Egypt (Khalil, 1931) and Jerusalem (Fattal et al, 1994), where the
infections fell to very low levels when wastewater irrigation was stopped.
Transmission of cholera can occur to consumers of vegetable crops irrigated with
untreated wastewater, as during the outbreak of cholera in Jerusalem in 1970. It appears
that typhoid can also be transmitted through this route, as seen in Santiago, Chile, where
the excess of typhoid fever in Santiago compared with the rest of Chile, and in the
summer irrigation months, has been attributed to irrigation with river water containing
untreated wastewater (Ferrecio et al, 1984, Shuval et al, 1986). In both cases,
transmission has occurred in communities with relatively high sanitation levels where
transmission through common routes such as contaminated drinking water and poor
personal hygiene has been diminished substantially.
Cattle grazing on pasture irrigated with raw wastewater can become heavily infected with
the larval stage of the tapeworm Taenia saginata (Cysticercus bovis), as has occurred in
Australia. There is no epidemiological evidence of human infection through the
consumption of raw or undercooked meat from such cattle, but the risk of infection
through this route probably exists.
Many outbreaks of enteric infection have been associated with wastewater contaminated
foods, but of the very few which were associated with wastewater irrigation, untreated
wastewater was used in all but two cases (Bryan, 1977).
2.3.2 Effects of use of treated wastewater
2.3.2.1 Effects on farm workers or nearby populations
There is very limited risk of infection among workers using partially treated wastewater for
irrigation. At Muskegon, USA, workers exposed to partially treated wastewater (from
aeration basins and storage lagoons) had no increase in clinical illness or infection with
enteroviruses. Only highly exposed workers (nozzle cleaners) had excess antibodies to
one enterovirus but no seroconversion and no excess in clinical illness (Linneman et al,
1984).
Sprinkler irrigation with partially treated wastewater can create aerosols containing small
numbers of excreted viruses and bacteria but there is no conclusive evidence of disease
transmission through this route. Several studies in Kibbutzim in Israel have addressed this
question. Here, wastewater is partially treated in oxidation ponds before use for irrigation.
The first study (Katzenelson et al, 1976) suggested increases in salmonellosis, shigellosis,
typhoid fever and infectious hepatitis in farmers and their families working on or living near
fields sprinkler irrigated with effluent from oxidation ponds (retention 5-7 days), but the
study was methodologically flawed. The second study (Fattal et al, 1986b) found a twofold
excess risk of clinical ’enteric’ disease in young children (0-4 years) living within 600-
1000m from sprinkler irrigated fields, but this was in the summer irrigation months only,
with no excess illness found on an annual basis. The third study (Fattal et al, 1986c and
Shuval et al, 1989) found that episodes of enteric disease were similar in Kibbutzim most
exposed to treated wastewater aerosols (sprinkler irrigation within 300-600m of residential
areas) and those not exposed to wastewater in any form. The wastewater was partially
treated in ponds with 5-10 days retention reaching a quality of 104-105 coliforms/100ml.
No excess of enteric disease was seen in wastewater contact workers or their families, as
well as in the general population living near the fields. This prospective study is
considered to be conclusive, having a superior epidemiological design.
However, it does seem that transmission of enteric viral pathogens to populations living
near fields sprinkler irrigated with partially treated wastewater can occur under some
circumstances, though this may not result in significant excess clinical infection. In a
seroepidemiological study associated with the third Israeli study (Fattal et al, 1986c and
Shuval et al, 1989) the results suggested that a non-endemic strain of ECHO 4 virus,
which was causing a national epidemic in urban areas, was transmitted to rural
communities through aerosols produced by sprinkler irrigated of wastewater, though no
excess clinical disease was detected (Fattal et al,1987). The fact that no similar excess of
the other viral antibodies studied was found suggests that exposure to wastewater
aerosols does not lead to an excess in enteroviral infection under non-epidemic conditions
2.3.2.2 Effects on consumers of vegetable crops
When vegetables are irrigated with treated wastewater rather than raw wastewater, there
is some evidence from Germany that transmission of Ascaris infection is drastically
reduced. In Berlin in 1949, where wastewater was treated using sedimentation and
biological oxidation prior to irrigation, rates of Ascaris infection were very low, whereas in
Darmstadt where untreated wastewater was used to irrigate vegetable and salad crops,
the majority of the population was infected (Baumhogger,1949 and Krey,1949). Rates
were highest in the suburb where wastewater irrigation was practiced, suggesting farm
workers and their families were infected more through direct contact than consumption.
2.4 New evidence of health risks from epidemiological and microbiological studies in Mexico
2.4.1 Study area
Raw wastewater coming from Mexico City to the Mezquital valley, Hidalgo, is used to irrigate
a restricted range of crops, mainly cereal and fodder crops through flood irrigation
techniques. Some of the wastewater passes through storage reservoirs and the quality of the
wastewater is improved before use; this is equivalent to partial treatment. The effluent from
the first reservoir (retention time 1-7 months, depending on the time of year) meets the WHO
Guideline for restricted irrigation (category B), even though a small amount of raw
wastewater enters the effluent prior to irrigation (quality 105 FC/100ml and <1 nematode
egg/litre). Effluent from the second reservoir is retained for an additional 2-6 months (>3
months of combined retention), and the quality improved further (quality 103 – 104 FC/100ml
and no detectable nematode eggs). Part of the effluent from the first reservoir enters the river
and is abstracted downstream to irrigate a large area of vegetable and salad crops, many of
which are eaten raw; the river water is essentially partially treated wastewater (quality 104
FC/100ml). These crops are sold in the local markets and eaten by the rural populations in
local villages, including those near the second reservoir. In a nearby area, vegetables are
irrigated with borehole wate
2.4.2 Results: risks to farm workers related to restricted irrigation and effect of
wastewater treatment
2.4.2.1 Exposure to raw wastewater
Farm workers and their children in contact with raw wastewater through irrigation or play
have a significantly higher prevalence of Ascaris infection than those in a control group,
who practice rain-fed agriculture (Fig 1a). The excess infection is greater in children than
in adults (Blumenthal et al, 1996, Peasey, 2000). Young children (aged 1-4 yrs) also have
a significantly higher rate of diarrhoeal disease (Fig 1b) (Cifuentes et al, 1993).
2.4.2.2 Exposure to partially treated wastewater
Contact with wastewater which has been retained in one reservoir before use (<1
nematode egg/l and 105 FC/100ml) results in excess Ascaris infection in children, but not in
adults, where the prevalence was reduced to a similar level to the control group (Fig 1c)
(Blumenthal et al, 1996). Children aged 5-14 years also have significantly higher rates of
diarrhoeal disease (Fig 1d) (Cifuentes et al, 1993, Blumenthal et al, 2000a).
When wastewater has been retained in two reservoirs in series before use (no nematode
eggs detected, geometric mean 4x103 FC/100ml, maximum 105 FC/100ml) direct contact
results in very little excess Ascaris infection in any age group (Fig 1e) (Cifuentes et al,
1994, Cifuentes, 1998). However, there is a significant excess of diarrhoeal disease in
children aged 5-14 years (Fig 1f), and a four-fold increase in seroresponse to Human
Norwalk-like Virus/Mexico in adults with high levels of contact with the effluent from the
second reservoir (Annex A, Table 1c) compared with those with no contact with this
effluent (Blumenthal et al, 1998, Blumenthal et al, 2000b).
Retention of water in two reservoirs in series, producing water of average quality 103
FC/100ml and no detectable nematode eggs, is therefore adequate to protect the children
of farmworkers from Ascaris infection but not against increased diarrhoeal disease.
2.4.3 Risks to consumers related to unrestricted irrigation
Risks from bacterial and viral infections related to the consumption of specific vegetables
(ie. courgette, cauliflower, cabbage, carrots, green tomato, red tomato, onion, chilli,
lettuce, radish, cucumber and coriander) and to total consumption of raw vegetables
irrigated with partially treated wastewater (average quality 104 FC/100ml) were
investigated. Consumers (of all ages) had no excess infection with diarrhoeal disease,
and no excess infection as measured by serological response to Human Norwalk-like
Virus/ Mexico (Hu/NLV/Mx), or Enterotoxigenic Escherichia coli (ETEC) related to their
total consumption of raw vegetables, that is, the number of raw vegetables eaten each
week (Blumenthal et al, 1998, Blumenthal et al, 2000b).
However, there was an excess of diarrhoeal disease in those in the exposed area who ate
increased amounts of onion compared with those who ate very little (Fig 2a). The effect
was seen particularly in adults and children under 5 years of age. There were also higher
levels of serological response to Hu/NLV/Mx in school-aged children who ate green
tomato (Fig 2b) and in adults who ate salsa (containing green tomato). The increase in
diarrhoeal disease associated with eating increased amounts of raw chillies (Fig 2c) was
not related to use of partially-treated wastewater as the chillies eaten by the study
population were grown in raw wastewater. Only the risks from eating onion and green
tomato can be associated with using partially treated wastewater in irrigation. In the final
analysis, consumption of onion, or green tomato, once a week or more was associated
with at least a two-fold increase in diarrhoea or Hu/NLV/MX respectively. Enteroviruses
were found on onions at harvest, giving support to this epidemiological evidence. The
effects described were seen after allowance was made for other risk factors for diarrhoeal
disease. No excess serological response to enterotoxigenic E. coli was related to raw
vegetable consumption. Consumption of vegetable crops irrigated with water of quality 104 FC/100ml therefore causes a significant risk of enteric infection in consumers.
2.5 New evidence of health risks from studies in other sites
In this section, new studies that shed light on the appropriateness of the WHO (1989)
Guidelines are reviewed (evidence from studies that were not fully published at the time of
the WHO Scientific Group meeting in 1987 is included).
2.5.1 Effects on farm workers or wastewater treatment plant workers
Evidence of the beneficial effect of wastewater treatment, and particularly of the positive
effect of wastewater storage in reservoirs, was found in the Lubbock Infection Surveillance
Study, a study of farm workers and residents living near the Lubbock land treatment
system in Texas, USA. Here, a rural community was exposed to sprinkler application of
partially treated wastewater from a much larger urban community (Camann et al, 1986).
For the first year, mainly primary effluent and trickling filter effluent was used to irrigate
cereals and industrial crops (quality 106 FC/100ml and virus 100-1000pfu/l)), and in the
second year, the effluent was stored in reservoirs before use (quality 103-104 FC/100ml
and virus <10pfu/l) (Camann et al, 1988).
There was no clear association between self-reported clinical illness episodes and
exposure to wastewater (Camann et al, 1986). However, in the data on seroconversion to
viral infections, a high degree of aerosol exposure was related to a slightly higher rate of
viral infections (risk ratio of 1.5-1.8). A dose-response relationship was observed over the
four irrigation seasons; the episodes of viral infection associated with wastewater
exposure mainly occurred in the first year, before the reservoirs had come into use. More
supporting evidence was found for the role of the wastewater aerosol route of exposure
than for direct contact with wastewater. Of the many infection episodes observed, few
were conclusively associated with wastewater exposure and none resulted in serious
illness. However, the authors could not determine whether wastewater exposure or
identified alternative explanations were the actual risk factors for the enteric viral
infections. Analysis of clinical viral infection data (from faecal specimens) also showed
that aerosol exposure (high) was associated with new viral infections in the summer of the
first year of irrigation, but the effect was of borderline significance (p=0.06)(Camann and
Moore, 1987). However, when allowance was made for alternative risk factors, eating at
local restaurants was identified as an alternative explanation for the viral infection
episodes. In a specific study of rotavirus infection, wastewater spray irrigation had no
detectable effect on the incidence of infection (Ward et al, 1989). Altogether, the results
do suggest that aerosol exposure to wastewater of quality 103-104 FC/100ml does not
result in excess infection with enteric viruses. There is some evidence that exposure to
wastewater of quality 106 FC/100ml results in excess viral infection (but not disease) but
this is not conclusive.
A new study of wastewater treatment plant workers (Khuder et al, 1998) suggests that
they have a significantly higher prevalence of gastroenteritis and gastrointestinal
symptoms than controls (college maintenance and oil refinery workers). There was no
association between extent of exposure and prevalence of symptoms. However, these
results are not reliable since workers were asked about symptoms over the previous 12
months (retrospectively). The previous studies (Clark et al, 1981 and 1985, see Annex A)
are more credible, involving ongoing collection of illness information and human samples
(prospectively).
2.5.2 Effects on consumers of vegetable crops
No further epidemiological studies have been located which assess the risk of enteric
infections to consumers of vegetable crops irrigated with treated wastewater.
2.5.2.1 Evidence from microbiological studies of crops irrigated with treated
wastewater
Studies on bacterial contamination of vegetable crops
Work in Portugal during 1985 - 1989 (Vaz da Costa Vargas et al., 1996) explored the
effect of the irrigation of salad crops with treated wastewater of various qualities. When
poor quality trickling filter effluent (106 FC per 100 ml) was used to spray-irrigate lettuces,
the initial level of indicator bacteria on the lettuces (106 FC/100g) reflected the
bacteriological quality of the irrigation water and exceeded the ICMSF (1974)
recommendations for foodstuffs eaten raw (<105 FC per 100 g fresh weight, preferably <
103 FC per 100 g). Once irrigation ceased, FC levels were similar to the level seen in
lettuces irrigated with fresh water after 7 to 12 days. Final levels were below the
recommendations of ICMSF (1974) and the quality was better than that of lettuces on sale
in the local markets (106 FC per 100 g) irrigated with surface waters.
In studies of drip and furrow irrigation of lettuces and radishes with waste stabilization
pond effluent which had a FC count slightly higher than the WHO recommendation of
1000 per 100 ml (1700 - 5000 FC per 100 ml geometric mean count) crop contamination
levels varied considerably. Under dry weather conditions they were, at worst, of the orders
of 103 and 104 Escherichia coli per 100g for radishes and lettuces respectively, and
salmonellae were always absent. The quality was better than that of locally sold lettuces
(which had a geometric mean FC count, based on 172 samples, of 1 x 106 per 100g) and
fell within the recommendations of ICMSF (1974). However, when rainfall occurred, E. coli
numbers increased and salmonellae were isolated from lettuce surfaces (Bastos and
Mara, 1995).
Experiments in UK assessed the effect of irrigation with final effluent from a conventional
treatment plant (105-106 FC/100ml). When furrow irrigation was used, the quality of
lettuces in covered plots improved to acceptable levels (103 FC/100g) within 3 days of
cessation of irrigation and were E.coli free after 9 days. However, results indicated that
crops in uncovered plots were recontaminated with bacteria from contaminated soils after
significant rainfall and regrowth of E.coli on crop surfaces was observed. Radishes were
prone to low level long term contamination with E.coli (up to 20 days).
These studies show that irrigating salad crops with effluent from conventional treatment
plants can result in unacceptable levels of bacterial contamination of crops (unless a
period of cessation of irrigation occurs before harvest) whereas use of better quality
effluents from waste stablilisation ponds results in acceptable levels of bacterial
contamination.
Studies in Israel have investigated the use of effluent from wastewater storage reservoirs
in unrestricted irrigation of vegetable and salad crops (Armon et al, 1994). When
vegetables were irrigated with poor quality effluent (up to 107 FC/100ml of eluant solution)
high levels of faecal indicator bacteria were detected (up to 105 FC/100ml). However,
when vegetables were irrigated with better quality effluent (0-200 FC/100ml) from a
storage reservoir with a lower organic loading, faecal coliform levels on crops were
generally very low, less than 103 FC/100ml and often lower (the data presented do not
allow for greater specificity about the levels) with a maximum of 104 FC/100ml. The
authors concluded that it is necessary to treat wastewater effluents to an extent that no
residual contaminants are detected on the irrigated crops, but could alternatively be
interpreted as showing that use of treated wastewater meeting WHO (1989) Guideline
levels results in acceptable levels (ICMSF, 1974) of bacterial contamination on crops.
Studies on contamination of vegetable crops with nematode eggs
Experimental studies in NE Brazil and Leeds UK, investigated the consumer risk from
nematode infection (Ascaris lumbricoides and Ascaridia galli respectively) from
wastewater-irrigated lettuces (Ayres et al., 1992; Stott et al., 1994). In Brazil, when raw
wastewater (>100 nematode eggs/l) was used to spray-irrigate lettuce, harvested crops
were contaminated with mean values of up to 60 eggs / plant after 5 weeks irrigation.
Irrigation with effluent from the anaerobic pond of a series of waste stabilisation ponds
(>10 eggs/l) reduced levels of nematode contamination on lettuce to around 0.6
eggs/plant at harvest and produced a better quality of lettuce than that sold in the local
market. When facultative pond effluent (<0.5 eggs/l) was used for irrigation, no eggs were
detected on crops. Lettuces irrigated with maturation pond effluent (0 eggs/l) were also
not contaminated despite growing uncovered plants in heavily contaminated soil
containing >1200 Ascaris eggs/100g indicating that neither irrigation nor rainfall resulted
in recontamination of crops.
In the UK trials, spray-irrigation of lettuce with poor quality wastewater (50 nematode
eggs/l) resulted in contamination of around 2.2 eggs/plant at harvest. Improving the
wastewater quality to 10 eggs/l resulted in reduced levels of nematode contamination on
lettuce to a maximum of 1.5 eggs/plant. When wastewater at the WHO quality of = 1
eggs/l was used for irrigation, very slight contamination was found on a few plants at
around 0.3 eggs/plant. However, no transmission of A. galli infection was found from
wastewater irrigated crops using animal studies although the infective dose is very low at
less than 5 embryonated eggs.
The results collectively show that irrigation with wastewater of WHO (1989) Guideline
quality resulted in no contamination of lettuce at harvest (0.5 eggs/l) or very slight
contamination on a few plants (6%) with eggs that were either degenerate or not infective.
However, a few nematode eggs on harvested plants were viable, but not yet embryonated
(20% A. lumbricoides on >100 eggs/l irrigated crops; <0.1 A. galli eggs/plant irrigated with
1-10 eggs/l) and so crops with a long shelf life can represent a potential risk to consumers
as these eggs might have time to become infective.
2.5.2.2 Evidence from risk assessment studies
Asano and Sakaji (1990) used the risk assessment methodology described by Haas
(1983) to estimate the risks of consumption of market-garden produce irrigated with water
containing 1 enteric virus in 40 litres (the Arizona standard). An individual’s annual risk of
infection was between 10-4 (ie. one case per 10,000 persons) and 10-8 (though when
100ml of such water is accidentally ingested the risk of infection is between 10-3 and 10-7).
Asano et al (1992) estimated the risk of infection with 3 enteric viruses (poliovirus1 and 3,
echovirus 12) related to use of chlorinated tertiary effluents and four scenarios of
exposure to wastewater; (i) irrigation of market-garden produce, (ii) irrigation of golf
courses, (iii) recreational uses of water and (iv) groundwater recharge. They used
estimates of the amount of water ingested via the various scenarios, for example, 1
ml/day for 2 days per week all year by golfers handling and cleaning golf balls, 10ml per
day for consumers of food crops. Allowance was made for viral reduction in the
environment, for example, through stopping irrigation of crops 2 weeks before harvest.
The annual risk of infection related to consuming irrigated market-garden produce was
between 10-6 and 10-11 when the effluent contained one viral unit in 100, and between
10-4 and 10-9 when water with a maximum concentration of 111 viral units/100litres was
used. The risk from the irrigation of golf courses is higher, between 10-2 and 10-5. Even
when unchlorinated secondary effluents were investigated (data taken from plants in
California), risk assessment showed that for food crop irrigation and groundwater
recharge, the annual risk of viral infection was less than 10-4 more than 95% of the time
(Tanaka et al,1998). For golf courses, the risks are at acceptable levels when chlorinated
secondary effluent (3.9 log removal) is used (10-4 - 10-6) but not when it is not chlorinated
(10-1-10-2). The estimated risks are higher when treated wastewater is used in recreational
impoundments used for swimming.
More recently, Shuval et al (1997) used the drinking water model for infection risk
developed by Haas et al. (1993) and combined this with laboratory data on the degree of
viral contamination of vegetables irrigated with wastewaters of various qualities. The
annual risk of becoming infected with hepatitis A from eating cucumbers which had been
irrigated with untreated wastewater was 10-3 but when the cucumbers were irrigated with
treated wastewaters containing =1000 FC per 100 ml the risk was 10-6 - 10-7; for rotavirus
infection the risk was 10-5 - 10-6. Data from waste stabilisation ponds in northeast Brazil
(Oragui et al, 1987) suggests that rotavirus numbers are likely to be less than 30 per 100
litres when the faecal coliform content is below 104 per 100ml. The results of these studies
are therefore consistent with those obtained by Asano et al (1992).
2.6 Discussion and implications of results for international guidelines
and policy concerning wastewater use in agriculture
2.6.1 Implications of the results for international guidelines for safe use of
wastewater in agriculture
2.6.1.1 Approaches to setting microbiological guidelines
There are currently several alternative approaches to establishing microbiological guidelines for wastewater reuse, which have different outcomes as their objective:
I The absence of faecal indicator organisms in the wastewater,
II No measurable excess cases in the exposed population, and
III A model-generated risk which is below a defined acceptable risk
Their assumptions appear to be as follows:
I The absence of faecal indicator organisms in the wastewater
In this approach, there should be no detectable indicators of faecal pollution in the
wastewater. This approach is based on the premise that it is impractical to monitor
reclaimed water for all the pathogenic microorganisms of concern, and that use of
surrogate parameters, such as faecal indicator organisms, is acceptable. Total and faecal
coliforms* are the most commonly used indicator organisms and these are often used in
conjunction with specified wastewater treatment requirements. Where this occurs, the
assumption is made that the need for expensive and time-consuming monitoring of
treated water for pathogenic microorganisms is eliminated. In practice, this approach has
led to guidelines which require zero faecal coliforms per 100ml for the irrigation of crops
to be eaten raw, in association with a requirement for secondary treatment, filtration and
disinfection. USEPA/USAID (1992) have taken this approach and consequently has
recommended very strict guidelines for wastewater use in the USA.
II No measurable excess cases in the exposed population – the epidemiological
perspective
The objective here is that there should be no actual risk of infection - that is, no
measurable excess risk of infection attributable to wastewater reuse, based on scientific
evidence, especially from epidemiological studies. This approach was adopted in setting
the 1989 WHO guidelines (Table 1), where epidemiological evidence was used (where
available) and supported by information from microbiological studies. Allowance can be
made for local epidemiological, socio-cultural and environmental factors and the
guidelines modified accordingly.
III. A model-generated risk which is below a defined acceptable risk
In this approach an acceptable risk of infection is first defined, as in the case of microbial
contamination of drinking water supplies, for example, for which the USEPA has set
annual risk of 10-4 per person (Haas et al, 1993). Once the acceptable annual risk has
been established by the regulator, a quantitative microbial risk assessment (QMRA)
model is then used to generate an estimated annual risk of infection based on exposure
assessment (including data on the concentrations of microorganisms in wastewater, the
quantity of treated wastewater remaining on crop surfaces following irrigation pathogenindicator
ratios and pathogen die-off between food crop harvest and consumption) and
“dose-response” data (i.e. data from human infection trials on pathogen dose and
resulting infection, if any). A microbiological quality guideline would then be set so that
the QMRA model produces an estimate of annual risk which is below the regulator’s
acceptable annual risk. This risk assessment approach is especially powerful when the
acceptable risk is below the level that can be measured in most epidemiological studies
(unless extremely large populations are studied).
These three approaches are considered further elsewhere (Blumenthal et al, 2000c)
In our assessment of the implications of the evidence on the health risks from wastewater
use on international guidelines we combine approach II with approach III. We use
evidence from studies since 1989 (including evidence from studies that were not fully
published at the time of the WHO Scientific Group meeting in 1987) to evaluate the 1989
WHO guidelines, and propose alternative guidelines where the evidence supports a
change (Table 2). We use empirical epidemiological evidence where it is available, as
these studies measure the result of real exposures that occur over time, and do not
depend on the use of estimates of mean daily microbial doses and dose-response
analyses based on experiments with healthy volunteers where the data are extrapolated
to provide low dose estimates. Epidemiological studies are particularly useful in highly
endemic areas for enteric diseases where risks of infection are high enough to be easily
measurable with current techniques. Where the epidemiological evidence is incomplete
we have used evidence from microbiological studies. Quantified microbial risk assessment
studies are particularly useful in areas with a low endemicity of enteric diseases, where
risks of infection are low, and where regular monitoring of pathogens in wastewater occurs
and produces good data sets for use in exposure assessment. The evidence is strongest
where both approaches lead to the same conclusions. If different results are obtained,
further analysis of the studies should help to where identify weaknesses and parts of the
methodology that need improvement. We believe that this is a rational approach, which it
is likely to be cost-effective in most settings. It does not achieve a ‘no risk’ scenario, and a
low level of risk may remain (below that which is detectable through most epidemiological
studies). Using the ‘no risk’ approach (approach I), however, and setting a standard of 0
FC/100ml results in very high additional costs per case of infectious disease averted
compared with a standard of 1000 FC/100ml (section 2.5, Shuval et al, 1997). Individual
countries may wish to spend money on reducing risks to these very low levels, but it is not
necessary for international guidelines to encourage other countries to do so.
A fourth approach may be considered in future in the setting of country standards. This is
necessary because the implications of wastewater reuse for many infections are not best
addressed through using infection as the assessment criteria. Using the ‘burden of
disease’ associated with wastewater reuse would be a better assessment criterion. For
example, for diarrhoeal disease this would take into account the incidence of diarrhoeal
disease, the number of hospitalisations and the mortality associated with wastewater
reuse. However, at present there are no data on which to base such an assessment.
Instead, a ‘disease control’ approach could be adopted (see section 2.6.1.3).
2.6.1.2 Proposed revised guidelines based on using the epidemiological perspective
In light of the epidemiological and microbiological studies reviewed above, it is possible to
evaluate the WHO (1989) Guidelines, and propose alternative guidelines where the
evidence supports a change.
Unrestricted irrigation - Category A
The results of studies of consumer risks do not provide any evidence to suggest a need to
change the WHO faecal coliform guideline of =103 FC/100ml for irrigation of vegetable and
salad crops eaten uncooked (Category A1). Epidemiological studies in an area in Mexico
where enteric infections are endemic suggest that risks of enteric infections are
significant, but low, when the guideline is exceeded by a factor of 10 (Blumenthal et al,
1998, Blumenthal et al, 2000b). There was no risk associated with the total consumption
of raw vegetables but consumption of onions, eaten by the majority of the study
population, was associated with at least a two-fold increase in diarrhoeal disease
Microbiological studies also suggest that a guideline of =103 FC/100ml is appropriate in
hot climates, where crops irrigated with water just exceeding the guideline value fell within
the quality recommendations of ICMSF (1974) (Vaz da Costas Vargas et al, 1996).
Recontamination of crops in uncovered plots after significant rainfall, however, suggests
that a stricter guideline may be necessary in countries where significant rainfall occurs
during the growing season. However, risk assessment studies in Israel (Shuval et al,
1997) have indicated that the annual risk of enteric virus and bacterial infection from
eating lettuce irrigated with water meeting the WHO Guideline level ranges from 10-5
(rotavirus) and 10-6 (hepatitis A virus) to 10-9 (cholera). Data from risk assessment in the
USA (Asano et al, 1992) support these conclusions, finding the annual risk of infection
from enteric viruses was between 10-4 and 10-9 when water with a maximum viral
concentration of 111 units per 100 litres was used to irrigate market garden produce. Data
from waste stabilisation ponds in northeast Brazil (Vaz da Costas Vargas et al, 1996)
suggest that rotavirus numbers are likely to be less than 30 per 100 litres when the faecal
coliform content is below 104 per 100ml. However, other enteric viruses such as
adenovirus may significantly outnumber rotaviruses and enteroviruses, possibly by an
order of magnitude (30). It can therefore be extrapolated from these data that use of water
meeting the WHO guideline level of 1000 FC per 100 ml is likely to produce an annual risk
of viral infection of less than 10-4. Since the US microbial standards for drinking water are
based on the criteria that human populations should not be subjected to the risk of
infection by enteric disease greater than 10-4, then the WHO (1989) wastewater reuse
guidelines would appear to offer a similar level of protection. Furthermore, additional
treatment to a FC level more stringent than 1000 per 100 ml is not cost effective, for
example, Shuval et al. (1997) showed that the cost per case of hepatitis A avoided by
irrigation with zero FC per 100 ml (as recommended by USEPA and USAID, 1992), rather
than with 1000 FC per 100 ml, was of the order of US$ 3-30 millions.
The nematode egg guideline of =1 nematode egg/litre appears to be adequate to protect
consumers of cultivated vegetables spray-irrigated with effluent of consistent quality and
at high temperatures, but not necessarily consumers of vegetables surface-irrigated with
such effluent at lower temperatures. Studies have shown that lettuces spray-irrigated with
water of =1 nematode egg/litre (mean maximum temperatures exceeding 28oC) were not
contaminated (when quality <0.5 eggs/litre) or only lightly contaminated at harvest, and
any eggs present were not infective (Annex B; Ayres et al, 1992; Stott et al, 1994).
However, since a few eggs on the harvested plants were viable, crops with a long shelf
life represent a potential risk to consumers. Epidemiological studies of wastewater-related
risk factors for Ascaris infection in central Mexico showed that there was an increase of
Ascaris infection among men consuming crops surface-irrigated with raw wastewater
infection compared to those who did not each such crops, but there was no increased risk
when crops were irrigated with sedimented wastewater (from a reservoir) with £1
nematode egg per litre. However, children under 15 years who ate crops from local fields
had a two-fold increase in Ascaris infection compared with those who did not eat such
crops, when either raw wastewater or sedimented wastewater was used in irrigation
(Peasey, 2000). The increased risk in these circumstances may have been influenced by
the irrigation method (surface, rather than spray), and the lower mean temperature (due to
high altitude and semi-desert conditions). It would be sensible, therefore, to adopt a
stricter guideline of £ 0.1 eggs per litre to prevent transmission of Ascaris infection in
circumstances where conditions favour the survival of helminth eggs (lower temperatures,
surface irrigation), and also to allow for the risks to farmworkers involved in cultivating the
vegetable crops (see below). In situations where crops with a short shelf life are grown in
hot and dry conditions, and where workers are adequately protected from infection
through direct contact with wastewater or soil, the original guideline of £1 nematode egg
per litre would appear to be adequate. However, use of the revised guideline may be
considered prudent even in these circumstances, adding a greater margin of safety.
Restricted irrigation - Category B
In the WHO (1989) guidelines there was no faecal coliform guideline for restricted
irrigation due to the lack of evidence of a risk of bacterial and viral infections to farm
workers and nearby residents. Recent evidence of enteric infections in farming families in
direct contact with partially treated wastewater (Mexico) and in populations living nearby
sprinkler irrigated fields (USA) when the water quality exceeds 106 FC/100ml suggests
that a faecal coliform guideline should now be added. Data from Israel (Shuval et al, 1989)
and Lubbock, USA (Camann et al, 1986) on situations where spray/sprinkler irrigation is
used suggest that a level of =105 FC/100ml would protect both farm workers and nearby
population groups from infection via direct contact or wastewater aerosols (Category B1).
However, data from Mexico on a situation where flood irrigation is used showed that there
was a significant excess of diarrhoeal disease in children aged 5-14 years, and a four-fold
increase in seroresponse to Human Norwalk-like Virus/Mexico in adults with high levels of
contact with the effluent from two sequential storage reservoirs (containing partially
treated wastewater with 103-104 FC per 100ml) compared with those with no contact with
this effluent (Blumenthal et al, 1998, Blumenthal et al, 2000b). There was also an excess
of diarrhoeal disease in adults (OR=1.5) but this did not reach significant levels (p=0.12)
probably due to sample size factors. A reduced guideline level of £103 FC per 100ml
would be safer where adult farmworkers are engaged in flood or furrow irrigation
(Category B2 in Table 2) and where children are regularly exposed. This would also help to reduce the risks from epidemic infections which could be transmitted to effluent-irrigating communities from an outbreak in the source community (Fattal et al, 1987). Where there are insufficient resources to provide treatment to reach this stricter guideline, a guideline of 105 FC per 100ml should be supplemented by other health protection measures (for example, health education concerning avoidance of direct contact with wastewater, and the importance of handwashing with soap after wastewater contact).
The nematode egg guideline of =1 nematode egg/litre does not appear to sufficiently protect farm workers and their families, especially children (under 15 years of age). This is particularly the case where wastewater treatment systems produce an effluent of variable quality, where the partially treated wastewater may be contaminated with small quantities of wastewater, and where children of farm workers come into direct contact with the
effluent. In such a situation in Mexico, children in contact with effluent from a storage
reservoir which met the WHO Guideline (even though it was contaminated with small
quantities of raw wastewater) had increased prevalence and intensity of Ascaris infection.
When the effluent had been stored in two reservoirs and no nematode eggs were
detectable, there was very little excess Ascaris infection in any age group (Cifuentes,
1998, Blumenthal et al, 2000a). Similar situations would arise where raw wastewater is
allowed to bypass conventional treatment plants, especially during periods of peak flow,
allowing untreated wastewater containing nematode eggs (where nematode infections are
endemic) into the effluent that is reused for agriculture. Since this is often the case in
reality, a stricter guideline of =0.1 eggs per litre is required for restricted irrigation where
children are exposed to irrigation water (Category B3). This would also be useful in
circumstances where stable treatment systems, such as waste stabilisation ponds are in
use, and workers may come into contact with the soil, since eggs in soil can accumulate
In specific cases, local epidemiological, sociocultural and environmental factors should be taken into account and the guidelines modified accordingly.
b Ascaris and Trichuris species and hookworms; the guideline is also intended to protect against risks from parasitic protozoa
c During the irrigation season (if the wastewater is treated in WSP or WSTR which have been designed to achieve these egg numbers, then routine effluent quality monitoring is not required).
d During the irrigation season (faecal coliform counts should preferably be done weekly, but at least monthly).
e A more stringent guideline (£ 200 faecal coliforms per 100 ml) is a ppropriate for public lawns, such as hotel lawns, with which the public may come into direct contact.
f This guideline can be increased to £1 egg per litre if (i) conditions are hot and dry and surface irrigation is not used, or (ii) if wastewater treatment is supplemented with anthelmintic chemotherapy campaigns in areas of wastewater re-use.
g In the case of fruit trees, irrigation should cease two weeks before fruit is picked and no fruit should be picked off the ground. Spray/sprinkler irrigation should not be used.
2.6.1.3 Implications of a disease control approach in the setting of country standards
Where economic constraints limit the level of wastewater treatment that can be provided,
a country may choose disease control as the objective, where a certain risk of infection is
accepted and the objective is to stop disease levels being reached, or where only the
most vulnerable groups e.g. young children, are protected. The implications of the studies
are less clear, due to the paucity of disease data, and are discussed below.
Unrestricted irrigation
If the objective is to prevent clinical enteric disease (and not enteric infection), the studies
in Mexico suggest that it may be possible to set a faecal coliform guideline for unrestricted
irrigation of 104 FC/100ml in areas where enteric infections are endemic, immunity to viral
infections exists and crops are eaten locally. At this level, the serological studies in Mexico
suggest there was transmission of viral infection but do not necessarily reflect a significant
increase of disease. Risks of diarrhoeal disease were related to the consumption of onion
and green tomato but not of other crops. If the guideline were set at this lower level, crop
restrictions could be added e.g. to prevent growing of onion. However, it may be prudent
to keep the guideline at 103 FC/100ml in order to (i) prevent the spread of infections
causing national epidemics being transmitted to rural communities through sprinkler
irrigation of partially-treated wastewater (Fattal et al, 1987), and (ii) grow crops which may
be exported to countries where enteric infections are not highly endemic.
A nematode egg guideline for unrestricted irrigation of =1 nematode egg/litre may be
adequate where crops with a short shelf life are grown (eg. salad crops) and wild plants
are not eaten. The very few viable eggs that are likely to be present would have less
chance of developing to infectivity in these circumstances. It is possible that a relaxed
guideline of 10 eggs/litre may be considered to be adequate if the goal is to prevent high
intensities of helminth infections (worm load) rather than infection itself.
Restricted irrigation
In highly endemic areas, if the objective were to prevent enteric disease in the vulnerable
children (under 5’s) and not necessarily in older children, a faecal coliform guideline of
=105 FC/100ml would be adequate. Contact with wastewater of 105 FC/100ml led to
increased diarrhoeal disease in older children and not young children (Cifuentes, 1995,
Table A1 section b, Annex A). School-aged children involved in farming activities would
need to be protected using other measures, and children discouraged from playing in the
fields. Where there is a difficulty in doing this, a relaxation of the guideline would not be
recommended.
Where the goal is to prevent high intensities of helminth infections, it is conceivable that a
less strict nematode egg guideline and additional health protection measures could be
used. In Mexico, the current standard for restricted irrigation is 5 eggs/litre, designed to be
achievable by conventional treatment plants. There is currently no epidemiological
evidence, however, on which to base such a relaxed guideline. In fact, data from Mexico
suggest that intensities of infection in school-aged children are as high when they are
exposed to wastewater of =1 egg/litre as to raw wastewater (Table A2, Annex A)
suggesting that a stricter standard is necessary if treatment is the only health protection
measure used. It is possible that a relaxed guideline could be used if it is supplemented
by other measures, such as, twice yearly chemotherapy for school-aged children (who
have the highest intensity infections)(see section 2.6). This would only be suitable in
countries where anti-parasite campaigns exist and can be successfully extended to cover
areas where wastewater is used in agriculture. Disease control would be dependent on
chemotherapy regularly reducing intensities of infection, which can easily return to pretreatment
levels after 6 months.
2.6.1.4 Risks from enteric viruses and parasitic protozoa – are specific guidelines necessary?
Protection against risks from enteric viruses through a viral guideline
The faecal coliform guideline in most guidelines and standards for wastewater reuse is
intended to address risks of enteric infections due to both bacterial and viral pathogens
yet it may not be adequate to protect against viral infections because (i) conventional
treatment processes involving disinfection are much less efficient in removing viruses than
indicator bacteria – and, as improved (molecular) techniques for viral detection have
become available, this becomes even more apparent (Blackmer et al. 2000), and (ii)
median infectious doses for enteric viruses are very low (below 50 infectious particles) in
comparison with those for most enteric bacteria (Haas et al. 1993, and Scwartzbrod,
1995). A further point is that wastewater virology is a rapidly expanding research area,
with the range of routinely considered faecal viruses being extended to include, for
example, adenoviruses and astroviruses (Chaperon et al. 2000), and these may survive
longer in treated wastewaters than enteroviruses.
There are few data available on the risks of viral infection from either direct contact or crop
consumption. Nevertheless, the following currently available findings have implications for
the evaluation of current guidelines with respect to viral risks:
(1) Use of risk assessment approaches have indicated that (a) when the concentration of
viruses (poliovirus 3, echovirus 12 and poliovirus 1) in chlorinated tertiary effluent was
a maximum of 111 pfu per 100 ml, the estimated annual risk of enteroviral infection
from spray irrigation of food crops was 10-4 – 10-7 (Asano et al. 1992); (b) use of
chlorinated secondary effluents (3.9 log virus removal) to irrigate food crops resulted
in an estimated annual risk of enteroviral infection to consumers of 10-7 - 10-9 and
even the use of unchlorinated secondary effluents resulted in an estimated annual risk
of enteroviral infection of 10-3 – 10-5 (Tanaka et al. 1998); and (c) use of effluent of
1000 FC per 100 ml to irrigate salad crops resulted in an order-of-magnitude estimate
for the annual risk of viral infection of less than 10-4 (Shuval et al. 1997). However,
these studies are recognised to have deficiencies (see Section 1, part III) compared to
more advanced QMRA techniques.
(2) Epidemiological studies have indicated that (a) when there was spray irrigation with
effluent containing fewer than 105 FC per 100 ml, there was no significant risk of
enteroviral infection to the surrounding population (Shuval et al. 1989, and Camann et
al. 1986); and (b) when there was surface irrigation with effluent of 103-104 FC per
100ml, there was a significant risk of infection with Norwalk-like virus (Hu/NLV/MX) to
farmworkers with high levels of contact with the wastewater (Blumenthal et al. 2000b);
however (c) when there was surface irrigation with effluent of 104 FC per 100ml there
was little risk of infection with Hu/NLV/MX associated with consumption of vegetable
crops eaten raw (Blumenthal et al, 2000c).
Taken together, these results suggest that (i) use of tertiary treatment plus disinfection
may not be needed to protect against viral risks from consumption of vegetable crops
eaten raw, and that (ii) the faecal coliform guideline of £1000 FC per 100ml is adequate
and no extra viral guideline is currently justified.
Adequacy of protection against risks from parasitic protozoa by the nematode egg guideline
There is increasing concern about the role of wastewater in the environmental
transmission of protozoan pathogens such as Giardia, Cryptosporidium and Cyclospora.
The 1989 WHO guidelines assumed that if helminth egg levels were reduced to the level
of the helminth egg guideline, then other “easily settlable” pathogens such as protozoan
(oo)cysts would also be reduced to levels that did not cause excess infection in exposed
populations. However, recent studies have shown that the removal of helminth eggs does
not correlate with that of protozoan (oo)cysts (Stott et al. 1997, Grimason et al. 1993 and
Alouini, 1998). There is evidence that protozoan (oo)cysts are not effectively removed by
conventional wastewater treatment processes with reported efficiencies varying from 26-
100% (Bukhari et al. 1997, Sykora et al. 1990, and Robertson ). In addition, the infectious
dose can be low; human feeding studies have shown that the median infectious dose for
Giardia is between 10 and 100 cysts, and for Cryptosporidium between 30 and 1000
oocysts (Cooper, and Olivieri, 1998).
Most of the evidence of water-related outbreaks of enteric protozoan diseases indicate
they are associated with ingestion of contaminated drinking water and immersion in
recreational waters (Craun, 1990, Fricker and Crabb, 1998, and Ortega et al. 1998) and
consumption of contaminated foods (Smith, 1993, and Rose and Slifko, 1999). There are
few data on the importance of wastewater reuse in agriculture, particularly the use of
treated wastewater, in the transmission of parasitic protozoan infection, and these other
routes of transmission and poor domestic hygiene are probably more important, especially
in developing countries. Even though oocysts of both Cryptosporidium parvum and
Cyclospora cayetanensis have been detected on market vegetables in an endemic area
(Ortega et al. 1997), there is no epidemiological evidence to implicate direct use of
wastewater used for irrigation as a risk factor for either pathogen.
Epidemiological studies done in Mexico have shown that there is a small risk of amoebic
infection (OR=1.3) in those in contact with untreated wastewater but not in those in
contact with settled wastewater retained in two reservoirs before use, which meets the
WHO nematode egg guideline (Cifuentes, 1995). Initial analysis indicated that there was
no risk of Giardia intestinalis in agricultural workers and their families related to contact
with raw wastewater, but a small risk related to contact with wastewater retained in two
reservoirs (Cifuentes et al. 1991/2). However, when these data were analysed further,
allowing for the effect of other transmission routes, the risk related to contact with the
reservoir effluent did not remain significant (Cifuentes et al, 2000). A study in India has
also shown that there was no significant risk of Giardia infection in agricultural workers
using untreated or treated wastewater, compared to controls (Sehgal and Mahanjan,
1991).
These studies indicate that there is at present no evidence to suggest that use of treated
wastewater meeting the WHO nematode egg guideline for irrigation results in an
increased risk of parasitic protozoan infection or that which exceeds acceptable levels,
and therefore no evidence to support the establishment of a separate guideline for
protozoa. However, it may be that risks from protozoan parasites are of greater public
health importance in industrialised countries than the risks from helminthic infections.
2.6.2 Implications for wastewater treatment and other health protection measures
There are a number of health protection measures that can be adopted, including
wastewater treatment, crop restrictions, irrigation techniques, human exposure control and
chemotherapeutic interventions. In practice these are usually used in combination, and
not singly. The most commonly used combination is partial wastewater treatment plus
crop restrictions, and this is reflected in the wastewater guidelines (Table 5). Partial
wastewater treatment can, however, be combined with one or more of the other
measures.
2.6.2.1 Wastewater treatment
A full discussion of wastewater treatment methods appropriate to meet the proposed
revised guidelines for wastewater reuse (Table 5) is given in Annex C. The main points
are summarised here.
When wastewater is treated with the intention of using the effluent for agricultural irrigation
and not disposal in receiving waters, the important quality criteria are those relevant to
human health rather than environmental criteria and those related to the health of fish in
receiving waters. Therefore, faecal coliform removal and nematode egg removal are more
important than BOD removal. In many situations, the most cost-effective wastewater
treatment option is waste stabilization ponds (WSP), as suggested in WHO (1989). The
advantages of WSP are low cost, simplicity of construction, operation and maintenance,
and high efficiency especially with respect to the removal of nematode eggs and faecal
bacteria. Properly designed (Mara, 1997; Mara and Pearson, 1998), WSP can easily
meet the helminthological and bacteriological quality requirements for both restricted and
unrestricted irrigation (Table 5 and Table 6). There are many existing WSP that do not
achieve these qualities (see, for example, Maynard et al., 1999), but they may not have
been so designed or are overloaded or poorly maintained.
Land availability or the cost of land can limit the use of WSP’s, especially when dealing
with effluent from large cities (population > 1 million), or in countries where lower
temperatures mean that longer retention times, and therefore larger land areas, are
required to meet the FC guideline for unrestricted irrigation. For example, for a flow of
1000 m3 per day of a wastewater with a BOD5 of 350mg/l and a faecal coliform count of
5x107 FC/100ml, the total pond area required to produce an effluent containing = 1000
FC/100ml would be 8,000 m2 at 25oC, 13,700 at 20oC, and 25,400 at 150C.
a Later work showed the same performance for BOD and SS removals at retention times of ~ 1 day (Silva, 1982)
Wastewater storage and treatment reservoirs (WSTR) are particularly useful in arid and
semi-arid regions where agricultural production is limited by the quantity of water,
including treated wastewater, for irrigation, since WSTR permit the whole year’s
wastewater to be used for irrigation, rather than just that produced during the irrigation
season. Recent research in Brazil has shown that sequential batch-fed WSTR’s (in pilot
scale) can remove faecal coliforms to less than 1000 FC/100ml by three weeks into the
rest phase (Mara et al, 1996) whereas single WSTR’s in Israel produce an effluent
suitable for restricted irrigation. Sequential storage reservoirs in the Mezquital Valley in
Mexico produce an effluent with a mean quality of 103 FC/100ml, the quality varying
depending on the retention time which varies according to irrigation demand (Cifuentes,
1995).
In situations where conventional treatment is being considered, it is essential to assess
the cost of operation, maintenance and personnel training, all of which are considerably
higher than for non-conventional treatment systems. Conventional wastewater treatment
systems (such as activated sludge, trickling filters) can only achieve a 2 log10 unit
reduction of faecal coliforms, so they do not meet the microbiological requirements for
agricultural reuse unless supplemented by tertiary treatment processes. They can be used
in circumstances where WSP are not suitable; extended aeration plants, such as oxidation
ditches, are generally the best option in this case as their costs are lowest (Arthur, 1983).
Conventional secondary STP’s are better at removing helminth eggs due to the retention
time in primary and secondary sedimentation. Data from Mexico suggest that these are
reduced to around 3 eggs/litre by advanced primary treatment. Filtration can be used to
reduce the egg levels further but this can add as significant extra cost to the plant.
However, maturation ponds (sometimes called “polishing” ponds in this context) can be
used to upgrade conventional effluents prior to either restricted or unrestricted irrigation.
Reservoirs can also be used for this purpose e.g. in the USA, reservoirs have produced a
2-3 log reduction in faecal coliform levels in trickling filter effluent, from 106 to 103-104
FC/100ml (Moore et al, 1988).
Sludge from conventional treatment plants or WSP must be treated or disposed of
carefully as pathogens are concentrated there. Helminth eggs can survive and remain
viable for nearly 12 months. Sludge can be injected into the subsoil or placed in furrows
and covered with a layer of earth before the planting season and no tuberculous crops
planted along such trenches. Alternatively there are a variety of treatment methods to
make sludge safe including storage for 6-12 months at ambient temperature in hot
climates, anaerobic digestion and forced aeration co-composting of sludge (Hespanhol,
1997).
The wastewater treatment system chosen needs to be able to deal with large differences
in seasonal flows of wastewater, including peaks during the rainy season; WSP can do
this. Bypassing conventional treatment plants with untreated or semi-treated wastewater
which is then used in agriculture is a particular source of health risks. Issues of the
training of treatment plant personnel and the running costs of each system, as well as
treatment efficiency, should guide the choice of treatment facility.
It is important when defining wastewater treatment policies to remember that treatment is
not the only measure available to protect health; crop selection and restriction, different
irrigation techniques and human exposure control are equally important health protection
measures. These non-treatment options should be considered as part of an integrated
approach to health protection where wastewater irrigation policies are being proposed or
modified.
2.6.2.2 Crop restriction
Crop restriction is often practiced in conjunction with wastewater treatment so that lower
quality effluents can be used to irrigate non-vegetable crops (see Table 5). Although this
appears straightforward, in practice it is often difficult to enforce. It can only be done
effectively where a public body controls the use of wastewater and laws providing for crop
restricted are strictly enforced, where there is adequate demand for the crops allowed
under crop restrictions and where there is little market pressure in favour of excluded
crops. (i.e. salad and other crops eaten uncooked). Crop restriction requires much less
costly wastewater treatment and may be favoured for this reason alone (but wastewater
treatment engineers need to discuss this clearly with the appropriate regulatory agency
and local farmers). Wastewater irrigation with crop restrictions is practiced in Mexico,
Chile and Peru. In Chile, wastewater from the city of Santiago was used to irrigate salad
crops and vegetables until 1992. However, as part of a national campaign to prevent and
control cholera, crop restriction was enforced, together with a general hygiene education
program. The result was a reduction in cholera cases by over 90% attributable to the
consumption of salad crops or vegetables (Monreal, 1992).
Crop restriction is not effective to control health risks from indirect reuse, where
wastewater-contaminated surface waters are used directly by the farmers and do not
come under the control of public bodies. Much unrestricted irrigation actually uses
wastewater-contaminated surface waters rather than wastewater itself (either untreated or
treated) and constitutes a particular challenge to the regulatory and public health
authorities.
2.6.2.3 Irrigation technique
The irrigation technique can be chosen to reduce the amount of human exposure to the
wastewater. In general, health risks are greatest when spray/sprinkler irrigation is used, as
this distributes contamination over the surface of crops and exposes nearby population
groups to aerosols containing bacteria and viruses (the opposite occurs with nematode
eggs, which tend to be washed off during spray irrigation (Annex B)). This technique
should be avoided where possible, and if used, stricter effluent standards apply (see
Table 5). Flood and furrow irrigation exposes field workers to the greatest risk, especially
if earth moving is done by hand and without protection. Localised irrigation (inc. drip,
trickle and bubbler irrigation) can give the greatest degree of health protection by reducing
the exposure of workers to the wastewater. A period of cessation of irrigation before
harvest (1-2 weeks) can allow die-off of bacteria and viruses such that the quality of
irrigated crops improves to levels seen in crops irrigated with fresh water, as shown by
Vaz da Costas Vargas et al (1996). However, it is not practical in unregulated
circumstances since farmers will probably not cease irrigation of leafy salad crops 5 days
or more before harvest. Replacing partially-treated wastewater with fresh water for a week
or so before harvest is not a reliable way of improving crop quality since re-contamination
of the crops from the soil has been found to occur (Vaz da Costas Vargas et al, 1996).
Use of cessation of irrigation before harvest is more viable with fodder crops which do not
need to be harvested at their freshest, and could enable the use of lower quality effluents.
2.6.2.4 Human exposure control
The groups potentially most at risk from wastewater reuse in agriculture are the farm
workers, their families, crop handlers, consumers of crops, and those living near
wastewater-irrigated areas. The approach required to minimize exposure depends on the
target group. Farm workers and their families have higher potential risks of parasitic
infections. Protection can be achieved by low-contaminating irrigation techniques (as
above), together with wearing protective clothing (e.g. footwear for farmers and gloves for
crop handlers) and improving levels of hygiene both occupationally and in the home can
help to control human exposure. Provision of adequate water supplies for consumption (to
avoid consumption of wastewater) and for hygiene purposes (e.g. for handwashing) is
important. Consumers can be protected by cooking vegetables, and by high standards of
personal and food hygiene.
Studies are needed to see whether hygiene promotion could possibly be included in the
work of agricultural extension services or by the health authorities where wastewater
reuse occurs e.g. to promote handwashing with soap after irrigation. It is possible that
health promotion programmes could be linked to existing health related services. For
example, in Mexico, hygiene promotion could be linked to desparasitation campaigns the
National Vaccination Council carries out for among 2-14 year olds in previously designated
high-risk areas, and to their health education programmes for women (Peasey et al., 1999).
The national diarrhoea control programme in Mexico has already increased sales of ORS
ten fold in 11 years (Gutierrez et al., 1996); promotion of ORS was linked to the childhood
immunization programme.
The effectiveness of current promotional techniques in environmental health, however, is
not very encouraging, as few have had an impact on behaviour change or health status
(Cave and Curtis, 1999). Better intervention design is needed and only a few specific
behaviours should be targeted. Behaviour change can be slow and require intensive or
prolonged intervention. In addition, the promotion of specific protective hygiene behaviours
is generally now thought more effective when tackled separately from disease and risk.
Studies have demonstrated that such behaviours are more easily and efficiently modified for
social and cultural reasons, rather than through a fear of possible illness (Curtis and Kanki,
1998).
2.6.2.5 Chemotherapeutic intervention
Chemotherapy, especially for helminth infections, can be considered in countries where
the Ministry of Health is involved in periodic anthelminthic campaigns in areas of high
infection levels, as occurs in Mexico. Areas where inadequately treated wastewater is
reused (directly or indirectly) could be targeted, along with known areas of high
prevalence of helminth infections. Treatment of children every 4 to 6 months is needed to
prevent infection reaching pre-treatment intensities of infection. Adults and children from
farming families could be particularly targeted.
The use of regular chemotherapy programmes and human exposure control, including
hygiene promotion, should be considered as interim measures in cases where no
wastewater treatment is provided or where there is a time delay before treatment plants
can be built.
3 Aquacultural Reuse
3.1 Background and WHO Guidelines
Fish farming is becoming an increasingly important as a source of income for farmers as it
is a high value crop and consumer demand for fish is increasing. Interest in wastewaterfed
fish farming is based on its cost-effectiveness and the interest in resource recovery
from the investment in wastewater treatment e.g. through the use of effluent from waste
stabilisation ponds in fish ponds.
Tentative effluent guidelines for aquaculture were proposed by WHO (1989) following a
review of the literature on the survival of pathogens in and on fish by Strauss (1985). A
tentative bacterial guideline was set at =103 faecal coliforms per 100ml (geometric mean)
for fishpond water, which can be achieved by treating the wastewater feed water to 103-
104 FC/100ml. This was to protect against the risk of bacterial infections and was aimed at
ensuring that the invasion of fish muscle was prevented. A helminth quality guideline was
set at the absence of viable trematode eggs, aimed at preventing the transmission of
trematode infections such as schistosomiasis, fasciolopsiasis and clonorchiasis.
New data are available to allow assessment of the bacterial guideline. The validity of the
trematode egg guideline will not be reviewed here.
3.2 Summary of evidence supporting WHO (1989) tentative guidelines
The main evidence used to support the WHO (1989) Bacterial Guideline was evidence on
the quality of fish grown in wastewater fed fishponds of different qualities. Strauss (1985)
concluded that:
(1) Invasion of fish muscle by bacteria is very likely to occur when the fish are grown
in ponds containing >104/100ml and >105/100ml faecal coliforms and salmonellae
respectively. The potential for muscle invasion increases with the duration of
exposure of the fish to the contaminated water.
(2) There is some evidence to suggest that there is little accumulation of enteric
organisms and pathogens on, or penetration into, edible fish tissue when the
faecal coliform concentration in the fishpond water is < 103/100ml.
(3) Even at lower contamination levels, high pathogen concentrations may be present
in the digestive tract and the intraperitoneal fluid of the fish.
There were no epidemiological data on the health effects to populations consuming fish
raised in wastewater fed fishponds.
3.3 New evidence of health risks from studies in Indonesia
The use of human excreta in aquaculture is a traditional practice in most of highland areas of
West Java, Indonesia and occurs through latrines overhanging the 'home garden' fishpond.
These ponds are generally small in size (on average about 200 m2) and are usually situated
alongside the houses, although some are larger and run on a commercial basis. A crosssectional
study of the risk of diarrhoeal disease associated with the use of excreta in such
fishponds was carried out in West Java (Blumenthal et al, 1991/92.; Abisudjak, in
preparation).
The population were exposed through consuming fish originating in an excreta-fed pond, but
also in many other ways. Several types of exposure were identified; domestic exposure,
(from the use of water which originated from excreta-fed fishponds for bathing and
washing of kitchen utensils or food), recreational exposure (from contact with pond water
while playing or swimming) and defecation exposure (from use of fishpond latrines for
defecation). Three study groups were set up. The exposed group included those with
domestic and defecation exposure; the semi-exposed group included those with defecation
exposure but no domestic exposure and the non-exposed (control) group included those
without domestic or defecation exposure. The effect of defecation exposure on the rate of
diarrhoeal disease was determined by comparing the exposed and semi-exposed
groups, and the effect of domestic exposure was determined by comparing the exposed
and control groups. Recreational and occupational exposure occurred in both exposed and
semi-exposed groups and consumer exposure in all groups. Multivariate analysis was used
to estimate the risks associated with each exposure.
The quality of fishpond water used by selected households in the study was markedly worse
than well water, with an overall geometric mean faecal coliform count of 3.9 x 104 FC/100
ml. Fishponds were classified by size and by source of excreta but it was not possible to
define the quantity of excreta input to each pond. Although there was a great range in water
quality, there was no evidence that smaller fishponds were more contaminated than large
ones, or that directly excreta-fed ponds were more contaminated than indirectly excreta-fed
ponds.
The one-week prevalence of diarrhoea in children under 5 years was 12.1%, 7.6% and 7.9%
in the exposed, semi-exposed and non-exposed groups respectively, and was significantly
different between exposure groups. The prevalence in those over 5 years was 1.4%, 1.2%
and 1.4% and showed no difference between exposure groups.
For children under 5 years, a multiple logistic regression analysis was carried out to examine
the effect of the exposures after allowing for several potential confounding factors (crowding,
age, keeping of food and treatment of kept food). There was no risk of diarrhoea related to
defecation exposure (odds ratio=0.81). There was a two-fold increase in diarrhoea related to
recreational exposure (OR=1.91), a 1.6 fold increase related to domestic exposure and a
1.4 fold increase associated with consumer exposure (although the latter was of borderline
significance). When the risk related to consumption of fish was explored separately in the
three study areas, there was a two-fold increase in diarrhoea related to consumption in the
control area (OR=2.35 95% C.I. 1.01-5.29) a 1.5 fold increase in the semi-exposed area
(which was not statistically significant), and no increase in the exposed area.
The results show that recreational and domestic contact with water from excreta-fed
fishponds with a mean quality of 4 x 104 faecal coliforms causes an excess risk in exposed
children under 5 years of age, but not in persons over 5 years of age. Consumption of fish
from such ponds is a risk to persons living in areas with no ponds and with less exposure to
contamination.
3.4 Discussion and implications of studies for international guidelines
The epidemiological study in West Java indicates that exceeding the WHO tentative
guideline level by 40 fold is problematic for vulnerable population groups like young children
in this situation, but does not invalidate the tentative guideline, which could be around the
right level.
The ponds where the fish were raised were neither commercial fish farms nor maturation
ponds in a waste stabilisation pond series. They were fertilised with excreta from overhang
latrines, either on the pond itself or on a pond further up the hillside. In such circumstances
the risk could be higher than in wastewater fed fishponds where the influent is treated effluent
(e.g. from WSP) and the fish are not in contact with raw wastewater or excreta. It therefore
could represent a ‘worst case’ scenario. In fish farms or combined WSP/aquaculture systems
the risks from consumption of the fish and contact with the pond water are relevant
(equivalent to recreational contact above).
Studies of the microbiological quality of fish raised in wastewater-fed aquaculture systems
have been used to recommend criteria for acceptable bacterial levels in fishpond water and
fish muscle. Buras et al (1987) raised fish (Tilapia and silver carp) in experimental ponds over
a whole growing season and concluded that the ‘threshold concentration’ (i.e. the
concentration that caused the appearance of bacteria in muscles) was 1x104 bacteria/100ml
based on SPCs (standard plate counts). The role of faecal coliforms as adequate indicators
of fish contamination was questioned, as they were not always detected in the muscles of
fish whereas other bacteria were recovered; the use of bacteria (SPC) as an indicator was
proposed. However, it is useful to review the level of faecal coliforms for comparative
purposes; at this threshold concentration, the level of faecal coliforms in the water was
around 3x102 FC/100ml. Moscoso and Florez (1991), however, found that when Tilapia
were grown in a combined WSP/aquaculture system in Peru, bacteria penetrated the fish
muscle when the water exceeded 105 FC/100ml, and concluded that maximum level of
faecal coliforms in the pond water should be 1x104 FC/100ml. This would be achievable by a
maximum concentration of 1x105 FC/100ml in the effluent of the WSP used to feed the
aquaculture pond. These two studies therefore come to different conclusions regarding the
threshold concentration.
Current guidelines or standards for the microbiological quality of fish (reviewed in Strauss,
1995, and Leon and Moscoso, 1996) show that the standard plate count is used in
conjunction with an E.coli or coliform level in most cases. The rejectable levels set for the
quality of fish were106 SPC/g and E.coli 500 per gram (ICMSF, 1995), 107 SPC/g
(FAO/IAEA/WHO, 1989) 5x104 SPC/g and 0.7x103 coliforms/g (USA, in Leon and Moscoso,
1996), and 105 SPC/g and 101 E.coli/g (Sweden, in Strauss, 1995). These levels are less
strict than those proposed by Buras (1987) for fish raised in excreta-fed systems, who
recommended that the total aerobic bacterial concentration in fish muscle should not exceed
50 bacteria/g. This is probably because ICMSF regulations are for fish contaminated mainly
by handling and were not set up to include fish raised in excreta fed systems (Edwards,
1992). Many regulatory agencies do not specify microbiological standards for freshly caught
fish, but specify standards for processed products, therefore ensuring adequate personal and
institutional hygiene during transport, processing and marketing, and treatment for
conservation of raw, unprocessed products prior to sale (Strauss, 1995).
For the use of wastewater in aquaculture, it seems appropriate for guidelines to specify the
water quality that is acceptable for aquaculture, taking into account both the likely
microbiological quality of the fish grown in such water and the likely health effects to
consumers of the fish and workers in contact with the fishpond water. It is important to note
that concentration of bacteria in the digestive tract is always higher than that in the fish
muscle, and there is therefore potential for cross-contamination of fish muscle during gutting
and preparation of the fish. Evidence from the epidemiological studies can take this latter risk
into account. The study in Indonesia (above) shows that the water quality needs to be below
104 FC/100ml before the risks are reduced to acceptable levels. On balance, there appears
to be sufficient evidence to suggest that the tentative faecal coliform guideline of =103
FC/100ml (WHO, 1989) for the fishpond water is the right order of magnitude, and insufficient
data to warrant a reduction of this level to 102 FC/100ml or a relaxation to 104 FC/100ml. This
implies that the quality of the feed water can be around 104-105 FC/100ml, depending on the
size of the fishpond and the amount of dilution that occurs. However, the water quality should
stay constant over the growing season as where large fluctuations in the quality of the
influent water occur, this reduces the quality of the fish (Buras et al, 1987). The water quality
should therefore be monitored weekly if there are likely to be fluctuations in its quality. In
future, it would be useful to consider adding a bacterial guideline for the quality of the
wastewater (SPC/100ml) and for the quality of fish (SPC/g).
Wastewater treatment is not enough. Attention should also be paid to protecting aquaculture
workers and populations living nearby the ponds from contact with the pond water, and to
ensuring that high standards of hygiene are maintained during fish handling and gutting. The
use of health promotion programmes, by the Fisheries Department or by the health services,
to address such behaviours needs further research (as for wastewater reuse in agriculture,
section 2.6.2.4).
The implications of guidelines at this level are that wastewater (or excreta/septage) needs to
undergo some form of treatment before it can be used in fishponds. Guidance on the design
of WSP for wastewater-fed aquaculture is given in Annex C. Anaerobic and facultative ponds
are designed on the basis of surface nitrogen loading and the facultative pond effluent
discharged into the fishpond. Checks are made to see that the fishpond does not contain
more than 1000 FC/100ml (Mara et al, 1993, Mara, 1997). If the quality is >1000 FC/100ml,
the retention time in the fishpond should be increased or a maturation pond could be added
to the WSP. All the trematode eggs settle out in the anaerobic and facultative ponds. Where
effluent from conventional secondary treatment plants is used, the quality of the effluent may
need to be improved by use of a polishing pond prior to the effluent being discharged into a
fishpond.
4 Conclusions
The review of recent studies on the health effects of wastewater reuse in agriculture has
led to an evaluation of the WHO (1989) Guidelines and to recommendations for revised
microbiological guidelines for wastewater use in agriculture and aquaculture. The
conclusions are as follows:-
1. For unrestricted irrigation, there is evidence to support the validity of the faecal
coliform guideline of =1000 FC/100ml and no evidence to suggest that it needs to be
revised. It is supported by data from epidemiological, microbiological and risk
assessment studies. However, there is epidemiological evidence that the nematode
egg guideline of =1 egg/litre is not adequate in conditions which favour the survival of
nematode eggs (lower mean temperatures, surface irrigation) and needs to be revised
to =0.1 egg/litre where those conditions apply.
2. For restricted irrigation, there is evidence to support the need for a faecal coliform
guideline to protect farm workers, their children, and nearby populations from enteric
viral and bacterial infections. The appropriate guideline will depend on which irrigation
method is used and who is exposed. For example, if adult farmworkers are exposed
through spray/sprinkler irrigation, a guideline of £105 FC per 100ml is necessary. A
reduced guideline of £103 FC per 100ml is warranted where adult farmworkers are
engaged in flood or furrow irrigation, and where children under 15 years are regularly
exposed (through farm work or play). Where there are insufficient resources to meet
this stricter guideline, a guideline of £105 FC per 100ml should be supplemented by
other health protection measures. The nematode egg guideline of £1 egg per litre is
adequate if no children are exposed, but a revised guideline of £0.1 egg per litre is
recommended if children are in contact with the wastewater through irrigation or play.
3. The risks to exposed populations are dependent on the irrigation method used. Health
risks from irrigated crops are greatest when spray/sprinkler irrigation is used and risk
to field workers are greatest when flood or furrow irrigation are used. The proposed
guidelines take these risks into account.
4. The evidence reviewed did not support the need for a separate guideline to
specifically protect against enteroviral infections, but there were insufficient data to
evaluate the need for a specific guideline for parasitic protozoa.
5. There are three different approaches for establishing microbiological quality guidelines
and standards for treated wastewater reuse in agriculture which have different
objectives as their outcome: (I) the absence of faecal indicator organisms in the
wastewater, (II) no measurable excess cases in the exposed population, and (III) a
model generated estimated risk below a defined acceptable risk. The above
conclusions were based on use of approach II, using empirical epidemiological studies
supplemented by microbiological studies on pathogen transmission, in conjunction
with approach III, using model-based quantitative microbial risk assessment for
selected pathogens.
6. The use of a disease control approach can be considered for the setting of country
standards, especially where economic constraints limit the level of wastewater
treatment that can be provided. Here, the aim would be to protect populations against
excess disease rather than excess infection. This could result in the relaxation of
microbiological guidelines and the use of other health protection measures to
supplement wastewater treatment.
7. The revised microbiological guidelines can be met through the use of waste
stabilisation ponds, wastewater storage and treatment reservoirs, or through
conventional treatment processes. When using WSP, the revised guidelines usually
require the use of 1 or more maturation ponds after the anaerobic and facultative
ponds. Use of sequential batch-fed storage and treatment reservoirs can be designed
to meet the guidelines for unrestricted and restricted irrigation. When conventional
treatment processes are used secondary treatment, filtration and disinfection are often
needed to meet the revised guidelines. The cost and difficulty in operating and
maintaining conventional treatment plants to the level needed to meet the guidelines
means that they are not recommended where WSP and WSTR can be used.
8. Crop restriction, irrigation technique, human exposure control and chemotherapeutic
intervention should all be considered as health protection measures to be used in
conjunction with partial wastewater treatment. In some cases, community
interventions using health promotion programmes and/or regular chemotherapy
programmes could be considered , in particular where no wastewater treatment is
provided or where there is a time delay before treatment plants can be built.
9. Regarding wastewater use in aquaculture, evidence from epidemiological studies shows
that the faecal coliform guideline needs to be below 104 FC/100ml. There appears to be
sufficient evidence to suggest that the tentative faecal coliform guideline of =103
FC/100ml (WHO, 1989) for the fishpond water is the right order of magnitude, and
insufficient data to warrant a reduction of this level to 102 FC/100ml or a relaxation to 104
FC/100ml. This implies that the quality of the feed water can be around 104-105
FC/100ml, depending on the size of the fishpond and the amount of dilution that occurs.
In future, it would be useful to consider adding a bacterial guideline for the quality of the
wastewater (SPC/100ml) and for the quality of fish (SPC/g). This will address concerns
over the adequacy of faecal coliforms as indicators of health risks from waste-fed
aquaculture.
10. In order to meet the faecal coliform guideline, wastewater (or excreta/septage) needs to
undergo some form of treatment before it can be used in fishponds. Where WSP are
used, effluent from the facultative pond or first maturation pond can be discharged into
the fishpond (depending on the effluent quality and size of the fishpond). Where effluent
from conventional secondary treatment plants is used, the quality of the effluent may
need to be improved by use of a polishing pond prior to the effluent being discharged into
a fishpond.
Epidemiological studies of wastewater reuse in Mexico
A series of epidemiological studies were conducted in Mexico to assess, firstly, the
occupational and recreational risks associated with exposure to wastewater of different
qualities, and secondly, the risks of consuming vegetable crops irrigated with partially
treated wastewater. In the first set of studies, infections (from helminths, protozoa and
diarrhoeal disease) in persons from farming families in direct contact (through irrigation or
play) with effluent from storage reservoirs or raw wastewater, were compared with
infections in a control group of farming families engaged in rain-fed agriculture. In the
studies on consumer risks, infections with diarrhoeal disease, Human Norwalk-like
Virus/Mx and Enterotoxigenic E. coli (LT) in persons from a rural population eating raw
vegetables irrigated with partially treated wastewater were compared with infections in
persons (in the same area) not eating these vegetables. Comparison was also made with
infections in persons in a nearby area where vegetables were irrigated with borehole
water. In all studies, the effects of wastewater exposure were assessed after adjustment
for many other potential confounding factors (including socio-economic factors, water
supply, sanitation and hygiene practices).
1.1 Study area
Raw wastewater coming from Mexico City to the Mezquital valley, Hidalgo, is used to irrigate
a restricted range of crops, mainly cereal and fodder crops through flood irrigation
techniques. Some of the wastewater passes through storage reservoirs and the quality of the
wastewater is improved before use; this is equivalent to partial treatment. The effluent from
the first reservoir (retention time 1-7 months, depending on the time of year) met the WHO
guideline for restricted irrigation (category B), even though a small amount of raw wastewater
enters the effluent prior to irrigation. Some effluent from the first reservoir passes into the
second reservoir and is retained for an additional 2-6 months (>3 months of combined
retention), and the quality improved further. Local farming populations are exposed to the
wastewater and effluent through activities associated with irrigation, domestic use (for
cleaning, not for drinking) and play. Part of the effluent from the first reservoir enters the river
and is abstracted downstream to irrigate a large area of vegetable and salad crops, many of
which are eaten raw; the river water is essentially partially treated wastewater. These crops
are sold in the local markets and eaten by the rural populations in local villages, including
those near the second reservoir. In a nearby area, vegetables were irrigated with borehole
water.
1.2 Wastewater quality
Untreated wastewater contained a high concentration of faecal coliforms (106-108
FC/100ml) and nematode eggs (90-135 eggs per litre). Retention in a single reservoir
reduced the number of helminth ova substantially, to a mean of = 1 eggs/litre (so meeting
the WHO Guideline for restricted irrigation) whereas faecal coliform levels were reduced
to 105 FC/100ml (average over the irrigation period) or 104 FC/100ml, with annual
variations depending on factors such as rainfall. The concentration of helminth ova
remained below 1 ova/litre (monthly monitoring) even after a small amount of raw
wastewater entered the effluent downstream of the reservoir. Retention in the second
reservoir reduced the faecal coliform concentration further (mean 4x103 FC/100ml ) and
no helminth ova were detected. Faecal coliform levels varied over the year depending on
the retention time in each reservoir which varied according to demand for irrigation water.
The geometric mean quality of the river water at the point where it is abstracted for use in
irrigation was 4x104 FC/100ml, with little variation occurring over the year. Enterovirus and
hepatitis A virus were present for most of the year (95% and 69% monthly samples
respectively), whereas rotavirus was detected during the peak months for rotavirus cases.
Limited data on virus levels on crops at harvest showed that enterovirus was detected on all crops tested (onion, radish, lettuce, cauliflower and coriander) whereas hepatitis A
virus was detected on lettuce, radish and onion (on which rotavirus was also detected).
1.3 Risks to workers related to restricted irrigation and effect of wastewater treatment
1.3.1 Exposure to raw wastewater
Exposure to raw wastewater over one year (following chemotherapy) was associated with
a significantly increased prevalence (percentage) and intensity of Ascaris infection (mean
egg load) in all age groups (Table A1 section a). Exposure was related to a 20 fold
increase in infection in children (compared to the control group) and a 10 fold increase in
adults (Blumenthal et al, 1996, Peasey, 2000). Increased morbidity, as shown by
increased wheezing and difficulty in breathing, was detected among those with higher
intensity infections. The specific behaviour which was most risky for adults was irrigating
chillies (6 fold increase) which was done by furrow irrigation and involved earth moving,
done by hand or by spade. For children, the most risky behaviour was eating local plants
(irrigated with wastewater). Exposure to raw wastewater was shown to account for over
80% of Ascaris infection in the commmunity.
1.3.2 Exposure to partially treated wastewater
Exposure over one year to wastewater which was retained in one reservoir resulted in a
14 fold increase in Ascaris infection in children (especially those aged 5-14 years) and a
much smaller increase (3 fold) in infection in adults (Table A1 section b) (Peasey 2000
and Peasey et al, 2000). For adults, planting chillies was associated with increased
infection. The intensity of Ascaris infection in adults was reduced to the level in the control
group, but in children was similar to levels in the raw wastewater group (Table A2)
(Blumenthal et al, 1996). Older children (aged 5-14 years) also had significantly higher
rates of diarrhoeal disease (Table A1 section b) (Cifuentes, 1995, Blumenthal et al,
2000a)
When wastewater was retained in two reservoirs in series, direct contact with the effluent
resulted in very little excess Ascaris infection in any age group (Cifuentes et al,1994). In
those over 5 years, the prevalence was twice as high as in the control group, but the
excess infection was less than 1% (Table A1 section c). Initially, it was found that there
was no excess of diarrhoeal disease related to exposure with this water (Cifuentes et al,
1994, Cifuentes, 1998) compared to the level in the control group, where rain-fed
agriculture was practised. However, in a later study, when children with contact with the
effluent from the second reservoir were compared with children from the same population
but with no contact with the effluent, a two-fold or greater increase in diarrhoeal disease in
children aged 5-14 years, and a four-fold increase in seroresponse to Human Norwalk-like
Virus/MX in adults with high levels of contact was found (Table A1 section c) (Blumenthal
et al, 1998, Blumenthal et al, 2000b).
Retention of water in two reservoirs in series, producing water of average quality 4x103
FC/100ml and no detectable nematode eggs, is therefore adequate to protect the children
of farm workers from Ascaris infection but not against increased diarrhoeal disease.
1.4 Risks to consumers related to unrestricted irrigation
In the above studies, there was some evidence that eating local plants (wild greens such
as spinaches) was associated with an increased risk of Ascaris infection in children (2-14
years) in families exposed to raw wastewater and to effluent from one reservoir.
Risks from bacterial and viral infections related to the consumption of specific cultivated
vegetables (ie. courgette, cauliflower, cabbage, carrots, green tomato, red tomato, onion,
chilli, lettuce radish, cucumber and coriander) and to total consumption of raw vegetables
irrigated with partially treated wastewater (quality 104 FC/100ml) were investigated in a
separate study. The results indicated that consumers of all ages had no excess infection with symptomatic diarrhoeal disease, and no excess serological response (defined as 50% increase in antibody titre over one year) to Human Norwalk-like Virus/MX or Enterotoxigenic E. coli related to their total consumption of raw vegetables, that is, the frequency of eating raw vegetables. (Blumenthal et al, 1998, Blumenthal et al, 2000b). However, there was an two-fold or greater excess of diarrhoeal disease in those who ate increased amounts of onion compared with those who ate very little. The effect was particularly seen in adults and children under 5 years of age. Similar results were found for the consumption of chillies.
Data on the consumption of foods prepared from raw vegetables also supported these results, since foods containing chilli or onion were associated with increased infection.
Frequently eating ‘salsa’ (chilli sauce) was associated with increased diarrhoea in adults older children, an increased seroresponse to Human Norwalk-like Virus/Mexico and a
significant rise in antibody titre to ETEC in children (1-14 years). Consumption of
‘picadillo’ (choppped onions, chilli and red tomato) by adults was associated with
increased diarrhoea whereas frequent consumption of ‘guacamole’ was associated with a
significant rise in antibody titre to ETEC in children (1-14 years). There were also higher
levels of serological response to Human Norwalk-like Virus/Mexico in school-aged
children who ate green tomato, but this effect was not seen in other age groups. No excess serological response to enterotoxigenic E. coli was related to individual raw vegetable consumption; the increased seroresponse related to eating foods prepared from raw vegetables could be due to contamination introduced via the chillies, but it could also have been introduced during preparation and bacteria multiplied to reach an infective dose during storage.
Data on the source of vegetables show that the chillies eaten by the study population
were grown in raw wastewater, so the risk of diarrhoea associated with eating chillies was related to raw wastewater irrigation. Therefore, it is only the risks from eating onion and possibly green tomato that can be associated with using partiallytreated wastewater for irrigation. However, since 83% of adults and 56% of children under 5 years of age ate onion more than once a month, the majority of the study population had\ a two-fold or greater risk of diarrhoea. Enteroviruses were found on onions at harvest, giving support to this epidemiological evidence.
In contrast, we have evidence that eating some other raw vegetables was associated with a decrease in diarrhoea. The evidence is strongest for eating carrots, which was associated with a 60% or greater reduction in diarrhoea in all age groups. Protective effects of 50% or greater were also related to eating red tomato, salad and the total amount of raw vegetables eaten by the older children. Consuming a high number of foods containing raw vegetables was also associated with a 75% reduction in seroresponse to Human Norwalk-like Virus/Mexico.
In summary, these results indicate that there is a year round potential for transmission of enteric infections through consumption of vegetable crops irrigated with water of quality
104 FC/100ml, and consumption of some vegetables is associated with a significant risk of enteric infection in consumers in the rural population studied. However, the risks associated with consumption of some vegetables, particularly onion, may be balanced by the protective effects associated with consumption of other vegetables.
In the communities studies, factors other than consumption of contaminated vegetables
are equally or more important as risk factors or protective factors against these infections.
In particular, there is evidence supporting the importance of hygiene behaviour; hand
washing is protective against diarrhoea (symptomatic) and Human Norwalk-like Virus/Mx,
especially in adults and when soap is used (Table A6). There is also evidence of risk
associated with drinking water from public supplies. Chlorination of drinking water supplies
in the area is often inadequate such that the water is often effectively untreated.
Prevention of contamination in the home is also important.

No comments:

Post a Comment