THEME PAPERS

 

Guidelines for the microbiological quality of treated wastewater used in agriculture: recommendations for revising WHO guidelines

 

Directives relatives à la qualité microbiologique des eaux résiduaires épurées employées dans l’agriculture : recommandations en faveur de la révision des directives OMS

 

Directrices relativas a la actividad microbiológica de las aguas residuales tratadas empleadas en la agricultura: recomendaciones para revisar las directrices de la OMS

 

 

Ursula J. BlumenthalI; D. Duncan MaraII; Anne PeaseyIII; Guillermo Ruiz-PalaciosIV; Rebecca StottV

ISenior Lecturer, Department of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine, Keppel Street, London WC1E 7HT, England
IIProfessor, School of Civil Engineering, University of Leeds, Leeds, England
IIIResearch Fellow, Department of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine, London, England
IVProfessor and Head, Department of Infectious Diseases, National Institute of Nutrition, Mexico City, Mexico
VResearch Fellow, Department of Civil Engineering, University of Portsmouth, Portsmouth, England

Correspondence

 

 


ABSTRACT

Three different approaches for establishing guidelines for the microbiological quality of treated wastewater that is reused for agriculture are reviewed. These approaches have different objectives as their outcomes: the absence of faecal indicator organisms in the wastewater, the absence of a measurable excess of cases of enteric disease in the exposed population and a model-generated estimated risk below a defined acceptable risk. If the second approach (using empirical epidemiological studies supplemented by microbiological studies of the transmission of pathogens) is used in conjunction with the third approach (using a model-based quantitative risk assessment for selected pathogens) a powerful tool is produced that aids the development of regulations. This combined approach is more cost-effective than the first approach and adequately protects public health.

The guideline limit for faecal coliform bacteria in unrestricted irrigation ( < 1000 faecal coliform bacteria/100 ml) is valid, but for restricted irrigation < 105 faecal coliform bacteria/100 ml is recommended when adult farmworkers are exposed to spray irrigation. A limit of < 103 faecal coliform bacteria/100 ml is recommended if flood irrigation is used or children are exposed. The guideline limit for nematode eggs for both types of irrigation is adequate except when conditions favour the survival of nematode eggs and where children are exposed; in these cases it should be reduced from < 1 egg/l to < 0.1 egg/l.

Keywords: water microbiology, standards; Enterobacteriaceae; maximum allowable concentration; agriculture; World Health Organization; guidelines


RÉSUMÉ

On examine ici trois approches différentes pour l’élaboration des directives relatives à la qualité microbiologique et aux normes de réutilisation agricole des eaux usées après épuration. Ces approches ont des objectifs différents du point de vue des résultats : l’absence de micro-organismes fécaux indicateurs dans les eaux usées ; l’absence d’un excès de cas d’infections entériques mesurable dans la population exposée et un risque, estimé à partir d’un modèle, inférieur au risque défini comme acceptable.

Cette analyse confirme que la première approche (absence de micro-organismes fécaux indicateurs) est non seulement inutilement prudente et coûteuse pour protéger la santé publique, mais aussi difficilement applicable. Si la deuxième approche (études épidémiologiques empiriques complétées par des études microbiologiques sur la transmission des germes pathogènes) est employée en conjonction avec la troisième (évaluation du risque quantitatif basée sur un modèle applicable à des germes pathogènes choisis), on obtient un instrument puissant qui va aider à l’élaboration des réglementations. Cette approche combinée a également un rapport coût/efficacité plus avantageux que la première approche et assure une bonne protection de la santé publique.

Notre évaluation des données récentes de la recherche basée sur cette approche combinée indique qu’il convient de réviser les directives OMS de 1989. Concernant l’irrigation sans restriction, rien ne permet de penser qu’il faille réviser la limite pour les coliformes fécaux, qui doit être 1000 bactéries/100 ml. Celle-ci est étayée par des études épidémiologiques, microbiologiques et d’évaluation des risques. Toutefois, certaines données épidémiologiques montrent que la limite pour les œufs de nématodes (<1 œuf/l) ne convient pas dans des conditions favorables à la survie des œufs de nématodes (températures moyennes plus basses et irrigation de surface) et qu’il faut alors la remplacer par une concentration 0,1 œuf/l.

En ce qui concerne l’irrigation restreinte, une directive relative à l’exposition aux coliformes fécaux s’impose afin de protéger les agriculteurs, leurs enfants et les populations avoisinantes des infections entériques virales et bactériennes. La directive appropriée dépendra de la méthode d’irrigation utilisée et des personnes exposées. Par exemple, si les agriculteurs adultes sont exposés du fait d’une irrigation par aspersion, il faut que la concentration en coliformes fécaux/100 ml soit 105. Une limite 103 bactéries/100 ml est justifiée lorsque les agriculteurs adultes pratiquent l’irrigation par rigoles d’infiltration ou par gravité et lorsque des enfants de moins de 15 ans sont régulièrement exposés dans le cadre des travaux ou de leurs jeux. Lorsqu’il n’y a pas suffisamment de ressources pour satisfaire à cette limite plus stricte, on peut compléter la limite 105 coliformes fécaux/ 100 ml par d’autres mesures de protection sanitaire. La limite pour les œufs de nématodes (1 œuf/l) convient si aucun enfant n’est exposé, mais une limite révisée 0,1 œuf/l est recommandée si des enfants sont en contact avec les eaux résiduaires à l’occasion des travaux d’irrigation ou dans le cadre de leurs jeux.

Les risques courus par les populations dépendent de la méthode d’irrigation employée. Les risques sanitaires présentés par les cultures irriguées sont maximums lorsque l’on utilise l’irrigation par aspersion, et le risque pour les agriculteurs est le plus élevé lorsqu’on utilise l’irrigation par gravité ou par rigoles d’infiltration. Les directives proposées ici tiennent compte de ces risques.

Les données analysées n’ont pas indiqué la nécessité d’élaborer une directive précise distincte pour protéger contre les infections virales, pas plus qu’elles n’ont indiqué la nécessité d’une directive particulière pour les protozoaires parasites.


RESUMEN

Se examinan tres enfoques distintos para elaborar directrices acerca de la calidad microbiológica de las aguas residuales tratadas en agricultura y las normas para su reutilización. Dichos enfoques apuntan a distintos resultados: la ausencia de microorganismos indicadores de contaminación fecal en las aguas residuales; la ausencia de excesos medibles de casos de enfermedades gastrointestinales en la población expuesta, y un riesgo estimado, generado mediante un modelo, inferior al riesgo definido como aceptable.

El estudio confirma que el primer enfoque (la ausencia de microorganismos indicadores de contaminación fecal) es no sólo un instrumento innecesariamente conservador y caro para proteger la salud del público, sino también un instrumento muy difícilmente viable en la práctica. Combinando el segundo enfoque (estudios epidemiológicos empíricos complementados por estudios microbiológicos sobre la transmisión de patógenos) con el tercero (una evaluación cuantitativa del riesgo basada en un modelo para determinados patógenos) se obtiene una poderosa herramienta de ayuda al desarrollo de normas de regulación. Ese enfoque combinado es también más eficaz en relación con el costo que el primero, y protege adecuadamente la salud pública.

Nuestra evaluación de los datos de investigación basados en ese enfoque combinado muestra que es necesario revisar las directrices de la OMS de 1989. En lo que respecta al riego sin restricción, nada indica que haya que revisar el límite de bacterias coliformes fecales establecido en las directrices, a saber, 1000 bacterias coliformes fecales/100 ml. Respaldan ese límite los datos aportados por estudios epidemiológicos, microbiológicos y de evaluación del riesgo. No obstante, existen indicios epidemiológicos de que el límite establecido para los huevos de nematodo ( 1 huevo/l) es inadecuado en las condiciones que favorecen la supervivencia de esos huevos (temperaturas medias inferiores y riego de superficie), por lo que debería ser revisado para reducirlo a 0,1 huevos/l en tales condiciones.

En cuanto al riego restringido, algunos datos apuntan a la necesidad de establecer límites orientativos para la exposición a bacterias coliformes fecales a fin de proteger a los agricultores, a sus hijos y a las poblaciones vecinas de infecciones gastrointestinales víricas y bacterianas. El valor idóneo del límite orientativo dependerá del método de riego empleado y de las personas que resulten expuestas. Por ejemplo, para los agricultores adultos expuestos a través del riego por aspersión, es necesario un límite orientativo de 105 bacterias coliformes fecales/100 ml. Es igualmente deseable un límite más reducido, 103 bacterias coliformes fecales/100 ml, para los agricultores adultos que participan en actividades de riego por inundación o por surcos, así como para las situaciones en que menores de 15 años se ven expuestos regularmente en los trabajos que realizan o en sus juegos. Cuando no se disponga de recursos suficientes para observar ese límite orientativo más estricto, el límite de 105 bacterias coliformes fecales/100 ml deberá complementarse con otras medidas de protección de la salud. El límite orientativo para los huevos de nematodo ( 1 huevo/l) es suficiente si no hay niños expuestos, pero se recomienda un límite revisado 0,1 huevos/l cuando hay niños expuestos a las aguas residuales a causa del riego o de sus actividades lúdicas.

Los riesgos para las poblaciones dependen del método de riego empleado. Los riesgos para la salud a partir de los cultivos de regadío son máximos en el caso del riego por aspersión, mientras que en el caso de los trabajadores el mayor riesgo es el asociado al riego por inundación o por surcos. Las directrices propuestas tienen en cuenta esos riesgos.

Los datos examinados no indican que sea necesario elaborar por separado una directriz específica para proteger contra las infecciones víricas, y no hay tampoco indicios suficientes para justificar la elaboración de una directriz específica referente a los protozoos parásitos.


 

 

Introduction

During the past decade, there has been growing concern that the world is moving towards a water crisis. Water is increasingly scarce in dry climate regions (for example, Africa and South Asia), and there are major political implications for the scarcity of water in some regions (for example, the Middle East). Issues of both water quantity and quality are of concern. The reuse of wastewater is one of the main options being considered as a new source of water in regions where water is scarce. The standards required for the safe use of wastewater and the amount and type of wastewater treatment needed are contentious. The cost of treating wastewater to conform to high microbiological standards can be so prohibitive that in many developing countries the use of untreated wastewater is effectively unregulated.

Here we discuss the different approaches that have been used to establish or evaluate guidelines on the microbiological quality of treated wastewater that is used to irrigate crops. We also review the evidence from epidemiological, microbiological and risk assessment studies published since the 1989 WHO guidelines (Table 1) and make recommendations for the revision of these guidelines taking this new evidence into account (1).

 

 

 

Approaches to setting microbiological guidelines

There are currently several alternative approaches to establishing microbiological guidelines for reusing wastewater. These have different outcomes as their objectives: the absence of faecal indicator bacteria in the wastewater, the absence of excess cases of enteric disease in the exposed population and a model-generated risk which is below a defined acceptable risk.

The absence of faecal indicator bacteria in the wastewater

With this approach, there should be no detectable indicators of faecal pollution in the wastewater. This is based on the premise that it is impractical to monitor reclaimed water for all pathogenic microorganisms of concern and that the use of surrogate parameters, such as faecal indicator organisms, is acceptable. ‘‘Faecal coliforms’’ are the indicator bacteria most commonly used in discussions of wastewater reuse. They are broadly equivalent to ‘‘thermotolerant coliforms’’. The preferred grouping would be ‘‘thermotolerant coliforms/Escherichia coli’’ which would eventually allow E. coli to be used as the preferred, and exclusively faecal indicator bacterium (2).

Guidelines in use in the United States

Total coliform and faecal coliform organisms are often used in conjunction with specified requirements for treating wastewater, and in such cases it is assumed that the need for expensive and time-consuming monitoring of treated water for pathogenic microorganisms is eliminated. In practice, however, this approach has led to guidelines that require zero faecal coliform bacteria/100 ml for water used to irrigate crops that are eaten raw in addition to a requirement for secondary treatment, filtration and disinfection. The United States Environmental Protection Agency (USEPA) and the US Agency for International Development have taken this approach, and consequently have recommended strict guidelines for wastewater use (3). For unrestricted irrigation (that is, for uses that include crops likely to be eaten uncooked), no detectable faecal coliform bacteria are allowed in 100 ml (compared with the 1989 WHO guidelines of < 1000 faecal coliform bacteria/100 ml), and for irrigation of commercially processed and fodder crops the guideline limit is < 200 faecal coliform bacteria/100 ml (for which only a guideline limit on the presence of nematode eggs is set by WHO). In the USA, the setting of actual standards is the responsibility of individual states, and different states take different approaches (some specify treatment processes, others specify water quality standards) and a range of standards is in use (4). For unrestricted irrigation of food crops these range from 10–1000 faecal coliform bacteria/100 ml for surface irrigation to 2.2–200 faecal coliform bacteria/100 ml for spray irrigation. Most regulatory agencies in the United States have chosen not to use epidemiological studies as the basis for determining water quality standards (5). California has some of the strictest standards, requiring < 2.2 total coliform bacteria/ 100 ml for irrigation of food crops (to be achieved through secondary treatment followed by filtration and disinfection) and < 23 total coliform bacteria/100 ml for irrigation of pasture and landscaped areas (through secondary treatment and disinfection) (6). Standards in several countries (for example, Israel and Oman) have been influenced by American standards, especially the California standards.

Limitations

The main criticism of this approach is that it may be unnecessarily strict and could result in high costs per case of infectious disease averted. In a preliminary analysis, Shuval et al. estimated that the cost per case of hepatitis A avoided by irrigation with zero faecal coliform bacteria/100 ml, rather than 1000 faecal coliform bacteria/100 ml, was in the order of US$ 3–30 million (7). This expense might be justified by industrialized countries with low levels of endemic enteric disease, but it cannot be justified in countries with higher levels of endemic infection, where enteric disease is more often transmitted through poor hygiene and sanitation than through wastewater reuse and where resources for preventive health care are limited.

No measurable excess cases in the exposed population: epidemiological perspective

The objective of this approach is that there should be no actual risk of infection — that is, there should be no measurable excess risk of infection attributable to the reuse of wastewater as evaluated using scientific evidence, especially from epidemiological studies. This was the approach adopted in the 1989 WHO guidelines, for which epidemiological evidence was used (when available); epidemiological evidence was supported by information from microbiological studies (Table 1).

The advantage of epidemiological studies is their ability to assess the risk of infection by observing the exposure and infection that actually occur in human populations that reuse wastewater. Studies of the effect of exposure to wastewater of differing quality (either occupationally or through the consumption of crops) can be used to assess at what level no excess infection occurs in the exposed population. However, results from any given study are generally specific to the time and place of that study. Extrapolation of the results to other times and other locations — as is necessary when they are used for regulation — depends on making assumptions about the changes to variables, such as contact with wastewater, which might affect the outcome. It is preferable to carry out studies of similar exposures in a number of locations representing different conditions. In setting standards for countries, allowances can be made for local epidemiological, sociocultural and environmental factors, and the guidelines can be modified accordingly, especially where local epidemiological studies have been carried out. The limitations of some past epidemiological studies can be overcome by using larger sample sizes, controlling for other risk factors for the infection which could act as confounding factors, and by using appropriate comparison groups.

The WHO guidelines have been controversial, particularly the relaxation of the guideline on unrestricted irrigation to a geometric mean level of < 1000 faecal coliform bacteria/100 ml. The use of epidemiological studies in developing countries has been criticized because some populations have acquired immunity to many enteric infections; the adequacy of the studies has also been criticized as has the lack of use of a health risk assessment methodology (8). Concern has been expressed about the lack of sensitivity of epidemiological methods to detect transmission of disease that does not lead to illness in exposed individuals but may lead to secondary transmission that causes illness in susceptible individuals (9). The ability of the guideline limit on nematode eggs to protect adequately against protozoan parasites and of the guideline limit on faecal coliforms to protect adequately against viruses has also been questioned since both these organisms are not easily removed by conventional treatment processes or disinfection.

Many countries have welcomed the guidance from WHO; standards in several countries have been based on the 1989 guidelines, including those in several countries in the European Union (10). France used a similar approach in setting guidelines, which were published in 1991. These are similar to those of WHO in defining analogous water categories (called A, B and C in the WHO guidelines; Table 1) and microbiological limits, but complement them with strict rules of application (11). For example, for category A in the French guidelines, the quality requirement must be complemented by the use of irrigation techniques that avoid wetting fruit and vegetables, and for irrigation of golf courses and open landscaped areas, spray irrigation must be performed outside public opening hours.

Some countries have modified the microbiological criteria to suit local epidemiological and economic circumstances: Mexico, for example, introduced standards of 5 nematode eggs/l, and for unrestricted irrigation introduced a daily mean of < 2000 faecal coliform bacteria/100 ml and a monthly mean of 1000 faecal coliform bacteria/100 ml (12, 13). These were designed to be sufficient to protect at-risk groups and to be achievable with the technology and resources available (14).

A model-generated risk that is below a defined acceptable risk

In this approach an acceptable risk of infection is first defined — for example, for the microbial contamination of drinking-water supplies, the USEPA has set annual risk of 10-4 per person (15). Once the acceptable annual risk has been established by the regulator, a quantitative microbial risk assessment (QMRA) model is used to generate an estimated annual risk of infection; this is based on an assessment of exposure (including data on the concentrations of microorganisms in wastewater, the quantity of treated wastewater remaining on crops after irrigation, the ratio of pathogens to indicator organisms and the percentage of pathogen die-off between the time the food crop is harvested and consumed) and dose–response data (that is, data from human infection trials on pathogen dose and resulting infection, if any). A microbiological quality guideline limit would then be set so that the model produces an estimate of an annual risk which is below the regulator’s acceptable annual risk. This risk assessment approach is especially powerful when the acceptable risk is below the level that can be measured in most epidemiological studies unless extremely large populations are studied. It is also useful in estimating risks from a particular level of exposure to a pathogen before a problem has occurred — for example, before it leads to an outbreak of infection.

Development of current QMRA techniques

QMRA techniques were used by Asano et al. to assess the annual risks of viral infection resulting from wastewater reuse and for evaluating the California wastewater reclamation criteria (16). Asano et al. used the b-Poisson dose–response model, an assumed constant quantity (10 ml) of treated wastewater remaining on crops after irrigation and an assumed constant value of 99.99% for viral die-off between final irrigation and crop harvest; they found that the annual risk of viral infection resulting from spray irrigation of food crops never exceeded 10-4, even in the worst-case scenario (irrigation with chlorinated tertiary treated wastewater containing 111 viral particles/100 l) (17). A more sophisticated QMRA procedure was adopted by Tanaka et al. to determine the expectations of the annual risk of enteroviral infection resulting from food crop irrigation with four unchlorinated secondary effluents containing variable concentrations of viruses (18). Using cumulative distribution functions of virus concentrations and Monte Carlo simulations (500 trials), these authors found that the expectations of annual risk ranged from 10-3 to 10-5. Chlorination reduced these values to between 10-7 and 10-9.

These techniques were also applied by Shuval et al. to estimate the risks associated with the consumption of lettuce irrigated with raw and treated wastewaters containing 107 and 103 faecal coliform bacteria/100 ml to evaluate WHO wastewater guidelines (7). Using the b-Poisson dose–response model, but assuming a constant (but measured) quantity (11 ml) of treated wastewater remained on the lettuce surfaces after irrigation, a constant virus/faecal coliform occurrence ratio of 10-5 and a constant pathogen die-off of 10-3 between crop harvest and consumption, these authors found annual risks of hepatitis and rotaviral diarrhoea of 10-6–10-7 and 10-5–10-6, respectively, when the wastewater was treated to 1000 faecal coliforms per 100 ml. Although this study can be criticized for using constant values (rather than values based on parameter probability density functions and multitrial Monte Carlo simulations), it serves well in producing order-of-magnitude estimates of risk. However, it was an assessment of the risks associated with a given microbiological quality and the evaluation of an existing guideline limit rather than (as required here) the derivation of a quality standard for a microbiological guideline limit from an accepted level of risk.

Disadvantages of current procedures

Current QMRA procedures have a number of disadvantages that should be addressed by future studies. These include the use of oral challenge data only from healthy adult volunteers and not from more vulnerable groups, such as children; such data are commomly based on extremely small numbers of volunteers. Furthermore, the median infectious dose depends greatly on whether the participants challenged have been previously exposed to the pathogen. The extrapolation of the results of high-dose oral challenge data to environmental exposure at low doses is also problematic. The absence of sufficiently large data sets on pathogen monitoring limits the use of the methods. Furthermore, the definition of acceptable risk itself poses problems: what is acceptable is likely to vary according to the level of endemic infection, the importance of other transmission routes, economic circumstances and the regulator’s perspective of society’s expectations. Haas argues for the adoption of an annual risk of 10-3 (19), but even this may be too conservative, given that in many countries actual rates of infection are considerably higher — for example, in England the annual rate for all infectious intestinal disease is 0.2 per person (20).

More sophisticated and more realistic risk assessment techniques should be applied to the reuse of wastewater; this is beginning to occur in other sectors — for example, for drinking-water (21, 22) and food (23). These assessments should include models that take into account epidemiological variables, such as transmission and immune status, and which characterize risk at the population level rather than at the individual level (24, 25). Such applications, together with a system of quality assurance for the risks assessed (26), would lead to greatly increased confidence in the assessments of the risks and to the guidelines for reuse that are based on them.

 

Proposed revisions to the 1989 WHO guidelines

In our assessment of the implications for international guidelines of the evidence on health risks from wastewater use we combine the second approach (in which risk is assessed using epidemiological studies supplemented by microbiological studies on the transmission of pathogens) with the third (which uses a model-based QMRA for selected pathogens). We use evidence from studies since 1989 (including evidence from studies that were not published at the time of the WHO Scientific Group meeting in 1987) to evaluate the 1989 guidelines, and propose alternative guidelines in cases in which the evidence supports a change (Table 2). We use empirical epidemiological evidence when it is available; these studies measure real exposures that occur over time and do not depend on estimates of mean daily microbial doses and dose–response analyses based on experiments with healthy volunteers from which data are extrapolated to provide estimates of the effects of low doses. Epidemiological studies are particularly useful in areas where enteric diseases are highly endemic and where the risk of infection is high enough to be easily measurable with current techniques. In cases in which the epidemiological evidence is incomplete we have used evidence from microbiological studies. QMRA studies are particularly useful for areas where enteric disease is not highly endemic, where risks of infection are low, and where regular monitoring of pathogens in wastewater occurs and produces good data sets for use in exposure assessment. The evidence is strongest in cases in which both approaches lead to the same conclusions. If different results are obtained, further analysis of the studies should help identify weaknesses and aspects of the methodology that need improvement. This is a rational approach, which it is likely to be cost-effective in most settings.

Unrestricted irrigation

Faecal coliform guideline limit 1000 per 100 ml. Epidemiological studies were performed in a rural area in central Mexico where river water containing partially treated wastewater was used to irrigate vegetables which were eaten by the local population (27). Risks from bacterial and viral infections associated with the consumption of specific vegetables (cabbages, carrots, green tomatoes, red tomatoes, onions, chillies, lettuce, radishes, cucumbers and coriander) and the total consumption of raw vegetables irrigated with partially treated wastewater (average quality 104 faecal coliform bacteria/100 ml) were investigated. The sample size was sufficient to detect a 15% increase in serological response between exposure categories and a 3% difference in the prevalence of diarrhoea between exposure categories among those aged over 5 years. There was no excess infection with diarrhoeal disease (as measured in a cross-sectional study) among vegetable consumers of all ages related to their total consumption of raw vegetables (that is, the number of raw vegetables eaten each week). There was also no excess infection with human Norwalk-like virus/Mexico (Hu/NLV/MX) or enterotoxigenic Escherichia coli (as measured by serological response over one year) associated with their total consumption of raw vegetables. However, consumption of onions, eaten by the majority of the study population, was associated with at least a twofold increase in diarrhoeal disease (3.5% in adults). Enteroviruses were found on onions at harvest, supporting this epidemiological evidence. Consumption of green tomatoes was associated with a twofold increase (16%) in serological response to Hu/NLV/MX in schoolchildren. The effects described were observed after controlling for other risk factors. The results suggest that the risk of enteric infection is significant but low when the guideline limit is exceeded by a factor of 10.

Validity of WHO faecal coliform guideline limit in hot climates. Microbiological studies in Portugal have shown that where crops were irrigated with water just exceeding the guideline limit of 1000 faecal coliform bacteria/100 ml, the crop still fell within the quality recommendations of the International Commission on Microbiological Specifications for Foods (ICMSF) ( 105 faecal coliform bacteria/100 g fresh weight for vegetables eaten uncooked) (28), suggesting that the WHO guideline limit is appropriate in hot climates (29).

In studies of drip and furrow irrigation of radish and lettuce with effluent from a series of waste stabilization ponds (used for wastewater treatment) which had a geometric mean faecal coliform count of 1700–5000 faecal coliform bacteria/100 ml (slightly higher than the WHO recommended limit of 1000/100 ml), crop contamination levels varied considerably. In dry weather they were in the order of 103 and 104E. coli/100 g for radish and lettuce, respectively, but salmonellae were always absent. The quality was better than that of locally sold lettuce (which, based on 172 samples, had a geometric mean count of
1x106/100 g) and fell within the ICMSF guidelines. However, when it rained, E. coli numbers increased and salmonellae were isolated from lettuce (30), suggesting that a stricter guideline may be necessary in countries where significant rainfall occurs during the growing season.

Risk assessment studies in Israel. Risk assessment studies in Israel (7) used the drinking-water model of Haas et al. to assess infection risk (17). This was combined with laboratory data on the degree of viral contamination of lettuce and cucumber irrigated with wastewater of differing quality. The annual risk of infection with hepatitis A from eating lettuce which had been irrigated with untreated wastewater was estimated at 10-3, but when the lettuce was irrigated with treated wastewater meeting the WHO guideline limit of 1000 faecal coliform bacteria/100 ml the estimated risk was in the range 10-5–10-7; for rotavirus infection the predicted risk ranged from 10-5 to 10-6, and for cholera the risk was 10-6. The results of these studies are consistent with those obtained by Asano et al. (16); they estimated the risk of infection with three enteric viruses (poliovirus 1 and 3, and echovirus 12) associated with the use of chlorinated tertiary effluents to irrigate horticultural produce. The annual risk of infection associated with consuming irrigated market-garden produce was estimated to be between 10-6 and 10-11 when the effluent contained 1 viral unit/100 l and between 10-4 and 10-9 when wastewater with a maximum concentration of 111 viral units/100 l was used.

Data from waste stabilization ponds in north-east Brazil suggest that rotavirus numbers are likely to be <30/100 l when the faecal coliform content is below 104 /100 ml (31); however, other enteric viruses, such as adenovirus, may significantly out-number rotaviruses and enteroviruses (32). Therefore extrapolation from these data indicate that using wastewater that meets the WHO guideline limit of 1000 faecal coliform bacteria/100 ml is likely to produce an annual risk of viral infection of <10-4. Even when unchlorinated secondary effluents were investigated, risk assessments using data from wastewater treatment plants in California showed that for food crop irrigation, the estimated annual risk of enteroviral infection was 10-3–10-5 (18). The American microbial standards for drinking-water are based on the assumption that humans should not be subjected to a risk of infection by enteric disease that is >10-4 ; the WHO guidelines appear to offer a similar level of protection.

The results of these studies of risks to those who consume these crops taken together do not provide any evidence to suggest a need to change the WHO guideline limit on exposure of 1000 faecal coliform bacteria/100 ml for irrigation of vegetable and salad crops eaten uncooked (category A1, Table 2).

Nematode egg guideline limit < 1 egg/l. This guideline limit seems to be adequate to protect those who consume cultivated vegetables that are spray-irrigated with effluent of consistent quality at high temperatures (>35°C), but it does not necessarily
protect those who eat vegetables that are surface-irrigated with such effluent at lower temperatures (mean temperature 15°C). Experimental studies in north-east Brazil and Leeds, England, investigated the risk of nematode infection (Ascaris lumbricoides and Ascaridia galli, respectively) from lettuce irrigated with treated wastewater (33, 34). In Brazil, the wastewater was treated in a series of waste stabilization ponds which comprised anaerobic, facultative and maturation ponds. When effluent from the facultative pond (<0.5 egg/l) was used for spray irrigation, no eggs were detected on crop surfaces. Lettuce irrigated with maturation pond effluent (0 eggs/l) was also not contaminated despite being grown during wet weather in heavily contaminated soil (>1200 Ascaris eggs/100 g). In the trials in England, spray-irrigation of lettuce with water containing 10 eggs/l resulted in a maximum of 1.5 eggs/plant, and when wastewater with 1 egg/l was used for irrigation, only very slight contamination was found (0.3 egg/plant). Thus, irrigation with wastewater that meets the WHO quality guideline limit resulted in no contamination of lettuce at harvest or very slight contamination of a few plants (6%) with eggs that were either degenerate or not infective. However, a few nematode eggs on harvested plants were viable but not embryonated (20% A. lumbricoides in crops irrigated with water containing >100 eggs/l; <0.1 A. galli egg/plant in crops irrigated with 1–10 eggs/l). Crops with a long shelf life might represent a potential risk to consumers if the eggs had time to become infective.

Epidemiological studies of risk factors for Ascaris infection. Epidemiological studies in central Mexico of risk factors for Ascaris infection related to wastewater showed that there was an increase in infection among men who ate crops that had been surface-irrigated with raw wastewater when compared with men who did not eat such crops; there was no increased risk when crops were irrigated with sedimented wastewater (from a reservoir) containing 1 nematode egg/l. However, children younger than 15 years old who ate crops from local fields irrigated with either raw wastewater or sedimented wastewater had a twofold increase in Ascaris infection compared with those who did not eat such crops (35). The increased risk in these circumstances may have been influenced by the irrigation method (surface rather than spray) and the lower mean temperature (caused by high altitude and semi-desert conditions).

It would be sensible, therefore, to adopt a stricter guideline limit of 0.1 egg/l to prevent transmission of Ascaris infection in circumstances where conditions favour the survival of helminth eggs (at lower temperatures and when surface irrigation is used); this stricter guideline limit would also address the risks to farmworkers who cultivate the vegetables (see below). In situations in which crops with a short shelf-life are grown in hot and dry conditions, and where workers are adequately protected from direct contact with wastewater or soil, the original guideline limit of 1 nematode egg/l seems adequate. However, using the revised guideline limit may be prudent even in these circumstances, adding a greater margin of safety.

Restricted irrigation

Faecal coliform guideline limits. The WHO guidelines did not include a limit for faecal coliform bacteria in the case of restricted irrigation because there was a lack of evidence of a risk of bacterial and viral infections to farmworkers and nearby residents. However, recent evidence indicates that a guideline limit should now be added. Data from prospective epidemiological studies in Israel (36) and the USA (37) on situations in which spray or sprinkler irrigation was used suggest that a level of 105 faecal coliform bacteria/100 ml would protect both farmworkers and the nearby population from infection transmitted through direct contact or aerosols from wastewater (category B1, Table 2).

Shuval et al. (36) showed that episodes of enteric disease were similar in Israeli kibbutzim (communal farming settlements) most exposed to effluent from waste stabilization ponds as aerosols from sprinkler irrigation (104 –105 faecal coliform bacteria/100 ml) and in those not exposed to wastewater effluents. This was the case both for workers who had contact with wastewater and their families and the general population living near the fields.

In Lubbock, Texas, USA, a rural community was exposed to sprinkler application of partially treated wastewater that came from a much larger urban community (37). In the first year, mainly primary effluent and trickling filter effluent were used to irrigate cereals and industrial crops (quality 106 faecal coliform bacteria/100 ml and enterovirus 100–1000 plaque-forming units (pfu)/l), and in the second year, the effluent was stored in reservoirs before use (quality 103–104 faecal coliform bacteria / 100ml and <10 pfu/l). There was no clear association between self-reported episodes of clinical illness and exposure
to wastewater. However, in the data on seroconversion to viral infections, a high degree of aerosol exposure was related to a slightly higher rate of viral infections (risk ratio 1.5–1.8); this effect was strongest in the first year (quality 106 faecal coliform bacteria / 100 ml) before the reservoirs had come into use. However, when allowance was made for alternative risk factors, eating at local restaurants was identified as an alternative explanation for viral infection.

Analysis of clinical data on viral infection (from faecal specimens) also showed that high exposure to aerosol was associated with new viral infections in the summer of the first year of irrigation but the effect was of borderline significance (P = 0.06) (38). In a specific study of rotavirus infection, wastewater spray irrigation had no detectable effect on the incidence of infection (39). Taken together, these results suggest that aerosol exposure to wastewater of a quality of 103–104 faecal coliform bacteria/100 ml does not result in excess infection with enteric viruses. There is some evidence that exposure to wastewater of a quality of 106 faecal coliform bacteria/100 ml results in excess viral infection (but not disease), but this is not conclusive since eating at local restaurants was an alternative explanation in this case.

However, data from Mexico in an area where flood and furrow irrigation are used suggested that in cases in which school-aged rural children and adults are in direct contact during irrigation or play with the partially treated wastewater that originated in an urban area, there may still be a risk of diarrhoeal disease when quality is at 103–104 faecal coliform bacteria/100 ml. Early studies indicated that there was an increased risk of diarrhoeal disease among those over 5 years (particularly children aged 5–14 years) in contact with partially treated wastewater retained in one reservoir and containing 105 faecal coliform bacteria/ 100 ml compared with those in a control group who practised rain-fed farming (40, 41).

Later studies found a significant excess of diarrhoeal disease in children aged 5–14 years and a fourfold increase in serological response to human Norwalk-like virus/Mexico in adults who had had a high level of contact with the effluent from two sequential storage reservoirs (containing partially treated wastewater with 103–104 faecal coliform bacteria/100 ml) when compared with those who had had no contact with this effluent (27). There was also an excess of diarrhoeal disease in adults (odds ratio = 1.5), but this did not reach significance (P = 0.12) probably due to the sample size. A stricter guideline limit of 103 faecal coliform bacteria/100 ml would be safer when adult farmworkers are engaged in flood or furrow irrigation (category B2, Table 2) and when children are regularly exposed (category B3, Table 2). This would also help to reduce the risks from epidemic infections which could be transmitted from an outbreak in the source community to communities that use the effluent for irrigation (42).

In cases in which there are insufficient resources to provide treatment to reach this stricter standard, a guideline limit of 105 faecal coliform bacteria/100 ml should be supplemented by other health protection measures (for example, health education about avoiding direct contact with wastewater and the importance of hand washing with soap after contact with wastewater).

Nematode egg guideline limits. In these studies performed in Mexico the guideline limit for nematode eggs of <1 egg/l seemed insufficient to protect farmworkers and their families, especially children under 15 years of age. This is particularly true where wastewater treatment systems produce an effluent of variable quality, where the partially treated wastewater may be contaminated with small quantities of raw wastewater and where children of farmworkers come into direct contact with the effluent. Children who came into contact with effluent from a storage reservoir that met the WHO criteria, even though it was contaminated with small quantities of raw wastewater, had an increased prevalence and intensity of Ascaris infection (43, 44). Contact with wastewater which had been retained in a reservoir before use (<1 nematode egg/l) resulted in excess Ascaris infection in children but not in adults, in whom the prevalence was similar to that in the control group (35, 43). When wastewater had been retained in two reservoirs in series before use and no nematode eggs had been detected, direct contact resulted in little excess Ascaris infection in any age group (45). Retention of wastewater in two reservoirs in series, producing water of an average quality of 103 faecal coliform bacteria/100 ml and no detectable nematode eggs, is therefore adequate to protect the children of farmworkers from Ascaris infection.

Similar situations can arise when raw wastewater is allowed to bypass conventional treatment plants, especially during periods of overflow after storms, which allows untreated wastewater containing nematode eggs (in areas where nematode infections are endemic) into the effluent that is reused for agriculture. Because this often occurs, a stricter guideline limit of 0.1 egg/l is required for use in restricted irrigation where children are exposed to the irrigation water or the soil (category B3, Table 2).

Guidance on the type and extent of wastewater treatment needed to meet these microbiological guidelines and on other health protection measures (such as the method of applying wastewater, instructing fieldworkers to wear footwear and to use good hygiene practices when handling and preparing crops) is given in the WHO guidelines and considered further in other studies (46, 47).

Risks from viruses and protozoa: are specific guidelines necessary?

The faecal coliform standard in most guidelines for wastewater reuse is intended to address the risks of enteric infections caused by both bacterial and viral pathogens, yet it may not provide adequate protection against viral infections because conventional treatment processes that use disinfection are much less efficient in removing viruses than in removing indicator bacteria and, as improved molecular techniques for viral detection become available, this becomes even more apparent (48). Additionally, the median infectious doses for enteric viruses are very low (<50 infectious particles) in comparison with those for most enteric bacteria (17, 49). Also, wastewater virology is a rapidly expanding area of research and the range of faecal viruses that are routinely considered is being extended to include, for example, adenoviruses and astroviruses (50), and these may survive longer in treated wastewaters than enteroviruses.

There are little data available on the risks of viral infection from either direct contact or the consumption of crops. Nevertheless, the findings described below have implications for the evaluation of current guidelines with respect to viral risks.

Risk assessment. The use of risk assessment approaches has shown that when the concentration of viruses (poliovirus 3, echovirus 12 and poliovirus 1) in chlorinated tertiary effluent reaches a maximum of 111 pfu/100 ml, the estimated annual risk of enteroviral infection from spray irrigation of food crops is in the range 10-4–10-7 (16). The use of chlorinated secondary effluents (3.9 log virus removal from untreated wastewater) to irrigate food crops results in an estimated annual risk of enteroviral infection of 10-7 –10-9, and even the use of unchlorinated secondary effluents resulted in an estimated annual risk of enteroviral infection of 10-3–10-5 (18). The use of effluent containing 1000 faecal coliform bacteria/100 ml to irrigate salad crops resulted in an order-of-magnitude estimate for the annual risk of viral infection of <10-4 (7). However, these studies are recognized to have deficiencies (see above) when compared with those using more advanced modelling techniques.

Epidemiological studies. Epidemiological studies have indicated that when effluent that contained fewer than 105 faecal coliform bacteria/ 100 ml was used in spray irrigation there was no significant risk of enteroviral infection to the surrounding population (36, 37). When there was surface irrigation with effluent containing 103–104 faecal coliform bacteria/100 ml, there was a significant risk of infection with human Norwalk-like virus/Mexico (Hu/NLV/MX) among farmworkers who had high levels of contact with the wastewater (27); however when there was surface irrigation with effluent containing 104 faecal coliform bacteria/100 ml there was little risk of infection with this virus associated with eating vegetable crops raw (27).

Taken together these results suggest that it may not be necessary to use tertiary treatment plus disinfection to protect against viral risks from the consumption of raw vegetables and that the faecal coliform guideline limit of 1000 faecal coliform bacteria/ 100 ml is adequate and no extra viral guideline limit is currently justified.

Adequacy of protection against risks from protozoa by the nematode egg guideline limit. There is increasing concern about the role of wastewater in the environmental transmission of protozoan pathogens such as Giardia, Cryptosporidium and Cyclospora. The WHO guidelines assumed that if the number of helminth eggs was reduced to the standard of the guideline limit then other pathogens, such as protozoan (oo)cysts, would also be reduced to levels that do not cause excess infection. However, studies have shown that the removal of helminth eggs does not correlate with that of protozoan (oo)cysts (51–53). There is evidence that protozoan (oo)cysts are not effectively removed by conventional wastewater treatment processes, with reported efficiencies varying from 26–100% (54–56). In addition, the infectious dose can be low: human oral challenge studies have shown that the median infectious dose for Giardia is between 10 and 100 cysts and for Cryptosporidium between 30 and 1000 (oo)cysts (4).

Most of the evidence indicates that water-related outbreaks of enteric protozoan disease are associated with ingestion of contaminated drinking-water, immersion in recreational waters (57–59) and consumption of contaminated foods (60, 61). Few data are available on the importance of wastewater reuse in agriculture — particularly the use of treated wastewater — in the transmission of parasitic protozoan infection, and these other routes of transmission and poor domestic hygiene are probably more important, especially in developing countries. Even though (oo)cysts of both Cryptosporidium parvum and Cyclospora cayetanensis have been detected on vegetables in markets in an endemic area (62), there is no epidemiological evidence to directly implicate the wastewater used for irrigation as a risk factor for either pathogen.

Epidemiological studies in Mexico have shown that there is a small risk of amoebic infection (OR=1.3) among those who are in contact with untreated wastewater but not among those in contact with settled wastewater retained in two reservoirs before use, which conforms to the WHO guideline limit on nematode eggs (41). Initial analysis indicated that there was no risk of infection with Giardia intestinalis among agricultural workers and their families who had contact with raw wastewater, but a small risk was associated with contact with wastewater retained in two reservoirs (63). However, when these data were analysed further, controlling for the effect of other transmission routes, the risk related to contact with the reservoir effluent did not remain significant (E. Cifuentes, personal communication). A study in India has also shown that there was no significant risk of Giardia infection in agricultural workers using untreated or treated wastewater when compared with controls (64).

These studies indicate that there is no evidence to suggest that the use of treated wastewater for irrigation, which meets the WHO guideline limit for nematode eggs, causes an increase in the risk of parasitic protozoan infection, and therefore there is no evidence to support the need to establish a separate guideline limit for protozoa. However, it may be that the risk of infection from protozoan parasites is of greater public health importance in countries than the risk of infections from helminths.

 

Conclusions

This review of the three main approaches for establishing microbiological quality guidelines and standards for the reuse of treated wastewater in agriculture confirms that the first approach (based on the absence of faecal indicator organisms in the wastewater) is an unnecessarily conservative and expensive instrument for public health protection. Use of the second approach (in which risk is assessed using epidemiological studies supplemented by microbiological studies on the transmission of pathogens) in conjunction with the third (which uses a model-based quantitative risk assessment for selected pathogens) produces a powerful tool that aids the development of regulations. It is also more cost-effective than the first approach, yet it adequately protects public health.

Our appraisal of recent research evidence based on this combined approach indicates that there is a need to revise the 1989 WHO guidelines. For unrestricted irrigation, there is no evidence to suggest a need to revise the faecal coliform guideline limit of 1000 faecal coliform bacteria/100 ml. The guideline limit is supported by data from epidemiological, microbiological and risk assessment studies. However, there is epidemiological evidence that the guideline limit for nematode eggs ( 1 egg/l) is not adequate in conditions that favour the survival of nematode eggs (lower mean temperatures and the use of surface irrigation), and it needs to be revised to 0.1 egg/l in these conditions.

For restricted irrigation, there is evidence to support the need for a guideline limit for exposure to faecal coliform bacteria to protect farmworkers, their children and nearby populations from enteric viral and bacterial infections. The appropriate guideline limit will depend on which irrigation method is used and who is exposed. For example, if adult farmworkers are exposed to spray or sprinkler irrigation, a guideline limit of 105 faecal coliform bacteria/100 ml is necessary. A reduced guideline limit of 103 faecal coliform bacteria/100 ml is warranted when adult farmworkers are engaged in flood or furrow irrigation and when children under age 15 are regularly exposed through work or play. Where there are insufficient resources to meet this stricter guideline limit, a guideline limit of 105 faecal coliform bacteria/100 ml should be supplemented by other health protection measures. The guideline limit for nematode eggs (1 egg/l) is adequate if no children are exposed, but a revised guideline limit of 0.1 egg/l is recommended if children are in contact with wastewater or soil through irrigation or play.

The evidence reviewed does not support the need for a separate specific guideline limit to protect against viral infections, and there was insufficient evidence to support the need for a specific guideline limit for parasitic protozoa.

The risks to populations are dependent on the irrigation method used. Health risks from irrigated crops are greatest when spray or sprinkler irrigation is used, and the risk to field workers is greatest when flood or furrow irrigation is used. The proposed guidelines take these risks into account. However, other potential sources of crop contamination should also be considered such as crop handling, transportation and the sale of produce in unhygienic markets.

 

Acknowledgements

We would like to thank the Department for International Development (DFID), United Kingdom, and Water and Sanitation in Developing Countries (SANDEC), Switzerland, for funding WELL (Water and Environmental Health at London and Loughborough) study no. 68, on which this paper is based.

 

References

1. Health guidelines for the use of wastewater in agriculture and aquaculture. Report of a WHO Scientific Group. Geneva, World Health Organization, 1989 (WHO Technical Report Series, No. 778).        

2. Edberg SC et al. Escherichia coli: the best biological drinking water indicator for public health protection. Journal of Applied Microbiology, 2000, 88: 106S–116S.        

3. US Environmental Protection Agency/US Agency for International Development. Guidelines for water reuse. Washington, DC, Environmental Protection Agency, Office of Wastewater Enforcement and Compliance, 1992 (technical report no. EPA/625/R-92/004).        

4. Cooper RC, Olivieri AW. Infectious disease concerns in wastewater reuse. In: Asano T, ed. Wastewater reclamation and reuse. Lancaster, PA, Technomic Publishing, 1998: 489–520.        

5. Crook J. Water reclamation and reuse criteria. In: Asano T, ed. Wastewater reclamation and reuse. Lancaster, PA, Technomic Publishing, 1998: 489–520.        

6. State of California. Wastewater reclamation criteria. Berkeley, CA, Department of Health Services, 1978. (California administrative code, Title 22, Division 4, Environmental Health.)        

7. Shuval H et al. Development of a risk assessment approach for evaluating wastewater reuse standards for agriculture. Water Science and Technology, 1997, 35(11/12): 15–20.        

8. Shelef G. Wastewater reclamation and water resources management. Water Science and Technology, 1991, 24 (4): 251–265.        

9. Rose J. Microbial aspects of wastewater reuse for irrigation. CRC Critical Reviews in Environmental Control, 1986, 16: 231–256.        

10. Bontoux L. The regulatory status of wastewater reuse in the European Union. In: Asano T, ed. Wastewater reclamation and reuse. Lancaster, PA, Technomic Publishing, 1998: 1463–1476.        

11. Bontoux L, Courtois G. The French wastewater reuse experience. In: Asano T, ed. Wastewater reclamation and reuse. Lancaster, PA, Technomic Publishing, 1998: 489–520.        

12. Norma Oficial Mexicana. Que establece los límites máximos permisbles de contaminantes en las descargas de aguas residuales en aguas y bienes nationales. Diario Oficial de la Federation, 1997: 68–85 (in Spanish).        

13. Norma Oficial Mexicana. Que establece los límites máximos permisbles de contaminantes en las descargas de aguas residuales en aguas y bienes nationales, publicado el 6 de enero de 1997 — Aclaracion. Diario Oficial de la Federation, 1997: 38–41 (in Spanish).        

14. Peasey A et al. A review of policy and standards for wastewater reuse in agriculture: a Latin American perspective. London, Water and Environmental Health at London and Loughborough Resource Centre, London School of Hygiene and Tropical Medicine, and Water, Engineering and Development Centre (WEDC), Loughborough University, 1999 (WELL study No. 68, Part II).        

15. Haas CN et al. Risk assessment of virus in drinking water. Risk Analysis, 1993, 13: 545–552.        

16. Asano T et al. Evaluation of the California wastewater reclamation criteria using enteric virus monitoring data. Water Science and Technology, 1992, 26(7/8): 1513–1524.        

17. Haas CN et al. Quantitative microbial risk assessment. New York, John Wiley & Son, 1999.        

18. Tanaka H et al. Estimating the safety of wastewater reclamation and reuse using enteric virus monitoring data. Water Environmental Research, 1998, 70(1): 39–51.        

19. Haas CN. Viewpoint: acceptable health risk. Journal of the American Water Works Association, 1996, 88 (12): 8.        

20. Wheeler JG et al. Study of infectious intestinal disease in England: rates in the community, presenting to general practice, and reported to national surveillance. British Medical Journal, 1999, 318: 1046–1050.        

21. Eisenberg JN et al. Quantifying water pathogen risk in an epidemiological framework. Risk Analysis, 1996, 16: 549– 563.        

22. Eisenberg JN et al. An analysis of the Milwaukee Cryptosporidium outbreak based on a dynamic model of disease transmission. Epidemiology, 1998, 9: 255: 263.        

23. van Gerwen SJC et al. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety. Journal of Applied Microbiology, 2000, 86: 938–951.        

24. Eisenberg JN et al. Quantifying water pathogen risk in an epidemiological framework. Risk Analysis, 1996, 16: 549– 563.        

25. ILSI Risk Science Institute Pathogen Risk Assessment Working Group. A conceptual framework to assess the risks of human disease following exposure to pathogens. Risk Analysis, 1996, 16: 841–847.        

26. MacGill SM et al. Towards quality assurance of assessed waterborne risks. Water Research, 2000, 34: 1050–1056.        

27. Blumenthal UJ et al. Consumer risks from enteric infections and heavy metals through agricultural reuse of wastewater, Mexico. London, London School of Hygiene and Tropical Medicine, 1998 (Final Report, DFID research project no. R5468).        

28. International Commission on Microbiological Specifications for Foods. Microorganisms in food 2 — sampling for microbiological analysis: principles and scientific applications. Toronto, University of Toronto Press, 1974.        

29. Vaz da Costa Vargas S et al. Bacteriological aspects of wastewater irrigation. Leeds, University of Leeds, Department of Civil Engineering, 1996 (Tropical Public Health Engineering research monograph no. 8).        

30. Bastos RKX, Mara DD. The bacteriological quality of salad crops drip and furrow irrigated with waste stabilization pond effluent: an evaluation of the WHO guidelines. Water Science and Technology, 1995, 31 (12): 425–430.        

31. Oragui JI et al. The removal of excreted bacteria and viruses in deep waste stabilization ponds in northeast Brazil. Water Science and Technology, 1987, 19 (Rio): 569–573.        

32. Crabtree KD et al. Waterborne adenovirus — a risk assessment. Water Science and Technology, 1997, 35: 1–6.        

33. Ayres RM et al. Contamination of lettuces with nematode eggs by spray irrigation with treated and untreated wastewater. Water Science and Technology, 1992, 26 (7–8): 1615–1623.        

34. Stott R et al. An experimental evaluation of potential risks to human health from parasitic nematodes in wastewaters treated in waste stabilisation ponds and used for crop irrigation. Leeds, University of Leeds, Department of Civil Engineering, 1994 (Tropical Public Health Engineering research monograph no. 6).        

35. Peasey AE. Human exposure to Ascaris infection through wastewater reuse in irrigation and its public health significance [PhD thesis]. London, University of London, 2000.        

36. Shuval HI et al. Transmission of enteric disease associated with wastewater irrigation: a prospective epidemiological study. American Journal of Public Health,1989, 79: 850–852.        

37. Camann DE et al. The Lubbock land treatment system research and demonstration project. Vol 4. Lubbock Infection Surveillance Study (LISS). North Carolina, United States Environmental Protection Agency, 1986 (project summary USEPA/600/S2-86/027d).        

38. Camann DE, Moore BE. Viral infections based on clinical sampling at a spray irrigation site. In: Implementing water reuse. Denver, CO, AWWA Research Foundation, 1988: 847–855.        

39. Ward RL et al. Effect of wastewater spray irrigation on rotavirus infection rates in an exposed population. Water Research, 1989, 23: 1503–1509.        

40. Cifuentes E et al. Problemas de salud asociados al riego agricola con agua residual en Mexico. [The health problems associated with irrigation with wastewater in Mexico]. Salud Publica de Mexico, 1993, 35: 614–619 (in Spanish).        

41. Cifuentes E. Impact of wastewater irrigation on intestinal infections in a farming population in Mexico: the Mezquital valley [PhD thesis]. London, University of London, 1995.        

42. Fattal B et al. Viral antibodies in agricultural populations exposed to aerosols from wastewater irrigation during a viral disease outbreak. American Journal of Epidemiology, 1987, 125: 899–906.        

43. Blumenthal UJ et al. Evaluation of the WHO nematode egg guidelines for restricted and unrestricted irrigation. Water Science and Technology, 1996, 33(10/11): 277–283.        

44. Blumenthal UJ et al. The risk of enteric infections associated with wastewater reuse: the effect of season and degree of storage of wastewater. Transactions of the Royal Society of Tropical Medicine and Hygiene, 2000, in press.        

45. Cifuentes E. The epidemiology of enteric infections in agricultural communities exposed to wastewater irrigation: perspectives for risk control. International Journal of Ennvironmental Health Research, 1998, 8: 203–213.        

46. Blumenthal UJ et al. Generalised model of the effect of different control measures in reducing health risks from waste reuse. Water Science and Technology, 1989, 21(Brighton): 567–577.        

47. Blumenthal UJ et al. Guidelines for wastewater reuse in agriculture and aquaculture: recommended revisions based on new research evidence. London, WELL Resource Centre, London School of Hygiene and Tropical Medicine, and Water, Engineering and Development Centre (WEDC), Loughborough University, 1999 (WELL study no. 68, part I).        

48. Blackmer F et al. Use of integrated cell culture–PCR to evaluate the effectiveness of poliovirus inactivation by chlorine. Applied and Environmental Microbiology, 2000, 66: 2267–2268.        

49. Schwartzbrod L. Effect of human viruses on public health associated with the use of wastewater and sewage sludge in agriculture and aquaculture.Geneva, World Health Organization, 1995 (WHO/EOS/95.19).        

50. Chaperon CD et al. Detection of astroviruses, enteroviruses, and adenovirus types 40 and 41 in surface waters collected and evaluated by the Information Collection Rule and an integrated cell culture–nested PCR procedure. Applied and Environmental Microbiology, 2000, 66: 2520–2525.        

51. Stott R et al. A survey of the microbial quality of wastewaters in Ismailia, Egypt, and the implications for wastewater reuse. Water Science and Technology, 1997, 35: 211–217.        

52. Grimason AM et al. Occurrence and removal of Cryptosporidium spp. oocysts and Giardia spp. cysts in Kenyan waste stabilisation ponds. Water Science and Technology, 1993, 27: 97–104.        

53. Alouini Z. Fate of parasite eggs and cysts in the course of wastewater treatment cycle of the Cherguia station in Tunis. La Houille Blanche, 1998, 53: 60–64.        

54. Bukhari Z et al. Occurrence of Cryptosporidium spp. oocysts and Giardia spp. cysts in sewage influents and effluents from treatment plants in England. Water Science and Technology, 1997, 35: 385–390.        

55. Sykora J et al. Giardia cysts in raw and treated sewage. In: Logdson GS, ed. Controlling waterborne giardiasis. New York, American Society of Civil Engineers, 1988: 22–33.        

56. Robertson LJ. Removal and destruction of intestinal parasitic protozoa by sewage treatment processes. International Journal of Environmental Health Research, 1999, 9: 85–96.        

57. Craun GF. Waterborne giardiasis. In: Meyer EA, ed. Human parasitic diseases. Vol. 3. Giardiasis. Amsterdam, Elsevier Science, 1990: 267–293.        

58. Fricker CR, Crabb JH. Waterborne cryptosporidiosis: detection methods and treatment options. Advances in Parasitology, 1998, 40: 242–278.        

59. Ortega YR et al.Cyclospora cayetanensis. Advances in Parasitology, 1998, 40: 399–418.        

60. Smith JL. Cryptosporidium and Giardia as agents of foodborne diseases. Journal of Food Protection, 1993, 56: 451–461.        

61. Rose JB, Slifko TR. Giardia, Cryptosporidium and Cyclospora and their impact on food: a review. Journal of Food Protection, 1999, 62: 1059–1070.        

62. Ortega YR et al. Isolation of Cryptosporidium parvum and Cyclospora cayetanensis from vegetables collected in markets of an endemic region of Peru. American Journal of Tropical Medicine and Hygiene, 1997, 57: 683–686.        

63. Cifuentes E et al. Health impact evaluation of wastewater reuse in Mexico. Public Health Reviews, 1991-92, 19: 243–250.        

64. Sehgal R, Mahajan R. Occupational risk in sewage works in India. Lancet, 1991, 338: 1404–1405.        

 

 

Correspondence

Ursula J. Blumenthal
Department of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine
Keppel Street, London WC1E 7HT, England
E-mail: ursula.blumenthal@lshtm.ac.uk

World Health Organization Genebra - Genebra - Switzerland
E-mail: bulletin@who.int