Canoeist Pathogenic Illness Guide
4.1 - Outline of Water Treatment Procedures
4.2 - Pathogen Reduction within the Sewage Treatment Process
4.3 - Pathogens within other Waste Water Flows
4.4 - Overall Pathogen / Indicator River Loadings
Forward to Chapter 5 - Legal Regulation
There are a variety of routes by which pathogenic microorganisms can find their way into a watercourse to pose a potential risk to the recreational watersports participant. Some of these flows are well known, such as via sewage works, but many other routes are more obscure. In order to devise a policy to manage or reduce the risk from pathogens, which may involve extensive capital investment if major water treatment is involved, then it is important that the relative risk from all of these possibly infectious routes is clearly understood. Figure 1. shows a basic schematic outline of the major flows of pathogenic microorganisms into a typical urban, lowland river such as the Trent or Thames.
The output from domestic facilities such as toilets, sinks, baths and washing machines is routed through sewers to sewage treatment works, and then to a river or to the sea for dilution and disposal (1).
The output from factories and other industrial systems may be connected to the normal sewage system. Such outputs may alternatively be discharged directly into the river, often after some form of "in-house" water treatment (2). In either case the authorities will regulate the quality of the output.
In many areas, especially older housing estates, surface water run-off of rainfall from roofs and paved areas can also be carried into the sewers. This is known as a combined sewerage system. In more modern areas it is common for this water to run through a separate storm water system directly into the river (3). This has the benefit of not overloading the sewage works in a heavy rainstorm, but the disadvantage of putting storm water directly into the river without treatment. It is not unusual for storm water systems to carry some foul water through illegally/incorrectly connected plumbing pipework.
Sewage works will be designed to carry a certain level of flow, depending on the size of the area being served. In a heavy rainstorm the volume of flow through the plant will increase dramatically, and it is not economically viable to build a plant which will provide full treatment to these high storm flows. In such conditions the plant will divert sewage overflow to the river, although in certain cases it will have received some limited treatment before diversion (4).
Finally, it is possible to get waste water and pathogen flow into the river through a range of other means which have not been designed or built by man. This could be run-off from fields and car parks, or seagulls feeding on waste tips and then roosting on areas of water. This flow could be directly into the river, or via a natural stream, or into the man-made storm water system.
The sewage treatment process is shown in figure 2.
Sewage is given a preliminary treatment to remove grit and larger solid objects such as rags. This is primarily to protect the machinery within the plant from damage. Primary treatment involves settlement in large tanks or lagoons where solid (faecal) matter settles for collection and disposal. The liquid, still containing some suspended solids and many dissolved pollutants, moves on to Secondary treatment. In this stage biological processes are used to break down the complex organic molecules suspended or dissolved in the effluent.
After further settlement to remove additional suspended solids, the liquid is separated for disposal to river. It is possible for a range of Tertiary treatments to be given before disposal, depending on the quality of the effluent and the needs of the receiving river. Such treatments could be:
The full treatment is given only when flow through the sewage plant is below it's design maximum flow rate. In higher flows, due normally to rainfall in a combined system, full treatment is given only to about three times the normal "dry weather flow"(DWF - a nominal figure derived from population in the catchment, land area, predicted rainfall etc). Excess flow is diverted to storm overflows after primary treatment, and discharged directly to the river. For larger rainstorms (typically 6 times the DWF) the flow may be diverted around the entire treatment system.
Crude sewage can be assumed to contain 20 million faecal coliforms per 100 ml (Grantham, R., 1992). For this to be reduced to the typical Bathing Water Directive levels of 2000 requires a dilution of at least 10,000 fold if no sewage treatment is to be given.
Traditionally in the UK, marine sewage disposal consisted of simple maceration or screening, then disposal through sea outfalls where sufficient dilution was assumed. This has often been found to be insufficient, and more modern systems include a combination of limited sewage treatment to provide some reduction of pathogens, and then disposal through long sea outfalls where modelling or testing has proved the levels of dispersion required.
On inland rivers, large scale dilution is not available because of the lower water flows in rivers. Grantham (ibid) quotes a typical river dilution factor of ten, and the Trent at Holme Pierrepont will contain 40% treated effluent in summer low flow conditions (Fewtrell, L., 1994). Sewage treatment for discharge to inland rivers will therefore need to include effective disinfection procedures if the receiving waterway is to be safe for recreation.
Conventional primary and secondary sewage treatment will give rise to a reduction of one to two orders of magnitude (90 - 99%) in the level of bacterial indicators. This is due to microorganisms binding to solids within the effluent which then settle out, and to the morbidity of enteric bacteria in an environment which is very hostile.
It is common for treated effluent to be discharged to the river after only these two stages of treatment.
If reductions in pathogen/indicator levels achieved by primary and secondary treatment, dispersal, dilution, and natural disinfection within the river do not allow the receiving watercourse to meet the required standard, then an additional stage of treatment may be required within the sewage treatment process.
There are a range of disinfection processes available for use in sewage treatment works. In America chlorination of discharge has been a standard procedure for many years, although this has been done to protect downriver abstraction for drinking water or crop irrigation as much as for recreational purposes. This practise is beginning to be questioned however, due to the potential environmental damage caused by chlorine and chlorine by-products.
Within the UK, sewage effluent disinfection is much less common, and the policy of the regulators is being developed on a case by case basis (Grantham, ibid). It is clear however that Ultra Violet Disinfection is likely to become the method of choice, and Microfiltration may be a useful process if full scale trials prove the potential of small pilot schemes.
ULTRA VIOLET DISINFECTION: This method of disinfection is based on the fact that cell DNA and RNA absorbs light in the range 255-289 nm. UV irradiation (which is close to this wavelength at 254 nm) will therefore be absorbed by the cell genetic material, and damage or rearrange the genetic information. This makes the cell unable to replicate. (Gross, T. S. C. and Davis, M. K. 1993).
A trial of UV treatment on treated sewage in Jersey (Gross, T. S. C. and Davis, M. K. 1992) found it possible to reduce mean total coliforms from 654, 583 per 100 ml down to 203 per 100 ml. Faecal coliforms were reduced from 56, 333 to 61 per 100 ml. The effluent would pass the EC Bathing Water Directive without further dilution.
The effectiveness of the process does however depend on the ability of the UV light to pass through the effluent. Suspended solids will affect UV transmission. If it is required to use UV disinfection for partially treated sewage, or for storm water flows, there might be a requirement to have many banks of treatment modules to provide sufficient capacity when the flow rate or turbidity increases. UV systems are however modularised so it is possible to provide this standby capacity.
Trials by Job, G. D. et al (1995) at a number of SW Water sewage works found that UV disinfection was effective against a range of pathogens, both bacterial and viral. Typically log reduction of three (ie 99.9%) could be achieved. They found however that the systems were prone to fouling, or to loss of transmittance due to high suspended solids, unless the effluent to be treated was of a high quality.
MICROFILTRATION. Renovexx is a patented filtration system in which primary or secondary settled effluent is dosed with a coagulant and then pumped through a woven fabric curtain. When the coagulate settles onto the fabric it forms a filter medium which is fine enough to remove a range of microorganisms including bacteria and viruses Log reductions up to 4 have been recorded at trials (Gosling, P. and Realey, G., 1992).
The filtration process has the benefit in that it will also improve other effluent qualities such as BOD and suspended solids, unlike UV disinfection which may need a good effluent quality as input if it is to work effectively.
Natural waters provide a hostile environment for enteric microorganisms, whether these are pathogens (eg salmonella, Shigella) or faecal indicators (eg coliforms or faecal streptococci). Morbidity is ascribed to two main causes; lack of nutrients, and radiation induced damage from the ultraviolet component of sunlight (Godfree, A,. 1993).
It is known that long term storage of fresh water results in a significant reductions in pathogenic and indicator bacteria. In fact the storage of water in large reservoirs is an important preliminary treatment in the disinfection of public drinking water supplies. Godfree reports trials by the Metropolitan Water Board which showed that 7 to 10 weeks of storage produced E. Coli reductions of 95.7 to 99.8% in spring, 90.7 to 99.7% in summer, and 85.7 to 98.2% in winter. The lower reductions in winter were ascribed to the lower density of sunlight. It was also shown that bacterial morbidity was reduced in low water temperatures.
Most research into natural radiation damage has however been carried out in marine environments, as this factor is critical in the dispersion/dilution modelling referred to above.
Natural disinfection is frequently used in enclosed reservoirs or lakes used for recreation. If the inflow of fresh pathogens can be prevented, then any body of water will naturally cleanse itself over a period of a few months.
The fate of viruses within the aquatic system has some similarities to the fate of indicator and pathogenic bacteria. Viruses are unable to multiply outside their host cells, and once they are released into the environment their numbers also decline, although the rate of decline is environmentally determined. In certain favourable conditions some viruses can survive for months. Neutral pH, the presence of particulate or organic matter, moisture, and low temperature have all been found to favour survival.
Temperature is critically important in controlling morbidity, with increased temperature resulting in decreased survival. In marine waters the time for a 99.9% reduction has been found to be <5 days at 370C, 2 - 9 days at 250C, and 40 - 90 days at 40C. Increased temperature will also increase the growth rate of other microorganisms which may be antiviral, both in the river and the sewage works.
Viral inactivation by sunlight has been poorly studied, with possible dual action again depending on the direct effect of UV light on viral DNA, or the indirect effect caused by biological pigments in water or the grazine effects of microfauna.
In general, the survival of viruses is reported to be longer than the coliform indicator bacteria typically used to index the presence of pathogens.
Waste water from many industrial processes which is not passed into the normal sewage system will be passed directly into the river, although the regulating authority may require various forms of treatment before discharge.
The risk from pathogenic microorganisms should be small, as few industrial processes have the potential to release pathogens. These will be regulated resulting in a risk appreciably lower than the risk from other sources such as sewage works.
It may be possible for industrial processes to release chemicals or other substances which can cause skin irritation or gastrointestinal symptoms. These releases will frequently be as a result of plant breakdown or incorrect operation, rather than planned, authorised discharge.
During recent years there has been a growing recognition that stormwater runoff from impermeable surfaces can be a major source of indicator organisms and pathogens in urban receiving waters. As the quality of effluent from sewage works improves due to the work put in hand under the Urban Waste Water Treatment Directive, the deleterious effect of non-point source inputs, such as storm water run-off, is becoming more easily noticeable.
Ellis, B. (1993) reports studies of urban surface run-off which show that such effluent typically has a quality similar to treated sewage effluent, as far as bacterial parameters are involved, with geometric mean faecal coliform and faecal streptococci counts in the order of 104 to 105. Rotaviruses appear to be prevalent in the winter and early spring period, and enteroviruses lowest in the summer. The bacteria loading did however fluctuate dramatically both within and between storm events, often by several orders of magnitude.
When studying the sources of these pathogens, street drains from a street market area (rodent and other animal faeces, litter and refuse) and a poorly maintained residential area (street dirt, pets and illegal cross connections) were found to be major contributors.
Later work reported by Ellis pointed to the effect of pathogenic microorganisms binding to sediments in sewer pipes and drains. Bacterial and enterovirus survival can be increased by between two and four fold compared to survival rates in the water column. It would then be possible for a rainstorm event to flush this pathogen containing sediment into the river. Ellis reports:
"very low, and particularly initial, volumes of surface run-off are likely to yield substantial bacterial loads to end of pipe. A recent Water Research Centre study, conducted on behalf of the NRA, concluded that an average daily rainfall in excess of 0.3 mm will generate an FC load equivalent to treated sewage effluent. Other work has also shown that an effective rainfall run off of only 12-15 mm will lead to the violation of the EC Guide-line limits in most urban receiving waters.
One Canadian study draws a distinct difference between storm generated flows which may still contain predominantly human enteric pathogens from urban sources, and other storm flows which may contain large numbers of non-human, animal associated enteric viruses and bacteria (O'Shea, M. L. and Field, R. 1992). In the second case a reliance on standard indicator methods may prove ineffective due to the inability of these methods to distinguish human from non-human, and possibly non-disease causing, sources (eg vegetation, soil and animals). In one study of bathing in a lake fed by run-off from a forested watershed, with no input from human sanitary systems, a positive relationship was found between bathing and illness symptoms, but standard measures of indicator organisms showed no statistical relationship with the illness rates.
Both O'Shea (1992) and Phillip (1991) report that such animal sources pathogens are more likely to be sources of ear, eye, skin and upper respiratory tract infections than of gastrointestinal infections.
There are a range of management techniques available to reduce pathogen flows in storm water (Ellis, B., ibid). Systematic "wrong connection" surveys in residential areas are now being undertaken by some Water Authorities, but they are costly in manpower and time. Impermeable paved areas and water channels can be replaced by grassed areas or other vegetation, either on flat surfaces or in carefully engineered channels and gulleys. Storm water can then soak into the ground where it will undergo natural biological treatment. Such measures also have aesthetic benefits, and reduce flood flow volumes in rivers.
Disinfection or settlement as an end-of-pipe treatment may also be possible, but as a second best option (prevention being better than cure).
It is possible for storm water events to cause flow of pathogens into watercourses by routes other than storm water culverts and discharges.
Wyer, M. D. et al (1994) report that after the commissioning of an UV disinfection plant at the sewage treatment works in Jersey (see Gross, T. S. C. and Davis, M. K (1992) above), expected reductions in indicator levels were not achieved, especially after rainfall. The problem was traced to high bacterial loading in a number of small streams running into the sea close to bathing beaches.
Similarly at Holme Pierrepont, Nottingham, the Regatta course has no input flow apart from small brooks and streams, and was assumed to be clean for recreational use due to natural disinfection processes. When high levels of bacterial indicators were unexpectedly found, the source was traced to a local sewage pumping station, which was overloaded due to recent urban development, and overflowed at certain peak times into a brook feeding the Regatta Course (Davies, J. 1993).
Both examples show one value of using indicator bacteria testing. Although such tests may not be an accurate index of the exact risk of infection, they at least indicate that sewage derived microorganisms may be present in the water, which should be a trigger for further investigation.
Diffuse run off from agricultural and urban areas into the many small streams and culverts which eventually run into the rivers can produce large increases in bacterial loadings. This will be a very difficult management problem when it comes to controlling or limiting such a geographically widespread phenomenon.
Because of the immense number of variables discussed above, attempts to model or describe with any accuracy the relationship between overall level of pathogens or biological indicators in a river and the environmental conditions will face great difficulties. Nevertheless a number of studies have measured overall bacterial levels in rivers for a number of years, and some consistent patterns have emerged.
Measurements of the Wyre, Duddon and Grizedale rivers between 1974 and 1984 (White, W. R. and Godfree, A. F., 1985) all showed consistent maximal counts in late summer or early autumn. A one year study of the Ribble, a more mature semi-urban river, showed similar short term maxima in the late summer, with a seasonal pattern of high counts in winter and lower in summer. The August maximum was associated with a sample taken on the day after a 42.3 mm storm event, after eleven previous dry days (Godfree 1993).
The Microbiological sampling carried out at Holme Pierrepont as part of the risk assessment (see Figure 3, data from Neal 1996,) also gives a chance to study microorganism changes with season. Within an erratic distribution, there is a similar clear pattern of low summer mean levels, higher winter mean levels, with isolated autumn maxima associated with storm events.