The water treatment technologies are mostly driven by three primary factors: the discovery of new rarer contaminants, the promulgation of new water quality standards, and cost. Chemical Clarification, Granular Media Filtration, and Chlorination were virtually the only treatment processes used in municipal water treatment. However, it was seen a dramatic change in the water industry’s approach to water treatment in which water utilities have started to seriously consider alternative treatment technologies to the traditional filtration/chlorination treatment approach. This paper identifies and discusses some of these “Advance” technologies.
Technology Development and Implementation Process
For a new technology to be considered it must have advantages over traditional treatment processes. These can include lower capital and operations and maintenance costs, higher efficiency, easier operation, better effluent water quality, and lower waste production. Nevertheless, for a water treatment technology to be accepted and implemented at large municipal scale, it must be demonstrated in stages. Understanding this process is necessary in order to properly plan and introduce a new technology to municipal water treatment. A typical sequence of these stages might be summarized as follows:
Stage 1: Successful demonstration in another field.
Stage 2: Testing and development at bench- and pilot-scale levels (1 to 50 gpm).
Stage 3: Verification at demonstration-scale level (>100 gpm).
Stage 4: Multiple successful installations and operations at small full-scale level (0.5 to 5 MGD).
Stage 5: Implementation at a large-scale municipal water treatment plant.
Two important milestones must be achieved in parallel with the above stages: obtaining regulatory approval and reducing costs to competitive levels. Commonly, regulatory approval is necessary by the end of the demonstration-Bottom of Form
scale Verification stage (stage 3) and prior to installation at small full-scale plants (stage 4). However, for a new technology to reach full acceptance (stage 5), its cost must be competitive with that of other more conventional processes that achieve the same objective.
The time duration for each of the above stages can vary greatly depending on the technology being considered, how urgent it is to have it implemented, how long it takes for its cost to reach competitive levels, and the significance of its role in the overall water treatment train. The last factor is different from the others in that it recognizes the difference between a technology that is proposed as an alternative to filtration, for example, which is an essential component of water treatment, versus a technology that is proposed to replace a less important component such as a pump, automation, chemical feed, taste-and-odor control, or preoxidation.
A wide range of water treatment technologies have been developed or are currently in development. This paper focuses on technologies that can be applied in municipal water treatment plants. Such a technology should meet the following criteria:
- The technology can be scaled to large applications (i.e., > 5 MGD).
- The technology can be cost competitive with existing technologies at large scale.
- The technology can produce water that meets regulatory requirements.
- The technology has a high degree of reliability.
In this paper the following technologies are screened and evaluated: membrane filtration (low pressure and high pressure), ultraviolet irradiation, advanced oxidation, ion-exchange, and biological filtration. Many of these technologies are certainly not new to the water industry. However, either their application has been limited or they were introduced to the water industry so recently that many questions remain unanswered about their large-scale application.
Membrane Filtration Technology
There are two classes of membrane treatment systems that should be discussed: low-pressure membrane systems (such as microfiltration and ultrafiltration) and high-pressure membrane systems (such as nanofiltration and reverse osmosis). Low-pressure membranes, including microfiltration (MF) and ultrafiltration (UF), are operated at pressures ranging from 10 to 30 psi, whereas high-pressure membranes, including nanofiltration (NF) and reverse osmosisBottom of Form (RO), are operated at pressures ranging from 75 to 250 psi. Figure 1 shows a schematic of the pore size of each membrane system as compared to the size of common water contaminants.
The idea of using low-pressure membrane filtration for surface water treatment began developing in the early 1980s. Low-pressure membranes had been usage in the food-processing industry as nonchemical disinfectants. The studies clearly showed that both MF membranes (with a nominal pore size of 0.2 mm and UF membranes (with a nominal pore size of 0.01 mm are highly capable of removing particulate matter (turbidity) and microorganisms. In fact, the research results showed that, when it came to these contaminants, membrane-treated water was of much better quality than that produced by the best conventional filtration plants. Figure 2 shows an example plot of turbidity removal by an MF membrane. The majority of treated-water samples had a turbidity level near the limit of the on-line turibidimeter (less than 0.05 Nephelometric Turbidity Units (NTU)). In addition, membrane filtration (both MF and UF) was proven to be an “absolute barrier” to Giardia cysts and Cryptosporidium oocysts when the membrane fibers and fittings were intact. Finally, the particular UF membranes tested were also proven to act as absolute barriers to viruses because of their nominal pore size of 0.01 mm.
As a surface water treatment technology, low-pressure membrane filtration has several advantages over conventional filtration and chlorination. These include smaller waste stream, lower chemical usage, smaller footprint, greater pathogen reduction, no disinfection byproduct formation, and more automation. For a while it was also believed that low-pressure membrane filtration is highly susceptible to excursions in raw water turbidity. However, pilot- and full-scale operational data have demonstrated that low-pressure membranes can treat turbidity excursions as high as several hundred NTUs with manageable impowdered activated carbonts on process operation and efficiency. All of the above advantages greatly favor membrane filtration over conventional filtration with chlorine.
On the other hand, because of their porous structure, low-pressure membranes are ineffective for the removal of dissolved organic matter. Therefore, color-causing organic matter, taste-and-odor-causing compounds such as Geosmin and methylisoborneol, and anthropogenic chemicals can pass through the membranes into treated water. This limits the applicability of low-pressure membrane filtration to surface water sources where the removal of organic matter is not required. One UF membrane system has overcome this limitation by introducing powdered activated carbon as part of the system. Powdered activated carbon injected into the influent water to the membrane is retained on the concentrate side of the membrane and disposed of with the waste stream. This approach is certain to expand the domain of low-pressure membrane applications in surface water treatment, especially at sites where organic removal is only occasionally required.
With all of these positive aspects, there were several obstacles that low-pressure membrane filtration had to overcome. First, for several years the cost of membrane filtration systems at “municipal” scale (i.e., greater than 1 MGD) was prohibitively high. Second, membrane filtration did not have regulatory acceptance and required extensive evaluation on a case-by-case basis. Third, information on its reliability in large-scale municipal applications was not available.
The cost of low-pressure membranes has decreased dramatically, which has made it more attractive to water utilities for full-scale implementation. In addition, a number of water utilities realized all the benefits that low-pressure membrane systems provided and decided to undergo the regulatory approval process to install these systems at relatively small and cost-effective scales. This has opened the door for the installation of increasingly larger low-pressure membrane plants. All MF or UF plants in the United States and around the world had capacities of less than 3 MGD. The first large-scale MF plant (5 MGD) went on-line in San Jose, California, after undergoing significant testing to obtain California Department of Health Services approval. Since then the application of low-pressure membrane filtration has been on the rise. The recent profile of low-pressure membrane installation in cumulative plant capacities. Today, membrane filtration is rapidly becoming accepted as a reliable water treatment technology. Membrane system construction costs are believed to be comparable to conventional plant construction costs up to a capacity of 20 MGD. However, this upper ceiling is rapidly rising. In fact, there are membrane plants being considered in the United States with capacities ranging from 30 to as high as 60 MGD.
As noted earlier, included in this category are nanofiltration (NF) and reverse osmosis (RO) membranes. NF membranes are actually thin-film composite Re membranes that were developed specifically to cover the pore size between Re membranes (<1 nm) and UF membranes (>2 nm) hence the name nanofiltration. Thin-film composite membranes are discussed later in this paper. The result was a type of membrane that operates Bottom of Form
at higher flux and lower pressure than traditional cellulose acetate RO membranes. In fact, NF membranes are sometimes referred to as “loose” RO membranes and are typically used when high sodium rejection, which is achieved by RO membranes, is not required, but divalent ions (such as calcium and magnesium) are to be removed. Nevertheless, NF membranes are viewed by the water industry as a separate class of membranes than RO membranes and are discussed in this paper as such. NF membranes are commonly operated at pressures ranging from 75 to 150 psi. NF membranes have been used successfully for groundwater softening since they achieve greater than 90 percent rejection of divalent ions such as calcium and magnesium. Several NF membrane-softening plants are currently in operation. The combined total capacity of NF plants was greater than 60 MGD. Because most commercially available NF membranes have molecular weight cutoff values ranging from 200 to 500 daltons, they are also capable of removing greater than 90 percent of naturalBottom of Form organic matter present in the water. Therefore, they are also excellent candidates for the removal of color and, more importantly, disinfection byproduct precursor material.
Currently, NF membranes are being considered as a total organic carbon (TOC) removal technology in surface water treatment. The idea is to install NF membranes downstream of media filtration in order to maintain a very low solids-loading rate on the membranes. Although NF membranes have been designated as one of two best available technologies for meeting stage 2 of the Disinfectants/Disinfection Byproducts, they have not been applied for surface water treatment at full scale. To date, pilot studies have been conducted to evaluate the applicability of NF membrane filtration downstream of media filtration during surface water treatment with mixed results. The study reported and clearly demonstrated that the fouling rate of NF membranes downstream of conventional filtration was two times higher than that of NF membranes downstream of MF or UF membranes. This was supported by another study that, which showed that conventional filtration pretreatment did not reduce the fouling rate of NF membranes to acceptable levels. Nevertheless, the Information Collection Rule includes data gathering on the applicability of NF membrane filtration for TOC removal from surface water sources. The majority of the data will be from bench-scale testing, which does not include information on long-term operational design and reliability, but some data will be obtained from pilot-testing programs. These data will provide additional input into the viability of NF membranes for surface water treatment.
RO membranes have long been used for desalination of seawater around the world. These membranes can consistently remove about 99 percent of the total dissolved solids (TDSs) present in the water, including monovalent ions such as chloride, bromide, and sodium. However, for a long time these membranes were predominantly made from cellulose acetate and required operating pressures at or greater than 250 psi. Recent innovations in Re membrane manufacturing have developed a new class of Re membranes, called Thin-film composite membranes that can achieve higher rejection of inorganic and organic contaminants than cellulose acetate Re membranes while operating at substantially lower pressures (100 to 150 psi). In addition, cellulose acetate Re membranes commonly require acid addition to lower the pH of the water to a range of 5.5 to 6.0 to avoid hydrolysis of the membrane material. Thin-film composite RO membranes do not hydrolyze at neutral or high pH and therefore do not require pH depression with acid addition. It should be noted that the need for pH depression for preventing the precipitation of salts on the membrane surface (such as CaCO3) may still be necessary in some cases depending on the quality of the water being treated and the availability of suitable antiscalents.
Thin-film composite RO membranes are currently being evaluated for water reclamation. Results from ongoing pilot studies have shown that Thin-film composite RO membranes can achieve greater than 90 to 95 percent rejection of nitrate and nitrite, compared to 50 to 70 percent removal with cellulose acetate Re membranes. The same pilot studies also show that the TOC concentration in the effluent of Thin-film composite Re membranes can be as low as 25 to 50 g/L.
Because of their existing applications for water softening and seawater desalination, high-pressure membrane treatment is currently accepted by the regulatory community and the water industry as a reliable technology. The main obstacle to increased application of high-pressure membranes in municipal water treatment is their high cost. By nature of the current modular design of membrane systems, economies of scale are not recognized for large treatment plants. However, several membrane manufacturers are currently modifying their membrane system designs to make them economically attractive at large scale.
Two-Stage Membrane Filtration
From the above discussion it is apparent that low-pressure membranes are highly effective for particulate removal, while high-pressure membranes are effective for dissolved matter removal (both organic and inorganic). Conceptually, combination of the two membrane systems in series (MF or UF followed by NF or RO) would provide a comprehensive treatment process train that is capable of removing the vast majority of dissolved and suspended material present in water. Such a treatment train is commonly termed “two-stage membrane filtration.” Other names include “integrated membrane systems” or “dual-stage membrane filtration.” The only material that is believed to pass through such a treatment train includes low-molecular-weight organic chemicals. However, compared to existing treatment, a two-stage membrane filtration process (possibly coupled with powdered activated carbon addition) would produce far superior water quality. The main concern about such highly treated water is that it may be more corrosive. Special corrosion inhibition measures for low-TDS waters of this kind require further development.
Several studies have been conducted to evaluate two-stage membrane systems for surface water treatment. The results of these studies have clearly shown that MF or UF membranes are excellent pretreatment processes to NF or RO membranes and that the combined particulate removal and organic removal capabilities of this treatment scheme produce excellent water quality that complies with existing and forthcoming regulatory requirements.
The primary obstacle that a two-stage membrane treatment system needs to overcome is its cost. Estimated the capital cost of a 40-gpm, two-membrane system at Rs.75000/kld. The capital unit cost of a large-scale, two-stage membrane system may range from Rs.38500 to Rs.56000/kld of capacity. This is still substantially higher than the cost of conventional treatment, which is estimated at Rs.19500 to 28500/kld.Bottom of Form
Membrane filtration technology is rapidly becoming accepted in the water treatment industry. Low-pressure membrane filtration (MF and UF) is now replacing conventional filtration for surface water treatment at several locations in the United States. High-pressure membrane filtration (both NF and RO) is used primarily for softening and TDS reduction but is being evaluated for the removal of natural organic matter in water treatment. The main obstacle to large-scale implementation of membrane filtration is its capital cost. Ongoing innovations in the design of large-scale membrane systems are continually lowering their capital cost and making them increasingly cost competitive with conventional treatment processes.
Ultraviolet Irradiation Technology
Ultraviolet (UV) irradiation technology is primarily used in the water and wastewater treatment industry as a disinfection process that capitalizes on the germicidal effect of UV light in the wavelength range of 250 to 270 nm. The process is commonly designed such that water flows in a narrow region around a series of UV lamps. The microorganisms in the water are inactivated through exposure to the UV light. The process is compowdered activated carbont since the time of exposure (which translates into hydraulic retention time) is commonly measured in seconds. The process works on the principle that UV energy disrupts the DNA of the microorganisms and prevents it from reproducing. UV irradiation technology has been used at approximately 50000 drinking water facilities. However, the vast majority of the facilities are either transient-non-community groundwater systems or non-transient-non-community groundwater systems serving less than 3,000 people each. These are facilities that provide water to restaurants, highway rest areas, airports, schools, camps, factories, rest homes, and hospitals. In fact, UV disinfection technology in drinking water treatment is currently only promoted for small-scale groundwater systems. However, the process can certainly be scaled up to large-scale applications since it is currently applied at large-scale wastewater treatment plants for final effluent disinfection. The largest wastewater treatment UV system with a peak design capacity of 265 MGD is in use. For water treatment systems, a minimum UV dose is commonly set for UV systems. The National Sanitation Foundation (NSF) standard for Class A UV systems (i.e., those that can be used as point-of-use (POU) and point-of-entry (POE) treatment devices) requires that they emit a minimum UV dose of 38 mW-sec/cm2, which is the dose determined to inactivate Bacillus subtilis spores. It has specific criteria for UV systems in the form of a minimum dose. The adopted minimum UV doses for pretreated drinking water (16 mW-sec/cm2 to 30 mW-sec/cm2). All of these doses are based on the requirement to inactivate bacteria and viruses but not protozoans. There is limited information on the ability of UV irradiation to Bottom of Form
inactivate Giardia cysts. The study to evaluate the UV inactivation of Giardia lamblia cysts obtained from infected humans and gerbils. The testing results conducted in distilled water are shown in Figure 4. The results show that a UV dose of approximately 40 mW-sec/cm2 achieved 0.5-log inactivation of Giardia lamblia, whereas a UV dose of 180 mw-sec/cm2 was required to achieve 2-log inactivation of Giardia cysts. A study also showed that a UV dose of 63 mW-sec/cm2 achieved 0.5-log inactivation of Giardia cysts, also in distilled water. As per guidance document for the application of UV technology for surface water treatment a specific dose of 140 mW-sec/cm2 as a requirement to meet the 2.2 coliforms/100 ml in reclaimed water.
It has been estimated the capital cost of a UV treatment system for a 1.5-MGD plant at Rs.32,65,600.00 (at a UV dose of 16 to 30 mW-sec/cm2). The average capital unit cost will be Rs.925/kld of capacity. The operations and maintenance cost of such a system is estimated at Rs.117/kld of water treated. On the other hand, UV disinfection is commonly used in large-scale wastewater treatment plants. There, the cost of a 12-MGD UV treatment system designed to meet water reclamation standards is estimated at Rs. 10.5 Crores to 14.2 Crores. This estimate is for a medium-pressure UV system, treating water with a 55 percent transmittance (approximately 0.260 cm-1 UV-254 absorbance) and applying a dose of 140 mW-sec/cm 2. This is equivalent to a cost range of Rs.925 to 1140/kld of capacity. These cost values indicate that the application of UV technology to large-scale water treatment is cost competitive. If UV irradiation can be proven effective against Cryptosporidium at reasonable doses (<200 mW-sec/cm2), it will become an attractive alternative to ozone, which is currently believed to be one of few effective disinfectants for inactivating Cryptosporidium.
There are four types of UV technologies of interest to the water industry: low-pressure, low-intensity (LP-LI) UV technology; low-pressure, medium-intensity (LP-MI) UV technology; medium-pressure, high-intensity (MP-HI) UV technology; and pulsed-UV (PUV) technology. Approximately 90 percent of the UV installations in North America have LP-LI UV technology, with some dating back to the 1970s. The power output of LP-LI UV lamps commonly varies from 40 to 85 W. Another unique characteristic of low-pressure lamps is that they emit a monochromatic light at a wavelength of 254 nm. EPA’s design manual is specifically based on and tailored to LP-LI UV technology. The primary advantage of LP-LI UV lamps is their high efficiency. The primary disadvantage is their low power, which results in the need for a large number of lamps for a small plant. For example, a typical secondary wastewater effluent would require approximately 40 LP-LI UV lamps per MGD of peak capacity. Considering that a significant labor effort is required to clean and maintain UV lamps, the application of LP-LI UV technology at large scale is not desirable.
LP-MI UV lamps are identical to LP-LI UV lamps with the exception of a higher power output–170 W compared to 40 to 85 W. Therefore, a typical secondary wastewater effluent would now require only 20 to 24 lamps per MGD of capacity. This makes LP-MI UV technology more applicable for medium-size water treatment facilities than LP-LI UV technology.
MP-HI UV lamps operate at substantially higher gas pressure inside the lamps compared to low-pressure UV lamps and are characterized by a power output that varies from 5 to 30 KW. Contrary to low-pressure lamps that produce all of their light at approximately 254 nm, medium-pressure lamps produce a polychromatic light, of which only 25 percent is in the germicidal wavelength range of 200 to 300 nm. However, because of the higher power output of MP-HI UV lamps, UV disinfection systems using this technology are substantially smaller than those using LP-LI UV technology, simply because of the need for significantly fewer lamps. This technology has been used in small-scale water treatment and industrial applications. However, it was not introduced to the municipal wastewater market until 1994. Currently, more than 270 MP-HI UV systems are in operation, with 70 of them operating at municipal wastewater treatment plants. One drawback of MP-LI UV technology is its low power efficiency compared to low-pressure technology. Another drawback is its high capital cost. Typically, a low-pressure UV lamp costs about $500, whereas a medium-pressure UV lamp costs about Rs3,52,,000. Nevertheless, considering the substantial savings in the number of lamps, both capital and operations and main tenance costs of large-scale MP-HI UV systems are lower than those of LP-LI UV systems. In fact, the general assumption in the industry is that UV systems with peak flow greater than 10 MGD should utilize MP-HI UV technology in order to keep the number of UV lamps at a manageable level.
Low- and medium-pressure UV technologies are past the research stage and have been accepted as reliable disinfection technologies. In fact, specific LP-LI UV doses are listed in the Surface Water Treatment Guidance Manual for the inactivation of viruses in water. The cost of UV systems is also not prohibitive since the technology is less expensive than ozone and many other disinfection processes.
The new UV technology under development is pulsed UV technology. In this process the energy is stored in a capowdered activated carbonitor and then released to the lamp in a short, high-intensity pulse. The duration between pulses is approximately 30 milliseconds, and each pulse lasts for less than 1 millisecond. The intensity of each pulse is believed to be about 107 mW/cm2. One manufacturer of this technology claims that the high energy emitted with each pulse is far more effective for the inactivation of microorganisms compared to the same level of energy emitted over an extended period of time. Figure 5 shows a plot of the inactivation rate of MS2 bacterial virus with pulsed UV and LP-LI UV systems. The results suggest that, for the same UV dose, pulsed UV systems may achieve approximately one log additional kill of MS2 virus compared to that achieved by traditional LP-LI UV. However, questions remain about the ability to accurately measure the UV dose emitted by a pulsed UV system. Regardless of the type of UV technology used, the obstacles against application of this technology in municipal water treatment can be summarized as follows.
Limited Information on Giardia and Cryptosporidium Inactivation Capability
As noted above, there is limited data on the inactivation of Giardia with UV technology. Two studies have shown that a UV dose of 40 to 65 mW-sec/cm2 is required to achieve a 0.5-log inactivation of Giardia lamblia and that 180 mW-sec/cm2 is required to achieve 2-log inactivation of Giardia lamblia. Data on UV inactivation of Cryptosporidium oocysts is even more sparse than that of Giardia cysts. Recent data presented at the 1998 AWWA Water Quality Technology Conference in San Diego, California, suggested that, using a mouse-infectivity model, a UV dose of 20 mW-sec/cm2 is sufficient to achieve 4-log inactivation of Cryptosporidium oocysts. However, using excystation to measure oocyst viability, the results indicate that only 1-log inactivation of Cryptosporidium oocysts was achieved with UV doses as high as 170 mW-sec/cm2. If these results can be validated by others, they seem to emphasize the notion that UV irradiation does not kill an organism but only limits its ability to reproduce and thus infect a host organism.
No Significant Oxidation Capability
One of the added benefits of disinfection with ozone, chlorine, or chlorine dioxide is the ability of each to also act as an oxidant for color, taste, and odor control. Unfortunately, disinfection with UV irradiation does not provide this added benefit because UV light, even with hydrogen peroxide addition, is not a strong oxidant As such, even if Giardia inactivation with UV light is proven to be feasible, the process will be limited to water systems that do not rely on the disinfectant for any color, taste, or odor oxidation.
A UV treatment process is comprised of a series of UV lamps enclosed inside a quartz sleeve. UV light passes through the quartz and into the water. Because of the high energy emitted by the UV lamps, the temperature of the quartz sleeve can rise substantially, causing precipitation of various scales on the surface of the sleeve, thus blocking the passage of the UV light into the water and dramatically reducing the efficiency of the process. The scales are commonly caused by the precipitation of calcium, iron, or magnesium salts. Several UV light manufacturers have developed continuous cleaning mechanisms to prevent scale buildup. However, this problem still plagues the majority of UV systems. Considering that natural waters can greatly vary in calcium, iron, and magnesium content, this issue may be a significant obstacle to widespread implementation of UV irradiation technology in water treatment.
Severely Impaired by Particulate Matter
UV treatment systems rely on the ability of UV light to reach and inactivate the target microorganism. However, if particulate matter is present in the water, it can shield the microorganism from the UV light and thus render the process ineffective. There is no documented correlation that the authors are aware of between suspended solids content and process efficiency. Until such correlation is developed and accepted, UV application in surface water treatment is confined to postfiltration, where the solids content is negligible.
Limited Process Reliability
Despite claims of UV manufacturers, field-scale UV systems commonly experience failures in various components. Their high reliance on sensitive electrical components, such as capowdered activated carbonitors and ballasts makes them vulnerable to high incidences of failure.
In summary, UV irradiation is a promising disinfection technology for large-scale water treatment applications. It is compowdered activated carbont and cost effective. More Giardia and Cryptosporidium UV-inactivation information is required before it is considered a reliable process for meeting current and upcoming water disinfection requirements.
Advanced Oxidation Technology
The term “Advanced Oxidation Processes” was first used to describe a process that produces hydroxyl radicals (OH) for the oxidation of organic and inorganic water impurities. Advanced Oxidation Processes include a number of processes. However, three main Advanced Oxidation Processes discussed herein: ozone, ozone with hydrogen peroxide addition, and UV irradiation with hydrogen peroxide addition. Advanced Oxidation Processes can have multiple uses in water treatment. Examples include oxidation of synthetic organic chemicals, color, taste-and-odor-causing compounds, sulfide, iron, and manganese and destruction of disinfection byproduct precursors prior to the addition of chlorine. However, it has been demonstrated that Advanced Oxidation Processes may not be good candidates (i.e., cost effective) for disinfection byproduct precursor de destruction. In this paper the application of each of the above processes in municipal water treatment is briefly discussed, and some of the challenges facing each process are presented.Bottom of Form
There are numerous published books, peer-reviewed articles, and proceedings papers on the application of ozonation in drinking water treatment. The fundamental chemical principles of ozone reactions in water to produce hydroxyl radicals, general ozone applications in water treatment, and the design of ozone treatment systems.
The application of ozone in water treatment has increased, especially for color removal, taste-and-odor control, and/or disinfection. With the increased pressure to reduce chlorination byproduct formation and the need to inactivate increasingly resistant pathogens, many utilities are looking to ozone as their primary disinfection process. Ozone also has unique benefits over most other disinfectants including taste-and-odor control and the ability to inactivate of Cryptosporidium. Promulgation resulted in a dramatic increase in the development, design, and construction of ozonation processes in new and existing plants. Approximately 40000 ozone water treatment plants were in operation. The number of ozone plants having greater than 1-MGD capacity was estimated at 114000.
It is fair to assume that ozone is no longer considered an “emerging” water treatment technology since it has been applied in large municipal treatment plants 600-MGD Aqueduct Filtration Plant (3.82 kgs/min ozone capacity) and 300-MGD water treatment plant (4.78 kgs/min ozone capacity). However, the design of ozone systems for water disinfection is currently undergoing a noticeable change. Ozone disinfection systems have been designed to achieve low levels of Giardia inactivation. This required ozone contactors with “conventional” design criteria, including an average hydraulic retention time of 8 to 12 minutes and ozone doses ranging from 0.5 to 2 mg/L for an average water quality. For such low doses the ozone-residual concentration is virtually nondetectable in the effluent of the ozone contactor. However, with utilities planning to use ozone for Cryptosporidium inactivation, process engineers need to consider redesigning the ozone systems to accommodate the requirement for a far higher contact time (CT) requirement, which translates into higher ozone doses and/or longer contact times. Optimization of ozone system designs for Cryptosporidium inactivation can help maintain ozone application in water treatment at cost-effective levels. In addition, testing done at pilot and demonstration levels is showing that ozone application to conventionally designed contactors at doses required for Cryptosporidium inactivation result in substantially high ozone-residual levels in the effluent water from the contactor. This residual ozone quickly volatilizes into the atmosphere as it exits the closed ozone-contactor environment to the open downstream environment.
Therefore, quenching of this residual ozone before the water exits the contactor is necessary in order to minimize operator exposure to unhealthy levels of ozone in the atmosphere. This task appears to be more challenging than earlier thought. Options to quench the ozone residual include air stripping the ozone in the last chamber of the ozone contactor and quenching the ozone residual with a reducing agent to the last chamber of the contactor (these include hydrogen peroxide, thiosulfate, and bisulfite). Air stripping the ozone in the last chamber requires installation of a separate air-stripping system. It is also not clear what air-to-water ratios are required to achieve effective stripping of ozone residual. Studies have shown that quenching the ozone residual with hydrogen peroxide is not always effective. Preliminary results seem to indicate that the reaction between ozone residual and hydrogen peroxide is substantially slower in lower-alkalinity waters. There is limited information on the effectiveness of thiosulfate or bisulfite for quenching ozone residuals in water.
It should be noted that one of the main obstacles to wider use of ozonation in municipal drinking water treatment is the potential formation of bromate (BrO3-), a possible human carcinogen, when the water being treated contains bromide. In general, bromide concentrations greater than 50 g/L may result in bromate formation at levels greater than the maximum contaminant level (MCL) of 10 g/L. At this time the only demonstrated bromate formation control strategy is to depress the water pH in the ozone contactor to less than 6.5 to 7. Additional work is needed to control bromate formation during ozonation of bromide-containing waters.
Rule-of-thumb costs for ozone systems are currently estimated at Rs.1,30,000 to Rs.1,98,000 per kg/min of ozone capacity. Therefore, for a 12-MGD treatment plant requiring an ozone dose of 5 mg/L, the capital cost of the ozone treatment system is estimated at Rs.6.52 Crores to Rs 9.92 Crores. This includes the ozone equipment and the concrete ozone contactor. This cost range is equivalent to a unit capital cost of Rs. 5.48 to Rs.7.78/1000kg/min of capacity.
Ozone with Hydrogen Peroxide Addition
When hydrogen peroxide (H2O2) is added to ozonated water, it reacts with the molecular ozone, which accelerates the formation of hydroxyl radicals. Therefore, in an ozone-H2O2 process the goal is to increase the concentration of hydroxyl radicals, which is a stronger oxidizer than molecular ozone, and consequently rapidly reduce the concentration of molecular ozone. Therefore, hydrogen peroxide is added to an ozone process if it is used as an oxidation process but not as a disinfection process, which relies on the prevalence of a high concentration of molecular ozone.
The ozone-H2O2 process is used for the destruction of taste-and-odor-causing compounds, color removal, and destruction of micropollutants, such as volatile organic compounds, pesticides, and herbicides. Stoichiometric analysis suggests that the optimum H2O2-to-ozone ratio is approximately 0.3:1 (mg/mg). However, pilot-and full-scale studies have shown that the optimum ratio is more on the order of 0.5:1 to 0.6:1 mg/mg.
Currently, the conventional design of an ozone-H2O2 treatment process is one in which hydrogen peroxide is fed as a liquid to the influent water and an ozone-rich gas is fed through fine-bubble diffusers at the bottom of a contactor.
Considering the complexity of the reaction chemistry between ozone, hydrogen peroxide, natural organic matter, and other water constituents, it is not clear whether such a conventional design is the optimum design for an ozone-H2O2 treatment system. Innovations in engineering design may be able to improve the efficiency of the process at lower ozone and/or hydrogen peroxide doses.
UV Irradiation with Hydrogen Peroxide Addition
In the presence of UV light, hydrogen peroxide decomposes to form hydroxyl radicals. Addition of hydrogen peroxide to the influent of a UV irradiation process is currently being used for the destruction of micropollutants from groundwater, but it can also be used for the same purposes as other AOPs, which include the destruction of taste-and-odor-causing compounds and the removal of color. The reaction between UV and hydrogen peroxide to form hydroxyl radicals is substantially slower than that between ozone and hydrogen peroxide. However, in many groundwater remediation efforts, the simplicity of a UV irradiation system has been favored over the complexity of an ozone generation and feed system. However, owing to the slow hydroxyl-radical formation reaction in UV-H2O2 systems, the process must be operated with an excess of high concentration of hydrogen peroxide (5 to 20 mg/L hydrogen peroxide residual). Therefore, for this process to be used in drinking water treatment, either the process should be modified to utilize less hydrogen peroxide or a treatment process should be installed downstream to quench the hydrogen peroxide residual to acceptable levels (<0.5 mg/L) before the water is put into the distribution system. The various options available for quenching the hydrogen peroxide residual include chlorine, thiosulfate, sulfite, or granular-activated carbon.
Ion Exchange Technology
Ion exchange technology has been used in the chemical and environmental engineering fields for a long time. However, its use has been mostly limited to water softening (Ca2+ and Mg2+ removal), either at the water treatment plant or as a point-of-use treatment process and for industrial applications, such as the production of fully demineralized water. However, with new limits being set on several inorganic chemicals, Ion exchange technology is finding new applications in water treatment. Some of the primary candidates for removal with Ion exchange include nitrate, arsenic, selenium, barium, radium, lead, fluoride, and chromate. Surveys conducted in the early 1980s showed that 400 communities exceeded the nitrate MCL of 10 mg/L as nitrogen and 400 communities exceeded the fluoride MCL of 4 mg/L. A new contaminant recently discovered in groundwater is perchlorate (C1O4-), which is a component of solid-rocket fuel. Ion exchange technology is ideal for the removal of perchlorate ion from contaminated groundwater.
The technology is commonly designed as a fixed-bed process in which a synthetic resin is powdered activated carbonked. As water passes through the resin bed, contaminantions present in the water are exchanged with ions on the resin surface, thus removing the contaminant ions from the water and concentrating them on the resin. The resin is frequently regenerated to remove the contaminant from the resin surface and replenish it with the original exchange ion. There are four primary types of Ion exchange resins: strong acid cationic resin, weak acid cationic resin, strong base anionic resin, and weak base anionic resin.
Table 1 lists the various ions that can be removed by each type of resin, the resin regeneration requirements, and some of the operating pH ranges for each resin type. As their names indicate, strong acid cationic and weak acid cationic resins are used to remove cations from water (e.g., Ca 2+, Mg2+, Ra2+, Ba2+, Pb2+), while strong base anionic and weak base anionic resins are used to remove anions from water (e.g., NO3–, SO42-, C1O4–, HAsO42-, SeO32-, etc.). Strong acid cationic resins operate over a wide range of pH values (1 to 14), whereas weak acid cationic resins can only operate at pH values greater than 7. During water softening, strong acid cationic resins can remove both carbonate and noncarbonate hardness, whereas weak acid cationic resins can only remove carbonate hardness. On the other hand, weak acid cationic resins are easier to regenerate than strong acid cationic resins and do not result in sodium concentration increases as strong acid cationic resins do.
TABLE 1 Types and Characteristics of IX Resins
|Resin Type||Functional Group||Ions Removed||Regeneration||Operating pH Range|
|Strong acid cationic resin||Sulfonate RSO3–||Total hardness, Mg2+, Ra2+, Ba2+ Pb2+, etc.||Regenerated with HCl or NaCl||1 to 14|
|Weak acid cationic resin||Carboxylate RCOO–||Carbonate hardness, Mg2+, Ra2+, Ba2+, Pb2+, etc.||Regenerated with HCl||>7|
|Strong base anionic resin||Quaternary amine RN(CH3)3+||NO3–, SO42-, C1O4–, HAsO42-, SeO32-, etc.||Regenerated with NaOH or NaCl||1 to 13|
|Weak base anionic resin||Tertiary amine RN(CH3)2H+||NO3–, SO42-, C1O4–, HAsO42-, SeO32-, etc.||Regenerated with NaOH, or Ca(OH)2||<6|
The cost of Ion exchange technology is competitive with that of other inorganics removal processes, such as lime softening, high-pH precipitation, and high-pressure membranes (e.g., RO membranes). For example, the capital cost of Ion exchange Bottom of Form
treatment for nitrate removal from groundwater is estimated at Rs. 9 to Rs.10/kg/d. However, application of Ion exchange technology large-scale is problematic because of the waste stream produced by the process. The volume of the waste stream is not large and can amount to only 2 to 5 percent of the water volume treated; however, the waste stream contains a high concentration of acid (HCl), base (NaOH), or salt (NaCl), ranging from 1 to 3 M. In addition, the waste stream contains a high concentration of the contaminant removed from the water (e.g., NO3–, HAsO42-, Pb2+, etc.). The disposal of a waste stream containing these components is the primary obstacle to widespread implementation of Ion exchange technology at large-scale water treatment plants. Plants in coastal areas may have the option of disposing of this stream into the ocean. However, no cost-effective disposal options exist for inland plants.
All of the technologies discussed above are physical and/or chemical processes. In fact, the water treatment industry depends solely on physical and/or chemical processes to meet water quality goals. Utilization of biological processes in water treatment has been frowned on by the industry because of concern about the introduction of microorganisms to water. However, this barrier has been broken by the introduction of biological filtration as the most effective process for the production of biologically stable water. This was specifically driven by concern about the increase in the concentration of biodegradable organic matter as a result of ozonating natural waters. There is concern that higher biodegradable organic matter levels may result in increased potential for biological regrowth in the distribution system. Therefore, implementing biological filtration in the water treatment plant reduces biodegradable organic matter concentrations in the water before it is introduced into the distribution system. Several plants currently use biological filtration after ozonation. In fact, Stage 1 of the D/disinfection byproduct, requires water utilities to implement biological filtration for biodegradable organic matter removal if ozone is used at the treatment plant.
There are several unanswered questions about the design and operation of biological filtration, such as what filter media type and size to use and what minimum empty bed contact time can be used while maintaining satisfactory biodegradable organic matter removal.
Pilot studies conducted by various researchers have concluded that either granular-activated carbon or anthracite, compared to sand, is required as the attachment medium for the biofilm. Clearly, anthracite is substantially less expensive than granular-activated carbon. Anthracite has been shown to be equivalent to granular-activated carbon as a biological filtration medium when used in warm climates. However, it may not be satisfactory in cold climates, as studies have shown that a higher granular-activated carbon surface area, compared to that of anthracite, is required to maintain an active biofilm when treating cold water. The concentration of biomass on the surface of biologically active granular-activated carbon filters was approximately three to eight times greater than that on the surface of biologically active anthracite filters. Figures 6 and 7 show examples of the impowdered activated carbont of media type, temperature, and empty bed contact time on the performance of biological filtration. Figure 6 shows that under warm-temperature conditions (10 to 15°C), the biological removal of oxalate (a byproduct of ozonating natural water) by granular-activated carbon or anthracite filters is virtually identical. Figure 6 also shows that under warm-temperature conditions the majority of the oxalate removal occurs within two minutes of empty bed contact time. Figure 7 shows that only under cold-water temperature conditions (1 to 3°C) was the biological removal of oxalate by granular-activated carbon filters substantially higher than that by anthracite filters. Figure 7 also shows that under cold-water temperature conditions an empty bed contact time in excess of 10 minutes may be required to achieve high removal of oxalate. The general trend in the design of biological filters is to include a shallow sand layer (6 to 12 inches) under the granular-activated carbon or anthracite media. This sand layer serves as a partial barrier against the breakthrough of biomass into the filter-effluent water. Biological filters are operated the same as conventional dual-media filters, with the exception that no chlorine or chloramine is present in the influent water to the filter. However, there are unanswered questions about the proper backwashing procedure for biofilters. Some plants have determined that intermittent addition of 4 to 5 mg/L of chlorine to the backwash water (about once every third backwashing) can help control the biological culture in the filter, prevent the growth of multicellular organisms in the filter, and increase filter run lengths by reducing the headless buildup rate. Additional work is required to address operational issues of biofiltration processes.
The use of biofiltration in drinking water treatment opens the door to new and innovative applications of this process. Biofiltration can be used for the biological reduction of various inorganic contaminants such as nitrate, bromate, perchlorate, chlorate, and selenate. However, its use for these applications still requires a substantial amount of research and engineering and is far from being ready for implementation at large municipal scale.
Historically, the water industry has adapted to new technologies at a slow, incremental powdered activated carbone. There has been a rapid entry of new technologies that continue to be developed, tested, demonstrated, and introduced into the municipal water treatment market. Some of these technologies are membrane filtration, UV irradiation, advanced oxidation, ion exchange, and biological filtration. These are certainly not the only technologies being considered by the water treatment industry. However, they have come a long way toward demonstrating their reliability and applicability to large-scale municipal water treatment plants. As the cost of these technologies continues to decrease, their applicability will steadily increase.
There is almost no contaminant that cannot be removed from water. The question becomes that of cost. As alternative water resources become increasingly less available, the need for innovative and cost-effective treatment technologies will rise steadily.