Jump to main content or area navigation.

Contact Us

Water: Monitoring & Assessment

5.7 Nitrates

What are nitrates and why are they important?

Nitrates are a form of nitrogen, which is found in several different forms in terrestrial and aquatic ecosystems. These forms of nitrogen include ammonia (NH3), nitrates (NO3), and nitrites (NO2). Nitrates are essential plant nutrients, but in excess amounts they can cause significant water quality problems. Together with phosphorus, nitrates in excess amounts can accelerate eutrophication, causing dramatic increases in aquatic plant growth and changes in the types of plants and animals that live in the stream. This, in turn, affects dissolved oxygen, temperature, and other indicators. Excess nitrates can cause hypoxia (low levels of dissolved oxygen) and can become toxic to warm-blooded animals at higher concentrations (10 mg/L) or higher) under certain conditions. The natural level of ammonia or nitrate in surface water is typically low (less than 1 mg/L); in the effluent of wastewater treatment plants, it can range up to 30 mg/L.

Sources of nitrates include wastewater treatment plants, runoff from fertilized lawns and cropland, failing on-site septic systems, runoff from animal manure storage areas, and industrial discharges that contain corrosion inhibitors.

Sampling and equipment considerations

Nitrates from land sources end up in rivers and streams more quickly than other nutrients like phosphorus. This is because they dissolve in water more readily than phosphates, which have an attraction for soil particles. As a result, nitrates serve as a better indicator of the possibility of a source of sewage or manure pollution during dry weather.

Water that is polluted with nitrogen-rich organic matter might show low nitrates. Decomposition of the organic matter lowers the dissolved oxygen level, which in turn slows the rate at which ammonia is oxidized to nitrite (NO2) and then to nitrate (NO3). Under such circumstances, it might be necessary to also monitor for nitrites or ammonia, which are considerably more toxic to aquatic life than nitrate. (See Standard Methods section 4500-NH3 and 4500-NO2 for appropriate nitrite methods; APHA, 1992)

Water samples to be tested for nitrate should be collected in glass or polyethylene containers that have been prepared by using Method B in the introduction.

Volunteer monitoring programs usually use two methods for nitrate testing: the cadmium reduction method and the nitrate electrode. The more commonly used cadmium reduction method produces a color reaction that is then measured either by comparison to a color wheel or by use of a spectrophotometer. A few programs also use a nitrate electrode, which can measure in the range of 0 to 100 mg/L nitrate. A newer colorimetric immunoassay technique for nitrate screening is also now available and might be applicable for volunteers.

Cadmium Reduction Method

The cadmium reduction method is a colorimetric method that involves contact of the nitrate in the sample with cadmium particles, which cause nitrates to be converted to nitrites. The nitrites then react with another reagent to form a red color whose intensity is proportional to the original amount of nitrate. The red color is then measured either by comparison to a color wheel with a scale in milligrams per liter that increases with the increase in color hue, or by use of an electronic spectrophotometer that measures the amount of light absorbed by the treated sample at a 543-nanometer wavelength. The absorbance value is then converted to the equivalent concentration of nitrate by using a standard curve. Methods for making standard solutions and standard curves are presented at the end of this section.

This curve should be created by the program advisor before each sampling run. The curve is developed by making a set of standard concentrations of nitrate, reacting them and developing the corresponding color, and then plotting the absorbance value for each concentration against concentration. A standard curve could also be generated for the color wheel.

Use of the color wheel is appropriate only if nitrate concentrations are greater than 1 mg/L. For concentrations below 1 mg/L, a spectrophotometer should be used. Matching the color of a treated sample at low concentrations to a color wheel (or cubes) can be very subjective and can lead to variable results. Color comparators can, however, be effectively used to identify sites with high nitrates.

This method requires that the samples being treated are clear. If a sample is turbid, it should be filtered through a 0.45-micron filter. Be sure to test whether the filter is nitrate-free. If copper, iron, or other metals are present in concentrations above several mg/L, the reaction with the cadmium will be slowed down and the reaction time will have to be increased.

The reagents used for this method are often prepackaged for different ranges, depending on the expected concentration of nitrate in the stream. For example, the Hach Company provides reagents for the following ranges: low (0 to 0.40 mg/L), medium (0 to 4.5 mg/L), and high (0 to 30 mg/L). You should determine the appropriate range for the stream being monitored.

Nitrate Electrode Method

A nitrate electrode (used with a meter) is similar in function to a dissolved oxygen meter. It consists of a probe with a sensor that measures nitrate activity in the water; this activity affects the electric potential of a solution in the probe. This change is then transmitted to the meter, which converts the electric signal to a scale that is read in millivolts. The millivolts are then converted to mg/L of nitrate by plotting them from a standard curve (see above). The accuracy of the electrode can be affected by high concentrations of chloride or bicarbonate ions in the sample water. Fluctuating pH levels can also affect the reading by the meter.

Nitrate electrodes and meters are expensive compared to field kits that employ the cadmium reduction method. (The expense is comparable, however, if a spectrophotometer is used rather than a color wheel.) Meter/probe combinations run between $700 and $1,200 including a long cable to connect the probe to the meter. If the program has a pH meter that displays readings in millivolts, it can be used with a nitrate probe and no separate nitrate meter is needed. Results are read directly as milligrams per liter.

Although nitrate electrodes and spectrophotometers can be used in the field, they have certain disadvantages. These devices are more fragile than the color comparators and are therefore more at risk of breaking in the field. They must be carefully maintained and must be calibrated before each sample run and, if you are doing many tests, between samplings. This means that samples are best tested in the lab. Note that samples to be tested with a nitrate electrode should be at room temperature, whereas color comparators can be used in the field with samples at any temperature.

How to collect and analyze samples

The procedures for collecting and analyzing samples for nitrate consist of the following tasks:

TASK 1 Prepare the sample containers

If factory-sealed, disposable Whirl-pak® bags are used for sampling, no preparation is needed. Reused sample containers (and all glassware used in this procedure) must be cleaned before the first run and after each sampling by following the method described on page 128 under Method B. Remember to wear latex gloves.

TASK 2 Prepare before leaving for the sampling site

Refer to section 2.3 - Safety Considerations for details on confirming sampling date and time, safety considerations, checking supplies, and checking weather and directions. In addition to the standard sampling equipment and apparel, the following equipment is needed when analyzing nitrate nitrogen in the field:

  • Color comparator or field spectrophotometer with sample tubes (for reading absorbance of the sample)
  • Reagent powder pillows (reagents to turn the water red)
  • Deionized or distilled water to rinse the sample tubes between uses
  • Wash bottle to hold rinse water
  • Waste bottle with secure lid to hold used cadmium particles, which should be clearly labeled and returned to the lab, where the cadmium will be properly disposed of
  • Mixing container with a mark at the sample volume (usually 25 mL) to hold and mix the sample
  • Clean, lint-free wipes to clean and dry the sample tubes

TASK 3 Collect the sample

Refer to Task 2 in Chapter 5 - Water Quality Conditions for details on collecting a sample using screw-cap bottles or Whirl-pak® bags.

TASK 4 Analyze the sample in the field

Cadmium Reduction Method With a Spectrophotometer

The following is the general procedure to analyze a sample using the cadmium reduction method with a spectrophotometer. However, this should not replace the manufacturer's directions if they differ from the steps provided below:

  1. Pour the first field sample into the sample cell test tube and insert it into the sample cell of the spectrophotometer.
  2. Record the bottle number on the lab sheet.
  3. Place the cover over the sample cell. Read the absorbance or concentration of this sample and record it on the field data sheet.
  4. Pour the sample back into the waste bottle for disposal at the lab.

Cadmium Reduction Method With a Color Comparator

To analyze a sample using the cadmium reduction method with a color comparator, follow the manufacturer's directions and record the concentration on the field data sheet.

TASK 5 Return the samples and the field data sheets to the lab/drop-off point for analysis

Samples being sent to a lab for analysis must be tested for nitrates within 48 hours of collection. Keep samples in the dark and on ice or refrigerated.

TASK 6 Determine results (for spectrophotometer absorbance or nitrate electrode) in lab

Preparation of Standard Concentrations

Cadmium Reduction Method With a Spectrophotometer

First determine the range you will be testing (low, medium, or high). For each range you will need to determine the lower end, which will be determined by the detection limit of your spectrophotometer. The high end of the range will be the endpoint of the range you are using. Use a nitrate nitrogen standard solution of appropriate strength for the range in which you are working. A 1-mg/L nitrate nitrogen (NO3-N) solution would be suitable for low-range (0 to 1.0 mg/L) tests. A 100-mg/L standard solution would be appropriate for medium- and high-range tests. In the following example, it is assumed that a set of standards for a 0 to 5.0 mg/L range is being prepared.

Example:

  1. Set out six 25-mL volumetric flasks (one for each standard). Label the flasks 0.0, 1.0, 2.0, 3.0, 4.0, and 5.0.
  2. Pour 30 mL of a 25-mg/L nitrate nitrogen standard solution into a 50-mL beaker.
  3. Use 1-, 2-, 3-, 4-, and 5-mL Class A volumetric pipets to transfer corresponding volumes of nitrate nitrogen standard solution to each 25-mL volumetric flask as follows:
    Standard
    Solution
    mL of Nitrate Nitrogen
    Standard Solution
    0.0 0
    1.0 1
    2.0 2
    3.0 3
    4.0 4
    5.0 5

Analysis of the Cadmium Reduction Method Standard Concentrations

Use the following procedure to analyze the standard concentrations.

  1. Add reagent powder pillows to the nitrate nitrogen standard concentrations.
  2. Shake each tube vigorously for at least 3 minutes.
  3. For each tube, wait at least 10 minutes but not more than 20 minutes to proceed.
  4. "Zero" the spectrophotometer using the 0.0 standard concentration and following the manufacturer's directions. Record the absorbance as "0" in the absorbance column on the lab sheet. Rinse the sample cell three times with distilled water.
  5. Read and record the absorbance of the 1.0-mg/L standard concentration.
  6. Rinse the sample cell test tube three times with distilled or deionized water. Avoid touching the lower part of the sample cell test tube. Wipe with a clean, lint-free wipe. Be sure that the lower part of the sample cell test tube is clean and free of smudges or water droplets.
  7. Repeat steps 3 and 4 for each standard.
  8. Prepare a calibration curve and convert absorbance to mg/L as follows:
    • Make an absorbance versus concentration graph on graph paper: (a) Make the vertical (y) axis and label it "absorbance." Mark this axis in 1.0 increments from 0 as high as the graph paper will allow. (b) Make the horizontal (x) axis and label it "concentration: mg/L as nitrate nitrogen." Mark this axis with the concentrations of the standards: 0.0, 1.0, 2.0, 3.0, 4.0, and 5.0.
    • Plot the absorbance of the standard concentrations on the graph.
    • Draw a "best fit" straight line through these points. The line should touch (or almost touch) each of the points. If it doesn't, the results of this procedure are not valid.
    • For each sample, locate the absorbance on the "y" axis, read over horizontally to the line, and then move down to read the concentration in mg/L as nitrate nitrogen.
    • Record the concentration on the lab sheet in the appropriate column.

For Nitrate Electrode

Standards are prepared using nitrate standard solutions of 100 and 10 mg/L as nitrate nitrogen (NO3-N). All references to concentrations and results in this procedure will be expressed as mg/L as NO3-N. Eight standard concentrations will be prepared:

100.0 mg/L 0.40 mg/L
10.0 mg/L 0.32 mg/L
1.0 mg/L 0.20 mg/L
0.8 mg/L 0.12 mg/L

Use the following procedure:

  1. Set out eight 25-mL volumetric flasks (one for each standard). Label the flasks 100.0, 10.0, 1.0, 0.8, 0.4, 0.32, 0.2, and 0.12.
  2. To make the 100.0-mg/L standard, pour 25 mL of the 100-mg/L nitrate standard solution into the flask labeled 100.0.
  3. To make the 10.0-mg/L standard, pour 25 mL of the 10-mg/L nitrate standard solution into the flask labeled 10.0.
  4. To make the 1.0-mg/L standard, use a 10- or 5-mL pipet to measure 2.5 mL of the 10-mg/L nitrate standard solution into the flask labeled 1.0. Fill the flask with 22.5 mL distilled, deionized water to the fill line. Rinse the pipet with deionized water.
  5. To make the 0.8-mg/L standard, use a 10- or 5-mL pipet or a 2-mL volumetric pipet to measure 2 mL of the 10-mg/L nitrate standard solution into the flask labeled 0.8. Fill the flask with about 23 mL distilled, deionized water to the fill line. Rinse the pipet with deionized water. 6. To make the 0.4-mg/L standard, use a 10- or 5-mL pipet or a 1-mL volumetric pipet to measure 1 mL of the 10-mg/L nitrate standard solution into the flask labeled 0.4. Fill the flask with about 24 mL distilled, deionized water to the fill line. Rinse the pipet with deionized water.
  6. To make the 0.32-, 0.2-, and 0.12-mg/L standards, follow step 4 to make a 25-mL volume of 1.0 mg/L standard solution. Transfer this to a beaker. Pipet the following volumes into the appropriately labeled volumetric flasks:
    Standard
    Solution
    mL of Nitrate Nitrogen
    Standard Solution
    0.32 8
    0.20 5
    0.12 3
    Fill each flask up to the fill line. Rinse pipets with deionized water.

Analysis of the Nitrate Electrode Standard Concentrations

Use the following procedure to analyze the standard concentrations.

  1. List the standard concentrations (100.0, 10.0, 1.0, 0.8, 0.4, 0.32, 0.2, and 0.12) under "bottle #" on the lab sheet.
  2. Prepare a calibration curve and convert to mg/L as follows:
    • Plot absorbance or mV readings for the 100-, 10-, and 1-mg/L standards on semi-logarithmic graph paper, with concentration on the logarithmic (x) axis and the absorbance or millivolts (mV) on the linear (y) axis. For the nitrate electrode curve, a straight line with a slope of 58 ñ 3 mV/decade at 25 C should result. That is, measurements of 10- and 100-mg/L standard solutions should be no more than 58 ± 3 mV apart.
    • Plot absorbance or mV readings for the 1.0-, 0.8-, 0.4-, 0.32-, 0.2-, and 0.12-mg/L standards on semi-logarithmic graph paper, with concentration on the logarithmic (x) axis and the millivolts (mV) on the linear (y) axis. For the nitrate electrode, the result here should be a curved line since the response of the electrode at these low concentrations is not linear.
    • For the nitrate electrode, recalibrate the electrodes several times daily by checking the mV reading of the 10-mg/L and 0.4-mg/L standards and adjusting the calibration control on the meter until the reading plotted on the calibration curve is displayed again.

References

APHA. 1992. Standard methods for the examination of water and wastewater. 18th ed. American Public Health Association, Washington, DC.


Jump to main content.