Water: Monitoring & Assessment
This fifth National Directory of Volunteer Environmental Monitoring Programs is a tribute to the energy, imagination, and dedication of tens of thousands of volunteer monitors across the country. Inspired by the belief that everyone--not just professionals with specialized degrees--can study the natural world and collect meaningful data, trained volunteer monitors spend countless hours in the field making careful observations and measurements.
From the first edition (published in 1988), the Directory has focused primarily on monitoring of aquatic environments--rivers, lakes, estuaries, and wetlands. This fifth edition is no exception. The overwhelming majority of the 772 groups listed here monitor water bodies. Yet many do so in the context of the watershed as a whole, which means that they monitor not just the water body itself but also the vegetation, wildlife, and land uses in the surrounding landscape.
The survey conducted for this Directory represents the second time that detailed nationwide information about volunteer monitoring activities has been gathered. The first was the survey conducted for the previous (fourth) edition. That edition, published in January 1994, included 517 programs. (The first three editions were more modest in scope; they provided contact information and program descriptions but did not quantify information or create a database.)
In gathering the information for this edition, we were more impressed than ever at the broad scope of monitoring activities. Volunteer monitors--or citizen scientists, as they are sometimes called--have not been afraid to venture into almost any branch of environmental science. Within the pages of this Directory can be found, for example, many examples of volunteers performing biological studies--monitoring stream insects, watching out for invasive species, banding birds, identifying aquatic plants, even observing the behavior of harbor seals. An even larger number of volunteers are engaged in chemistry as they carry out basic water quality tests (dissolved oxygen is measured by over two-thirds of the programs listed). Other volunteers are delving into the field of public health, testing for bacteria in swimming areas or monitoring shellfish for paralytic shellfish poisoning, while still others study the physical side of aquatic systems--for example, measuring stream flow rate and channel shape, or streambed particle size.
Interpreting the survey data
As volunteers know, the conclusions drawn from monitoring data can only be as good as the data. The same principle applies to interpreting the survey data presented in this Directory. Although the survey form asked for seemingly straightforward information--what parameters do you monitor, where does funding come from, how many volunteers are involved--it was not always easy for respondents to provide accurate answers. Volunteer monitoring programs are continually changing and evolving as new monitoring activities are added, new volunteers sign up, funding sources change, and so on. Essentially, the survey was asking respondents to hit a moving target.
Moreover, real life rarely fits neatly into the kinds of categories seen on survey forms. To give just one example, volunteer monitoring programs very often partner with other groups on collaborative projects--so how does each partner determine which activities and volunteers to count as "theirs" for the purpose of filling out the form?
Finally, no survey, no matter how extensive, could succeed in capturing every volunteer monitoring program that's out there. Thus, not only are our measurements imprecise, but our dataset is incomplete. Yet in spite of these inevitable shortcomings, the dataset does yield useful conclusions--as long as we bear in mind that the picture we are seeing was painted with a very broad brush.
Survey data on the Web
The Directory database is also available on the Environmental Protection Agency's volunteer monitoring Website at http://yosemite.epa.gov/water/volmon.nsf. This electronic database is updatable--users can visit the Website to revise program listings or add new volunteer monitoring programs.
Note that the electronic database contains some additional information not included in the printed Directory--for example, it lists all the counties in which a program is active, and indicates whether the program has an approved quality assurance/quality control plan for monitoring. The electronic database also contains many additions and updates that were made after the hardcopy directory went to press.
Volunteer monitoring around the U.S.
A glance at the maps below shows that volunteer monitoring activity is not evenly distributed around the country, but is concentrated in the Northeast and Great Lakes regions, along the West Coast, and (to a somewhat lesser extent) along the Gulf Coast.
Not surprisingly, the distribution of volunteer monitoring programs tends to follow the distribution of water in the U.S. The least active region the--Great Plains--is also the driest, with low rainfall and limited surface water.
The distribution of programs also reflects the history of volunteer monitoring: volunteer lake and stream monitoring programs got started in the Northeast and Great Lakes regions, and many of those pioneering programs are still going strong after 20 or more years. (Some examples: the Izaak Walton League of America's Save Our Streams Program, started in Maryland in 1969; Maryland's Save Our Streams/Adopt-A-Stream program, founded in 1970; Maine's Volunteer Lake Monitoring Program, 1971; Minnesota's Citizen Lake Monitoring Program, 1973; the Michigan Cooperative Lakes Monitoring Program, 1974; and the New Hampshire Lakes Lay Monitoring Program, 1978).
Growth of volunteer monitoring
The Directory chronicles the continued growth of volunteer environmental monitoring. This fifth edition includes more programs than ever before (772, compared to 517 in the 1994 edition). It's also the first edition to list volunteer monitoring programs for every state in the Union. And while only five states in the last Directory had 25 or more programs, this edition includes eleven states with that number.
As Chart 1 and Table 1 show, rivers continue to be the environment monitored by the largest number of volunteer monitoring programs. Five hundred eighty-five programs--just over three-fourths of the respondents--include river monitoring in their activities.
Even though lakes still come in a distant second to rivers, more programs are monitoring lakes now than in the previous survey (Table 1). There has also been an increase in the percentage of programs that monitor wetlands, while estuaries are monitored by about the same proportion of programs as before.
Air, coral reef, and land were new choices added to the survey form this time. As the graph shows, volunteer programs are beginning to make forays into monitoring air and coral reefs, but these activities are still quite rare.
Monitoring on land, on the other hand, is becoming fairly common. Activities such as construction site inspections, land use mapping, and storm drain monitoring can identify land-based sources of pollution to a water body. Surveys of terrestrial wildlife are another way to monitor the landscape. For example, monitoring bird or amphibian populations in the area surrounding a wetland can help assess how well the entire ecosystem is functioning.
Programs that extend their monitoring into the surrounding landscape are demonstrating a whole-watershed approach. Further evidence of a watershed approach is seen in Table 2, which shows that over half of the programs (411, or 53%) monitor more than one environment (i.e., estuary plus river/stream, or lake plus wetland plus land), and 27% monitor three or more. This diversification reflects an awareness that all parts of a watershed are connected; that to gauge the health of, for example, a lake, you need to look not just at the lake itself but at the upstream tributaries and the surrounding land uses.
Table 3 shows the number of volunteer monitoring programs that participate in each monitoring activity listed on the survey form. Bear in mind that the form asked respondents to list all parameters monitored. Thus, programs that monitor more than one environment (and the majority do, as we saw in Table 2) would have checked any parameter that was tested for any of those environments. Since so many programs include river monitoring, parameters that are widely monitored in rivers--for example, macroinvertebrates--tend to rank high in the table.
The "big three" parameters temperature, dissolved oxygen (DO), and pH also came in at the top in the previous survey. These three parameters are relatively easy to measure and are important indicators of the ability of any surface water--river, lake, estuary, or wetland--to support aquatic life.
The low number of programs that measure pesticides, metals, and hydrocarbons probably reflects the unavailability of simple, reliable, low-cost methods. It seems likely that many more volunteer groups would want to test for these pollutants if appropriate methods were available. Since they are not, volunteer programs monitor the biological response of organisms such as macroinvertebrates, aquatic vegetation, fish, and other wildlife. The abundance, diversity, and/or condition of these organisms will reflect the overall health of the system and suggest whether toxic levels of pollutants are present.
River and lake parameters
To get a sense of which parameters are most commonly monitored for the different types of water bodies, it is helpful to examine separately those programs that monitor just a single environment. Table 4 shows the parameters tested by the 245 programs that monitor exclusively rivers and streams, and Table 5 shows those monitored by the 77 groups that monitor lakes only.
In comparing Tables 4 and 5, some clear differences can be seen. For example, for the "lakes-only" programs Secchi transparency, chlorophyll, exotic/invasive species, and aquatic vegetation all rank very high (1, 5, 8, and 9, respectively), but these same parameters rank quite low (17, 36, 31, and 25) for programs that monitor only rivers. Meanwhile, macroinvertebrate monitoring, which is the third most popular activity for the river groups, comes in at number 21 for the lake programs. Similar differences in methods are also seen in professional monitoring programs.
Why these differences? Mainly they arise from the fundamental differences between rivers and lakes. In general, rivers flow and tend to be shallow, while the water in a lake is relatively still and deep. As a result, lakes and rivers have some different problems--and even when they have the same problems, sometimes different methods must be used.
In many lakes, a major concern is excessive growth of algae and aquatic plants, caused by nutrient overenrichment. Thus, most lake programs monitor chlorophyll (a measure of algal growth) and aquatic vegetation. (Algal and plant overgrowth is usually less of a problem in rivers, where nutrients are swept along instead of accumulating in one place.)
The Secchi disk--one of the most widely used monitoring tools in the world--is extremely popular among lake monitors because it's a quick, simple, low-cost way to measure water clarity. Many river monitors are also interested in water clarity, but often they can't use a Secchi disk, either because the current is too strong for the disk to hang straight down and/or because the water is too shallow.
Bottom-dwelling macroinvertebrates (primarily aquatic insect larvae) are an ideal parameter for river and stream monitoring because they integrate impacts over time. Even after pollutants themselves have been flushed downstream, their effects can still be seen in the invertebrate community. At present, lake macroinvertebrate monitoring techniques suitable for volunteers have not been developed (methods used by professionals require dredging or diving).
Of course many water quality concerns are the same for both rivers and lakes, and this is reflected in the fact that five parameters (water temperature, pH, DO, nitrogen, and phosphorus) rank in the top 10 in both tables.
Estuary and wetland parameters
It is somewhat difficult to draw firm conclusions about the parameters monitored by estuary-only and wetland-only programs, for the simple reason that our datasets are too small. Of the 144 programs in the Directory that monitor estuaries, a mere 13 confine themselves exclusively to estuaries; all the rest monitor at least one other environment. And only 9 programs monitor just wetlands.
Yet even with these small datasets, some interesting patterns emerge. The top 10 parameters for the 13 estuaries-only programs are water temperature, DO, and salinity (all tied for first place), Secchi transparency, pH, nitrogen, turbidity, chlorophyll, rainfall, and aquatic vegetation. Except for salinity, these are remarkably similar to the top 10 parameters monitored by the lakes-only programs. This makes sense because estuaries, which are by definition semi-enclosed, often resemble lakes--that is, they are broad, open bodies of water, usually deeper than rivers, that lack a swiftly moving current. In estuaries where shellfish are harvested, bacteria (which just missed the "top 10" list) are another very important parameter.
For the nine wetlands-only programs, the 10 most commonly measured parameters are aquatic vegetation, birds, flow/water level, wildlife, exotic/invasive species, amphibians, habitat assessments, pH, terrestrial vegetation, and water temperature. These are quite different from the "top 10" list for any of the other water bodies. Wetlands, which consist of both land and water, are biologically complex and rich. They often contain large areas of emergent or submerged aquatic plants. Water may be shallow, or present only at certain times of year, making it difficult to perform standard water quality tests. The choice of wetland parameters is also influenced by the tradition of assessing wetlands in terms of how well they perform various "functions," such as providing flood control or wildlife habitat.
Most volunteer monitoring programs participate in other activities besides just monitoring. The survey form asked specifically about debris cleanups (checked off by 46% of respondents), restoration (checked by 32%), and storm drain stenciling (checked by 21%). A number of respondents also wrote in other activities--for example, building and maintaining interpretive trails and bike paths; tracking compliance with permits; raising salmon; maintaining a telephone hotline; and a variety of community outreach activities, such as slide shows, public speaking, and community festivals.
Data uses and users
The survey form asked respondents to fill in a matrix of data uses and users; the compiled results from all the surveys are shown in Table 6.
Charts 2 and 3 present these results in a slightly different way. Chart 2 shows the total number of programs that checked a particular user for any use (regardless of whether they checked that user for one use or for 16). Similarly, Chart 3 shows the number of programs that checked a particular data use at least once. (For example: Table 6 tells us that 202 programs' data are used by state government for education and 143 programs' data are used by state government for watershed planning. Chart 2 tells us that 430 programs' data are used by state goverment for something.)
The No. 1 user of volunteer monitoring data turns out to be monitoring programs themselves--85% of respondents checked "our program" for at least one data use (Chart 2). Moreover, a look at Table 6 reveals that "our program" was the primary user in all but four categories of data use (the exceptions are land use decisions, enforcement, shellfish bed closures, and state 305(b) report).
This result is interesting because discussions about "ensuring that volunteer data are used" sometimes jump quickly to identifying potential users outside the monitoring group--state environmental agencies, local planning commissions, universities, and so forth--and considering how volunteer monitoring data can best meet their needs. The survey is a reminder that the first and foremost question to consider is how the monitoring group itself will use the data.
All of this is not to say that other data users are not important--they are. And in fact use of volunteer data by state government, local government, and community organizations were all reported by more than half the programs.
As Chart 3 shows, education is the clear front-runner in the "data uses" category, with 647, or 84%, of programs reporting this use. The next three most common uses--establishing baseline conditions (67%), screening for problems (61%), and research (53%)--pretty well define the fundamental purposes of any type of monitoring. Professionals and volunteers alike monitor an environment to characterize it, find out if anything is wrong, and answer specific questions.
Taken together, the top four data uses tell us that volunteer monitoring data are being widely used to keep communities, elected officials, and resource management agencies informed about the condition of local water bodies and the problems that need to be addressed.
Next on the list of uses are advocacy (48%) and community organizing (46%). These go hand in hand with education, but take it a step further--beyond simply informing people to mobilizing them to take action. The following four uses--watershed planning, assessing nonpoint source pollution, planning restoration projects, and land use decisions--illustrate the role of volunteer monitoring data in guiding many kinds of local resource management decisions.
Section 305(b) of the Clean Water Act requires all states to submit to Congress a biennial assessment of the quality of their waters. EPA has told states that they may incorporate quality-assured data collected by trained volunteers into the 305(b) report. According to the survey results, many states are doing just that: 107 programs, representing 35 states, reported that their data are used into their states' 305(b) reports. (In the 1994 Directory, 53 programs in 27 states reported this use.)
Data quality assurance
Ensuring data quality goes hand in hand with having the data used, and a good way to document data quality is to have a written plan for quality assurance/quality control (QA/QC). Forty-four percent of respondents indicated that they do have such a plan, with 27% reporting that the plan is state-approved, and 18% that it is EPA-approved. (Individual programs' responses regarding their QA/QC plans are not included in this Directory, but may be found on the electronic database.)
Number of volunteers
Volunteer monitoring groups tend to be small: as Table 7 shows, programs with 50 or fewer volunteers account for the majority of groups in the Directory (53%). However, compared to the last survey, there seems to be a trend toward slightly larger programs. Whereas only 21% of programs in the previous directory had more than 100 volunteers, now 35% have over 100 (Table 8). Also, the median number of volunteers per program increased from 25 to 40. (Note: In calculating the above statistics, teachers and students were counted along with other volunteers.)
How many volunteer monitors are currently active in the U.S.? This is a very slippery number to get hold of. The sum of all volunteers (including teachers and students) reported by all the programs in the Directory is 462,209. This includes 175,006 participants in the Center for Marine Conservation's 1997 International Coastal Cleanup, which is by far the largest single program listed. (The second-largest is Kentucky Water Watch, with 33,147 total volunteers, and the third-largest is Cornell Laboratory of Ornithology, with 12,000.) By comparison, the 1994 edition had a total of 346,313 volunteers, including 161,000 from the Coastal Cleanup.
But this figure of 462,209 leaves out a lot of volunteer monitors. In fact, some of the very largest programs--the big regional and national networks--were the least able to estimate the size of their volunteer corps. For example, the National Audubon Society left the item blank, saying they "couldn't even guess" at the total number of volunteers in all their projects nationwide--though they do know it's more than 70,000 (Audubon's Christmas Bird Count alone involves 50,000 volunteers). The Rivers Project reported 3,000 teachers but left a blank for the number of students, explaining that they don't keep track of that figure. Conservatively stimating 15 students per teacher, we can conclude that at least 45,000 Rivers Project students went uncounted in our total.
So we could revise our total by adding in 70,000 for Audubon and 45,000 for the Rivers Project. That gives us 577,209 volunteers--more than half a million. Yet even this figure is so far from accurate that all we can say for certain is that the actual number must be considerably greater.
The survey results confirm the tremendous popularity of environmental monitoring in classrooms. Over half (52%) of programs in the Directory include teachers, students, or both among their volunteer monitors. And of the 462,209 total volunteer monitors counted in the database, 12,027 (3%) are teachers and 197,364 (43%) are students.
Volunteer monitoring has a reputation for being cost-effective, and the survey results validate this idea. Nearly one-fifth (19%) of the programs reported rock-bottom annual budgets of $100 or less, while 44% came in at $1,000 or less. The median annual budget was just $2,000.
The survey form listed eight possible funding sources and asked respondents to check all that provided them with financial support. (The survey did not ask for information about the amount of funding provided by each source.) Chart 4, which summarizes the results, is rather remarkable for its uniformity--that is, we don't see any one funding source being checked off by a huge majority of groups, nor any source checked by only a few. Instead, support for volunteer monitoring seems quite evenly spread out among a number of funding sources.
Government emerges as a very important funding source, with three of the top four sources being state, local, and federal government (in that order). But volunteer monitoring is also a "bootstrap" operation; 30% of organizations receive support from their own members and 25% conduct grassroots fundraising. In addition, contributions from members and local communities are probably included in the broad and rather vague category of "donations."
Foundations rank toward the lower end but still are a source of funding for 29% of the programs. Businesses are providing support to about one-fifth of groups; this may be an untapped resource that more programs should consider approaching.
As any financial advisor will tell you, diversification is the key to financial security. Looking at Table 10, we can see that 31% of volunteer monitoring programs are in the potentially risky position of having just one source of funding. Sixty-nine percent have two or more sources, 44% checked three or more, and a fortunate 6% enjoy very broad support, with six or more different sources of funding.
How the survey was conducted
The survey form (reproduced below) was initially disseminated as part of the Spring 1997 issue of "The Volunteer Monitor" newsletter, which was mailed to some 10,000 subscribers and also distributed at conferences and meetings. Later, survey forms were mailed to over 2,000 names from various national and state listings of volunteer monitoring programs.
The information from the survey forms was entered into an electronic database (FoxPro), from which the information in this Directory was generated. The database is also posted on EPA's volunteer monitoring Website (www.epa.gov/owow/monitoring/volunteer/).
In cases where handwritten survey responses were difficult to read, we attempted to verify the information by phone or letter. If no response was received, we made our best guess or else left information out. Some program descriptions were edited for clarity or length. We apologize if we inadvertently introduced any errors.
Survey for National Volunteer Monitoring Directory
|Name of person completing questionnaire: ________________________________________
|Monitoring program name, exactly as you want it listed in the Directory. Note: Programs will be listed alphabetically. Use the first line below for the name most people will look for -- e.g., if "Friends of Fox Lake" has a monitoring project called "Citizen Watch," they should list Friends of Fox Lake on the first line and Citizen Watch on the second line.
|Affiliation, if you are part of a national, statewide, or regional network (e.g., Izaak Walton League, Texas Watch):
|Monitoring program coordinator(s): ____________________________________________
|Does your program serve as an "umbrella" organization for smaller monitoring groups? Y N
|NOTE: The questions below refer only to the portion of your program devoted to volunteer monitoring.
|# Active volunteers (excluding school classes):
|For programs that work with schools: # Teachers: ______ # Students: ______
|Approx. annual monitoring budget: $___________
Year monitoring began: 19____
|Sources of funding or in-kind support:
|__ fed. gov't
|__ state gov't
|__ local gov't
|__ grassroots fundraising (events, solicitations, etc.
|Does program have a written QA (quality assurance) plan? Y N
Is it state-approved? Y N
EPA-approved? Y N
|Does program have monitoring-related publications you are willing to share with, or sell to, other groups? Y N
|Counties in which you monitor. This information will be used to locate your monitoring activities in EPA's "Surf Your Watershed" Web site. Please list ALL counties in which you monitor, by both county name and state (attach extra sheet if needed).
|Program description. Please tell us what you would most like people to know about your program (e.g., water bodies and watersheds monitored: major monitoring projects and related activities; international projects). Space is limited! Please be brief!
Understanding the entries
The entries in the Directory are arranged alphabetically within each state. The hypothetical entry below shows the type of information included:
Volunteer Monitoring Program (1990)
P.O. Box 1234, Anytown, USA 00000
ph 000-123-1234 . email firstname.lastname@example.org
Coordinator Dr. Secchi; Marty Monitor
river/stream, lake/pond, estuary Volunteers 25, + 2 teachers/30 students
Phys/chem water temp., pH, DO, BOD, Secchi, turbidity, nitrogen, phosphorus, TSS/TDS, hardness, alkalinity, flow/water level Biological macroinvert., habitat assessments, bacteria, chlorophyll, aquatic veg., shellfish, wildlife, exotic/invasive spp. Other activities debris cleanup, land use surveys, photo surveys, stream channel morph., storm drain stenciling, construction site inspec., restoration Data users our program, community org's, fed., state, and local gov't, univ. scientists Data uses educ., community organizing, screen for problems, estab. baseline conditions, nonpoint source assessment, BMP evaluation, plan restoration, state 305(b) report Funding sources fed., state, and local gov't, foundations, donations, grassroots fundraising Annual budget ~$500 Affiliation Water Partners
Abbreviations and explanations:
BOD - biochemical oxygen demand
BMP - best management practices
DO - dissolved oxygen
TSS/TDS - total suspended solids, total dissolved solids
State 305(b) report - an assessment of a state's waters, which states are required to submit to Congress biennially
Volunteers - number of volunteers currently involved in monitoring. In the above example, the program's volunteer monitors consist of 2 teachers, 30 students, and 25 other volunteers.
Order of elements in entries
The elements in the entries follow a set order, which was generated by the computer database. For example, in the list of environments monitored, river/stream always comes before estuary, and estuary always comes before wetland. Similarly, for the physical/chemical parameters, water temperature always comes before pH.
Thus the order of elements has no meaning--it does not reflect the relative importance of various activities for an individual program. Even if a program monitors 20 lakes and just one river, river/stream will come first in the list of environments monitored. Or if 90 percent of a program's funding comes from foundations and just 10 percent from local government, local government will still be listed first.
For programs with more than one coordinator, the order of the names was generated by the computer. This order may or may not match the order on the original survey form, and no significance should be attached to it.
National Directory of Volunteer Environmental Monitoring Programs (fifth edition, 1998)
Provides contact and basic program information on volunteer monitoring programs nationwide.