Jump to main content or area navigation.

Contact Us

Technology Transfer Network / NAAQS
Ozone Implementation

PART 58--AMBIENT AIR QUALITY SURVEILLANCE

Information provided for informational purposes onlyNote: EPA no longer updates this information, but it may be useful as a reference or resource.
[Code of Federal Regulations]
[Title 40, Volume 5, Parts 53 to 59]
[Revised as of July 1, 2000]
From the U.S. Government Printing Office via GPO Access
[CITE: 40CFR58]

[Page 205-291]

                   TITLE 40--PROTECTION OF ENVIRONMENT

               CHAPTER I--ENVIRONMENTAL PROTECTION AGENCY

PART 58--AMBIENT AIR QUALITY SURVEILLANCE

                      Subpart A--General Provisions

Sec.
58.1  Definitions.
58.2  Purpose.
58.3  Applicability.

                     Subpart B--Monitoring Criteria

58.10  Quality assurance.
58.11  Monitoring methods.
58.12  Siting of instruments or instrument probes.
58.13  Operating schedule.
58.14  Special purpose monitors.

       Subpart C--State and Local Air Monitoring Stations (SLAMS)

58.20  Air quality surveillance: plan content.
58.21  SLAMS network design.
58.22  SLAMS methodology.
58.23  Monitoring network completion.
58.24  [Reserved]
58.25  System modification.
58.26  Annual State air monitoring report.
58.27  Compliance date for air quality data reporting.
58.28  SLAMS data submittal.

           Subpart D--National Air Monitoring Stations (NAMS)

58.30  NAMS network establishment.
58.31  NAMS network description.
58.32  NAMS approval.
58.33  NAMS methodology.
58.34  NAMS network completion.
58.35  NAMS data submittal.
58.36  System modification.

     Subpart E--Photochemical Assessment Monitoring Stations (PAMS)

58.40  PAMS network establishment.
58.41  PAMS network description.
58.42  PAMS approval.
58.43  PAMS methodology.
58.44  PAMS network completion.
58.45  PAMS data submittal.
58.46  System modification.

                 Subpart F--Air Quality Index Reporting

58.50  Index reporting.

                      Subpart G--Federal Monitoring

58.60  Federal monitoring.
58.61  Monitoring other pollutants.

Appendix A to Part 58--Quality Assurance Requirements for State and
          Local Air Monitoring Stations (SLAMS)
Appendix B to Part 58--Quality Assurance Requirements for Prevention of
          Significant Deterioration (PSD) Air Monitoring
Appendix C to Part 58--Ambient Air Quality Monitoring Methodology
Appendix D to Part 58--Network Design for State and Local Air Monitoring
          Stations (SLAMS), National Air Monitoring Stations (NAMS), and
          Photochemical Assessment Monitoring Stations (PAMS)
Appendix E to Part 58--Probe and Monitoring Path Siting Criteria for
          Ambient Air Quality Monitoring
Appendix F to Part 58--Annual SLAMS Air Quality Information

[[Page 206]]

Appendix G to Part 58--Uniform Air Quality Index (AQI) and Daily
          Reporting

    Authority: 42 U.S.C. 7410, 7601(a), 7613, and 7619.

    Source: 44 FR 27571, May 10, 1979, as amended at 59 FR 41628, Aug.
12, 1994.

                      Subpart A--General Provisions

Sec. 58.1  Definitions.

    As used in this part, all terms not defined herein have the meaning
given them in the Act:
    Act means the Clean Air Act as amended (42 U.S.C. 7401, et seq.).
    Administrator means the Administrator of the Environmental
Protection Agency (EPA) or his or her authorized representative.
    Aerometric Information Retrieval System (AIRS)-Air Quality Subsystem
(AQS) is EPA's computerized system for storing and reporting of
information relating to ambient air quality data.
    Annual State air monitoring report is an annual report, prepared by
control agencies and submitted to EPA for approval, that consists of an
annual data summary report for all pollutants and a detailed report
describing any proposed changes to their air quality surveillance
network.
    CO means carbon monoxide.
    Community Monitoring Zone (CMZ) means an optional averaging area
with established, well defined boundaries, such as county or census
block, within a MPA that has relatively uniform concentrations of annual
PM2.5 as defined by appendix D of this part. Two or more core
SLAMS and other monitors within a CMZ that meet certain requirements as
set forth in Appendix D of this part may be averaged for making
comparisons to the annual PM2.5 NAAQS.
    Consolidated Metropolitan Statistical Area (CMSA) means the most
recent area as designated by the U.S. Office of Management and Budget
and population figures from the Bureau of the Census. The Department of
Commerce provides that within metropolitan complexes of 1 million or
more population, separate component areas are defined if specific
criteria are met. Such areas are designated primary metropolitan
statistical areas (PMSAs; and any area containing PMSAs is designated
CMSA.
    Core PM2.5 SLAMS means community-oriented monitoring
sites representative of community-wide exposures that are the basic
component sites of the PM2.5 SLAMS regulatory network. Core
PM2.5 SLAMS include community-oriented SLAMS monitors, and
sites collocated at PAMS.
    Corrected concentration pertains to the result of an accuracy or
precision assessment test of an open path analyzer in which a high-
concentration test or audit standard gas contained in a short test cell
is inserted into the optical measurement beam of the instrument. When
the pollutant concentration measured by the analyzer in such a test
includes both the pollutant concentration in the test cell and the
concentration in the atmosphere, the atmospheric pollutant concentration
must be subtracted from the test measurement to obtain the corrected
concentration test result. The corrected concentration is equal to the
measured concentration minus the average of the atmospheric pollutant
concentrations measured (without the test cell) immediately before and
immediately after the test.
    Correlated acceptable continuous (CAC) PM analyzer means an optional
fine particulate matter analyzer that can be used to supplement a
PM2.5 reference or equivalent sampler, in accordance with the
provisions of Sec. 58.13(f).
    Effective concentration pertains to testing an open path analyzer
with a high-concentration calibration or audit standard gas contained in
a short test cell inserted into the optical measurement beam of the
instrument. Effective concentration is the equivalent ambient-level
concentration that would produce the same spectral absorbance over the
actual atmospheric monitoring path length as produced by the high-
concentration gas in the short test cell. Quantitatively, effective
concentration is equal to the actual concentration of the gas standard
in the test cell multiplied by the ratio of the path length of the test
cell to the actual atmospheric monitoring path length.
    Equivalent method means a method of sampling and analyzing the
ambient

[[Page 207]]

air for an air pollutant that has been designated as an equivalent
method in accordance with part 53 of this chapter; it does not include a
method for which an equivalent method designation has been canceled in
accordance with Sec. 53.11 or Sec. 53.16 of this chapter.
    Indian Governing Body means the governing body of any tribe, band,
or group of Indians subject to the jurisdiction of the United States and
recognized by the United States as possessing power of self-government.
    Indian Reservation means any Federally recognized reservation
established by treaty, agreement, executive order, or act of Congress.
    Local agency means any local government agency, other than the State
agency, which is charged with the responsibility for carrying out a
portion of the plan.
    Meteorological measurements means measurements of wind speed, wind
direction, barometric pressure, temperature, relative humidity, and
solar radiation.
    Metropolitan Statistical Area (MSA) as designated by the most recent
decennial U.S. Census of Population Report.
    Monitor is a generic term for an instrument, sampler, analyzer, or
other device that measures or assists in the measurement of atmospheric
air pollutants and which is acceptable for use in ambient air
surveillance under the provisions of appendix C to this part, including
both point and open path analyzers that have been designated as either
reference or equivalent methods under part 53 of this chapter and air
samplers that are specified as part of a manual method that has been
designated as a reference or equivalent method under part 53 of this
chapter.
    Monitoring path for an open path analyzer is the actual path in
space between two geographical locations over which the pollutant
concentration is measured and averaged.
    Monitoring path length of an open path analyzer is the length of the
monitoring path in the atmosphere over which the average pollutant
concentration measurement (path-averaged concentration) is determined.
See also, optical measurement path length.
    Monitoring Planning Area (MPA) means a contiguous geographic area
with established, well defined boundaries, such as a metropolitan
statistical area, county or State, having a common area that is used for
planning monitoring locations for PM2.5. MPAs may cross State
boundaries, such as the Philadelphia PA-NJ MSA, and be further
subdivided into community monitoring zones. MPAs are generally oriented
toward areas with populations greater than 200,000, but for convenience,
those portions of a State that are not associated with MSAs can be
considered as a single MPA. MPAs must be defined, where applicable, in a
State PM monitoring network description.
    NAMS means National Air Monitoring Station(s). Collectively the NAMS
are a subset of the SLAMS ambient air quality monitoring network.
    NO2 means nitrogen dioxide. NO means nitrogen oxide.
NOX means oxides of nitrogen and is defined as the sum of the
concentrations of NO2 and NO.
    O3 means ozone.
    Open path analyzer is an automated analytical method that measures
the average atmospheric pollutant concentration in situ along one or
more monitoring paths having a monitoring path length of 5 meters or
more and that has been designated as a reference or equivalent method
under the provisions of part 53 of this chapter.
    Optical measurement path length is the actual length of the optical
beam over which measurement of the pollutant is determined. The path-
integrated pollutant concentration measured by the analyzer is divided
by the optical measurement path length to determine the path-averaged
concentration. Generally, the optical measurement path length is:
    (1) Equal to the monitoring path length for a (bistatic) system
having a transmitter and a receiver at opposite ends of the monitoring
path;
    (2) Equal to twice the monitoring path length for a (monostatic)
system having a transmitter and receiver at one end of the monitoring
path and a mirror or retroreflector at the other end; or
    (3) Equal to some multiple of the monitoring path length for more
complex systems having multiple passes of

[[Page 208]]

the measurement beam through the monitoring path.
    PAMS means Photochemical Assessment Monitoring Stations.
    Particulate matter monitoring network description, required by
Sec. 58.20(f), means a detailed plan, prepared by control agencies and
submitted to EPA for approval, that describes their PM2.5 and
PM10 air quality surveillance network.
    Pb means lead.
    Plan means an implementation plan, approved or promulgated pursuant
to section 110 of the Clean Air Act.
    PM2.5 means particulate matter with an aerodynamic
diameter less than or equal to a nominal 2.5 micrometers as measured by
a reference method based on 40 CFR part 50, Appendix L, and designated
in accordance with part 53 of this chapter or by an equivalent method
designated in accordance with part 53 of this chapter.
    PM10 means particulate matter with an aerodynamic
diameter less than or equal to a nominal 10 micrometers as measured by a
reference method based on appendix J of part 50 of this chapter and
designated in accordance with part 53 of this chapter or by an
equivalent method designated in accordance with part 53 of this chapter.
    Point analyzer is an automated analytical method that measures
pollutant concentration in an ambient air sample extracted from the
atmosphere at a specific inlet probe point and that has been designated
as a reference or equivalent method in accordance with part 53 of this
chapter.
    Population-oriented monitoring (or sites) applies to residential
areas, commercial areas, recreational areas, industrial areas, and other
areas where a substantial number of people may spend a significant
fraction of their day.
    Primary Metropolitan Statistical Area (PMSA) is a separate component
of a consolidated metropolitan statistical area. For the purposes of
this part, PMSA is used interchangeably with MSA.
    Probe is the actual inlet where an air sample is extracted from the
atmosphere for delivery to a sampler or point analyzer for pollutant
analysis.
    PSD station means any station operated for the purpose of
establishing the effect on air quality of the emissions from a proposed
source for purposes of prevention of significant deterioration as
required by Sec. 51.24(n) of part 51 of this chapter.
    Reference method means a method of sampling and analyzing the
ambient air for an air pollutant that will be specified as a reference
method in an appendix to part 50 of this chapter, or a method that has
been designated as a reference method in accordance with this part; it
does not include a method for which a reference method designation has
been canceled in accordance with Sec. 53.11 or Sec. 53.16 of this
chapter.
    Regional Administrator means the Administrator of one of the ten EPA
Regional Offices or his or her authorized representative.
    SAROAD site identification form is one of the several forms in the
SAROAD system. It is the form which provides a complete description of
the site (and its surroundings) of an ambient air quality monitoring
station.
    SLAMS means State or Local Air Monitoring Station(s). The SLAMS make
up the ambient air quality monitoring network which is required by
Sec. 58.20 to be provided for in the State's implementation plan. This
definition places no restrictions on the use of the physical structure
or facility housing the SLAMS. Any combination of SLAMS and any other
monitors (Special Purpose, NAMS, PSD) may occupy the same facility or
structure without affecting the respective definitions of those
monitoring station.
    SO2 means sulfur dioxide.
    Special Purpose Monitor (SPM) is a generic term used for all
monitors other than SLAMS, NAMS, PAMS, and PSD monitors included in an
agency's monitoring network for monitors used in a special study whose
data are officially reported to EPA.
    State agency means the air pollution control agency primarily
responsible for development and implementation of a plan under the Act.
    Storage and Retrieval of Aerometric Data (SAROAD) system is a
computerized system which stores and reports information relating to
ambient air quality. The SAROAD system has been replaced with the AIRS-
AQS system; however, the SAROAD data reporting

[[Page 209]]

format continues to be used by some States and local air pollution
agencies as an interface to AIRS on an interim basis.
    Traceable means that a local standard has been compared and
certified, either directly or via not more than one intermediate
standard, to a National Institute of Standards and Technology (NIST)-
certified primary standard such as a NIST-Traceable Reference Material
(NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS).
    TSP (total suspended particulates) means particulate matter as
measured by the method described in appendix B of part 50 of this
chapter,
    Urban area population means the population defined in the most
recent decennial U.S. Census of Population Report.
    VOC means volatile organic compounds.

[44 FR 27571, May 10, 1979, as amended at 48 FR 2529, Jan. 20, 1983; 51
FR 9586, Mar. 19, 1986; 52 FR 24739, July 1, 1987; 58 FR 8467, Feb. 12,
1993; 59 FR 41628, 41629, Aug. 12, 1994; 60 FR 52319, Oct. 6, 1995; 62
FR 38830, July 18, 1997; 63 FR 7714, Feb. 17, 1998]

Sec. 58.2  Purpose.

    (a) This part contains criteria and requirements for ambient air
quality monitoring and requirements for reporting ambient air quality
data and information. The monitoring criteria pertain to the following
areas:
    (1) Quality assurance procedures for monitor operation and data
handling.
    (2) Methodology used in monitoring stations.
    (3) Operating schedule.
    (4) Siting parameters for instruments or instrument probes.
    (b) The requirements pertaining to provisions for an air quality
surveillance system in the State Implementation Plan are contained in
this part.
    (c) This part also acts to establish a national ambient air quality
monitoring network for the purpose of providing timely air quality data
upon which to base national assessments and policy decisions. This
network will be operated by the States and will consist of certain
selected stations from the States' SLAMS networks. These selected
stations will remain as SLAMS and will continue to meet any applicable
requirements on SLAMS. The stations, however, will also be designated as
National Air Monitoring Stations (NAMS) and will be subject to
additional data reporting and monitoring methodology requirements as
contained in subpart D of this part.
    (d) This section also acts to establish a Photochemical Assessment
Monitoring Stations (PAMS) network as a subset of the State's SLAMS
network for the purpose of enhanced monitoring in O3
nonattainment areas listed as serious, severe, or extreme. The PAMS
network will be subject to the data reporting and monitoring methodology
requirements as contained in subpart E of this part.
    (e) Requirements for the daily reporting of an index of ambient air
quality, to insure that the population of major urban areas are informed
daily of local air quality conditions, are also included in this part.

[44 FR 27571, May 10, 1979, as amended at 58 FR 8467, Feb. 12, 1993]

Sec. 58.3  Applicability.

    This part applies to:
    (a) State air pollution control agencies.
    (b) Any local air pollution control agency or Indian governing body
to which the State has delegated authority to operate a portion of the
State's SLAMS network.
    (c) Owners or operators of proposed sources.

                     Subpart B--Monitoring Criteria

Sec. 58.10  Quality assurance.

    (a) Appendix A to this part contains quality assurance criteria to
be followed when operating the SLAMS network.
    (b) Appendix B to this part contains the quality assurance criteria
to be followed by the owner or operator of a proposed source when
operating a PSD station.

Sec. 58.11  Monitoring methods.

    Appendix C to this part contains the criteria to be followed in
determining acceptable monitoring methods or instruments for use in
SLAMS.

[[Page 210]]

Sec. 58.12  Siting of instruments or instrument probes.

    Appendix E to this part contains criteria for siting instruments or
instrument probes for SLAMS.

Sec. 58.13  Operating schedule.

    Ambient air quality data collected at any SLAMS must be collected as
follows:
    (a) For continuous analyzers--consecutive hourly averages except
during:
    (1) Periods of routine maintenance,
    (2) Periods of instrument calibration, or
    (3) Periods or seasons exempted by the Regional Administrator.
    (b) For manual methods (excluding PM10 samplers,
PM2.5 samplers, and PAMS VOC samplers), at least one 24-hour
sample must be obtained every sixth day except during periods or seasons
exempted by the Regional Administrator.
    (c) For PAMS VOC samplers, samples must be obtained as specified in
sections 4.3 and 4.4 of appendix D to this part. Area-specific PAMS
operating schedules must be included as part of the network description
required by Sec. 58.40 and must be approved by the Administrator.
    (d) For PM10 samplers--a 24-hour sample must be taken a
minimum of every third day, except during periods or seasons exempted by
the Regional Administrator.
    (e) For PM2.5 samplers, a 24-hour sample is required
everyday for certain core SLAMS, including certain PAMS, as described in
section 2.8.1.3 of appendix D of this part, except during seasons or
periods of low PM2.5 as otherwise exempted by the Regional
Administrator. A waiver of the everyday sampling schedule for SLAMS may
be granted by the Regional Administrator or designee, and for NAMS by
the Administrator or designee, for 1 calendar year from the time a
PM2.5 sequential sampler (FRM or Class I equivalent) has been
approved by EPA. A 24-hour sample must be taken a minimum of every third
day for all other SLAMS, including NAMS, as described in section 2.8.1.3
of appendix D of this part, except when exempted by the Regional
Administrator in accordance with forthcoming EPA guidance. During
periods for which exemptions to every third day or every day sampling
are allowed for core PM2.5 SLAMS, a minimum frequency of one
in 6-day sampling is still required. However, alternative sampling
frequencies are allowed for SLAMS sites that are principally intended
for comparisons to the 24-hour NAAQS. Such modifications must be
approved by the Regional Administrator.
    (f) Alternatives to everyday sampling at sites with correlated
acceptable continuous analyzers. (1) Certain PM2.5 core SLAMS
sites located in monitoring planning areas (as described in section 2.8
of appendix D of this part) are required to sample every day with a
reference or equivalent method operating in accordance with part 53 of
this chapter and section 2 of appendix C of this part. However, in
accordance with the monitoring priority as defined in paragraph (f)(2)
of this section, established by the control agency and approved by EPA,
a core SLAMS monitor may operate with a reference or equivalent method
on a 1 in 3-day schedule and produce data that may be compared to the
NAAQS, provided that it is collocated with an acceptable continuous fine
particulate PM analyzer that is correlated with the reference or
equivalent method. If the alternative sampling schedule is selected by
the control agency and approved by EPA, the alternative schedule shall
be implemented on January 1 of the year in which everyday sampling is
required. The selection of correlated acceptable continuous PM analyzers
and procedures for correlation with the intermittent reference or
equivalent method shall be in accordance with procedures approved by the
Regional Administrator. Unless the continuous fine particulate analyzer
satisfies the requirements of section 2 of appendix C of this part,
however, the data derived from the correlated acceptable continuous
monitor are not eligible for direct comparisons to the NAAQS in
accordance with part 50 of this chapter.
    (2) A Metropolitan Statistical Area (MSA) (or primary metropolitan
statistical area) with greater than 1 million population and high
concentrations of PM2.5 (greater than or equal to 80 percent
of the NAAQS) shall be a Priority

[[Page 211]]

1 PM monitoring area. Other monitoring planning areas may be designated
as Priority 2 PM monitoring areas.
    (3) Core SLAMS having a correlated acceptable continuous analyzer
collocated with a reference or equivalent method in a Priority 1 PM
monitoring area may operate on the 1 in 3 sampling frequency only after
reference or equivalent data are collected for at least 2 complete
years.
    (4) In all monitoring situations, with a correlated acceptable
continuous alternative, FRM samplers or filter-based equivalent
analyzers should preferably accompany the correlated acceptable
continuous monitor.

[44 FR 27571, May 10, 1979, as amended at 52 FR 24739, July 1, 1987; 58
FR 8467, Feb. 12, 1993; 62 FR 38831, July 18, 1997; 63 FR 7714, Feb. 17,
1998]

Sec. 58.14  Special purpose monitors.

    (a) Except as specified in paragraph (b) of this section, any
ambient air quality monitoring station other than a SLAMS or PSD station
from which the State intends to use the data as part of a demonstration
of attainment or nonattainment or in computing a design value for
control purposes of the National Ambient Air Quality Standards (NAAQS)
must meet the requirements for SLAMS as described in Sec. 58.22 and,
after January 1, 1983, must also meet the requirements for SLAMS
described in Sec. 58.13 and Appendices A and E of this part.
    (b) Based on the need, in transitioning to a PM2.5
standard that newly addresses the ambient impacts of fine particles, to
encourage a sufficiently extensive geographical deployment of
PM2.5 monitors and thus hasten the development of an adequate
PM2.5 ambient air quality monitoring infrastructure,
PM2.5 NAAQS violation determinations shall not be exclusively
made based on data produced at a population-oriented SPM site during the
first 2 complete calendar years of its operation. However, a notice of
NAAQS violations resulting from population-oriented SPMs shall be
reported to EPA in the State's annual monitoring report and be
considered by the State in the design of its overall SLAMS network;
these population-oriented SPMs should be considered to become a
permanent SLAMS during the annual network review in accordance with
Sec. 58.25.
    (c) Any ambient air quality monitoring station other than a SLAMS or
PSD station from which the State intends to use the data for SIP-related
functions other than as described in paragraph (a) of this section is
not necessarily required to comply with the requirements for a SLAMS
station under paragraph (a) of this section but must be operated in
accordance with a monitoring schedule, methodology, quality assurance
procedures, and probe or instrument-siting specifications approved by
the Regional Administrator.

[62 FR 38832, July 18, 1997]

       Subpart C--State and Local Air Monitoring Stations (SLAMS)

Sec. 58.20  Air quality surveillance: plan content.

    By January 1, 1980, the State shall adopt and submit to the
Administrator a revision to the plan which will:
    (a) Provide for the establishment of an air quality surveillance
system that consists of a network of monitoring stations designated as
State and Local Air Monitoring Stations (SLAMS) which measure ambient
concentrations of those pollutants for which standards have been
established in part 50 of this chapter. SLAMS (including NAMS)
designated as PAMS will also obtain ambient concentrations of speciated
VOC and NOX, and meteorological measurements. PAMS may
therefore be located at existing SLAMS or NAMS sites when appropriate.
    (b) Provide for meeting the requirements of appendices A, C, D, and
E to this part.
    (c) Provide for the operation of at least one SLAMS per criteria
pollutant except Pb during any stage of an air pollution episode as
defined in the plan.
    (d) Provide for the review of the air quality surveillance system on
an annual basis to determine if the system meets the monitoring
objectives defined in appendix D of this part. Such

[[Page 212]]

review must identify needed modifications to the network such as
termination or relocation of unnecessary stations or establishment of
new stations that are necessary. For PM2.5, the review must
identify needed changes to core SLAMS, monitoring planning areas, the
chosen community monitoring approach including optional community
monitoring zones, SLAMS, or SPMs.
    (e) Provide for having a SLAMS network description available for
public inspection and submission to the Administrator upon request. The
network description must be available at the time of plan revision
submittal and must contain the following information for each SLAMS:
    (1) The AIRS site identification form for existing stations.
    (2) The proposed location for scheduled stations.
    (3) The sampling and analysis method.
    (4) The operating schedule.
    (5) The monitoring objective and spatial scale of representativeness
as defined in appendix D to this part.
    (6) A schedule for: (i) Locating, placing into operation, and making
available the AIRS site identification form for each SLAMS which is not
located and operating at the time of plan revision submittal, (ii)
implementing quality assurance procedures of appendix A to this part for
each SLAMS for which such procedures are not implemented at the time of
plan revision submittal, and (iii) resiting each SLAMS which does not
meet the requirements of appendix E to this part at the time of plan
revision submittal.
    (f) Provide for having a PM monitoring network description available
for public inspection which must provide for monitoring planning areas,
and the community monitoring approach involving core monitors and
optional community monitoring zones for PM2.5. The PM
monitoring network description for PM10 and PM2.5
must be submitted to the Regional Administrator for approval by July 1,
1998, and must contain the following information for each PM SLAMS and
PM2.5 SPM:
    (1) The AIRS site identification form for existing stations.
    (2) The proposed location for scheduled stations.
    (3) The sampling and analysis method.
    (4) The operating schedule.
    (5) The monitoring objective, spatial scale of representativeness,
and additionally for PM2.5, the monitoring planning area,
optional community monitoring zone, and the site code designation to
identify which site will be identified as core SLAMS; and SLAMS or
population-oriented SPMs, if any, that are microscale or middle scale in
their representativeness as defined in appendix D of this part.
    (6) A schedule for:
    (i) Locating, placing into operation, and making available the AIRS
site identification form for each SLAMS which is not located and
operating at the time of plan revision submittal.
    (ii) Implementing quality assurance procedures of appendix A of this
part for each SLAMS for which such procedures are not implemented at the
time of plan revision submittal.
    (iii) Resiting each SLAMS which does not meet the requirements of
appendix E of this part at the time of plan revision submittal.
    (g) Provide for having a list of all PM2.5 monitoring
locations including SLAMS, NAMS, PAMS and population-oriented SPMs, that
are included in the State's PM monitoring network description and are
intended for comparison to the NAAQS, available for public inspection.
    (h) Within 9 months after;
    (1) February 12, 1993; or
    (2) Date of redesignation or reclassification of any existing
O3 nonattainment area to serious, severe, or extreme; or
    (3) The designation of a new area and classification to serious,
severe, or extreme, affected States shall adopt and submit a plan
revision to the Administrator.
    (i) The plan revision will provide for the establishment and
maintenance of PAMS. Each PAMS site will provide for the monitoring of
ambient concentrations of criteria pollutants (O3,
NO2), and non-criteria pollutants (NOX, NO, and
speciated VOC) as stipulated in section 4.2 of appendix D, and
meteorological measurements. The PAMS

[[Page 213]]

network is part of the SLAMS network, and the plan provisions in
paragraphs (a) through (h) of this section will apply to the revision.
Since NAMS sites are also part of the SLAMS network, some PAMS sites may
be coincident with NAMS sites and may be designated as both PAMS and
NAMS.

[44 FR 27571, May 10, 1979, as amended at 46 FR 44164, Sept. 3, 1981; 52
FR 24740, July 1, 1987; 58 FR 8467, Feb. 12, 1993; 59 FR 41628, Aug. 12,
1994; 62 FR 38832, July 18, 1997]

Sec. 58.21  SLAMS network design.

    The design criteria for SLAMS contained in appendix D to this part
must be used in designing the SLAMS network. The State shall consult
with the Regional Administrator during the network design process. The
final network design will be subject to the approval of the Regional
Administrator.

Sec. 58.22  SLAMS methodology.

    Each SLAMS must meet the monitoring methodology requirements of
appendix C to this part at the time the station is put into operation as
a SLAMS.

Sec. 58.23  Monitoring network completion.

    With the exception of the PM10 monitoring networks that
shall be in place by March 16, 1998 and with the exception of the
PM2.5 monitoring networks as described in paragraph (c) of
this section:
    (a) Each station in the SLAMS network must be in operation, be sited
in accordance with the criteria in appendix E to this part, and be
located as described on the station's AIRS site identification form, and
    (b) The quality assurance requirements of appendix A to this part
must be fully implemented.
    (c) Each PM2.5 station in the SLAMS network must be in
operation in accordance with the minimum requirements of appendix D of
this part, be sited in accordance with the criteria in appendix E of
this part, and be located as described on the station's AIRS site
identification form, according to the following schedule:
    (1) Within 1 year after September 16, 1997, at least one required
core PM2.5 SLAMS site in each MSA with population greater
than 500,000, plus one site in each PAMS area, (plus at least two
additional SLAMS sites per State) must be in operation.
    (2) Within 2 years after September 16, 1997, all other required
SLAMS, including all required core SLAMS, required regional background
and regional transport SLAMS, continuous PM monitors in areas with
greater than 1 million population, and all additional required
PM2.5 SLAMS must be in operation.
    (3) Within 3 years after September 16, 1997, all additional sites
(e.g., sites classified as SLAMS/SPM to complete the mature network)
must be in operation.

[44 FR 27571, May 10, 1979, as amended at 52 FR 24740, July 1, 1987; 59
FR 41628, Aug. 12, 1994; 62 FR 38832, July 18, 1997]

Sec. 58.24  [Reserved]

Sec. 58.25  System modification.

    The State shall annually develop and implement a schedule to modify
the ambient air quality monitoring network to eliminate any unnecessary
stations or to correct any inadequacies indicated by the result of the
annual review required by Sec. 58.20(d). The State shall consult with
the Regional Administrator during the development of the schedule to
modify the monitoring program. The final schedule and modifications will
be subject to the approval of the Regional Administrator. Nothing in
this section will preclude the State, with the approval of the Regional
Administrator, from making modifications to the SLAMS network for
reasons other than those resulting from the annual review.

Sec. 58.26  Annual State air monitoring report.

    (a) The State shall submit to the Administrator (through the
appropriate Regional Office) an annual summary report of all the ambient
air quality monitoring data from all monitoring stations designated
State and Local Air Monitoring Stations (SLAMS). The annual report must
be submitted by July 1 of each year for data collected from January 1 to
December 31 of the previous year.

[[Page 214]]

    (b) The SLAMS annual data summary report must contain:
    (1) The information specified in appendix F,
    (2) The location, date, pollution source, and duration of each
incident of air pollution during which ambient levels of a pollutant
reached or exceeded the level specified by Sec. 51.16(a) of this chapter
as a level which could cause significant harm to the health of persons.
    (c) The senior air pollution control officer in the State or his
designee shall certify that the annual summary report is accurate to the
best of his knowledge.
    (d) For PM monitoring and data--
    (1) The State shall submit a summary to the appropriate Regional
Office (for SLAMS) or Administrator (through the Regional Office) (for
NAMS) that details proposed changes to the PM Monitoring Network
Description and to be in accordance with the annual network review
requirements in Sec. 58.25. This shall discuss the existing PM networks,
including modifications to the number, size or boundaries of monitoring
planning areas and optional community monitoring zones; number and
location of PM10 and PM2.5 SLAMS; number and
location of core PM2.5 SLAMS; alternative sampling
frequencies proposed for PM2.5 SLAMS (including core
PM2.5 SLAMS and PM2.5 NAMS), core PM2.5
SLAMS to be designated PM2.5 NAMS; and PM10 and
PM2.5 SLAMS to be designated PM10 and
PM2.5 NAMS respectively.
    (2) The State shall submit an annual summary to the appropriate
Regional Office of all the ambient air quality monitoring PM data from
all special purpose monitors that are described in the State's PM
monitoring network description and are intended for SIP purposes. These
include those population-oriented SPMs that are eligible for comparison
to the PM2.5 NAAQS. The State shall certify the data in
accordance with paragraph (c) of this section.
    (e) The Annual State Air Monitoring Report shall be submitted to the
Regional Administrator by July 1 or by an alternative annual date to be
negotiated between the State and Regional Administrator. The Region
shall provide review and approval/disapproval within 60 days. After 3
years following September 16, 1997, the schedule for submitting the
required annual revised PM2.5 monitoring network description
may be altered based on a new schedule determined by the Regional
Administrator. States may submit an alternative PM monitoring network
description in which it requests exemptions from specific required
elements of the network design (e.g., required number of core sites,
other SLAMS, sampling frequency, etc.). After 3 years following
September 16, 1997 or once a monitoring area has been determined to
violate the NAAQS, then changes to an MPA monitoring network affecting
the violating locations shall require public review and notification.

[44 FR 27571, May 10, 1979, as amended at 51 FR 9586, Mar. 19, 1986; 62
FR 38833, July 18, 1997; 63 FR 7714, Feb. 17, 1998]

Sec. 58.27  Compliance date for air quality data reporting.

    The annual air quality data reporting requirements of Sec. 58.26
apply to data collected after December 31, 1980. Data collected before
January 1, 1981, must be reported under the reporting procedures in
effect before the effective date of subpart C of this part.

Sec. 58.28  SLAMS data submittal.

    The State shall submit all of the SLAMS data according to the same
data submittal requirements as defined for NAMS in section 58.35. The
State shall also submit any portion or all of the SLAMS data to the
appropriate Regional Administrator upon request.

[59 FR 41628, Aug. 12, 1994]

           Subpart D--National Air Monitoring Stations (NAMS)

Sec. 58.30  NAMS network establishment.

    (a) By January 1, 1980, with the exception of PM10 and
PM2.5 samplers, which shall be by July 1, 1998, the State
shall:
    (1) Establish, through the operation of stations or through a
schedule for locating and placing stations into operation, that portion
of a National Ambient Air Quality Monitoring Network which is in that
State, and

[[Page 215]]

    (2) Submit to the Administrator (through the appropriate Regional
Office) a description of that State's portion of the network.
    (b) Hereinafter, the portion of the national network in any State
will be referred to as the NAMS network.
    (c) The stations in the NAMS network must be stations from the SLAMS
network required by Sec. 58.20.
    (d) The requirements of appendix D to this part must be met when
designing the NAMS network. The process of designing the NAMS network
must be part of the process of designing the SLAMS network as explained
in appendix D to this part.

[44 FR 27571, May 10, 1979, as amended at 46 FR 44164, Sept. 3, 1981; 52
FR 24740, July 1, 1987; 62 FR 38833, July 18, 1997]

Sec. 58.31  NAMS network description.

    The NAMS network description required by Sec. 58.30 must contain the
following for all stations, existing or scheduled:
    (a) The AIRS site identification number for existing stations.
    (b) The proposed location for scheduled stations.
    (c) Identity of the urban area represented.
    (d) The sampling and analysis method.
    (e) The operating schedule.
    (f) The monitoring objective, spatial scale of representativeness,
and for PM2.5, the monitoring planning area and community
monitoring zones, as defined in appendix D of this part.
    (g) A schedule for:
    (1) Locating, placing into operation, and submitting the AIRS site
identification form for each NAMS which is not located and operating at
the time of network description submittal,
    (2) Implementing quality assurance procedures of appendix A to this
part for each NAMS for which such procedures are not implemented at the
time of network description submittal, and
    (3) Resiting each NAMS which does not meet the requirements of
appendix E to this part at the time of network description submittal.

[44 FR 27571, May 10, 1979, as amended at 59 FR 41628, Aug. 12, 1994; 62
FR 38833, July 18, 1997; 63 FR 7714, Feb. 17, 1998; 64 FR 3034, Jan. 20,
1999]]

Sec. 58.32  NAMS approval.

    The NAMS network required by Sec. 58.30 is subject to the approval
of the Administrator. Such approval will be contingent upon completion
of the network description as outlined in Sec. 58.31 and upon
conformance to the NAMS design criteria contained in appendix D to this
part.

Sec. 58.33  NAMS methodology.

    Each NAMS must meet the monitoring methodology requirements of
appendix C to this part applicable to NAMS at the time the station is
put into operation as a NAMS.

Sec. 58.34  NAMS network completion.

    With the exception of PM10 samplers, which shall be by 1
year after September 16, 1997, and PM2.5, which shall be by 3
years after September 16, 1997:
    (a) Each NAMS must be in operation, be sited in accordance with the
criteria in Appendix E to this part, and be located as described in the
AIRS database; and
    (b) The quality assurance requirements of appendix A to this part
must be fully implemented for all NAMS.

[44 FR 27571, May 10, 1979, as amended at 46 FR 44164, Sept. 3, 1981; 52
FR 24740, July 1, 1987; 59 FR 41628, Aug. 12, 1994; 62 FR 38833, July
18, 1997; 64 FR 3034, Jan. 20, 1999]

Sec. 58.35  NAMS data submittal.

    (a) The requirements of this section apply to those stations
designated as both SLAMS and NAMS by the network description required by
Secs. 58.20 and 58.30.
    (b) The State shall report to the Administrator all ambient air
quality data for SO2, CO, O3, NO2, Pb,
PM10, and PM2.5, and information specified by the
AIRS Users Guide (Volume II, Air Quality Data Coding, and Volume III,
Air Quality Data Storage) to be coded into the AIRS-AQS format. Such air
quality data and information must be submitted directly to the AIRS-AQS
via either electronic transmission or magnetic tape, in the format of
the AIRS-AQS, and in accordance with the quarterly schedule described in
paragraph (c) of this section.
    (c) The specific quarterly reporting periods are January 1-March 31,
April

[[Page 216]]

1-June 30, July 1-September 30, and October 1-December 31. The data and
information reported for each reporting period must:
    (1) Contain all data and information gathered during the reporting
period.
    (2) Be received in the AIRS-AQS within 90 days after the end of the
quarterly reporting period. For example, the data for the reporting
period January 1-March 31, 1994 are due on or before June 30, 1994.
    (d) Air quality data submitted for each reporting period must be
edited, validated, and entered into the AIRS-AQS for updating (within
the time limits specified in paragraph (c) of this section) pursuant to
appropriate AIRS-AQS procedures. The procedures for editing and
validating data are described in the AIRS Users Guide, Volume II Air
Quality Data Coding.
    (e) This section does not permit a State to exempt those SLAMS which
are also designated as NAMS from all or any of the reporting
requirements applicable to SLAMS in Sec. 58.26.

[44 FR 27571, May 10, 1979, as amended at 46 FR 44164, Sept. 3, 1981; 51
FR 9586, Mar. 19, 1986; 52 FR 24740, July 1, 1987; 59 FR 41628, Aug. 12,
1994; 62 FR 38833, July 18, 1997]

Sec. 58.36  System modification.

    During the annual SLAMS Network Review specified in Sec. 58.20, any
changes to the NAMS network identified by the EPA and/or proposed by the
State and agreed to by the EPA will be evaluated. These modifications
should address changes invoked by a new census and changes to the
network due to changing air quality levels, emission patterns, etc. The
State shall be given one year (until the next annual evaluation) to
implement the appropriate changes to the NAMS network.

[51 FR 9586, Mar. 19, 1986]

     Subpart E--Photochemical Assessment Monitoring Stations (PAMS)

    Source: 58 FR 8468, Feb. 12, 1993, unless otherwise noted.

Sec. 58.40  PAMS network establishment.

    (a) In addition to the plan revision, the State shall submit a
photochemical assessment monitoring network description including a
schedule for implementation to the Administrator within 6 months after;
    (1) February 12, 1993; or
    (2) Date of redesignation or reclassification of any existing
O3 nonattainment area to serious, severe, or extreme; or
    (3) The designation of a new area and classification to serious,
severe, or extreme O3 nonattainment.

The network description will apply to all serious, severe, and extreme
O3 nonattainment areas within the State. Some O3
nonattainment areas may extend beyond State or Regional boundaries. In
instances where PAMS network design criteria as defined in appendix D to
this part require monitoring stations located in different States and/or
Regions, the network description and implementation schedule should be
submitted jointly by the States involved. When appropriate, such
cooperation and joint network design submittals are preferred. Network
descriptions shall be submitted through the appropriate Regional
Office(s). Alternative networks, including different monitoring
schedules, periods, or methods, may be submitted, but they must include
a demonstration that they satisfy the monitoring data uses and fulfill
the PAMS monitoring objectives described in sections 4.1 and 4.2 of
appendix D to this part.
    (b) For purposes of plan development and approval, the stations
established or designated as PAMS must be stations from the SLAMS
network or become part of the SLAMS network required by Sec. 58.20.
    (c) The requirements of appendix D to this part applicable to PAMS
must be met when designing the PAMS network.

Sec. 58.41  PAMS network description.

    The PAMS network description required by Sec. 58.40 must contain the
following:
    (a) Identification of the monitoring area represented.
    (b) The AIRS site identification number for existing stations.
    (c) The proposed location for scheduled stations.

[[Page 217]]

    (d) Identification of the site type and location within the PAMS
network design for each station as defined in appendix D to this part
except that during any year, a State may choose to submit detailed
information for the site scheduled to begin operation during that year's
PAMS monitoring season, and defer submittal of detailed information on
the remaining sites until succeeding years. Such deferred network design
phases should be submitted to EPA for approval no later than January 1
of the first year of scheduled operation. As a minimum, general
information on each deferred site should be submitted each year until
final approval of the complete network is obtained from the
Administrator.
    (e) The sampling and analysis method for each of the measurements.
    (f) The operating schedule for each of the measurements.
    (g) An O3 event forecasting scheme, if appropriate.
    (h) A schedule for implementation. This schedule should include the
following:
    (1) A timetable for locating and submitting the AIRS site
identification form for each scheduled PAMS that is not located at the
time of submittal of the network description;
    (2) A timetable for phasing-in operation of the required number and
type of sites as defined in appendix D to this part; and
    (3) A schedule for implementing the quality assurance procedures of
appendix A to this part for each PAMS.

[58 FR 8468, Feb. 12, 1993, as amended at 64 FR 3035, Jan. 20, 1999]

Sec. 58.42  PAMS approval.

    The PAMS network required by Sec. 58.40 is subject to the approval
of the Administrator. Such approval will be contingent upon completion
of each phase of the network description as outlined in Sec. 58.41 and
upon conformance to the PAMS network design criteria contained in
appendix D to this part.

Sec. 58.43  PAMS methodology.

    PAMS monitors must meet the monitoring methodology requirements of
appendix C to this part applicable to PAMS.

Sec. 58.44  PAMS network completion.

    (a) The complete, operational PAMS network will be phased in as
described in appendix D to this part over a period of 5 years after;
    (1) February 12, 1993; or
    (2) Date of redesignation or reclassification of any existing
O3 nonattainment area to serious, severe, or extreme; or
    (3) The designation of a new area and classification to serious,
severe, or extreme O3 nonattainment.
    (b) The quality assurance criteria of appendix A to this part must
be implemented for all PAMS.

Sec. 58.45  PAMS data submittal.

    (a) The requirements of this section apply only to those stations
designated as PAMS by the network description required by Sec. 58.40.
    (b) All data shall be submitted to the Administrator in accordance
with the format, reporting periods, reporting deadlines, and other
requirements as specified for NAMS in Sec. 58.35.
    (c) The State shall report NO and NOX data consistent
with the requirements of Sec. 58.35 for criteria pollutants.
    (d) The State shall report VOC data and meteorological data within 6
months following the end of each quarterly reporting period.

Sec. 58.46  System modification.

    (a) Any proposed changes to the PAMS network description will be
evaluated during the annual SLAMS Network Review specified in
Sec. 58.20. Changes proposed by the State must be approved by the
Administrator. The State will be allowed 1 year (until the next annual
evaluation) to implement the appropriate changes to the PAMS network.
    (b) PAMS network requirements are mandatory only for serious,
severe, and extreme O3 nonattainment areas. When any such
area is redesignated to attainment, the State may revise its PAMS
monitoring program subject to approval by the Administrator.

[[Page 218]]

                 Subpart F--Air Quality Index Reporting

Sec. 58.50  Index reporting.

    (a) The State shall report to the general public through prominent
notice an air quality index in accordance with the requirements of
appendix G to this part.
    (b) Reporting is required by all Metropolitan Statistical Areas with
a population exceeding 350,000.
    (c) The population of a Metropolitan Statistical Area for purposes
of index reporting is the most recent decennial U.S. census population.

[64 FR 42547, Aug. 4, 1999]

                      Subpart G--Federal Monitoring

    Source: 44 FR 27571, May 10, 1979. Redesignated at 58 FR 8467, Feb.
12, 1993.

Sec. 58.60  Federal monitoring.

    The Administrator may locate and operate an ambient air monitoring
station if the State fails to locate, or schedule to be located, during
the initial network design process or as a result of the annual review
required by Sec. 58.20(d):
    (a) A SLAMS at a site which is necessary in the judgment of the
Regional Administrator to meet the objectives defined in appendix D to
this part, or
    (b) A NAMS at a site which is necessary in the judgment of the
Administrator for meeting EPA national data needs.

Sec. 58.61  Monitoring other pollutants.

    The Administrator may promulgate criteria similar to that referenced
in subpart B of this part for monitoring a pollutant for which a
National Ambient Air Quality Standard does not exist. Such an action
would be taken whenever the Administrator determines that a nationwide
monitoring program is necessary to monitor such a pollutant.

  Appendix A to Part 58--Quality Assurance Requirements for State and
                  Local Air Monitoring Stations (SLAMS)

1. General Information.
    1.1 This appendix specifies the minimum quality assurance/quality
control (QA/QC) requirements applicable to SLAMS air monitoring data
submitted to EPA. State and local agencies are encouraged to develop and
maintain quality assurance programs more extensive than the required
minimum.
    1.2 To assure the quality of data from air monitoring measurements,
two distinct and important interrelated functions must be performed. One
function is the control of the measurement process through broad quality
assurance activities, such as establishing policies and procedures,
developing data quality objectives, assigning roles and
responsibilities, conducting oversight and reviews, and implementing
corrective actions. The other function is the control of the measurement
process through the implementation of specific quality control
procedures, such as audits, calibrations, checks, replicates, routine
self-assessments, etc. In general, the greater the control of a given
monitoring system, the better will be the resulting quality of the
monitoring data. The results of quality assurance reviews and
assessments indicate whether the control efforts are adequate or need to
be improved.
    1.3 Documentation of all quality assurance and quality control
efforts implemented during the data collection, analysis, and reporting
phases is important to data users, who can then consider the impact of
these control efforts on the data quality (see reference 1 of this
appendix). Both qualitative and quantitative assessments of the
effectiveness of these control efforts should identify those areas most
likely to impact the data quality and to what extent.
    1.4 Periodic assessments of SLAMS data quality are required to be
reported to EPA. To provide national uniformity in this assessment and
reporting of data quality for all SLAMS networks, specific assessment
and reporting procedures are prescribed in detail in sections 3, 4, and
5 of this appendix. On the other hand, the selection and extent of the
QA and QC activities used by a monitoring agency depend on a number of
local factors such as the field and laboratory conditions, the
objectives for monitoring, the level of the data quality needed, the
expertise of assigned personnel, the cost of control procedures,
pollutant concentration levels, etc. Therefore, the quality system
requirements, in section 2 of this appendix, are specified in general
terms to allow each State to develop a quality assurance program that is
most efficient and effective for its own circumstances while achieving
the Ambient Air Quality Programs data quality objectives.
2. Quality System Requirements.
    2.1 Each State and local agency must develop a quality system
(reference 2 of this appendix) to ensure that the monitoring results:

[[Page 219]]

    (a) Meet a well-defined need, use, or purpose.
    (b) Satisfy customers' expectations.
    (c) Comply with applicable standards specifications.
    (d) Comply with statutory (and other) requirements of society.
    (e) Reflect consideration of cost and economics.
    (f) Implement a quality assurance program consisting of policies,
procedures, specifications, standards, and documentation necessary to:
    (1) Provide data of adequate quality to meet monitoring objectives,
and
    (2) Minimize loss of air quality data due to malfunctions or out-of-
control conditions. This quality assurance program must be described in
detail, suitably documented in accordance with Agency requirements
(reference 4 of this appendix), and approved by the appropriate Regional
Administrator, or the Regional Administrator's designee. The Quality
Assurance Program will be reviewed during the systems audits described
in section 2.5 of this appendix.
    2.2 Primary requirements and guidance documents for developing the
quality assurance program are contained in references 2 through 7 of
this appendix, which also contain many suggested and required
procedures, checks, and control specifications. Reference 7 of this
appendix describes specific guidance for the development of a QA Program
for SLAMS. Many specific quality control checks and specifications for
methods are included in the respective reference methods described in
part 50 of this chapter or in the respective equivalent method
descriptions available from EPA (reference 8 of this appendix).
Similarly, quality control procedures related to specifically designated
reference and equivalent method analyzers are contained in the
respective operation or instruction manuals associated with those
analyzers. Quality assurance guidance for meteorological systems at PAMS
is contained in reference 9 of this appendix. Quality assurance
procedures for VOC, NOx (including NO and NO2),
O3, and carbonyl measurements at PAMS must be consistent with
reference 15 of this appendix. Reference 4 of this appendix includes
requirements for the development of quality assurance project plans, and
quality assurance and control programs, and systems audits demonstrating
attainment of the requirements.
    2.3 Pollutant Concentration and Flow Rate Standards.
    2.3.1 Gaseous pollutant concentration standards (permeation devices
or cylinders of compressed gas) used to obtain test concentrations for
CO, SO2, NO, and NO2 must be traceable to either a
National Institute of Standards and Technology (NIST) NIST-Traceable
Reference Material (NTRM) or a NIST-certified Gas Manufacturer's
Internal Standard (GMIS), certified in accordance with one of the
procedures given in reference 10 of this appendix.
    2.3.2 Test concentrations for O3 must be obtained in
accordance with the UV photometric calibration procedure specified in 40
CFR part 50, appendix D, or by means of a certified ozone transfer
standard. Consult references 11 and 12 of this appendix for guidance on
primary and transfer standards for O3.
    2.3.3 Flow rate measurements must be made by a flow measuring
instrument that is traceable to an authoritative volume or other
applicable standard. Guidance for certifying some types of flowmeters is
provided in reference 7 of this appendix.
    2.4 National Performance Audit Program (NPAP). Agencies operating
SLAMS are required to participate in EPA's NPAP. These audits are
described in reference 7 of this appendix. For further instructions,
agencies should contact either the appropriate EPA Regional QA
Coordinator at the appropriate EPA Regional Office location, or the NPAP
Coordinator, Emissions Monitoring and Analysis Division (MD-14), U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711.
    2.5 Systems Audit Programs. Systems audits of the ambient air
monitoring programs of agencies operating SLAMS shall be conducted at
least every 3 years by the appropriate EPA Regional Office. Systems
audit programs are described in reference 7 of this appendix. For
further instructions, agencies should contact either the appropriate EPA
Regional QA Coordinator or the Systems Audit QA Coordinator, Office of
Air Quality Planning and Standards, Emissions Monitoring and Analysis
Division (MD-14), U.S. Environmental Protection Agency, Research
Triangle Park, NC 27711.
3. Data Quality Assessment Requirements.
    3.0.1 All ambient monitoring methods or analyzers used in SLAMS
shall be tested periodically, as described in this section, to
quantitatively assess the quality of the SLAMS data. Measurement
uncertainty is estimated for both automated and manual methods.
Terminology associated with measurement uncertainty are found within
this appendix and includes:
    (a) Precision. A measurement of mutual agreement among individual
measurements of the same property usually under prescribed similar
conditions, expressed generally in terms of the standard deviation;
    (b) Accuracy. The degree of agreement between an observed value and
an accepted reference value, accuracy includes a combination of random
error (precision) and systematic error (bias) components which are due
to sampling and analytical operations;
    (c) Bias. The systematic or persistent distortion of a measurement
process which causes errors in one direction. The individual

[[Page 220]]

results of these tests for each method or analyzer shall be reported to
EPA as specified in section 4 of this appendix. EPA will then calculate
quarterly assessments of measurement uncertainty applicable to the SLAMS
data as described in section 5 of this appendix. Data assessment results
should be reported to EPA only for methods and analyzers approved for
use in SLAMS monitoring under appendix C of this part.
    3.0.2 Estimates of the data quality will be calculated on the basis
of single monitors and reporting organizations and may also be
calculated for each region and for the entire Nation. A reporting
organization is defined as a State, subordinate organization within a
State, or other organization that is responsible for a set of stations
that monitors the same pollutant and for which data quality assessments
can be pooled. States must define one or more reporting organizations
for each pollutant such that each monitoring station in the State SLAMS
network is included in one, and only one, reporting organization.
    3.0.3 Each reporting organization shall be defined such that
measurement uncertainty among all stations in the organization can be
expected to be reasonably homogeneous, as a result of common factors.
    (a) Common factors that should be considered by States in defining
reporting organizations include:
    (1) Operation by a common team of field operators.
    (2) Common calibration facilities.
    (3) Oversight by a common quality assurance organization.
    (4) Support by a common laboratory or headquarters.
    (b) Where there is uncertainty in defining the reporting
organizations or in assigning specific sites to reporting organizations,
States shall consult with the appropriate EPA Regional Office. All
definitions of reporting organizations shall be subject to final
approval by the appropriate EPA Regional Office.
    3.0.4 Assessment results shall be reported as specified in section 4
of this appendix. Table A-1 of this appendix provides a summary of the
minimum data quality assessment requirements, which are described in
more detail in the following sections.
    3.1 Precision of Automated Methods Excluding PM2.5.
    3.1.1 Methods for SO2, NO2, O3 and
CO. A one- point precision check must be performed at least once every 2
weeks on each automated analyzer used to measure SO2,
NO2, O3 and CO. The precision check is made by
challenging the analyzer with a precision check gas of known
concentration (effective concentration for open path analyzers) between
0.08 and 0.10 ppm for SO2, NO2, and O3
analyzers, and between 8 and 10 ppm for CO analyzers. To check the
precision of SLAMS analyzers operating on ranges higher than 0 to 1.0
ppm SO2, NO2, and O3, or 0 to 100 ppm
for CO, use precision check gases of appropriately higher concentration
as approved by the appropriate Regional Administrator or their designee.
However, the results of precision checks at concentration levels other
than those specified above need not be reported to EPA. The standards
from which precision check test concentrations are obtained must meet
the specifications of section 2.3 of this appendix.
    3.1.1.1 Except for certain CO analyzers described below, point
analyzers must operate in their normal sampling mode during the
precision check, and the test atmosphere must pass through all filters,
scrubbers, conditioners and other components used during normal ambient
sampling and as much of the ambient air inlet system as is practicable.
If permitted by the associated operation or instruction manual, a CO
point analyzer may be temporarily modified during the precision check to
reduce vent or purge flows, or the test atmosphere may enter the
analyzer at a point other than the normal sample inlet, provided that
the analyzer's response is not likely to be altered by these deviations
from the normal operational mode. If a precision check is made in
conjunction with a zero or span adjustment, it must be made prior to
such zero or span adjustments. Randomization of the precision check with
respect to time of day, day of week, and routine service and adjustments
is encouraged where possible.
    3.1.1.2 Open path analyzers are tested by inserting a test cell
containing a precision check gas concentration into the optical
measurement beam of the instrument. If possible, the normally used
transmitter, receiver, and as appropriate, reflecting devices should be
used during the test, and the normal monitoring configuration of the
instrument should be altered as little as possible to accommodate the
test cell for the test. However, if permitted by the associated
operation or instruction manual, an alternate local light source or an
alternate optical path that does not include the normal atmospheric
monitoring path may be used. The actual concentration of the precision
check gas in the test cell must be selected to produce an effective
concentration in the range specified in section 3.1.1. Generally, the
precision test concentration measurement will be the sum of the
atmospheric pollutant concentration and the precision test
concentration. If so, the result must be corrected to remove the
atmospheric concentration contribution. The corrected concentration is
obtained by subtracting the average of the atmospheric concentrations
measured by the open path instrument under test immediately before and
immediately after the precision check test from the precision test

[[Page 221]]

concentration measurement. If the difference between these before and
after measurements is greater than 20 percent of the effective
concentration of the test gas, discard the test result and repeat the
test. If possible, open path analyzers should be tested during periods
when the atmospheric pollutant concentrations are relatively low and
steady.
    3.1.1.3 Report the actual concentration (effective concentration for
open path analyzers) of the precision check gas and the corresponding
concentration measurement (corrected concentration, if applicable, for
open path analyzers) indicated by the analyzer. The percent differences
between these concentrations are used to assess the precision of the
monitoring data as described in section 5.1. of this appendix.
    3.1.2 Methods for Particulate Matter Excluding PM2.5. A
one-point precision check must be performed at least once every 2 weeks
on each automated analyzer used to measure PM10. The
precision check is made by checking the operational flow rate of the
analyzer. If a precision flow rate check is made in conjunction with a
flow rate adjustment, it must be made prior to such flow rate
adjustment. Randomization of the precision check with respect to time of
day, day of week, and routine service and adjustments is encouraged
where possible.
    3.1.2.1 Standard procedure: Use a flow rate transfer standard
certified in accordance with section 2.3.3 of this appendix to check the
analyzer's normal flow rate. Care should be used in selecting and using
the flow rate measurement device such that it does not alter the normal
operating flow rate of the analyzer. Report the actual analyzer flow
rate measured by the transfer standard and the corresponding flow rate
measured, indicated, or assumed by the analyzer.
    3.1.2.2 Alternative procedure:
    3.1.2.2.1 It is permissible to obtain the precision check flow rate
data from the analyzer's internal flow meter without the use of an
external flow rate transfer standard, provided that:
    3.1.2.2.1.1 The flow meter is audited with an external flow rate
transfer standard at least every 6 months.
    3.1.2.2.1.2 Records of at least the three most recent flow audits of
the instrument's internal flow meter over at least several weeks confirm
that the flow meter is stable, verifiable and accurate to
4%.
    3.1.2.2.1.3 The instrument and flow meter give no indication of
improper operation.
    3.1.2.2.2 With suitable communication capability, the precision
check may thus be carried out remotely. For this procedure, report the
set-point flow rate as the actual flow rate along with the flow rate
measured or indicated by the analyzer flow meter.
    3.1.2.2.3 For either procedure, the percent differences between the
actual and indicated flow rates are used to assess the precision of the
monitoring data as described in section 5.1 of this appendix (using flow
rates in lieu of concentrations). The percent differences between these
concentrations are used to assess the precision of the monitoring data
as described in section 5.1. of this appendix.
    3.2 Accuracy of Automated Methods Excluding PM2.5.
    3.2.1 Methods for SO2, NO2, O3, or
CO.
    3.2.1.1 Each calendar quarter (during which analyzers are operated),
audit at least 25 percent of the SLAMS analyzers that monitor for
SO2, NO2, O3, or CO such that each
analyzer is audited at least once per year. If there are fewer than four
analyzers for a pollutant within a reporting organization, randomly
reaudit one or more analyzers so that at least one analyzer for that
pollutant is audited each calendar quarter. Where possible, EPA strongly
encourages more frequent auditing, up to an audit frequency of once per
quarter for each SLAMS analyzer.
    3.2.1.2 (a) The audit is made by challenging the analyzer with at
least one audit gas of known concentration (effective concentration for
open path analyzers) from each of the following ranges applicable to the
analyzer being audited:

------------------------------------------------------------------------
                                          Concentration Range, PPM
            Audit Level           --------------------------------------
                                     SO2, O3        NO2           CO
------------------------------------------------------------------------
1................................    0.03-0.08    0.03-0.08          3-8
2................................    0.15-0.20    0.15-0.20        15-20
3................................    0.35-0.45    0.35-0.45        35-45
4................................    0.80-0.90  ...........        80-90
------------------------------------------------------------------------

    (b) NO2 audit gas for chemiluminescence-type
NO2 analyzers must also contain at least 0.08 ppm NO.
    3.2.1.3 NO concentrations substantially higher than 0.08 ppm, as may
occur when using some gas phase titration (GPT) techniques, may lead to
audit errors in chemiluminescence analyzers due to inevitable minor NO-
NOx channel imbalance. Such errors may be atypical of routine
monitoring errors to the extent that such NO concentrations exceed
typical ambient NO concentrations at the site. These errors may be
minimized by modifying the GPT technique to lower the NO concentrations
remaining in the NO2 audit gas to levels closer to typical
ambient NO concentrations at the site.
    3.2.1.4 To audit SLAMS analyzers operating on ranges higher than 0
to 1.0 ppm for SO2, NO2, and O3 or 0 to
100 ppm for CO, use audit gases of appropriately higher concentration as
approved by the appropriate Regional Administrator or the
Administrators's designee. The results of audits at concentration levels
other than those shown in the above table need not be reported to EPA.
    3.2.1.5 The standards from which audit gas test concentrations are
obtained must meet

[[Page 222]]

the specifications of section 2.3 of this appendix. The gas standards
and equipment used for auditing must not be the same as the standards
and equipment used for calibration or calibration span adjustments. The
auditor should not be the operator or analyst who conducts the routine
monitoring, calibration, and analysis.
    3.2.1.6 For point analyzers, the audit shall be carried out by
allowing the analyzer to analyze the audit test atmosphere in its normal
sampling mode such that the test atmosphere passes through all filters,
scrubbers, conditioners, and other sample inlet components used during
normal ambient sampling and as much of the ambient air inlet system as
is practicable. The exception provided in section 3.1 of this appendix
for certain CO analyzers does not apply for audits.
    3.2.1.7 Open path analyzers are audited by inserting a test cell
containing the various audit gas concentrations into the optical
measurement beam of the instrument. If possible, the normally used
transmitter, receiver, and, as appropriate, reflecting devices should be
used during the audit, and the normal monitoring configuration of the
instrument should be modified as little as possible to accommodate the
test cell for the audit. However, if permitted by the associated
operation or instruction manual, an alternate local light source or an
alternate optical path that does not include the normal atmospheric
monitoring path may be used. The actual concentrations of the audit gas
in the test cell must be selected to produce effective concentrations in
the ranges specified in this section 3.2 of this appendix. Generally,
each audit concentration measurement result will be the sum of the
atmospheric pollutant concentration and the audit test concentration. If
so, the result must be corrected to remove the atmospheric concentration
contribution. The corrected concentration is obtained by subtracting the
average of the atmospheric concentrations measured by the open path
instrument under test immediately before and immediately after the audit
test (or preferably before and after each audit concentration level)
from the audit concentration measurement. If the difference between the
before and after measurements is greater than 20 percent of the
effective concentration of the test gas standard, discard the test
result for that concentration level and repeat the test for that level.
If possible, open path analyzers should be audited during periods when
the atmospheric pollutant concentrations are relatively low and steady.
Also, the monitoring path length must be reverified to within
3 percent to validate the audit, since the monitoring path
length is critical to the determination of the effective concentration.
    3.2.1.8 Report both the actual concentrations (effective
concentrations for open path analyzers) of the audit gases and the
corresponding concentration measurements (corrected concentrations, if
applicable, for open path analyzers) indicated or produced by the
analyzer being tested. The percent differences between these
concentrations are used to assess the accuracy of the monitoring data as
described in section 5.2 of this appendix.
    3.2.2 Methods for Particulate Matter Excluding PM2.5.
    3.2.2.1 Each calendar quarter, audit the flow rate of at least 25
percent of the SLAMS PM10 analyzers such that each
PM10 analyzer is audited at least once per year. If there are
fewer than four PM10 analyzers within a reporting
organization, randomly re-audit one or more analyzers so that at least
one analyzer is audited each calendar quarter. Where possible, EPA
strongly encourages more frequent auditing, up to an audit frequency of
once per quarter for each SLAMS analyzer.
    3.2.2.2 The audit is made by measuring the analyzer's normal
operating flow rate, using a flow rate transfer standard certified in
accordance with section 2.3.3 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used to
calibrate the analyzer. However, both the calibration standard and the
audit standard may be referenced to the same primary flow rate or volume
standard. Great care must be used in auditing the flow rate to be
certain that the flow measurement device does not alter the normal
operating flow rate of the analyzer. Report the audit (actual) flow rate
and the corresponding flow rate indicated or assumed by the sampler. The
percent differences between these flow rates are used to calculate
accuracy (PM10) as described in section 5.2 of this appendix.
    3.3 Precision of Manual Methods Excluding PM2.5.
    3.3.1 For each network of manual methods other than for
PM2.5, select one or more monitoring sites within the
reporting organization for duplicate, collocated sampling as follows:
for 1 to 5 sites, select 1 site; for 6 to 20 sites, select 2 sites; and
for over 20 sites, select 3 sites. Where possible, additional collocated
sampling is encouraged. For purposes of precision assessment, networks
for measuring TSP and PM10 shall be considered separately
from one another. PM10 and TSP sites having annual mean
particulate matter concentrations among the highest 25 percent of the
annual mean concentrations for all the sites in the network must be
selected or, if such sites are impractical, alternative sites approved
by the Regional Administrator may be selected.
    3.3.2 In determining the number of collocated sites required for
PM10, monitoring networks for lead should be treated
independently from networks for particulate matter, even though the
separate networks may share one or more common samplers.

[[Page 223]]

However, a single pair of samplers collocated at a common-sampler
monitoring site that meets the requirements for both a collocated lead
site and a collocated particulate matter site may serve as a collocated
site for both networks.
    3.3.3 The two collocated samplers must be within 4 meters of each
other, and particulate matter samplers must be at least 2 meters apart
to preclude airflow interference. Calibration, sampling, and analysis
must be the same for both collocated samplers and the same as for all
other samplers in the network.
    3.3.4 For each pair of collocated samplers, designate one sampler as
the primary sampler whose samples will be used to report air quality for
the site, and designate the other as the duplicate sampler. Each
duplicate sampler must be operated concurrently with its associated
routine sampler at least once per week. The operation schedule should be
selected so that the sampling days are distributed evenly over the year
and over the seven days of the week. A six-day sampling schedule is
required. Report the measurements from both samplers at each collocated
sampling site. The calculations for evaluating precision between the two
collocated samplers are described in section 5.3 of this appendix.
    3.4 Accuracy of Manual Methods Excluding PM2.5. The
accuracy of manual sampling methods is assessed by auditing a portion of
the measurement process.
    3.4.1 Procedures for PM10 and TSP.
    3.4.1.1 Procedures for flow rate audits for PM10. Each
calendar quarter, audit the flow rate of at least 25 percent of the
PM10 samplers such that each PM10 sampler is
audited at least once per year. If there are fewer than four
PM10 samplers within a reporting organization, randomly
reaudit one or more samplers so that one sampler is audited each
calendar quarter. Audit each sampler at its normal operating flow rate,
using a flow rate transfer standard certified in accordance with section
2.3.3 of this appendix. The flow rate standard used for auditing must
not be the same flow rate standard used to calibrate the sampler.
However, both the calibration standard and the audit standard may be
referenced to the same primary flow rate standard. The flow audit should
be scheduled so as to avoid interference with a scheduled sampling
period. Report the audit (actual) flow rate and the corresponding flow
rate indicated by the sampler's normally used flow indicator. The
percent differences between these flow rates are used to calculate
accuracy and bias as described in section 5.4.1 of this appendix.
    3.4.1.2 Great care must be used in auditing high-volume particulate
matter samplers having flow regulators because the introduction of
resistance plates in the audit flow standard device can cause abnormal
flow patterns at the point of flow sensing. For this reason, the flow
audit standard should be used with a normal filter in place and without
resistance plates in auditing flow-regulated high-volume samplers, or
other steps should be taken to assure that flow patterns are not
perturbed at the point of flow sensing.
    3.4.2 SO2 Methods.
    3.4.2.1 Prepare audit solutions from a working sulfite-
tetrachloromercurate (TCM) solution as described in section 10.2 of the
SO2 Reference Method (40 CFR part 50, appendix A). These
audit samples must be prepared independently from the standardized
sulfite solutions used in the routine calibration procedure. Sulfite-TCM
audit samples must be stored between 0 and 5  deg.C and expire 30 days
after preparation.
    3.4.2.2 Prepare audit samples in each of the concentration ranges of
0.2-0.3, 0.5-0.6, and 0.8-0.9 g SO2/ml. Analyze an
audit sample in each of the three ranges at least once each day that
samples are analyzed and at least twice per calendar quarter. Report the
audit concentrations (in g SO2/ml) and the
corresponding indicated concentrations (in g SO2/
ml). The percent differences between these concentrations are used to
calculate accuracy as described in section 5.4.2 of this appendix.
    3.4.3 NO2 Methods. Prepare audit solutions from a working
sodium nitrite solution as described in the appropriate equivalent
method (see reference 8 of this appendix). These audit samples must be
prepared independently from the standardized nitrite solutions used in
the routine calibration procedure. Sodium nitrite audit samples expire
in 3 months after preparation. Prepare audit samples in each of the
concentration ranges of 0.2-0.3, 0.5-0.6, and 0.8-0.9 g
NO2/ml. Analyze an audit sample in each of the three ranges
at least once each day that samples are analyzed and at least twice per
calendar quarter. Report the audit concentrations (in g
NO2/ml) and the corresponding indicated concentrations (in
g NO2/ml). The percent differences between these
concentrations are used to calculate accuracy as described in section
5.4.2 of this appendix.
    3.4.4 Pb Methods.
    3.4.4.1 For the Pb Reference Method (40 CFR part 50, appendix G),
the flow rates of the high-volume Pb samplers shall be audited as part
of the TSP network using the same procedures described in section 3.4.1
of this appendix. For agencies operating both TSP and Pb networks, 25
percent of the total number of high-volume samplers are to be audited
each quarter.
    3.4.4.2 Each calendar quarter, audit the Pb Reference Method
analytical procedure using glass fiber filter strips containing a known
quantity of Pb. These audit sample strips are prepared by depositing a
Pb solution on unexposed glass fiber filter strips of dimensions

[[Page 224]]

1.9 cm by 20.3 cm (3/4 inch by 8 inch) and allowing them to dry
thoroughly. The audit samples must be prepared using batches of reagents
different from those used to calibrate the Pb analytical equipment being
audited. Prepare audit samples in the following concentration ranges:

------------------------------------------------------------------------
                                                           Equivalent
                                             Pb            Ambient Pb
                Range                  Concentration,    Concentration,
                                      g/Strip    g/m3
                                                               \1\
------------------------------------------------------------------------
1...................................           100-300           0.5-1.5
2...................................          600-1000           3.0-5.0
------------------------------------------------------------------------
\1\ Equivalent ambient Pb concentration in g/m3 is based on
  sampling at 1.7 m3/min for 24 hours on a 20.3 cm x 25.4 cm (8 inch x
  10 inch) glass fiber filter.

    3.4.4.3 Audit samples must be extracted using the same extraction
procedure used for exposed filters.
    3.4.4.4 Analyze three audit samples in each of the two ranges each
quarter samples are analyzed. The audit sample analyses shall be
distributed as much as possible over the entire calendar quarter. Report
the audit concentrations (in g Pb/strip) and the corresponding
measured concentrations (in g Pb/strip) using unit code 77. The
percent differences between the concentrations are used to calculate
analytical accuracy as described in section 5.4.2 of this appendix.
    3.4.4.5 The accuracy of an equivalent Pb method is assessed in the
same manner as for the reference method. The flow auditing device and Pb
analysis audit samples must be compatible with the specific requirements
of the equivalent method.
    3.5 Measurement Uncertainty for Automated and Manual
PM2.5 Methods. The goal for acceptable measurement
uncertainty has been defined as 10 percent coefficient of variation (CV)
for total precision and  10 percent for total bias
(reference 14 of this appendix).
    3.5.1 Flow Rate Audits.
    3.5.1.1 Automated methods for PM2.5. A one-point
precision check must be performed at least once every 2 weeks on each
automated analyzer used to measure PM2.5. The precision check
is made by checking the operational flow rate of the analyzer. If a
precision flow rate check is made in conjunction with a flow rate
adjustment, it must be made prior to such flow rate adjustment.
Randomization of the precision check with respect to time of day, day of
week, and routine service and adjustments is encouraged where possible.
    3.5.1.1.1 Standard procedure: Use a flow rate transfer standard
certified in accordance with section 2.3.3 of this appendix to check the
analyzer's normal flow rate. Care should be used in selecting and using
the flow rate measurement device such that it does not alter the normal
operating flow rate of the analyzer. Report the actual analyzer flow
rate measured by the transfer standard and the corresponding flow rate
measured, indicated, or assumed by the analyzer.
    3.5.1.1.2 Alternative procedure: It is permissible to obtain the
precision check flow rate data from the analyzer's internal flow meter
without the use of an external flow rate transfer standard, provided
that the flow meter is audited with an external flow rate transfer
standard at least every 6 months; records of at least the three most
recent flow audits of the instrument's internal flow meter over at least
several weeks confirm that the flow meter is stable, verifiable and
accurate to 4%; and the instrument and flow meter give no
indication of improper operation. With suitable communication
capability, the precision check may thus be carried out remotely. For
this procedure, report the set-point flow rate as the actual flow rate
along with the flow rate measured or indicated by the analyzer flow
meter.
    3.5.1.1.3 For either procedure, the differences between the actual
and indicated flow rates are used to assess the precision of the
monitoring data as described in section 5.5 of this appendix.
    3.5.1.2 Manual methods for PM2.5. Each calendar quarter,
audit the flow rate of each SLAMS PM2.5 analyzer. The audit
is made by measuring the analyzer's normal operating flow rate, using a
flow rate transfer standard certified in accordance with section 2.3.3
of this appendix. The flow rate standard used for auditing must not be
the same flow rate standard used to calibrate the analyzer. However,
both the calibration standard and the audit standard may be referenced
to the same primary flow rate or volume standard. Great care must be
used in auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the analyzer.
Report the audit (actual) flow rate and the corresponding flow rate
indicated or assumed by the sampler. The procedures used to calculate
measurement uncertainty PM2.5 are described in section 5.5 of
this appendix.
    3.5.2 Measurement of Precision using Collocated Procedures for
Automated and Manual Methods of PM2.5.
    (a) For PM2.5 sites within a reporting organization each
EPA designated Federal reference method (FRM) or Federal equivalent
method (FEM) must:
    (1) Have 25 percent of the monitors collocated (values of .5 and
greater round up).
    (2) Have at least 1 collocated monitor (if the total number of
monitors is less than 4). The first collocated monitor must be a
designated FRM monitor.
    (b) In addition, monitors selected must also meet the following
requirements:
    (1) A monitor designated as an EPA FRM shall be collocated with a
monitor having the same EPA FRM designation.
    (2) For each monitor designated as an EPA FEM, 50 percent of the
designated monitors

[[Page 225]]

shall be collocated with a monitor having the same method designation
and 50 percent of the monitors shall be collocated with an FRM monitor.
If there are an odd number of collocated monitors required, the
additional monitor shall be an FRM. An example of this procedure is
found in table A-2 of this appendix.
    (c) For PM2.5 sites during the initial deployment of the
SLAMS network, special emphasis should be placed on those sites in areas
likely to be in violation of the NAAQS. Once areas are initially
determined to be in violation, the collocated monitors should be
deployed according to the following protocol:
    (1) Eighty percent of the collocated monitors should be deployed at
sites with concentrations  ninety percent of the annual
PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area);
one hundred percent if all sites have concentrations above either NAAQS,
and each area determined to be in violation should be represented by at
least one collocated monitor.
    (2) The remaining 20 percent of the collocated monitors should be
deployed at sites with concentrations  ninety percent of the annual
PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area)
    (3) If an organization has no sites at concentration ranges
 ninety percent of the annual PM2.5 NAAQS (or 24-
hour NAAQS if that is affecting the area), 60 percent of the collocated
monitors should be deployed at those sites with the annual mean
PM2.5 concentrations (or 24-hour NAAQS if that is affecting
the area) among the highest 25 percent for all PM2.5 sites in
the network.
    3.5.2.1 In determining the number of collocated sites required for
PM2.5, monitoring networks for visibility should not be
treated independently from networks for particulate matter, as the
separate networks may share one or more common samplers. However, for
class I visibility areas, EPA will accept visibility aerosol mass
measurement instead of a PM2.5 measurement if the latter
measurement is unavailable. Any PM2.5 monitoring site which
does not have a monitor which is an EPA federal reference or equivalent
method is not required to be included in the number of sites which are
used to determine the number of collocated monitors.
    3.5.2.2 The two collocated samples must be within 4 meters of each
other, and particulate matter samplers must be at least 2 meters apart
(1 meter apart for samplers having flow rates less than 200 liters/min.)
to preclude airflow interference. Calibration, sampling, and analysis
must be the same for both collocated samplers and the same as for all
other samplers in the network.
    3.5.2.3 For each pair of collocated samplers, designate one sampler
as the primary sampler whose samples will be used to report air quality
for the site, and designate the other as the duplicate sampler. Each
duplicate sampler must be operated concurrently with its associated
primary sampler. The operation schedule should be selected so that the
sampling days are distributed evenly over the year and over the 7 days
of the week and therefore, a 6-day sampling schedule is required. Report
the measurements from both samplers at each collocated sampling site.
The calculations for evaluating precision between the two collocated
samplers are described in section 5.5 of this appendix.
    3.5.3 Measurement of Bias using the FRM Audit Procedures for
Automated and Manual Methods of PM2.5.
    3.5.3.1 The FRM audit is an independent assessment of the total
measurement system bias. These audits will be performed under the
National Performance Audit Program (section 2.4 of this appendix) or a
comparable program. Twenty-five percent of the SLAMS monitors within
each reporting organization will be assessed with an FRM audit each
year. Additionally, every designated FRM or FEM within a reporting
organization must:
    (a) Have at least 25 percent of each method designation audited,
including collocated sites (even those collocated with FRM instruments),
(values of .5 and greater round up).
    (b) Have at least one monitor audited.
    (c) Be audited at a frequency of four audits per year.
    (d) Have all FRM or FEM samplers subject to an FRM audit at least
once every 4 years. Table A-2 illustrates the procedure mentioned above.
    3.5.3.2 For PM2.5 sites during the initial deployment of
the SLAMS network, special emphasis should be placed on those sites in
areas likely to be in violation of the NAAQS. Once areas are initially
determined to be in violation, the FRM audit program should be
implemented according to the following protocol:
    (a) Eighty percent of the FRM audits should be deployed at sites
with concentrations  ninety percent of the annual
PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area);
one hundred percent if all sites have concentrations above either NAAQS,
and each area determined to be in violation should implement an FRM
audit at a minimum of one monitor within that area.
    (b) The remaining 20 percent of the FRM audits should be implemented
at sites with concentrations  ninety percent of the annual
PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area).
    (c) If an organization has no sites at concentration ranges
 ninety percent of the annual PM2.5 NAAQS (or 24-
hour NAAQS if that is affecting the area), 60 percent of the FRM audits
should be implemented at those sites with the annual mean
PM2.5 concentrations (or 24-hour NAAQS if that is affecting
the area) among the highest 25 percent for all

[[Page 226]]

PM2.5 sites in the network. Additional information concerning
the FRM audit program is contained in reference 7 of this appendix. The
calculations for evaluating bias between the primary monitor and the FRM
audit are described in section 5.5.
4. Reporting Requirements.
    (a) For each pollutant, prepare a list of all monitoring sites and
their AIRS site identification codes in each reporting organization and
submit the list to the appropriate EPA Regional Office, with a copy to
AIRS-AQS. Whenever there is a change in this list of monitoring sites in
a reporting organization, report this change to the Regional Office and
to AIRS-AQS.
    4.1 Quarterly Reports. For each quarter, each reporting organization
shall report to AIRS-AQS directly (or via the appropriate EPA Regional
Office for organizations not direct users of AIRS) the results of all
valid precision, bias and accuracy tests it has carried out during the
quarter. The quarterly reports of precision, bias and accuracy data must
be submitted consistent with the data reporting requirements specified
for air quality data as set forth in Sec. 58.35(c). EPA strongly
encourages early submittal of the QA data in order to assist the State
and Local agencies in controlling and evaluating the quality of the
ambient air SLAMS data. Each organization shall report all QA/QC
measurements. Report results from invalid tests, from tests carried out
during a time period for which ambient data immediately prior or
subsequent to the tests were invalidated for appropriate reasons, and
from tests of methods or analyzers not approved for use in SLAMS
monitoring networks under appendix C of this part. Such data should be
flagged so that it will not be utilized for quantitative assessment of
precision, bias and accuracy.
    4.2 Annual Reports.
    4.2.1 When precision, bias and accuracy estimates for a reporting
organization have been calculated for all four quarters of the calendar
year, EPA will calculate and report the measurement uncertainty for the
entire calendar year. These limits will then be associated with the data
submitted in the annual SLAMS report required by Sec. 58.26.
    4.2.2 Each reporting organization shall submit, along with its
annual SLAMS report, a listing by pollutant of all monitoring sites in
the reporting organization.
5. Calculations for Data Quality Assessment.
    (a) Calculations of measurement uncertainty are carried out by EPA
according to the following procedures. Reporting organizations should
report the data for individual precision, bias and accuracy tests as
specified in sections 3 and 4 of this appendix even though they may
elect to perform some or all of the calculations in this section on
their own.
    5.1 Precision of Automated Methods Excluding PM2.5.
Estimates of the precision of automated methods are calculated from the
results of biweekly precision checks as specified in section 3.1 of this
appendix. At the end of each calendar quarter, an integrated precision
probability interval for all SLAMS analyzers in the organization is
calculated for each pollutant.
    5.1.1 Single Analyzer Precision.
    5.1.1.1 The percent difference (di) for each precision
check is calculated using equation 1, where Yi is the
concentration indicated by the analyzer for the I-th precision check and
Xi is the known concentration for the I-th precision check,
as follows:

                               Equation 1
[GRAPHIC] [TIFF OMITTED] TR18JY97.138

    5.1.1.2 For each analyzer, the quarterly average (dj) is
calculated with equation 2, and the standard deviation (Sj)
with equation 3, where n is the number of precision checks on the
instrument made during the calendar quarter. For example, n should be 6
or 7 if precision checks are made biweekly during a quarter. Equation 2
and 3 follow:

                               Equation 2
[GRAPHIC] [TIFF OMITTED] TR18JY97.139

                               Equation 3
[GRAPHIC] [TIFF OMITTED] TR18JY97.140

    5.1.2 Precision for Reporting Organization.
    5.1.2.1 For each pollutant, the average of averages (D) and the
pooled standard deviation (Sa) are calculated for all
analyzers audited for the pollutant during the quarter, using either
equations 4 and 5 or 4a and 5a, where k is the number of analyzers
audited within the reporting organization for a single pollutant, as
follows:

                               Equation 4
[GRAPHIC] [TIFF OMITTED] TR18JY97.141

                               Equation 4a
[GRAPHIC] [TIFF OMITTED] TR18JY97.142

[[Page 227]]

                               Equation 5
[GRAPHIC] [TIFF OMITTED] TR18JY97.143

                               Equation 5a
[GRAPHIC] [TIFF OMITTED] TR18JY97.144

    5.1.2.2 Equations 4 and 5 are used when the same number of precision
checks are made for each analyzer. Equations 4a and 5a are used to
obtain a weighted average and a weighted standard deviation when
different numbers of precision checks are made for the analyzers.
    5.1.2.3 For each pollutant, the 95 Percent Probability Limits for
the precision of a reporting organization are calculated using equations
6 and 7, as follows:

                               Equation 6
[GRAPHIC] [TIFF OMITTED] TR18JY97.145

                               Equation 7
[GRAPHIC] [TIFF OMITTED] TR18JY97.146

    5.2 Accuracy of Automated Methods Excluding PM2.5.
Estimates of the accuracy of automated methods are calculated from the
results of independent audits as described in section 3.2 of this
appendix. At the end of each calendar quarter, an integrated accuracy
probability interval for all SLAMS analyzers audited in the reporting
organization is calculated for each pollutant. Separate probability
limits are calculated for each audit concentration level in section 3.2
of this appendix.
    5.2.1 Single Analyzer Accuracy. The percentage difference
(di) for each audit concentration is calculated using
equation 1, where Yi is the analyzer's indicated
concentration measurement from the I-th audit check and Xi is
the actual concentration of the audit gas used for the I-th audit check.
    5.2.2 Accuracy for Reporting Organization.
    5.2.2.1 For each audit concentration level of a particular
pollutant, the average (D) of the individual percentage differences
(di) for all n analyzers audited during the quarter is
calculated using equation 8, as follows:

                               Equation 8
[GRAPHIC] [TIFF OMITTED] TR18JY97.147

    5.2.2.2 For each concentration level of a particular pollutant, the
standard deviation (Sa) of all the individual percentage
differences for all n analyzers audited during the quarter is
calculated, using equation 9, as follows:

                               Equation 9
[GRAPHIC] [TIFF OMITTED] TR18JY97.148

    5.2.2.3 For reporting organizations having four or fewer analyzers
for a particular pollutant, only one audit is required each quarter. For
such reporting organizations, the audit results of two consecutive
quarters are required to calculate an average and a standard deviation,
using equations 8 and 9. Therefore, the reporting of probability limits
shall be on a semiannual (instead of a quarterly) basis.
    5.2.2.4 For each pollutant, the 95 Percent Probability Limits for
the accuracy of a reporting organization are calculated at each audit
concentration level using equations 6 and 7.
    5.3 Precision of Manual Methods Excluding PM2.5.
Estimates of precision of manual methods are calculated from the results
obtained from collocated samplers as described in section 3.3 of this
appendix. At the end of each calendar quarter, an integrated precision
probability interval for all collocated samplers operating in the
reporting organization is calculated for each manual method network.
    5.3.1 Single Sampler Precision.
    5.3.1.1 At low concentrations, agreement between the measurements of
collocated samplers, expressed as percent differences,

[[Page 228]]

may be relatively poor. For this reason, collocated measurement pairs
are selected for use in the precision calculations only when both
measurements are above the following limits:
    (a) TSP: 20 g/m3.
    (b) SO2: 45 g/m3.
    (c) NO2: 30 g/m3.
    (d) Pb: 0.15 g/m3.
    (e) PM10: 20 g/m3.
    5.3.1.2 For each selected measurement pair, the percent difference
(di) is calculated, using equation 10, as follows:

                               Equation 10
[GRAPHIC] [TIFF OMITTED] TR18JY97.149

where:

Yi is the pollutant concentration measurement obtained from
the duplicate sampler; and
Xi is the concentration measurement obtained from the primary
sampler designated for reporting air quality for the site.

    (a) For each site, the quarterly average percent difference
(dj) is calculated from equation 2 and the standard deviation
(Sj) is calculated from equation 3, where n= the number of
selected measurement pairs at the site.
    5.3.2 Precision for Reporting Organization.
    5.3.2.1 For each pollutant, the average percentage difference (D)
and the pooled standard deviation (Sa) are calculated, using
equations 4 and 5, or using equations 4a and 5a if different numbers of
paired measurements are obtained at the collocated sites. For these
calculations, the k of equations 4, 4a, 5 and 5a is the number of
collocated sites.
    5.3.2.2 The 95 Percent Probability Limits for the integrated
precision for a reporting organization are calculated using equations 11
and 12, as follows:

                               Equation 11
[GRAPHIC] [TIFF OMITTED] TR18JY97.150

                               Equation 12
[GRAPHIC] [TIFF OMITTED] TR18JY97.151

    5.4 Accuracy of Manual Methods Excluding PM2.5. Estimates
of the accuracy of manual methods are calculated from the results of
independent audits as described in section 3.4 of this appendix. At the
end of each calendar quarter, an integrated accuracy probability
interval is calculated for each manual method network operated by the
reporting organization.
    5.4.1 Particulate Matter Samplers other than PM2.5
(including reference method Pb samplers).
    5.4.1.1 Single Sampler Accuracy. For the flow rate audit described
in section 3.4.1 of this appendix, the percentage difference
(di) for each audit is calculated using equation 1, where
Xi represents the known flow rate and Yi
represents the flow rate indicated by the sampler.
    5.4.1.2 Accuracy for Reporting Organization. For each type of
particulate matter measured (e.g., TSP/Pb), the average (D) of the
individual percent differences for all similar particulate matter
samplers audited during the calendar quarter is calculated using
equation 8. The standard deviation (Sa) of the percentage
differences for all of the similar particulate matter samplers audited
during the calendar quarter is calculated using equation 9. The 95
Percent Probability Limits for the integrated accuracy for the reporting
organization are calculated using equations 6 and 7. For reporting
organizations having four or fewer particulate matter samplers of one
type, only one audit is required each quarter, and the audit results of
two consecutive quarters are required to calculate an average and a
standard deviation. In that case, probability limits shall be reported
semi-annually rather than quarterly.
    5.4.2 Analytical Methods for SO2, NO2, and Pb.
    5.4.2.1 Single Analysis-Day Accuracy. For each of the audits of the
analytical methods for SO2, NO2, and Pb described
in sections 3.4.2, 3.4.3, and 3.4.4 of this appendix, the percentage
difference (dj) at each concentration level is calculated
using equation 1, where Xj represents the known value of the
audit sample and Yj represents the value of SO2,
NO2, or Pb indicated by the analytical method.
    5.4.2.2 Accuracy for Reporting Organization. For each analytical
method, the average (D) of the individual percent differences at each
concentration level for all audits during the calendar quarter is
calculated using equation 8. The standard deviation (Sa) of
the percentage differences at each concentration level for all audits
during the calendar quarter is calculated using equation 9. The 95
Percent Probability Limits for the accuracy for the reporting
organization are calculated using equations 6 and 7.
    5.5 Precision, Accuracy and Bias for Automated and Manual
PM2.5 Methods.
    (a) Reporting organizations are required to report the data that
will allow assessments of the following individual quality control
checks and audits:
    (1) Flow rate audit.
    (2) Collocated samplers, where the duplicate sampler is not an FRM
device.
    (3) Collocated samplers, where the duplicate sampler is an FRM
device.

[[Page 229]]

    (4) FRM audits.
    (b) EPA uses the reported results to derive precision, accuracy and
bias estimates according to the following procedures.
    5.5.1 Flow Rate Audits. The reporting organization shall report both
the audit standard flow rate and the flow rate indicated by the sampling
instrument. These results are used by EPA to calculate flow rate
accuracy and bias estimates.
    5.5.1.1 Accuracy of a Single Sampler - Single Check (Quarterly)
Basis (di). The percentage difference (di) for a
single flow rate audit di is calculated using equation 13,
where Xi represents the audit standard flow rate (known) and
Yi represents the indicated flow rate, as follows:

                               Equation 13
[GRAPHIC] [TIFF OMITTED] TR18JY97.152

    5.5.1.2 Bias of a Single Sampler - Annual Basis (Dj). For
an individual particulate sampler j, the average (Dj) of the
individual percentage differences (di) during the calendar
year is calculated using equation 14, where nj is the number
of individual percentage differences produced for sampler j during the
calendar year, as follows:

                               Equation 14
[GRAPHIC] [TIFF OMITTED] TR18JY97.153

    5.5.1.3 Bias for Each EPA Federal Reference and Equivalent Method
Designation Employed by Each Reporting Organization - Quarterly Basis
(Dk,q). For method designation k used by the reporting
organization, quarter q's single sampler percentage differences
(di) are averaged using equation 16, where nk,q is
the number of individual percentage differences produced for method
designation k in quarter q, as follows:

                               Equation 15
[GRAPHIC] [TIFF OMITTED] TR18JY97.154

    5.5.1.4 Bias for Each Reporting Organization - Quarterly Basis
(Dq). For each reporting organization, quarter q's single
sampler percentage differences (di) are averaged using
equation 16, to produce a single average for each reporting
organization, where nq is the total number of single sampler
percentage differences for all federal reference or equivalent methods
of samplers in quarter q, as follows:

                               Equation 16
[GRAPHIC] [TIFF OMITTED] TR18JY97.155

    5.5.1.5 Bias for Each EPA Federal Reference and Equivalent Method
Designation Employed by Each Reporting Organization - Annual Basis
(Dk). For method designation k used by the reporting
organization, the annual average percentage difference, Dk,
is derived using equation 17, where Dk,q is the average
reported for method designation k during the qth quarter, and
nk,q is the number of the method designation k's monitors
that were deployed during the qth quarter, as follows:

                               Equation 17
[GRAPHIC] [TIFF OMITTED] TR18JY97.156

    5.5.1.6 Bias for Each Reporting Organization - Annual Basis (D). For
each reporting organization, the annual average percentage difference,
D, is derived using equation 18, where Dq is the average
reported for the reporting organization during the qth quarter, and
nq is the total number monitors that were deployed during the
qth quarter. A single annual average is produced for each reporting
organization. Equation 18 follows:

                               Equation 18
[GRAPHIC] [TIFF OMITTED] TR18JY97.157

    5.5.2 Collocated Samplers, Where the Duplicate Sampler is not an FRM
Device. (a) At low concentrations, agreement between the measurements of
collocated samplers may be relatively poor. For this reason, collocated
measurement pairs are selected for use in the precision calculations
only when both measurements are above the following limits:
              PM2.5 : 6 g/m3
(b) Collocated sampler results are used to assess measurement system
precision. A collocated sampler pair consists of a primary sampler (used
for routine monitoring) and a duplicate sampler (used as a quality
control check). Quarterly precision estimates are

[[Page 230]]

calculated by EPA for each pair of collocated samplers and for each
method designation employed by each reporting organization. Annual
precision estimates are calculated by EPA for each primary sampler, for
each EPA Federal reference method and equivalent method designation
employed by each reporting organization, and nationally for each EPA
Federal reference method and equivalent method designation.
    5.5.2.1 Percent Difference for a Single Check (di). The
percentage difference, di, for each check is calculated by
EPA using equation 19, where Xi represents the concentration
produced from the primary sampler and Yi represents
concentration reported for the duplicate sampler, as follows:

                               Equation 19
[GRAPHIC] [TIFF OMITTED] TR18JY97.158

    5.5.2.2 Coefficient of Variation (CV) for a Single Check
(CVi). The coefficient of variation, CVi, for each
check is calculated by EPA by dividing the absolute value of the
percentage difference, di, by the square root of two as shown
in equation 20, as follows:

                               Equation 20
[GRAPHIC] [TIFF OMITTED] TR18JY97.159

    5.5.2.3 Precision of a Single Sampler - Quarterly Basis
(CVj,q).
    (a) For particulate sampler j, the individual coefficients of
variation (CVj,q) during the quarter are pooled using
equation 21, where nj,q is the number of pairs of
measurements from collocated samplers during the quarter, as follows:

                               Equation 21
[GRAPHIC] [TIFF OMITTED] TR18JY97.160

    (b) The 90 percent confidence limits for the single sampler's CV are
calculated by EPA using equations 22 and 23, where X2
0.05,df and X2 0.95,df are the 0.05 and
0.95 quantiles of the chi-square (X2) distribution with
nj,q degrees of freedom, as follows:

                               Equation 22
[GRAPHIC] [TIFF OMITTED] TR18JY97.161

                               Equation 23
[GRAPHIC] [TIFF OMITTED] TR18JY97.162

    5.5.2.4 Precision of a Single Sampler - Annual Basis. For
particulate sampler j, the individual coefficients of variation,
CVi, produced during the calendar year are pooled using
equation 21, where nj is the number of checks made during the
calendar year. The 90 percent confidence limits for the single sampler's
CV are calculated by EPA using equations 22 and 23, where X2
0.05,df and X2 0.95,df are the 0.05 and
0.95 quantiles of the chi-square (X2) distribution with
nj degrees of freedom.
    5.5.2.5 Precision for Each EPA Federal Reference Method and
Equivalent Method Designation Employed by Each Reporting Organization -
Quarterly Basis (CVk,q).
    (a) For each method designation k used by the reporting
organization, the quarter's single sampler coefficients of variation,
CVj,qs, obtained from equation 21, are pooled using equation
24, where nk,q is the number of collocated primary monitors
for the designated method (but not collocated with FRM samplers) and
nj,q is the number of degrees of freedom associated with
CVj,q, as follows:

                               Equation 24
[GRAPHIC] [TIFF OMITTED] TR18JY97.163

    (b) The number of method CVs produced for a reporting organization
will equal the number of different method designations having more than
one primary monitor employed by the organization during the quarter.
(When exactly one monitor of a specified designation is used by a
reporting organization, it will be collocated with an FRM sampler.)
    5.5.2.6 Precision for Each Method Designation Employed by Each
Reporting Organization - Annual Basis (CVk). For each method

[[Page 231]]

designation k used by the reporting organization, the quarterly
estimated coefficients of variation, CVk,q, are pooled using
equation 25, where nk,q is the number of collocated primary
monitors for the designated method during the qth quarter and also the
number of degrees of freedom associated with the quarter's precision
estimate for the method designation, CVk,q, as follows:

                               Equation 25
[GRAPHIC] [TIFF OMITTED] TR18JY97.164

    5.5.3 Collocated Samplers, Where the Duplicate Sampler is an FRM
Device. At low concentrations, agreement between the measurements of
collocated samplers may be relatively poor. For this reason, collocated
measurement pairs are selected for use in the precision calculations
only when both measurements are above the following limits:
PM2.5: 6 g/m3. These duplicate sampler
results are used to assess measurement system bias. Quarterly bias
estimates are calculated by EPA for each primary sampler and for each
method designation employed by each reporting organization. Annual
precision estimates are calculated by EPA for each primary monitor, for
each method designation employed by each reporting organization, and
nationally for each method designation.
    5.5.3.1 Accuracy for a Single Check (d'i). The percentage
difference, d'i, for each check is calculated by EPA using
equation 26, where Xi represents the concentration produced
from the FRM sampler taken as the true value and Yi
represents concentration reported for the primary sampler, as follows:

                               Equation 26
[GRAPHIC] [TIFF OMITTED] TR18JY97.165

    5.5.3.2 Bias of a Single Sampler - Quarterly Basis
(D'j,q).
    (a) For particulate sampler j, the average of the individual
percentage differences during the quarter q is calculated by EPA using
equation 27, where nj,q is the number of checks made for
sampler j during the calendar quarter, as follows:

                               Equation 27
[GRAPHIC] [TIFF OMITTED] TR18JY97.166

    (b) The standard error, s'j,q, of sampler j's percentage
differences for quarter q is calculated using equation 28, as follows:

                               Equation 28
[GRAPHIC] [TIFF OMITTED] TR17FE98.007

    (c) The 95 Percent Confidence Limits for the single sampler's bias
are calculated using equations 29 and 30 where t0.975,df is
the 0.975 quantile of Student's t distribution with df =
nj,q-1 degrees of freedom, as follows:

                               Equation 29
[GRAPHIC] [TIFF OMITTED] TR18JY97.168

                               Equation 30
[GRAPHIC] [TIFF OMITTED] TR18JY97.169

    5.5.3.3 Bias of a Single Sampler - Annual Basis (D'j).
    (a) For particulate sampler j, the mean bias for the year is derived
from the quarterly bias estimates, D'j,q, using equation 31,
where the variables are as defined for equations 27 and 28, as follows:

                               Equation 31
[GRAPHIC] [TIFF OMITTED] TR18JY97.170

    (b) The standard error of the above estimate, sej' is
calculated using equation 32, as follows:

[[Page 232]]

                               Equation 32
[GRAPHIC] [TIFF OMITTED] TR18JY97.171

    (c) The 95 Percent Confidence Limits for the single sampler's bias
are calculated using equations 33 and 34, where t0.975,df is
the 0.975 quantile of Student's t distribution with df =
(nj,1 + nj,2 + nj,3 +
nj,4-4) degrees of freedom, as follows:

                               Equation 33
[GRAPHIC] [TIFF OMITTED] TR18JY97.172

                               Equation 34
[GRAPHIC] [TIFF OMITTED] TR18JY97.173

    5.5.3.4 Bias for a Single Reporting Organization (D') - Annual
Basis. The reporting organizations mean bias is calculated using
equation 35, where variables are as defined in equations 31 and 32, as
follows:

                               Equation 35
[GRAPHIC] [TIFF OMITTED] TR18JY97.174

    5.5.4 FRM Audits. FRM Audits are performed once per quarter for
selected samplers. The reporting organization reports concentration data
from the primary sampler. Calculations for FRM Audits are similar to
those for collocated samplers having FRM samplers as duplicates. The
calculations differ because only one check is performed per quarter.
    5.5.4.1 Accuracy for a Single Sampler, Quarterly Basis
(di). The percentage difference, di, for each
check is calculated using equation 26, where Xi represents
the concentration produced from the FRM sampler and Yi
represents the concentration reported for the primary sampler. For
quarter q, the bias estimate for sampler j is denoted Dj,q.
    5.5.4.2 Bias of a Single Sampler - Annual Basis (D'j).
For particulate sampler j, the mean bias for the year is derived from
the quarterly bias estimates, Dj,q, using equation 31, where
nj,q equals 1 because one FRM audit is performed per quarter.
    5.5.4.3. Bias for a Single Reporting Organization - Annual Basis
(D'). The reporting organizations mean bias is calculated using equation
35, where variables are as defined in equations 31 and 32.

                   References in Appendix A of Part 58

    (1) Rhodes, R.C. Guideline on the Meaning and Use of Precision and
Accuracy Data Required by 40 CFR part 58, Appendices A and B. EPA-600/4-
83/023. U.S. Environmental Protection Agency, Research Triangle Park, NC
27711, June, 1983.
    (2) American National Standard--Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental
Technology Programs. ANSI/ASQC E4-1994. January 1995. Available from
American Society for Quality Control, 611 East Wisconsin Avenue,
Milwaukee, WI 53202.
    (3) EPA Requirements for Quality Management Plans. EPA QA/R-2.
August 1994. Available from U.S. Environmental Protection Agency, ORD
Publications Office, Center for Environmental Research Information
(CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 45268.
    (4) EPA Requirements for Quality Assurance Project Plans for
Environmental Data Operations. EPA QA/R-5. August 1994. Available from
U.S. Environmental Protection Agency, ORD Publications Office, Center
for Environmental Research Information (CERI), 26 W. Martin Luther King
Drive, Cincinnati, OH 45268.
    (5) Guidance for the Data Quality Objectives Process. EPA QA/G-4.
September 1994. Available from U.S. Environmental Protection Agency, ORD
Publications Office, Center for Environmental Research Information
(CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 45268.
    (6) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume 1--A Field Guide to Environmental Quality Assurance.
EPA-600/R-94/038a. April 1994. Available from U.S. Environmental
Protection Agency, ORD Publications Office, Center for Environmental
Research Information (CERI), 26 W. Martin Luther King Drive, Cincinnati,
OH 45268.
    (7) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II--Ambient Air Specific Methods EPA-600/R-94/038b.
Available from U.S. Environmental Protection Agency, ORD Publications
Office, Center for Environmental Research Information (CERI), 26 W.
Martin Luther King Drive, Cincinnati, OH 45268.
    (7a) Copies of section 2.12 of the Quality Assurance Handbook for
Air Pollution Measurement Systems, are available from Department E (MD-
77B), U.S. EPA, Research Triangle Park, NC 27711.
    (8) List of Designated Reference and Equivalent Methods. Available
from U.S. Environmental Protection Agency, National Exposure Research
Laboratory, Quality Assurance Branch, MD-77B, Research Triangle Park, NC
27711.
    (9) Technical Assistance Document for Sampling and Analysis of Ozone
Precursors.

[[Page 233]]

Atmospheric Research and Exposure Assessment Laboratory, U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711. EPA
600/8-91-215. October 1991.
    (10) EPA Traceability Protocol for Assay and Certification of
Gaseous Calibration Standards. EPA-600/R-93/224. September 1993.
Available from U.S. Environmental Protection Agency, ORD Publications
Office, Center for Environmental Research Information (CERI), 26 W.
Martin Luther King Drive, Cincinnati, OH 45268.
    (11) Paur, R.J. and F.F. McElroy. Technical Assistance Document for
the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057. U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711,
September, 1979.
    (12) McElroy, F.F. Transfer Standards for the Calibration of Ambient
Air Monitoring Analyzers for Ozone. EPA-600/4-79-056. U.S. Environmental
Protection Agency, Research Triangle Park, NC 27711, September, 1979.
    (13) Musick, D.R. The Ambient Air Precision and Accuracy Program:
1995 Annual Report. EPA-454/R97001. U.S. Environmental Protection
Agency, Research Triangle Park, NC 27711, February 1997.
    (14) Papp, M.L., J,B., Elkins, D.R., Musick and M.J., Messner, Data
Quality Objectives for the PM2.5. Monitoring Data, U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711. In
preparation.
    (15) Photochemical Assessment Monitoring Stations Implementation
Manual. EPA-454/B-93-051, U.S. Environmental Protection Agency, Research
Triangle Park, NC 27711, March 1994.

                          Table A-1 to Appendix A--Minimum Data Assessment Requirements
----------------------------------------------------------------------------------------------------------------
        Method           Assessment Method           Coverage          Minimum Frequency     Parameters Reported
----------------------------------------------------------------------------------------------------------------
Precision:
    Automated Methods  Response check at      Each analyzer          Once per 2 weeks       Actual concentration
     for SO2, NO2,      concentration                                                        \2\ and measured
     O3, and CO         between .08 and .10                                                  concentration \3\
                        ppm (8 & 10 ppm for
                        CO) \2\

    Manual Methods:    Collocated samplers    1 site for 1-5 sites   Once every six days    Particle mass
     All methods                              2 sites for 6-20                               concentration
     except PM2.5                              sites                                         indicated by
                                              3 sites >20 sites                              sampler and by
                                               (sites with highest                           collocated sampler
                                               conc.)
Accuracy:
    Automated Methods  Response check at      1. Each analyzer       1. Once per year       Actual concentration
     for SO2, NO2,     .03-.08 ppm1,2         2. 25% of analyzers    2. Each calendar        \2\ and measured
     O3, and CO        .15-.20 ppm1,2          (at least 1)           quarter                (indicated)
                       .35-.45 ppm1,2                                                        concentration \3\
                       80-.90 ppm1,2 (if                                                     for each level
                        applicable)

    Manual Methods     Check of analytical    Analytical system      Each day samples are   Actual concentration
     for SO2, and NO2   procedure with audit                          analyzed, at least     and measured
                        standard solutions                            twice per quarter      (indicated)
                                                                                             concentration for
                                                                                             each audit solution

    TSP, PM10          Check of sampler flow  1. Each sampler        1. Once per year       Actual flow rate and
                        rate                  2. 25% of samplers     2. Each calendar        flow rate indicated
                                               (at least 1)           quarter                by the sampler

    Lead               1. Check of sample     1. Each sampler        1. Include with TSP    1. Same as for TSP
                        flow rate as for TSP
                       2. Check of            2. Analytical system   2. Each quarter        2. Actual
                        analytical system                                                    concentration and
                        with Pb audit strips                                                 measured
                                                                                             (indicated)
                                                                                             concentration of
                                                                                             audit samples
                                                                                             (g Pb/
                                                                                             strip)
PM2.5
    Manual and         Collocated samplers    25% of SLAMS           Once every six days    1. Particle mass
     Automated                                 (monitors with Conc                           concentration
     Methods-                                  affecting NAAQS                               indicated by
     Precision                                 violation status)                             sampler and by
                                                                                             collocated sampler
                                                                                            2. 24-hour value for
                                                                                             automated methods
    Manual and         1. Check of sampler    Every SLAMS monitor    1. Automated--once     1. Actual flow rate
     Automated          flow rate                                     every 2 weeks;         and flow rate
     Methods-Accuracy                                                 Manual--each           indicated by
     and Bias                                                         calendar quarter (4/   sampler
                                                                      year)

[[Page 234]]

                       2. Audit with          .....................  2. Minimum 4           2. Particle mass
                        reference method                              measurements per       concentration
                                                                      year                   indicated by
                                                                                             sampler and by
                                                                                             audit reference
                                                                                             sampler
----------------------------------------------------------------------------------------------------------------
\1\ Concentration times 100 for CO.
\2\ Effective concentration for open path analyzers.
\3\ Corrected concentration, if applicable, for open path analyzers.

Table A-2 to Appendix A--Summary of PM2.5 Collocation and Audits Procedures As an Example of a Typical Reporting
         Organization Needing 43 Monitors, Having Procured FRMs and Three Other Equivalent Method Types
----------------------------------------------------------------------------------------------------------------
                                                                                of Collocated
  Method Designation       Total  of          Total         of Collocated  Monitors of Same   of Independent
                            Monitors         Collocated           FRMs              Type           FRM Audits
----------------------------------------------------------------------------------------------------------------
FRM                            25                 6                 6                n/a                6
Type A                         10                 3                 2                 1                 3
Type C                          2                 1                 1                 0                 1
Type D                          6                 2                 1                 1                 2
----------------------------------------------------------------------------------------------------------------

[62 FR 38833, July 18, 1997; 63 FR 7714, 7715, Feb. 17, 1998]

Appendix B to Part 58--Quality Assurance Requirements for Prevention of
             Significant Deterioration (PSD) Air Monitoring

1. General Information
    This appendix specifies the minimum quality assurance requirements
for the control and assessment of the quality of the PSD ambient air
monitoring data submitted to EPA by an organization operating a network
of PSD stations. Such organizations are encouraged to develop and
maintain quality assurance programs more extensive than the required
minimum.
    Quality assurance of air monitoring systems includes two distinct
and important interrelated functions. One function is the control of the
measurement process through the implementation of policies, procedures,
and corrective actions. The other function is the assessment of the
quality of the monitoring data (the product of the measurement process).
In general, the greater the effort and effectiveness of the control of a
given monitoring system, the better will be the resulting quality of the
monitoring data. The results of data quality assessments indicate
whether the control efforts need to be increased.
    Documentation of the quality assessments of the monitoring data is
important to data users, who can then consider the impact of the data
quality in specific applications (see Reference 1). Accordingly,
assessments of PSD monitoring data quality are required to be made and
reported periodically by the monitoring organization.
    To provide national uniformity in the assessment and reporting of
data quality among all PSD networks, specific assessment and reporting
procedures are prescribed in detail in sections 3, 4, 5, and 6 of this
appendix.
    In contrast, the control function encompasses a variety of policies,
procedures, specifications, standards, and corrective measures which
affect the quality of the resulting data. The selection and extent of
the quality control activities--as well as additional quality assessment
activities--used by a monitoring organization depend on a number of
local factors such as the field and laboratory conditions, the
objectives of the monitoring, the level of the data quality needed, the
expertise of assigned personnel, the cost of control procedures,
pollutant concentration levels, etc. Therefore, the quality assurance
requirements, in section 2 of this appendix, are specified in general
terms to allow each organization to develop a quality control system
that is most efficient and effective for its own circumstances.
    For purposes of this appendix, ``organization'' is defined as a
source owner/operator, a government agency, or their contractor that
operates an ambient air pollution monitoring network for PSD purposes.

2. Quality Assurance Requirements

    2.1 Each organization must develop and implement a quality assurance
program consisting of policies, procedures, specifications, standards
and documentation necessary to:
    (1) Provide data of adequate quality to meet monitoring objectives
and quality assurance requirements of the permit-granting authority, and

[[Page 235]]

    (2) Minimize loss of air quality data due to malfunctions or out-of-
control conditions.
    This quality assurance program must be described in detail, suitably
documented, and approved by the permit-granting authority. The Quality
Assurance Program will be reviewed during the system audits described in
section 2.4.
    2.2 Primary guidance for developing the Quality Assurance Program is
contained in References 2 and 3, which also contain many suggested
procedures, checks, and control specifications. Section 2.0.9 of
Reference 3 describes specific guidance for the development of a Quality
Assurance Program for automated analyzers. Many specific quality control
checks and specifications for manual methods are included in the
respective reference methods described in part 50 of this chapter or in
the respective equivalent method descriptions available from EPA (see
Reference 4). Similarly, quality control procedures related to
specifically designated reference and equivalent analyzers are contained
in their respective operation and instruction manuals. This guidance,
and any other pertinent information from appropriate sources, should be
used by the organization in developing its quality assurance program.
    As a minimum, each quality assurance program must include
operational procedures for each of the following activities:
    (1) Selection of methods, analyzers, or samplers;
    (2) Training;
    (3) Installation of equipment;
    (4) Selection and control of calibration standards;
    (5) Calibration;
    (6) Zero/span checks and adjustments of automated analyzers;
    (7) Control checks and their frequency;
    (8) Control limits for zero, span and other control checks, and
respective corrective actions when such limits are surpassed;
    (9) Calibration and zero/span checks for multiple range analyzers
(see section 2.6 of appendix C of this part);
    (10) Preventive and remedial maintenance;
    (11) Recording and validating data;
    (12) Date quality assessment (precision and accuracy);
    (13) Documentation of quality control information.
    2.3 Pollutant Standards.
    2.3.1 Gaseous standards (permeation tubes, permeation devices or
cylinders of compressed gas) used to obtain test concentrations for CO,
SO2, and NO2 must be traceable to either a
National Institute of Standards and Technology (NIST) gaseous Standard
Reference Material (SRM) or an NIST/EPA-approved commercially available
Certified Reference Material (CRM). CRM's are described in Reference 5,
and a list of CRM sources is available from Quality Assurance Division
(MD-77), Atmospheric Research and Exposure Assessment Laboratory, U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711. A
recommended protocol for certifying gaseous standards against an SRM or
CRM is given in section 2.0.7 of Reference 3. Direct use of a CRM as a
working standard is acceptable, but direct use of an NIST SRM as a
working standard is discouraged because of the limited supply and
expense of SRM's.
    2.3.2 Test concentrations for ozone must be obtained in accordance
with the UV photometric calibration procedure specified in appendix D of
part 50 of this chapter, or by means of a certified ozone transfer
standard. Consult References 6 and 7 for guidance on primary and
transfer standards for ozone.
    2.3.3. Flow measurement must be made by a flow measuring instrument
that is traceable to an authoritative volume or other standard. Guidance
for certifying various types of flowmeters is provided in Reference 3.
    2.4 Performance and System Audit Programs. The organization
operating a PSD monitoring network must participate in EPA's national
performance audit program. The permit granting authority, or EPA, may
conduct system audits of the ambient air monitoring programs of
organizations operating PSD networks. See section 1.4.16 of reference 2
and section 2.0.11 of reference 3 for additional information about these
programs. Organizations should contact either the appropriate EPA
Regional Quality Control Coordinator or the Quality Assurance Branch,
AREAL/RTP, at the address given in reference 3 for instructions for
participation.

3. Data Quality Assessment Requirements

    All ambient monitoring methods or analyzers used in PSD monitoring
shall be tested periodically, as described in this section 3, to
quantitatively assess the quality of the data being routinely collected.
The results of these tests shall be reported as specified in section 6.
Concentration standards used for the tests must be as specified in
section 2.3. Additional information and guidance in the technical
aspects of conducting these tests may be found in Reference 3 or in the
operation or instruction manual associated with the analyzer or sampler.
Concentration measurements reported from analyzers or analytical systems
must be derived by means of the same calibration curve and data
processing system used to obtain the routine air monitoring data. Table
B-1 provides a summary of the minimum data quality assessment
requirements, which are described in more detail in the following
sections.
    3.1 Precision of Automated Methods. A one-point precision check must
be carried out at least once every two weeks on each automated analyzer
used to measure SO2, NO2, O2, and CO.
The precision check is made

[[Page 236]]

by challenging the analyzer with a precision check gas of known
concentration (effective concentration for open path analyzers) between
0.08 and 0.10 ppm for SO2, NO2, and O3
analyzers, and between 8 and 10 ppm for CO analyzers. The standards from
which precision check test concentrations are obtained must meet the
specifications of section 2.3. Except for certain CO analyzers described
below, point analyzers must operate in their normal sampling mode during
the precision check, and the test atmosphere must pass through all
filters, scrubbers, conditioners and other components used during normal
ambient sampling and as much of the ambient air inlet system as is
practicable. If permitted by the associated operation or instruction
manual, a CO point analyzer may be temporarily modified during the
precision check to reduce vent or purge flows, or the test atmosphere
may enter the analyzer at a point other than the normal sample inlet,
provided that the analyzer's response is not likely to be altered by
these deviations from the normal operational mode.
    Open path analyzers are tested by inserting a test cell containing a
precision check gas concentration into the optical measurement beam of
the instrument. If possible, the normally used transmitter, receiver,
and, as appropriate, reflecting devices should be used during the test,
and the normal monitoring configuration of the instrument should be
altered as little as possible to accommodate the test cell for the test.
However, if permitted by the associated operation or instruction manual,
an alternate local light source or an alternate optical path that does
not include the normal atmospheric monitoring path may be used. The
actual concentration of the precision check gas in the test cell must be
selected to produce an ``effective concentration'' in the range
specified above. Generally, the precision test concentration measurement
will be the sum of the atmospheric pollutant concentration and the
precision test concentration. If so, the result must be corrected to
remove the atmospheric concentration contribution. The ``corrected
concentration'' is obtained by subtracting the average of the
atmospheric concentrations measured by the open path instrument under
test immediately before and immediately after the precision check test
from the precision test concentration measurement. If the difference
between these before and after measurements is greater than 20 percent
of the effective concentration of the test gas, discard the test result
and repeat the test. If possible, open path analyzers should be tested
during periods when the atmospheric pollutant concentrations are
relatively low and steady.
    If a precision check is made in conjunction with a zero or span
adjustment, it must be made prior to such zero or span adjustment. The
difference between the actual concentration (effective concentration for
open path analyzers) of the precision check gas and the corresponding
concentration measurement (corrected concentration, if applicable, for
open path analyzers) indicated by the analyzer is used to assess the
precision of the monitoring data as described in section 4.1. Report
data only from automated analyzers that are approved for use in the PSD
network.
    3.2 Accuracy of Automated Methods. Each sampling quarter, audit each
analyzer that monitors for SO2, NO2,
O3, or CO at least once. The audit is made by challenging the
analyzer with at least one audit gas of known concentration (effective
concentration for open path analyzers) from each of the following ranges
that fall within the measurement range of the analyzer being audited:

------------------------------------------------------------------------
                                     Concentration range, ppm
            Audit level             --------------------------     CO
                                       SO2, O3,       NO2,
------------------------------------------------------------------------
1..................................    0.03-0.08    0.03-0.08     3-8
2..................................    0.15-0.20    0.15-0.20   15-20
3..................................    0.36-0.45    0.35-0.45   35-45
4..................................    0.80-0.90  ...........   80-90
------------------------------------------------------------------------

NO2 audit gas for chemiluminescence-type NO2
analyzers must also contain at least 0.08 ppm NO.
    Note: NO concentrations substantially higher than 0.08 ppm, as may
occur when using some gas phase titration (GPT) techniques, may lead to
audit errors in chemiluminescence analyzers due to inevitable minor NO-
NOX channel imbalance. Such errors may be atypical of routine
monitoring errors to the extent that such NO concentrations exceed
typical ambient NO concentrations. These errors may be minimized by
modifying the GPT technique to lower the NO concentrations remaining in
the NO2 audit gas to levels closer to typical ambient NO
concentrations at the site.
    The standards from which audit gas test concentrations are obtained
must meet the specifications of section 2.3. Working and transfer
standards and equipment used for auditing must be different from the
standards and equipment used for calibration and spanning. The auditing
standards and calibration standards may be referenced to the same NIST,
SRM, CRM, or primary UV photometer. The auditor must not be the
operator/analyst who conducts the routine monitoring, calibration and
analysis.
    For point analyzers, the audit shall be carried out by allowing the
analyzer to analyze the audit test atmosphere in the same manner as
described for precision checks in section 3.1. The exception given in
section 3.1 for certain CO analyzers does not apply for audits.

[[Page 237]]

    Open path analyzers are audited by inserting a test cell containing
an audit gas concentration into the optical measurement beam of the
instrument. If possible, the normally used transmitter, receiver, and,
as appropriate, reflecting devices should be used during the audit, and
the normal monitoring configuration of the instrument should be modified
as little as possible to accommodate the test cell for the audit.
However, if permitted by the associated operation or instruction manual,
an alternate local light source or an alternate optical path that does
not include the normal atmospheric monitoring path may be used. The
actual concentrations of the audit gas in the test cell must be selected
to produce ``effective concentrations'' in the range specified in this
section 3.2. Generally, each audit concentration measurement result will
be the sum of the atmospheric pollutant concentration and the audit test
concentration. If so, the result must be corrected to remove the
atmospheric concentration contribution. The ``corrected concentration''
is obtained by subtracting the average of the atmospheric concentrations
measured by the open path instrument under test immediately before and
immediately after the audit test (or preferably before and after each
audit concentration level) from the audit concentration measurement. If
the difference between these before and after measurements is greater
than 20 percent of the effective concentration of the test gas
standards, discard the test result for that concentration level and
repeat the test for that level. If possible, open path analyzers should
be audited during periods when the atmospheric pollutant concentrations
are relatively low and steady. Also, the monitoring path length must be
reverified to within 3 percent to validate the audit, since
the monitoring path length is critical to the determination of the
effective concentration.
    The differences between the actual concentrations (effective
concentrations for open path analyzers) of the audit test gas and the
corresponding concentration measurements (corrected concentrations, if
applicable, for open path analyzers) indicated by the analyzer are used
to assess the accuracy of the monitoring data as described in section
4.2. Report data only from automated analyzers that are approved for use
in the PSD network.
    3.3 Precision of Manual Methods.
    3.3.1 TSP and PM10 Methods. For a given organization's
monitoring network, one sampling site must have collocated samplers. A
site with the highest expected 24-hour pollutant concentration must be
selected. The two samplers must be within 4 meters of each other but at
least 2 meters apart to preclude airflow interference. Calibration,
sampling and analysis must be the same for both collocated samplers as
well as for all other samplers in the network. The collocated samplers
must be operated as a minimum every third day when continuous sampling
is used. When a less frequent sample schedule is used, the collocated
samplers must be operated at least once each week. For each pair of
collocated samplers, designate one sampler as the sampler which will be
used to report air quality for the site and designate the other as the
duplicate sampler. The differences in measured concentration (
g/m3) between the two collocated samplers are used to
calculate precision as described in section 5.1.
    3.3.2 Pb Method. The operation of collocated samplers at one
sampling site must be used to assess the precision of the reference or
an equivalent Pb method. The procedure to be followed for Pb methods is
the same as described in 3.3.1 for the TSP method. If approved by the
permit granting authority, the collocated TSP samplers may serve as the
collocated lead samplers.
    3.4 Accuracy of Manual Methods.
    3.4.1 TSP and PM10 Methods. Each sampling quarter, audit
the flow rate of each sampler at least once. Audit the flow at the
normal flow rate, using a certified flow transfer standard (see
reference 2). The flow transfer standard used for the audit must not be
the same one used to calibrate the flow of the sampler being audited,
although both transfer standards may be referenced to the same primary
flow or volume standard. The difference between the audit flow
measurement and the flow indicated by the sampler's flow indicator is
used to calculate accuracy, as described in paragraph 5.2.
    Great care must be used in auditing high-volume samplers having flow
regulators because the introduction of resistance plates in the audit
device can cause abnormal flow patterns at the point of flow sensing.
For this reason, the orifice of the flow audit device should be used
with a normal glass fiber filter in place and without resistance plates
in auditing flow regulated high-volume samplers, or other steps should
be taken to assure that flow patterns are not perturbed at the point of
flow sensing.
    3.4.2 Pb Method. For the reference method (appendix G of part 50 of
this chapter) during each sampling quarter audit the flow rate of each
high-volume Pb sampler at least once. The procedure to be followed for
lead methods is the same as described in section 3.4.1 for the TSP
method.
    For each sampling quarter, audit the Pb analysis using glass fiber
filter strips containing a known quantity of lead. Audit samples are
prepared by depositing a Pb solution on 1.9 cm by 20.3 cm (\3/4\ inch by
8 inch) unexposed glass fiber filter strips and allowing to dry
thoroughly. The audit samples must be prepared using reagents different
from those

[[Page 238]]

used to calibrate the Pb analytical equipment being audited. Prepare
audit samples in the following concentration ranges:

------------------------------------------------------------------------
                                                      Equivalent ambient
                                  Pb concentration     Pb concentration
             Ranges                g/strip   \1\  g/m
                                                             \3\
------------------------------------------------------------------------
1..............................  100 to 300........  0.5 to 1.5.
2..............................  600 to 1,000......  3.0 to 5.0.
------------------------------------------------------------------------
\1\ Equivalent ambient Pb concentration in  g/m\3\ is based on
  sampling at 1.7 m\3\/min for 24 hours on 20.3 cm  x  25.4 cm (8 inch
  x  10 inch) glass fiber filter.

    Audit samples must be extracted using the same extraction procedure
used for exposed filters.
    Analyze at least one audit sample in each of the two ranges each day
that samples are anlayzed. The difference between the audit
concentration (in mu;g Pb/strip) and the analyst's measured
concentration (in mu;g Pb/strip is used to calculate accuracy as
described in section 5.4.
    The accuracy of an equivalent method is assessed in the same manner
as the reference method. The flow auditing device and Pb analysis audit
samples must be compatible with the specific requirements of the
equivalent method.

4. Calculations for Automated Methods

    4.1 Single Analyzer Precision. Each organization, at the end of each
sampling quarter, shall calculate and report a precision probability
interval for each analyzer. Directions for calculations are given below
and directions for reporting are given in section 6. If monitoring data
are invalidated during the period represented by a given precision
check, the results of that precision check shall be excluded from the
calculations. Calculate the percentage difference (di) for
each precision check using equation 1.
[GRAPHIC] [TIFF OMITTED] TC09NO91.023

where:

Yi = analyzer's indicated concentration from the i-th
precision check
Xi = known concentration of the test gas used for the i-th
precision check.

For each instrument, calculate the quarterly average (dj),
equation 2, and the standard deviation (Sj), equation 3.
[GRAPHIC] [TIFF OMITTED] TC09NO91.024

[GRAPHIC] [TIFF OMITTED] TC09NO91.025

where n is the number of precision checks on the instrument made during
ther sampling quarter. For example, n should be 6 or 7 if span checks
are made biweekly during a quarter.
    Calculate the 95 percent probability limits for precision using
equation 4 and 5.

Upper 95 Percent Probability
Limit = dj+1.96 Sj
                                                               (4)
Lower 95 Percent Probability
Limit = dj-1.96 Sj
                                                               (5)

    4.2 Single Analyzer Accuracy. Each organization, at the end of each
sampling quarter, shall calculate and report the percentage difference
for each audit concentration for each analyzer audited during the
quarter. Directions for calculations are given below (directions for
reporting are given in section 6).
    Calculate and report the percentage difference (di) for
each audit concentration using equation 1 where Yi is the
analyzer's indicated concentration from the i-th audit check and Xi
is the known concentration of the audit gas used for the i-th audit
check.

5. Calculations for Manual Methods

    5.1 Single Instrument Precision for TSP, Pb and PM10.
Estimates of precision for ambient air quality particulate measurements
are calculated from results obtained from collocated samplers as
described in section 3.3. At the end of each sampling quarter, calculate
and report a precision probability interval, using weekly result from
the collecated samplers. Directions for calculations are given below,
and directions for reporting are given in section 6.
    For the paired measurements obtained as described in sections 3.3.1
and 3.3.2, calculate the percent difference (di) using
equation 1a, where Yi is the concentration of pollutant
measured by the duplicate sampler, and Xi is the
concentration measured by the sampler reporting air quality for the
site. Calculate the quarterly average percent difference
(dj), equation 2; standard deviation (Sj),
equation 3; and upper and lower 95 percent probability limits for
precision, equations 6 and 7.
[GRAPHIC] [TIFF OMITTED] TC09NO91.026

                                                                    (1a)

Upper 95 percent probability
limit = dj+1.96 Sj/2
                                                                     (6)
Lower 95 percent probability
limit = dj-1.96 Sj/2
                                                                     (7)

    5.2 Single Instrument Accuracy for TSP and PM10. Each
organization, at the end of each sampling quarter, shall calculate and
report the percentage difference for each high-volume or PM10
sampler audited during

[[Page 239]]

the quarter. Directions for calculation are given below and directions
for reporting are given in section 6.
    For the flow rate audit described in section 3.4, let Xi
represent the known flow rate and Yi represent the indicated
flow rate. Calculate the percentage difference (di) using
equation 1.
    5.3 Single Instrument Accuracy for Pb. Each organization, at the end
of each sampling quarter, shall calculate and report the percentage
difference for each high-volume lead sampler audited during the quarter.
Directions for calculation are given in 5.2 and directions for reporting
are given in section 6.
    5.4 Single-Analysis-Day Accuracy for Pb. Each organization, at the
end of each sampling quarter, shall calculate and report the percentage
difference for each Pb analysis audit during the quarter. Directions for
calculations are given below and directions for reporting are given in
section 6.
    For each analysis audit for Pb described in section 3.4.2, let
Xi represent the known value of the audit sample and Yi
the indicated value of Pb. Calculate the percentage difference
(di) for each audit at each concentration level using
equation 1.

6. Organization Reporting Requirements.

    At the end of each sampling quarter, the organization must report
the following data assessment information:
    (1) For automated analyzers--precision probability limits from
section 4.1 and percentage differences from section 4.2, and
    (2) For manual methods--precision probability limits from section
5.1 and percentage differences from sections 5.2 and 5.3. The precision
and accuracy information for the entire sampling quarter must be
submitted with the air monitoring data. All data used to calculate
reported estimates of precision and accuracy including span checks,
collocated sampler and audit results must be made available to the
permit granting authority upon request.

                               Table B-1--Minimum PSD Data Assessment Requirements
----------------------------------------------------------------------------------------------------------------
                                                                                                  Parameters
             Method                Assessment method       Coverage            Frequency           reported
----------------------------------------------------------------------------------------------------------------
Precision:
  Automated Methods for SO2,      Response check at   Each analyzer.....  Once per 2 weeks..  Actual
   NO2, O3, and CO.                concentration                                               concentration 2 &
                                   between .08 & .10                                           measured
                                   ppm (8 & 10 ppm                                             concentration.3
                                   for CO) 2.
  TSP, PM10, Lead...............  Collocated          Highest             Once per week or    Two concentration
                                   samplers.           concentration       every 3rd day for   measurements.
                                                       site in             continuous
                                                       monitoring          sampling.
                                                       network.
Accuracy:
  Automated Methods for SO2,      Response check at:  Each analyzer.....  Once per sampling   Actual
   NO2, O3, and CO.                .03-.08 ppm;1,2                         quarter.            concentration2 &
                                   .15-.20 ppm;1,2                                             measured
                                   .35-.45 ppm;1,2                                             (indicated)
                                   .80-.90 ppm;1,2                                             concentration3
                                   (if applicable).                                            for each level.
  TSP, PM10.....................  Sampler flow check  Each sampler......  Once per sampling   Actual flow rate
                                                                           quarter.            and flow rate
                                                                                               indicated by the
                                                                                               sampler.
  Lead..........................  1. Sample flow      1. Each sampler...  1. Once/quarter...  1. Same as for
                                   rate check..       2. Analytical       2. Each quarter Pb   TSP.
                                  2. Check             system.             samples are        2. Actual
                                   analytical system                       analyzed.           concentration &
                                   with Pb audit                                               measured
                                   strips.                                                     concentration of
                                                                                               audit samples
                                                                                               ( g Pb/
                                                                                               strip).
----------------------------------------------------------------------------------------------------------------
\1\ Concentration shown times 100 for CO.
\2\ Effective concentration for open path analyzers.
\3\ Corrected concentration, if applicable, for open path analyzers.

References

    1. Rhodes, R.C. Guideline on the Meaning and Use of Precision and
Accuracy Data Required by 40 CFR part 58, appendices A and B. EPA-600/4-
83-023. U.S. Environmental Protection Agency, Research Triangle Park, NC
27711, June, 1983.
    2. ``Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume I--Principles.'' EPA-600/9-76-005. March 1976. Available
from U.S Environmental Protection Agency, Atmospheric Research and
Exposure Assessment Laboratory (MD-77), Research Triangle Park, NC
27711.
    3. ``Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II--Ambient Air Specific Methods.'' EPA-600/4-77-027a.
May 1979. Available from U.S. Environmental Protection Agency,
Atmospheric Research and Exposure Assessment Laboratory(MD-77), Research
Triangle Park, NC 27711.

[[Page 240]]

    4. ``List of Designated Reference and Equivalent Methods.''
Available from U.S. Environmental Protection Agency, Department E (MD-
77), Research Triangle Park, NC 27711.
    5. Hughes, E.E. and J. Mandel. A Procedure for Establishing
Traceability of Gas Mixtures to Certain National Bureau of Standards
SRM's. EPA-600/7-81-010. U.S. Environmental Protection Agency, Research
Triangle Park, NC 27711, May, 1981. (Joint NBS/EPA Publication)
    6. Paur, R.J. and F.F. McElroy. Technical Assistance Document for
the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057. U.S.
Environmental Protection Agency, Atmospheric Research and Exposure
Assessment Laboratory (MD-77), Research Triangle Park, NC 27711,
September, 1979.
    7. McElroy, F.F. Transfer Standards for the Calibration of Ambient
Air Monitoring Analyzers for Ozone. EPA-600/4-79-056. U.S. Environmental
Protection Agency, Atmospheric Research and Exposure Assessment
Laboratory (MD-77), Research Triangle Park, NC 27711, September, 1979.

[44 FR 27571, May 10, 1979; 44 FR 65070, Nov. 9, 1979; 44 FR 72592, Dec.
14, 1979, as amended at 46 FR 44168, Sept. 3, 1981; 48 FR 2530, Jan. 20,
1983; 51 FR 9596, Mar. 19, 1986; 52 FR 24741, July 1, 1987; 59 FR 41628,
41629, Aug. 12, 1994; 60 FR 52321, Oct. 6, 1995]

    Appendix C to Part 58--Ambient Air Quality Monitoring Methodology

1.0 Purpose
    This appendix specifies the monitoring methods (manual methods or
automated analyzers) which must be used in State ambient air quality
monitoring stations.

2.0 State and Local Air Monitoring Stations (SLAMS)

    2.1 Except as otherwise provided in this appendix, a monitoring
method used in a SLAMS must be a reference or equivalent method as
defined in Sec. 50.1 of this chapter.
    2.2 Substitute PM10 samplers.
    2.2.1 For purposes of showing compliance with the NAAQS for
particulate matter, a high volume TSP sampler described in 40 CFR part
50, appendix B, may be used in a SLAMS in lieu of a PM10
monitor as long as the ambient concentrations of particles measured by
the TSP sampler are below the PM10 NAAQS. If the TSP sampler
measures a single value that is higher than the PM10 24-hour
standard, or if the annual average of its measurements is greater than
the PM10 annual standard, the TSP sampler operating as a
substitute PM10 sampler must be replaced with a
PM10 monitor. For a TSP measurement above the 24-hour
standard, the TSP sampler should be replaced with a PM10
monitor before the end of the calendar quarter following the quarter in
which the high concentration occurred. For a TSP annual average above
the annual standard, the PM10 monitor should be operating by
June 30 of the year following the exceedance.
    2.2.2 In order to maintain historical continuity of ambient
particulate matter trends and patterns for PM10 NAMS that
were previously TSP NAMS, the TSP high volume sampler must be operated
concurrently with the PM10 monitor for a one-year period
beginning with the PM10 NAMS start-up date. The operating
schedule for the TSP sampler must be at least once every 6 days
regardless of the PM10 sampling frequency.
    2.3 Any manual method or analyzer purchased prior to cancellation of
its reference or equivalent method designation under Sec. 53.11 or
Sec. 53.16 of this chapter may be used in a SLAMS following cancellation
for a reasonable period of time to be determined by the Administrator.
    2.4 Approval of non-designated PM2.5 methods operated at
specific individual sites. A method for PM2.5 that has not
been designated as a reference or equivalent method as defined in
Sec. 50.1 of this chapter may be approved for use for purposes of
section 2.1 of this appendix at a particular SLAMS under the following
stipulations.
    2.4.1 The method must be demonstrated to meet the comparability
requirements (except as provided in this section 2.4.1) set forth in
Sec. 53.34 of this chapter in each of the four seasons at the site at
which it is intended to be used. For purposes of this section 2.4.1, the
requirements of Sec. 53.34 of this chapter shall apply except as
follows:
    2.4.1.1 The method shall be tested at the site at which it is
intended to be used, and there shall be no requirement for tests at any
other test site.
    2.4.1.2 For purposes of this section 2.4, the seasons shall be
defined as follows: Spring shall be the months of March, April, and May;
summer shall be the months of June, July, and August; fall shall be the
months of September, October, and November; and winter shall be the
months of December, January, and February; when alternate seasons are
approved by the Administrator.
    2.4.1.3 No PM10 samplers shall be required for the test,
as determination of the PM2.5/PM10 ratio at the
test site shall not be required.
    2.4.1.4 The specifications given in table C-4 of part 53 of this
chapter for Class I methods shall apply, except that there shall be no
requirement for any minimum number of sample sets with Rj greater than
40 g/m3 for 24-hour samples or greater than 15
g/m3 average concentration collected over a 48-hour
period.
    2.4.2 The monitoring agency wishing to use the method must develop
and implement appropriate quality assurance procedures for the method.

[[Page 241]]

    2.4.3 The monitoring agency wishing to use the method must develop
and implement appropriate procedures for assessing and reporting the
precision and accuracy of the method comparable to the procedures set
forth in appendix A of this part for designated reference and equivalent
methods.
    2.4.4 The assessment of network operating precision using collocated
measurements with reference method ``audit'' samplers required under
section 3 of appendix A of this part shall be carried out semi-annually
rather than annually (i.e., monthly audits with assessment
determinations each 6 months).
    2.4.5 Requests for approval under this section 2.4 must meet the
general submittal requirements of sections 2.7.1 and 2.7.2.1 of this
appendix and must include the requirements in sections 2.4.5.1 through
2.4.5.7 of this appendix.
    2.4.5.1 A clear and unique description of the site at which the
method or sampler will be used and tested, and a description of the
nature or character of the site and the particulate matter that is
expected to occur there.
    2.4.5.2 A detailed description of the method and the nature of the
sampler or analyzer upon which it is based.
    2.4.5.3 A brief statement of the reason or rationale for requesting
the approval.
    2.4.5.4 A detailed description of the quality assurance procedures
that have been developed and that will be implemented for the method.
    2.4.5.5 A detailed description of the procedures for assessing the
precision and accuracy of the method that will be implemented for
reporting to AIRS.
    2.4.5.6 Test results from the comparability tests as required in
section 2.4.1 through 2.4.1.4 of this appendix.
    2.4.5.7 Such further supplemental information as may be necessary or
helpful to support the required statements and test results.
    2.4.6 Within 120 days after receiving a request for approval of the
use of a method at a particular site under this section 2.4 and such
further information as may be requested for purposes of the decision,
the Administrator will approve or disapprove the method by letter to the
person or agency requesting such approval.
    2.5 Approval of non-designated methods under Sec. 58.13(f). An
automated (continuous) method for PM2.5 that is not
designated as either a reference or equivalent method as defined in
Sec. 50.1 of this chapter may be approved under Sec. 58.13(f) for use at
a SLAMS for the limited purposes of Sec. 58.13(f). Such an analyzer that
is approved for use at a SLAMS under Sec. 58.13(f), identified as
correlated acceptable continuous (CAC) monitors, shall not be considered
a reference or equivalent method as defined in Sec. 50.1 of this chapter
by virtue of its approval for use under Sec. 58.13(f), and the
PM2.5 monitoring data obtained from such a monitor shall not
be otherwise used for purposes of part 50 of this chapter.
    2.6 Use of Methods With Higher, Nonconforming Ranges in Certain
Geographical Areas.
    2.6.1 [Reserved]
    2.6.2 Nonconforming Ranges. An analyzer may be used (indefinitely)
on a range which extends to concentrations higher than two times the
upper limit specified in table B-1 of part 53 of this chapter if:
    2.6.2.1 The analyzer has more than one selectable range and has been
designated as a reference or equivalent method on at least one of its
ranges, or has been approved for use under section 2.5 (which applies to
analyzers purchased before February 18, 1975);
    2.6.2.2 The pollutant intended to be measured with the analyzer is
likely to occur in concentrations more than two times the upper range
limit specified in table B-1 of part 53 of this chapter in the
geographical area in which use of the analyzer is proposed; and
    2.6.2.3 The Administrator determines that the resolution of the
range or ranges for which approval is sought is adequate for its
intended use. For purposes of this section (2.6), ``resolution'' means
the ability of the analyzer to detect small changes in concentration.
    2.6.3 Requests for approval under section 2.6.2 must meet the
submittal requirements of section 2.7. Except as provided in subsection
2.7.3, each request must contain the information specified in subsection
2.7.2 in addition to the following:
    2.6.3.1 The range or ranges proposed to be used;
    2.6.3.2 Test data, records, calculations, and test results as
specified in subsection 2.7.2.2 for each range proposed to be used;
    2.6.3.3 An identification and description of the geographical area
in which use of the analyzer is proposed;
    2.6.3.4 Data or other information demonstrating that the pollutant
intended to be measured with the analyzer is likely to occur in
concentrations more than two times the upper range limit specified in
table B-1 of part 53 of this chapter in the geographical area in which
use of the analyzer is proposed; and
    2.6.3.5 Test data or other information demonstrating the resolution
of each proposed range that is broader than that permitted by section
2.5.
    2.6.4 Any person who has obtained approval of a request under this
section (2.6.2) shall assure that the analyzer for which approval was
obtained is used only in the geographical area identified in the request
and only while operated in the range or ranges specified in the request.
    2.7 Requests for Approval; Withdrawal of Approval.

[[Page 242]]

    2.7.1 Requests for approval under sections 2.4, 2.6.2, or 2.8 of
this appendix must be submitted to: Director, National Exposure
Assessment Laboratory, Department E, (MD-77B), U.S. Environmental
Protection Agency, Research Triangle Park, North Carolina 27711.
    2.7.2 Except as provided in section 2.7.3, each request must
contain:
    2.7.2.1 A statement identifying the analyzer (e.g., by serial
number) and the method of which the analyzer is representative (e.g., by
manufacturer and model number); and
    2.7.2.2 Test data, records, calculations, and test results for the
analyzer (or the method of which the analyzer is representative) as
specified in subpart B, subpart C, or both (as applicable) of part 53 of
this chapter.
    2.7.3 A request may concern more than one analyzer or geographical
area and may incorporate by reference any data or other information
known to EPA from one or more of the following:
    2.7.3.1 An application for a reference or equivalent method
determination submitted to EPA for the method of which the analyzer is
representative, or testing conducted by the applicant or by EPA in
connection with such an application;
    2.7.3.2 Testing of the method of which the analyzer is
representative at the initiative of the Administrator under Sec. 53.7 of
this chapter; or
    2.7.3.3 A previous or concurrent request for approval submitted to
EPA under this section (2.7).
    2.7.4 To the extent that such incorporation by reference provides
data or information required by this section (2.7) or by sections 2.4,
2.5, or 2.6, independent data or duplicative information need not be
submitted.
    2.7.5 After receiving a request under this section (2.7), the
Administrator may request such additional testing or information or
conduct such tests as may be necessary in his judgment for a decision on
the request.
    2.7.6 If the Administrator determines, on the basis of any
information available to him, that any of the determinations or
statements on which approval of a request under this section (2.7) was
based are invalid or no longer valid, or that the requirements of
section 2.4, 2.5, or 2.6, as applicable, have not been met, he may
withdraw the approval after affording the person who obtained the
approval an opportunity to submit information and arguments opposing
such action.
    2.8 Modifications of Methods by Users.
    2.8.1 Except as otherwise provided in this section (2.8), no
reference method, equivalent method, or alternative method may be used
in a SLAMS if it has been modified in a manner that will, or might,
significantly alter the performance characteristics of the method
without prior approval by the Administrator. For purposes of this
section (2.8), ``alternative method'' means an analyzer the use of which
has been approved under section 2.4, 2.5, or 2.6 of this appendix or
some combination thereof.
    2.8.2 Requests for approval under this section (2.8) must meet the
submittal requirements of sections 2.7.1 and 2.7.2.1 of this appendix.
    2.8.3 Each request submitted under this section (2.8) must include:
    2.8.3.1 A description, in such detail as may be appropriate, of the
desired modification;
    2.8.3.2 A brief statement of the purpose(s) of the modification,
including any reasons for considering it necessary or advantageous;
    2.8.3.3 A brief statement of belief concerning the extent to which
the modification will or may affect the performance characteristics of
the method; and
    2.8.3.4 Such further information as may be necessary to explain and
support the statements required by sections 2.8.3.2 and 2.8.3.3.
    2.8.4 Within 75 days after receiving a request for approval under
this section (2.8) and such further information as he may request for
purposes of his decision, the Administrator will approve or disapprove
the modification in question by letter to the person or agency
requesting such approval.
    2.8.5 A temporary modification that will or might alter the
performance characteristics of a reference, equivalent, or alternative
method may be made without prior approval under this section (2.8) if
the method is not functioning or is malfunctioning, provided that parts
necessary for repair in accordance with the applicable operation manual
cannot be obtained within 45 days. Unless such temporary modification is
later approved under section 2.8.4, the temporarily modified method
shall be repaired in accordance with the applicable operation manual as
quickly as practicable but in no event later than 4 months after the
temporary modification was made, unless an extension of time is granted
by the Administrator. Unless and until the temporary modification is
approved, air quality data obtained with the method as temporarily
modified must be clearly identified as such when submitted in accordance
with Sec. 58.28 or Sec. 58.35 of this chapter and must be accompanied by
a report containing the information specified in section 2.8.3. A
request that the Administrator approve a temporary modification may be
submitted in accordance with sections 2.8.1 through 2.8.4. In such cases
the request will be considered as if a request for prior approval had
been made.
    2.9 Use of IMPROVE Samplers at a SLAMS. ``IMPROVE'' samplers may be
used in SLAMS for monitoring of regional background and regional
transport concentrations of fine particulate matter. The IMPROVE
samplers were developed for use in the Interagency Monitoring of
Protected Visual Environments (IMPROVE) network to characterize all of
the major components and

[[Page 243]]

many trace constituents of the particulate matter that impair visibility
in Federal Class I Areas. These samplers are routinely operated at about
70 locations in the United States. IMPROVE samplers consist of four
sampling modules that are used to collect twice weekly 24-hour duration
simultaneous samples. Modules A, B, and C collect PM2.5 on
three different filter substrates that are compatible with a variety of
analytical techniques, and module D collects a PM10 sample.
PM2.5 mass and elemental concentrations are determined by
analysis of the 25mm diameter stretched Teflon filters from module A.
More complete descriptions of the IMPROVE samplers and the data they
collect are available elsewhere (references 4, 5, and 6 of this
appendix).
    *    *    *    *    *

3.0 National Air Monitoring Stations (NAMS)

    3.1 Methods used in those SLAMS which are also designated as NAMS to
measure SO2, CO, NO2, or O3 must be
automated reference or equivalent methods (continuous analyzers).

4.0 Photochemical Assessment Monitoring Stations (PAMS)

    4.1 Methods used for O3 monitoring at PAMS must be
automated reference or equivalent methods as defined in Sec. 50.1 of
this chapter.
    4.2 Methods used for NO, NO2 and NOX
monitoring at PAMS should be automated reference or equivalent methods
as defined for NO2 in Sec. 50.1 of this chapter. If
alternative NO, NO2 or NOX monitoring
methodologies are proposed, such techniques must be detailed in the
network description required by Sec. 58.40 and subsequently approved by
the Administrator.
    4.3 Methods for meteorological measurements and speciated VOC
monitoring are included in the guidance provided in references 2 and 3.
If alternative VOC monitoring methodology (including the use of new or
innovative technologies), which is not included in the guidance, is
proposed, it must be detailed in the network description required by
Sec. 58.40 and subsequently approved by the Administrator.

    5.0 Particulate Matter Episode Monitoring

    5.1 For short-term measurements of PM10 during air
pollution episodes (see Sec. 51.152 of this chapter) the measurement
method must be:
    5.1.1 Either the ``Staggered PM10'' method or the
``PM10 Sampling Over Short Sampling Times'' method, both of
which are based on the reference method for PM10 and are
described in reference 1: or
    5.1.2 Any other method for measuring PM10:
    5.1.2.1 Which has a measurement range or ranges appropriate to
accurately measure air pollution episode concentration of
PM10,
    5.1.2.2 Which has a sample period appropriate for short-term
PM10 measurements, and
    5.1.2.3 For which a quantitative relationship to a reference or
equivalent method for PM10 has been established at the use
site. Procedures for establishing a quantitative site-specific
relationship are contained in reference 1.
    5.2 Quality Assurance. PM10 methods other than the
reference method are not covered under the quality assessment
requirements of appendix A. Therefore, States must develop and implement
their own quality assessment procedures for those methods allowed under
this section 4. These quality assessment procedures should be similar or
analogous to those described in section 3 of appendix A for the
PM10 reference method.

6.0 References

    1. Pelton, D. J. Guideline for Particulate Episode Monitoring
Methods, GEOMET Technologies, Inc., Rockville, MD. Prepared for U.S.
Environmental Protection Agency, Research Triangle Park, NC. EPA
Contract No. 68-02-3584. EPA 450/4-83-005. February 1983.
    2. Technical Assistance Document For Sampling and Analysis of Ozone
Precursors. Atmospheric Research and Exposure Assessment Laboratory,
U.S. Environmental Protection Agency, Research Triangle Park, NC 27711.
EPA 600/8-91-215. October 1991.
    3. Quality Assurance Handbook for Air Pollution Measurement Systems:
Volume IV. Meteorological Measurements. Atmospheric Research and
Exposure Assessment Laboratory, U.S. Environmental Protection Agency,
Research Triangle Park, NC 27711. EPA 600/4-90-0003. August 1989.
    (4) Eldred, R.A., Cahill, T.A., Wilkenson, L.K., et al.,
Measurements of fine particles and their chemical components in the
IMPROVE/NPS networks, in Transactions of the International Specialty
Conference on Visibility and Fine Particles, Air and Waste Management
Association: Pittsburgh, PA, 1990; pp 187-196.
    (5) Sisler, J.F., Huffman, D., and Latimer, D.A.; Spatial and
temporal patterns and the chemical composition of the haze in the United
States: An analysis of data from the IMPROVE network, 1988-1991, ISSN
No. 0737-5253-26, National Park Service, Ft. Collins, CO, 1993.
    (6) Eldred, R.A., Cahill, T.A., Pitchford, M., and Malm, W.C.;
IMPROVE--a new remote area particulate monitoring system for visibility
studies, Proceedings of the 81st Annual

[[Page 244]]

Meeting of the Air Pollution Control Association, Dallas, Paper 88-54.3,
1988.

[44 FR 27571, May 10, 1979, as amended at 44 FR 37918, June 29, 1979; 44
FR 65070, Nov. 9, 1979; 51 FR 9597, Mar. 19, 1986; 52 FR 24741, 24742,
July 1, 1987; 58 FR 8469, Feb. 12, 1993; 59 FR 41628, Aug. 12, 1994; 62
FR 38843, July 18, 1997]

Appendix D to Part 58--Network Design for State and Local Air Monitoring
     Stations (SLAMS), National Air Monitoring Stations (NAMS), and
           Photochemical Assessment Monitoring Stations (PAMS)

    1. SLAMS Monitoring Objectives and Spatial Scales
    2. SLAMS Network Design Procedures
    2.1 Background Information for Establishing SLAMS
    2.2 Substantive Changes in SLAMS/NAMS Network Design Elements
    2.3 Sulfur Dioxide (SO2) Design Criteria for SLAMS
    2.4 Carbon Monoxide (CO) Design Criteria for SLAMS
    2.5 Ozone (O3) Design Criteria for SLAMS
    2.6 Nitrogen Dioxide (NO2) Design Criteria for SLAMS
    2.7 Lead (Pb) Design Criteria for SLAMS
    2.8 Particluate Matter Design Criteria for SLAMS
    3. Network Design for National Air Monitoring Stations (NAMS)
    3.1 [Reserved]
    3.2 Sulfur Dioxide (SO2) Design Criteria for NAMS
    3.3 Carbon Monoxide (CO) Design Criteria for NAMS
    3.4 Ozone (O3) Design Criteria for NAMS
    3.5 Nitrogen Dioxide (NO2) Design Criteria for NAMS
    3.6 Lead (Pb) Design Criteria for NAMS
    3.7 Particulate Matter Design Criteria for NAMS
    4. Network Design for Photochemical Assessment Monitoring Stations
(PAMS)
    5. Summary
    6. References

    1. SLAMS Monitoring Objectives and Spatial Scales

    The purpose of this appendix is to describe monitoring objectives
and general criteria to be applied in establishing the State and Local
Air Monitoring Stations (SLAMS) networks and for choosing general
locations for new monitoring stations. It also describes criteria for
determining the number and location of National Air Monitoring Stations
(NAMS), Photochemical Assessment Monitoring Stations (PAMS), and core
Stations for PM2.5. These criteria will also be used by EPA
in evaluating the adequacy of the SLAMS/NAMS/PAMS and core
PM2.5 networks.
    The network of stations that comprise SLAMS should be designed to
meet a minimum of six basic monitoring objectives. These basic
monitoring objectives are:
    (1) To determine highest concentrations expected to occur in the
area covered by the network.
    (2) To determine representative concentrations in areas of high
population density.
    (3) To determine the impact on ambient pollution levels of
significant sources or source categories.
    (4) To determine general background concentration levels.
    (5) To determine the extent of Regional pollutant transport among
populated areas; and in support of secondary standards.
    (6) To determine the welfare-related impacts in more rural and
remote areas (such as visibility impairment and effects on vegetation).
    It should be noted that this appendix contains no criteria for
determining the total number of stations in SLAMS networks, except in
areas where Pb concentrations currently exceed or have exceeded the Pb
NAAQS during any one quarter of the most recent eight quarters. The
optimum size of a particular SLAMS network involves trade offs among
data needs and available resources that EPA believes can best be
resolved during the network design process.
    This appendix focuses on the relationship between monitoring
objectives and the geographical location of monitoring stations.
Included are a rationale and set of general criteria for identifying
candidate station locations in terms of physical characteristics which
most closely match a specific monitoring objective. The criteria for
more specifically siting the monitoring station, including spacing from
roadways and vertical and horizontal probe and path placement, are
described in appendix E of this part.
    To clarify the nature of the link between general monitoring
objectives and the physical location of a particular monitoring station,
the concept of spatial scale of representativeness of a monitoring
station is defined. The goal in siting stations is to correctly match
the spatial scale represented by the sample of monitored air with the
spatial scale most appropriate for the monitoring objective of the
station.
    Thus, spatial scale of representativeness is described in terms of
the physical dimensions of the air parcel nearest to a monitoring
station throughout which actual pollutant concentrations are reasonably
similar. The scale of representativeness of most interest for the
monitoring objectives defined above are as follows:
    Microscale--defines the concentrations in air volumes associated
with area dimensions

[[Page 245]]

ranging from several meters up to about 100 meters.
    Middle Scale--defines the concentration typical of areas up to
several city blocks in size with dimensions ranging from about 100
meters to 0.5 kilometer.
    Neighborhood Scale--defines concentrations within some extended area
of the city that has relatively uniform land use with dimensions in the
0.5 to 4.0 kilometers range.
    Urban Scale--defines the overall, citywide conditions with
dimensions on the order of 4 to 50 kilometers. This scale would usually
require more than one site for definition.
    Regional Scale--defines usually a rural area of reasonably
homogeneous geography and extends from tens to hundreds of kilometers.
    National and Global Scales--these measurement scales represent
concentrations characterizing the nation and the globe as a whole.
    Proper siting of a monitoring station requires precise specification
of the monitoring objective which usually includes a desired spatial
scale of representativeness. For example, consider the case where the
objective is to determine maximum CO concentrations in areas where
pedestrians may reasonably be exposed. Such areas would most likely be
located within major street canyons of large urban areas and near
traffic corridors. Stations located in these areas are most likely to
have a microscale of representativeness since CO concentrations
typically peak nearest roadways and decrease rapidly as the monitor is
moved from the roadway. In this example, physical location was
determined by consideration of CO emission patterns, pedestrian
activity, and physical characteristics affecting pollutant dispersion.
Thus, spatial scale of representativeness was not used in the selection
process but was a result of station location.
    In some cases, the physical location of a station is determined from
joint consideration of both the basic monitoring objective, and a
desired spatial scale of representativeness. For example, to determine
CO concentrations which are typical over a reasonably broad geographic
area having relatively high CO concentrations, a neighborhood scale
station is more appropriate. Such a station would likely be located in a
residential or commercial area having a high overall CO emission density
but not in the immediate vicinity of any single roadway. Note that in
this example, the desired scale of representativeness was an important
factor in determining the physical location of the monitoring station.
    In either case, classification of the station by its intended
objective and spatial scale of representativeness is necessary and will
aid in interpretation of the monitoring data.
    Table 1 illustrates the relationship between the four basic
monitoring objectives and the scales of representativeness that are
generally most appropriate for that objective.

     Table 1--Relationship Among Monitoring Objectives and Scale of
                           Representativeness
------------------------------------------------------------------------
           Monitoring Objective               Appropriate Siting Scales
------------------------------------------------------------------------
Highest concentration.....................  Micro, Middle, neighborhood
                                             (sometimes urban \1\)
Population................................  Neighborhood, urban
Source impact.............................  Micro, middle, neighborhood
General/background........................  Neighborhood, urban,
                                             regional
Regional transport........................  Urban/regional
Welfare-related impacts...................  Urban/regional
------------------------------------------------------------------------
\1\ Urban denotes a geographic scale applicable to both cities and rural
  areas

    Open path analyzers can often be used effectively and advantageously
to provide better monitoring representation for population exposure
monitoring and general or background monitoring in urban and
neighborhood scales of representation. Such analyzers may also be able
to provide better area coverage or operational advantages in high
concentration and source-impact monitoring in middle scale and possibly
microscale areas. However, siting of open path analyzers for the latter
applications must be carried out with proper regard for the specific
monitoring objectives and for the path-averaging nature of these
analyzers. Monitoring path lengths need to be commensurate with the
intended scale of representativeness and located carefully with respect
to local sources or potential obstructions. For short-term/high-
concentration or source-oriented monitoring, the monitoring path may
need to be further restricted in length and be oriented approximately
radially with respect to the source in the downwind direction, to
provide adequate peak concentration sensitivity. Alternatively, multiple
(e.g., orthogonal) paths may be used advantageously to obtain both wider
area coverage and peak concentration sensitivity. Further discussion on
this topic is included in section 2.2 of this appendix.
    Subsequent sections of this appendix describe in greater detail the
most appropriate scales of representativeness and general monitoring
locations for each pollutant.

2. SLAMS Network Design Procedures

    The preceding section of this appendix has stressed the importance
of defining the objectives for monitoring a particular pollutant. Since
monitoring data are collected to ``represent'' the conditions in a
section or subregion of a geographical area, the previous section
included a discussion of the scale of representativeness of a monitoring
station. The use of this physical basis for locating stations allows for
an objective approach to network design.

[[Page 246]]

    The discussion of scales in sections 2.3 through 2.8 of this
appendix does not include all of the possible scales for each pollutant.
The scales that are discussed are those that are felt to be most
pertinent for SLAMS network design.
    In order to evaluate a monitoring network and to determine the
adequacy of particular monitoring stations, it is necessary to examine
each pollutant monitoring station individually by stating its monitoring
objective and determining its spatial scale of representativeness. This
will do more than insure compatibility among stations of the same type.
It will also provide a physical basis for the interpretation and
application of the data. This will help to prevent mismatches between
what the data actually represent and what the data are interpreted to
represent. It is important to note that SLAMS are not necessarily
sufficient for completely describing air quality. In many situations,
diffusion models must be applied to complement ambient monitoring, e.g.,
determining the impact of point sources or defining boundaries of
nonattainment areas.
    Information such as emissions density, housing density,
climatological data, geographic information, traffic counts, and the
results of modeling will be useful in designing regulatory networks. Air
pollution control agencies have shown the value of screening studies,
such as intensive studies conducted with portable samplers, in designing
networks. In many cases, in selecting sites for core PM2.5 or
carbon monoxide SLAMS, and for defining the boundaries of
PM2.5 optional community monitoring zones, air pollution
control agencies will benefit from using such studies to evaluate the
spatial distribution of pollutants.
    2.1 Background Information for Establishing SLAMS. Background
information that must be considered in the process of selecting SLAMS
from the existing network and in establishing new SLAMS includes
emission inventories, climatological summaries, and local geographical
characteristics. Such information is to be used as a basis for the
judgmental decisions that are required during the station selection
process. For new stations, the background information should be used to
decide on the actual location considering the monitoring objective and
spatial scale while following the detailed procedures in References 1
through 4.
    Emission inventories are generally the most important type of
background information needed to design the SLAMS network. The emission
data provide valuable information concerning the size and distribution
of large point sources. Area source emissions are usually available for
counties but should be subdivided into smaller areas or grids where
possible, especially if diffusion modeling is to be used as a basis for
determining where stations should be located. Sometimes this must be
done rather crudely, for example, on the basis of population or housing
units. In general, the grids should be smaller in areas of dense
population than in less densely populated regions.
    Emission inventory information for point sources should be generally
available for any area of the country for annual and seasonal averaging
times. Specific information characterizing the emissions from large
point sources for the shorter averaging times (diurnal variations, load
curves, etc.) can often be obtained from the source. Area source
emission data by season, although not available from the EPA, can be
generated by apportioning annual totals according to degree days.
    Detailed area source data are also valuable in evaluating the
adequacy of an existing station in terms of whether the station has been
located in the desired spatial scale of representativeness. For example,
it may be the desire of an agency to have an existing CO station
measuring in the neighborhood scale.
    By examining the traffic data for the area and examining the
physical location of the station with respect to the roadways, a
determination can be made as to whether or not the station is indeed
measuring the air quality on the desired scale.
    The climatological summaries of greatest use are the frequency
distributions of wind speed and direction. The wind rose is an easily
interpreted graphical presentation of the directional frequencies. Other
types of useful climatological data are also available, but generally
are not as directly applicable to the site selection process as are the
wind statistics.
    In many cases, the meteorological data originating from the most
appropriate (not necessarily the nearest) national weather service (NWS)
airport station in the vicinity of the prospective siting area will
adequately reflect conditions over the area of interest, at least for
annual and seasonal averaging times. In developing data in complex
meteorological and terrain situations, diffusion meteorologists should
be consulted. NWS stations can usually provide most of the relevant
weather information in support of network design activities anywhere in
the country. Such information includes joint frequency distributions of
winds and atmospheric stability (stability-wind roses).
    The geographical material is used to determine the distribution of
natural features, such as forests, rivers, lakes, and manmade features.
Useful sources of such information may include road and topographical
maps, aerial photographs, and even satellite photographs. This
information may include the terrain and land-use setting of the
prospective monitor siting area, the proximity of larger water bodies,
the distribution of pollutant sources in the area, the location of

[[Page 247]]

NWS airport stations from which weather data may be obtained, etc. Land
use and topographical characteristics of specific areas of interest can
be determined from U.S. Geological Survey (USGS) maps and land use maps.
Detailed information on urban physiography (building/street dimensions,
etc.) can be obtained by visual observations, aerial photography, and
also surveys to supplement the information available from those sources.
Such information could be used in determining the location of local
pollutant sources in and around the prospective station locations.
    2.2 Substantive Changes in SLAMS/NAMS Network Design Elements. Two
important purposes of the SLAMS monitoring data are to examine and
evaluate overall air quality within a certain region, and to assess the
trends in air pollutant levels over several years. The EPA believes that
one of the primary tools for providing these characterizations is an
ambient air monitoring program which implements technically
representative networks. The design of these networks must be carefully
evaluated not only at their outset, but at relatively frequent intervals
thereafter, using an appropriate combination of other important
technical tools, including: dispersion and receptor modeling, saturation
studies, point and area source emissions analyses, and meteorological
assessments. The impetus for these subsequent reexaminations of
monitoring network adequacy stems not only from the need to evaluate the
effect that changes in the environment may pose, but also from the
recognition that new and/or refined tools and techniques for use in
impact assessments are continually emerging and available for
application.
    Substantiative changes to an ambient air monitoring network are both
inevitable and necessary; however, any changes in any substantive aspect
of an existing SLAMS network or monitoring site that might affect the
continuity or comparability of pollutant measurements over time must be
carefully and thoroughly considered. Such substantive changes would
include cessation of monitoring at an existing site, relocation of an
existing site, a change in the type of monitoring method used, any
change in the probe or path height or orientation that might affect
pollutant measurements, any significant changes in calibration
procedures or standards, any significant change in operational or
quality assurance procedures, any significant change in the sources or
the character of the area in the vicinity of a monitoring site, or any
other change that could potentially affect the continuity or
comparability of monitoring data obtained before and after the change.
    In general, these types of changes should be made cautiously with
due consideration given to the impact of such changes on the network/
site's ability to meet its intended goals. Some of these changes will be
inevitable (such as when a monitoring site will no longer be available
and the monitor must be relocated, for example). Other changes may be
deemed necessary and advantageous, after due consideration of their
impact, even though they may have a deleterious effect on the long-term
comparability of the monitoring data. In these cases, an effort should
be made to quantify, if possible, or at least characterize, the nature
or extent of the effects of the change on the monitoring data. In all
cases, the changes and all information pertinent to the effect of the
change should be properly and completely documented for evaluation by
trends analysts.
    The introduction of open path methods to the SLAMS monitoring
network may seem relatively straightforward, given the kinds of
technical analyses required in this appendix. However, given the
uncertainties attendant to these analyses and the critical nature and
far-reaching regulatory implications of some sites in the current SLAMS
network composed of point monitors, there is a need to `bridge' between
databases generated by these different candidate methods to evaluate and
promote continuity in understanding of the historical representativeness
of the database.
    Concurrent, nominally collocated monitoring must be conducted in all
instances where an open path analyzer is effectively intended to replace
a criteria pollutant point monitor which meets either of the following:
    1. Data collected at the site represents the maximum concentration
for a particular nonattainment area; or
    2. Data collected at the site is currently used to characterize the
development of a nonattainment area State implementation plan.
    The Regional Administrator, the Administrator, or their appropriate
designee may also require collocated monitoring at other sites which
are, based on historical technical data, significant in assessing air
quality in a particular area. The term of this requirement is determined
by the Regional Administrator (for SLAMS), Administrator (for NAMS), or
their appropriate designee. The recommended minimum term consists of one
year (or one season of maximum pollutant concentration) with a maximum
term indexed to the subject pollutant NAAQS compliance interval (e.g.,
three calendar years for ozone). The requirement involves concurrent
monitoring with both the open path analyzer and the existing point
monitor during this term. Concurrent monitoring with more than one point
analyzer with an open path analyzer using one or more measurement paths
may also be advantageous to confirm adequate peak concentration
sensitivity or to optimize the location and length of the monitoring
path or paths.

[[Page 248]]

    All or some portion of the above requirement may be waived by the
Regional Administrator (for SLAMS), the Administrator (for NAMS), or
their designee in response to a request, based on accompanying technical
information and analyses, or in certain unavoidable instances caused by
logistical circumstances.
    These requirements for concurrent monitoring also generally apply to
situations where the relocation of any SLAMS site, using either a point
monitor or an open path analyzer, within an area is being contemplated.
    2.3 Sulfur Dioxide (SO2) Design Criteria for SLAMS. The
spatial scales for SO2 SLAMS monitoring are the middle,
neighborhood, urban, and regional scales. Because of the nature of
SO2 distributions over urban areas, the middle scale is the
most likely scale to be represented by a single measurement in an urban
area, but only if the undue effects from local sources (minor or major
point sources) can be eliminated. Neighborhood scales would be those
most likely to be represented by single measurements in suburban areas
where the concentration gradients are less steep. Urban scales would
represent areas where the concentrations are uniform over a larger
geographical area. Regional scale measurements would be associated with
rural areas.
    Middle Scale--Some data uses associated with middle scale
measurements for SO2 include assessing the effects of control
strategies to reduce urban concentrations (especially for the 3-hour and
24-hour averaging times) and monitoring air pollution episodes.
    Neighborhood Scale--This scale applies in areas where the SO2
concentration gradient is relatively flat (mainly suburban areas
surrounding the urban center) or in large sections of small cities and
towns. In general, these areas are quite homogeneous in terms of
SO2 emission rates and population density. Thus, neighborhood
scale measurements may be associated with baseline concentrations in
areas of projected growth and in studies of population responses to
exposure to SO2. Also concentration maxima associated with
air pollution episodes may be uniformly distributed over areas of
neighborhood scale, and measurements taken within such an area would
represent neighborhood, and to a limited extent, middle scale
concentrations.
    Urban Scale--Data from this scale could be used for the assessment
of air quality trends and the effect of control strategies on urban
scale air quality.
    Regional Scale--These measurements would be applicable to large
homogeneous areas, particularly those which are sparsely populated. Such
measurements could provide information on background air quality and
interregional pollutant transport.
    After the spatial scale has been selected to meet the monitoring
objectives for each station location, the procedures found in reference
2 should be used to evaluate the adequacy of each existing SO2
station and must be used to relocate an existing station or to locate
any new SLAMS stations. The background material for these procedures
should consist of emission inventories, meteorological data, wind roses,
and maps for population and topographical characteristics of specific
areas of interest. Isopleth maps of SO2 air quality as
generated by diffusion models5 are useful for the general
determination of a prospective area within which the station is
eventually placed.
    2.4 Carbon Monoxide (CO) Design Criteria for SLAMS. Micro, middle,
and neighborhood scale measurements are necessary station
classifications for SLAMS since most people are exposed to CO
concentrations in these scales. Carbon monoxide maxima occur primarily
in areas near major roadways and intersections with high traffic density
and poor atmospheric ventilation. As these maxima can be predicted by
ambient air quality modeling, a large fixed network of CO monitors is
not required. Long-term CO monitoring should be confined to a limited
number of micro and neighborhood scale stations in large metropolitan
areas to measure maximum pollution levels and to determine the
effectiveness of control strategies.
    Microscale--Measurements on this scale would represent distributions
within street canyons, over sidewalks, and near major roadways. The
measurements at a particular location in a street canyon would be
typical of one high concentration area which can be shown to be a
representation of many more areas throughout the street canyon or other
similar locations in a city. This is a scale of measurement that would
provide valuable information for devising and evaluating ``hot spot''
control measures.
    Middle Scale--This category covers dimensions from 100 meters to 0.5
kilometer. In certain cases discussed below, it may apply to regions
that have a total length of several kilometers. In many cases of
interest, sources and land use may be reasonably homogeneous for long
distances along a street, but very inhomogeneous normal to the street.
This is the case with strip development and freeway corridors. Included
in this category are measurements to characterize the CO concentrations
along the urban features just enumerated. When a location is chosen to
represent conditions in a block of street development, then the
characteristic dimensions of this scale are tens of meters by hundreds
of meters. If an attempt is made to characterize street-side conditions
throughout the downtown area or along an extended stretch of freeway,
the dimensions may be tens of meters by kilometer.

[[Page 249]]

    The middle scale would also include the parking lots and feeder
streets associated with indirect sources which attract significant
numbers of pollutant emitters, particularly autos. Shopping centers,
stadia, and office buildings are examples of indirect sources.
    Neighborhood Scale--Measurements in this category would represent
conditions throughout some reasonably homogeneous urban subregions, with
dimensions of a few kilometers and generally more regularly shaped than
the middle scale. Homogeneity refers to CO concentration, but it
probably also applies to land use. In some cases, a location carefully
chosen to provide neighborhood scale data, might represent not only the
immediate neighborhood, but also neighborhoods of the same type in other
parts of the city. These kinds of stations would provide information
relating to health effects because they would represent conditions in
areas where people live and work. Neighborhood scale data would provide
valuable information for developing, testing, and revising concepts and
models that describe the larger scale concentration patterns, especially
those models relying on spatially smoothed emission fields for inputs.
These types of measurements could also be used for interneighborhood
comparisons within or between cities.
    After the spatial scale has been determined to meet the monitoring
objectives for each location, the location selection procedures, as
shown in reference 3 should be used to evaluate the adequacy of each
existing CO station and must be used to relocate an existing station or
to locate any new SLAMS stations. The background material necessary for
these procedures may include the average daily traffic on all streets in
the area, wind roses for different hours of the day, and maps showing
one-way streets, street widths, and building heights. If the station is
to typify the area with the highest concentrations, the streets with the
greatest daily traffic should be identified. If some streets are one-
way, those streets that have the greatest traffic during the afternoon
and evening hours should be selected as tentative locations, because the
periods of high traffic volume are usually of greatest duration through
the evening hours. However, the strength of the morning inversion has to
be considered along with the traffic volume and pattern when seeking
areas with the highest concentrations. Traffic counters near the
stations will provide valuable data for interpreting the observed CO
Concentrations.
    Monitors should not be placed in the vicinity of possible anomalous
source areas. Examples of such areas include toll gates on turnpikes,
metered freeway ramps, and drawbridge approaches. Additional information
on network design may be found in reference 3.
    2.5 Ozone (O3) Design Criteria for SLAMS. Ozone is not
directly emitted into the atmosphere but results from complex
photochemical reactions involving organic compounds, oxides of nitrogen,
and solar radiation.
    The relationships between primary emissions (precursors) and
secondary pollutants (O3) tend to produce large separations
spatially and temporally between the major sources and the areas of high
oxidant pollution. This suggests that the meteorological transport
process and the relationships between sources and sinks need to be
considered in the development of the network design criteria and
placement of monitoring stations, especially in measuring peak
concentration levels.
    The principal spatial scales for SLAMS purposes based on the
monitoring objectives are neighborhood, urban, regional, and to a lesser
extent, middle scale. Since ozone requires appreciable formation time,
the mixing of reactants and products occurs over large volumes of air,
and this reduces the importance of monitoring small scale spatial
variability.
    Middle Scale--Measurement in this scale would represent conditions
close to sources of NOX such as roads where it would be
expected that suppression of O3 concentrations would occur.
Trees also may have a strong scavenging effect on O3 and may
tend to suppress O3 concentrations in their immediate
vicinity. Measurements at these stations would represent conditions over
relatively small portions of the urban area.
    Neighborhood Scale--Measurements in this category represent
conditions throughout some reasonably homogeneous urban subregion, with
dimensions of a few kilometers. Homogeneity refers to pollutant
concentrations. Neighborhood scale data will provide valuable
information for developing, testing, and revising concepts and models
that describe urban/regional concentration patterns. They will be useful
to the understanding and definition of processes that take periods of
hours to occur and hence involve considerable mixing and transport.
Under stagnation conditions, a station located in the neighborhood scale
may also experience peak concentration levels within the urban areas.
    Urban Scale--Measurement in this scale will be used to estimate
concentrations over large portions of an urban area with dimensions of
several kilometers to 50 or more kilometers. Such measurements will be
used for determining trends, and designing area-wide control strategies.
The urban scale stations would also be used to measure high
concentrations downwind of the area having the highest precursor
emissions.
    Regional Scale--This scale of measurement will be used to typify
concentrations over large portions of a metropolitan area and

[[Page 250]]

even larger areas with dimensions of as much as hundreds of kilometers.
Such measurements will be useful for assessing the ozone that is
transported into an urban area. Data from such stations may be useful in
accounting for the ozone that cannot be reduced by control strategies in
that urban area.
    The location selection procedure continues after the spatial scale
is selected based on the monitoring objectives. The appropriate network
design procedures as found in reference 4, should be used to evaluate
the adequacy of each existing O3 monitor and must be used to
relocate an existing station or to locate any new O3 SLAMS
stations. The first step in the siting procedure would be to collect the
necessary background material, which may consist of maps, emission
inventories for nonmethane hydrocarbons and oxides of nitrogen
(NOX), climatological data, and existing air quality data for
ozone, nonmethane hydrocarbons, and NO2/NO.
    For locating a neighborhood scale station to measure typical city
concentrations, a reasonably homogeneous geographical area near the
center of the region should be selected which is also removed from the
influence of major NOX sources. For an urban scale station to
measure the high concentration areas, the emission inventories should be
used to define the extent of the area of important nonmethane
hydrocarbons and NOX emissions. The most frequent wind speed
and direction for periods of important photochemical activity should be
determined. Then the prospective monitoring area should be selected in a
direction from the city that is most frequently downwind during periods
of photochemical activity. The distance from the station to the upwind
edge of the city should be about equal to the distance traveled by air
moving for 5 to 7 hours at wind speeds prevailing during periods of
photochemical activity. Prospective areas for locating O3
monitors should always be outside the area of major NOX.
    In locating a neighborhood scale station which is to measure high
concentrations, the same procedures used for the urban scale are
followed except that the station should be located closer to the areas
bordering on the center city or slightly further downwind in an area of
high density population.
    For regional scale background monitoring stations, the most frequent
wind associated with important photochemical activity should be
determined. The prospective monitoring area should be upwind for the
most frequent direction and outside the area of city influence.
    Since ozone levels decrease significantly in the colder parts of the
year in many areas, ozone is required to be monitored at NAMS and SLAMS
monitoring sites only during the ``ozone season'' as designated in the
AIRS files on a State by State basis and described below:

                    Ozone Monitoring Season By State
------------------------------------------------------------------------
              State                   Begin month          End month
------------------------------------------------------------------------
Alabama.........................  March.............  October.
Alaska..........................  April.............  October.
Arizona.........................  January...........  December.
Arkansas........................  March.............  November.
California......................  January...........  December.
Colorado........................  March.............  September.
Connecticut.....................  April.............  September.
Delaware........................  April.............  October.
District of Columbia............  April.............  October.
Florida.........................  March.............  October.
Georgia.........................  March.............  October.
Hawaii..........................  January...........  December.
Idaho...........................  April.............  October.
Illinois........................  April.............  October.
Indiana.........................  April.............  September.
Iowa............................  April.............  October.
Kansas..........................  April.............  October.
Kentucky........................  March.............  October.
Louisiana.......................  January...........  December.
Maine...........................  April.............  September.
Maryland........................  April.............  October.
Massachusetts...................  April.............  September.
Michigan........................  April.............  September.
Minnesota.......................  April.............  October.
Mississippi.....................  March.............  October.
Missouri........................  April.............  October.
Montana.........................  June..............  September.
Nebraska........................  April.............  October.
Nevada..........................  January...........  December.
New Hampshire...................  April.............  September.
New Jersey......................  April.............  October.
New Mexico......................  January...........  December.
New York........................  April.............  October.
North Carolina..................  April.............  October.
North Dakota....................  May...............  September.
Ohio............................  April.............  October.
Oklahoma........................  March.............  November.
Oregon..........................  May...............  September.
Pennsylvania....................  April.............  October.
Puerto Rico.....................  January...........  December.
Rhode Island....................  April.............  September.
South Carolina..................  April.............  October.
South Dakota....................  June..............  September.
Tennessee.......................  March.............  October.
Texas AQCR 4,5,7,10,11..........  January...........  December
Texas AQCR 1, 2, 3, 6, 8, 9, 12.  March.............  October
Utah............................  May...............  September.
Vermont.........................  April.............  September.
Virginia........................  April.............  October.
Washington......................  May...............  September.
West Virginia...................  April.............  October.
Wisconsin.......................  April 15..........  October 15.
Wyoming.........................  April.............  October.
American Samoa..................  January...........  December.
Guam............................  January...........  December.
Virgin Islands..................  January...........  December.
------------------------------------------------------------------------

Additional discussion on the procedures for siting ozone stations may be
found in reference 4.
    2.6 Nitrogen Dioxide (NO2) Design Criteria for SLAMS. The
typical spatial scales of representativeness associated with nitrogen
dioxide monitoring based on monitoring objectives are middle,
neighborhood, and urban. Since nitrogen dioxide is primarily formed in

[[Page 251]]

the atmosphere from the oxidation of NO, large volumes of air and mixing
times usually reduce the importance of monitoring on small scale spatial
variability especially for long averaging times. However, there may be
some situations where NO2 measurements would be made on the
middle scale for both long- and short-term averages.
    Middle Scale--Measurements on this scale would cover dimensions from
about 100 meters to 0.5 kilometer. These measurements would characterize
the public exposure to NO2 in populated areas. Also monitors
that are located closer to roadways than the minimum distances specified
in table 3 of appendix E of this part, would be represented by
measurements on this scale.
    Neighborhood and Urban Scales--The same considerations as discussed
in section 2.5 for O3 would also apply to NO2.
    After the spatial scale is selected based on the monitoring
objectives, then the siting procedures as found in reference 4 should be
used to evaluate the adequacy of each existing NO2 station
and must be used to relocate an existing station or to locate any new
NO2 SLAMS stations. The siting procedures begin with
collecting the background material. This background information may
include the characteristics of the area and its sources under study,
climatological data to determine where concentration maxima are most
likely to be found, and any existing monitoring data for NO2.
    For neighborhood or urban scales, the emphasis in site selection
will be in finding those areas where long-term averages are expected to
be the highest. Nevertheless, it should be expected that the maximum
NO2 concentrations will occur in approximately the same
locations as the maximum total oxides of nitrogen concentrations. The
best course would be to locate the station somewhat further downwind
beyond the expected point of maximum total oxides of nitrogen to allow
more time for the formation of NO2. The dilution of the
emissions further downwind from the source should be considered along
with the need for reaction time for NO2 formation in locating
stations to measure peak concentration. If dispersion is favorable,
maximum concentrations may occur closer to the emission sources than the
locations predicted from oxidation of NO to NO2 alone. This
will occur downwind of sources based on winter wind direction or in
areas where there are high ozone concentrations and high density
NO2 emissions such as on the fringe of the central business
district or further downwind. The distance and direction downwind would
be based on ozone season wind patterns.
    Once the major emissions areas and wind patterns are known, areas of
potential maximum NO2 levels can be determined. Nitrogen
dioxide concentrations are likely to decline rather rapidly outside the
urban area. Therefore, the best location for measuring NO2
concentrations will be in neighborhoods near the edge of the city.
    2.7  Lead (Pb) Design Criteria for SLAMS. Presently, less than 1
percent of the Nation's Pb air pollution emissions originate from on-
road mobile source exhaust. The majority of Pb emissions come from point
sources, such as metals processing facilities, waste disposal and
recycling, and fuel combustion (reference 19 of this appendix). The
SLAMS networks are used to assess the air quality impacts of Pb point
sources, and to determine the broad population exposure from any Pb
source. The most important spatial scales to effectively characterize
the emissions from point sources are the micro, middle, and neighborhood
scales. For purposes of establishing monitoring stations to represent
large homogeneous areas other than the above scales of
representativeness, urban or regional scale stations may also be needed.
    Microscale--This scale would typify areas in close proximity to lead
point sources. Emissions from point sources such as primary and
secondary lead smelters, and primary copper smelters may under
fumigation conditions likewise result in high ground level
concentrations at the microscale. In the latter case, the microscale
would represent an area impacted by the plume with dimensions extending
up to approximately 100 meters. Data collected at microscale stations
provide information for evaluating and developing ``hot-spot'' control
measures.
    Middle Scale--This scale generally represents Pb air quality levels
in areas up to several city blocks in size with dimensions on the order
of approximately 100 meters to 500 meters. The middle scale may for
example, include schools and playgrounds in center city areas which are
close to major Pb point sources. Pb monitors in such areas are desirable
because of the higher sensitivity of children to exposures of elevated
Pb concentrations (reference 7 of this appendix). Emissions from point
sources frequently impact on areas at which single sites may be located
to measure concentrations representing middle spatial scales.
    Neighborhood Scale--The neighborhood scale would characterize air
quality conditions throughout some relatively uniform land use areas
with dimensions in the 0.5 to 4.0 kilometer range. Stations of this
scale would provide monitoring data in areas representing conditions
where children live and play. Monitoring in such areas is important
since this segment of the population is more susceptible to the effects
of Pb. Where a neighborhood site is located away from immediate Pb
sources, the site may be very useful in representing typical air quality
values for a larger residential area, and therefore suitable for
population exposure and trends analyses.

[[Page 252]]

    Urban Scale--Such stations would be used to present ambient Pb
concentrations over an entire metropolitan area with dimensions in the 4
to 50 kilometer range. An urban scale station would be useful for
assessing trends in citywide air quality and the effectiveness of larger
scale air pollution control strategies.
    Regional Scale--Measurements from these stations would characterize
air quality levels over areas having dimensions of 50 to hundreds of
kilometers. This large scale of representativeness, rarely used in Pb
monitoring, would be most applicable to sparsely populated areas and
could provide information on background air quality and inter-regional
pollutant transport.
    Monitoring for ambient Pb levels is required for all major urbanized
areas where Pb levels have been shown or are expected to be of concern
due to the proximity of Pb point source emissions. Sources emitting five
tons per year or more of actual point and fugitive Pb emissions would
generally be candidates for lead ambient air monitoring. Modeling may be
needed to determine if a source has the potential to exceed the
quarterly lead National Ambient Air Quality Standards (NAAQS). The total
number and type of stations for SLAMS are not prescribed but must be
determined on a case-by-case basis. As a minimum, there must be two
stations in any area where Pb concentrations currently exceed or have
exceeded the Pb NAAQS during any one quarter of the most recent eight
quarters. Where the Pb air quality violations are widespread or the
emissions density, topography, or population locations are complex and
varied, there may be a need to establish more than two Pb ambient air
monitoring stations. The EPA Regional Administrator may specify more
than two monitoring stations if it is found that two stations are
insufficient to adequately determine if the Pb standard is being
attained and maintained. The Regional Administrator may also specify
that stations be located in areas outside the boundaries of the
urbanized areas.
    Concerning the previously discussed required minimum of two
stations, at least one of the stations must be a category (a) type
station and the second may be either category (a) or (b) depending upon
the extent of the point source's impact and the existence of residential
neighborhoods surrounding the source. When the source is located in an
area that is subject to NAMS requirements as in Section 3 of this
Appendix, it is preferred that the NAMS site be used to describe the
population's exposure and the second SLAMS site be used as a category
(a) site. Both of these categories of stations are defined in section 3.
    To locate monitoring stations, it will be necessary to obtain
background information such as point source emissions inventories,
climatological summaries, and local geographical characteristics. Such
information should be used to identify areas that are most suitable to
the particular monitoring objective and spatial scale of
representativeness desired. References 9 & 10 of this appendix provide
additional guidance on locating sites to meet specific urban area
monitoring objectives and should be used in locating new stations or
evaluating the adequacy of existing stations.
    After locating each Pb station and, to the extent practicable,
taking into consideration the collective impact of all Pb sources and
surrounding physical characteristics of the siting area, a spatial scale
of representativeness must be assigned to each station.
    2.8 Particulate Matter Design Criteria for SLAMS.
    As with other pollutants measured in the SLAMS network, the first
step in designing the particulate matter network is to collect the
necessary background information. Various studies in references 11, 12,
13, 14, 15, and 16 of section 6 of this appendix have documented the
major source categories of particulate matter and their contribution to
ambient levels in various locations throughout the country.
    2.8.0.1 Sources of background information would be regional and
traffic maps, and aerial photographs showing topography, settlements,
major industries and highways. These maps and photographs would be used
to identify areas of the type that are of concern to the particular
monitoring objective. After potentially suitable monitoring areas for
particulate matter have been identified on a map, modeling may be used
to provide an estimate of particulate matter concentrations throughout
the area of interest. After completing the first step, existing
particulate matter stations should be evaluated to determine their
potential as candidates for SLAMS designation. Stations meeting one or
more of the six basic monitoring objectives described in section 1 of
this appendix must be classified into one of the five scales of
representativeness (micro, middle, neighborhood, urban and regional) if
the stations are to become SLAMS. In siting and classifying particulate
matter stations, the procedures in references 17 and 18 of section 6 of
this appendix should be used.
    2.8.0.2 The most important spatial scales to effectively
characterize the emissions of particulate matter from both mobile and
stationary sources are the middle scales for PM10 and
neighborhood scales for both PM10 and PM2.5. For
purposes of establishing monitoring stations to represent large
homogenous areas other than the above scales of representativeness and
to characterize regional transport, urban or regional scale stations
would also be needed. Most PM2.5 monitoring in urban areas
should be representative of a neighborhood scale.

[[Page 253]]

    2.8.0.3 Microscale--This scale would typify areas such as downtown
street canyons and traffic corridors where the general public would be
exposed to maximum concentrations from mobile sources. In some
circumstances, the microscale is appropriate for particulate stations;
core SLAMS on the microscale should, however, be limited to urban sites
that are representative of long-term human exposure and of many such
microenvironments in the area. In general, microscale particulate matter
sites should be located near inhabited buildings or locations where the
general public can be expected to be exposed to the concentration
measured. Emissions from stationary sources such as primary and
secondary smelters, power plants, and other large industrial processes
may, under certain plume conditions, likewise result in high ground
level concentrations at the microscale. In the latter case, the
microscale would represent an area impacted by the plume with dimensions
extending up to approximately 100 meters. Data collected at microscale
stations provide information for evaluating and developing hot spot
control measures. Unless these sites are indicative of population-
oriented monitoring, they may be more appropriately classified as SPMs.
    2.8.0.4 Middle Scale--Much of the measurement of short-term public
exposure to coarse fraction particles (PM10) is on this scale
and on the neighborhood scale; for fine particulate, much of the
measurement is on the neighborhood scale. People moving through downtown
areas, or living near major roadways, encounter particles that would be
adequately characterized by measurements of this spatial scale. Thus,
measurements of this type would be appropriate for the evaluation of
possible short-term exposure public health effects of particulate matter
pollution. In many situations, monitoring sites that are representative
of micro-scale or middle-scale impacts are not unique and are
representative of many similar situations. This can occur along traffic
corridors or other locations in a residential district. In this case,
one location is representative of a neighborhood of small scale sites
and is appropriate for evaluation of long-term or chronic effects. This
scale also includes the characteristic concentrations for other areas
with dimensions of a few hundred meters such as the parking lot and
feeder streets associated with shopping centers, stadia, and office
buildings. In the case of PM10, unpaved or seldom swept
parking lots associated with these sources could be an important source
in addition to the vehicular emissions themselves.
    2.8.0.5 Neighborhood Scale--Measurements in this category would
represent conditions throughout some reasonably homogeneous urban
subregion with dimensions of a few kilometers and of generally more
regular shape than the middle scale. Homogeneity refers to the
particulate matter concentrations, as well as the land use and land
surface characteristics. Much of the PM2.5 exposures are
expected to be associated with this scale of measurement. In some cases,
a location carefully chosen to provide neighborhood scale data would
represent not only the immediate neighborhood but also neighborhoods of
the same type in other parts of the city. Stations of this kind provide
good information about trends and compliance with standards because they
often represent conditions in areas where people commonly live and work
for periods comparable to those specified in the NAAQS. In general, most
PM2.5 monitoring in urban areas should have this scale. A
PM2.5 monitoring location is assumed to be representative of
a neighborhood scale unless the monitor is adjacent to a recognized
PM2.5 emissions source or is otherwise demonstrated to be
representative of a smaller spatial scale by an intensive monitoring
study. This category also may include industrial and commercial
neighborhoods especially in districts of diverse land use where
residences are interspersed.
    2.8.0.6 Neighborhood scale data could provide valuable information
for developing, testing, and revising models that describe the larger-
scale concentration patterns, especially those models relying on
spatially smoothed emission fields for inputs. The neighborhood scale
measurements could also be used for neighborhood comparisons within or
between cities. This is the most likely scale of measurements to meet
the needs of planners.
    2.8.0.7 Urban Scale--This class of measurement would be made to
characterize the particulate matter concentration over an entire
metropolitan or rural area ranging in size from 4 to 50 km. Such
measurements would be useful for assessing trends in area-wide air
quality, and hence, the effectiveness of large scale air pollution
control strategies. Core PM2.5 SLAMS may have this scale.
    2.8.0.8 Regional Scale--These measurements would characterize
conditions over areas with dimensions of as much as hundreds of
kilometers. As noted earlier, using representative conditions for an
area implies some degree of homogeneity in that area. For this reason,
regional scale measurements would be most applicable to sparsely
populated areas with reasonably uniform ground cover. Data
characteristics of this scale would provide information about larger
scale processes of particulate matter emissions, losses and transport.
Especially in the case of PM2.5, transport contributes to
particulate concentrations and may affect multiple urban and State
entities with large populations such as in the Eastern United States.
Development of effective pollution control strategies requires an
understanding

[[Page 254]]

at regional geographical scales of the emission sources and atmospheric
processes that are responsible for elevated PM2.5 levels and
may also be associated with elevated ozone and regional haze.
    2.8.1 Specific Design Criteria for PM2.5.
    2.8.1.1 Monitoring Planning Areas.
    Monitoring planning areas (MPAs) shall be used to conform to the
community-oriented monitoring approach used for the PM2.5
NAAQS given in part 50 of this chapter. MPAs are required to correspond
to all metropolitan statistical areas (MSAs) with population greater
than 200,000, and all other areas determined to be in violation of the
PM2.5 NAAQS.1 MPAs for other designated parts of
the State are optional. All MPAs shall be defined on the basis of
existing, delineated mapping data such as State boundaries, county
boundaries, zip codes, census blocks, or census block groups.
---------------------------------------------------------------------------

    \1\The boundaries of MPAs do not have to necessarily correspond to
those of MSAs and existing intra or interstate air pollution planning
districts may be utilized.
---------------------------------------------------------------------------

    2.8.1.2 PM2.5 Monitoring Sites within the State's PM
Monitoring Network Description.
    2.8.1.2.1 The minimum required number, type of monitoring sites, and
sampling requirements for PM2.5 are based on monitoring
planning areas described in the PM monitoring network description and
proposed by the State in accordance with Sec. 58.20.
    2.8.1.2.2 Comparisons to the PM2.5 NAAQS may be based on
data from SPMs in addition to SLAMS (including NAMS, core SLAMS and
collocated PM2.5 sites at PAMS), that meet the requirements
of Sec. 58.13 and Appendices A, C and E of this part, that are included
in the PM monitoring network description. For comparison to the annual
NAAQS, the monitors should be neighborhood scale community-oriented
locations. Special purpose monitors that meet part 58 requirements will
be exempt from NAAQS comparisons with the PM2.5 NAAQS for the
first 2 calendar years of their operation to encourage PM2.5
monitoring initially. After this time, however, any SPM that records a
violation of the PM2.5 NAAQS must be seriously considered as
a potential SLAMS site during the annual SLAMS network review in
accordance with Sec. 58.25. If such SPMs are not established as a SLAMS,
the agency must document in its annual report the technical basis for
excluding it as a SLAMS.
    2.8.1.2.3 The health-effects data base that served as the basis for
selecting the new PM2.5 standards relied on a spatial average
approach that reflects average community-oriented area-wide PM exposure
levels. Under this approach, the most effective way to reduce total
population risk is by lowering the annual distributions of ambient 24-
hour PM2.5 concentrations, as opposed to controlling peak 24-
hour concentrations on individual days. The annual standard selected by
EPA will generally be the controlling standard for lowering both short-
and long-term PM2.5 concentrations on an area-wide basis and
will achieve this result. In order to be consistent with this rationale,
therefore, PM2.5 data collected from SLAMS and special
purpose monitors that are representative, not of area-wide but rather,
of relatively unique population-oriented microscale, or localized hot
spot, or unique population-oriented middle-scale impact sites are only
eligible for comparison only to the 24-hour PM2.5 NAAQS.
However, in instances where certain population-oriented micro- or
middle-scale PM2.5 monitoring sites are determined by the EPA
Regional Administrator to collectively identify a larger region of
localized high ambient PM2.5 concentrations, data from these
population-oriented sites would be eligible for comparison to the annual
NAAQS.
    2.8.1.2.4 Within each MPA, the responsible air pollution control
agency shall install core SLAMS, other required SLAMS and as many
PM2.5 stations judged necessary to satisfy the SLAMS
requirements and monitoring objectives of this appendix.
    2.8.1.3 Core Monitoring Stations for PM2.5.
    Core monitoring stations or sites are a subset of the SLAMS network
for PM2.5 that are sited to represent community-wide air
quality. These core sites include sites to be collocated at PAMS.
    2.8.1.3.1 Within each monitoring planning area, the responsible air
pollution control agency shall install the following core
PM2.5 SLAMS:
    (a) At least two core PM2.5 SLAMS per MSA with population
greater than 500,000 sampling everyday, unless exempted by the Regional
Administrator, including at least one station in a population-oriented
area of expected maximum concentration and at least one station in an
area of poor air quality and at least one additional core monitor
collocated at a PAMS site in each PAMS area2.
---------------------------------------------------------------------------

    \2\The core monitor to be collocated at a PAMS site shall not be
considered a part of the PAMS as described in section 4 of this
appendix, but shall instead be considered to be a component of the
particular MPA PM2.5 network.
---------------------------------------------------------------------------

    (b) At least one core PM2.5 SLAMS per MSA with population
greater than 200,000 and less than or equal to 500,000 sampling every
third day.
    (c) Additional core PM2.5 SLAMS per MSA with population
greater than 1 million, sampling every third day, as specified in the
following table:

[[Page 255]]

   Table 1--Required Number of Core SLAMS According to MSA Population
------------------------------------------------------------------------
           MSA Population            Minimum Required No. of Core Sites1
------------------------------------------------------------------------
>1 M                                 3
Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡
Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡
Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡
Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡ 1Core SLAMS at PAMS are in addition
------------------------------------------------------------------------

    to these numbers.2.8.1.3.2 The site situated in the area of expected
maximum concentration is analogous to NAMS ``category a.'' 3
This will henceforth be termed a category a core SLAMS site. The site
located in the area of poor air quality with high population density or
representative of maximum population impact is analogous to NAMS,
``category b.'' This second site will be called a category b core SLAMS
site.
---------------------------------------------------------------------------

    \3\The measured maximum concentrations at core population-oriented
sites should be consistent with the averaging time of the NAAQS.
Therefore, sites only with high concentrations for shorter averaging
times (say 1-hour) should not be category ``a'' core SLAMS monitors.
---------------------------------------------------------------------------

    2.8.1.3.3 Those MPAs that are substantially impacted by several
different and geographically disjoint local sources of fine particulate
should have separate core sites to monitor each influencing source
region.
    2.8.1.3.4 Within each monitoring planning area, one or more required
core SLAMS may be exempted by the Regional Administrator. This may be
appropriate in areas where the highest concentration is expected to
occur at the same location as the area of maximum or sensitive
population impact, or areas with low concentrations (e.g., highest
concentrations are less than 80 percent of the NAAQS). When only one
core monitor for PM2.5 is included in a MPA or optional CMZ,
however, a ``category a'' core site is strongly preferred to determine
community-oriented PM2.5 concentrations in areas of high
average PM2.5 concentration.
    2.8.1.3.5 More than the minimum number of core SLAMS should be
deployed as necessary in all MPAs. Except for the core SLAMS described
in section 2.8.1.3.1 of this appendix, the additional core SLAMS must
only comply with the minimum sampling frequency for SLAMS specified in
Sec. 58.13(e).
    2.8.1.3.6 A subset of the core PM2.5 SLAMS shall be
designated NAMS as discussed in section 3.7 of this appendix. The
selection of core monitoring sites in relation to MPAs and CMZs is
discussed further in section 2.8.3 of this appendix.
    2.8.1.3.7 Core monitoring sites shall represent neighborhood or
larger spatial scales. A monitor that is established in the ambient air
that is in or near a populated area, and meets appropriate 40 CFR part
58 criteria (i.e., meets the requirements of Sec. 58.13 and Sec. 58.14,
Appendices A, C, and E of this part) can be presumed to be
representative of at least a neighborhood scale, is eligible to be
called a core site and shall produce data that are eligible for
comparison to both the 24-hour and annual PM2.5 NAAQS. If the
site is adjacent to a dominating local source or can be shown to have
average 24-hour concentrations representative of a smaller spatial
scale, then the site would only be compared to the 24-hour
PM2.5 NAAQS.
    2.8.1.3.8 Continuous fine particulate monitoring at core SLAMS. At
least one continuous fine particulate analyzer (e.g., beta attenuation
analyzer; tapered-element, oscillating microbalance (TEOM);
transmissometer; nephelometer; or other acceptable continuous fine
particulate monitor) shall be located at a core monitoring
PM2.5 site in each metropolitan area with a population
greater than 1 million. These analyzers shall be used to provide
improved temporal resolution to better understand the processes and
causes of elevated PM2.5 concentrations and to facilitate
public reporting of PM2.5 air quality and will be in
accordance with appropriate methodologies and QA/QC procedures approved
by the Regional Administrator.
    2.8.1.4 Other PM2.5 SLAMS Locations.
    In addition to the required core sites described in section 2.8.1.3
of this appendix, the State shall also install and operate on an every
third day sampling schedule at least one SLAMS to monitor for regional
background and at least one SLAMS to monitor regional transport. These
monitoring stations may be at a community-oriented site and their
requirement may be satisfied by a corresponding SLAMS monitor in an area
having similar air quality in another State. The State shall also be
required to establish additional SLAMS sites based on the total
population outside the MSA(s) associated with monitoring planning areas
that contain required core SLAMS. There shall be one such additional
SLAMS for each 200,000 people. The minimum number of SLAMS may be
deployed anywhere in the State to satisfy the SLAMS monitoring
objectives including monitoring of small scale impacts which may not be
community-oriented or for regional transport as described in section 1
of this appendix. Other SLAMS may also be established and are encouraged
in a State PM2.5 network.
    2.8.1.5 Additional PM2.5 Analysis Requirements.
    (a) Within 1 year after September 16, 1997, chemical speciation will
be required at approximately 25 PM2.5 core sites collocated
at PAMS sites (1 type 2 site per PAMS area)

[[Page 256]]

and at approximately 25 other core sites for a total of approximately 50
sites. The selection of these sites will be performed by the
Administrator in consultation with the Regional Administrator and the
States. Chemical speciation is encouraged at additional sites. At a
minimum, chemical speciation to be conducted will include analysis for
elements, selected anions and cations, and carbon. Samples for required
speciation will be collected using appropriate monitoring methods and
sampling schedule in accordance with procedures approved by the
Administrator.
    (b) Air pollution control agencies shall archive PM2.5
filters from all other SLAMS sites for a minimum of one year after
collection. These filters shall be made available for supplemental
analyses at the request of EPA or to provide information to State and
local agencies on the composition for PM2.5. The filters
shall be archived in accordance with procedures approved by the
Administrator.
    2.8.1.6 Community Monitoring Zones.
    2.8.1.6.1 The CMZs describe areas within which two or more core
monitors may be averaged for comparison with the annual PM2.5
NAAQS. This averaging approach as specified in 40 CFR part 50, appendix
N, is directly related to epidemiological studies used as the basis for
the PM2.5 NAAQS. A CMZ should characterize an area of
relatively similar annual average air quality (i.e., the average
concentrations at individual sites shall not exceed the spatial average
by more than 20 percent) and exhibit similar day to day variability
(e.g., the monitoring sites should not have low correlations, say less
than 0.6). Moreover, the entire CMZ should principally be affected by
the same major emission sources of PM2.5 .
    2.8.1.6.2 Each monitoring planning area may have at least one CMZ,
that may or may not cover the entire MPA. In metropolitan statistical
areas (MSAs) for which MPAs are required, the CMZs may completely cover
the entire MSA. When more than one CMZ is described within an MPA, CMZs
shall not overlap in their geographical coverage. All areas in the
ambient air may become a CMZ.
    2.8.1.6.3. As PM2.5 networks are first established, core
sites would be used individually for making comparisons to the annual
PM2.5 NAAQS. As these networks evolve, individual monitors
may not be adequate by themselves to characterize the annual average
community wide air quality. This is especially true for areas with sharp
gradients in annual average air quality. Therefore, CMZs with multiple
core SLAMS or other eligible sites as described in accordance with
section 2.8.1.2 to this appendix, may be established for the purposes of
providing improved estimates of community wide air quality and for
making comparisons to the annual NAAQS. This CMZ approach is subject to
the constraints of section 2.8.1.6.1 to this appendix.
    2.8.1.6.4 The spatial representativeness of individual monitoring
sites should be considered in the design of the network and in
establishing the boundaries of CMZs. Communities within the MPA with the
highest PM2.5 concentrations must have a high priority for
PM2.5 monitoring. Until a sufficient number of monitoring
stations or CMZs are established, however, the monitored air quality in
all parts of the MPA may not be precisely known. It would be desirable,
however, to design the placement of monitors so that those portions of
the MPAs without monitors could be characterized as having average
concentrations less than the monitored portions of the network.
    2.8.1.7 Selection of Monitoring Locations Within MPAs or CMZs.
    2.8.1.7.1 Figure 1 of this appendix illustrates a hypothetical
monitoring planning area and shows the location of monitors in relation
to population and areas of poor air quality. Figure 2 of this appendix
shows the same hypothetical MPA as Figure 1 of this appendix and
illustrates potential community monitoring zones and the location of
core monitoring sites within them.
    2.8.1.7.2 In Figure 1 of this appendix, a hypothetical monitoring
planning area is shown representing a typical Eastern US urban areas.
The ellipses represent zones with relatively high population and poor
air quality, respectively. Concentration isopleths are also depicted.
The highest population density is indicated by the urban icons, while
the area of worst air quality is presumed to be near the industrial
symbols. The monitoring area should have at least one core monitor to
represent community wide air quality in each sub-area affected by
different emission sources. Each monitoring planning area with
population greater than 500,000 is required to have at least two core
population-oriented monitors that will sample everyday (with PAMS areas
requiring three) and may have as many other core SLAMS, other SLAMS, and
SPMs as necessary. All SLAMS should generally be population-oriented,
while the SPMs can focus more on other monitoring objectives, e.g.,
identifying source impacts and the area boundaries with maximum
concentration. Ca denotes ``category a'' core SLAMS site
(community-oriented site in area of expected maximum concentration); it
is shown within the populated area and closest to the area with highest
concentration. Cb denotes a ``category b'' core SLAMS site
(area of poor air quality with high population density or representative
of maximum population impact); it is shown in the area of poor air
quality, closest to highest population density. All other core SLAMS in
this MPA are denoted by ``C.'' S denotes other SLAMS

[[Page 257]]

sites (monitoring for any objective: Max concentration, population
exposure, source-oriented, background, or regional transport or in
support of secondary NAAQS). P denotes a Special Purpose Monitor (a
specialized monitor that, for example, may use a non-reference sampler).
Finally, note that all SPMs would be subject to the 2-year moratorium
against data comparison to the NAAQS for the first 2 complete calendar
years of its operation.
    2.8.1.7.3 A Monitoring Planning Area may have one or more community
monitoring zones (CMZ) for aggregation of data from eligible SLAMS and
SPM sites for comparison to the annual NAAQS. The planning area has
large gradients of average air quality and, as shown in Figure 2 may be
assigned three CMZs: An industrial zone, a downtown central business
district (CBD), and a residential area. (If there is not a large
difference between downtown concentrations and other residential areas,
a separate CBD zone would not be appropriate).

[[Page 258]]

[GRAPHIC] [TIFF OMITTED] TR17FE98.008

[[Page 259]]

[GRAPHIC] [TIFF OMITTED] TR17FE98.009

    2.8.1.7.4 Figure 3 of this appendix illustrates how CMZs and
PM2.5 monitors might be located in a hypothetical MPA typical
of a Western State. Western States with more localized sources of PM and
larger geographic area could require a different mix of SLAMS and SPM
monitors and may need

[[Page 260]]

more total monitors. As the networks are deployed, the available
monitors may not be sufficient to completely represent all geographic
portions of the Monitoring Planning Area. Due to the distribution of
pollution and population and because of the number and spatial
representativeness of monitors, the MPAs and CMZs may not cover the
entire State.
[GRAPHIC] [TIFF OMITTED] TR18JY97.176

    2.8.1.7.5 Figure 4 of this appendix shows how the MPAs, CMZs, and
PM2.5 monitors might be distributed within a hypothetical
State. Areas of the State included within MPAs are shown within heavy
solid lines. Two MPAs are illustrated. Areas in the State outside the
MPAs will also include monitors, but this monitoring coverage may be
limited. This portion of the State may also be represented by CMZs
(shown by areas enclosed within dotted lines). The monitors that are
intended for comparison to the NAAQS are indicated by X. Furthermore,
eligible monitors within a CMZ could be averaged for comparison to the
annual NAAQS or examined individually for comparison to both NAAQS. Both
within the MPAs and in the remainder of the State, some special study
monitors might not satisfy applicable 40 CFR part 58 requirements and
will not be eligible for comparison to the NAAQS.

[[Page 261]]

[GRAPHIC] [TIFF OMITTED] TR18JY97.177

    2.8.2 Substitute PM Monitoring Sites.
    2.8.2.1 Section 2.2 of appendix C of this part describes conditions
under which TSP samplers can be used as substitutes for PM10.
This provision is intended to be used when PM10
concentrations are expected to be very low and substitute TSP samplers
can be used to satisfy the minimum number of PM10 samplers
needed for an adequate PM10 network.
    2.8.2.2 If data produced by substitute PM samplers exceed the
concentration levels described in appendix C of this part, then the need
for this sampler to be converted to a PM10 or
PM2.5 sampler, shall be considered in the PM monitoring
network review. If the State does not believe that a PM10 or
PM2.5 sampler should be sited, the State shall submit
documentation to EPA as part of its annual PM report to justify this
decision. If a PM site is not designated as a substitute site in the PM
monitoring network description, then high concentrations at this site
would not necessarily cause this site to become a PM2.5 or
PM10 site, whichever is indicated.
    2.8.2.3 Consistent with Sec. 58.1, combinations of SLAMS
PM10 or PM2.5 monitors and other monitors may
occupy the same structure without any mutual effect on the regulatory
definition of the monitors.

3. Network Design for National Air Monitoring Stations (NAMS)

    The NAMS must be stations selected from the SLAMS network with
emphasis given to urban and multisource areas. Areas to be monitored
must be selected based on urbanized population and pollutant
concentration levels. Generally, a larger number of NAMS are needed in
more polluted urban and multisource areas. The network design criteria
discussed below reflect these concepts. However, it should be emphasized
that deviations from the NAMS network design criteria may be necessary
in a few cases. Thus, these design criteria are not a set of rigid rules
but rather a guide for achieving a proper distribution of monitoring
sites on a national scale.
    The primary objective for NAMS is to monitor in the areas where the
pollutant concentration and the population exposure are expected to be
the highest consistent with the averaging time of the NAAQS.
Accordingly, the NAMS fall into two categories:
    Category (a): Stations located in area(s) of expected maximum
concentrations, generally microscale for CO, microscale or middle scale
for Pb, middle scale or neighborhood scale for population-oriented
particulate matter, urban or regional scale for Regional transport
PM2.5, neighborhood scale for SO2, and NO2, and urban scale
for O3.

[[Page 262]]

    Category (b): Stations which combine poor air quality with a high
population density but not necessarily located in an area of expected
maximum concentrations (neighborhood scale, except urban scale for
NO2). Category (b) monitors would generally be representative
of larger spatial scales than category (a) monitors.
    For each urban area where NAMS are required, both categories of
monitoring stations must be established. In the case of Pb and
SO2 if only one NAMS is needed, then category (a) must be
used. The analysis and interpretation of data from NAMS should consider
the distinction between these types of stations as appropriate.
    For each MSA where NAMS are required, both categories of monitoring
stations must be established. In the case of SO2 if only one
NAMS is needed, then category (a) must be used. The analysis and
interpretation of data from NAMS should consider the distinction between
these types of stations as appropriate.
    The concept of NAMS is designed to provide data for national policy
analyses/trends and for reporting to the public on major metropolitan
areas. It is not the intent to monitor in every area where the NAAQS are
violated. On the other hand, the data from SLAMS should be used
primarily for nonattainment decisions/ analyses in specific geographical
areas. Since the NAMS are stations from the SLAMS network, station
locating procedures for NAMS are part of the SLAMS network design
process.
    3.1 [Reserved]
    3.2 Sulfur Dioxide (SO2) Design Criteria for NAMS. It is
desirable to have a greater number of NAMS in the more polluted and
densely populated urban and multisource areas. The data in table 3 show
the approximate number of permanent stations needed in urban areas to
characterize the national and regional SO2 air quality trends
and geographical patterns. These criteria require that the number of
NAMS in areas where urban populations exceed 1,000,000 and
concentrations also exceed the primary NAAQS may range from 6 to 10 and
that in areas where the SO2 problem is minor, only one or two
(or no) monitors are required. For those cases where more than one
station is required for an urban area, there should be at least one
station for category (a) and category (b) objectives as discussed in
section 3. Where three or more stations are required, the mix of
category (a) and (b) stations is determined on a case-by-case basis. The
actual number and location of the NAMS must be determined by EPA
Regional Offices and the State Agency, subject to the approval of EPA
Headquarters, Office of Air Quality Planning and Standards (OAQPS).

                              Table 3--SO2 National Air Monitoring Station Criteria
                                   [Approximate number of stations per area] a
----------------------------------------------------------------------------------------------------------------
                                                                     High            Medium            Low
                     Population category                       concentration b  concentration c  concentration d
----------------------------------------------------------------------------------------------------------------
>1,000,000                                                          6-10                 4-8            2-4
500,000 to 1,000,000                                                 4-8                 2-4            1-2
250,000 to 500,000                                                   3-4                 1-2            0-1
100,000 to 250,000                                                   1-2                 0-1              0
----------------------------------------------------------------------------------------------------------------
a Selection of urban areas and actual number of stations per area will be jointly determined by EPA and the
  State agency.
b High concentration--exceeding level of the primary NAAQS.
c Medium concentration--exceeding 60 percent of the level of the primary or 100% of the secondary NAAQS.
d Low concentration--less than 60 percent of the level of the primary or 100% of the secondary NAAQS.

    The estimated number of SO2 NAMS which would be required
nationwide ranges from approximately 200 to 300. This number of NAMS
SO2 monitors is sufficient for national trend purposes due to
the low background SO2 levels, and the fact that air quality
is very sensitive to SO2 emission changes. The actual number
of stations in any specific area depends on local factors such as
meteorology, topography, urban and regional air quality gradients, and
the potential for significant air quality improvements or degradation.
The greatest density of stations should be where urban populations are
large and where pollution levels are high. Fewer NAMS are necessary in
the western States since concentrations are seldom above the NAAQS in
their urban areas. Exceptions to this are in the areas where an expected
shortage of clean fuels indicates that ambient air quality may be
degraded by increased SO2 emissions. In such cases, a minimum
number of NAMS is required to provide EPA with a proper national
perspective on significant changes in air quality.
    Like TSP, the worst air quality in an urban area is to be used as
the basis for determining the required number of SO2 NAMS
(see table 3). This includes SO2 air quality levels within
populated parts of urbanized areas, that are affected by one or two
point sources of SO2 if the impact of the source(s) extends
over a reasonably broad geographic scale (neighborhood or larger).
Maximum SO2 air quality levels in remote unpopulated areas
should be excluded as a basis for selecting NAMS regardless of the
sources affecting the concentration levels. Such remote areas are more
appropriately monitored by SLAMS or SPM networks and/or characterized by
diffusion model calculations as necessary.
    3.3 Carbon Monoxide (CO) Design Criteria for NAMS. Information is
needed on ambient CO levels in major urbanized areas where CO levels
have been shown or inferred to be a significant concern. At the national
level, EPA will not routinely require data from as many stations as are
required for PM-10, and

[[Page 263]]

perhaps SO2, since CO trend stations are principally needed
to assess the overall air quality progress resulting from the emission
controls required by the Federal motor vehicle control program (FMVCP)
and other local controls.
    Although State and local air programs may require extensive
monitoring to document and measure the local impacts of CO emissions and
emission controls, an adequate national perspective is possible with as
few as two stations per major urban area. The two categories for which
CO NAMS would be required are: (a) Peak concentration areas such as are
found around major traffic arteries and near heavily traveled streets in
downtown areas (micro scale); and (b) neighborhoods where concentration
exposures are significant (middle scale, neighborhood scale).
    The peak concentration station (micro scale) is usually found near
heavily traveled downtown streets (street canyons), but could be found
along major arterials (corridors), either near intersections or at low
elevations which are influenced by downslope drainage patterns under low
inversion conditions. The peak concentration station should be located
so that it is representative of several similar source configurations in
the urban area, where the general population has access. Thus, it should
reflect one of many potential peak situations which occur throughout the
urban area. It is recognized that this does not measure air quality
which represents large geographical areas. Thus, a second type of
station on the neighborhood scale is necessary to provide data
representative of the high concentration levels which exist over large
geographical areas.
    The category (b) (middle scale or neighborhood scale) should be
located in areas with a stable, high population density, projected
continuity of neighborhood character, and high traffic density. The
stations should be located where no major zoning changes, new highways,
or new shopping centers are being considered. The station should be
where a significant CO pollution problem exists, but not be unduly
influenced by any one line source. Rather, it should be more
representative of the overall effect of the sources in a significant
portion of the urban area.
    Because CO is generally associated with heavy traffic and population
clusters, an urbanized area with a population greater than 500,000 is
the principal critertion for identifying the urban areas for which pairs
of NAMS for this pollutant will be required. The criterion is based on
judgment that stations in urban areas with greater than 500,000
population would provide sufficient data for national analysis and
national reporting to Congress and the public. Also, it has generally
been shown that major CO problems are found in areas greater than
500,000 population.
    3.4 Ozone (O3) Design Criteria for NAMS. The criterion
for selecting locations for ozone NAMS is any urbanized area having a
population of more than 200,000. This population cut off is used since
the sources of hydrocarbons are both mobile and stationary and are more
diverse. Also, because of local and national control strategies and the
complex chemical process of ozone formation and transport, more sampling
stations than for CO are needed on a national scale to better understand
the ozone problem. This selection criterion is based entirely on
population and will include those relatively highly populated areas
where most of the oxidant precursors originate.
    Each urban area will generally require only two ozone NAMS, One
station would be representative of maximum ozone concentrations
(category (a), urban scale) under the wind transport conditions as
discussed in section 2.5. The exact location should balance local
factors affecting transport and buildup of peak O3 levels
with the need to represent population exposure. The second station
(category (b), neighborhood scale), should be representative of high
density population areas on the fringes of the central business district
along the predominant summer/fall daytime wind direction. This latter
station should measure peak O3 levels under light and
variable or stagnant wind conditions. Two ozone NAMS stations will be
sufficient in most urban areas since spatial gradients for ozone
generally are not as sharp as for other criteria pollutants.
    3.5 Nitrogen Dioxide (NO2) Criteria for NAMS. Nitrogen
dioxide NAMS will be required in those areas of the country which have a
population greater than 1,000,000. These areas will have two NO2
NAMS. It is felt that stations in these major metropolitan areas would
provide sufficient data for a national analysis of the data, and also
because NO2 problems occur in areas of greater than 1,000,000
population.
    Within urban areas requiring NAMS, two permanent monitors are
sufficient. The first station (category (a), middle scale or
neighborhood scale) would be to measure the photochemical production of
NO2 and would best be located in that part of the urban area
where the emission density of NOX is the highest. The second
station (category (b) urban scale), would be to measure the NO2
produced from the reaction of NO with O3 and should be
downwind of the area of peak NOX emission areas.
    3.6  Lead (Pb) Design Criteria for NAMS. In order to achieve the
national monitoring objective, one NAMS site must be located in one of
the two cities with the greatest population in the following ten regions
of the country (the choice of which of the two metropolitan areas should
have the lead NAMS requirement is made by the Administrator or

[[Page 264]]

the Administrator's designee using the recommendation of the Regional
Administrators or the Regional Administrators' designee):

Table 1.--EPA Regions & Two Current Largest MSA/CMSAs (Using 1995 Census
                                  Data)
------------------------------------------------------------------------
            Region (States)                   Two Largest MSA/CMSAs
------------------------------------------------------------------------
I (Connecticut, Massachusetts, Maine,    Boston-Worcester-Lawrence CMSA,
 New Hampshire, Rhode Island, Vermont).   Hartford, CT MSA.
II (New Jersey, New York, Puerto Rico,   New York-Northern New Jersey-
 U.S. Virgin Islands).                    Long Island, CMSA, San Juan-
                                          Caguas-Arecibo, PR CMSA.
III (Delaware, Maryland, Pennsylvania,   Washington-Baltimore CMSA,
 Virginia, West Virginia, Washington,     Philadelphia-Wilmington-
 DC).                                     Atlantic City CMSA.
IV (Alabama, Florida, Georgia,           Miami-Fort Lauderdale CMSA,
 Kentucky, Mississippi, North Carolina,   Atlanta, GA MSA.
 South Carolina, Tennessee).
V (Illinois, Indiana, Michigan,          Chicago-Gary-Kenosha CMSA,
 Minnesota, Ohio, Wisconsin).             Detroit-Ann Arbor-Flint CMSA.
VI (Arkansas, Louisiana, New Mexico,     Dallas-Fort Worth CMSA, Houston-
 Oklahoma, Texas).                        Galveston-Brazoria CMSA.
VII (Iowa, Kansas, Missouri, Nebraska).  St. Louis MSA, Kansas City MSA.
VIII (Colorado, Montana, North Dakota,   Denver-Boulder-Greeley CMSA,
 South Dakota, Utah, Wyoming).            Salt Lake City-Ogden MSA.
IX (American Samoa, Arizona,             Los Angeles-Riverside-Orange
 California, Guam, Hawaii, Nevada).       County CMSA, San Francisco-
                                          Oakland-San Jose CMSA.
X (Alaska, Idaho, Oregon, Washington)..  Seattle-Tacoma-Bremerton CMSA,
                                          Portland-Salem CMSA.
------------------------------------------------------------------------

    In addition, one NAMS site must be located in each of the MSA/CMSAs
where one or more violations of the quarterly Pb NAAQS have been
recorded over the previous eight quarters. If a violation of the
quarterly Pb NAAQS is measured at a monitoring site outside of a MSA/
CMSA, one NAMS site must be located within the county in a populated
area, apart from the Pb source, to assess area wide Pb air pollution
levels. These NAMS sites should represent the maximum Pb concentrations
measured within the MSA/CMSA, city, or county that is not directly
affected from a single Pb point source. Further, in order that on-road
mobile source emissions may continue to be verified as not contributing
to lead NAAQS violations, roadside ambient lead monitors should be
considered as viable NAMS site candidates. A NAMS site may be a
microscale or middle scale category (a) station, located adjacent to a
major roadway (e.g., >30,000 ADT), or a neighborhood scale category (b)
station that is located in a highly populated residential section of the
MSA/CMSA or county where the traffic density is high. Data from these
sites will be used to assess general conditions for large MSA/CMSAs and
other populated areas as a marker for national trends, and to confirm
continued attainment of the Pb NAAQS. In some cases, the MSA/CMSA
subject to the latter lead NAMS requirement due to a violating point
source will be the same MSA/CMSA subject to the lead NAMS requirement
based upon its population. For these situations, the total minimum
number of required lead NAMS is one.
    3.7 Particulate Matter Design Criteria for NAMS.
    3.7.1 Table 4 indicates the approximate number of permanent stations
required in MSAs to characterize national and regional PM10
air quality trends and geographical patterns. The number of
PM10 stations in areas where MSA populations exceed 1,000,000
must be in the range from 2 to 10 stations, while in low population
urban areas, no more than two stations are required. A range of
monitoring stations is specified in table 4 because sources of
pollutants and local control efforts can vary from one part of the
country to another and therefore, some flexibility is allowed in
selecting the actual number of stations in any one locale.
    3.7.2 Through promulgation of the NAAQS for PM2.5, the
number of PM10 SLAMS is expected to decrease, but
requirements to maintain PM10 NAMS remain in effect. The
PM10 NAMS are retained to provide trends data, to support
national assessments and decisions, and in some cases to continue
demonstration that a NAAQS for PM10 is maintained as a
requirement under a State Implementation Plan.
    3.7.3 The PM2.5 NAMS shall be a subset of the core
PM2.5 SLAMS and other SLAMS intended to monitor for regional
transport. The PM2.5 NAMS are planned as long-term monitoring
stations concentrated in metropolitan areas. A target range of 200 to
300 stations shall be designated nationwide. The largest metropolitan
areas (those with a population greater than approximately one million)
shall have at least one PM2.5 NAMS stations.
    3.7.4 The number of total PM2.5 NAMS per Region will be
based on recommendations of the EPA Regional Offices, in concert with
their State and local agencies, in accordance with the network design
goals described in sections 3.7.5 through 3.7.7 of this appendix. The
selected stations should represent the range of conditions occurring in
the Regions

[[Page 265]]

and will consider factors such as total number or type of sources,
ambient concentrations of particulate matter, and regional transport.
    3.7.5 The approach for PM2.5 NAMS is intended to give
State and local agencies maximum flexibility while apportioning a
limited national network. By advancing a range of monitors per Region,
EPA intends to balance the national network with respect to geographic
area and population. Table 5 presents the target number of
PM2.5 NAMS per Region to meet the national goal of 200 to 300
stations. These numbers consider a variety of factors such as Regional
differences in metropolitan population, population density, land area,
sources of particulate emissions, and the numbers of PM10
NAMS.
    3.7.6 States will be required to establish approximately 50 NAMS
sites for routine chemical speciation of PM2.5. These sites
will include those collocated at approximately 25 PAMS sites and
approximately 25 other core SLAMS sites to be selected by the
Administrator. After 5 years of data collection, the Administrator may
exempt some sites from collecting speciated data. The number of NAMS
sites at which speciation will be performed each year and the number of
samples per year will be determined by the Administrator.
    3.7.7 Since emissions associated with the operation of motor
vehicles contribute to urban area particulate matter levels,
consideration of the impact of these sources must be included in the
design of the NAMS network, particularly in MSAs greater than 500,000
population. In certain urban areas particulate emissions from motor
vehicle diesel exhaust currently is or is expected to be a significant
source of particulate matter ambient levels. The actual number of NAMS
and their locations must be determined by EPA Regional Offices and the
State agencies, subject to the approval of the Administrator as required
by Sec. 58.32. The Administrator's approval is necessary to ensure that
individual stations conform to the NAMS selection criteria and that the
network as a whole is sufficient in terms of number and location for
purposes of national analyses.

                             Table 4--PM10 National Air Monitoring Station Criteria
                                  [Approximate Number of Stations per MSA] \1\
----------------------------------------------------------------------------------------------------------------
                                                                High              Medium       Low Concentration
                  Population Category                    Concentration \2\  Concentration \3\         \4\
----------------------------------------------------------------------------------------------------------------
>1,000,000.............................................           6-10                4-8                2-4
500,000-1,000,000......................................           4-8                 2-4                1-2
250,000-500,000........................................           3-4                 1-2                0-1
100,000-250,000........................................           1-2                 0-1                0
----------------------------------------------------------------------------------------------------------------
1 Selection of urban areas and actual number of stations per area will be jointly determined by EPA and the
  State agency.
2 High concentration areas are those for which ambient PM10 data show ambient concentrations exceeding either
  PM10 NAAQS by 20 percent or more.
3 Medium concentration areas are those for which ambient PM10 data show ambient concentrations exceeding 80
  percent of the PM10 NAAQS.
4 Low concentration areas are those for which ambient PM10 data show ambient concentrations less than 80 percent
  of the PM10 NAAQS.

    3.7.7.1 Selection of urban areas and actual number of stations per
area will be jointly determined by EPA and the State agency.
    3.7.7.2 High concentration areas are those for which: Ambient
PM10 data show ambient concentrations exceeding either
PM10 NAAQS by 20 percent or more.
    3.7.7.3 Medium concentration areas are those for which: Ambient
PM10 data show ambient concentrations exceeding either 80
percent of the PM10 NAAQS.
    3.7.7.4 Low concentration areas are those for which: Ambient
PM10 data show ambient concentrations less than 80 percent of
the PM10 NAAQS.

            Table 5--Goals for Number of PM2.5 NAMS by Region
------------------------------------------------------------------------
                                                   Percent of National
      EPA Region           Number of NAMS \1\             Total
------------------------------------------------------------------------
1                       15 to 20                 6 to 8
2                       20 to 30                 8 to 12
3                       20 to 25                 8 to 10
4                       35 to 50                 14 to 20
5                       35 to 50                 14 to 20
6                       25 to 35                 10 to 14
7                       10 to 15                 4 to 6
8                       10 to 15                 4 to 6
9                       25 to 40                 10 to 16
10                      10 to 15                 4 to 6
Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡Å¡
    Total               205-295                  100
------------------------------------------------------------------------
\1\ Each region will have one to three NAMS having the monitoring of
  regional transport as a primary objective.

4. Network Design for Photochemical Assessment Monitoring Stations
(PAMS)

    In order to obtain more comprehensive and representative data on
O3 air pollution, the 1990 Clean Air Act Amendments require
enhanced monitoring for ozone (O3), oxides of nitrogen (NO,
NO2, and NOX), and monitoring for VOC in
O3 nonattainment areas classified as serious, severe, or
extreme. This will be accomplished through the establishment of a
network of Photochemical Assessment Monitoring Stations (PAMS).
    4.1 PAMS Data Uses. Data from the PAMS are intended to satisfy
several coincident needs related to attainment of the National Ambient
Air Quality Standards (NAAQS), SIP control strategy development and
evaluation, corroboration of emissions tracking, preparation of trends
appraisals, and exposure assessment.
    (a) NAAQS attainment and control strategy development. Like SLAMS
and NAMS data, PAMS data will be used for monitoring O3
exceedances and providing input for attainment/nonattainment decisions.
In addition,

[[Page 266]]

PAMS data will help resolve the roles of transported and locally emitted
O3 precursors in producing an observed exceedance and may be
utilized to identify specific sources emitting excessive concentrations
of O3 precursors and potentially contributing to observed
exceedances of the O3 NAAQS. The PAMS data will enhance the
characterization of O3 concentrations and provide critical
information on the precursors which cause O3, therefore
extending the database available for future attainment demonstrations.
These demonstrations will be based on photochemical grid modeling and
other approved analytical methods and will provide a basis for
prospective mid-course control strategy corrections. PAMS data will
provide information concerning (1) which areas and episodes to model to
develop appropriate control strategies; (2) boundary conditions required
by the models to produce quantifiable estimates of needed emissions
reductions; and (3) the evaluation of the predictive capability of the
models used.
    (b) SIP control strategy evaluation. The PAMS will provide data for
SIP control strategy evaluation. Long-term PAMS data will be used to
evaluate the effectiveness of these control strategies. Data may be used
to evaluate the impact of VOC and NOX emission reductions on
air quality levels for O3 if data is reviewed following the
time period during which control measures were implemented. Speciation
of measured VOC data will allow determination of which organic species
are most affected by the emissions reductions and assist in developing
cost-effective, selective VOC reductions and control strategies. A State
or local air pollution control agency can therefore ensure that
strategies which are implemented in their particular nonattainment area
are those which are best suited for that area and achieve the most
effective emissions reductions (and therefore largest impact) at the
least cost.
    (c) Emissions tracking. PAMS data will be used to corroborate the
quality of VOC and NOX emission inventories. Although a
perfect mathematical relationship between emission inventories and
ambient measurements does not yet exist, a qualitative assessment of the
relative contributions of various compounds to the ambient air can be
roughly compared to current emission inventory estimates to evaluate the
accuracy of the emission inventories. In addition, PAMS data which are
gathered year round will allow tracking of VOC and NOX
emission reductions, provide additional information necessary to support
Reasonable Further Progress (RFP) calculations, and corroborate
emissions trends analyses. While the regulatory assessments of progress
will be made in terms of emission inventory estimates, the ambient data
can provide independent trends analyses and corroboration of these
assessments which either verify or highlight possible errors in
emissions trends indicated by inventories. The ambient assessments,
using speciated data, can gauge the accuracy of estimated changes in
emissions. The speciated data can also be used to assess the quality of
the VOC speciated and NOX emission inventories for input
during photochemical grid modeling exercises and identify potential
urban air toxic pollutant problems which deserve closer scrutiny.
    The speciated VOC data will be used to determine changes in the
species profile, resulting from the emission control program,
particularly those resulting from the reformulation of fuels.
    (d) Trends. Long-term PAMS data will be used to establish speciated
VOC, NOX, and limited toxic air pollutant trends, and
supplement the O3 trends database. Multiple statistical
indicators will be tracked, including O3 and its precursors
during the events encompassing the days during each year with the
highest O3 concentrations, the seasonal means for these
pollutants, and the annual means at representative locations.
    The more PAMS that are established in and near nonattainment areas,
the more effective the trends data will become. As the spatial
distribution and number of O3 and O3 precursor
monitors improves, trends analyses will be less influenced by instrument
or site location anomalies. The requirement that surface meteorological
monitoring be established at each PAMS will help maximize the utility of
these trends analyses by comparisons with meteorological trends, and
transport influences. The meteorological data can also help interpret
the ambient air pollution trends by taking meteorological factors into
account.
    (e) Exposure assessment. PAMS data will be used to better
characterize O3 and toxic air pollutant exposure to
populations living in serious, severe, or extreme areas. Annual mean
toxic air pollutant concentrations will be calculated to help estimate
the average risk to the population associated with individual VOC
species, which are considered toxic, in urban environments.
    4.2 PAMS Monitoring Objectives. Unlike the SLAMS and NAMS design
criteria which are pollutant specific, PAMS design criteria are site
specific. Concurrent measurements of O3, NOX,
speciated VOC, and meteorology are obtained at PAMS. Design criteria for
the PAMS network are based on selection of an array of site locations
relative to O3 precursor source areas and predominant wind
directions associated with high O3 events. Specific
monitoring objectives are associated with each location. The overall
design should enable characterization of precursor emission sources
within the area, transport of O3 and its precursors into and
out of the area, and the photochemical processes related to
O3 nonattainment, as well as developing an

[[Page 267]]

initial, though limited, urban air toxic pollutant database. Specific
objectives that must be addressed include assessing ambient trends in
O3, NO, NO2, NOX, VOC (including
carbonyls), and VOC species, determining spatial and diurnal variability
of O3, NO, NO2, NOX, and VOC species
and assessing changes in the VOC species profiles that occur over time,
particularly those occurring due to the reformulation of fuels. A
maximum of five PAMS sites are required in an affected nonattainment
area depending on the population of the Metropolitan Statistical Area/
Consolidated Metropolitan Statistical Area (MSA/CMSA) or nonattainment
area, whichever is larger. Specific monitoring objectives associated
with each of these sites result in four distinct site types. Note that
detailed guidance for the locating of these sites may be found in
reference 19.
    Type (1) sites are established to characterize upwind background and
transported O3 and its precursor concentrations entering the
area and will identify those areas which are subjected to overwhelming
transport. Type (1) sites are located in the predominant morning upwind
direction from the local area of maximum precursor emissions during the
O3 season and at a distance sufficient to obtain urban scale
measurements as defined in section 1 of this appendix. Typically, type
(1) sites will be located near the edge of the photochemical grid model
domain in the predominant morning upwind direction from the city limits
or fringe of the urbanized area. Depending on the boundaries and size of
the nonattainment area and the orientation of the grid, this site may be
located outside of the nonattainment area. The appropriate predominant
morning wind direction should be determined from historical wind data
occurring during the period 7 a.m. to 10 a.m. on high O3 days
or on those days which exhibit the potential for producing high
O3 levels, i.e., O3-conducive days as described in
reference 25. Alternate schemes for specifying this morning wind
direction may be submitted as a part of the network description required
by Secs. 58.40 and 58.41. Data measured at type (1) sites will be used
principally for the following purposes:
     Future development and evaluation of control strategies,
     Identification of incoming pollutants,
     Corroboration of NOX and VOC emission
inventories,
     Establishment of boundary conditions for future
photochemical grid modeling and mid-course control strategy changes, and
     Development of incoming pollutant trends.
    Type (2) sites are established to monitor the magnitude and type of
precursor emissions in the area where maximum precursor emissions are
expected to impact and are suited for the monitoring of urban air toxic
pollutants. Type (2) sites are located immediately downwind of the area
of maximum precursor emissions and are typically placed near the
downwind boundary of the central business district to obtain
neighborhood scale measurements. The appropriate downwind direction
should be obtained similarly to that for type (1) sites. Additionally, a
second type (2) site may be required depending on the size of the area,
and should be placed in the second-most predominant morning wind
direction as noted previously. Data measured at type (2) sites will be
used principally for the following purposes:
     Development and evaluation of imminent and future control
strategies,
     Corroboration of NOX and VOC emission
inventories,
     Augmentation of RFP tracking,
     Verification of photochemical grid model performance,
     Characterization of O3 and toxic air pollutant
exposures (appropriate site for measuring toxic emissions impact),
     Development of pollutant trends, particularly toxic air
pollutants and annual ambient speciated VOC trends to compare with
trends in annual VOC emission estimates, and
     Determination of attainment with the NAAQS for
NO2 and O3.
    Type (3) sites are intended to monitor maximum O3
concentrations occurring downwind from the area of maximum precursor
emissions. Locations for type (3) sites should be chosen so that urban
scale measurements are obtained. Typically, type (3) sites will be
located 10 to 30 miles downwind from the fringe of the urban area. The
downwind direction should also be determined from historical wind data,
but should be identified as those afternoon winds occurring during the
period 1 p.m. to 4 p.m. on high O3 days or on those days
which exhibit the potential for producing high O3 levels.
Alternate schemes for specifying this afternoon wind direction may also
be submitted as a part of the network description required by
Secs. 58.40 and 58.41. Data measured at type (3) sites will be used
principally for the following purposes:
     Determination of attainment with the NAAQS for
O3 (this site may coincide with an existing maximum
concentration O3 monitoring site),
     Evaluation of future photochemical grid modeling
applications,
     Future development and evaluation of control strategies,
     Development of pollutant trends, and
     Characterization of O3 pollutant exposures.
    Type (4) sites are established to characterize the extreme downwind
transported O3 and its precursor concentrations exiting the
area and will identify those areas which are potentially contributing to
overwhelming transport in other areas. Type (4) sites are

[[Page 268]]

located in the predominant afternoon downwind direction, as determined
for the type (3) site, from the local area of maximum precursor
emissions during the O3 season and at a distance sufficient
to obtain urban scale measurements as defined elsewhere in this
appendix. Typically, type (4) sites will be located near the downwind
edge of the photochemical grid model domain. Alternate schemes for
specifying the location of this site may be submitted as a part of the
network description required by Secs. 58.40 and 58.41. Data measured at
type (4) sites will be used principally for the following purposes:
     Development and evaluation of O3 control
strategies,
     Identification of emissions and photochemical products
leaving the area,
     Establishment of boundary conditions for photochemical grid
modeling,
     Development of pollutant trends,
     Background and upwind information for other downwind areas,
and
     Evaluation of photochemical grid model performance.
    States choosing to submit an individual network description for each
affected nonattainment area, irrespective of its proximity to other
affected areas, must fulfill the requirements for isolated areas as
described in section 4 of this appendix, as an example, and illustrated
by Figure 5. States containing areas which experience significant impact
from long-range transport or are proximate to other nonattainment areas
(even in other States) should collectively submit a network description
which contains alternative sites to those that would be required for an
isolated area. Such a submittal should, as a guide, be based on the
example provided in Figure 6, but must include a demonstration that the
design satisfies the monitoring data uses and fulfills the PAMS
monitoring objectives described in sections 4.1 and 4.2 of this
appendix.

[[Page 269]]

[GRAPHIC] [TIFF OMITTED] TR18JY97.178

[[Page 270]]

[GRAPHIC] [TIFF OMITTED] TR18JY97.179

    Alternative PAMS network designs should, on a site-by-site basis,
provide those data necessary to enhance the attainment/nonattainment
database for criteria pollutants and explain the origins of overwhelming
O3 transport. The alternative PAMS data should be usable for
the corroboration and verification of O3 precursor emissions
inventories and should comprise a qualitative (if not quantitative)
measure of the accuracy of RFP calculations. The data should be
sufficient to evaluate the effectiveness of the implemented
O3 control strategies and should provide data necessary to
establish photochemical grid modeling boundary conditions and necessary
inputs including appropriate meteorological parameters, and provide
measurements which can serve as model evaluation tools. Further,
utilizing its PAMS database (alternative or not), a State should be able
to draw conclusions regarding population exposure and conduct trends
analyses for both criteria and non-criteria pollutants. Overall, the
PAMS network should serve as one of several complementary means,
together with modeling and analysis of other data bases (e.g.,
inventories) and availability of control technology, etc., for States to
justify the modification of existing control programs, design

[[Page 271]]

new programs, and evaluate future courses of actions for O3
control.
    4.3 Monitoring Period. PAMS precursor monitoring will be conducted
annually throughout the months of June, July and August (as a minimum)
when peak O3 values are expected in each area; however,
precursor monitoring during the entire O3 season for the area
is preferred. Alternate precursor monitoring periods may be submitted
for approval as a part of the PAMS network description required by
Sec. 58.40. Changes to the PAMS monitoring period must be identified
during the annual SLAMS Network Review specified in Sec. 58.20. PAMS
O3 monitors must adhere to the O3 monitoring
season specified in section 2.5 of appendix D. To ensure a degree of
national consistency, monitoring for the 1993 season should commence as
follows:
    One in 3-day sampling--June 3, 1993.
    One in 6-day sampling--June 6, 1993.
    These monitoring dates will thereby be coincident with the
previously-established, intermittent schedule for particulate matter.
States initiating sampling earlier (or later) than June 3, 1993 should
adjust their schedules to coincide with this national schedule.
    4.4 Minimum Monitoring Network Requirements. The minimum required
number and type of monitoring sites and sampling requirements are based
on the population of the affected MSA/CMSA or nonattainment area
(whichever is larger). The MSA/CMSA basis for monitoring network
requirements was chosen because it typically is the most representative
of the area which encompasses the emissions sources contributing to
nonattainment. The MSA/CMSA emissions density can also be effectively
and conveniently portrayed by the surrogate of population. Additionally,
a network which is adequate to characterize the ambient air of an MSA/
CMSA often must extend beyond the boundaries of such an area (especially
for O3 and its precursors); therefore, the use of smaller
geographical units (such as counties or nonattainment areas which are
smaller than the MSA/CMSA) for monitoring network design purposes is
inappropriate. Various sampling requirements are imposed according to
the size of the area to accommodate the impact of transport on the
smaller MSAs/CMSAs, to account for the spatial variations inherent in
large areas, to satisfy the differing data needs of large versus small
areas due to the intractability of the O3 nonattainment
problem, and to recognize the potential economic impact of
implementation on State and local government. Population figures must
reflect the most recent decennial U.S. census population report.
Specific guidance on determining network requirements is provided in
reference 19. Minimum network requirements are outlined in table 2.

        Table 2--PAMS Minimum Monitoring Network Requirements \1\
------------------------------------------------------------------------
                                              Minimum         Minimum
  Population of MSA/CMSA or     Required   speciated VOC     carbonyl
    nonattainment area \2\     site type     sampling        sampling
                                  \3\      frequency \4\   frequency \4\
------------------------------------------------------------------------
Less than 500,000............  (\1\)      A or C          ..............
                               (\2\)      A or C          D or F \5\
500,000 to 1,000,000.........  (\1\)      A or C          ..............
                               (\2\)      B               E
                               (\3\)      A or C          ..............
1,000,000 to 2,000,000.......  (\1\)      A or C          ..............
                               (\2\)      B               E
                               (\2\)      B               E
                               (\3\)      A or C          ..............
More than 2,000,000..........  (\1\)      A or C          ..............
                               (\2\)      B               E
                               (\2\)      B               E
                               (\3\)      A or C          ..............
                               (\4\)      A or C          ..............
------------------------------------------------------------------------
\1\ O3 and NOX (including NO and NO2) monitoring should be continuous
  measurements.
\2\ Whichever area is larger.
\3\ See Figure 5.
\4\ Frequency Requirements are as follows: A--Eight 3-hour samples every
  third day and one additional 24-hour sample every sixth day during the
  monitoring period; B--Eight 3-hour samples, every day during the
  monitoring period and one additional 24-hour sample every sixth day
  year-round; C--Eight 3-hour samples on the 5 peak O3 days plus each
  previous day, eight 3-hour samples every sixth day, and one additional
  24-hour sample every sixth day, during the monitoring period; D--Eight
  3-hour samples every third day during the monitoring period; E--Eight
  3-hour samples every day during the monitoring period; F--Eight 3-hour
  samples on the 5 peak O3 days plus each previous day and eight 3-hour
  samples every sixth day during the monitoring period. (NOTE: multiple
  samples taken on a daily basis must begin at midnight and consist of
  sequential, non-overlapping sampling periods.)
\5\ Carbonyl sampling frequency must match the chosen speciated VOC
  frequency.
Note that the use of Frequencies C or F requires the submittal of an
  ozone event forecasting scheme.

    For purposes of network implementation and transition, EPA
recommends the following priority order for the establishment of sites:
     The type (2) site which provides the most comprehensive
data concerning O3 precursor emissions and toxic air
pollutants,
     The type (3) site which provides a maximum O3
measurement and total conversion of O3 precursors,
     The type (1) site which delineates the effect of incoming
precursor emissions and concentrations of O3 and provides
upwind boundary conditions,
     The type (4) site which provides extreme downwind boundary
conditions, and
     The second type (2) site which provides comprehensive data
concerning O3 precursor emissions and toxic air pollutants in
the second-most predominant morning wind direction on high O3
days.
    Note also that O3 event (peak day) monitoring will
require the development of a scheme for forecasting such high
O3 days or will necessitate the stipulation of what
meteorological conditions constitute a potential high O3 day;
monitoring could then be

[[Page 272]]

triggered only via meteorological projections. The O3 event
forecasting and monitoring scheme should be submitted as a part of the
network description required by Secs. 58.40 and 58.41 and should be
reviewed during each annual SLAMS Network Review specified in
Sec. 58.20.
    4.5 Transition Period. A variable period of time is proposed for
phasing in the operation of all required PAMS. Within 1 year after (1)
February 12, 1993, (2) or date of redesignation or reclassification of
any existing O3 nonattainment area to serious, severe, or
extreme, or (3) the designation of a new area and classification to
serious, severe, or extreme O3 nonattainment, a minimum of
one type (2) site must be operating. Operation of the remaining sites
must, at a minimum, be phased in at the rate of one site per year during
subsequent years as outlined in the approved PAMS network description
provided by the State.
    4.6 Meteorological Monitoring. In order to support monitoring
objectives associated with the need for various air quality analyses,
model inputs and performance evaluations, meteorological monitoring
including wind measurements at 10 meters above ground is required at
each PAMS site. Monitoring should begin with site establishment. In
addition, upper air meteorological monitoring is required for each PAMS
area. Upper air monitoring should be initiated as soon as possible, but
no later than 2 years after (1) February 12, 1993, (2) or date of
redesignation or reclassification of any existing O3
nonattainment area to serious, severe, or extreme, or (3) the
designation of a new area and classification to serious, severe, or
extreme O3 nonattainment. The upper air monitoring site may
be located separately from the type (1) through (4) sites, but the
location should be representative of the upper air data in the
nonattainment area. Upper air meteorological data must be collected
during those days specified for monitoring by the sampling frequencies
in table 2. of section 4.4 of this appendix D in accordance with current
EPA guidance.

5. Summary.

    Table 6 of this appendix shows by pollutant, all of the spatial
scales that are applicable for SLAMS and the required spatial scales for
NAMS. There may also be some situations, as discussed later in appendix
E of this part, where additional scales may be allowed for NAMS
purposes.

                    Table 6--Summary of Spatial Scales for SLAMS and Required Scales for NAMS
----------------------------------------------------------------------------------------------------------------
                                                          Scales Applicable for SLAMS
        Spatial Scale        -----------------------------------------------------------------------------------
                                  SO2         CO          O3          NO2         Pb         PM10        PM2.5
----------------------------------------------------------------------------------------------------------------
Micro.......................                                                       
Middle......................                                  
Neighborhood................                                  
Urban.......................                                         
Regional....................                                                

                                                           Scales Required for NAMS

Micro.......................                                                      1
Middle......................                                                             1
Neighborhood................                                  
Urban.......................                                                             2
Regional....................                                                                           2
----------------------------------------------------------------------------------------------------------------
\1\ Only permitted if representative of many such micro-scale environments in a residential district (for middle
  scale, at least two).
\2\ Either urban or regional scale for regional transport sites.

6. References

    1. Ludwig, F. L., J. H. S. Kealoha, and E. Shelar. Selecting Sites
for Monitoring Total Suspended Particulates. Stanford Research
Institute, Menlo Park, CA. Prepared for U.S. Environmental Protection
Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-
018. June 1977, revised December 1977.
    2. Ball, R. J. and G. E. Anderson. Optimum Site Exposure Criteria
for SO2 Monitoring. The Center for the Environment and Man,
Inc., Hartford, CT. Prepared for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-013. April
1977.
    3. Ludwig, F. L. and J. H. S. Kealoha. Selecting Sites for Carbon
Monoxide Monitoring. Stanford Research Institute, Menlo Park, CA.
Prepared for U.S. Environmental Protection Agency, Research Triangle
Park, NC. EPA Publication No. EPA-450/3-75-077. September 1975.
    4. Ludwig, F. L. and E. Shelar. Site Selecting for the Monitoring of
Photochemical Air Pollutants. Stanford Research Institute, Menlo Park,
CA. Prepared for U.S. Environmental Protection Agency, Research Triangle
Park, NC. EPA Publication No. EPA-450/3-78-013. April 1978.
    5. Guideline on Air Quality Models. OAQPS, U.S. Environmental
Protection

[[Page 273]]

Agency, Research Triangle Park, NC. OAQPS No. 1.2-080. April 1978.
    6. Lead Guideline Document, U. S. Environmental Protection Agency,
Research Triangle Park, NC. EPA-452/R-93-009.
    7. Air Quality Criteria for Lead. Office of Research and
Development, U.S. Environmental Protection Agency, Washington, DC. EPA-
600/8-83-028 aF-dF, 1986, and supplements EPA-600/8-89/049F, August
1990. (NTIS document numbers PB87-142378 and PB91-138420.)
    8. Johnson, D. E., et al. Epidemiologic Study of the Effects of
Automobile Traffic on Blood Lead Levels, Southwest Research Institute,
Houston, TX. Prepared for U.S. Environmental Protection Agency, Research
Triangle Park, NC. EPA-600/1-78-055. August 1978.
    9. Optimum Site Exposure Criteria for Lead Monitoring. PEDCo
Environmental, Inc., Cincinnati, OH. Prepared for U.S. Environmental
Protection Agency, Research Triangle Park, NC. EPA Contract No. 68-02-
3013. (May 1981.)
    10. ``Guidance for Conducting Ambient Air Monitoring for Lead Around
Point Sources,'' Office of Air Quality Planning and Standards, U.S.
Environmental Protection Agency, Research Triangle Park, NC EPA-454/R-
92-009, May 1997.
    11. Cooper, J.A., et. al. Summary of the Portland Aerosol
Characterization Study. (Presented at the 1979 Annual Air Pollution
Association Meeting, Cincinnati, OH. APCA 79-24.4).
    12. Bradway, R.M. and F.A. Record. National Assessment of the Urban
Particulate Problem. Volume 1. Prepared for U.S. Environmental
Protection Agency, Research Triangle Park, NC. EPA-450/3-76-024. July
1976.
    13. U.S. Environmental Protection Agency, Air Quality Criteria for
Particulate Matter and Sulfur Oxides, Volume 2. Environmental Criteria
and Assessment Office, Research Triangle Park, NC. December 1981.
    14. Watson, J.G., et al. Analysis of Inhalable and Fine Particulate
Matter Measurements. Prepared for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA-450/4-81-035. December 1981.
    15. Record, F.A. and L.A. Baci. Evaluation on Contribution of Wind
Blown Dust from the Desert Levels of Particulate Matter in Desert
Communities. GCA Technology Division, Bedford, MA. Prepared for U.S.
Environmental Protection Agency, Research Triangle Park, NC. EPA-450/2-
80-078. August 1980.
    16. Goldstein, E.A. and Paly M. The Diesel Problem in New York City.
Project on the Urban Environment. Natural Resources Defense Council,
Inc., New York, NY. April 1985.
    17. Koch, R.C. and H.E. Rector. Optimum Network Design and Site
Exposure Criteria for Particulate Matter. GEOMET Technologies, Inc.,
Rockville, MD. Prepared for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Contract No. 68-02-3584. EPA 450/4-87-
009. May 1987.
    18. Watson et al. Guidance for Network Design and Optimum Site
Exposure for PM2.5 and PM10. Prepared for U.S.
Environmental Protection Agency, Research Triangle Park, NC.
    19. National Air Pollutant Emissions Trends, 1900-1995, Office of
Air Quality Planning and Standards, U. S. Environmental Protection
Agency, Research Triangle Park, NC. EPA-454/R96-007, October 1996,
updated annually.

[44 FR 27571, May 10, 1979]

    Editorial Note: For Federal Register citations affecting appendix D
to part 58, see the List of CFR Sections Affected in the Finding Aids
section of this volume.

    Effective Date Note: At at 60 FR 52323, October 6, 1995, appendix D
to part 58 was amended in part by adding Section 2.2. This section
contains information collection and recordkeeping requirements and will
not become effective until approval has been given by the Office of
Management and Budget.

  Appendix E to Part 58--Probe and Monitoring Path Siting Criteria for
                     Ambient Air Quality Monitoring

    1. Introduction
    2 Sulfur Dioxide (SO2), Ozone (O3), and
Nitrogen Dioxide (NO2)
    2.1 Horizontal and Vertical Placement
    2.2 Spacing from Minor Sources (Applicable to SO2 and
O3 Monitoring Only)
    2.3 Spacing From Obstructions
    2.4 Spacing From Trees
    2.5 Spacing From Roadways (Applicable to O3 and
NO2 Only)
    2.6 Cumulative Interferences on a Monitoring Path
    2.7 Maximum Monitoring Path Length
    3 [Reserved]
    4. Carbon Monoxide (CO)
    4.1 Horizontal and Vertical Placement
    4.2 Spacing From Obstructions
    4.3 Spacing From Roadways
    4.4 Spacing From Trees and Other Considerations
    4.5 Cumulative Interferences on a Monitoring Path
    4.6 Maximum Monitoring Path Length
    5-6 [Reserved]
    7. Lead(Pb)
    7.1 Vertical Placement
    7.2 Spacing From Obstructions
    7.3 Spacing From Roadways
    7.4 Spacing From Trees and Other Considerations.
    8. Particulate Matter (PM10 and PM2.5)
    8.1 Vertical Placement

[[Page 274]]

    8.2 Spacing From Obstructions
    8.3 Spacing From Roadways
    8.4 Other Considerations
    9. Probe Material and Pollutant Sample Residence Time
    10. Photochemical Assessment Monitoring Stations (PAMS)
    10.1 Horizontal and Vertical Placement
    10.2 Spacing From Obstructions
    10.3 Spacing From Roadways
    10.4 Spacing From Trees
    11. Discussion and Summary
    12. Summary
    13. References

1. Introduction
    This appendix contains specific location criteria applicable to
ambient air quality monitoring probes and monitoring paths after the
general station siting has been selected based on the monitoring
objectives and spatial scale of representation discussed in appendix D
of this part. Adherence to these siting criteria is necessary to ensure
the uniform collection of compatible and comparable air quality data.
    The probe and monitoring path siting criteria discussed below must
be followed to the maximum extent possible. It is recognized that there
may be situations where some deviation from the siting criteria may be
necessary. In any such case, the reasons must be thoroughly documented
in a written request for a waiver that describes how and why the
proposed siting deviates from the criteria. This documentation should
help to avoid later questions about the validity of the resulting
monitoring data. Conditions under which the EPA would consider an
application for waiver from these siting criteria are discussed in
section 11 of this appendix.
    The spatial scales of representation used in this appendix, i.e.,
micro, middle, neighborhood, urban, and regional, are defined and
discussed in appendix D of this part. The pollutant-specific probe and
monitoring path siting criteria generally apply to all spatial scales
except where noted otherwise. Specific siting criteria that are phrased
with a ``must'' are defined as requirements and exceptions must be
approved through the waiver provisions. However, siting criteria that
are phrased with a ``should'' are defined as goals to meet for
consistency but are not requirements.

2. Sulfur Dioxide (SO2), Ozone (O3), and Nitrogen
Dioxide (NO2)

    Open path analyzers may be used to measure SO2,
O3, and NO2 at SLAMS/NAMS sites for middle,
neighborhood, urban, and regional scale measurement applications.
Additional information on SO2, NO2, and
O3 monitor siting criteria may be found in references 11 and
13.
    2.1 Horizontal and Vertical Placement. The probe or at least 80
percent of the monitoring path must be located between 3 and 15 meters
above ground level. The probe or at least 90 percent of the monitoring
path must be at least 1 meter vertically or horizontally away from any
supporting structure, walls, parapets, penthouses, etc., and away from
dusty or dirty areas. If the probe or a significant portion of the
monitoring path is located near the side of a building, then it should
be located on the windward side of the building relative to the
prevailing wind direction during the season of highest concentration
potential for the pollutant being measured.
    2.2 Spacing from Minor Sources (Applicable to SO2 and
O3 Monitoring Only). Local minor sources of SO2
can cause inappropriately high concentrations of SO2 in the
vicinity of probes and monitoring paths for SO2. Similarly,
local sources of nitric oxide (NO) and ozone-reactive hydrocarbons can
have a scavenging effect causing unrepresentatively low concentrations
of O3 in the vicinity of probes and monitoring paths for
O3. To minimize these potential interferences, the probe or
at least 90 percent of the monitoring path must be away from furnace or
incineration flues or other minor sources of SO2 or NO,
particularly for open path analyzers because of their potential for
greater exposure over the area covered by the monitoring path. The
separation distance should take into account the heights of the flues,
type of waste or fuel burned, and the sulfur content of the fuel. It is
acceptable, however, to monitor for SO2 near a point source
of SO2 when the objective is to assess the effect of this
source on the represented population.
    2.3 Spacing From Obstructions. Buildings and other obstacles may
possibly scavenge SO2, O3, or NO2. To
avoid this interference, the probe or at least 90 percent of the
monitoring path must have unrestricted airflow and be located away from
obstacles so that the distance from the probe or monitoring path is at
least twice the height that the obstacle protrudes above the probe or
monitoring path. Generally, a probe or monitoring path located near or
along a vertical wall is undesirable because air moving along the wall
may be subject to possible removal mechanisms. A probe must have
unrestricted airflow in an arc of at least 270 degrees around the inlet
probe, or 180 degrees if the probe is on the side of a building. This
arc must include the predominant wind direction for the season of
greatest pollutant concentration potential. A sampling station having a
probe located closer to an obstacle than this criterion allows should be
classified as middle scale rather than neighborhood or urban scale,
since the measurements from such a station would more closely represent
the middle scale. A monitoring path must be clear of all trees, brush,
buildings, plumes, dust, or other optical obstructions, including
potential obstructions that may

[[Page 275]]

move due to wind, human activity, growth of vegetation, etc. Temporary
optical obstructions, such as rain, particles, fog, or snow, should be
considered when siting an open path analyzer. Any of these temporary
obstructions that are of sufficient density to obscure the light beam
will affect the ability of the open path analyzer to continuously
measure pollutant concentrations.
    Special consideration must be devoted to the use of open path
analyzers due to their inherent potential sensitivity to certain types
of interferences, or optical obstructions. While some of these potential
interferences are comparable to those to which point monitors are
subject, there are additional sources of potential interferences which
are altogether different in character. Transient, but significant
obscuration of especially longer measurement paths could be expected to
occur as a result of certain prevailing meteorological conditions (e.g.,
heavy fog, rain, snow) and/or aerosol levels that are of a sufficient
density to prevent the open path analyzer's light transmission. If
certain compensating measures are not otherwise implemented at the onset
of monitoring (e.g., shorter path lengths, higher light source
intensity), data recovery during periods of greatest primary pollutant
potential could be compromised. For instance, if heavy fog or high
particulate levels are coincident with periods of projected NAAQS-
threatening pollutant potential, the representativeness of the resulting
data record in reflecting maximum pollutant concentrations may be
substantially impaired despite the fact that the site may otherwise
exhibit an acceptable, even exceedingly high overall valid data capture
rate.
    In seeking EPA approval for inclusion of a site using an open path
analyzer into the formal SLAMS/NAMS or PSD network, monitoring agencies
must submit an analysis which evaluates both obscuration potential for a
proposed path length for the subject area and the effect this potential
is projected to have on the representativeness of the data record. This
analysis should include one or more of the following elements, as
appropriate for the specific circumstance: climatological information,
historical pollutant and aerosol information, modeling analysis results,
and any related special study results.
    2.4 Spacing From Trees. Trees can provide surfaces for
SO2, O3, or NO2 adsorption or reactions
and obstruct wind flow. To reduce this possible interference, the probe
or at least 90 percent of the monitoring path should be 20 meters or
more from the drip line of trees. If a tree or trees could be considered
an obstacle, the probe or 90 percent of the monitoring path must meet
the distance requirements of section 2.3 and be at least 10 meters from
the drip line of the tree or trees. Since the scavenging effect of trees
is greater for O3 than for other criteria pollutants, strong
consideration of this effect must be given to locating an O3
probe or monitoring path to avoid this problem.
    2.5 Spacing From Roadways (Applicable to O3 and
NO2 Only). In siting an O3 analyzer, it is
important to minimize destructive interferences from sources of NO,
since NO readily reacts with O3. In siting NO2
analyzers for neighborhood and urban scale monitoring, it is important
to minimize interferences from automotive sources. Table 1 provides the
required minimum separation distances between a roadway and a probe and
between a roadway and at least 90 percent of a monitoring path for
various ranges of daily roadway traffic. A sampling station having a
point analyzer probe located closer to a roadway than allowed by the
table 1 requirements should be classified as middle scale rather than
neighborhood or urban scale, since the measurements from such a station
would more closely represent the middle scale. If an open path analyzer
is used at a site, the monitoring path(s) must not cross over a roadway
with an average daily traffic count of 10,000 vehicles per day or more.
For those situations where a monitoring path crosses a roadway with
fewer than 10,000 vehicles per day, one must consider the entire segment
of the monitoring path in the area of potential atmospheric interference
from automobile emissions. Therefore, this calculation must include the
length of the monitoring path over the roadway plus any segments of the
monitoring path that lie in the area between the roadway and the minimum
separation distance, as determined from table 1. The sum of these
distances must not be greater than 10 percent of the total monitoring
path length.

   Table 1--Minimum Separation Distance Between Roadways and Probes or
Monitoring Paths for Monitoring Neighborhood--and Urban--Scale Ozone and
                            Nitrogen Dioxide
------------------------------------------------------------------------
                                                      Minimum separation
  Roadway average daily traffic, vehicles per day    distance,\1\ meters
------------------------------------------------------------------------
10,000..................................               10
15,000.............................................               20
20,000.............................................               30
40,000.............................................               50
70,000.............................................              100
110,000............................................              250
------------------------------------------------------------------------
\1\ Distance from the edge of the nearest traffic lane. The distance for
  intermediate traffic counts should be interpolated from the table
  values based on the actual traffic count.

    2.6 Cumulative Interferences on a Monitoring Path. The cumulative
length or portion of a monitoring path that is affected by minor
sources, obstructions, trees, or roadways must not exceed 10 percent of
the total monitoring path length.
    2.7 Maximum Monitoring Path Length. The monitoring path length must
not exceed 1

[[Page 276]]

kilometer for analyzers in neighborhood, urban, or regional scale. For
middle scale monitoring sites, the monitoring path length must not
exceed 300 meters. In areas subject to frequent periods of dust, fog,
rain, or snow, consideration should be given to a shortened monitoring
path length to minimize loss of monitoring data due to these temporary
optical obstructions. For certain ambient air monitoring scenarios using
open path analyzers, shorter path lengths may be needed in order to
ensure that the monitoring station meets the objectives and spatial
scales defined for SLAMS in appendix D. Therefore, the Regional
Administrator or the Regional Administrator's designee may require
shorter path lengths, as needed on an individual basis, to ensure that
the SLAMS meet the appendix D requirements. Likewise, the Administrator
or the Administrator's designee may specify the maximum path length used
at monitoring stations designated as NAMS or PAMS as needed on an
individual basis.

3. [Reserved]

4. Carbon Monoxide (CO)

    Open path analyzers may be used to measure CO at SLAMS/NAMS sites
for middle or neighborhood scale measurement applications. Additional
information on CO monitor siting criteria may be found in reference 12.
    4.1 Horizontal and Vertical Placement. Because of the importance of
measuring population exposure to CO concentrations, air should be
sampled at average breathing heights. However, practical factors require
that the inlet probe be higher. The required height of the inlet probe
for CO monitoring is therefore 3\1/2\ meters for a
microscale site, which is a compromise between representative breathing
height and prevention of vandalism. The recommended 1 meter range of
heights is also a compromise to some extent. For consistency and
comparability, it would be desirable to have all inlets at exactly the
same height, but practical considerations often prevent this. Some
reasonable range must be specified and 1 meter provides adequate leeway
to meet most requirements.
    For the middle and neighborhood scale stations, the vertical
concentration gradients are not as great as for the microscale station.
This is because the diffusion from roads is greater and the
concentrations would represent larger areas than for the microscale.
Therefore, the probe or at least 80 percent of the monitoring path must
be located between 3 and 15 meters above ground level for middle and
neighborhood scale stations. The probe or at least 90 percent of the
monitoring path must be at least 1 meter vertically or horizontally away
from any supporting structure, walls, parapets, penthouses, etc., and
away from dusty or dirty areas. If the probe or a significant portion of
the monitoring path is located near the side of a building, then it
should be located on the windward side of the building relative to both
the prevailing wind direction during the season of highest concentration
potential and the location of sources of interest, i.e., roadways.
    4.2 Spacing From Obstructions. Buildings and other obstacles may
restrict airflow around a probe or monitoring path. To avoid this
interference, the probe or at least 90 percent of the monitoring path
must have unrestricted airflow and be located away from obstacles so
that the distance from the probe or monitoring path is at least twice
the height that the obstacle protrudes above the probe or monitoring
path. A probe or monitoring path located near or along a vertical wall
is undesirable because air moving along the wall may be subject to
possible removal mechanisms. A probe must have unrestricted airflow in
an arc of at least 270 degrees around the inlet probe, or 180 degrees if
the probe is on the side of a building. This arc must include the
predominant wind direction for the season of greatest pollutant
concentration potential. A monitoring path must be clear of all trees,
brush, buildings, plumes, dust, or other optical obstructions, including
potential obstructions that may move due to wind, human activity, growth
of vegetation, etc. Temporary optical obstructions, such as rain,
particles, fog, or snow, should be considered when siting an open path
analyzer. Any of these temporary obstructions that are of sufficient
density to obscure the light beam will affect the ability of the open
path analyzer to continuously measure pollutant concentrations.
    Special consideration must be devoted to the use of open path
analyzers due to their inherent potential sensitivity to certain types
of interferences, or optical obstructions. While some of these potential
interferences are comparable to those to which point monitors are
subject, there are additional sources of potential interferences which
are altogether different in character. Transient, but significant
obscuration of especially longer measurement paths could be expected to
occur as a result of certain prevailing meteorological conditions (e.g.,
heavy fog, rain, snow) and/or aerosol levels that are of a sufficient
density to prevent the open path analyzer's light transmission. If
certain compensating measures are not otherwise implemented at the onset
of monitoring (e.g., shorter path lengths, higher light source
intensity), data recovery during periods of greatest primary pollutant
potential could be compromised. For instance, if heavy fog or high
particulate levels are coincident with periods of projected NAAQS-
threatening pollutant potential, the representativeness of the resulting
data record in reflecting maximum pollutant concentrations may be
substantially impaired despite the fact that the site may otherwise
exhibit

[[Page 277]]

an acceptable, even exceedingly high overall valid data capture rate.
    In seeking EPA approval for inclusion of a site using an open path
analyzer into the formal SLAMS/NAMS or PSD network, monitoring agencies
must submit an analysis which evaluates both obscuration potential for a
proposed path length for the subject area and the effect this potential
is projected to have on the representativeness of the data record. This
analysis should include one or more of the following elements, as
appropriate for the specific circumstance: climatological information,
historical pollutant and aerosol information, modeling analysis results,
and any related special study results.
    4.3 Spacing From Roadways. Street canyon and traffic corridor
stations (microscale) are intended to provide a measurement of the
influence of the immediate source on the pollution exposure of the
population. In order to provide some reasonable consistency and
comparability in the air quality data from microscale stations, a
minimum distance of 2 meters and a maximum distance of 10 meters from
the edge of the nearest traffic lane must be maintained for these CO
monitoring inlet probes. This should give consistency to the data, yet
still allow flexibility of finding suitable locations.
    Street canyon/corridor (microscale) inlet probes must be located at
least 10 meters from an intersection and preferably at a midblock
location. Midblock locations are preferable to intersection locations
because intersections represent a much smaller portion of downtown space
than do the streets between them. Pedestrian exposure is probably also
greater in street canyon/corridors than at intersections. Also, the
practical difficulty of positioning sampling inlets is less at midblock
locations than at the intersection. However, the final siting of the
monitor must meet the objectives and intent of appendix D, sections 2.4,
3, 3.3, and appendix E, section 4.
    In determining the minimum separation between a neighborhood scale
monitoring station and a specific line source, the presumption is made
that measurements should not be substantially influenced by any one
roadway. Computations were made to determine the separation distance,
and table 2 provides the required minimum separation distance between
roadways and a probe or 90 percent of a monitoring path. Probes or
monitoring paths that are located closer to roads than this criterion
allows should not be classified as a neighborhood scale, since the
measurements from such a station would closely represent the middle
scale. Therefore, stations not meeting this criterion should be
classified as middle scale.

   Table 2--Minimum Separation Distance Between Roadways and Probes or
   Monitoring Paths for Monitoring Neighborhood Scale Carbon Monoxide
------------------------------------------------------------------------
                                                               Minimum
                                                             separation
                                                            distance \1\
                                                             for probes
      Roadway average daily traffic, vehicles per day        or 90% of a
                                                             monitoring
                                                                path
                                                              (meters)
------------------------------------------------------------------------
10,000.........................................            10
  15,000..................................................            25
  20,000..................................................            45
  30,000..................................................            80
  40,000..................................................           115
  50,000..................................................           135
  60,000.......................................           150
------------------------------------------------------------------------
\1\ Distance from the edge of the nearest traffic lane. The distance for
  intermediate traffic counts should be interpolated from the table
  values based on the actual traffic count.

    4.4 Spacing From Trees and Other Considerations. Since CO is
relatively nonreactive, the major factor concerning trees is as
obstructions to normal wind flow patterns. For middle and neighborhood
scale stations, trees should not be located between the major sources of
CO, usually vehicles on a heavily traveled road, and the monitor. The
probe or at least 90 percent of the monitoring path must be 10 meters or
more from the drip line of trees which are between the probe or the
monitoring path and the road and which extend at least 5 meters above
the probe or monitoring path. For microscale stations, no trees or
shrubs should be located between the probe and the roadway.
    4.5 Cumulative Interferences on a Monitoring Path. The cumulative
length or portion of a monitoring path that is affected by obstructions,
trees, or roadways must not exceed 10 percent of the total monitoring
path length.
    4.6 Maximum Monitoring Path Length. The monitoring path length must
not exceed 1 kilometer for analyzers used for neighborhood scale
monitoring applications, or 300 meters for middle scale monitoring
applications. In areas subject to frequent periods of dust, fog, rain,
or snow, consideration should be given to a shortened monitoring path
length to minimize loss of monitoring data due to these temporary
optical obstructions. For certain ambient air monitoring scenarios using
open path analyzers, shorter path lengths may be needed in order to
ensure that the monitoring station meets the objectives and spatial
scales defined for SLAMS in appendix D. Therefore, the Regional
Administrator or the Regional Administrator's designee may require
shorter path lengths, as needed on an individual basis, to ensure that
the SLAMS meet the appendix D requirements. Likewise, the Administrator
or the Administrator's designee may specify the maximum path length used

[[Page 278]]

at monitoring stations designated as NAMS or PAMS as needed on an
individual basis.

5.-6. [Reserved]

7. Lead (Pb)

    7.1  Vertical Placement. Optimal placement of the sampler inlet for
Pb monitoring should be at breathing height level. However, practical
factors such as prevention of vandalism, security, and safety
precautions must also be considered when siting a Pb monitor. Given
these considerations, the sampler inlet for microscale Pb monitors must
be 2-7 meters above ground level. The lower limit was based on a
compromise between ease of servicing the sampler and the desire to avoid
unrepresentative conditions due to re-entrainment from dusty surfaces.
The upper limit represents a compromise between the desire to have
measurements which are most representative of population exposures and a
consideration of the practical factors noted above.
    For middle or larger spatial scales, increased diffusion results in
vertical concentration gradients which are not as great as for the small
scales. Thus, the required height of the air intake for middle or larger
scales is 2-15 meters.
    7.2 Spacing From Obstructions. The sampler must be located away from
obstacles such as buildings, so that the distance between obstacles and
the sampler is at least twice the height that the obstacle protrudes
above the sampler.
    A minimum of 2 meters of separation from walls, parapets, and
penthouses is required for rooftop samplers. No furnace or incinerator
flues should be nearby. The height and type of flues and the type,
quality, and quantity of waste or fuel burned determine the separation
distances. For example, if the emissions from the chimney have high lead
content and there is a high probability that the plume would impact on
the sampler during most of the sampling period, then other buildings/
locations in the area that are free from the described sources should be
chosen for the monitoring site.
    There must be unrestricted airflow in an arc of at least 270 deg.
around the sampler.
Since the intent of the category (a) site is to measure the maximum
concentrations from a road or point source, there must be no significant
obstruction between a road or point source and the monitor, even though
other spacing from obstruction criteria are met. The predominant
direction for the season with the greatest pollutant concentration
potential must be included in the 270 deg. arc.
    7.3. Spacing from Roadways. This criteria applies only to those Pb
sites designed to assess lead concentrations from mobile sources.
Numerous studies have shown that ambient Pb levels near mobile sources
are a function of the traffic volume and are most pronounced at ADT
>30,000 within the first 15 meters on the downwind side of the roadways.
Numberous studies have shown that ambient lead levels near mobile source
are a function of the traffic volume and are most pronounced at ADT
30,000 within the first 15 meters, on the downwind side of
the roadways. (1, 16-19) Therefore, stations to measure the peak
concentration from mobile sources should be located at the distance most
likely to produce the highest concentrations. For the microscale
station, the location must be between 5 and 15 meters from the major
roadway. For the middle scale station, a range of acceptable distances
from the major roadway is shown in table 4. This table also includes
separation distances between a roadway and neighborhood or larger scale
stations. These distances are based upon the data of reference 16 which
illustrates that lead levels remain fairly constant after certain
horizontal distances from the roadway. As depicted in the above
reference, this distance is a function of the traffic volume.

 Table 3--Separation Distance Between Pb Stations and Roadways (Edge of
                          Nearest Traffic Lane)
------------------------------------------------------------------------
                                    Separation distance between roadways
                                            and stations, meters
                                   -------------------------------------
   Roadway average daily traffic                            Neighborhood
         vehicles per day                         Middle        urban
                                    Microscale     scale      regional
                                                                scale
------------------------------------------------------------------------
10,000............................       5-15   \1\>15-50        \1\>50
  20,000..........................       5-15   >15-75              >75
40,000............................       5-15   >15-100            >100
------------------------------------------------------------------------
\1\ Distances should be interpolated based on traffic flow.

    7.4. Spacing from trees and other considerations. Trees can provide
surfaces for deposition or adsorption of Pb particles and obstruct
normal wind flow patterns. For microscale and middle scale category (a)
sites there must not be any tree(s) between the source of the Pb and the
sampler. For neighborhood scale category (b) sites, the sampler should
be at least 20 meters from the drip line of trees. The sampler must,
however, be placed at least 10 meters from the drip line of trees which
could be classified as an obstruction, i.e., the distance between the
tree(s) and the sampler is less than the height that the tree protrudes
above the sampler.

8. Particulate Matter (PM10 and PM2.5)

    8.1 Vertical Placement. Although there are limited studies on the
PM10 concentration gradients around roadways or other ground
level sources, References 1, 2, 4, 18 and 19 of this appendix show a
distinct variation in the distribution of TSP and Pb levels near
roadways, TSP, which is greatly affected by gravity, has large
concentration gradients, both horizontal and vertical, immediately

[[Page 279]]

adjacent to roads. Lead, being predominately sub-micron in size, behaves
more like a gas and exhibits smaller vertical and horizontal gradients
than TSP. PM10, being intermediate in size between these two
extremes exhibits dispersion properties of both gas and settleable
particulates and does show vertical and horizontal gradients. \3\\0\
Similar to monitoring for other pollutants, optimal placement of the
sampler inlet for PM10 monitoring should be at breathing
height level. However, practical factors such as prevention of
vandalism, security, and safety precautions must also be considered when
siting a PM10 monitor. Given these considerations, the
sampler inlet for microscale PM10 monitors must be 2-7 meters
above ground level. The lower limit was based on a compromise between
ease of servicing the sampler and the desire to avoid re-entrainment
from dusty surfaces. The upper limit represents a compromise between the
desire to have measurements which are most representative of population
exposures and a consideration of the practical factors noted above.
Although microscale or middle scale stations are not the preferred
spatial scale for PM2.5 sites, there are situations where
such sites are representative of several locations within an area where
large segments of the population may live or work (e.g., central
business district of Metropolitan area). In these cases, the sampler
inlet for such microscale PM2.5 stations must also be 2-7
meters above ground level.
    For middle or larger spatial scales, increased diffusion results in
vertical concentration gradients that are not as great as for the
microscale. Thus, the required height of the air intake for middle or
larger scales is 2-15 meters.
    8.2 Spacing From Obstructions. If the sampler is located on a roof
or other structure, then there must be a minimum of 2 meters separation
from walls, parapets, penthouses, etc. No furnace or incineration flues
should be nearby. This separation distance from flues is dependent on
the height of the flues, type of waste or fuel burned, and quality of
the fuel (ash content). In the case of emissions from a chimney
resulting from natural gas combustion, as a precautionary measure, the
sampler should be placed at least 5 meters from the chimney.
    On the other hand, if fuel oil, coal, or solid waste is burned and
the stack is sufficiently short so that the plume could reasonably be
expected to impact on the sampler intake a significant part of the time,
other buildings/locations in the area that are free from these types of
sources should be considered for sampling. Trees provide surfaces for
particulate desposition and also restrict airflow. Therefore, the
sampler should be placed at least 20 meters from the dripline and must
be 10 meters from the dripline when the tree(s) acts as an obstruction.
    The sampler must also be located away from obstacles such as
buildings, so that the distance between obstacles and the sampler is at
least twice the height that the obstacle protrudes above the sampler
except for street canyon sites. Sampling stations that are located
closer to obstacles than this criterion allows should not be classified
as neighborhood, urban, or regional scale, since the measurements from
such a station would closely represent middle scale stations. Therefore,
stations not meeting the criterion should be classified as middle scale.
    There must be unrestricted airflow in an arc of at least 270 deg.
around the sampler except for street canyon sites. Since the intent of
the category (a) site is to measure the maximum concentrations from a
road or point source, there must be no significant obstruction between a
road or point source and the monitor, even though other spacing from
obstruction criteria are met. The predominant direction for the season
with the greatest pollutant concentration potential must be included in
the 270 deg. arc.
    8.3 Spacing From Roads. Since emissions associated with the
operation of motor vehicles contribute to urban area particulate matter
ambient levels, spacing from roadway criteria are necessary for ensuring
national consistency in PM sampler siting.
    The intent is to locate category (a) NAMS sites in areas of highest
concentrations whether it be from mobile or multiple stationary sources.
If the area is primarily affected by mobile sources and the maximum
concentration area(s) is judged to be a traffic corridor or street
canyon location, then the monitors should be located near roadways with
the highest traffic volume and at separation distances most likely to
produce the highest concentrations. For the microscale traffic corridor
station, the location must be between 5 and 15 meters from the major
roadway. For the microscale street canyon site the location must be
between 2 and 10 meters from the roadway. For the middle scale station,
a range of acceptable distances from the roadway is shown in Figure 2.
This figure also includes separation distances between a roadway and
neighborhood or larger scale stations by default. Any station, 2 to 15
meters high, and further back than the middle scale requirements will
generally be neighborhood, urban or regional scale. For example,
according to Figure 2, if a PM sampler is primarily influenced by
roadway emissions and that sampler is set back 10 meters from a 30,000
ADT road, the station should be classified as a micro scale, if the
sampler height is between 2 and 7 meters. If the sampler height is
between 7 and 15 meters, the station should be classified as middle
scale. If the sample is 20 meters from the same road, it will be
classified as middle scale; if 40 meters, neighborhood scale; and if 110
meters, an urban scale.

[[Page 280]]

[GRAPHIC] [TIFF OMITTED] TC01JY92.003

    It is important to note that the separation distances shown in
Figure 2 are measured from the edge of the nearest traffic lane of the
roadway presumed to have the most influence on the site. In general,
this presumption is an oversimplification of the usual urban settings
which normally have several streets that impact a given site. The
effects

[[Page 281]]

of surrounding streets, wind speed, wind direction and topography should
be considered along with Figure 2 before a final decision is made on the
most appropriate spatial scale assigned to the sampling station.
    8.4 Other Considerations. For those areas that are primarily
influenced by stationary source emissions as opposed to roadway
emissions, guidance in locating these areas may be found in the
guideline document Optimum Network Design and Site Exposure Criteria for
Particulate Matter. \2\\9\
    Stations should not be located in an unpaved area unless there is
vegetative ground cover year round, so that the impact of wind blown
dusts will be kept to a minimum.

9. Probe Material and Pollutant Sample Residence Time

    For the reactive gases, SO2, NO2, and
O3, special probe material must be used for point analyzers.
Studies 20-24 have been conducted to determine the
suitability of materials such as polypropylene, polyethylene, polyvinyl
chloride, Tygon, aluminum, brass, stainless steel, copper, Pyrex glass
and Teflon for use as intake sampling lines. Of the above materials,
only Pyrex glass and Teflon have been found to be acceptable for use as
intake sampling lines for all the reactive gaseous pollutants.
Furthermore, the EPA25 has specified borosilicate glass or
FEP Teflon as the only acceptable probe materials for delivering test
atmospheres in the determination of reference or equivalent methods.
Therefore, borosilicate glass, FEP Teflon, or their equivalent must be
used for existing and new NAMS or SLAMS.
---------------------------------------------------------------------------

    20-29 See References at end of this appendix.
---------------------------------------------------------------------------

    For VOC monitoring at those SLAMS designated as PAMS, FEP teflon is
unacceptable as the probe material because of VOC adsorption and
desorption reactions on the FEP teflon. Borosilicate glass, stainless
steel, or its equivalent are the acceptable probe materials for VOC and
carbonyl sampling. Care must be taken to ensure that the sample
residence time is 20 seconds or less.
    No matter how nonreactive the sampling probe material is initially,
after a period of use reactive particulate matter is deposited on the
probe walls. Therefore, the time it takes the gas to transfer from the
probe inlet to the sampling device is also critical. Ozone in the
presence of NO will show significant losses even in the most inert probe
material when the residence time exceeds 20 seconds.26 Other
studies 27-28 indicate that a 10-second or less residence
time is easily achievable. Therefore, sampling probes for reactive gas
monitors at SLAMS or NAMS must have a sample residence time less than 20
seconds.

    10. Photochemical Assessment Monitoring Stations (PAMS)

    10.1 Horizontal and Vertical Placement. The probe or at least 80
percent of the monitoring path must be located 3 to 15 meters above
ground level. This range provides a practical compromise for finding
suitable sites for the multipollutant PAMS. The probe or at least 90
percent of the monitoring path must be at least 1 meter vertically or
horizontally away from any supporting structure, walls, parapets,
penthouses, etc., and away from dusty or dirty areas.
    10.2 Spacing From Obstructions. The probe or at least 90 percent of
the monitoring path must be located away from obstacles and buildings
such that the distance between the obstacles and the probe or the
monitoring path is at least twice the height that the obstacle protrudes
above the probe or monitoring path. There must be unrestricted airflow
in an arc of at least 270 deg. around the probe inlet. Additionally, the
predominant wind direction for the period of greatest pollutant
concentration (as described for each site in section 4.2 of appendix D)
must be included in the 270 deg. arc. If the probe is located on the
side of the building, 180 deg. clearance is required. A monitoring path
must be clear of all trees, brush, buildings, plumes, dust, or other
optical obstructions, including potential obstructions that may move due
to wind, human activity, growth of vegetation, etc. Temporary optical
obstructions, such as rain, particles, fog, or snow, should be
considered when siting an open path analyzer. Any of these temporary
obstructions that are of sufficient density to obscure the light beam
will affect the ability of the open path analyzer to continuously
measure pollutant concentrations.
    Special consideration must be devoted to the use of open path
analyzers due to their inherent potential sensitivity to certain types
of interferences, or optical obstructions. While some of these potential
interferences are comparable to those to which point monitors are
subject, there are additional sources of potential interferences which
are altogether different in character. Transient, but significant
obscuration of especially longer measurement paths could be expected to
occur as a result of certain prevailing meteorological conditions (e.g.,
heavy fog, rain, snow) and/or aerosol levels that are of a sufficient
density to prevent the open path analyzer's light transmission. If
certain compensating measures are not otherwise implemented at the onset
of monitoring (e.g., shorter path lengths, higher light source
intensity), data recovery during periods of greatest primary pollutant
potential could be compromised. For instance, if heavy fog or high
particulate levels are coincident with periods of projected NAAQS-
threatening pollutant potential, the representativeness of the resulting
data record

[[Page 282]]

in reflecting maximum pollutant concentrations may be substantially
impaired despite the fact that the site may otherwise exhibit an
acceptable, even exceedingly high overall valid data capture rate.
    In seeking EPA approval for inclusion of a site using an open path
analyzer into the formal SLAMS/NAMS or PSD network, monitoring agencies
must submit an analysis which evaluates both obscuration potential for a
proposed path length for the subject area and the effect this potential
is projected to have on the representativeness of the data record. This
analysis should include one or more of the following elements, as
appropriate for the specific circumstance: climatological information,
historical pollutant and aerosol information, modeling analysis results,
and any related special study results.
    10.3 Spacing From Roadways. It is important in the probe and
monitoring path siting process to minimize destructive interferences
from sources of NO since NO readily reacts with O3. Table 4
below provides the required minimum separation distances between
roadways and PAMS (excluding upper air measuring stations):

         Table 4--Separation Distance Between Pams and Roadways
                     [Edge of Nearest Traffic Lane]
------------------------------------------------------------------------
                                                               Minimum
                                                              separation
                                                               distance
                                                               between
      Roadway average daily traffic, vehicles per day          roadways
                                                                 and
                                                             stations in
                                                              meters \1\
------------------------------------------------------------------------
10,000.....................................................          >10
15,000.....................................................           20
20,000.....................................................           30
40,000.....................................................           50
70,000.....................................................          100
>110,000...................................................          250
------------------------------------------------------------------------
\1\ Distance from the edge of the nearest traffic lane. The distance for
  intermediate traffic counts should be interpolated from the table
  based on the actual traffic flow.

    10.4 Spacing From Trees. Trees can provide surfaces for adsorption
and/or reactions to occur and can obstruct normal wind flow patterns. To
minimize these effects at PAMS, the probe or at least 90 percent of the
monitoring path should be placed at least 20 meters from the drip line
of trees. Since the scavenging effect of trees is greater for
O3 than for the other criteria pollutants, strong
consideration of this effect must be given in locating the PAMS probe or
monitoring path to avoid this problem. Therefore, the probe or at least
90 percent of the monitoring path must be at least 10 meters from the
drip line of trees.

11. Waiver Provisions

    It is believed that most sampling probes or monitors can be located
so that they meet the requirements of this appendix. New stations with
rare exceptions, can be located within the limits of this appendix.
However, some existing stations may not meet these requirements and yet
still produce useful data for some purposes. EPA will consider a written
request from the State Agency to waive one or more siting criteria for
some monitoring stations providing that the State can adequately
demonstrate the need (purpose) for monitoring or establishing a
monitoring station at that location. For establishing a new station. a
waiver may be granted only if both of the following criteria are met:
    The site can be demonstrated to be as representative of the
monitoring area as it would be if the siting criteria were being met.
    The monitor or probe cannot reasonably be located so as to meet the
siting criteria because of physical constraints (e.g., inability to
locate the required type of station the necessary distance from roadways
or obstructions).
    However, for an existing station, a waiver may be granted if either
of the above criteria are met.
    Cost benefits, historical trends, and other factors may be used to
add support to the above, however, they in themselves, will not be
acceptable reasons for granting a waiver. Written requests for waivers
must be submitted to the Regional Administrator. For those SLAMS also
designated as NAMS, the request will be forwarded to the Administrator.
For those SLAMS also designated as NAMS or PAMS, the request will be
forwarded to the Administrator.

12. Summary

    Table 5 presents a summary of the general requirements for probe and
monitoring path siting criteria with respect to distances and heights.
It is apparent from table 5 that different elevation distances above the
ground are shown for the various pollutants. The discussion in the text
for each of the pollutants described reasons for elevating the monitor,
probe, or monitoring path. The differences in the specified range of
heights are based on the vertical concentration gradients. For CO, the
gradients in the vertical direction are very large for the microscale,
so a small range of heights has been used. The upper limit of 15 meters
was specified for consistency between pollutants and to allow the use of
a single manifold or monitoring path for monitoring more than one
pollutant.

[[Page 283]]

                                              Table 5--Summary of Probe and Monitoring Path Siting Criteria
--------------------------------------------------------------------------------------------------------------------------------------------------------
                                                                                         Horizontal and
                                                              Height from ground to    vertical distance     Distance from trees       Distance from
                                         Scale [maximum          probe or 80% of        from supporting       to probe or 90% of    roadways to probe or
             Pollutant                   monitoring path        monitoring path A    structures B to probe    monitoring path A      monitoring path A
                                         length, meters]            (meters)          or 90% of monitoring         (meters)               (meters)
                                                                                        path A  (meters)
--------------------------------------------------------------------------------------------------------------------------------------------------------
SO2 C,D,E,F........................  Middle [300m]           3-15..................  >1...................  >10..................  N/A.
                                      Neighborhood, Urban,
                                      and Regional [1km].
CO D,E,G...........................  Micro Middle [300m]     30.5; 3-15  >1...................  >10..................  2-10; See table 2 for
                                      Neighborhood [1km].                                                                           middle and
                                                                                                                                    neighborhood scales.
O3 C,D,E...........................  Middle [300m]           3-15..................  >1...................  >10..................  See table 1 for all
                                      Neighborhood, Urban,                                                                          scales.
                                      and Regional [1km].
Ozone precursors (for PAMS) C,D,E..  Neighborhood and Urban  3-15..................  >1...................  >10..................  See table 4 for all
                                     [1 km]................                                                                         scales.
NO2 C,D,E..........................  Middle [300m]           3-15..................  >1...................  >10..................  See table 1 for all
                                      Neighborhood and                                                                              scales.
                                      Urban [1km].
Pb C,D,E,F,H.......................  Micro; Middle,          2-7 (Micro); 2-15 (All  >2 (All scales,        >10 (All scales).....  5-15 (Micro); See
                                      Neighborhood, Urban     other scales).          horizontal distance                           table 3 for all
                                      and Regional.                                   only).                                        other scales.
PM-10 C,D,E,F,H....................  Micro; Middle,          2-7 (Micro); 2-15 (All  >2 (All scales,        >10 (All scales).....  2-10 (Micro); See
                                      Neighborhood, Urban     other scales).          horizontal distance                           Figure 2 for all
                                      and Regional.                                   only).                                        other scales.
--------------------------------------------------------------------------------------------------------------------------------------------------------
N/A--Not applicable.
A Monitoring path for open path analyzers is applicable only to middle or neighborhood scale CO monitoring and all applicable scales for monitoring SO2,
  O3, O3 precursors, and NO2.
B When probe is located on a rooftop, this separation distance is in reference to walls, parapets, or penthouses located on roof.
C Should be >20 meters from the dripline of tree(s) and must be 10 meters from the dripline when the tree(s) act as an obstruction.
D Distance from sampler, probe, or 90% of monitoring path to obstacle, such as a building, must be at least twice the height the obstacle protrudes
  above the sampler, probe, or monitoring path. Sites not meeting this criterion may be classified as middle scale (see text).
E Must have unrestricted airflow 270 deg. around the probe or sampler; 180 deg. if the probe is on the side of a building.
F The probe, sampler, or monitoring path should be away from minor sources, such as furnace or incineration flues. The separation distance is dependent
  on the height of the minor source's emission point (such as a flue), the type of fuel or waste burned, and the quality of the fuel (sulfur, ash, or
  lead content). This criterion is designed to avoid undue influences from minor sources.
G For microscale CO monitoring sites, the probe must be >10 meters from a street intersection and preferably at a midblock location.
H For collocated Pb and PM-10 samplers, a 2-4 meter separation distance between collocated samplers must be met.

13. References

    1. Bryan, R.J., R.J. Gordon, and H. Menck. Comparison of High Volume
Air Filter Samples at Varying Distances from Los Angeles Freeway.
University of Southern California, School of Medicine, Los Angeles, CA.
(Presented at 66th Annual Meeting of Air Pollution Control Association.
Chicago, IL., June 24-28, 1973. APCA 73-158.)
    2. Teer, E.H. Atmospheric Lead Concentration Above an Urban Street.
Master of Science Thesis, Washington University, St. Louis, MO. January
1971.
    3. Bradway, R.M., F.A. Record, and W.E. Belanger. Monitoring and
Modeling of Resuspended Roadway Dust Near Urban Arterials. GCA
Technology Division, Bedford, MA. (Presented at 1978 Annual Meeting of
Transportation Research Board, Washington, DC. January 1978.)
    4. Pace, T.G., W.P. Freas, and E.M. Afify. Quantification of
Relationship Between Monitor Height and Measured Particulate Levels in
Seven U.S. Urban Areas. U.S. Environmental Protection Agency, Research
Triangle Park, NC. (Presented at 70th Annual Meeting of Air Pollution
Control Association, Toronto, Canada, June 20-24, 1977. APCA 77-13.4.)
    5. Harrison, P.R. Considerations for Siting Air Quality Monitors in
Urban Areas. City of Chicago, Department of Environmental Control,
Chicago, IL. (Presented at 66th Annual Meeting of Air Pollution Control
Association, Chicago, IL., June 24-28, 1973. APCA 73-161.)
    6. Study of Suspended Particulate Measurements at Varying Heights
Above Ground.

[[Page 284]]

Texas State Department of Health, Air Control Section, Austin, TX. 1970.
p.7.
    7. Rodes, C.E. and G.F. Evans. Summary of LACS Integrated Pollutant
Data. In: Los Angeles Catalyst Study Symposium. U.S. Environmental
Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-
600/4-77-034. June 1977.
    8. Lynn, D.A. et. al. National Assessment of the Urban Particulate
Problem: Volume 1, National Assessment. GCA Technology Division,
Bedford, MA. U.S. Environmental Protection Agency, Research Triangle
Park, NC. EPA Publication No. EPA-450/3-75-024. June 1976.
    9. Pace, T.G. Impact of Vehicle-Related Particulates on TSP
Concentrations and Rationale for Siting Hi-Vols in the Vicinity of
Roadways. OAQPS, U.S. Environmental Protection Agency, Research Triangle
Park, NC. April 1978.
    10. Ludwig, F.L., J.H. Kealoha, and E. Shelar. Selecting Sites for
Monitoring Total Suspended Particulates. Stanford Research Institute,
Menlo Park, CA. Prepared for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-018. June
1977, revised December 1977.
    11. Ball, R.J. and G.E. Anderson. Optimum Site Exposure Criteria for
SO2 Monitoring. The Center for the Environment and Man, Inc.,
Hartford, CT. Prepared for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-013. April
1977.
    12. Ludwig, F.L. and J.H.S. Kealoha. Selecting Sites for Carbon
Monoxide Monitoring. Stanford Research Institute, Menlo Park, CA.
Prepared for U.S. Environmental Protection Agency, Research Park, NC.
EPA Publication No. EPA-450/3-75-077. September 1975.
    13. Ludwig, F.L. and E. Shelar. Site Selection for the Monitoring of
Photochemical Air Pollutants. Stanford Research Institute, Menlo Park,
CA. Prepared for U.S. Environmental Protection Agency, Research Triangle
Park, NC. EPA Publication No. EPA-450/3-78-013. April 1978.
    14. Lead Analysis for Kansas City and Cincinnati, PEDCo
Environmental, Inc., Cincinnati, OH. Prepared for U.S. Environmental
Protection Agency, Research Triangle Park, NC. EPA Contract No. 66-02-
2515, June 1977.
    15. Barltrap, D. and C. D. Strelow. Westway Nursery Testing Project.
Report to the Greater London Council. August 1976.
    16. Daines, R. H., H. Moto, and D. M. Chilko. Atmospheric Lead: Its
Relationship to Traffic Volume and Proximity to Highways. Environ. Sci.
and Technol., 4:318, 1970.
    17. Johnson, D. E., et al. Epidemiologic Study of the Effects of
Automobile Traffic on Blood Lead Levels, Southwest Research Institute,
Houston, TX. Prepared for U.S. Environmental Protection Agency, Research
Triangle Park, NC. EPA-600/1-78-055, August 1978.
    18. Air Quality Criteria for Lead. Office of Research and
Development, U.S. Environmental Protection Agency, Washington, DC EPA-
600/8-83-028 aF-dF, 1986, and supplements EPA-600/8-89/049F, August
1990. (NTIS document numbers PB87-142378 and PB91-138420.)
    19. Lyman, D. R. The Atmospheric Diffusion of Carbon Monoxide and
Lead from an Expressway, Ph.D. Dissertation, University of Cincinnati,
Cincinnati, OH. 1972.
    20. Wechter, S.G. Preparation of Stable Pollutant Gas Standards
Using Treated Aluminum Cylinders. ASTM STP. 598:40-54, 1976.
    21. Wohlers, H.C., H. Newstein and D. Daunis. Carbon Monoxide and
Sulfur Dioxide Adsorption On and Description From Glass, Plastic and
Metal Tubings. J. Air Poll. Con. Assoc. 17:753, 1976.
    22. Elfers, L.A. Field Operating Guide for Automated Air Monitoring
Equipment. U.S. NTIS. p. 202, 249, 1971.
    23. Hughes, E.E. Development of Standard Reference Material for Air
Quality Measurement. ISA Transactions, 14:281-291, 1975.
    24. Altshuller, A.D. and A.G. Wartburg. The Interaction of Ozone
with Plastic and Metallic Materials in a Dynamic Flow System. Intern.
Jour. Air and Water Poll., 4:70-78, 1961.
    25. CFR Title 40 part 53.22, July 1976.
    26. Butcher, S.S. and R.E. Ruff. Effect of Inlet Residence Time on
Analysis of Atmospheric Nitrogen Oxides and Ozone, Anal. Chem., 43:1890,
1971.
    27. Slowik, A.A. and E.B. Sansone. Diffusion Losses of Sulfur
Dioxide in Sampling Manifolds. J. Air. Poll. Con. Assoc., 24:245, 1974.
    28. Yamada, V.M. and R.J. Charlson. Proper Sizing of the Sampling
Inlet Line for a Continuous Air Monitoring Station. Environ. Sci. and
Technol., 3:483, 1969.
    29. Koch, R.C. and H.E. Rector. Optimum Network Design and Site
Exposure Criteria for Particulate Matter, GEOMET Technologies, Inc.,
Rockville, MD. Prepared for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Contract No. 68-02-3584. EPA 450/4-87-
009. May 1987.
    30. Burton, R.M. and J.C. Suggs. Philadelphia Roadway Study.
Environmental Monitoring Systems Laboratory, U.S. Environmental
Protection Agency, Research Triangle Park, N.C. EPA-600/4-84-070
September 1984.
    31. Technical Assistance Document For Sampling and Analysis of Ozone
Precursors. Atmospheric Research and Exposure Assessment Laboratory,
U.S. Environmental Protection Agency, Research Triangle Park, NC 27711.
EPA 600/8-91-215. October 1991.
    32. Quality Assurance Handbook for Air Pollution Measurement
Systems: Volume IV. Meteorological Measurements. Atmospheric

[[Page 285]]

Research and Exposure Assessment Laboratory, U.S. Environmental
Protection Agency, Research Triangle Park, NC 27711. EPA 600/4-90-0003.
August 1989.
    33. On-Site Meteorological Program Guidance for Regulatory Modeling
Applications. Office of Air Quality Planning and Standards, U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711. EPA
450/4-87-013. June 1987.

[44 FR 27571, May 10, 1979; 44 FR 72592, Dec. 14, 1979, as amended at 46
FR 44170, Sept. 3, 1981; 51 FR 9598, Mar. 19, 1986; 52 FR 24744--24748,
July 1, 1987; 52 FR 27286, July 20, 1987; 58 FR 8474, 8475, Feb. 12,
1993; 60 FR 52324, Oct. 6, 1995; 62 FR 38854, July 18, 1997; 64 FR 3036,
Jan. 20, 1999]

       Appendix F to Part 58--Annual SLAMS Air Quality Information

    1. General
    2. Required Information
    2.1 Sulfur Dioxide (SO2)
    2.1.1 Site and Monitoring Information
    2.1.2 Annual Summary Statistics
    2.2 Total Suspended Particulates (TSP)
    2.2.1 Site and Monitoring Information
    2.2.2 Annual Summary Statistics
    2.2.3 Episode and Other Unscheduled Sampling Data
    2.3 Carbon Monoxide (CO)
    2.3.1 Site and Monitoring Information
    2.3.2 Annual Summary Statistics
    2.4 Nitrogen Dioxide (NO2)
    2.4.1 Site and Monitoring Information
    2.4.2 Annual Summary Statistics
    2.5 Ozone(O3)
    2.5.1 Site and Monitoring Information
    2.5.2 Annual Summary Statistics
    2.6 Lead (Pb)
    2.6.1 Site and Monitoring Information
    2.6.2 Annual Summary Statistics
    2.7 Particulate Matter (PM10)
    2.7.1 Site and Monitoring Information
    2.7.2 Annual Summary Statistics
    2.7.3 Annual Summary Statistics
    2.7.4 Episode and Other Unscheduled Sampling Data

1. General

    This appendix describes information to be compiled and submitted
annually to EPA for each ambient monitoring station in the SLAMS Network
in accordance with Sec. 58.26. The annual summary statistics that are
described in section 2 below shall be construed as only the minimum
necessary statistics needed by EPA to overview national air quality
status. They will be used by EPA to convey information to a variety of
interested parties including environmental groups, Federal agencies, the
Congress, and private citizens upon request. As the need arises, EPA may
issue modifications to these minimum requirements to reflect changes in
EPA policy concerning the National Ambient Air Quality Standards
(NAAQS).
    As indicated in Sec. 58.26(c), the contents of the SLAMS annual
report shall be certified by the senior air pollution control officer in
the State to be accurate to the best of his knowledge. In addition, the
manner in which the data were collected must be certified to have
conformed to the applicable quality assurance, air monitoring
methodology, and probe siting criteria given in appendices A, C, and E
to this part. A certified statement to this effect must be included with
the annual report. As required by Sec. 58.26(a), the report must be
submitted by July 1 of each year for data collected during the period
January 1 to December 31 of the previous year.
    EPA recognizes that most air pollution control agencies routinely
publish air quality statistical summaries and interpretive reports. EPA
encourages State and local agencies to continue publication of such
reports and recommends that they be expanded, where appropriate, to
include analysis of air quality trends, population exposure, and
pollutant distributions. At their discretion, State and local agencies
may wish to integrate the SLAMS report into routine agency publications.

2. Required Information

    This paragraph describes air quality monitoring information and
summary statistics which must be included in the SLAMS annual report.
The required information is itemized below by pollutant. Throughout this
appendix, the time of occurrence refers to the ending hour. For example,
the ending hour of an 8-hour CO average from 12:01 a.m. to 8:00 a.m.
would be 8:00 a.m.
    For the purposes of range assignments the following rounding
convention will be used. The air quality concentration should be rounded
to the number of significant digits used in specifying the concentration
intervals. The digit to the right of the last significant digit
determines the rounding process. If this digit is greater than or equal
to 5, the last significant digit is rounded up. The insignificant digits
are truncated. For example, 100.5 ug/m3 rounds to 101 ug/
m3 and 0.1245 ppm rounds to 0.12 ppm.
    2.1 Sulfur Dioxide (SO2)
    2.1.1 Site and Monitoring Information. City name (when applicable),
county name and street address of site location. AIRS-AQS site code.
AIRS-AQS monitoring method code. Number of hourly observations. (1)
Number of daily observations. (2)
    2.1.2 Annual Summary Statistics. Annual arithmetic mean (ppm).
Highest and second highest 24-hour averages (3) (ppm) and dates of
occurrence. Highest and second highest 3-hour averages (1, 3) (ppm) and
dates and times (1) (ending hour) of occurrence. Number of exceedances
of the 24-hour primary

[[Page 286]]

NAAQS. (3) Number of exceedances of the 3-hour secondary NAAQS. (3)
Number of 24-hour average concentrations (4) in ranges:

------------------------------------------------------------------------
                       Range                           Number of values
------------------------------------------------------------------------
0.00 to 0.04 (ppm).................................  ...................
0.05 to 0.08.......................................  ...................
0.09 to 0.12.......................................  ...................
0.13 to 0.16.......................................  ...................
0.17 to 0.20.......................................  ...................
0.21 to 0.24.......................................  ...................
0.25 to 0.28.......................................  ...................
Greater than .28...................................  ...................
------------------------------------------------------------------------

    2.2 Total Suspended Particulates (TSP)
    2.2.1 Site and Monitoring Information. City name (when applicable),
county name and street address of site location. AIRS-AQS site code.
Number of daily observations.
    2.2.2 Annual Summary Statistics. Annual arithmetic mean (g/
m \3\) as specified in appendix K of part 50. Daily TSP values exceeding
the level of the 24-hour PM10 NAAQS and dates of occurrence.
If more than 10 occurrences, list only the 10 highest daily values.
Sampling schedule used such as once every six days, once every three
days, etc. Number of additional sampling days beyond sampling schedule
used. Number of 24-hour average concentrations in ranges:

------------------------------------------------------------------------
                       Range                           Number of values
------------------------------------------------------------------------
  0 to 50 (g/m \3\).......................  ...................
  51 to 100........................................  ...................
  101 to 150.......................................  ...................
  151 to 200.......................................  ...................
  201 to 250.......................................  ...................
  251 to 300.......................................  ...................
  301 to 400.......................................  ...................
  Greater than 400.................................  ...................
------------------------------------------------------------------------

    2.2.3 Episode and Other Unscheduled Sampling Data. List episode
measurements, other unscheduled sampling data, and dates of occurrence.
List the regularly scheduled sample measurements and date of occurrence
that preceded the episode or unscheduled measurement.
    2.3 Carbon Monoxide (CO)
    2.3.1 Site and Monitoring Information. City name (when applicable),
county name and street address of site location. AIRS-AQS site code.
AIRS-AQS monitoring method code. Number of hourly observations.
    2.3.2 Annual Summary Statistics. Highest and second highest 1-hour
values (ppm) and date and time of occurrence. Highest and second highest
8-hour averages (3) (ppm) and date and time of occurrence (ending hour).
Number of exceedances of the 1-hour primary NAAQS. Number of exceedances
of the 8-hour average primary NAAQS. (3) Number of 8-hour average
concentrations (4) in ranges:

------------------------------------------------------------------------
                       Range                           Number of values
------------------------------------------------------------------------
0 to 4 (ppm).......................................  ...................
5 to 8 (ppm).......................................  ...................
9 to 12............................................  ...................
13 to 16...........................................  ...................
17 to 20...........................................  ...................
21 to 24...........................................  ...................
25 to 28...........................................  ...................
Greater than 28....................................  ...................
------------------------------------------------------------------------

    2.4 Nitrogen Dioxide (NO2)
    2.4.1 Site and Monitoring Information. City name (when applicable),
county name, and street address of site location. AIRS-AQS site code.
AIRS-AQS monitoring method code. Number of hourly observations. (1)
Number of daily observations. (2)
    2.4.2 Annual Summary Statistics. Annual arithmetic mean (ppm).
Highest and second highest hourly averages (3) (ppm) and their dates and
time of occurrence. Highest and second highest 24-hour averages (2) and
their date of occurrence (ppm). Number of hourly average concentrations
in ranges. (1)

------------------------------------------------------------------------
                       Range                           Number of values
------------------------------------------------------------------------
.0 to .04 (ppm)....................................  ...................
.05 to .08.........................................  ...................
.09 to .12.........................................  ...................
.13 to .16.........................................  ...................
.17 to .20.........................................  ...................
.21 to .24.........................................  ...................
.25 to .28.........................................  ...................
Greater than 0.28..................................  ...................
------------------------------------------------------------------------

    2.5 Ozone (O3)
    2.5.1 Site and Monitoring Information. City name (when applicable),
county name and street address of site location. AIRS-AQS site code.
AIRS-AQS monitoring method code. Number of hourly observations.
    2.5.2 Annual Summary Statistics. Four highest daily maximum hour
values (ppm) and their dates and time of occurrence. Number of
exceedances of the daily maximum 1-hour primary NAAQS. Number of daily
maximum hour concentrations in ranges:

------------------------------------------------------------------------
                       Range                           Number of values
------------------------------------------------------------------------
0 to .04 (ppm).....................................  ...................
.05 to .08.........................................  ...................
.09 to .12.........................................  ...................
.13 to .16.........................................  ...................
.17 to .20.........................................  ...................
.21 to .24.........................................  ...................
.25 to .28.........................................  ...................
Greater than .28...................................  ...................
------------------------------------------------------------------------

    2.6 Lead (Pb).
    2.6.1 Site and Monitoring Information. City name (when applicable),
county name, and street address of site location, AIRS-AQS site code.
AIRS-AQS monitoring method code. Sampling interval of submitted data,
e.g., twenty-four hour or quarterly composites.
    2.6.2 Annual Summary Statistics. The four quarterly arithmetic
averages given to two decimal places for the year together with the

[[Page 287]]

number of twenty-four hour samples included in the average, as in the
following format:

------------------------------------------------------------------------
                                                              Quarterly
                                                             arithmetic
              Quarter                  Number of 24-hour       average
                                            samples         (g/
                                                               m \3\)
------------------------------------------------------------------------
Jan.-March.........................  .....................  ............
April-June.........................  .....................  ............
July-Sept..........................  .....................  ............
Oct.-Dec...........................  .....................  ............
------------------------------------------------------------------------

    2.7 Particulate Matter (PM10)
    2.7.1 Site and Monitoring Information. City name (when applicable),
county name, and street address of site location. AIRS-AQS site code.
Number of daily observations.
    2.7.2 Annual Summary Statistics. Annual arithmetic mean (g/
m \3\) as specified in appendix K of part 50. All daily PM10
values above the level of the 24-hour PM10 NAAQS and dates of
occurrence. Sampling schedule used such as once every six days, once
every three days, etc. Number of additional sampling days beyond
sampling schedule used. Number of 24-hour average concentrations in
ranges:

------------------------------------------------------------------------
                       Range                           Number of values
------------------------------------------------------------------------
0 to 25 (g/m \3\).........................  ...................
26 to 50...........................................  ...................
51 to 75...........................................  ...................
76 to 100..........................................  ...................
101 to 125.........................................  ...................
126 to 150.........................................  ...................
151 to 175.........................................  ...................
176 to 200.........................................  ...................
Greater than 200...................................  ...................
------------------------------------------------------------------------

    2.7.3 Annual Summary Statistics. Annual arithmetic mean (g/
m3) as specified in 40 CFR part 50, appendix N. All daily PM-
fine values above the level of the 24-hour PM-fine NAAQS and dates of
occurrence. Sampling schedule used such as once every 6 days, everyday,
etc. Number of 24-hour average concentrations in ranges:

------------------------------------------------------------------------
                       Range                           Number of Values
------------------------------------------------------------------------
0 to 15 (g/m3)............................
16 to 30...........................................  ...................
31 to 50...........................................  ...................
51 to 70...........................................  ...................
71 to 90...........................................  ...................
91 to 110..........................................  ...................
Greater than 110...................................  ...................
------------------------------------------------------------------------

    2.7.4 Episode and Other Unscheduled Sampling Data. List episode
measurements, other unscheduled sampling data, and dates of occurrence.
List the regularly scheduled sample measurements and date of occurrence
that preceded the episode or unscheduled measurement.

    Footnotes

    1. Continuous methods only.
    2. Manual or intermittent methods only.
    3. Based on nonoverlapping values computed according to procedures
described in reference (1) or on individual intermittent measurements.
    4. Based on overlapping running averages for continuous measurements
as described in reference (1) or on individual measurement for
intermittent methods.

Reference

    1. ``Guidelines for the Interpretation of Air Quality Standards''
U.S. Environmental Protection Agency, Office of Air Quality Planning and
Standards, Research Triangle Park, NC 27711. OAQPS No. 1.2-008,
February, 1977.

[44 FR 27571, May 10, 1979, as amended at 46 FR 44171, Sept. 3, 1981; 51
FR 9600, Mar. 19, 1986; 52 FR 24748, 24749, July 1, 1987; 59 FR 41628,
Aug. 12, 1994; 62 FR 38854, July 18, 1997]

    Appendix G to Part 58--Uniform Air Quality Index (AQI) and Daily
                                Reporting

                          General Requirements

    1. What is the AQI?
    2. Why report the AQI?
    3. Must I report the AQI?
    4. What goes into my AQI report?
    5. Is my AQI report for my MSA only?
    6. How do I get my AQI report to the public?
    7. How often must I report the AQI?
    8. May I make exceptions to these reporting requirements?

                               Calculation

    9. How does the AQI relate to air pollution levels?
    10. Where do I get the pollutant concentrations to calculate the
AQI?
    11. Do I have to forecast the AQI?
    12. How do I calculate the AQI?

                   Background and Reference Materials

    13. What additional information should I know?

                          General Requirements

                           1. What Is the AQI?

    The AQI is a tool that simplifies reporting air quality to the
general public. The AQI incorporates into a single index concentrations
of 5 criteria pollutants: ozone (O3), particulate matter
(PM), carbon monoxide (CO), sulfur dioxide (SO2), and
nitrogen dioxide (NO2). The scale of the index is divided
into general categories that are associated with health messages.

                         2. Why Report the AQI?

    The AQI offers various advantages:
    a. It is simple to create and understand.
    b. It conveys the health implications of air quality.

[[Page 288]]

    c. It promotes uniform use throughout the country.

                        3. Must I Report the AQI?

    You must report the AQI daily if yours is a metropolitan statistical
area (MSA) with a population over 350,000.

                    4. What Goes Into My AQI Report?

    i. Your AQI report must contain the following:
    a. The reporting area(s) (the MSA or subdivision of the MSA).
    b. The reporting period (the day for which the AQI is reported).
    c. The critical pollutant (the pollutant with the highest index
value).
    d. The AQI (the highest index value).
    e. The category descriptor and index value associated with the AQI
and, if you choose to report in a color format, the associated color.
Use only the following descriptors and colors for the six AQI
categories:

                        Table 1.--AQI Categories
------------------------------------------------------------------------
                                                          And this color
           For this AQI             Use this descriptor         1
------------------------------------------------------------------------
0 to 50..........................  ``Good''............  Green.
------------------------------------------------------------------------
51 to 100........................  ``Moderate''........  Yellow.
------------------------------------------------------------------------
101 to 150.......................  ``Unhealthy for       Orange.
                                    Sensitive Groups''.
------------------------------------------------------------------------
151 to 200.......................  ``Unhealthy''.......  Red.
------------------------------------------------------------------------
201 to 300.......................  ``Very Unhealthy''..  Purple.
------------------------------------------------------------------------
301 and above....................  ``Hazardous''.......  Maroon.\1\
------------------------------------------------------------------------
1 Specific colors can be found in the most recent reporting guidance
  (Guideline for Public Reporting of Daily Air Quality--Air Quality
  Index (AQI)).

    f. The pollutant specific sensitive groups for any reported index
value greater than 100. Use the following sensitive groups for each
pollutant:

------------------------------------------------------------------------
 When this pollutant has an index value  Report these sensitive groups *
            above 100 * * *                            * *
------------------------------------------------------------------------
Ozone..................................  Children and people with asthma
                                          are the groups most at risk.
------------------------------------------------------------------------
PM2.5..................................  People with respiratory or
                                          heart disease, the elderly and
                                          children are the groups most
                                          at risk.
------------------------------------------------------------------------
PM10...................................  People with respiratory disease
                                          are the group most at risk.
------------------------------------------------------------------------
CO.....................................  People with heart disease are
                                          the group most at risk.
------------------------------------------------------------------------
SO2....................................  People with asthma are the
                                          group most at risk.
------------------------------------------------------------------------
NO2....................................  Children and people with
                                          respiratory disease are the
                                          groups most at risk.
------------------------------------------------------------------------

    ii. When appropriate, your AQI report may also contain the
following:
    a. Appropriate health and cautionary statements.
    b. The name and index value for other pollutants, particularly those
with an index value greater than 100.
    c. The index values for sub-areas of your MSA.
    d. Causes for unusual AQI values.
    e. Actual pollutant concentrations.

                  5. Is My AQI Report for My MSA Only?

    Generally, your AQI report applies to your MSA only. However, if a
significant air quality problem exists (AQI greater than 100) in areas
significantly impacted by your MSA but not in it (for example,
O3 concentrations are often highest downwind and outside an
urban area), you should identify these areas and report the AQI for
these areas as well.

              6. How Do I Get My AQI Report to the Public?

    You must furnish the daily report to the appropriate news media
(radio, television, and newspapers). You must make the daily report
publicly available at one or more places of public access, or by any
other means, including a recorded phone message, a public Internet site,
or facsimile transmission. When the AQI value is greater than 100, it is
particularly critical that the reporting to the various news media be as
extensive as possible. At a minimum, it should include notification to
the media with the largest market coverages for the area in question.

                   7. How Often Must I Report the AQI?

    You must report the AQI at least 5 days per week. Exceptions to this
requirement are in section 8 of this appendix.

        8. May I Make Exceptions to These Reporting Requirements?

    i. If the index value for a particular pollutant remains below 50
for a season or year, then you may exclude the pollutant from your
calculation of the AQI in section 12.
    ii. If all index values remain below 50 for a year, then you may
report the AQI at your discretion. In subsequent years, if pollutant

[[Page 289]]

levels rise to where the AQI would be above 50, then the AQI must be
reported as required in sections 3, 4, 6, and 7 of this appendix.

                               Calculation

           9. How Does the AQI Relate to Air Pollution Levels?

    For each pollutant, the AQI transforms ambient concentrations to a
scale from 0 to 500. The AQI is keyed as appropriate to the national
ambient air quality standards (NAAQS) for each pollutant. In most cases,
the index value of 100 is associated with the numerical level of the
short-term standard (i.e., averaging time of 24-hours or less) for each
pollutant. Different approaches are taken for NO2, for which
no short-term standard has been established, and for PM2.5,
for which the annual standard is the principal vehicle for protecting
against short-term concentrations. The index value of 50 is associated
with the numerical level of the annual standard for a pollutant, if
there is one, at one-half the level of the short-term standard for the
pollutant, or at the level at which it is appropriate to begin to
provide guidance on cautionary language. Higher categories of the index
are based on increasingly serious health effects and increasing
proportions of the population that are likely to be affected. The index
is related to other air pollution concentrations through linear
interpolation based on these levels. The AQI is equal to the highest of
the numbers corresponding to each pollutant. For the purposes of
reporting the AQI, the sub-indexes for PM10 and
PM2.5 are to be considered separately. The pollutant
responsible for the highest index value (the reported AQI) is called the
``critical'' pollutant.

  10. Where Do I Get the Pollutant Concentrations To Calculate the AQI?

    You must use concentration data from population-oriented State/Local
Air Monitoring Station (SLAMS) or parts of the SLAMS required under 40
CFR 58.20 for each pollutant except PM. For PM, you need only calculate
and report the AQI on days for which you have measured air quality data
(e.g., particulate monitors often report values only every sixth day).
You may use particulate measurements from monitors that are not
reference or equivalent methods (for example, continuous PM10
or PM2.5 monitors) if you can relate these measurements by
statistical linear regression to reference or equivalent method
measurements.

                   11. Do I Have to Forecast the AQI?

    You should forecast the AQI to provide timely air quality
information to the public, but this is not required. If you choose to
forecast the AQI, then you may consider both long-term and short-term
forecasts. You can forecast the AQI at least 24-hours in advance using
the most accurate and reasonable procedures considering meteorology,
topography, availability of data, and forecasting expertise. The
document ``Guideline for Developing an Ozone Forecasting Program'' (the
Forecasting Guidance) will help you start a forecasting program. You can
also issue short-term forecasts by predicting 8-hour ozone values from
1-hour ozone values using methods suggested in the Reporting Guidance,
``Guideline for Public Reporting of Daily Air Quality.''

                     12. How Do I Calculate the AQI?

    i. The AQI is the highest value calculated for each pollutant as
follows:
    a. Identify the highest concentration among all of the monitors
within each reporting area and truncate the pollutant concentration to
one more than the significant digits used to express the level of the
NAAQS for that pollutant. This is equivalent to the rounding conventions
used in the NAAQS.
    b. Using Table 2, find the two breakpoints that contain the
concentration.
    c. Using Equation 1, calculate the index.
    d. Round the index to the nearest integer.

[[Page 290]]

                                                            Table 2.--Breakpoints for the AQI
--------------------------------------------------------------------------------------------------------------------------------------------------------
                                             These breakpoints                                                Equal these AQIs * *
------------------------------------------------------------------------------------------------------------           *
                                                                       PM10                                 -----------------------       Category
         O3 (ppm)  8-hour          O3 (ppm)  1-   PM2.5  (g/    CO (ppm)     SO2 (ppm)
                                      hour 1          m>g/m3)           m3)                                   NO2 (ppm)      AQI
--------------------------------------------------------------------------------------------------------------------------------------------------------
0.000-0.064......................  ............   0.0-15.4            0-54       0.0-4.4      0.000-0.034       (2)       0-50      Good.
0.065-0.084......................  ............  15.5-40.4          55-154       4.5-9.4      0.035-0.144       (2)       51-100    Moderate.
0.085-0.104......................  0.125-0.164   40.5-65.4         155-254       9.5-12.4     0.145-0.224       (2)       101-150   Unhealthy for
                                                                                                                                     sensitive groups.
0.105-0.124......................  0.165-0.204   4 65.5-150.4      255-354       12.5-15.4    0.225-0.304       (2)       151-200   Unhealthy.
0.125-0.374......................  0.205-0.404   4 150.5-250.4     355-424       15.5-30.4    0.305-0.604    0.65-1.24    201-300   Very unhealthy.
(3)..............................  0.405-0.504   4 250.5-350.4     425-504       30.5-40.4    0.605-0.804    1.25-1.64    301-400   ....................
(3)..............................  0.505-0.604   4 350.5-500.4     505-604       40.5-50.4    0.805-1.004    1.65-2.04    401-500   Hazardous.
--------------------------------------------------------------------------------------------------------------------------------------------------------
1 Areas are generally required to report the AQI based on 8-hour ozone values. However, there are a small number of areas where an AQI based on 1-hour
  ozone values would be more precautionary. In these cases, in addition to calculating the 8-hour ozone index value, the 1-hour ozone index value may be
  calculated, and the maximum of the two values reported.
2 NO2 has no short-term NAAQS and can generate an AQI only above an AQI value of 200.
3 8-hour O3 values do not define higher AQI values ( 301). AQI values of 301 or higher are calculated with 1-hour O3 concentrations.
4 If a different SHL for PM2.5 is promulgated, these numbers will change accordingly.

[[Page 291]]

    ii. If the concentration is equal to a breakpoint, then the index is
equal to the corresponding index value in Table 2. However, Equation 1
can still be used. The results will be equal. If the concentration is
between two breakpoints, then calculate the index of that pollutant with
Equation 1. You must also note that in some areas, the AQI based on 1-
hour O3 will be more precautionary than using 8-hour values
(see footnote 1 to Table 2). In these cases, you may use 1-hour values
as well as 8-hour values to calculate index values and then use the
maximum index value as the AQI for O3.
[GRAPHIC] [TIFF OMITTED] TR04AU99.044

Where:

Ip = the index value for pollutantp
Cp = the truncated concentration of
          pollutantp
BPHi = the breakpoint that is greater than or equal to
          Cp
BPLo = the breakpoint that is less than or equal to
          Cp
IHi = the AQI value corresponding to BPHi
Ilo = the AQI value corresponding to BPLo.

    iii. If the concentration is larger than the highest breakpoint in
Table 2 then you may use the last two breakpoints in Table 2 when you
apply Equation 1.

                                 Example

    iv. Using Table 2 and Equation 1, calculate the index value for each
of the pollutants measured and select the one that produces the highest
index value for the AQI. For example, if you observe a PM10
value of 210 g/m3, a 1-hour O3 value of
0.156 ppm, and an 8-hour O3 value of 0.130 ppm, then do this:
    a. Find the breakpoints for PM10 at 210 g/
m3 as 155 g/m3 and 254 g/
m3, corresponding to index values 101 and 150;
    b. Find the breakpoints for 1-hour O3 at 0.156 ppm as
0.125 ppm and 0.164 ppm, corresponding to index values 101 and 150;
    c. Find the breakpoints for 8-hour O3 at 0.130 ppm as
0.125 ppm and 0.374 ppm, corresponding to index values 201 and 300;
    d. Apply Equation 1 for 210 g/m3,
PM10:
[GRAPHIC] [TIFF OMITTED] TR04AU99.045

    e. Apply Equation 1 for 0.156 ppm, 1-hour O3:
    [GRAPHIC] [TIFF OMITTED] TR04AU99.046

    f. Apply Equation 1 for 0.130 ppm, 8-hour O3:
    [GRAPHIC] [TIFF OMITTED] TR04AU99.047

    g. Find the maximum, 203. This is the AQI. The minimal AQI report
would read:
    v. Today, the AQI for my city is 203 which is very unhealthy, due to
ozone. Children and people with asthma are the groups most at risk.

                   Background and Reference Materials

             13. What Additional Information Should I Know?

    The EPA has developed a computer program to calculate the AQI for
you. The program works with Windows 95, it prompts for inputs, and it
displays all the pertinent information for the AQI (the index value,
color, category, sensitive group, health effects, and cautionary
language). The EPA has also prepared a brochure on the AQI that explains
the index in detail (The Air Quality Index), Reporting Guidance
(Guideline for Public Reporting of Daily Air Quality) that provides
associated health effects and cautionary statements, and Forecasting
Guidance (Guideline for Developing an Ozone Forecasting Program) that
explains the steps necessary to start an air pollution forecasting
program. You can download the program and the guidance documents at
www.epa.gov/airnow.

[64 FR 42547, Aug. 4, 1999]

Jump to main content.