Skip to main content

Currently Skimming:

5 Source Identification and Apportionment Methods
Pages 143-208

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 143...
... Speciated rollback models are relatively simple, spatially averaged models that take changes in pollutant concentrations to be directly proportional to changes in regional emissions of these pollutants or their precursors. Receptor-oriented methods and models infer source contributions by characterizing atmospheric aerosol samples, often using chemical elements or compounds in those samples as tracers for the presence of material from particular kinds of sources.
From page 144...
... Because these models require pollutant concentrations only as initial and boundary conditions for a simulation they can therefore be used to predict the effects of sources before they are built. The members of the committee do not aim to give advice on how to choose a single best source apportionment technique for analyzing a given visibility problem.
From page 145...
... First, we provide criteria for evaluating the relative merits of source identification and apportionment methods in ache context of a national program to protect visibility. We then evaluate various methods, roughly in order of increasing resources required for their application: simple source identification methods; speciated rollback models; receptor models, including chemical mass balance models and regression analysis; models for transport only and for transport with linear chemistry (these are simplified mechanistic models that are either receptor or source oriented)
From page 146...
... CRITERIA FOR EVALUATING SOURCE IDENTIFICATION AND APPORTIONMENT METHODS A national visibility protection program could employ many alternative modeling methods. Source apportionment studies are generally best conducted through the successive use of simple screening models followed by more precise methods.
From page 147...
... formulation, these assumptions should be made to capture the essence of the problem at hand rather than to oversimplify the problem to the extent that there is little assurance that source-receptor relationships are represented correctly. The same criteria apply to mechanistic models for predicting We optical properties of the atmosphere described by Mie theory.
From page 148...
... probably will not provide the particle size distribution data needed to perform a Mie theory light-scattering calculation. Input Data Requirements The data required for application of a particular approach to source apportionment should be understood and obtainable in a practical sense.
From page 149...
... . Other modeling methods (such as speciated rollback models)
From page 150...
... Some receptor-oriented models can apportion emissions from existing sources but cannot readily predict emissions from new ones. Some mechanistic models are better than others at predicting the effects of changes in the elevation of emissions.
From page 151...
... Before a source apportionment method is selected, it should be known how many people, how much time, and how much money are required to start and maintain an assessment of source contributions to visibility impairment in Class T areas. Otherwise, it is unlikely that a regulatory or research program would be established with the amount of support needled to do the work correctly.
From page 152...
... Some simple source apportionment systems, such as plume blight models, might be applicable for use by each of these agencies and could be used nationwide by different agencies acting independently. On the other hand, regional haze analyses that extend over several states and incorporate several Class ~ areas within a single analytical framework would need large amounts of data and might require a more unified approach to visibility regulation than has been taken to date.
From page 153...
... In addition to being able to identify the source contributions to a regional visibility problem, the method should be capable of being matched to a analysis of the least expensive way to meet a particular visibility improvement goal. Some source apportionment methods, particularly linear methods, are readily linked to cost optimization calculations.
From page 154...
... The following discussion should therefore be viewed as a guide for successively unraveling various aspects of a visibility problem rather than for selecting a best single method of analysis. Source Identification Methods In some cases, sources of visibility impairment in Class ~ areas can be identified directly by simple empirical methods.
From page 155...
... Although fire lookouts often photograph Class ~ area visibility conditions each day for state and federal land management agencies, few agencies require written records of the observations of the sources of visibility impairment. In Oregon, observations and photographs taken by fire lookouts were used to develop prima facie evidence that agricultural burning was severely impairing visibility within the Eagle Cap and Central Oregon Cascade wilderness areas (Oregon Department of Environmental Quality, 1990~.
From page 156...
... Speciated Rollback Models As defined in Appendix C, speciated rollback models are simple, spatially averaged, conservation-of-mass models disaggregated according to the major chemical components of aerosols (Trijonis et al., 1975,
From page 157...
... Speciated rollback modeling is a relatively simple method of apportioning visibility-impairing pollutants among sources in a region. Technical Adequacy Two assumptions must be met for a speciated rollback model to be valid.
From page 158...
... the use of linear rollback models to apportion secondary particles could lead to poor control strategies. For example, much sulfate production probably occurs in clouds as a result of oxidation by H2O2.
From page 159...
... composition from emission inventory to air-quality mode} to optics model. One advantage of speciated rollback models is that their input data requirements usually can be met based on data gathered by most airpollution-contro!
From page 160...
... Uncertainty in the primary organic aerosol emissions inventory is essentially independent of the uncertainty in the SO2 emissions inventory. Administrative Feasibility Perhaps the greatest advantage of the speciated rollback approach is Me elimination of many administrative problems associated with more complex source apportionment techniques.
From page 161...
... It would help to have national guidelines for the identification of controlling precursor species, the collection of aerosol composition data, the compilation of speciated emission inventories, and the estimation of background concentrations. The simplicity of the speciated rollback mode!
From page 162...
... The speciated rollback mode! is more appropriate for analyzing air quality averaged over many receptor sites or for considering a widespread problem such as regional haze.
From page 163...
... The main complexity- keeping track of each pollutant within the emissions inventories, airquality data bases, and light extinction budgets is consistent throughout the analysis. Summary The speciated rollback mode!
From page 164...
... Chemical Mass Balance Receptor Models Receptor-oriented models, which are discussed in detail in Appendix C, infer source contributions to visibility impairment at a particular site based on atmospheric aerosol samples from that site. Chemical elements and compounds in ambient samples are often used as tracers for the presence of emissions from particular source types (e.g., knowledge of the lead content of the particles emitted from automobiles burning leaded gasoline historically has been used to compute the amount of motor vehicle exhaust aerosol present in an ambient aerosol sample that contains lead)
From page 165...
... has been used to apportion light extinction by applying additional assumptions about relative humidity and the physical and optical characteristics of particles, these assumptions, which are needed to link the CMB model to atmospheric optics, require full validation before they are likely to be accepted for regulatory use. Until recently, a major disadvantage of the CMB model has been the scarcity of the required input data about the chemical composition of emissions.
From page 170...
... works well even when it is used for areas with varied topography, whereas mechanistic models for aerosol transport can be difficult to apply in complex terrain. The CMB model uses chemical element tracers rather than meteorologic flow fields to determine source-receptor relationships.
From page 171...
... They are compatible with the personnel, skills, and budgets of many regulatory agencies. Expansion of CMB models to apportion secondary particles could be constrained by resource limitations if tracer injection were a required element of the modeling protocol.
From page 172...
... Balance The resources required to apply the CMB model and the usefulness of the method are balanced attractively when the mode} is used to apportion primary emissions based on ambient ant! source aerosol chemical composition data.
From page 173...
... If these deficiencies have been overcome in a particular case and if He source contributions to concentrations of the atmospheric particles have been estimated reliably, then those results can be linked to atmospheric optical properties at the same level of detail as is possible for rollback and CMB models. Many extinction budget studies used to
From page 174...
... More fundamentally, it is not clear that reliable endemic signatures even exist for some source types. For example, Figure 5-1 shows that copper smelters emit selenium and arsenic and thus may be difficult to distinguish from coal combustion in the Southwest by either regression models or CMB models.
From page 175...
... In common with CMB and other approaches based on the deconvolution of ambient composition, regression analysis is unimpeded by complex terrain or diverse source configurations. Unlike CMB analysis, the time resolution available from conventional regression models is limiteci by the fact that the regression relationship is averaged over all the observations used in its derivation.
From page 176...
... Appendix C, demonstrates what results when emissions from an unrecognized or poorly characterized source are proportional to the ambient correlation of those emissions with other emissions, a statistic that is by its nature unknown. -Although the bias is hard to quantify, regression models tend to overestimate the importance of sources that have reliable and easily measurable chemical signatures, whether endemic or injected.
From page 177...
... Cow Ed Watson (1:9891 ve Atlministrative Feasibility The data base needed for regression analysis requires a considerable investment of resources in field measurements of source and ambient aerosol chemical composition. CMB models also face this problem.
From page 178...
... Resource pressures can aciversely affect the equity of an experimental design. Regression analysis tends to overestimate the importance of sources with reliable and easily measurable chemical signatures.
From page 179...
... Models for Transport Only and Transport with Linear Chemistry Two other sets of models for source identification and apportionment are those that assess only the transport of pollutants without regard to the chemical or physical processes that affect the pollutants and those that assess transport coupled with a simple linear chemical transformation process. This family of models includes techniques that range from air parcel trajectory analysis, through models based on prediction of pollutant concentrations via solution of integral or differential equations that govern atmospheric transport and dilution.
From page 180...
... and Samson (1986) showed that the median error for trajectories drawn from National Weather Service data in the eastern United States after 72 hours of simulation was about 350 km.
From page 181...
... The assumption that pollutants are well mixed could be adequate in the warmer months in the eastern United States where visibility-reducing material is often mixed by convective eddies during episodes of regional haze. However, this assumption might not be appropriate for the complex terrain of the western United States or for plumes from single sources.
From page 182...
... This makes them useful for evaluating plume blight. Eulerian grid models and Lagrangian particle models for transport alone are about equally useful when applied to regional haze problems where individual plumes have merged to form a widespread!
From page 183...
... Transport modeling calculations can be used within each of the regulatory frameworks discussed earlier. Transport moclels that predict pollutant concentrations could be useful in predicting the effects of new sources.
From page 184...
... However, ensemble trajectory analysis that yields estimates of linear source-receptor relationships, as well as Eulerian or Lagrangian transport models that compute pollutant concentrations and can incorporate linear chemistry, can be combined with such economic optimization models (see Harley et al., 1989~. Flexibility Trajectory modeling as described here is a receptor-orienteci tool used to diagnose the likely sources of observed pollutant concentrations.
From page 185...
... See Appendix C for further discussion of mechanistic models. Models that are based on physical and chemical atmospheric processes can be used to determine source-receptor relationships and to assess the merits of various pollution control strategies.
From page 186...
... However, these models are hard to develop because of mathematic difficulties and difficulties associated with imperfect understanding of atmospheric aerosol physics and chemistry. Complete moclels for predicting particle size distributions and chemical composition are not available for regulatory use.
From page 187...
... If models that predict bulk f~ne-particle chemical properties (but not size distributions) are to be used as part of a visibility study, and if the accuracy generally expected of mechanistic models is maintained, then the translation of pollutant concentration predictions into effects on visibility will probably need to be supported by size distribution measurements made at the time and place stuclied.
From page 188...
... Although sulfur and NOx emissions and ambient concentrations are widely available for the United States, data on ammonia, organic particle precursors, and primary fine-particle species and size distributions are much less comprehensive or reliable (Places et al., 1990~. Acquiring accurate input data for primary particle size distributions is technically difficult and expensive for regional models, and the lack of such information frequently limits the accuracy of mode!
From page 189...
... Procedures for selecting and weighting representative episodes need to be developed for visibility modeling. Mechanistic models can respond to the full range of geographic settings likely to be encountered, subject to the availability of suitable input data.
From page 190...
... Administrative Feasibility The administrative feasibility of using mechanistic visibility models for regulatory purposes will be governed by the resources that these models require. Mechanistic models require substantially more computer and personnel resources than do other source apportionment methods.
From page 191...
... Mechanistic models are well suited for assessing a wide variety of emissions control strategies. Because emissions must be clearly described as input data to the model, it is generally straightforward to simulate the effect on emissions and air quality of specific source controls.
From page 192...
... Balance Mechanistic models demand a balance between field data acquisition and computational analysis. Problems can arise because both of these activities require a high level of effort.
From page 193...
... The accuracy of current models is limited by incomplete understancIing of atmospheric aerosol phenomena, especially particle size distribution. Consequently, mechanistic modeling for source apportionment of visibility impairment is still in its infancy.
From page 194...
... The details of particle size distribution and morphology are likely to reflect such influences. The contents of an air parcel then depend on its dilution history, and transport and transformation are inextricably coupled.
From page 195...
... For example, contributions from fugitive soil or road dust sources are easily identified by chemical mass balance methods; transport models that use fluid mechanics have great difficulty representing fugitive dust sources correctly because accurate emissions inventories for these sources are hard to assemble. On the other hand, particle formation can be tracked by certain mechanistic models for atmospheric transport and chemical reaction; while chemical mass balance models that are fully satisfactory for secondary-particle source apportionment have not been developed.
From page 196...
... Widespread haze and plume blight are idealized categories which do not exhaust the range of possible forms of visibility impairment. They can be viewed as extreme cases between which there is a continuum of possibilities for visibility impairment.
From page 197...
... As discussed elsewhere in the report (see Chapter 7) , if signif~cant progress is to be macle in preventing and remedying visibility impairment in Class ~ areas, source contributions not just to plume blight but also to regional haze must be considered.
From page 198...
... Since plume blight occurs near sources, plume blight models treat advection and dispersion much more simply. The standard EPA models rely on standard Gaussian plume parameterizations developed to predict ground-level effects (EPA, 1970~.
From page 199...
... As noted above, current regulatory programs rely heavily on plume blight models for judging the visibility effects of proposed new sources. However, a proposed source's potential for plume blight does not necessarily correlate well with its potential to form widespread single-source haze or with its contribution to regional haze.
From page 200...
... The simpler plume blight models are most appropriate for use in cases (often near the source) where light extinction within the plume is due to primary particles and NO2.
From page 201...
... Administrative Feasibility The personnel resources required for the use of simple Gaussian plume blight models are readily available to most air-pollution control agencies. Indeed, most agencies already perform calculations using these models as part of the new-source review process.
From page 202...
... Balance The balance between data collection and data analysis is not likely to be distorted when using simple Gaussian plume blight models, as the data and personnel resources needed for their use are modest. Programs that involve reactive plume models must be carefully structured.
From page 203...
... For that reason, even in single source siting decisions, it may be necessary to consider effects on visibility at spatial scales greater than those of plume blight models or reactive plume models. The clear answer to how to do this is that the proposed new source or modified existing source should be introduced into an appropriate multiple source regional-scale model chosen from those described in Appendix C and in the regional haze section of this chapter.
From page 204...
... In many cases complex mechanistic models will describe a broad range of physical and chemical processes associated with the major gaseous and particulate pollutants. As they analyze visibility impairment, these models can simultaneously determine the concentrations, fluxes, and effects of primary emissions as well as secondary oxidants, acids, and aerosols.
From page 205...
... Speciated rollback models linked to light extinction budget calculations represent perhaps the only complete system of analysis that can be used for regional haze source apportionment throughout the United
From page 206...
... Their completion and testing should be pursued as a high priority. Some mechanistic models are available that present a partial picture of the effects of source emissions on pollutant concentrations.
From page 207...
... in summary, we will consider methods for source apportionment that are either available or could be put together from available components. Following our emphasis on a nested approach in which models of increasing difficulty and accuracy are chosen, the most attractive systems are judged to be as follows: Regional haze assessment: · Speciated rollback models; · Hybrid combinations of chemical mass balance receptor models with secondary-particle models; · Mechanistic transport and secondary-particle formation models used with measured particle size distribution data to facilitate light-scattering calculations.
From page 208...
... First, fully developed mechanistic models for the chemical composition, size distribution, and optical properties of atmospheric particles and gases should be created and tested. Two types of mechanistic models are needed: an advanced reactive plume aerosol process mode} for analysis of singlesource problems close to the source and a grid-based multiple-source regional mode!


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.