Dirty Air Down Under

Introduction
Similar to those in Seattle or Southern California, residents of Brisbane, Australia, regard the rapidly increasing population of their city with some concern. The area's warm climate, beautiful beaches and rainforests have been a population magnet in recent years and the current population of 1.5 million is expected to continue to expand at a furious pace. Rightfully so, Brisbane's local government asks just how much growth the city can withstand before the quality of the air is affected. Brisbane City Council, one of the largest local governments in the world, has commissioned the help of scientists to study the region's airshed, the 'box" of atmosphere above a selected region, and help project answers to various "what if?" scenarios, such as how "x" number of autos on local roadways or the addition of a power plant will affect air quality.
In 1992, the Environmental Aerosol Laboratory (EAL), which specializes in environmental aerosol science with a particular focus on fine and ultrafine particles, was established within the Faculty of Science at the Queensland University of Technology (QUT). The EAL conducts fundamental and applied research in the complex, interdisciplinary scientific field of atmospheric particles and their impact on human health. The atmospheric chemistry and modelling sub-group of the EAL conduct research in two interrelated areas: experimental and modelling studies of the ultrafine particles found in smog to understand the chemical and photochemical transformation of pollutants; and computational modeling of the pollution chemistry, transport and distribution of air pollutants over a large district (airshed).
Background
For several decades, ozone (the major component of photochemical smog) was used as a measure of air pollution that results from the reactions of hydrocarbons and nitrogen oxides, which are emitted mostly by vehicles and industry, in the presence of ultra-violet radiation. While ozone measurements are relatively straightforward indicators of smog, other products of these reactions have been found to cause considerable negative health effects on humans -- ultrafine particles, for example. Particulate matter (PM) are typically measured as PM10 and PM2.5, mass concentrations of particles smaller than 10 micrometers and 2.5 micrometers (the thickness of a human hair is approximately 100 micrometers), but ultrafine particles formed in photochemical reactions are typically smaller than 0.3 micrometers. It is the large numbers and small sizes of these particles that result in their deposition in the alveoli of our lungs, rather than being filtered by our respiratory system. As these particles provide a mechanism by which carcinogenic and toxic substances can be delivered to the depths of our lungs, it is important to understand how they are formed and how their formation may be minimized. It is also important to understand how ultrafine particulate matter is dispersed across an airshed, so that the positioning of new roads or industry does not impact on existing residential areas or to prevent the development of a new residential area in a pollution "hot spot." To do this, air movements in three dimensions (north-south, east-west and vertically) in the airshed needs to be understood.
The steps of data processing, subsequent statistical calculations and generation of graphs are not research. Research is the process of interpretation of the statistics and graphs and making logical deductions that can be used to improve the model's performance or to conceptualize an important process that the model has simulated. Reduction in the time to calculate the model and associated statistics, and to prepare visualizations of the data (plots), results in a faster pace of research.
Studies of the formation of ultrafine particles have increased in the last decade due to the availability of new technology, such as high resolution Scanning Mobility Particle Sizers (TSI Inc.), computing power and computer software. Increases in available computer power have meant that both data-intensive smog chamber studies and also airshed modeling studies can become more complex and thus more computationally demanding. Such studies require further computer-intensive calculations to investigate and validate the performance of the models. The vast quantities of data that result from experiments and models are almost impossible to analyze without the use of high-performance graphing and data analysis tools, such as Origin Corp.
Analysis Methodology
Smog Chamber Experiments
The smog chamber used in the studies of fine particulate formation is a 18.1 cubic meter Teflon-walled enclosed room with two banks of UV lamps, designed and built by the Commonwealth Scientific and Industrial research Organisation (CSIRO). The chamber is cleaned thoroughly, so that there are zero particles in the chamber, and known amounts of nitrogen oxides and toluene (a component of gasoline) are injected and well-mixed. The UV lamps are then turned on at intensities that simulate noon sunlight, and the chamber is monitored for up to eight hours. Monitoring of gaseous pollutants is effected by continuous measurement or taking samples at regular intervals and by employing analytical techniques to measure for concentrations of nitrogen oxides, ozone and many hydrocarbon products found in smog. Scanning Mobility Particle Sizers classify and count particles between 0.003 and 0.7 micrometers as a function of time.
These experiments typically generate up to 30,000 data points of various data types in a single eight-hour experiment. Most data validation and analysis is performed on a desktop computer. Subsequent analysis focuses on developing computer models that simulate the underlying chemistry of experimental findings. Some of this modeling is performed on a 60-processor silicon graphics computer, and subsequent analysis and comparison of experimental and modeled data is performed on a desktop computer using Origin graphing and data analysis software.
Figure 1 illustrates how the concentration (number concentration and volume concentration) of particles changes in a photochemical system. The number concentration is important as it describes how many particles are in one cubic centimeter. The volume concentration describes the actual amount (effectively, the mass) of particles formed. The plot shows that although the total number (i.e., the total area of each number distribution) of particles per cubic centimeter does not seem to increase a great deal over the second hour of this experiment, and in fact starts to decrease, the volume (mass) of particles does increase. This occurs because chemicals condense onto the existing particles rather than creating new particles, such that the existing particles get bigger and heavier. In this way, particle size and total volume increase as time goes by. The reason for the decrease in total aerosol number after 90 minutes is thus due to coagulation where two or more particles stick together, reducing the total number of particles but having no affect on the total volume/mass.
Air Shed Modeling
The air shed studies often use the same computing resources to generate and run simulation models. It is impossible to physically measure every detail of every process that occurs in a system such as an airshed, be it the formation of gaseous pollution or particles, or the movement and chemistry of pollution in an air shed. However, it is possible to measure some or parts of the processes, such as the wind speed and direction, as well as concentrations of some pollutants at several specific locations in an air shed. Computer models are developed to help to fill in the gaps and help explain what is actually going on, as well as how one process can affect another.
The breadth of data being analyzed in this multivariate system is such that automated analysis tools are critical to timely interpretation of modeled air shed data and to enable the research to progress at a meaningful pace. The complexity of the air shed model is such that it generates up to 10 gigabytes of raw data and can take up to six days to simulate just one month. The data of interest are extracted from this raw data set and Origin is used (utilizing either its interpreted language, LabTalk, or the compiled Origin-aware C Language) to create dozens of plots automatically that compare the model output against measured data or against other runs of the model. These plots are reviewed and subsequent decisions about model parameters are made.
Windroses are used to present wind speed data and wind direction data that has been collected over some time, so that the dominant wind pattern for a particular area can be determined. Windroses are also useful as they project a large quantity of data in a simple graphical plot. In Figure 2, the wind rose plots compare observed wind data against wind data that has been calculated from a computer model. The time period is one month and the data are hourly averages. The length of each "arm" is proportional to the fractional frequency at which that windspeed (and below) was observed from that direction. Different colors on each "arm" indicate the windspeed. For example, in the left windrose, most winds came form the south to southwest, but winds with higher windspeeds came from the west. The generation of this plot is entirely automated. First, raw wind data are processed externally and a matrix of the counts with respect to windspeed and direction are imported into Origin software. Similar windrose plots are generated automatically and formatted (names, dates, percentages, etc.) automatically for up to 18 different sites at which wind data are monitored.
Conclusion
By incorporating results from smog chamber research into air shed modelling tools, atmospheric researchers aim to simulate the chemistry, transport and distribution of pollutants, as well the formation and growth of ultrafine particles, within regional air sheds. The pace of scientific discovery in this field is improved dramatically by new technologies and computational tools, both hardware and software: even 10 years ago, the cost and limitations of available computing power were such that these aerosol studies would proceed much more slowly.
Aaron Wiegand, PhD, is currently employed as a postdoctoral research fellow at the Center for Instrumental and Developmental Chemistry, Queensland University of Technology, Brisbane, Australia, to undertake research into the formation of secondary particulates. He wishes to acknowledge the support of Environment Australia, the Australian Commonwealth's Department of the Environment and Heritage, the CSIRO, Brisbane City Council and the Environmental Aerosol Laboratory, QUT and his immediate supervisor, Dr Neville Bofinger. He can be reached at <a href="mailto:[email protected]">[email protected]</a>.
For more information, circle XX on card.
e-sources
Brisbane City Council -- www.brisbane.qld.gov.au/council_at_work/environment/air/
Environmental Aerosol Laboratory -- www.sci.qut.edu.au/physci/cmhp/aerosol/
Queensland University of Technology -- www.qut.edu.au
TSI Inc. -- www.tsi.com
OriginLab Corp. -- www.OriginLab.com
Commonwealth Scientific and Industrial research Organisation (CSIRO) -- www.csiro.com
This article originally appeared in the October 2002 issue of Environmental Protection, Vol. 13, No. 9, p. 22.
This article originally appeared in the 10/01/2002 issue of Environmental Protection.