2005-01-28 Assessment of the Speciated PM Network (Initial Draft, November 2004)

321 views
301 views

Published on

Published in: Technology, Economy & Finance
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
321
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

2005-01-28 Assessment of the Speciated PM Network (Initial Draft, November 2004)

  1. 1. Assessment of the Speciated PM Network (Initial Draft, November 2004 ) Washington University, St. Louis CIRA/NPS VIEWS Team
  2. 2. Contents <ul><li>Background and Approach </li></ul><ul><ul><li>Aerosol Speciation and the NCore Network </li></ul></ul><ul><ul><li>Aerosol Species in IMPROVE and SPEC Networks </li></ul></ul><ul><ul><li>Technical Approach to the Network Assessment </li></ul></ul><ul><li>Temporal Data Coverage </li></ul><ul><ul><li>Long-Term Trend Data: Fine Mass, SO4, K (1988-03) </li></ul></ul><ul><ul><li>Recent Trends (1999-03): Fine Mass, Sulfate, Nitrate, OC (IMPROVE) , OC_NIOSH (SPEC) , EC (IMPROVE) , EC_NIOSH (SPEC) </li></ul></ul><ul><li>Spatial Data Coverage </li></ul><ul><ul><li>Total Data Stock Maps: Fine Mass, So4, NO3, Ocf, Ocf_NIOSH, Euf </li></ul></ul><ul><ul><li>Fine Mass, Sulfate, Nitrate, Europium </li></ul></ul><ul><li>Information Value of Stations: Error for SO 4 </li></ul><ul><ul><li>Estimation Error: Single Day </li></ul></ul><ul><ul><li>Estimation Error: Long-term Average </li></ul></ul><ul><li>Temporal and Spatial SO 4 Characterization </li></ul><ul><li>Assessment Summary </li></ul>
  3. 3. Aerosol Speciation and the NCore Network <ul><li>NCore is to characterize the pollutant pattern for many applications </li></ul><ul><li>Speciation provides the data for aerosol source apportionment </li></ul><ul><li>In NCore, ‘core species’ are measured at Level-2 and L1 sites </li></ul><ul><li>Currently (2003), the speciation sites exceed 350 </li></ul><ul><li>The challenge is to assess the evolution and status of the speciation network </li></ul>
  4. 4. Technical Approach to the Network Assessment <ul><li>This draft Network Assessment is a collaboration of the CAPITA and CIRA groups </li></ul><ul><li>CIRA has created an integrated speciation database as part of the RPO VIEWS project </li></ul><ul><li>CAPITA has applied the analysis tools of DataFed to the VIEWS database </li></ul><ul><li>The results of the assessment analysis are presented (this PPT) to the NCore team </li></ul><ul><li>Guided by the evaluation and feedback from NCore, the assessment is revised </li></ul>CIRA/ VIEWS Database CAPITA/ DataFed Database Network Assessment PPT IMPROVE EPA SPEC CIRA Tools and Processes DataFed Tools and Processes Analysis Tools and Processes Speciated Data Flow and Processing EPA NCore Process Evaluation, Feedback
  5. 5. Aerosol Species Monitored Snapshot for Aug 19, 2003 <ul><li>IMPROVE monitors over 30 species </li></ul><ul><li>EPA monitors over 40 species </li></ul><ul><li>About 25 species are reported by both </li></ul>IMPROVE Species SPECIATION IMP + SPEC Data Count
  6. 6. Long-Term Monitoring: Fine Mass, SO4, K <ul><li>Long-term speciated monitoring begun in 1988 with the IMPROVE network </li></ul><ul><li>Starting in 2000, the IMPROVE and EPA networks have expanded </li></ul><ul><li>By 2003, the IMPROVE + EPA species are sampled at 350 sites </li></ul><ul><li>In 2003, the FRM/IMPROVE PM25 network is reporting data from over 1200 sites </li></ul>Sulfate Fine Mass Potassium
  7. 7. Fine Mass, Sulfate, Nitrate Monitoring (1999-03) <ul><li>Daily valid station counts for sulfate has increased from 50 to 350 </li></ul><ul><li>About 250 sites sample every 3 rd day, 350 sites every 6 th day </li></ul>Every 6 th day Every 3 rd day Every 6 th day Every 3 rd day Every 6 th day Every 3 rd day Fine Mass Nitrate Sulfate
  8. 8. OC (IMPROVE) , OC_NIOSH (SPEC) , 1999-2003 <ul><li>Organic Carbon at the IMPROVE (OCf) and EPA (Ocf_NIOSH) sites are not fully compatible </li></ul><ul><li>Since 2000, the IMPROVE OCf monitoring has increased from 50 to 150 sites </li></ul><ul><li>Since 2001, the EPA network has grown from 20 to over 200 sites </li></ul><ul><li>(Need to separate STN?? Which are the STN site codes?) </li></ul>IMPROVE OCf EPA OCf_NIOSH
  9. 9. EC (IMPROVE) , EC_NIOSH (SPEC) , 1999-2003 <ul><li>Same as Organic Carbon … redundant?? </li></ul>IMPROVE ECf EPA ECf_NIOSH
  10. 10. Data Stock: Fine Mass, So4 <ul><li>The data stock is the accumulated data resource </li></ul><ul><li>The IMPROVE sites that begun in 1988 have over 1500 samples (red circles) </li></ul><ul><li>The more recent EPA sites have 400 or less samples </li></ul>IMPROVE + EPA Fine Mass IMPROVE + EPA Sulfate
  11. 11. Data Stock: Ocf, Ocf_NIOSH, NO3, EUf <ul><li>The IMPROVE data stock for OCf and NO3f over 1500, for EPA OCf_NIOSH is < 400 </li></ul><ul><li>Europium is only measured at the EPA sites, ~400 samples or less </li></ul>IMPROVE OCf IMPROVE/EPA NO3f EPA Ocf_NIOSH EPA Europium
  12. 12. Evolution of Spatial Data Coverage: Fine Mass 1998-2003 <ul><li>Before 1998, IMPROVE provided much of the PM2.5 data (non-FRM EPA PM25 not included here) </li></ul><ul><li>In the 1990s, the mid-section of the US was not covered </li></ul><ul><li>By 2003, the PM2.5 sites (1200+) covered most of the US </li></ul>1998 1999 2000 2003 2002 2001
  13. 13. Evolution of Spatial Data Coverage: Fine Sulfate, 1998-2003 <ul><li>Before 1998, IMPROVE provided much of the PM2.5 sulfate </li></ul><ul><li>In the 1990s, the mid-section of the US was not covered </li></ul><ul><li>By 2003, the IMPROVE and EPA sulfate sites (350+) covered most of the US </li></ul>1998 1999 2000 2003 2002 2001
  14. 14. Evolution of Spatial Data Coverage: Nitrate, 1998-2003 <ul><li>Ditto as Sulfate </li></ul>1998 1999 2000 2003 2002 2001
  15. 15. Evolution of Spatial Data Coverage: Europium, 1998-2003 <ul><li>Europium and many other trace elements are only measured in the EPA network </li></ul><ul><li>Starting 2001, the EPA network expanded to over 200 sites </li></ul>
  16. 16. Site Information Value <ul><li>The information value of a site can be measured by how much it reduces the uncertainty of concentration estimates </li></ul><ul><li>If the concentration at a site can be estimated accurately for auxiliary data, (estimation error is low), then the information content of that site is low </li></ul><ul><li>If the estimate is poor (estimation error is high), then the information content of that site is high, since having that site reduces the concentration uncertainty </li></ul><ul><li>Thus, estimation error is a measure of the information content of a specific site </li></ul><ul><li>Estimation Error Calculation </li></ul><ul><li>Cross-validation estimates the information value of individual stations by </li></ul><ul><ul><li>removing each monitor, one at a time </li></ul></ul><ul><ul><li>estimating the concentration at the removed monitor location by the ‘best available’ spatial extrapolation scheme </li></ul></ul><ul><ul><li>calculation the error as difference (Estimated – Measured) or ratio (Estimated/Measured) </li></ul></ul><ul><li>The ‘best available’ estimated concentration is calculated using </li></ul><ul><ul><li>de-clustering by the method of S. Falke </li></ul></ul><ul><ul><li>interpolation using 1/r -4 weighing </li></ul></ul><ul><ul><li>Using the nearest 10 stations within a 600 km search radius </li></ul></ul><ul><li>In the following examples, the error is estimated for two extreme cases: </li></ul><ul><ul><li>for a single day with significant spatial gradients (Aug 19, 2003) </li></ul></ul><ul><ul><li>for the grand average concentration field with smooth pattern </li></ul></ul>
  17. 17. SO4 Estimation Error: Single Day (Aug 19, 2003) <ul><li>The measured concentration contour plot is shown for all data </li></ul><ul><li>The estimated contour uses only values estimated from neighboring sites </li></ul><ul><li>The difference map indicates +/- 5 ug/m3 estimation errors (need to check ) </li></ul><ul><li>The ratio map shows the error to be over 50% </li></ul>Measured Estimated Ratio Difference
  18. 18. Estimation Error: Long-term Average (2000-2003) <ul><li>The long-term average concentration pattern are smoother than for single day </li></ul><ul><li>The concentration error difference is below +/- 1 ug/m3 for most stations in the East </li></ul><ul><li>The error/measured ratio is below 20% except at few (<5) sites </li></ul><ul><li>In the West, the errors are higher due to varying site elevation/topographic barriers </li></ul>Measured Estimated Ratio Difference
  19. 19. Average SO4: Estimated - Measured <ul><li>The Estimated-Measured correlation at Eastern sites is r 2 ~ 0.8 </li></ul><ul><li>For the Western US sites the correlation is only r 2 ~ 0.5 </li></ul>
  20. 20. Temporal and Spatial SO4 Characterization <ul><li>For SO4, the temporal coverage is every 3 or 6 days </li></ul><ul><li>Typical ‘transport distance’ between 3-day samples (at 5 m/sec wind speed) is about 1500 km </li></ul><ul><li>On the other hand, the characteristic distance between sites is about 160 km (total area 9x10 6 km 2 , 350 sites) </li></ul><ul><li>Thus, the spatial sampling provides at least 10-fold more ‘characterization' than the 3 rd day temporal sampling. </li></ul>3 Day Transport 1 Day Transport
  21. 21. Speciation Network Assessment Summary (Initial Draft, November 2004 ) <ul><li>Since 2000, speciated aerosol monitoring has grown from 50 to 350 sites </li></ul><ul><li>IMPROVE and EPA sites have accumulated 1500 and 400 data points, respectively </li></ul><ul><li>By 2003, the spatial coverage for speciated sampling was high throughout the US </li></ul><ul><li>For long-term SO4 averages, the estimation error over the East was below 1 ug/m 3 </li></ul><ul><li>For a specific day with strong SO4 gradient, the error was below up to 5 ug/m 3 </li></ul><ul><li>The 350 sites provide at least 10-fold more ‘characterization' than the 3 rd day sampling </li></ul>
  22. 22. AIRNOW PM25 - ASOS RH- Corrected Bext July 21, 2004 July 22, 2004 July 23, 2004 ARINOW PM25 ARINOW PM25 ARINOW PM25 ASOS RHBext ASOS RHBext ASOS RHBext
  23. 23. Quebec Smoke July 7, 2002 Satellite Optical Depth & Surface ASOS RHBext
  24. 24. A note to the NCore implementation managers: From ad hoc to Continuous Network Assessment <ul><li>By design, NCore will be changing in response to the evolving conditions </li></ul><ul><li>Nudging NCore toward desired goals, requires assessment and feedback </li></ul><ul><li>FRM mass and speciation monitoring is now ready to be ‘monitored’ </li></ul><ul><li>The indicators can be calculated from the VIEWS integrated database </li></ul><ul><li>Many assessment tools (maps, charts, CrossVal) are developed or feasible </li></ul><ul><li>… so, it may be time to consider … </li></ul><ul><li>Automated Network Assessment as a routine part of monitoring </li></ul>
  25. 25. Continuous Speciated Network Assessment: A Feasibility Pilot Project <ul><li>Currently, network assessments are done intermittently with ad hoc tools </li></ul><ul><li>Network status and trends monitoring is now possible with web-based tools </li></ul><ul><li>A ‘pilot feasibility project’ could aid the design of operational network assessment </li></ul><ul><li>Such automatic feedback would contribute to the agility of the monitoring network </li></ul>CIRA/ VIEWS Database CAPITA/ DataFed Database Automatic Assessment WebTool IMPROVE EPA SPEC Monitoring Networks CIRA Tools and Processes DataFed Tools and Processes Analysis Tools and Processes Network Assessmt PPT Analysis Tools and Processes Network Adjustment Many Other Factors
  26. 26. Data Life Cycle: Acquisition Phase – Usage Phase <ul><li>Need a ‘force’ to move data from one-shot to reusable form </li></ul><ul><li>External force – contracts </li></ul><ul><li>Internal – humanitarian, benefits </li></ul>
  27. 27. The Researcher/Analyst’s Challenge “The researcher cannot get access to the data; if he can, he cannot read them; if he can read them, he does not know how good they are; and if he finds them good he cannot merge them with other data.” Information Technology and the Conduct of Research: The Users View National Academy Press, 1989
  28. 28. Data Flow Resistances These resistances can be overcome through a distributed system that catalogs and standardizes the data allowing easy access for data manipulation and analysis. <ul><li>The user does not know what data are available </li></ul><ul><li>The available data are poorly described (metadata) </li></ul><ul><li>There is a lack of QA/QC information </li></ul><ul><li>Incompatible data can not be combined and fused </li></ul>The data flow process is hampered by a number of resistances.

×