Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
1<br />Validating Satellite Land Surface Temperature Products for GOES-R and JPSS Missions<br />Yunyue Yu,  Mitchell Goldb...
2<br />Motivation-- LST validation needs<br />LST Products Derived from Different Sensors for Decades<br />POES <br />NOAA...
GOES-R</li></ul>LST Climate Data Record <br />
3<br />Motivation-- LST validation issues<br />LST Validation Difficulties<br />In situ data limitation<br />Measurement d...
Approach-- Strategy<br /><ul><li>Using Existing Ground Observation Data
Cost consideration
Site representativeness and selection: characterization analysis
Match-up Dataset Generation
Stringent cloud filtering: additional measures
Data pair quality control
Site-to-pixel Model Development
Synthetic pixel analysis using high resolution sensor data
Proxy data testing
Real satellite data evaluation
Validation Methodology
Direct comparisons
Indirection comparisons</li></ul>4<br />
Approach<br />T(x,y,t)<br />T(x0,y0,t0)<br />Synthetic pixel analysis using ASTER data— an integrated approach for <br />s...
For pixel that is relatively homogeneous, analyze statistical relationship of the ground-site sub-pixel with the surroundi...
Establish relationship between the objective pixel and its sub-pixels (i.e., up-scaling model),  e.g.,                    ...
ASTER scene (90m) pixel<br />6<br />Approach<br />A Site Characterization Simulation Model – synthesizing VIIRS pixel usin...
Distance of every synthetic pixel center from the ground site is within the pixel size (~1Km).
Different colors are used for the 9 synthetic pixels, and the center of each pixel is marked with a small numbered square ...
The numbers on the squares are the pixel IDs used in the relevant analysis.</li></ul>Colored squares: <br />Ground site   ...
Upcoming SlideShare
Loading in …5
×

VALIDATING SATELLITE LAND SURFACE TEMPERATURE PRODUCTS FOR GOES-R AND JPSS MISSIONS.pptx

877 views

Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

VALIDATING SATELLITE LAND SURFACE TEMPERATURE PRODUCTS FOR GOES-R AND JPSS MISSIONS.pptx

  1. 1. 1<br />Validating Satellite Land Surface Temperature Products for GOES-R and JPSS Missions<br />Yunyue Yu, Mitchell Goldberg, Ivan Csiszar<br />NOAA/NESDIS<br />Center for Satellite Applications and Research<br />
  2. 2. 2<br />Motivation-- LST validation needs<br />LST Products Derived from Different Sensors for Decades<br />POES <br />NOAA AVHRR (TIROS-N to NOAA-19) since 1978 <br />ESA ATSR/AATSR since 1991<br />EOS MODIS since 1999 <br />MetOp AVHRR since 2006<br />GOES<br />US GOES/Imagers since 1975<br />Meteosat (MVIRI since 1995 and then) SEVIRI since 2004<br />Future LST products<br /><ul><li>JPSS/NPP
  3. 3. GOES-R</li></ul>LST Climate Data Record <br />
  4. 4. 3<br />Motivation-- LST validation issues<br />LST Validation Difficulties<br />In situ data limitation<br />Measurement difficulty <br />Cloud contamination effect<br />particularly the partial or thin cloudy pixels <br />Spatial and temporal variations<br />Spot vs pixel difference<br />Sub-pixel heterogeneity<br />Accurate match-up process (different sampling rates and sampling timing)<br />Others <br />i.e. angle effect<br />Surface heterogeneity is shown in a 4km x 4km Google map (1km x 1km, in the center box) around the Bondville station area<br />
  5. 5. Approach-- Strategy<br /><ul><li>Using Existing Ground Observation Data
  6. 6. Cost consideration
  7. 7. Site representativeness and selection: characterization analysis
  8. 8. Match-up Dataset Generation
  9. 9. Stringent cloud filtering: additional measures
  10. 10. Data pair quality control
  11. 11. Site-to-pixel Model Development
  12. 12. Synthetic pixel analysis using high resolution sensor data
  13. 13. Proxy data testing
  14. 14. Real satellite data evaluation
  15. 15. Validation Methodology
  16. 16. Direct comparisons
  17. 17. Indirection comparisons</li></ul>4<br />
  18. 18. Approach<br />T(x,y,t)<br />T(x0,y0,t0)<br />Synthetic pixel analysis using ASTER data— an integrated approach for <br />site representativeness analysis and site-to-pixel model development<br /><ul><li>Quantitatively characterize the sub-pixel heterogeneity and decide whether a ground site is adequately representative for the satellite pixel. The sub-pixels may be generated from pixels of a higher-resolution satellite.
  19. 19. For pixel that is relatively homogeneous, analyze statistical relationship of the ground-site sub-pixel with the surrounding sub-pixels: {T(x,y) } ~ T(x0,y0)
  20. 20. Establish relationship between the objective pixel and its sub-pixels (i.e., up-scaling model), e.g., Tpixel = T(x,y) + DT (time dependent?)</li></ul>ASTER pixel<br />The site pixel<br />MODIS pixel<br />The Synthetic pixel/sub-pixel model<br />5<br />
  21. 21. ASTER scene (90m) pixel<br />6<br />Approach<br />A Site Characterization Simulation Model – synthesizing VIIRS pixel using higher-resolution ASTER TIR pixels. <br /><ul><li>Each synthetic pixel has the target ground site enclosed, but the distance between the ground site and the center of synthetic pixel varies, which mimics the possible over-passing VIIRS swaths.
  22. 22. Distance of every synthetic pixel center from the ground site is within the pixel size (~1Km).
  23. 23. Different colors are used for the 9 synthetic pixels, and the center of each pixel is marked with a small numbered square of the same corresponding color.
  24. 24. The numbers on the squares are the pixel IDs used in the relevant analysis.</li></ul>Colored squares: <br />Ground site synthetic VIIRS pixels <br />
  25. 25. Approach<br />7<br />Quantification of the difference between the Synthetic Pixel and Ground Measurement, that is,<br />Evaluation: and with<br />the model:<br />Note that: <br />Sub-pixel heterogeneity<br />Systematic bias between ASTER and ground measurements<br />Tsat– satellite pixel LST<br />Tg -- ground site LST<br />Tc -- central sub-pixel LST<br />
  26. 26. Data Sets<br />8<br />Data Acquisition <br /><ul><li>Inventory preparation of clear-sky ASTER scenes passing over 23 ground sites (SURFRAD and CRN) during 2001-2007.
  27. 27. Collection of the clear-sky ASTER data sets associated with six SURFRAD sites and two CRN sites.
  28. 28. AST_04, AST_05, AST_08 and L1B (in total, about 2000 ASTER swath scenes).
  29. 29. Collection of the six SURFRAD and the two CRN ground data.
  30. 30. Collection of the two CRN ground data.
  31. 31. Collection of MODIS LST product data (MOD11_L2).
  32. 32. All the swaths passing over the SURFRAD sites in 2001
  33. 33. All the swaths corresponding to the ASTER scenes during 2001-2007
  34. 34. Collection of the narrow-band emissivity data sets
  35. 35. UW-Madison Baseline Fit Emissivity Database
  36. 36. North American ASTER Land Surface Emissivity Database (NAALSED)
  37. 37. Collection of NCEP reanalysis TPW datasets</li></li></ul><li>Data Sets<br />Satellite LST: MODIS LST, ASTER, and GOES LST<br />Ground LST: Derived from SURFRAD site measurements<br />Candidate Ground Sites and Database<br /> Dataset Used <br /><ul><li>SURFRAD data
  38. 38. ASTER Data
  39. 39. Data period: 2001-2007</li></ul>ASTER data is courtesy by Shunlin Liang<br />Table: Matched ASTER Data<br />9<br />
  40. 40. 10<br />Jul. 2011<br />Ground Site Broadband Emissivity<br />Regression based on the UW-Madison Baseline Fit Emissivity Database<br />( Seemann et al., 2008).<br />Data Sets<br />Regression of Broadband emissivity from well-developed narrowband emissivity database:<br />UW-Madison baseline Fit Emissivity Database<br />a=0.2122, b=0.3859,c=0.4029 (Wang, 2004)<br />
  41. 41. Processing<br />General Components of Validation Processing<br /> Satellite Data<br />Geolocation Match-up<br />Satellite Data Reader<br />Time Match-up<br />Ground Data Reader<br />Ground Data<br />Match-up Datasets<br />Satellite Cloud Mask<br />Satellite LST Calculation/Extraction<br />Ground Data Mask<br />Ground LST Estimation/Extraction<br />Manual Cloud Control<br />Outputs <br />(Plots, Tables, etc.)<br />Direct Comparison<br />Synthetic Analysis and Correction<br />Indirect Comparison<br />Statistical Analysis<br />11<br />
  42. 42. Processing<br />Sample Match-up Flow Chart<br />Time<br />Match-up<br />(< 5 mins)<br />Satellite<br />Data<br />Cloud Mask <br />Geolocation<br />Match-up<br />Spatial<br />Difference Test:<br />BT -- 3X3 pix STDs,<br />Visual -- 0.5 deg<br />SURFRAD<br />Data<br />Manual<br />Tuning<br />Channel BT<br />Difference Test:<br />(Ts, T10mm), (T10mm, T3.9mm)<br />(T10mm, T12mm)<br />Matched<br />Dataset<br />Time Series<br />Smoothness Check <br />(if available):<br />Upwelling, Downwelling<br />Irradiances<br />Additional cloud filter<br />Note: this flow chart is specifically for GOES Imager<br />Similar procedure is/will be applied for the ASTER and MODIS/VIIRS data<br />12<br />
  43. 43. Processing<br />13<br />Data Processing/Analysis<br /><ul><li>Clear-sky cases analysis
  44. 44. Cloud and clear-sky climatology analysis (for site selection)
  45. 45. ASTER Clear-sky swath selection from the ASTER inventory from the Warehouse Inventory Search Tool
  46. 46. Ground broadband emissivity regression analysis
  47. 47. SURFRAD LST estimation from PIR measurements
  48. 48. Spatial and temporal match-up among ground sites, ASTER scenes and MODIS scenes
  49. 49. Geolocation mapping of ASTER pixels as the sub-pixels of a MODIS pixel
  50. 50. Quality-control and enhanced cloud filtering
  51. 51. Processing of ASTER LST QC information
  52. 52. Processing of ASTER emissivity QC information
  53. 53. Processing MODIS cloud masks
  54. 54. Processing of MODIS LST QC information
  55. 55. Surface observations
  56. 56. Statistical testing</li></li></ul><li>Processing<br />14<br />Data Processing/Analysis (con’t)<br /><ul><li>Site representativeness analysis and site-to-pixel difference characterization
  57. 57. Semi-variance analysis
  58. 58. Synthetic analysis
  59. 59. Site-to-pixel model testing
  60. 60. Testing with all the MODIS Terra LST swaths passing over SUFRAD sites in 2001
  61. 61. Testing with the MODIS LST swaths corresponding with ASTER scenes during 2001-2007
  62. 62. VIIRS LST case studies on NPP land LPEATE platform
  63. 63. Development of VIIRS LST algorithm modules for flexible offline testing and algorithm improvement
  64. 64. Visualization tools of the analysis and results disolay</li></li></ul><li>Results<br />Additional cloud filtering is need for obtaining high quality satellite-ground match-up dataset<br />Left: ATSER cloud free dataset. Right: possible cloud contamination.<br />Cloud<br />15<br />
  65. 65. 16<br />Results<br />Comparison of the temperatures calculated from synthetic pixel average (top-right), center-pixel (bottom-left), and nearest pixel (bottom-right) with the ground site temperature. Note the different colors represent for the 9 different synthetic pixels shown previously.<br />For this particular site the ground site location within the satellite pixel does not have significance impact to the validation process, simply because the land surface thermal emission at Desert Rock is fairly homogeneous.<br />SURFRAD Station: Desert Rock<br />
  66. 66. 17<br />Results<br />Site=Desert Rock, NV<br />Ts - Tc<br />Tc-Ta<br />Ts – Ta<br />Case<br />Mean<br />STD<br />STD`<br />Mean<br />STD<br />Mean<br />-1.81<br />2.46<br />0.69<br />0.04<br />2.13<br />-1,78<br />0<br />0.60<br />-0.01<br />2.26<br />-1.82<br />1<br />0.61<br />0.08<br />2.20<br />-1.74<br />2<br />0.92<br />0.20<br />1.99<br />-1.61<br />3<br />0.96<br />0.06<br />2.03<br />-1.75<br />4<br />0.98<br />-0.24<br />2.18<br />-2.05<br />5<br />0.80<br />-0.34<br />2.30<br />-2.15<br />6<br />0.65<br />-0.26<br />2.40<br />-2.07<br />7<br />0.60<br />-0.16<br />2.37<br />-1.97<br />8<br />0.76<br />-0.07<br />2.21<br />-1.88<br />Average<br />Sample statistical analysis result on the Desert Rock site. <br />Impact of pixel location bias to the ground site<br />The Ta and Ts difference is tested by comparing its spatial structure to the site geographic structure.<br />It shows that such Ta and Ts difference matches the site geographic feature well, which implies that the synthetic pixel temperature calculation is reasonable.<br />Ts: LST of SURFRAD site<br />Ta: average LST over 13x13 ASTER pixels<br />Tc: LST of ASTER pixel nearest to the site<br />
  67. 67. Jul. 2011<br />18<br />Results<br />Site=Bondville, IL<br />Ts - Tc<br />Tc-Ta<br />Ts – Ta<br />Case<br />Mean<br />STD<br />STD`<br />Mean<br />STD<br />Mean<br />-0.59<br />2.01<br />0.92<br />-0.07<br />2.04<br />-0.66<br />0<br />1.04<br />-0.14<br />2.01<br />-0.73<br />1<br />1.07<br />-0.05<br />2.05<br />-0.64<br />2<br />1.27<br />-0.05<br />2.17<br />-0.64<br />3<br />1.15<br />-0.03<br />2.10<br />-0.68<br />4<br />1.10<br />-0.09<br />2.14<br />-0.60<br />5<br />0.97<br />-0.001<br />2.12<br />-0.62<br />6<br />0.95<br />-0.03<br />2.05<br />-0.77<br />7<br />0.97<br />-0.18<br />2.02<br />-0.77<br />8<br />1.05<br />-0.09<br />2.08<br />-0.80<br />Average<br />Sample statistical analysis result on the Bondville site. <br />Impact of pixel location bias to the ground site<br />
  68. 68. Jul. 2011<br />19<br />Results<br />Sample statistical analysis result on the Boulder site. <br />Ts - Tc<br />Tc-Ta<br />Ts – Ta<br />Case<br />Mean<br />STD<br />STD`<br />Mean<br />STD<br />Mean<br />-0.77<br />2.60<br />0.58<br />-0.07<br />2.62<br />-0.84<br />0<br />0.85<br />-0.38<br />2.61<br />-1.15<br />1<br />Site=Boulder, CO<br />0.91<br />-0.27<br />2.30<br />-1.03<br />2<br />0.84<br />-0.14<br />2.27<br />-0.91<br />3<br />0.61<br />-0.03<br />2.54<br />-0.80<br />4<br />0.61<br />-0.10<br />2.64<br />-0.67<br />5<br />Impact of pixel location bias to the ground site<br />0.69<br />-0.00<br />2.75<br />-0.77<br />6<br />0.70<br />-0.10<br />2.80<br />-0.87<br />7<br />0.70<br />-0.25<br />2.70<br />-1.02<br />8<br />0.72<br />-0.13<br />2.58<br />-0.90<br />Average<br />
  69. 69. Results<br />Sample scatter plots show the linear relationship between satellite LST and ground LST.<br />Tsat = A Tg + B + e<br />
  70. 70. 21<br />7/27/2011<br />Results<br />Site-to-Pixel Statistical Relationship<br />
  71. 71. 22<br />Results<br />Site-to-Pixel Statistical Relationship – MODIS Proxy<br /><ul><li>The standard deviation of the difference between real MODIS pixel LST and SURFRAD LST are consistent with the standard deviation of the difference between synthetic pixel and SURFRAD.
  72. 72. The synthetic pixel LST is generally much warmer (about 1.5K) than SURFRAD LST, while MODIS LST is slightly cooler than SURFRAD LST.
  73. 73. Implication: High resolution ASTER data be used for site representativeness analysis, but a further verification of ASTER LST quality seems essential for the parameter estimation of the site-to-pixel model (e.g., A,B,C) and the application of the model in real cases.</li></li></ul><li>Results –Summary<br /><ul><li>Synthetic pixel analysis model is created for analyzing ground site temperature heterogeneous feature. ASTER data are used to generate synthetic VIIRS pixel data (LST) and compared to the SURFRAD site data
  74. 74. LST of SURFRAD measurements may be used as good references for VIIRS/ABI LST cal/valif the measurements are of high-quality and the in-situ estimation of LST is accurate enough.
  75. 75. Directional variations are small, so small geo-referencing (within 1Km) bias may not be an issue affecting VIIRS/ABI LST cal/val at the above SURFRAD sites.
  76. 76. Application of the site-to-pixel model depends on the ASTER LST data quality.
  77. 77. Directional variation of the potential sub-pixel heterogeneity is found to be consistent with the physical topographic features, even if it is small.
  78. 78. The limited datasets doesn’t allow us to characterize the seasonal variation of heterogeneities, which is more desirable than a simple mean difference. More datasets are expected. And about 1K difference seems unavoidable in practice.</li></ul>23<br />
  79. 79. Future Plan<br /><ul><li>Analysis over time scales of interest, e.g., seasonal variation.
  80. 80. Analysis over sites of different surface types
  81. 81. Up/down-scaling models
  82. 82. Emissivity uncertainties
  83. 83. Test of the scaling model using real satellite data</li></ul>24<br />

×