Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Validation of different nowcasting
models based on the Meteosat
Second Generation satellite data
M. de Rosa1, M. Picchiani...
Outline
• Nowcasting
• The StormTrack model
• The validation
• The benchmark
• Validation at South Africa (case studies)
•...
Nowcasting
• Very short term weather prediction (within
few hours) over a certain area
• Prediction of extreme weather eve...
The actors
StormTrack
The benchmark
The ground truth
The validation tool
2015-09-23 Eumetsat Conference 2015 - Toulouse
The StormTrack model
• MSG as unique data source
• (Early) Detection of the convective objects
• Tracking of the detected ...
The StormTrack model
• Cell Detector (CDT)
• Cell Tracker (CTK)
• MSG Nowcasting Engine (MNE)
• CTWriter
2015-09-23 Eumets...
The StormTrack model
JSON Public APIs
2015-09-23 Eumetsat Conference 2015 - Toulouse
The algorithm: CDT
2015-09-23 Eumetsat Conference 2015 - Toulouse
• Use the 5,6 and 9
channels
• BTD6,9 cloud base
detecti...
The Validation
• MET is a set of verification tools developed by the Developmental
Testbed Center (DTC) for use by the num...
The Validation
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
POD FAR
2015-09-23 Eumetsat Conference 2015 - Toulouse
The benchmark
• RDT, Rapid Development Thunderstorms, has been (and is)
developed by Meteo-France in the framework of the
...
Validation at South Africa
• 6 case studies during summer
2014 (12 UTC – 18 UTC)
• Lightning network: accuracy
500m, CG st...
South Africa case study: 2014/12/16
2015-09-23 Eumetsat Conference 2015 - Toulouse
Event statistics
0.00
0.50
1.00
1.50
2.00
2.50
3.00
3.50
4.00
4.50
0
50
100
150
200
250
300
350
400
Strikes Density (strik...
StormTrack vs RDT
PODs
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
POD STK POD RDT
0.00
0.50
1.00
1.50
2.00
2.50
3.00
3.50
4.0...
StormTrack vs RDT
FARs
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
FAR STK FAR RDT
0.00
1.00
2.00
3.00
4.00
5.00
0
50
100
150
...
South Africa case study: 2014/12/08
2015-09-23 Eumetsat Conference 2015 - Toulouse
Event statistics
0.00
2.00
4.00
6.00
8.00
10.00
12.00
14.00
16.00
18.00
0
500
1000
1500
2000
2500
Strikes Density (strikes...
StormTrack vs RDT
PODs
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
POD STK POD RDT
2015-09-23 Eumetsat Conference 2015 - Toulo...
StormTrack vs RDT
FARs
0.00
2.00
4.00
6.00
8.00
10.00
12.00
14.00
16.00
18.00
0
500
1000
1500
2000
2500
Strikes Density (s...
South Africa case study: 2014/11/10
2015-09-23 Eumetsat Conference 2015 - Toulouse
Event statistics
0.00
2.00
4.00
6.00
8.00
10.00
12.00
14.00
16.00
0
500
1000
1500
2000
2500
3000
3500
Strikes Density (str...
StormTrack vs RDT
PODs
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
POD STK POD RDT
0.00
2.00
4.00
6.00
8.00
10.00
12.00
14.00
...
StormTrack vs RDT
FARs
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
FAR STK FAR RDT
0.00
2.00
4.00
6.00
8.00
10.00
12.00
14.00
16.00
...
Summary: South Africa case studies
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
12,00 13,00 14,00 15,00 16,00 17,00
Hours UTC
S...
Validation at South Africa:
Summarising
• Mean StormTrack Accuracy (POD): 0.5
– 0.41 in the morning
– 0.61 in the afternoo...
Validation at Europe
• Mid June 2015 – Mid Sep
2015 (00 UTC-21 UTC)
• ATDNet lightning data
(sampled every 5 mins)
• Strik...
Validation at Europe
2015-09-23 Eumetsat Conference 2015 - Toulouse
Credits Jocken Kerkmann (EUMETSAT)
• Over 5000 samples collected (Mid June 2015 –
Mid Sept 2015) for StormTrack
• Over 3000 samples collected (Mid June 2015 ...
Validation at Europe:
Scores vs Strikes
StormTrack
RDT
Discriminant: 250 strikes, POD>FAR
Discriminant: 130 strikes
2015-0...
Validation at Europe:
Scores vs Matched objects area
StormTrack
RDT
Discriminant:750 pixels
Discriminant: 360 pixels
2015-...
Validation at Europe:
Scores vs Strikes density
StormTrack
RDT
Discriminant: 3.5 strikes/px
Discriminant: 1.9 strikes/px20...
Validation at Europe:
Hourly Scores
RDT
Higher STK
POD
Lower STK
FAR2015-09-23 Eumetsat Conference 2015 - Toulouse
Validation at Europe:
Summarising
• Mean StormTrack Accuracy (POD): 0.6
• Mean RDT Accuracy (POD): 0.7
• Mean StormTrack F...
Our nowcasting equipment
DISH: 110 cm
LNB Inverto
Reception station: 8 GB Ram
Quad core 3 GHz TBS 6983 DVB S2 card
Win 7
B...
Summary
• Validation over South Africa (6 case studies) and Europe
(Summer 2015)
• MET framework for validation (Object Or...
Future tasks
• Validation over Europe will go on (divide the
region into sub regions)
• Automatic trajectories extrapolati...
Acknowledgements
• Estelle de Coning (SAWS) and SAWS for the RDT and the lightning
data
• Cecilia Marcos (AEMET), Ana Sánc...
Thanks for your attention
michele.derosa@geo-k.it
Questions?
Upcoming SlideShare
Loading in …5
×

Validation of different nowcasting models based on the Meteosat Second Generation satellite data

283 views

Published on

The number of the extreme meteorological events has increased in the last few years and the trend should be the same for the next future. These events generally develop very quickly and on a small scale, nevertheless they may provoke several damages and affect also the human's safety. For these reasons it is very important to monitor and prevent these natural hazards by means of advanced techniques which must be able to:

● detect the event as soon as possible

● track the behaviour of the event

● predict the short term development of the event

StormTrack is a novel multispectral algorithm for the detection, the tracking and the short term forecast of convective objects developed by Geo-K.

The model is already operational and some web services based on its output are already available. Our work is aimed to show the results of the validation activity developed in the last year.

The StormTrack validation involves the comparison with benchmarks like the Rapid Development Thunderstorms, developed by Meteo France, and the lightning activity over Europe and South Africa.

Published in: Science
  • Be the first to comment

  • Be the first to like this

Validation of different nowcasting models based on the Meteosat Second Generation satellite data

  1. 1. Validation of different nowcasting models based on the Meteosat Second Generation satellite data M. de Rosa1, M. Picchiani1,2, M. Sist1,2, F. Del Frate2 1 GEO-K srl, via del Politecnico, Rome,Italy 2 “Tor Vergata” University of Rome, Department of Civil Engineering and Computer Science, via del Politecnico, Rome, Italy
  2. 2. Outline • Nowcasting • The StormTrack model • The validation • The benchmark • Validation at South Africa (case studies) • Validation at Europe (Summer 2015) • Summary • Future tasks 2015-09-23 Eumetsat Conference 2015 - Toulouse
  3. 3. Nowcasting • Very short term weather prediction (within few hours) over a certain area • Prediction of extreme weather events like thunderstorms, floods, hurricanes, tornadoes • Near real time computation time: in parallel with observations (weather stations, soundings, satellite images, weather radar) • Very useful to the outdoor activities, air traffic control, agrometeorology 2015-09-23 Eumetsat Conference 2015 - Toulouse
  4. 4. The actors StormTrack The benchmark The ground truth The validation tool 2015-09-23 Eumetsat Conference 2015 - Toulouse
  5. 5. The StormTrack model • MSG as unique data source • (Early) Detection of the convective objects • Tracking of the detected objects • Cells lifecycle monitoring • Temporal and spatial extrapolation of the detected objects • High computation efficiency and reliability • Easy to use 2015-09-23 Eumetsat Conference 2015 - Toulouse
  6. 6. The StormTrack model • Cell Detector (CDT) • Cell Tracker (CTK) • MSG Nowcasting Engine (MNE) • CTWriter 2015-09-23 Eumetsat Conference 2015 - Toulouse STK
  7. 7. The StormTrack model JSON Public APIs 2015-09-23 Eumetsat Conference 2015 - Toulouse
  8. 8. The algorithm: CDT 2015-09-23 Eumetsat Conference 2015 - Toulouse • Use the 5,6 and 9 channels • BTD6,9 cloud base detection (early detection) • BTD5,9: cloud top detection (Kolios and Feidas, 2010) • Connected components • Convex approximation • Object definition (properties)
  9. 9. The Validation • MET is a set of verification tools developed by the Developmental Testbed Center (DTC) for use by the numerical weather prediction community to help them assess and evaluate the performance of numerical weather predictions. • The primary goal of MET development is to provide a state-of-the- art verification package to the NWP community. By “state-of-the- art” it means that MET will incorporate newly developed and advanced verification methodologies, including new methods for diagnostic and spatial verification and new techniques provided by the verification and modeling communities. • Several tools are part of the MET package and the MODE (object oriented validation) tool has been chosen for the validation of the StormTrack algorithm. Don’t reinvent the wheel Write new routines? New validation techniques? 2015-09-23 Eumetsat Conference 2015 - Toulouse
  10. 10. The Validation 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 POD FAR 2015-09-23 Eumetsat Conference 2015 - Toulouse
  11. 11. The benchmark • RDT, Rapid Development Thunderstorms, has been (and is) developed by Meteo-France in the framework of the EUMETSAT SAF in support to Nowcasting. • Using mainly geostationary satellite data, it provides information on clouds related to significant convective systems, from meso-alpha scale (200 to 2000 km) down to smaller scales (few pixels). • The RDT algorithm includes three steps: – The detection of cloud systems – The tracking of cloud systems – The discrimination of convective cloud objects • Only objects flagged as convective have been taken into account RDT: the state-of-the-art in storms detection 2015-09-23 Eumetsat Conference 2015 - Toulouse
  12. 12. Validation at South Africa • 6 case studies during summer 2014 (12 UTC – 18 UTC) • Lightning network: accuracy 500m, CG strikes, 19 detectors • Strikes 5 mins before/after MSG slot time • RDT setup: MSG HRIT, NWP, lightning data (thanks to SAWS) • StormTrack setup: MSG HRIT 2015-09-23 Eumetsat Conference 2015 - Toulouse
  13. 13. South Africa case study: 2014/12/16 2015-09-23 Eumetsat Conference 2015 - Toulouse
  14. 14. Event statistics 0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 0 50 100 150 200 250 300 350 400 Strikes Density (strikes/px) 2015-09-23 Eumetsat Conference 2015 - Toulouse Strikes Density
  15. 15. StormTrack vs RDT PODs 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 POD STK POD RDT 0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 0 50 100 150 200 250 300 350 400 Strikes Density (strikes/px) Higher lightning activity 2015-09-23 Eumetsat Conference 2015 - Toulouse
  16. 16. StormTrack vs RDT FARs 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 FAR STK FAR RDT 0.00 1.00 2.00 3.00 4.00 5.00 0 50 100 150 200 250 300 350 400 Strikes Density (strikes/px) Higher lightning activity 2015-09-23 Eumetsat Conference 2015 - Toulouse
  17. 17. South Africa case study: 2014/12/08 2015-09-23 Eumetsat Conference 2015 - Toulouse
  18. 18. Event statistics 0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 18.00 0 500 1000 1500 2000 2500 Strikes Density (strikes/px) 2015-09-23 Eumetsat Conference 2015 - Toulouse Strikes Density
  19. 19. StormTrack vs RDT PODs 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 POD STK POD RDT 2015-09-23 Eumetsat Conference 2015 - Toulouse
  20. 20. StormTrack vs RDT FARs 0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 18.00 0 500 1000 1500 2000 2500 Strikes Density (strikes/px) 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 FAR STK FAR RDT 2015-09-23 Eumetsat Conference 2015 - Toulouse
  21. 21. South Africa case study: 2014/11/10 2015-09-23 Eumetsat Conference 2015 - Toulouse
  22. 22. Event statistics 0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 0 500 1000 1500 2000 2500 3000 3500 Strikes Density (strikes/px) 2015-09-23 Eumetsat Conference 2015 - Toulouse Density Strikes
  23. 23. StormTrack vs RDT PODs 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 POD STK POD RDT 0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 0 500 1000 1500 2000 2500 3000 3500 Strikes Density (strikes/px) 2015-09-23 Eumetsat Conference 2015 - Toulouse
  24. 24. StormTrack vs RDT FARs 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 FAR STK FAR RDT 0.00 2.00 4.00 6.00 8.00 10.00 12.00 14.00 16.00 0 500 1000 1500 2000 2500 3000 3500 Strikes Density (strikes/px) 2015-09-23 Eumetsat Conference 2015 - Toulouse
  25. 25. Summary: South Africa case studies 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 12,00 13,00 14,00 15,00 16,00 17,00 Hours UTC StormTrack vs RDT Mean StormTrack POD Mean RDT POD Mean StormTrack FAR Mean RDT FAR Lower StormTrack FAR (0.2 vs 0.3) Higher RDT POD in the morning Higher StormTrack POD in the afternoon 2015-09-23 Eumetsat Conference 2015 - Toulouse
  26. 26. Validation at South Africa: Summarising • Mean StormTrack Accuracy (POD): 0.5 – 0.41 in the morning – 0.61 in the afternoon • Mean RDT Accuracy (POD): 0.54 – 0.58 in the morning – 0.51 in the afternoon • Mean StormTrack FAR: 0.2 – 0.2 in the morning – O.2 in the afternoon • Mean RDT FAR: 0.32 – 0.27 in the morning – 0.38 in the afternoon 2015-09-23 Eumetsat Conference 2015 - Toulouse
  27. 27. Validation at Europe • Mid June 2015 – Mid Sep 2015 (00 UTC-21 UTC) • ATDNet lightning data (sampled every 5 mins) • Strikes 5 mins before, 10 mins after MSG slot time • RDT setup: MSG HRIT, NWP, lightning data (thanks to AEMET) • StormTrack setup: MSG HRIT 2015-09-23 Eumetsat Conference 2015 - Toulouse
  28. 28. Validation at Europe 2015-09-23 Eumetsat Conference 2015 - Toulouse Credits Jocken Kerkmann (EUMETSAT)
  29. 29. • Over 5000 samples collected (Mid June 2015 – Mid Sept 2015) for StormTrack • Over 3000 samples collected (Mid June 2015 – Mid Aug 2015) for RDT • No filters on ground data: investigating about the model(s) sensitivity Validation at Europe 2015-09-23 Eumetsat Conference 2015 - Toulouse Density Strikes Density Strikes Density Strikes Density Strikes
  30. 30. Validation at Europe: Scores vs Strikes StormTrack RDT Discriminant: 250 strikes, POD>FAR Discriminant: 130 strikes 2015-09-23 Eumetsat Conference 2015 - Toulouse
  31. 31. Validation at Europe: Scores vs Matched objects area StormTrack RDT Discriminant:750 pixels Discriminant: 360 pixels 2015-09-23 Eumetsat Conference 2015 - Toulouse
  32. 32. Validation at Europe: Scores vs Strikes density StormTrack RDT Discriminant: 3.5 strikes/px Discriminant: 1.9 strikes/px2015-09-23 Eumetsat Conference 2015 - Toulouse
  33. 33. Validation at Europe: Hourly Scores RDT Higher STK POD Lower STK FAR2015-09-23 Eumetsat Conference 2015 - Toulouse
  34. 34. Validation at Europe: Summarising • Mean StormTrack Accuracy (POD): 0.6 • Mean RDT Accuracy (POD): 0.7 • Mean StormTrack FAR: 0.54 • Mean RDT FAR: 0.49 • Sensitivity: – Strikes: STK 250, RDT 130 – Matched area: STK 750, RDT 360 (px) – Strikes density: STK 3.5, RDT 1.9 (strikes/px) 2015-09-23 Eumetsat Conference 2015 - Toulouse STK suitable on extreme events RDT suitable on different conditions
  35. 35. Our nowcasting equipment DISH: 110 cm LNB Inverto Reception station: 8 GB Ram Quad core 3 GHz TBS 6983 DVB S2 card Win 7 BDADataEx DVBS2 card sw STK HW:I7 Quad core 3.2 GHz 8 GB FD Processing time: 7 mins Linux CentOS 6.0 2015-09-23 Eumetsat Conference 2015 - Toulouse
  36. 36. Summary • Validation over South Africa (6 case studies) and Europe (Summer 2015) • MET framework for validation (Object Oriented using MODE tool) • StormTrack setup: MSG HRIT • RDT (benchmark) setup: MSG HRIT, NWP, lightning data • Good results in the afternoon on South Africa case studies • Over Europe better RDT POD but comparable FAR. BTW good STK POD on average (no ground data or NWP) • STK flexible and “light” (less than 2 GB RAM needed on FD) • Issues on cold base clouds 2015-09-23 Eumetsat Conference 2015 - Toulouse
  37. 37. Future tasks • Validation over Europe will go on (divide the region into sub regions) • Automatic trajectories extrapolation ready but not yet operational (15/30 mins ahead) • RSS integration • Neural nets to improve POD/reduce FAR (pruning using other MSG channels) • Public APIs development to share data (first version online) • App and IoT (Internet of Things) integration 2015-09-23 Eumetsat Conference 2015 - Toulouse
  38. 38. Acknowledgements • Estelle de Coning (SAWS) and SAWS for the RDT and the lightning data • Cecilia Marcos (AEMET), Ana Sánchez Piqué (AEMET) and AEMET for the hints about MET, the RDT and the lightning data • Italian Air force Meteorological Office for the support 2015-09-23 Eumetsat Conference 2015 - Toulouse
  39. 39. Thanks for your attention michele.derosa@geo-k.it Questions?

×