2006-03-21 Work Group Meeting on IT Techniques, Tools and Philosophies for Model IntercomparisonPresentation Transcript
Work Group Meeting on IT Techniques, Tools and Philosophies for Model Intercomparison Ispra, JRC, March 14, 2004 Data handling approaches, software tools, participant involvement, data dissemination, integration issues Program: 09:00-09:15 Welcome/Background Dentener 09:15-10:00 Thunis/Cuvelier City Delta/Euro Delta 10:00-10:45 Michael Schulz/Stefan Kinne Aerocom 10:45-11:00 Coffee 11:00-11:45 Charles Doutrioux (Ocean modelling) 11:45-12:30 Rudolf Husar American Aerosol intercomparisons 12:30-14:00 Lunch + discussion 14:00-14:30 Christiane Textor (GEMS) 14:30-15:15 Stefano Galmarini Radioactivity Alert system 15:15-15:30 Tea 15:30-17:00 Discussion+ summarizing recommendations
Meeting Background & Purpose F. Dentener, JRC, Ispra
There is an increasing number of model intercomparison studies
These cover different modeling scales (global, regional, local), pollutants (gases, aerosols, metals, radioactivity) and comparison methodologies
Intercomparison participants may not be familiar with similar work elsewhere
A common problem for all these studies is the intense use of information technologies for bringing together, homogenizing and comparing the models
Present the major intercomparison studies with emphasis on IT issues
Discuss IT issues and solutions, (e.g. common formats, conventions)
Recommend actions for joint IT-related efforts
Past Model Intercomparison Studies
The Euro-and City-Delta Model Intercomparison P. Thunis, K. Cuvelier, JRC, Ispra
EuroDelta: Evaluates uncertainty of source-receptor relationships used in air quality policy ( CHIMERE, REM, EMEP, MATCH, LOTOS,TM5)
EuroDelta: Includes sub-grid effects into a Europe-wide health impact assessment for PM and O3( CALGRID,MUSE, CAMX, TRANSCHIM, MUSCAT, EUROS, OFIS, MOCAGE, STEM REM, LOTOS, CHIMERE, EPISODE EMEP )
Original Participant data (~Tb) Pre-processing (JRC) Processed (20 Gb), tool, on Web
AeroCom – Aerosol Model Comparison C. Textor, S. Guibert, S. Kinne, J. Penner, M. Schulz, F. Dentener
Central model database (~2TB), public web interface to images, joint papers
http://nansen.ipsl.jussieu.fr/AEROCOM/data.html Proposal for cooperation « Atmospheric Tracer Model Intercomparison Tools Initiative »
GEMS - C. Textor Global E arth-system M onitoring using S pace and in-situ data
Goal: Build validated, operational assimilation system for atmospheric composition and dynamics, by 2008.
Integrated Project co-funded by EC, 17 M€, 31 consortium members, 4 years (started in March 2005 )
Reg. AQ: Ensemble forecasts
Model Evaluation Methods?
“ Eyeball” methods
Basis statistical evaluation
Sophisticated skill scores
Model Evaluation Tools?
CDO, MetView?, CDAT..
Towards Automated Model Output Analysis Charles Doutriaux
Python based system
Added packages by community
From central to distributed resources:
Data, visualization, supercomputers
Collaboration and data sharing
Analyst focus on work, not mechanics
AutoMOD Automated Model Diagnostic Facility CDAT ESG
Data adheres to standards
NetCDF format, CF compliant
Data pre-processed via code
ENSEMBLE: Reconciliation of Disparate National Medium/Long Range Dispersion Forecasts Stefano Galmarini, JRC, Ispra
Following Chernobyl, accidental radioactive dispersion, conc., deposition needs to be forecasted over the next few days (60 h)
National services use LRTP models
DataFed: Federated Data System for Air Quality R. B. Husar, Washington University
AQ info is distributed over many ‘dimensions’: Geography, Content, Agency..
Info content includes: emissions , ambient & satellite data and models
Info is provided and consumed by different agencies , (NASA, NOAA, EPA…)
Providers have different access protocols , formats, and information usage
Standardization is a key need for agile IT systems
Non-intrusive mediators can achieve virtual standardization
Technologies are currently available for dynamic NETWORKING
Eager to cooperate, share networked data, tools for model comparison
Proposal for cooperation
Atmospheric Tracer Model Intercomparison Tools Initiative
Accelerate Analysis of Models and Feedback to model participants
Develop jointly intercomparison tools through transformation, integration, adaptation and development of tools
Allow participants to join more easily into the analysis of an intercomparison
Specific Objectives to which the Network should contribute for any intercomparison
Model quality control
In the first place tutorial institutions are identified, which provide general support for the planning and implementation of the initiative (JRC, LSCE/IPSL + ??) A steering committee is put into place to develop the initiative and report to the tutorial institutions.
A work plan is elaborated until early summer 2006 to structure and prioritise the actions to be undertaken.
A pilot project is the support of the planned 1st phase of the intercomparison under the HTAP convention, by reusing and integrating tools prepared for EuroDelta, ACCENT and AeroCom.
Present the initiative at forthcoming meetings (eg AeroCom 17-19 Octobre HTAP workshops etc) and promote its existence through publication (IGAC newsletter? +?)
The Workplan aims to identify a detailed implementation plan
On What steps are taken, Who is doing them, and When they are ready.
Identify the formatting standards needed > units, CF or less strict??
Identify standard names for specific tracer variables > report to CF
Identify the standard interface specifications between any of the tools and any of the data bases (see slide 5) so that different tools can call and adress each databases
To develop a standard protocol form to be used for different intercomparisons and subparts of them ( reference+compliance w check tools)
Define standards for file names and image names for databases
Develop compliance test tools to test whether data and tools fulfill the standards set under 1-5
Implement a pilot database based on automod for testing if different existing tools (AeroCom catalogues etc)
Build a web based repository for users to find tools to prepare CF compliant model output (fortran routines) to rename and reformat files (nco examples) to do specific diagnostics (region budgets, aerosol size fractions etc) to do regridding compliant to the standards to handle/replace/identify missing data
To develop check tools for physical meaning of data (budgets, units, order of magnitude, ocean/land contrast)
Develop observational data comparison tools which interface to the model data as defined under 1-4
Develop tools to develop higher order analysis results (ensemble averages, spatial correlation)
Documentation tool for keeping track of model version characteristics