Fuqing Zhang, Professor, Department of Meteorology and Department of Statistics; Director, Penn State Center for Advanced Data Assimilation and Predictability Techniques;
Pennsylvania State University - November 2017 UCAR Congressional Briefing
Numerous studies have found an average increase in extreme precipitation for both the U.S. and Northern Hemisphere mid-latitude land areas, consistent with the expectations arising from the observed increase in greenhouse gas concentrations (now more than 40% above pre-industrial levels). However, there are important regional variations in these trends that are not fully explained. These trend studies are typically based on direct analyses of observational station data. Such analyses confront multiple challenges, such as incomplete data and uneven spatial coverage of stations. Central scientific questions related to this general finding are: Are there changes in weather system phenomenology that are contributing to this observed increase? What is the contribution of increases in atmospheric water vapor? There are also questions related to application of potential future changes in planning. Because of the rarity (by definition) of extreme events, trends are mostly found only when aggregating over space. When would we expect to see a signal at the local level? What are the uncertainties surrounding future changes and their potential incorporation into future design? Further development of statistical/mathematical methods, or innovative application of existing methods, is desirable to aid scientists in exploring these central scientific questions. This talk will describe characteristics of the observation record and the issues surrounding the above questions.
The document discusses NOAA's Climate Data Record (CDR) Program, which leverages over 30 years of satellite data to provide trusted climate information. The CDR Program collects data from multiple decades using similar satellite systems, and develops fundamental CDRs by applying the best analysis methods across different data sets and instruments. This reveals climate information within archived satellite data.
WP Technical Paper - Inter-annual variability of wind speed in South AfricaMatthew Behrens
This study analyzed wind speed data from 26 ground stations in South Africa as well as MERRA and ERA-Interim reanalysis datasets to determine an appropriate figure for inter-annual variability (IAV) of wind speeds in the region. The study found a mean IAV of 4.3% across South Africa, lower than previous conservative assumptions. No clear relationship was found between long-term and short-term ground station data or between ground stations and reanalysis datasets in terms of IAV. The minimum data periods needed to determine a representative IAV were determined to be 8 years for ground stations, 15 years for MERRA, and 13 years for ERA-Interim. Guidelines are presented for assessing site-
This document describes research mapping forest height, biomass, and carbon across the contiguous United States from 2000-2007 using satellite and field data. A National Biomass and Carbon Dataset was created in 2000 by combining SRTM radar data, Landsat optical imagery, and field plot measurements. This provided 30m resolution estimates of height, biomass, and carbon. ALOS PALSAR dual-polarization SAR data from 2007 was later used to update biomass estimates. Both datasets were validated against field measurements and showed strong correlations at regional to national scales. The research demonstrated the ability to map important forest variables over large areas by combining different remote sensing and field data sources.
Reanalysis Datasets for Solar Resource Assessment - 2014 SPI Gwendalyn Bender
This document evaluates the use of reanalysis datasets for long-term solar resource assessment as an alternative to satellite datasets that only go back 15 years. It finds that while second-generation reanalysis datasets have improved resolution compared to first-generation, they still do not resolve clouds as well as needed. Bias correcting the datasets reduced errors but not enough to recommend them over established satellite methods. More development is needed before reanalysis datasets can reliably extend energy estimates beyond satellite coverage.
The document discusses a methodology for improving wind speed forecasts through synergizing outputs from two numerical weather prediction (NWP) models - the Global Environmental Multiscale model (GEM) and the North American Mesoscale model (NAM). Wind speed measurements from four meteorological towers are used to evaluate the individual NWP models and their combined forecasts. Results show the combined GEM-NAM forecasts reduce root mean square error by up to 20% compared to the individual models, indicating improved forecast accuracy through optimal combination of the two NWP models.
This document discusses using satellite imagery to detect agricultural smoke pollution. It describes an algorithm to calculate aerosol optical thickness from SeaWiFS satellite data. A case study is presented analyzing smoke over Kansas from agricultural burning on April 10-13, 2003. Surface PM2.5 measurements are compared to the satellite-derived aerosol optical thickness values to correlate smoke pollution levels.
This document discusses using satellite and surface sensors to detect agricultural smoke. It describes using SeaWiFS satellite data to obtain surface reflectance and aerosol optical thickness (AOT) through an algorithm. A case study of agricultural fires in Kansas in April 2003 is presented, with images showing smoke patterns from Rayleigh corrected satellite data and AOT, along with surface PM2.5 measurements and wind vectors. The study correlates PM2.5 data with AOT values from satellite images at matching latitudes and longitudes.
Numerous studies have found an average increase in extreme precipitation for both the U.S. and Northern Hemisphere mid-latitude land areas, consistent with the expectations arising from the observed increase in greenhouse gas concentrations (now more than 40% above pre-industrial levels). However, there are important regional variations in these trends that are not fully explained. These trend studies are typically based on direct analyses of observational station data. Such analyses confront multiple challenges, such as incomplete data and uneven spatial coverage of stations. Central scientific questions related to this general finding are: Are there changes in weather system phenomenology that are contributing to this observed increase? What is the contribution of increases in atmospheric water vapor? There are also questions related to application of potential future changes in planning. Because of the rarity (by definition) of extreme events, trends are mostly found only when aggregating over space. When would we expect to see a signal at the local level? What are the uncertainties surrounding future changes and their potential incorporation into future design? Further development of statistical/mathematical methods, or innovative application of existing methods, is desirable to aid scientists in exploring these central scientific questions. This talk will describe characteristics of the observation record and the issues surrounding the above questions.
The document discusses NOAA's Climate Data Record (CDR) Program, which leverages over 30 years of satellite data to provide trusted climate information. The CDR Program collects data from multiple decades using similar satellite systems, and develops fundamental CDRs by applying the best analysis methods across different data sets and instruments. This reveals climate information within archived satellite data.
WP Technical Paper - Inter-annual variability of wind speed in South AfricaMatthew Behrens
This study analyzed wind speed data from 26 ground stations in South Africa as well as MERRA and ERA-Interim reanalysis datasets to determine an appropriate figure for inter-annual variability (IAV) of wind speeds in the region. The study found a mean IAV of 4.3% across South Africa, lower than previous conservative assumptions. No clear relationship was found between long-term and short-term ground station data or between ground stations and reanalysis datasets in terms of IAV. The minimum data periods needed to determine a representative IAV were determined to be 8 years for ground stations, 15 years for MERRA, and 13 years for ERA-Interim. Guidelines are presented for assessing site-
This document describes research mapping forest height, biomass, and carbon across the contiguous United States from 2000-2007 using satellite and field data. A National Biomass and Carbon Dataset was created in 2000 by combining SRTM radar data, Landsat optical imagery, and field plot measurements. This provided 30m resolution estimates of height, biomass, and carbon. ALOS PALSAR dual-polarization SAR data from 2007 was later used to update biomass estimates. Both datasets were validated against field measurements and showed strong correlations at regional to national scales. The research demonstrated the ability to map important forest variables over large areas by combining different remote sensing and field data sources.
Reanalysis Datasets for Solar Resource Assessment - 2014 SPI Gwendalyn Bender
This document evaluates the use of reanalysis datasets for long-term solar resource assessment as an alternative to satellite datasets that only go back 15 years. It finds that while second-generation reanalysis datasets have improved resolution compared to first-generation, they still do not resolve clouds as well as needed. Bias correcting the datasets reduced errors but not enough to recommend them over established satellite methods. More development is needed before reanalysis datasets can reliably extend energy estimates beyond satellite coverage.
The document discusses a methodology for improving wind speed forecasts through synergizing outputs from two numerical weather prediction (NWP) models - the Global Environmental Multiscale model (GEM) and the North American Mesoscale model (NAM). Wind speed measurements from four meteorological towers are used to evaluate the individual NWP models and their combined forecasts. Results show the combined GEM-NAM forecasts reduce root mean square error by up to 20% compared to the individual models, indicating improved forecast accuracy through optimal combination of the two NWP models.
This document discusses using satellite imagery to detect agricultural smoke pollution. It describes an algorithm to calculate aerosol optical thickness from SeaWiFS satellite data. A case study is presented analyzing smoke over Kansas from agricultural burning on April 10-13, 2003. Surface PM2.5 measurements are compared to the satellite-derived aerosol optical thickness values to correlate smoke pollution levels.
This document discusses using satellite and surface sensors to detect agricultural smoke. It describes using SeaWiFS satellite data to obtain surface reflectance and aerosol optical thickness (AOT) through an algorithm. A case study of agricultural fires in Kansas in April 2003 is presented, with images showing smoke patterns from Rayleigh corrected satellite data and AOT, along with surface PM2.5 measurements and wind vectors. The study correlates PM2.5 data with AOT values from satellite images at matching latitudes and longitudes.
1) NREL is a national laboratory operated by the Alliance for Sustainable Energy, LLC that focuses on energy efficiency and renewable energy.
2) The presentation discusses options for quantifying solar resource from measurements including horizontal and inclined surfaces, and methods for transposing horizontal irradiance data to plane of array irradiance.
3) It notes that isotropic models used to approximate this transposition can underestimate plane of array irradiance by 5-20% compared to using anisotropic physics models that better simulate cloud conditions and solar radiances.
This document summarizes a presentation about quantifying uncertainty when tuning satellite-derived solar irradiance data to ground measurements. It discusses how the length of ground data used for tuning impacts uncertainty levels in the tuned satellite data. Analysis of multiple sites showed that uncertainty decreases as more months of ground data are used, as it reduces the effects of seasonal variations. With less than a year of ground data, seasonal effects can amplify uncertainty, but results stabilize with a year or more of data.
Reanalysis Datasets for Solar Resource Assessment Presented at ASES 2014Gwendalyn Bender
The document compares the accuracy of two second generation reanalysis datasets (MERRA and ERA-Interim) to satellite-derived solar data for solar resource assessment. It finds that MERRA has less error than ERA-Interim or satellite data alone. A bias correction method is applied to improve MERRA, reducing errors further. However, in cloudier locations, one year of training data is insufficient for the correction. While reanalysis datasets provide long-term global data, satellite observations currently provide more accurate short-term solar estimates.
This document discusses challenges and opportunities for using machine learning and data mining techniques on big climate data. It describes various types of climate and Earth observation data available from satellites and models. Research highlights are presented on using pattern mining to track ocean eddies, extreme value theory to study heatwaves and rainfall, and relationship mining to study seasonal hurricane activity. The challenges of analyzing multi-scale, heterogeneous climate data are also discussed.
Estimating Fire Weather Indices Via Semantic Reasoning Over Wireless Sensor N...IJwest
Wildfires are frequent, devastating events in Australia that regularly cause significant loss of life and widespread property damage. Fire weather indices are a widely-adopted method for measuring fire danger and they play a significant role in issuing bushfire warnings and in anticipating demand for bushfire management resources. Existing systems that calculate fire weather indices are limited due to low spatial and temporal resolution. Localized wireless sensor networks, on the other hand, gather continuous sensor data measuring variables such as air temperature, relative humidity, rainfall and wind speed at high resolutions. However, using wireless sensor networks to estimate fire weather indices is a challenge due to data quality issues, lack of standard data formats and lack of agreement on thresholds and methods for calculating fire weather indices. Within the scope of this paper, we propose a standardized approach to calculating Fire Weather Indices (a.k.a. fire danger ratings) and overcome a number of the challenges by applying Semantic Web Technologies to the processing of data streams from a wireless sensor network deployed in the Springbrook region of South East Queensland. This paper describes the underlying ontologies, the semantic reasoning and the Semantic Fire Weather Index (SFWI) system that we have developed to enable domain experts to specify and adapt rules for calculating Fire Weather Indices. We also describe the Web-based mapping interface that we have developed, that enables users to improve their understanding of how fire weather indices vary over time within a particular region. Finally, we discuss our evaluation results that indicate that the proposed system outperforms state-of-the-art techniques in terms of accuracy, precision and query performance.
The document summarizes the development of satellite modeling for the National Solar Radiation Database (NSRDB) to provide accurate surface solar radiation data. It describes the evolution from empirical to physical models using satellite measurements and ancillary data as inputs to radiative transfer models. Validation shows the new 2005-2012 dataset has a mean bias error of less than 5% for GHI and DNI compared to surface measurements, though uncertainty remains for cloudy cases. Future work aims to improve the model with higher resolution data and better representation of aerosols and surfaces.
IRJET - Intelligent Weather Forecasting using Machine Learning TechniquesIRJET Journal
This document discusses using machine learning techniques to forecast weather intelligently. It proposes using multi-target regression and recurrent neural network (RNN) models trained on historical weather data from Bangalore to predict future weather conditions like temperature, humidity, and precipitation. The data is first preprocessed before being fed to the models. The models are evaluated to accurately predict weather in the short term to help people like farmers and commuters without relying on expensive equipment.
1. The document discusses using sky imagers for short-term solar forecasting, as traditional methods lack sufficient spatial and temporal resolution for small-scale applications.
2. The proposed sky imager forecast model involves 7 steps: image analysis, cloud detection, cloud projection, shadow projection, irradiance modeling, predicting cloud motion to generate forecasts, and PV power modeling.
3. Accurate cloud detection, projection, and shadow projection are challenging due to issues like cloud inhomogeneity, perspective errors with distance from camera, and sensitivity to errors in estimated cloud base height.
Ilkay Altintas from the San Diego Supercomputer Center gave this talk at the HPC User Forum.
"WIFIRE is an integrated system for wildfire analysis, with specific regard to changing urban dynamics and climate. The system integrates networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modeling, and data assimilation to provide a scalable method to monitor such phenomena as weather patterns that can help predict a wildfire's rate of spread."
Watch the video: https://wp.me/p3RLHQ-inQ
Learn more: https://wifire.ucsd.edu/https://wifire.ucsd.edu/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Weather forecasting is the application of science and technology to predict the state of the atmosphere for a future time and a given location. Now days, forecasting of accurate atmospheric conditions is the major challenge for the meteorologist and poor forecasting has significant impact on our daily lives. This brings the necessity to make research works on forecasting of the weather events with respect to Ethiopia.
Air quality challenges and business opportunities in China: Fusion of environ...CLIC Innovation Ltd
MMEA (The Measurement, Monitoring and Environmental Efficiency Assessment) research program final seminar presentation by Dr. Ari Karppinen, Finnish Meteorological Institute
CSP Training series : solar resource assessment 2/2Leonardo ENERGY
Fifth session of the 2nd Concentrated Solar Power Training dedicated to solar resource assessment.
* DNI Variability, Frequency Distributions
* Typical Meteorological Years
* DNI measurements: broadband vs. spectral, and their limitations
* What is circumsolar radiation and why should we care in CSP/CPV?
* How much diffuse irradiance can be used in concentrators?
* How to measure and model the circumsolar irradiance?
* Spectral irradiance standards and their use for PV/CPV rating
* The AM1.5 direct standard spectrum: Why did it change? Why AM1.5?
* Use of the SMARTS radiative code to evaluate clear-sky spectral irradiances
* Sources of measured spectral irradiance data
* Spectral effects on silicon and multijunction cells and their dependence on climate
This document describes an approach to guide a robot or person carrying a methane sniffer to locate the source of a methane leak. It involves simulating methane distribution based on wind patterns and obstacles, calculating the probability of different leak location hypotheses given sensor readings, and using expected information gain to determine the next best location to move to in order to gather more information. The approach was tested in a virtualization demo using real methane concentration and wind data.
Earth observation systems now allow for accurate global monitoring and measurement due to advances in satellite technology. High resolution satellite images are more widely available at lower costs, enabling detailed land use data collection. Key elements of measurement include both remote sensing via satellites as well as ground measurements for calibration and analysis. Spatial data is organized using GIS databases to extrapolate measurements and characterize land use heterogeneity globally.
Weather prediction technology is a global, big data enterprise. This talk will describe the huge quantities of information that make modern weather prediction possible, from satellite and radar data to surface observations and the output from numerical weather prediction models. The role of smartphones and other mobile devices for distributing forecasts and weather information will be discussed and the future of weather prediction will be outlined.
This document discusses GPS technology and proposes solutions to some of its problems. It begins with background on the initial military and civilian uses of GPS. It then explains how GPS works using triangulation of signals from satellites. The document notes that while GPS has become more popular and affordable with products like Garmin, problems exist regarding availability, strength, accuracy, and security. Solutions proposed include using wireless networking principles to improve availability during "handoffs"; applying quantum cryptography to encryption for better security; and using repeaters to strengthen signals and reduce initialization times, improving accuracy. Experiments are suggested to test these solutions in real-world trials.
The document describes a student research project that used seismic data from 10 stations near Neal Hot Springs, Oregon to detect small local earthquakes through numerical cross correlation. The student created filtered templates from 3 known earthquakes to scan 20 months of seismic data recorded between 2011-2012. This process identified 9 additional earthquakes, beyond the original 3 that had been visually identified. Finding more small earthquakes will help interpret local fault activity and assess if nearby geothermal exploration affects seismicity. The research demonstrates applying new computational techniques to gain higher resolution from existing seismic data and better understand subsurface geology.
CSP Training series : solar resource assessment 1/2Leonardo ENERGY
The key factors that can explain inconsistencies and large disagreements between solar resource maps include:
1. Differences in the cloud data sources or periods used to create the maps
2. Inconsistent aerosol data used in the models
3. Use of long-term monthly average aerosol data versus mean daily data
4. Reliance on empirical algorithms that may degrade in accuracy for some areas
5. Lack of validation against actual ground-measured DNI data, which is limited
Proper validation against available ground measurements and consistency in input data are important to improve agreement between solar resource maps. The scarcity of DNI data makes validation challenging.
MO3.L10 - STATUS OF PRE-LAUNCH ACTIVITIES FOR THE NPOESS COMMUNITY COLLABORAT...grssieee
The document summarizes the status of pre-launch validation activities for the NPP satellite. It discusses that validation teams are continuing work to characterize sensor data records and environmental data records in preparation for post-launch validation. Team leads provide experience from past missions and are working with stakeholders and experts to refine algorithms and calibration. Activities include analysis of test data, preparation of validation tools, and coordination between sensor and data record teams.
One–day wave forecasts based on artificial neural networksJonathan D'Cruz
The document summarizes a study that uses artificial neural networks (ANNs) to generate 24-hour wave forecast based on wave buoy data from 6 locations. It trains ANNs using over 12 years of wave height data from the buoys as input, and forecasts wave heights up to 24 hours ahead as output. The ANNs are able to generate reliable 6-12 hour forecasts, but longer-term forecasts tend to underestimate peak heights or delay their timing. Real-time predictions starting in April 2005 showed similar trends.
Scott McIntosh, Director, High Altitude Observatory, National Center for Atmospheric Research, Boulder, Colorado
June 2016 - UCAR Congressional Briefing on Predicting Space Weather
Video of this presentation will be available soon.
1) NREL is a national laboratory operated by the Alliance for Sustainable Energy, LLC that focuses on energy efficiency and renewable energy.
2) The presentation discusses options for quantifying solar resource from measurements including horizontal and inclined surfaces, and methods for transposing horizontal irradiance data to plane of array irradiance.
3) It notes that isotropic models used to approximate this transposition can underestimate plane of array irradiance by 5-20% compared to using anisotropic physics models that better simulate cloud conditions and solar radiances.
This document summarizes a presentation about quantifying uncertainty when tuning satellite-derived solar irradiance data to ground measurements. It discusses how the length of ground data used for tuning impacts uncertainty levels in the tuned satellite data. Analysis of multiple sites showed that uncertainty decreases as more months of ground data are used, as it reduces the effects of seasonal variations. With less than a year of ground data, seasonal effects can amplify uncertainty, but results stabilize with a year or more of data.
Reanalysis Datasets for Solar Resource Assessment Presented at ASES 2014Gwendalyn Bender
The document compares the accuracy of two second generation reanalysis datasets (MERRA and ERA-Interim) to satellite-derived solar data for solar resource assessment. It finds that MERRA has less error than ERA-Interim or satellite data alone. A bias correction method is applied to improve MERRA, reducing errors further. However, in cloudier locations, one year of training data is insufficient for the correction. While reanalysis datasets provide long-term global data, satellite observations currently provide more accurate short-term solar estimates.
This document discusses challenges and opportunities for using machine learning and data mining techniques on big climate data. It describes various types of climate and Earth observation data available from satellites and models. Research highlights are presented on using pattern mining to track ocean eddies, extreme value theory to study heatwaves and rainfall, and relationship mining to study seasonal hurricane activity. The challenges of analyzing multi-scale, heterogeneous climate data are also discussed.
Estimating Fire Weather Indices Via Semantic Reasoning Over Wireless Sensor N...IJwest
Wildfires are frequent, devastating events in Australia that regularly cause significant loss of life and widespread property damage. Fire weather indices are a widely-adopted method for measuring fire danger and they play a significant role in issuing bushfire warnings and in anticipating demand for bushfire management resources. Existing systems that calculate fire weather indices are limited due to low spatial and temporal resolution. Localized wireless sensor networks, on the other hand, gather continuous sensor data measuring variables such as air temperature, relative humidity, rainfall and wind speed at high resolutions. However, using wireless sensor networks to estimate fire weather indices is a challenge due to data quality issues, lack of standard data formats and lack of agreement on thresholds and methods for calculating fire weather indices. Within the scope of this paper, we propose a standardized approach to calculating Fire Weather Indices (a.k.a. fire danger ratings) and overcome a number of the challenges by applying Semantic Web Technologies to the processing of data streams from a wireless sensor network deployed in the Springbrook region of South East Queensland. This paper describes the underlying ontologies, the semantic reasoning and the Semantic Fire Weather Index (SFWI) system that we have developed to enable domain experts to specify and adapt rules for calculating Fire Weather Indices. We also describe the Web-based mapping interface that we have developed, that enables users to improve their understanding of how fire weather indices vary over time within a particular region. Finally, we discuss our evaluation results that indicate that the proposed system outperforms state-of-the-art techniques in terms of accuracy, precision and query performance.
The document summarizes the development of satellite modeling for the National Solar Radiation Database (NSRDB) to provide accurate surface solar radiation data. It describes the evolution from empirical to physical models using satellite measurements and ancillary data as inputs to radiative transfer models. Validation shows the new 2005-2012 dataset has a mean bias error of less than 5% for GHI and DNI compared to surface measurements, though uncertainty remains for cloudy cases. Future work aims to improve the model with higher resolution data and better representation of aerosols and surfaces.
IRJET - Intelligent Weather Forecasting using Machine Learning TechniquesIRJET Journal
This document discusses using machine learning techniques to forecast weather intelligently. It proposes using multi-target regression and recurrent neural network (RNN) models trained on historical weather data from Bangalore to predict future weather conditions like temperature, humidity, and precipitation. The data is first preprocessed before being fed to the models. The models are evaluated to accurately predict weather in the short term to help people like farmers and commuters without relying on expensive equipment.
1. The document discusses using sky imagers for short-term solar forecasting, as traditional methods lack sufficient spatial and temporal resolution for small-scale applications.
2. The proposed sky imager forecast model involves 7 steps: image analysis, cloud detection, cloud projection, shadow projection, irradiance modeling, predicting cloud motion to generate forecasts, and PV power modeling.
3. Accurate cloud detection, projection, and shadow projection are challenging due to issues like cloud inhomogeneity, perspective errors with distance from camera, and sensitivity to errors in estimated cloud base height.
Ilkay Altintas from the San Diego Supercomputer Center gave this talk at the HPC User Forum.
"WIFIRE is an integrated system for wildfire analysis, with specific regard to changing urban dynamics and climate. The system integrates networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modeling, and data assimilation to provide a scalable method to monitor such phenomena as weather patterns that can help predict a wildfire's rate of spread."
Watch the video: https://wp.me/p3RLHQ-inQ
Learn more: https://wifire.ucsd.edu/https://wifire.ucsd.edu/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Weather forecasting is the application of science and technology to predict the state of the atmosphere for a future time and a given location. Now days, forecasting of accurate atmospheric conditions is the major challenge for the meteorologist and poor forecasting has significant impact on our daily lives. This brings the necessity to make research works on forecasting of the weather events with respect to Ethiopia.
Air quality challenges and business opportunities in China: Fusion of environ...CLIC Innovation Ltd
MMEA (The Measurement, Monitoring and Environmental Efficiency Assessment) research program final seminar presentation by Dr. Ari Karppinen, Finnish Meteorological Institute
CSP Training series : solar resource assessment 2/2Leonardo ENERGY
Fifth session of the 2nd Concentrated Solar Power Training dedicated to solar resource assessment.
* DNI Variability, Frequency Distributions
* Typical Meteorological Years
* DNI measurements: broadband vs. spectral, and their limitations
* What is circumsolar radiation and why should we care in CSP/CPV?
* How much diffuse irradiance can be used in concentrators?
* How to measure and model the circumsolar irradiance?
* Spectral irradiance standards and their use for PV/CPV rating
* The AM1.5 direct standard spectrum: Why did it change? Why AM1.5?
* Use of the SMARTS radiative code to evaluate clear-sky spectral irradiances
* Sources of measured spectral irradiance data
* Spectral effects on silicon and multijunction cells and their dependence on climate
This document describes an approach to guide a robot or person carrying a methane sniffer to locate the source of a methane leak. It involves simulating methane distribution based on wind patterns and obstacles, calculating the probability of different leak location hypotheses given sensor readings, and using expected information gain to determine the next best location to move to in order to gather more information. The approach was tested in a virtualization demo using real methane concentration and wind data.
Earth observation systems now allow for accurate global monitoring and measurement due to advances in satellite technology. High resolution satellite images are more widely available at lower costs, enabling detailed land use data collection. Key elements of measurement include both remote sensing via satellites as well as ground measurements for calibration and analysis. Spatial data is organized using GIS databases to extrapolate measurements and characterize land use heterogeneity globally.
Weather prediction technology is a global, big data enterprise. This talk will describe the huge quantities of information that make modern weather prediction possible, from satellite and radar data to surface observations and the output from numerical weather prediction models. The role of smartphones and other mobile devices for distributing forecasts and weather information will be discussed and the future of weather prediction will be outlined.
This document discusses GPS technology and proposes solutions to some of its problems. It begins with background on the initial military and civilian uses of GPS. It then explains how GPS works using triangulation of signals from satellites. The document notes that while GPS has become more popular and affordable with products like Garmin, problems exist regarding availability, strength, accuracy, and security. Solutions proposed include using wireless networking principles to improve availability during "handoffs"; applying quantum cryptography to encryption for better security; and using repeaters to strengthen signals and reduce initialization times, improving accuracy. Experiments are suggested to test these solutions in real-world trials.
The document describes a student research project that used seismic data from 10 stations near Neal Hot Springs, Oregon to detect small local earthquakes through numerical cross correlation. The student created filtered templates from 3 known earthquakes to scan 20 months of seismic data recorded between 2011-2012. This process identified 9 additional earthquakes, beyond the original 3 that had been visually identified. Finding more small earthquakes will help interpret local fault activity and assess if nearby geothermal exploration affects seismicity. The research demonstrates applying new computational techniques to gain higher resolution from existing seismic data and better understand subsurface geology.
CSP Training series : solar resource assessment 1/2Leonardo ENERGY
The key factors that can explain inconsistencies and large disagreements between solar resource maps include:
1. Differences in the cloud data sources or periods used to create the maps
2. Inconsistent aerosol data used in the models
3. Use of long-term monthly average aerosol data versus mean daily data
4. Reliance on empirical algorithms that may degrade in accuracy for some areas
5. Lack of validation against actual ground-measured DNI data, which is limited
Proper validation against available ground measurements and consistency in input data are important to improve agreement between solar resource maps. The scarcity of DNI data makes validation challenging.
MO3.L10 - STATUS OF PRE-LAUNCH ACTIVITIES FOR THE NPOESS COMMUNITY COLLABORAT...grssieee
The document summarizes the status of pre-launch validation activities for the NPP satellite. It discusses that validation teams are continuing work to characterize sensor data records and environmental data records in preparation for post-launch validation. Team leads provide experience from past missions and are working with stakeholders and experts to refine algorithms and calibration. Activities include analysis of test data, preparation of validation tools, and coordination between sensor and data record teams.
One–day wave forecasts based on artificial neural networksJonathan D'Cruz
The document summarizes a study that uses artificial neural networks (ANNs) to generate 24-hour wave forecast based on wave buoy data from 6 locations. It trains ANNs using over 12 years of wave height data from the buoys as input, and forecasts wave heights up to 24 hours ahead as output. The ANNs are able to generate reliable 6-12 hour forecasts, but longer-term forecasts tend to underestimate peak heights or delay their timing. Real-time predictions starting in April 2005 showed similar trends.
Scott McIntosh, Director, High Altitude Observatory, National Center for Atmospheric Research, Boulder, Colorado
June 2016 - UCAR Congressional Briefing on Predicting Space Weather
Video of this presentation will be available soon.
TU2.L10 - NEXT-GENERATION GLOBAL PRECIPITATION PRODUCTS AND THEIR APPLICATIONSgrssieee
The document summarizes the goals and capabilities of the upcoming Global Precipitation Measurement (GPM) mission. GPM will provide next-generation global precipitation data products through a constellation of passive microwave sensors calibrated to the GPM Core Observatory's radar and radiometer. This will improve accuracy for light rain and snow and provide higher resolution and more frequent observations. Ground validation efforts and applications research are important to maximize the scientific and societal benefits of GPM precipitation data.
C5.05: Fit for Purpose Marine Observations - Boris Kelly-GerreynBlue Planet Symposium
This document discusses ensuring marine observations are fit for purpose. It notes the Bureau collects over 2 million marine observations per year. Marine observations are crucial for numerous applications including weather forecasting, climate modeling, and informed decision making. The Bureau uses techniques like observing system experiments and forecast sensitivity to observations to evaluate how well observations improve forecasts and determine which observation types have the most impact. Preliminary results found buoys and satellite temperature and humidity profiles significantly improve forecasts. Going forward the Bureau plans to apply these techniques to upgraded forecast models to guide network planning and data assimilation research.
CFD down-scaling and online measurements for short-term wind power forecastingJean-Claude Meteodyn
Usually speaking, Forecast systems are classified : Intraday (Very Short term) is commonly Stochastic with online measurements while Extraday (Short term) is usually Deterministic based on NWP data. This work aims to breakdown these classifications, proposing a unique tool based on the unification of all these techniques.
The Pacific Research Platform: A Regional-Scale Big Data Analytics Cyberinfra...Larry Smarr
National Ocean Exploration Forum 2017
Ocean Exploration in a Sea of Data
Calit2’s Qualcomm Institute
University of California, San Diego
October 21, 2017
Using the Pacific Research Platform for Earth Sciences Big DataLarry Smarr
Grand Challenge Lecture
Big Data and the Earth Sciences: Grand Challenges Workshop
Calit2’s Qualcomm Institute
University of California, San Diego
May 31, 2017
FourCastNet is a global weather forecasting model that provides accurate short- to medium-range predictions at an unprecedented high resolution of 0.25° using a data-driven deep learning approach. FourCastNet matches or exceeds the accuracy of ECMWF's numerical weather prediction model IFS for variables like surface winds and precipitation. It generates week-long forecasts nearly 45,000 times faster than IFS, enabling large ensembles for improved probabilistic forecasting. FourCastNet uses an adaptive Fourier neural operator with a vision transformer backbone to make the first high-resolution global weather predictions from a deep learning model.
Modern weather forecasting relies on numerical weather prediction (NWP) models that integrate systems of equations governing atmospheric processes. NWP models have evolved from early conceptual models to high-resolution global and regional models run on supercomputers. Continuous observations from satellites and other sensors are assimilated using data assimilation techniques to initialize models. Ensemble modeling addresses forecast uncertainty. Though limited by incomplete understanding and observations, NWP provides increasingly accurate forecasts out to around two weeks.
The document summarizes the SCI-HI experiment which aims to observe hydrogen during the Cosmic Dawn period using the 21-cm emission line. The experiment has deployed a preliminary instrument on Isla Guadalupe to collect data. Analysis of the initial data involved calibration using a Milky Way galaxy model and foreground removal through polynomial fitting. Further improvements to the instrument are planned, such as using a gas generator instead of batteries and deploying to more remote sites, to help better understand the signal from the first stars.
The document proposes the GOAL&GO architecture, which would provide global observations from Lagrange point, pole-sitter, and geosynchronous orbits using small, low-cost spacecraft. This revolutionary concept could monitor Earth's response to climate change and meet needs for disaster monitoring and relief through frequent imaging of the entire globe. The system is designed to evolve over 10-20 years using simple, proven technologies on multiple spacecraft to provide flexible, low-cost Earth observations.
The document summarizes the past 10 years of studying volcanoes using InSAR techniques from spaceborne radar systems and looks ahead to future developments. Key points include: 1) InSAR has advanced from initial imaging to reliable time series analyses of deformation; 2) New radar systems provide higher resolution data at different frequencies but coverage remains limited; and 3) Future missions like DESDynI-R are designed for volcanology but funding and policies remain challenges to fully utilizing the technique.
NPOESS Transition to the Joint Polar Satellite System (JPSS) and Defense Weat...grssieee
The document discusses the transition of the NPOESS environmental monitoring program to two new programs - JPSS led by NOAA/NASA and DWSS led by the DoD. It outlines the status of the NPP satellite, ground systems, calibration/validation plans, and data readiness for users. The transition aims to ensure continuity of critical weather and climate observations through international collaboration between the US and European agencies.
This document summarizes a spatial analysis of tornado risk in East Tennessee. It analyzes tornado start point density using kernel density estimation within 25 miles of each point, focusing on the area covered by the NWS Morristown office. Statistics show areas of anomalously high density northeast of the French Broad River Valley have density values above the mean. While no definitive conclusions can be drawn, the results provide some support for previous research finding higher tornado risk near terrain features like river valleys. Future modeling studies are suggested to better understand how terrain may influence tornado formation.
TRACKING ANALYSIS OF HURRICANE GONZALO USING AIRBORNE MICROWAVE RADIOMETERjmicro
There is a huge consideration in the use of microwave airborne radiometry for remote sensing instead of satellite, the important role of airborne way is how to provide high accuracy real time data. The airborne hurricane tracking is an important method compared with the space borne method, which is developed by NASA Marshall Space Flight center to provide high resolution measurements. By flying special aircraft equipment using synthetic thinned array radiometry technology and included all critical measurements such as hurricane eye location, speed of wind and the pressure. This paper describes the data analysis of best track positions for Hurricane Gonzalo based on the date collected by airborne microwave radiometry. Significant analysis comes from comparing the airborne data with the surface observations from ship reports. The vast majority is to estimate peak intensity and minimum central pressure of Gonzalo from 12 to 19 October 2014, based on blend of SFMR flight-level winds and pressure retrievals from observing brightness temperatures. SFMR: Stepped-Frequency Microwave Radiometer is a highly developed tool developed by the Langley Research Center that is designed to measure the wind speed at the ocean’s surface, and the rain fall rates within the storm accurately and continuously. The work also addresses the realistic details of the locations and the valuable information about the pressure and wind speed, which is very critical to predict the growth and movement to get the idea for future monitoring of the hurricane disasters. Also presents a conceptual of step frequency microwave radiometer in airborne side. The objective of this research is tracking analysis techniques based on comparing the satellite, ship and airborne reports to get higher accuracy. The system operates at four spaced frequencies in the range between 4 GHz and 7 GHz provides wide measurements between ± 45 incidence angle. Gonzalo 2014 is an example; the best results of retrieved wind speed, locations and pressure are presented. There are several national projects have been developed for earth observation, such as fire, hurricane and border surveillance. In this work, the efficient high resolution techniques of C-band, four-frequency, the work also addresses a valuable information comes from the airborne system and the prediction way of the growth and movement of hurricanes. In passive microwave remote sensing from space at C band has the penetrating advantages of atmosphere. Airborne system is able to work in full Polari-metric in four bands, C, X, S, L and P-band, which cover the wavelengths from 3 to 85 cm. The modes of measurement contain single channel operation wavelength and polarization.
Dr. Thomas Zurbuchen discusses research frontiers in space weather. He notes that space weather is as important to space researchers as cancer is to biologists. His presentation covers the role of universities in fundamental research, modeling challenges like the University of Michigan's model, new distributed sensor network architectures using fleets of nanosatellites, and the educational challenge of training students in developing new technologies and analyzing data.
Similar to Big Science, Big Data, and Big Computing for Advanced Hurricane Prediction (20)
Gokhan Danabasoglu, Senior Scientist and Community Earth System Model Chief Scientist, National Center for Atmospheric Research (NCAR)
UCAR Congressional Briefing - April 2018
Ben Kirtman, Director, Cooperative Institute for Marine & Atmospheric Studies, University of Miami, Rosentiel School for Marine and Atmospheric Science
UCAR Congressional Briefing - April 2018
The document discusses how subseasonal to seasonal forecasts can help drive analytics in the cattle industry. It provides examples of how forecasts of drought in the Southern Plains and Hurricane Harvey helped cattle producers plan grazing and reduce exposure to weather risks. Livestock Wx translates weather and climate information for cattle producers and uses probabilistic precipitation forecasts from multiple models to assess risk on timescales from monthly to interannual. Subseasonal to seasonal predictions have economic value for the cattle industry by helping with tactical planning for issues like forage growth, pest outbreaks, and cattle weights.
Alicia Karspeck, Climate Scientist and Associate Director of Research Partnerships, Jupiter Technology Systems, Inc.
UCAR Congressional Briefing - April 2018
Rebecca Morss, Senior Scientist and Deputy Director, Mesoscale and Microscale Meteorology Laboratory of the National Center for Atmospheric Research (NCAR) - November 2017 UCAR Congressional Briefing
Today's wildfires are far outside the historic range of variability, with lasting consequences for our forests and open lands. Understanding the three fundamental forces governing fire behavior - fuel, topography, and weather - remains key to predicting how fires may behave in coming decades.
Fire modeling occurs over many scales. Many disparate communities are involved, from foresters and ecologists to engineers, atmospheric scientists, and remote sensing specialists. There are challenges obtaining good observational data. Researchers are working to build better physical models to help understand how wildfires spread, to fill in missing data, and to improve observations from a variety of sources.
Wildland fires are exceedingly complex phenomena. No human can integrate all the interacting factors in real-time. More sophisticated tools are needed that capture interactions between the fire and the local atmosphere. Research is yielding emerging wildfire decision support technologies that are primed to be transitioned to operations.
Edward P. Clark, Director, Geo-Intelligence Office of Water Prediction, National Water Center - September 2016
UCAR Congressional Briefing
The National Water Model is a collaborative effort, with hydrography data developed by the USGS and EPA, modeling framework developed by NCAR, and testing and deployment supported by CUAHSI and the National Water Center. The National Water Center - opened in 2015 - is the nation's first facility dedicated to water forecasts, research, and collaboration across federal water science and management agencies.
David J. Gochis, Scientist, National Center for Atmospheric Research - September 2016
UCAR Congressional Briefing
WRF-Hydro, a powerful NCAR-based computer model developed by a collaborative community drawing from academia, federal labs, and private industry, is the first nationwide operational system to provide continuous predictions of water levels and potential flooding in rivers and streams from coast to coast. NOAA's new Office of Water Prediction selected it last year as the core of the agency's new National Water Model. WRF-Hydro is designed to provide adaptable modeling for assimilation and prediction of precipitation, soil moisture, snowpack, groundwater, streamflow, and inundation.
Richard P. Hooper, Executive Director, Consortium for the Advancement of Hydrologic Science, Inc. (CUAHSI) - September 2016
UCAR Congressional Briefing
The new National Water Model gives the university research community a framework for collaboration that supports interdisciplinary research. This new framework will also help move research to operations, providing a testbed for different process representations and connecting different scales for a more integrated view.
Ryan E. Emanuel (Lumbee), Associate Professor, Department of Forestry and Environmental Resources, North Carolina State University - September 2016
UCAR Congressional Briefing
Water influences ecological processes and patterns; ecosystems influence water quantity and quality; ecohydrology focuses on these water-life interactions.
UCAR Congressional Briefing
John McHenry, Chief Scientist, Advanced Meteorological Systems, Baron Services - September 2016
UCAR Congressional Briefing
Commercial weather companies partner with research community and government agencies to develop and deploy critical weather intelligence with the goal of reducing harm to people and property. The newly deployed National Water Model has significant promise for reducing flood-related disaster risks.
Vice Admiral (Retired) Conrad Lautenbacher, CEO, GeoOptics Inc., Pasadena, CA
June 2016 - UCAR Congressional Briefing on Space Weather Prediction
Video of this presentation will be available shortly.
Bruce Carmichael, Director, Aviation Applications Program, National Center for Atmospheric Research
February 2016 - UCAR Congressional Briefing on Aviation Weather Safety
Video of this presentation: https://president.ucar.edu/government-relations/washington-update/3594/aviation-weather-safety-ucar-congressional-briefing
Julienne Stroeve discusses the need for improved forecasting of changing sea ice conditions months to years in advance. She notes that summer sea ice has been declining rapidly, opening new shipping routes in the Arctic. However, forecasts have large uncertainties and are needed at shorter time scales and with higher spatial resolution. Improving processes like melt ponds and ocean upwelling in models could enhance forecasts, as could initializing forecasts with ice thickness maps. A dedicated polar prediction effort like the proposed Year of Polar Prediction could bring together international experts to advance this important area.
Dr. David W. Titley, Rear Adm USN (ret) - Sept 2015
UCAR Congressional Briefing
The Arctic is changing, and not in a vacuum. It's time to get ready to address our security, access, and sovereignty.
Video of this presentation: https://president.ucar.edu/government-relations/washington-update/851/state-arctic-ucar-congressional-briefing
More from UCAR - Atmospheric & Earth System Science (18)
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
Sexuality - Issues, Attitude and Behaviour - Applied Social Psychology - Psyc...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
Microbial interaction
Microorganisms interacts with each other and can be physically associated with another organisms in a variety of ways.
One organism can be located on the surface of another organism as an ectobiont or located within another organism as endobiont.
Microbial interaction may be positive such as mutualism, proto-cooperation, commensalism or may be negative such as parasitism, predation or competition
Types of microbial interaction
Positive interaction: mutualism, proto-cooperation, commensalism
Negative interaction: Ammensalism (antagonism), parasitism, predation, competition
I. Mutualism:
It is defined as the relationship in which each organism in interaction gets benefits from association. It is an obligatory relationship in which mutualist and host are metabolically dependent on each other.
Mutualistic relationship is very specific where one member of association cannot be replaced by another species.
Mutualism require close physical contact between interacting organisms.
Relationship of mutualism allows organisms to exist in habitat that could not occupied by either species alone.
Mutualistic relationship between organisms allows them to act as a single organism.
Examples of mutualism:
i. Lichens:
Lichens are excellent example of mutualism.
They are the association of specific fungi and certain genus of algae. In lichen, fungal partner is called mycobiont and algal partner is called
II. Syntrophism:
It is an association in which the growth of one organism either depends on or improved by the substrate provided by another organism.
In syntrophism both organism in association gets benefits.
Compound A
Utilized by population 1
Compound B
Utilized by population 2
Compound C
utilized by both Population 1+2
Products
In this theoretical example of syntrophism, population 1 is able to utilize and metabolize compound A, forming compound B but cannot metabolize beyond compound B without co-operation of population 2. Population 2is unable to utilize compound A but it can metabolize compound B forming compound C. Then both population 1 and 2 are able to carry out metabolic reaction which leads to formation of end product that neither population could produce alone.
Examples of syntrophism:
i. Methanogenic ecosystem in sludge digester
Methane produced by methanogenic bacteria depends upon interspecies hydrogen transfer by other fermentative bacteria.
Anaerobic fermentative bacteria generate CO2 and H2 utilizing carbohydrates which is then utilized by methanogenic bacteria (Methanobacter) to produce methane.
ii. Lactobacillus arobinosus and Enterococcus faecalis:
In the minimal media, Lactobacillus arobinosus and Enterococcus faecalis are able to grow together but not alone.
The synergistic relationship between E. faecalis and L. arobinosus occurs in which E. faecalis require folic acid
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...Scintica Instrumentation
Targeting Hsp90 and its pathogen Orthologs with Tethered Inhibitors as a Diagnostic and Therapeutic Strategy for cancer and infectious diseases with Dr. Timothy Haystead.
BIRDS DIVERSITY OF SOOTEA BISWANATH ASSAM.ppt.pptxgoluk9330
Ahota Beel, nestled in Sootea Biswanath Assam , is celebrated for its extraordinary diversity of bird species. This wetland sanctuary supports a myriad of avian residents and migrants alike. Visitors can admire the elegant flights of migratory species such as the Northern Pintail and Eurasian Wigeon, alongside resident birds including the Asian Openbill and Pheasant-tailed Jacana. With its tranquil scenery and varied habitats, Ahota Beel offers a perfect haven for birdwatchers to appreciate and study the vibrant birdlife that thrives in this natural refuge.
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Sérgio Sacani
Context. The observation of several L-band emission sources in the S cluster has led to a rich discussion of their nature. However, a definitive answer to the classification of the dusty objects requires an explanation for the detection of compact Doppler-shifted Brγ emission. The ionized hydrogen in combination with the observation of mid-infrared L-band continuum emission suggests that most of these sources are embedded in a dusty envelope. These embedded sources are part of the S-cluster, and their relationship to the S-stars is still under debate. To date, the question of the origin of these two populations has been vague, although all explanations favor migration processes for the individual cluster members. Aims. This work revisits the S-cluster and its dusty members orbiting the supermassive black hole SgrA* on bound Keplerian orbits from a kinematic perspective. The aim is to explore the Keplerian parameters for patterns that might imply a nonrandom distribution of the sample. Additionally, various analytical aspects are considered to address the nature of the dusty sources. Methods. Based on the photometric analysis, we estimated the individual H−K and K−L colors for the source sample and compared the results to known cluster members. The classification revealed a noticeable contrast between the S-stars and the dusty sources. To fit the flux-density distribution, we utilized the radiative transfer code HYPERION and implemented a young stellar object Class I model. We obtained the position angle from the Keplerian fit results; additionally, we analyzed the distribution of the inclinations and the longitudes of the ascending node. Results. The colors of the dusty sources suggest a stellar nature consistent with the spectral energy distribution in the near and midinfrared domains. Furthermore, the evaporation timescales of dusty and gaseous clumps in the vicinity of SgrA* are much shorter ( 2yr) than the epochs covered by the observations (≈15yr). In addition to the strong evidence for the stellar classification of the D-sources, we also find a clear disk-like pattern following the arrangements of S-stars proposed in the literature. Furthermore, we find a global intrinsic inclination for all dusty sources of 60 ± 20◦, implying a common formation process. Conclusions. The pattern of the dusty sources manifested in the distribution of the position angles, inclinations, and longitudes of the ascending node strongly suggests two different scenarios: the main-sequence stars and the dusty stellar S-cluster sources share a common formation history or migrated with a similar formation channel in the vicinity of SgrA*. Alternatively, the gravitational influence of SgrA* in combination with a massive perturber, such as a putative intermediate mass black hole in the IRS 13 cluster, forces the dusty objects and S-stars to follow a particular orbital arrangement. Key words. stars: black holes– stars: formation– Galaxy: center– galaxies: star formation
Anti-Universe And Emergent Gravity and the Dark UniverseSérgio Sacani
Recent theoretical progress indicates that spacetime and gravity emerge together from the entanglement structure of an underlying microscopic theory. These ideas are best understood in Anti-de Sitter space, where they rely on the area law for entanglement entropy. The extension to de Sitter space requires taking into account the entropy and temperature associated with the cosmological horizon. Using insights from string theory, black hole physics and quantum information theory we argue that the positive dark energy leads to a thermal volume law contribution to the entropy that overtakes the area law precisely at the cosmological horizon. Due to the competition between area and volume law entanglement the microscopic de Sitter states do not thermalise at sub-Hubble scales: they exhibit memory effects in the form of an entropy displacement caused by matter. The emergent laws of gravity contain an additional ‘dark’ gravitational force describing the ‘elastic’ response due to the entropy displacement. We derive an estimate of the strength of this extra force in terms of the baryonic mass, Newton’s constant and the Hubble acceleration scale a0 = cH0, and provide evidence for the fact that this additional ‘dark gravity force’ explains the observed phenomena in galaxies and clusters currently attributed to dark matter.
GBSN - Biochemistry (Unit 6) Chemistry of Proteins
Big Science, Big Data, and Big Computing for Advanced Hurricane Prediction
1. Big Science, Big Data and Big Computing for
Advanced Hurricane Monitoring & Prediction
Fuqing Zhang
Director, Center for Advanced Data Assimilation and Predictability Techniques
Professor, Department of Meteorology and Atmospheric Science
Pennsylvania State University
Research sponsored by NSF, ONR, NOAA and NASA
FROM RESEARCH TO INDUSTRY:
How Earth system science enables private sector innovation
2. Track forecasts have improved drastically over past 25 years: a 3-
day forecast today is as accurate as a 1-day forecast was in 1989.
Intensity forecast accuracy has remained generally stagnant over
that same period, except for the last few years, thanks to the
Hurricane Forecast Improvement Program (HFIP) led by NOAA.
National Hurricane Center Official TC Forecast Errors
3. NSF HPC Computing
At Texas Advanced
Computing Center
System Name: Ranger
Operating System: Linux
Number of Cores: 62,976
Total Memory: 123TB
Peak Performance: 579.4TFlops
Total Disk: 1.73PB (shared)
HFIP Allocation : 30M SUs
(July 1, 2008- 31 March 2009)
Goals of NOAA’s Hurricane Forecast
Improvement Project (HFIP)
• Reduce average track and intensity forecast error by
half for days 1 through 5
• Significantly increase the probability of detection for
rapid intensity change and decrease false alarm ratio
• Extend lead time for hurricane forecasts out to Day 7
My team’s HFIP effort co-funded by ONR, NSF and NASA
4. How to make better input to the hurricane models?
High-resolution observations from Hurricane Hunters and UAVs: Provide
crucial airborne inflight measurements, dropsondes, Doppler Radar Winds, …
50°
~ 3 km
Data assimilation: The process of generating initial conditions for weather prediction models
through combining the background model estimate, and all applicable up-to-date observations.
NOAA P3 NASA
Global
Hawk
5. PSU WRF-EnKF Hurricane Analysis & Prediction System
with advanced assimilation of airborne Doppler Radar Vr
Evaluated for all 100+ P3 TDR missions during 2008-2012
The PSU system uses the NCAR’s WRF model; TDR Methodology now adopted by NOAA .
(F. Zhang and Y. Weng 2015, Bulletin of the American Meteorological Society)
PSU WRF-EnKF Hurricane Intensity error (knots)
10. PSU WRF-EnKF Harvey Forecast with GOES-R Assimilation
in comparison with WRF(NoDA), operational HWRF & best track
Research Supported by ONR, NASA, NOAA and NSF
11. Research Supported by ONR, NASA, NOAA and NSF
FV3 Prediction of Hurricane Harvey with a 3-km nested domain
PSU WRF-EnKF GOES-R assimilation used for FV3 initial vortex
FV3 3km-nest simulated radar reflectivity (left) vs. observations (right) at Landfall
KCRPKCRP
KHGXKHGXKEWX KEWX
KGRKKGRK
12. Research Supported by ONR, NASA, NOAA and NSF
FV3 Prediction of Hurricane Harvey with a 3-km nested domain
PSU WRF-EnKF GOES-R assimilation used for FV3 initial vortex
FV3-forecasted (left) vs. observed event total rainfall (right) with 6-day lead time
KCRP
KHGX
KEWX
KGRK
KLCH
KPOE
KSHV
KLIX
KCRP
KHGX
KEWX
KGRK
KLCH
KPOE
KSHV
KLIX
Point maximum of 40+ inches Point maximum of 50+ inches
13. Promises of US’s Next-Generation Global Prediction System: FV3
Comparison with EC model on 1-year-mean 10-day-forecast 500mb anomaly correlations
Forecast lead time (h)
FV3 model with EC model initial condition (IC) comparable to EC day 1-7 but better thereafter;
FV3 model with current GFS initial condition is considerably worse than either run with EC IC;
US forecast is inferior due mostly to poorer IC, inferior data assimilation ingesting less data
Courtesy of Linus Magnusson at ECMWF
and SJ Lin at NOAA/GFDL
AnomalyCorrelationCoefficient(ACC)
14. From Research to Industry
Concluding Remarks: Invest on big data hurricane science
• There are great potentials and needs for improving hurricane prediction through
collaborative research and sustained federal investment on data, science and personnel
• Future hurricane prediction advances may come from the following areas:
• Observations: advanced observing systems such as those from airborne dropsondes,
Doppler radar and weather satellites
• Model: cutting-edge higher-resolution weather prediction models with more accurate
numerics and physics
• Data assimilation: comprehensive algorithms and methodologies that that can more
effectively ingest existing and future observations into state-of-science models
• Computing: high-performance computing facilities that can perform advanced analysis
and forecasting in a timely manner.
15. + 3 kts initial intensity error
Shear only
Initial V + shear
Track
Initial + track
Initial inner core
moisture error
Predictability and Error Sources of Hurricane Intensity Forecast
(Emanuel and Zhang 2016, Journal of Atmospheric Sciences)
16. With realtime EnKF assimilation of airborne Doppler winds
(Zhang and Weng, 2015 BAMS)
What “big data” means for hurricane research: ensembles
17. Research Supported by ONR, NASA, NOAA and NSF
FV3 Prediction of Hurricane Harvey with a 3-km nested domain
PSU WRF-EnKF GOES-R assimilation used for FV3 initial vortex
in comparison with FV3(GFS), operational GFS, EC & best track
Ongoing PSU collaboration with NOAA/GFDL