The document discusses techniques for detecting land cover changes using time series satellite imagery. It describes segmentation-based and predictive model-based time series data mining techniques. The recursive merging algorithm is a segmentation-based technique that merges similar segments over time to identify change points. The yearly delta (YD) algorithm and variability distribution (VD) algorithm are predictive model-based techniques. The YD algorithm identifies changes as large differences from the previous year's values, while the VD algorithm also considers natural variability patterns at each location to identify significant changes. The VD algorithm is shown to perform better than YD for landscapes with diverse vegetation that experience different levels of natural variability over time.
This document discusses challenges and opportunities for using machine learning and data mining techniques on big climate data. It describes various types of climate and Earth observation data available from satellites and models. Research highlights are presented on using pattern mining to track ocean eddies, extreme value theory to study heatwaves and rainfall, and relationship mining to study seasonal hurricane activity. The challenges of analyzing multi-scale, heterogeneous climate data are also discussed.
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
The document provides information about the National Centre for Medium Range Weather Forecasting (NCMRWF) in India. Some key points:
1. NCMRWF's mission is to develop advanced numerical weather prediction systems for India and neighboring regions to improve forecast reliability and accuracy.
2. NCMRWF operates global and regional forecast models and an ensemble prediction system. It assimilates various satellite, radiosonde, and surface observations into these models.
3. NCMRWF provides weather forecasts and other products to various government agencies and sectors like agriculture, energy, and disaster management in India.
This document summarizes a study on the 2010 Pakistan floods. The study used an AGCM model with different SST forcings to determine the roles of ENSO and ENSO-unrelated SSTs. The model was able to simulate observed rainfall anomalies and atmospheric circulation over South Asia when using real 2010 SSTs. Experiments showed that both Pacific La Niña SSTs and Indian Ocean SST anomalies contributed to the extreme rainfall. Pacific SSTs modulated the large-scale monsoon circulation while positive Indian Ocean SST anomalies induced northward moisture transport into northwest India and Pakistan, leading to heavy rainfall. The study highlights the importance of monitoring Indian Ocean variability for improving extended-range prediction of heavy rainfall events over adjacent subt
This document discusses challenges and opportunities for using machine learning and data mining techniques on big climate data. It describes various types of climate and Earth observation data available from satellites and models. Research highlights are presented on using pattern mining to track ocean eddies, extreme value theory to study heatwaves and rainfall, and relationship mining to study seasonal hurricane activity. The challenges of analyzing multi-scale, heterogeneous climate data are also discussed.
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
The document provides information about the National Centre for Medium Range Weather Forecasting (NCMRWF) in India. Some key points:
1. NCMRWF's mission is to develop advanced numerical weather prediction systems for India and neighboring regions to improve forecast reliability and accuracy.
2. NCMRWF operates global and regional forecast models and an ensemble prediction system. It assimilates various satellite, radiosonde, and surface observations into these models.
3. NCMRWF provides weather forecasts and other products to various government agencies and sectors like agriculture, energy, and disaster management in India.
This document summarizes a study on the 2010 Pakistan floods. The study used an AGCM model with different SST forcings to determine the roles of ENSO and ENSO-unrelated SSTs. The model was able to simulate observed rainfall anomalies and atmospheric circulation over South Asia when using real 2010 SSTs. Experiments showed that both Pacific La Niña SSTs and Indian Ocean SST anomalies contributed to the extreme rainfall. Pacific SSTs modulated the large-scale monsoon circulation while positive Indian Ocean SST anomalies induced northward moisture transport into northwest India and Pakistan, leading to heavy rainfall. The study highlights the importance of monitoring Indian Ocean variability for improving extended-range prediction of heavy rainfall events over adjacent subt
Measuring water from Sky: Basin-wide ET monitoring and applicationIwl Pcu
This document discusses using remote sensing to measure evapotranspiration (ET) at the basin scale. It introduces ETWatch, an operational remote sensing model for estimating ET. ETWatch uses inputs like net radiation, soil heat flux, aerodynamic roughness, and atmospheric boundary layer parameters derived from remote sensing data. The document outlines validation of ETWatch estimates against field measurements. It also describes applications of ETWatch for water balance studies and identifying target ET for sustainable water consumption in basins like the Hai Basin in China.
This the presentation I gave for my thesis defense. It\'s entitled "Using bioclimatic envelope modelling to incorporate spatial and temporal dynamics of climate change into conservation planning".
This document discusses coupling land surface models (LSM) with radiative transfer models (RTM) for assimilating microwave brightness temperature observations over India. It explains that the community microwave emission model is used as the forward operator to relate LSM states like soil moisture to brightness temperatures. An ensemble Kalman filter is used to update LSM states by combining forecasts with observations. Results show assimilating brightness temperatures improves Noah LSM soil moisture simulations compared to open-loop runs. Issues regarding biases and parameter estimation for the RTM are discussed.
The document discusses the impact of climate change on snowmelt runoff in the Tamakoshi River basin in Nepal. It summarizes that rising temperatures due to climate change are causing Himalayan glaciers to retreat faster than the global average. This study uses a snowmelt runoff model to simulate snowmelt and runoff in the Tamakoshi basin, finding that stream flow is increasing with higher snowmelt contributions from rising temperatures. The model accurately simulates observed discharge data. Climate change simulations show stream flow and winter flow increasing approximately 3% and 8% respectively for every 1 degree Celsius of warming from increased melting of snow and glaciers in the basin.
Development and Applications of Fire Danger Rating Systems in Southeast AsiaCIFOR-ICRAF
This document discusses the development and application of fire danger rating systems (FDRS) in Southeast Asia. It describes the Canadian Forest Fire Danger Rating System (CFFDRS) and how it was adapted for use in Southeast Asia through a project from 1999-2004. The project involved technical adaptation of the FDRS including fuel modeling, calibration of indices, and mapping of fuel types for the region. It also describes how the FDRS can support early warning of fire risk, monitoring of fire danger levels, and mitigation activities through interpretation of the fire weather index. The goal is to integrate FDRS products with fire suppression planning to allow more anticipatory mobilization of resources and improved fire management.
Numerous studies have found an average increase in extreme precipitation for both the U.S. and Northern Hemisphere mid-latitude land areas, consistent with the expectations arising from the observed increase in greenhouse gas concentrations (now more than 40% above pre-industrial levels). However, there are important regional variations in these trends that are not fully explained. These trend studies are typically based on direct analyses of observational station data. Such analyses confront multiple challenges, such as incomplete data and uneven spatial coverage of stations. Central scientific questions related to this general finding are: Are there changes in weather system phenomenology that are contributing to this observed increase? What is the contribution of increases in atmospheric water vapor? There are also questions related to application of potential future changes in planning. Because of the rarity (by definition) of extreme events, trends are mostly found only when aggregating over space. When would we expect to see a signal at the local level? What are the uncertainties surrounding future changes and their potential incorporation into future design? Further development of statistical/mathematical methods, or innovative application of existing methods, is desirable to aid scientists in exploring these central scientific questions. This talk will describe characteristics of the observation record and the issues surrounding the above questions.
This document discusses creating an icing climatology using downscaling techniques from a weather modeling project. It examines modeling icing events and ice loads, and comparing modeled results to measurements which sometimes show large differences. Two approaches to the climatology are considered: statistically downscaling long-term low-resolution model runs, or modeling shorter representative periods at high resolution. Both have advantages and drawbacks regarding accuracy and representation of climatological conditions. More research is needed to determine the best approach.
In the first part of the talk, we will present a sensitivity analysis of a novel sea ice model. neXtSIM is a continuous Lagrangian numerical model that uses an elastobrittle rheology to simulate the ice response to external forces. The response of the model is evaluated in terms of simulated ice drift distances from its initial position and from the mean position of the ensemble. The simulated ice drift is decomposed into advective and diffusive parts that are characterized separately both spatially and temporally and compared to what is obtained with a free-drift model, i.e. when the ice rheology does not play any role. Overall the large-scale response of neXtSIM is correlated to the ice thickness and the wind velocity fields while the free-drift model response is mostly correlated to the wind velocity pattern only. The seasonal variability of the model sensitivity shows the role of the ice compactness and rheology at both local and Arctic scales. Indeed, the ice drift simulated by neXtSIM in summer is close to the free-drift model, while the more compact and solid ice pack is showing a significantly different mechanical and drift behavior in winter. In contrast of the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy’s trajectories. We found that neXtSIM performs better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search-and-rescue operations. Adaptive meshes, as the one used in neXtSIM, are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, use a remeshing process to remove and insert mesh points at various points in their evolution. This represents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur.
In the second part of the talk, we highlight the challenges that such a modeling framework represents for data assimilation setup. We then describe a remeshing scheme for an adaptive mesh in one dimension. The development of advanced data assimilation methods that are appropriate for such a moving and remeshed grid is presented. Finally we discuss the extension of these techniques to two-dimensional models, like neXtSIM.
Climate downscaling aims to bridge the scale gap between global climate models (GCMs) and local decision-making needs. There are two main downscaling methods: statistical downscaling establishes empirical relationships between large-scale GCM outputs and local variables, while dynamical downscaling uses regional climate models nested within GCMs at higher resolution. Both methods make assumptions about stationary relationships between scales, and dynamical downscaling is more computationally expensive. Downscaling can provide added value like improved regional precipitation simulations, but choosing appropriate domains and bias-correction techniques is important. Statistical downscaling is presently more suitable than dynamical downscaling for seasonal forecasts.
Master's course defense presentation in Water Resource Management and GIS Tooryalay Ayoubi
1) The document summarizes a master's thesis that used the SWAT hydrological model within a GIS to simulate surface runoff in the Panjshir watershed in Afghanistan.
2) Key results included monthly and daily surface runoff predictions that matched observed discharge data with R2 values of 0.815 for calibration and 0.817 for validation.
3) The study also found that land use changes between 1993-2010 increased total water yield in the watershed, with average annual changes ranging from 1.2-4.5% between scenarios.
Physical processes in the earth system are modeled with mathematical representations called parameterizations. This talk will describe some of the conceptual approaches and mathematics used do describe physical parameterizations focusing on cloud parameterizations. This includes tracing physical laws to discrete representations in coarse scale models. Clouds illustrate several of the complexities and techniques common to many physical parameterizations. This includes the problem of different scales, sub-grid scale variability. Discussions of mathematical methods for dealing with the sub-grid scale will be discussed. In-exactness or indeterminate problems for both weather and climate will be discussed, including the problems of indeterminate parameterizations, and inexact initial conditions. Different mathematical methods, including the use of stochastic methods, will be described and discussed, with examples from contemporary earth system models.
Extinction of Millimeter wave on Two Dimensional Slices of Foam-Covered Sea-s...IJSRED
This document discusses millimeter wave (mmW) extinction due to its interaction with layers of air bubbles (sea foam) on the ocean surface. It presents the following key points:
1) A numerical model using the split-step Fourier method was used to evaluate mmW attenuation through layers of sea foam of varying thickness, frequency, polarization, and incidence angle.
2) Estimates of the effective dielectric constant of sea foam layers were calculated for different foam configurations and WindSat frequencies based on a two-dimensional model of randomly packed air bubbles coated with thin seawater layers.
3) The parabolic wave equation method, which approximates solutions to Maxwell's equations, was discussed as an efficient way to
Over the past several years, the authors who create the U.S. Drought Monitor have gained access to an increasing amount of geospatial data to help with their analysis. This includes high resolution precipitation data from the Advanced Hydrologic Prediction Service, soil moisture estimates, vegetation indices, and station data from various networks. The authors now have access to this data in GIS format, allowing them to directly incorporate the information into their drought depiction process. While some needs still remain, such as more in-situ soil moisture observations, the availability of these datasets has greatly improved the authors' ability to accurately portray drought conditions in the U.S.
This problem represents an interesting opportunity for scientists and statisticians to collaborate since the problem is too big for either community. The science is not well established, although fairly sophisticated ice flow models exist. They are even becoming relevant to explain some of the complexity seen in observational data. At the same time, the complex phenomena we see in observations may not be particularly relevant to assessing the risks of significant increases in sea level rise over the near future. The talk will review what we have learned about this problem through the PISCEES SciDAC project. This problem is rich with challenges and opportunities, particularly for realigning how our two communities engage each other. The talk will review the computational, scientific, and mathematical "reality checks" that might stop any reasonable person from considering this topic further. I then will point out how each of these challenges could be mitigated if these different perspectives were better integrated.
This document summarizes a remote sensing study of the urban heat island effect in the St. Louis metropolitan area. The study used Landsat 7 satellite data to analyze land surface temperature, vegetation abundance, and land use patterns. Surface temperature maps and transects showed higher temperatures in urban areas compared to rural surroundings. Statistical analysis found significant positive correlations between temperature and developed land use, and significant negative correlations between temperature and vegetation. Land use types also had significantly different average temperatures based on pairwise t-tests.
This document analyzes the visual elements of a rhythm game. It identifies key elements like choosing a song and difficulty level on the front page before playing. During gameplay, the player matches gestures like taps and holds to targets that move across the screen. The player's avatar and performance are represented through visual effects. Scores and a battery-like icon provide feedback on the player's response and progress. Overall, the game uses an abstract, futuristic graphic style with planetary and stellar wireframes to set a sci-fi themed universe.
This document summarizes IBM MQ basics including messaging, queuing, asynchronous and synchronous messaging, message segmentation, message types, persistent vs non-persistent messages, message descriptors, MQ managers, queues, process definitions, channels, and more. Key points are that MQ uses queuing to enable program-to-program communication by sending and receiving messages without a direct connection. Messages have data and descriptor parts, and can be persistent or non-persistent. MQ managers connect applications to queues using channels and provide message queuing interfaces.
Measuring water from Sky: Basin-wide ET monitoring and applicationIwl Pcu
This document discusses using remote sensing to measure evapotranspiration (ET) at the basin scale. It introduces ETWatch, an operational remote sensing model for estimating ET. ETWatch uses inputs like net radiation, soil heat flux, aerodynamic roughness, and atmospheric boundary layer parameters derived from remote sensing data. The document outlines validation of ETWatch estimates against field measurements. It also describes applications of ETWatch for water balance studies and identifying target ET for sustainable water consumption in basins like the Hai Basin in China.
This the presentation I gave for my thesis defense. It\'s entitled "Using bioclimatic envelope modelling to incorporate spatial and temporal dynamics of climate change into conservation planning".
This document discusses coupling land surface models (LSM) with radiative transfer models (RTM) for assimilating microwave brightness temperature observations over India. It explains that the community microwave emission model is used as the forward operator to relate LSM states like soil moisture to brightness temperatures. An ensemble Kalman filter is used to update LSM states by combining forecasts with observations. Results show assimilating brightness temperatures improves Noah LSM soil moisture simulations compared to open-loop runs. Issues regarding biases and parameter estimation for the RTM are discussed.
The document discusses the impact of climate change on snowmelt runoff in the Tamakoshi River basin in Nepal. It summarizes that rising temperatures due to climate change are causing Himalayan glaciers to retreat faster than the global average. This study uses a snowmelt runoff model to simulate snowmelt and runoff in the Tamakoshi basin, finding that stream flow is increasing with higher snowmelt contributions from rising temperatures. The model accurately simulates observed discharge data. Climate change simulations show stream flow and winter flow increasing approximately 3% and 8% respectively for every 1 degree Celsius of warming from increased melting of snow and glaciers in the basin.
Development and Applications of Fire Danger Rating Systems in Southeast AsiaCIFOR-ICRAF
This document discusses the development and application of fire danger rating systems (FDRS) in Southeast Asia. It describes the Canadian Forest Fire Danger Rating System (CFFDRS) and how it was adapted for use in Southeast Asia through a project from 1999-2004. The project involved technical adaptation of the FDRS including fuel modeling, calibration of indices, and mapping of fuel types for the region. It also describes how the FDRS can support early warning of fire risk, monitoring of fire danger levels, and mitigation activities through interpretation of the fire weather index. The goal is to integrate FDRS products with fire suppression planning to allow more anticipatory mobilization of resources and improved fire management.
Numerous studies have found an average increase in extreme precipitation for both the U.S. and Northern Hemisphere mid-latitude land areas, consistent with the expectations arising from the observed increase in greenhouse gas concentrations (now more than 40% above pre-industrial levels). However, there are important regional variations in these trends that are not fully explained. These trend studies are typically based on direct analyses of observational station data. Such analyses confront multiple challenges, such as incomplete data and uneven spatial coverage of stations. Central scientific questions related to this general finding are: Are there changes in weather system phenomenology that are contributing to this observed increase? What is the contribution of increases in atmospheric water vapor? There are also questions related to application of potential future changes in planning. Because of the rarity (by definition) of extreme events, trends are mostly found only when aggregating over space. When would we expect to see a signal at the local level? What are the uncertainties surrounding future changes and their potential incorporation into future design? Further development of statistical/mathematical methods, or innovative application of existing methods, is desirable to aid scientists in exploring these central scientific questions. This talk will describe characteristics of the observation record and the issues surrounding the above questions.
This document discusses creating an icing climatology using downscaling techniques from a weather modeling project. It examines modeling icing events and ice loads, and comparing modeled results to measurements which sometimes show large differences. Two approaches to the climatology are considered: statistically downscaling long-term low-resolution model runs, or modeling shorter representative periods at high resolution. Both have advantages and drawbacks regarding accuracy and representation of climatological conditions. More research is needed to determine the best approach.
In the first part of the talk, we will present a sensitivity analysis of a novel sea ice model. neXtSIM is a continuous Lagrangian numerical model that uses an elastobrittle rheology to simulate the ice response to external forces. The response of the model is evaluated in terms of simulated ice drift distances from its initial position and from the mean position of the ensemble. The simulated ice drift is decomposed into advective and diffusive parts that are characterized separately both spatially and temporally and compared to what is obtained with a free-drift model, i.e. when the ice rheology does not play any role. Overall the large-scale response of neXtSIM is correlated to the ice thickness and the wind velocity fields while the free-drift model response is mostly correlated to the wind velocity pattern only. The seasonal variability of the model sensitivity shows the role of the ice compactness and rheology at both local and Arctic scales. Indeed, the ice drift simulated by neXtSIM in summer is close to the free-drift model, while the more compact and solid ice pack is showing a significantly different mechanical and drift behavior in winter. In contrast of the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy’s trajectories. We found that neXtSIM performs better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search-and-rescue operations. Adaptive meshes, as the one used in neXtSIM, are used to model a wide variety of physical phenomena. Some of these models, in particular those of sea ice movement, use a remeshing process to remove and insert mesh points at various points in their evolution. This represents a challenge in developing compatible data assimilation schemes, as the dimension of the state space we wish to estimate can change over time when these remeshings occur.
In the second part of the talk, we highlight the challenges that such a modeling framework represents for data assimilation setup. We then describe a remeshing scheme for an adaptive mesh in one dimension. The development of advanced data assimilation methods that are appropriate for such a moving and remeshed grid is presented. Finally we discuss the extension of these techniques to two-dimensional models, like neXtSIM.
Climate downscaling aims to bridge the scale gap between global climate models (GCMs) and local decision-making needs. There are two main downscaling methods: statistical downscaling establishes empirical relationships between large-scale GCM outputs and local variables, while dynamical downscaling uses regional climate models nested within GCMs at higher resolution. Both methods make assumptions about stationary relationships between scales, and dynamical downscaling is more computationally expensive. Downscaling can provide added value like improved regional precipitation simulations, but choosing appropriate domains and bias-correction techniques is important. Statistical downscaling is presently more suitable than dynamical downscaling for seasonal forecasts.
Master's course defense presentation in Water Resource Management and GIS Tooryalay Ayoubi
1) The document summarizes a master's thesis that used the SWAT hydrological model within a GIS to simulate surface runoff in the Panjshir watershed in Afghanistan.
2) Key results included monthly and daily surface runoff predictions that matched observed discharge data with R2 values of 0.815 for calibration and 0.817 for validation.
3) The study also found that land use changes between 1993-2010 increased total water yield in the watershed, with average annual changes ranging from 1.2-4.5% between scenarios.
Physical processes in the earth system are modeled with mathematical representations called parameterizations. This talk will describe some of the conceptual approaches and mathematics used do describe physical parameterizations focusing on cloud parameterizations. This includes tracing physical laws to discrete representations in coarse scale models. Clouds illustrate several of the complexities and techniques common to many physical parameterizations. This includes the problem of different scales, sub-grid scale variability. Discussions of mathematical methods for dealing with the sub-grid scale will be discussed. In-exactness or indeterminate problems for both weather and climate will be discussed, including the problems of indeterminate parameterizations, and inexact initial conditions. Different mathematical methods, including the use of stochastic methods, will be described and discussed, with examples from contemporary earth system models.
Extinction of Millimeter wave on Two Dimensional Slices of Foam-Covered Sea-s...IJSRED
This document discusses millimeter wave (mmW) extinction due to its interaction with layers of air bubbles (sea foam) on the ocean surface. It presents the following key points:
1) A numerical model using the split-step Fourier method was used to evaluate mmW attenuation through layers of sea foam of varying thickness, frequency, polarization, and incidence angle.
2) Estimates of the effective dielectric constant of sea foam layers were calculated for different foam configurations and WindSat frequencies based on a two-dimensional model of randomly packed air bubbles coated with thin seawater layers.
3) The parabolic wave equation method, which approximates solutions to Maxwell's equations, was discussed as an efficient way to
Over the past several years, the authors who create the U.S. Drought Monitor have gained access to an increasing amount of geospatial data to help with their analysis. This includes high resolution precipitation data from the Advanced Hydrologic Prediction Service, soil moisture estimates, vegetation indices, and station data from various networks. The authors now have access to this data in GIS format, allowing them to directly incorporate the information into their drought depiction process. While some needs still remain, such as more in-situ soil moisture observations, the availability of these datasets has greatly improved the authors' ability to accurately portray drought conditions in the U.S.
This problem represents an interesting opportunity for scientists and statisticians to collaborate since the problem is too big for either community. The science is not well established, although fairly sophisticated ice flow models exist. They are even becoming relevant to explain some of the complexity seen in observational data. At the same time, the complex phenomena we see in observations may not be particularly relevant to assessing the risks of significant increases in sea level rise over the near future. The talk will review what we have learned about this problem through the PISCEES SciDAC project. This problem is rich with challenges and opportunities, particularly for realigning how our two communities engage each other. The talk will review the computational, scientific, and mathematical "reality checks" that might stop any reasonable person from considering this topic further. I then will point out how each of these challenges could be mitigated if these different perspectives were better integrated.
This document summarizes a remote sensing study of the urban heat island effect in the St. Louis metropolitan area. The study used Landsat 7 satellite data to analyze land surface temperature, vegetation abundance, and land use patterns. Surface temperature maps and transects showed higher temperatures in urban areas compared to rural surroundings. Statistical analysis found significant positive correlations between temperature and developed land use, and significant negative correlations between temperature and vegetation. Land use types also had significantly different average temperatures based on pairwise t-tests.
This document analyzes the visual elements of a rhythm game. It identifies key elements like choosing a song and difficulty level on the front page before playing. During gameplay, the player matches gestures like taps and holds to targets that move across the screen. The player's avatar and performance are represented through visual effects. Scores and a battery-like icon provide feedback on the player's response and progress. Overall, the game uses an abstract, futuristic graphic style with planetary and stellar wireframes to set a sci-fi themed universe.
This document summarizes IBM MQ basics including messaging, queuing, asynchronous and synchronous messaging, message segmentation, message types, persistent vs non-persistent messages, message descriptors, MQ managers, queues, process definitions, channels, and more. Key points are that MQ uses queuing to enable program-to-program communication by sending and receiving messages without a direct connection. Messages have data and descriptor parts, and can be persistent or non-persistent. MQ managers connect applications to queues using channels and provide message queuing interfaces.
The document outlines an approach called "One-Minute-Leadership" which focuses on spending brief but impactful moments on important leadership and management tasks. These include setting goals with employees in under a minute, providing one-minute coaching, praising, and reprimanding. The approach emphasizes catching employees doing things right, addressing problems immediately but also reassuring employees of their value. The goal is to develop happy and high-performing people and organizations through these small but focused investments of a leader or manager's time.
This document provides an overview of facts about the state of Texas. It discusses the history and founding of Texas, key geographical details like location and size. It also outlines important facts about the state capital of Austin, the state flag, climate, population, and major industries. Finally, it shares information about top tourist attractions in Texas, including the University of Texas at Austin and famous sports teams like the Dallas Cowboys, as well as notable people from Texas.
This document provides information about Wiley Publishing and their online book and reference work offerings. It discusses Wiley's history and publishing portfolios. It describes the different formats of online books and reference works available on Wiley Online Library. It provides pricing details and purchase options for online books, reference works, and subject collections. It highlights several major reference works and describes Wiley's other online resources like Current Protocols, The Cochrane Library, and ELS.
This document summarizes critical responses to Joseph Conrad's novella Heart of Darkness from three time periods: 1) The early responses when it was published in 1902 which praised it as a literary achievement but ignored its treatment of race. 2) Chinua Achebe's 1977 critique that condemned the novella as racist. 3) Edward Said's 1993 defense of Conrad, contextualizing him as a product of his colonial era. It argues the critiques were influenced by their historical contexts, with early readers less sensitive to race issues and Achebe/Said reflecting post-colonial perspectives.
eLibrary USA is a collection of 30 databases containing full-text journal articles, ebooks, essays, encyclopedias, reports, and documentary films on topics like learning English, business, current issues, and health. Users can access the databases through the URL http://elibraryusa.state.gov using their email address as their username. The system will assign a password and the databases can be accessed from any internet connection at any time.
Mise en scene refers to elements that establish the time and location of a film scene. Key elements include the setting, costumes, props, makeup, lighting, and positioning of actors. Together these elements provide context and help tell the story through visual means rather than dialogue. For example, westerns are typically set in the American Midwest to reflect the historical setting of cowboys, while sci-fi films may be set in space to feature other worlds or space travel. Costumes also help indicate the time period of a scene or character traits, and lighting, body language, and positioning can convey mood or a character's role.
North Dakota is known as the "Mother Land of all Farmers" and has a predominantly rural landscape with 90% of its land being used for farming. Some of the major industries include agriculture, mining, oil production, and manufacturing food products. North Dakota has 17 state parks and is home to events like craft shows, festivals, and marathons. The state symbols include the American elm tree, western meadowlark bird, and rough rider nickname.
The document provides background information on various British youth subcultures from the 1950s and 1960s to help prepare for an assessment on representations of youth. It discusses the rise of teenage culture following World War 2 and the Teddy Boys subculture of the 1950s. It then covers the mods and rockers, two conflicting subcultures of the 1960s, and how media coverage of fights between them led to a moral panic. Theories that could be applied are also mentioned, such as Cohen's theory of moral panic and Hebdige's theory of subcultures. Students are instructed to compare representations of youth from the 1960s and contemporary society, drawing on examples from music, film, and news articles.
The document summarizes several theories about media representations of youth:
- Giroux and Acland argue that media representations are created by adults without understanding youth reality, portraying them in stereotypical ways to control behavior.
- Hebdige says media limits representations of youth to "trouble" or "fun" without showing diversity.
- Cohen describes how media can create "moral panics" by portraying youth as "folk devils" when they challenge social norms.
- Gramsci discusses how media upholds norms of the middle class as the standard.
- Philo and Gerbner examine how media links negative stereotypes to social class and cultivates perceptions that youth are
The document provides algorithms and flowcharts for:
1) Finding the average of two numbers by inputting the numbers, adding them, and dividing the sum by 2.
2) Summing the even numbers between 1 and 20 by initializing a sum and counter, adding the counter if it is even, and incrementing the counter until it reaches 20.
3) Finding the sum of the first 50 natural numbers by initializing a sum and counter, adding the counter to the sum, and repeating until the sum reaches 50.
Drought monitoring, Precipitation statistics, and water balance with freely a...AngelosAlamanos
The aim of this study is to showcase and discuss these new technologies for hydrometeorological studies. Six of NASA’s web-repositories that can be used to freely download and
visualise such spatial and/or time-series factors are listed and explained with examples for Ireland: ways
to access hydrological, meteorological, soil, vegetation and socio-economic data are shown, and
estimations of various precipitations statistics, anomalies, and water balance are presented for monthly
and seasonal analyses. The advantages, disadvantages and limitations of the satellite datasets are
discussed to provide useful recommendations about their proper use, based on purpose, scale, precision,
time requirement, and modelling-expansion criteria.
Andy Jarvis PARASID Near Real Time Monitoring Of Habitat Change Using A Neur...CIAT
Brown bag presentation for TNC in Washington 24th September 2009 on the PARASID habitat monitoring tool. Authored by Andy Jarvis, Louis Reymondin and Jerry Touval.
The document discusses the goals and activities of the Year of Polar Prediction (YOPP) in improving polar prediction models through enhanced observational data from field sites. It describes YOPP's efforts to standardize data collection and model output across sites to facilitate direct comparisons between observations and multiple models. This includes developing common file formats, defining essential climate variables to be collected, and making both observation and model output available through a central data portal. The goals are to evaluate model performance against observations to identify areas for model improvement and advance polar prediction capabilities.
Andy Jarvis - Parasid Near Real Time Monitoring Of Habitat Change Using A Neu...CIAT
Presentation on the PARASID tool, a habitat monitoring system for Latin America, developed jointly by TNC and CIAT. Presented in a meeting with IDEAM, Bogota, Colombia on 19th September 2009.
1) The document outlines a method for documenting longitudinal site histories of land use and land management practices and scoring the responses of native vegetation communities over time.
2) A case study applying this method examines the transformation of open grassy woodlands in New South Wales through compiling historical land use and disturbance records and scoring changes in vegetation structure, composition, and regenerative capacity.
3) The method aims to provide insights on landscape transformation over decades and centuries that can inform future land use and conservation decisions by making visible the impacts of past and present land management practices.
Urban Landuse/ Landcover change analysis using Remote Sensing and GISHarshvardhan Vashistha
This document provides an overview of land cover and land use change detection. It discusses techniques for detecting changes through analyzing satellite images over time. Methods include visual interpretation, image rationing, classification, and indices. Factors to consider include the objective, type of change to detect, and type of changes of interest, like land use, forest, or urban changes. Applications include monitoring land use change, deforestation, fires, wetlands, urban growth, and environmental changes. Proper selection of methods and data depends on the scale and specifics of the changes being examined.
18 06 entso-e ieee pan european system adequacyLaurent Schmitt
The next Clean Energy transition requires to revisit the European System Adequacy approach for a proper coordination of the needed generation transition. The enclosed IEEE presentation provides an overview of the associated principles as coordinated through ENTSO-E.
Andy Jarvis Parasid Near Real Time Monitoring Of Habitat Change Using A Neura...CIAT
Presentation for the TNC Science Cabinet on the PARASID habitat monitoring tool, authored by Andy Jarvis and Louis Reymondin of CIAT and Jerry Touval of TNC. Presented on the 25th September 2009.
This document discusses climate change projections and their role in developing adaptation pathways. It notes that the IPCC provides the scientific basis for climate policies and that climate models at global and regional scales can provide climate change information and projections. It emphasizes that adaptation strategies should consider both current climate variability and potential future climate changes, as the risks may evolve over time. It also highlights lessons from Southeast Asia including the need for coordinated regional guidance, engagement with users, and probabilistic projections of climate extremes.
6.1.1 Methodologies for climate rational for adaptation- CC ProjectionsNAP Events
This document discusses climate change projections and their role in developing adaptation pathways. It notes that the IPCC provides the scientific basis for climate policies and that climate models at global and regional scales can provide climate change information and projections. It emphasizes that adaptation strategies should consider both current climate variability and potential future climate changes, as the risks may evolve over time. Finally, it discusses lessons learned from climate projection efforts in Southeast Asia, including the need for regional coordination, guidance on best practices, and tools to access and analyze climate model output.
The document describes a term project applying GIS techniques to study glacial retreat in the Buni Zum area of Chitral, Pakistan. A group of 5 students will use Landsat imagery from 2006 and 2013 to identify changes in snow cover, vegetation cover, and water in the study area over this period. They will use various image processing techniques like thresholding, classification, band ratios, and indices to analyze changes. Preliminary results found around a 6% decrease in glacier area, indicating rapid glacial depletion. The results were overlaid on a DEM contour map to analyze melting patterns across elevation ranges.
- AERO is a wind erosion modeling framework that simulates horizontal and vertical mass flux of windblown particles on the plot scale based on inputs of meteorological, soil, and vegetation data.
- It was developed to provide land managers with a decision support tool and researchers with a platform to study aeolian processes across different land cover types.
- AERO can leverage data from existing US monitoring programs to assess wind erosion impacts of management actions, land conditions, drought, and other disturbances at various spatial scales.
eMAST aims to integrate data from TERN and other sources to model ecosystems at all scales in Australia from 2013-2015. This will be done using data assimilation, model evaluation and optimization tools to further ecosystem science and help address questions about topics like carbon, water, climate change, fire, and biodiversity. Key products being delivered include high resolution climate and productivity datasets as well as tools for data analysis, interpolation and modeling. Progress includes the development and delivery of ANUClimate climate datasets and the ePiSaT model for estimating primary productivity across Australia using flux tower and satellite data.
Storm Prediction data analysis using R/SASGautam Sawant
• Performed data cleaning and analysis using R, SAS to predict financial loss caused due to storms also predict when a storm will occur depending upon previous storm data
• Implemented algorithms like Logistic Regression, Multiple Regression, Linear Discriminant Analysis, PCA to obtain insights from the Storm Dataset from 1950-2007
This document provides information on estimating mean areal rainfall using different techniques like the arithmetic method, Thiessen polygon method, and isohyetal method. It also discusses frequency analysis of point rainfall, including plotting position formulae and developing intensity-duration-frequency curves. Finally, it covers hydrological abstractions like interception loss and depression storage. Interception loss depends on factors like canopy storage capacity, rainfall duration and intensity, while depression storage varies based on land use and storm characteristics.
This document describes a study that developed a method for mapping annual land cover and land use (LULC) in the Dry Chaco ecoregion of South America using MODIS satellite imagery. Reference data was collected by visually interpreting high-resolution QuickBird imagery in Google Earth at random sample points. LULC was classified into 8 classes using predictor variables derived from MODIS time series data and a Random Forests classifier. Annual LULC maps from 2001 to 2007 were produced at 250m resolution and assessed for accuracy. The maps showed rapid deforestation related to expansion of soybean and pasture agriculture.
This study assessed the impacts of climate change and land use change on watershed hydrology using the SWAT model. The MIROC3.2 climate model and CLUE-s land use change model were used to project future conditions. SWAT was calibrated and validated, then run using projected increases in temperature and variability in precipitation as well as predicted decreases in forest/agriculture and increases in urban/grassland. The results indicated higher evapotranspiration and surface runoff by 2080, as well as increased groundwater recharge and streamflow, affecting dam and reservoir management.
This document summarizes research assessing ecosystem services over large areas in Scotland. Satellite data was integrated with other data to model various ecosystem services indicators at high spatial resolution, including net primary productivity, crop production, livestock density, water services, nutrient retention, and biodiversity. Hotspots with high levels of multiple ecosystem services were identified. Tools were developed to help stakeholders explore tradeoffs and advise on sustainable land management and land use change options based on ecosystem services priorities and spatial context. Limitations included lack of data on biodiversity's role and need for frequent monitoring to assess change over time.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELgerogepatton
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Low power architecture of logic gates using adiabatic techniquesnooriasukmaningtyas
The growing significance of portable systems to limit power consumption in ultra-large-scale-integration chips of very high density, has recently led to rapid and inventive progresses in low-power design. The most effective technique is adiabatic logic circuit design in energy-efficient hardware. This paper presents two adiabatic approaches for the design of low power circuits, modified positive feedback adiabatic logic (modified PFAL) and the other is direct current diode based positive feedback adiabatic logic (DC-DB PFAL). Logic gates are the preliminary components in any digital circuit design. By improving the performance of basic gates, one can improvise the whole system performance. In this paper proposed circuit design of the low power architecture of OR/NOR, AND/NAND, and XOR/XNOR gates are presented using the said approaches and their results are analyzed for powerdissipation, delay, power-delay-product and rise time and compared with the other adiabatic techniques along with the conventional complementary metal oxide semiconductor (CMOS) designs reported in the literature. It has been found that the designs with DC-DB PFAL technique outperform with the percentage improvement of 65% for NOR gate and 7% for NAND gate and 34% for XNOR gate over the modified PFAL techniques at 10 MHz respectively.
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSIJNSA Journal
The smart irrigation system represents an innovative approach to optimize water usage in agricultural and landscaping practices. The integration of cutting-edge technologies, including sensors, actuators, and data analysis, empowers this system to provide accurate monitoring and control of irrigation processes by leveraging real-time environmental conditions. The main objective of a smart irrigation system is to optimize water efficiency, minimize expenses, and foster the adoption of sustainable water management methods. This paper conducts a systematic risk assessment by exploring the key components/assets and their functionalities in the smart irrigation system. The crucial role of sensors in gathering data on soil moisture, weather patterns, and plant well-being is emphasized in this system. These sensors enable intelligent decision-making in irrigation scheduling and water distribution, leading to enhanced water efficiency and sustainable water management practices. Actuators enable automated control of irrigation devices, ensuring precise and targeted water delivery to plants. Additionally, the paper addresses the potential threat and vulnerabilities associated with smart irrigation systems. It discusses limitations of the system, such as power constraints and computational capabilities, and calculates the potential security risks. The paper suggests possible risk treatment methods for effective secure system operation. In conclusion, the paper emphasizes the significant benefits of implementing smart irrigation systems, including improved water conservation, increased crop yield, and reduced environmental impact. Additionally, based on the security analysis conducted, the paper recommends the implementation of countermeasures and security approaches to address vulnerabilities and ensure the integrity and reliability of the system. By incorporating these measures, smart irrigation technology can revolutionize water management practices in agriculture, promoting sustainability, resource efficiency, and safeguarding against potential security threats.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
3. What is Meteorology and Oceanography?
◦ study of spatial and temporal variations of the atmospheric,
oceanographic and land parameters over long time periods
◦ helps in prediction of disasters which prevents loss of life and
property
What is data mining?
◦ process of extraction of
implicit,
previously unknown
and potentially useful information from huge amount of data
4. Technique Application
Anomaly
detection
Detection of Land cover change , outlier
values of precipitation
Association rule
mining
Finding association between
oceanographic parameters and cyclone
intensification
Pattern mining
Understanding of natural events. For
example: eddies sustain energy for weeks
or months and therefore can be
manifested as connected group of
gradually increasing or decreasing time
series
Classification Detection of water fraction per flood pixel
Regression Detection of forest cover per pixel
5. Swirls of ocean currents
Play significant role in transport
of water, heat, salt, and nutrients
Green
swirl is
ocean
eddy
Gradually decreasing segments of time series enclosed between red and
green lines are signatures of an eddy
6. This is challenging due to following reasons:
◦ Not concrete objects: Spatio-temporal phenomena are not
concrete objects but evolving patterns over space and time
whereas in traditional data mining, objects are concrete i.e. they
are either present or absent.
Transactions – item either present or absent (0 or 1)
Hurricanes – continuous gradual evolution, does not simply appear
and disappear
◦ Uncertainty: It occurs due to biases in measurement as some
values may be missing due to presence of cloud cover.
◦ Diversity: This is due to heterogeneity in space and time as data
may be available from different sources at different spatial and
temporal resolutions
◦ Variability: Values captured for same location at difference of
small intervals may vary due to local climatic variations
7. The data retrieved from the remote sensing satellites is in the form
of data products having different data formats.
The standard data format for most of the data products is HDF
format. Some other formats are NetCDF, KML etc.
One data product contains data related to one parameter.
The authenticated users can download the Indian satellite data from
mosdac.gov.in website of ISRO.
MOSDAC disseminates data for around 20 parameters. Some of
these are:
o Normalized Difference Vegetation Index (NDVI)
o Land surface temperature (LST)
o Aerosol Optical Depth (AOD)
o Cloud Liquid Water (CLW)
o Mean Sea Surface (MSS)
8. Container for storing a variety of scientific data
Composed of two primary types of objects :
◦ Groups :
grouping structure containing 1 or more HDF objects together
with supporting metadata
◦ Datasets :
Multidimensional array of data elements together with
supporting metadata
12. Anomaly detection
◦ Land cover change detection
◦ Outlier precipitation detection
◦ Outlier time interval detection
Detection of water fraction per flood pixel
Detection of forest cover per pixel
13. Aim:
◦ To find those locations which undergo significant and sudden change
during a particular time period.
◦ The time at which the change occurs is also determined.
Importance:
◦ Helps in mapping of damages following a natural disaster such as fire,
droughts, floods etc.
14. Land cover
change detection
methods
Bitemporal
methods
Red – focused
techniques
Time series data
mining techniques
15. Bitemporal
methods
Image
differencing
Image
ratioing
Principal
component
analysis
Change
vector
analysis
Land cover
change
detection
methods
Bitemporal
methods
Time series
data mining
techniques
16. Time series
data mining
techniques
Predictive
model based
Yearly Delta
Algorithm
Variability
Distribution
Algorithm
Vegetation
Independent
Yearly Delta
Algorithm
Segmentation
based
Top down
approach
Bottom up
approach
Recursive
merging
algorithm
Land cover
change
detection
methods
Bitemporal
methods
Time series
data mining
techniques
17. Bitemporal methods Time series data mining methods
Two time instants are compared Vegetation time series is analyzed at each
location and changes in the time series are
identified
Do not provide the information about the
time of change
Provides the information about the time of
change
Less computational complexity Computational complexity is high as large
time series has to be analyzed
Segmentation based approach Predictive model based approach
Time series is partitioned into homogenous
segments and boundaries between
segments may be change points
A model is constructed for the portion of the
time series and that is used to predict the
future time points.
The time series that are sufficiently different
are considered change points.
18.
19. Time series data mining methods
Segmentation based
◦ Recursive merging algorithm
Predictive model based
◦ Yearly Delta algorithm
◦ Variability distribution algorithm
◦ Vegetation independent variability distribution algorithm
20. Input : Monthly composited EVI (Enhanced Vegetation Index) dataset for
the state of California for years 2000-2006.
Output : Detection of land cover changes
◦ Forest fires
◦ Conversion to farming
◦ Construction or logging
21. Algorithm : The pixel time series is analyzed as follows:
1. Let {b1,b2,…., bn} correspond to list of annual EVI sum which is the sum of
vegetation index value of all the months.
2. Two consecutive segments with most similar annual EVI sums are merged
• Suppose b1 and b2 are most similar EVI sums, then at the end of this
step, list will be {(b1+b2)/2,b3,…., bn} having one less element
• Merge cost s1= dist {b1,b2}
3. Step 2 is applied recursively until list contains one element
4. List of merge costs will be s1,s2,.......,sn-1.
5. Change score for a location or pixel will be
max
change score
i
s
6. Pixels are ranked on basis of change score value and some top ranked
pixels are considered as changes.
1
1
1
1
min
n
i
i
n
i
s
23. Change score is calculated in such a way so as to take into account
the type of vegetation
◦ very small change can be considered as change point for stable
forests
◦ large change may not be change point for high variability regions
such as grasslands
Helps in reducing the detection of false positives
Limitations:
◦ Minimum cost of merging is considered as variability value due to
local climatic changes.
◦ But, the minimum cost may have occurred very rare and have
been captured by chance
24. Time series data mining methods
Segmentation based
◦ Recursive merging algorithm
Predictive model based
◦ Yearly Delta algorithm
◦ Variability distribution algorithm
◦ Vegetation independent variability distribution algorithm
25. Input: MODIS EVI data for California and Yukon
◦ Data for California is at 250m spatial resolution for years 2006-
2008.
◦ Data for Yukon is at 1km spatial resolution for years 2004-2008.
◦ Time series for each pixel is analyzed independently
Output:
◦ Land cover change locations (pixels)
◦ Time at which change occurred
Validation: High quality data for fires generated from independent
source is used for validation
26. Algorithm:
◦ Previous year is considered as a model
◦ Change score is assigned to each time step as difference between mean annual
EVI of current year and previous year
change score annual EVI
annual EVI current year current year previous year
◦ Maximum change score across all the time steps is considered the YD score for a
location
n-1
◦ Top ranked pixels according to YD score are called change points.
Limitation :
◦ Does not make use of information about natural variation in EVI.
◦ Only one top change of a time series is considered.
There is possibility that one time series may undergo multiple changes during a
given period
score max(change score)
i1
YD
28. Change occurs in year 2005 due to natural
variations
Although
difference in
annual EVI is
high but not
very high if
compared with
mean
variability
score
29. Time series data mining methods
Segmentation based
◦ Recursive merging algorithm
Predictive model based
◦ Yearly Delta algorithm
◦ Variability distribution algorithm
◦ Vegetation independent variability distribution algorithm
30. Algorithm:
Each annual segment in the first k years is considered a model and remaining k-1
values are considered as the observed values.
Mean Manhattan distance is computed for the k-1 years of model to give the
distribution of variability scores for that location.
Modified score value called VD score is used which is
where μ is the mean of distribution.
The mean is estimated using Maximum Likelihood Estimation method
Special features:
Makes use of information about natural variation in EVI.
Any year for which annual EVI deviates significantly from the mean annual EVI for k
years should be discarded
Limitation :
Some of the vegetation types such as open shrubs have large variations in spread of
annual variability
VD score YD score -
31. Change
point
As only one
vegetation type
i.e. forests is
considered,
therefore YD is
also performing
better
Constant YD score
Constant VD score
Scatter plot of mean variability against YD score for forest cover
(Courtesy: Mithal et al. [6])
32. Savannas consist of
trees, shrubs, grasses
etc.
The different
vegetation types has
different value of
threshold change
score to be
considered as actual
change.
Therefore, VD
performs better than
YD algorithm
Constant YD score
Constant VD score
Scatter plot of mean variability against YD score for savannas (Courtesy:
Mithal et al. [6])
33. As open shrub-lands
show different spread
of variability for
different locations
even though
vegetation type is
same, therefore both
YD and VD are
showing lot of false
positives
Constant YD score
Constant VD score
Scatter plot of mean variability against YD score for shrublands
(Courtesy: Mithal et al. [6])
34. Time series data mining methods
Segmentation based
◦ Recursive Merging Algorithm
Predictive model based
◦ Yearly Delta Algorithm
◦ Variability Distribution Algorithm
◦ Vegetation Independent Variability Distribution Algorithm
35. Algorithm:
Mean and standard deviation of variability score distribution are
estimated as maximum likelihood estimates of distribution
New score called VID score is used and calculated as follows:
VID score
Salient features:
YD score -
Takes into account the information about spread of variability score
distribution and therefore reduces false alarm rates
High VID score implies lower false positive rate and vice versa.
36. Curve for variability
score for pixel 2
Curve for variability change for pixel 1
score for pixel 1
Mean annual EVI
Variability score in
this area indicates
Variability score in
this area indicates
change for pixel 2
Both pixels correspond to shrub vegetation
type whose spread of variability score varies
from location to location and time to time.
37. Maximum likelihood estimation (MLE)
Every model is specified by the parameters.
MLE is a parameter estimation method which finds the parameter values of a
model that best fits the data.
As fluctuations in variability score for particular vegetation type are normally
distributed for a location, therefore parameters are calculated for normal
distribution
The mean and standard deviation are the parameters for the normal distribution.
Calculation of mean and standard deviation using MLE
◦ Let f(y|w) denotes probability density function (PDF) that specifies probability of observing data
vector y given the parameter w.
◦ If individual observations yi are independent of each other, then according to theory of
probability, the PDF for data y=(y1,.......,yn) given the vector w can be expressed as
multiplication of individual PDFs.
f(y=(y1,…..yn)|w) = f1(y1|w) f2(y2|w)…..fn(yn|w)
38. The PDF for one observation is
e
( )
xi
1
P x
( ) 2
2
2
2
The PDF for multiple independent observations is
1
x x e
( ,....., | , ) 2
2
Taking log on both sides
xi
f
n
( )
2
2
1
e
( )
xi
n
n
2
2
2
(2 ) 2
2
2
2
ln(2 ) ln( )
1
2
ln( )
f n n xi
39. In order for data to best fit the model, the value of the parameter
vector should maximize the PDF.
The partial differentiation of PDF with respect to each of component
parameter of vector should be zero
f x xi i
n
0
(ln( ))
2
f n xi xi
n
2
3
2
0
(ln( ))
40. Yearly Delta algorithm Variability distribution
method
Vegetation Independent
Yearly Delta Algorithm
Does not consider the type
of vegetation.
Same YDscore value may be
actual change for forests
but not for savannas or
shrublands.
Considers the type of
vegetation
Same VD score may be
actual change for regions
such as savannas (having
less variation in variability
value) but not for
shrublands (having high
variation in variability value)
Considers the type of
vegetation
VIDscore works for all the
vegetation types
Does not consider the
average change score
value(μ) and the degree of
variability in value(σ)
Considers only the average
change score value(μ)
Considers both the average
change score value(μ) and
the degree of variability in
value(σ)
YDscore= max i=1 to n(annual
EVI current year – annual EVI
previous year)
VD score = YDscore - μ VID score=(YDscore-μ)/ σ
41. Where TPn = true positives,
FPn = false positives,
M = total no of pixels considered
42. Green line -> YD score
Red line -> VD score
Black line -> VID score
VD and VID gives better
results than YD.
Reason:
Graph corresponds
to only forest region.
MODIS forest map
was used to detect
forest cover pixels
inaccurate and
includes some
shrubs and
agricultural land
labeled as forests.
43. Green line -> YD score
Red line -> VD score
Black line -> VID score
VID performs slightly
worse than VD
Reason-Initial few years
selected to model variability
may have some noise
Therefore, mean variability
for that location is modeled
as high and changes in later
years will go undetected
44. Green line -> YD score
Red line -> VD score
Black line -> VID score
Performance of
VID is best.
Reason-Shrubs
form dominant land
cover type for
California and they
show high variability
in spread of
variability score due
to higher sensitivity
to climatic variations
45. Green line -> YD score
Red line -> VD score
Black line -> VID score
Performance of YD is
exceptionally poor and that
of VID is exceptionally
good.
Reason-due to high
variability in spread of
variability score for different
locations with vegetation
type as shrubs
46. Anomaly detection
◦ Land cover change detection
◦ Outlier precipitation detection
◦ Outlier time interval detection
Detection of water fraction per flood pixel
Detection of forest cover per pixel
47. Input :
◦ South American Precipitation dataset in geoscience format known as NetCDF
Output:
Variable Value
Num Year Periods 10
Year Range 1995-2004
Grid Size 2.5º×2.5º
Num Latitudes 31
Num Longitudes 23
Total Grids 713
◦ The top k=5 outliers are found for every year
◦ Total of 155 outlier sequences were found over a period of 10 years
Running time of algorithm is 229s.
48. Aim:
◦ To find and track the position of outliers with time
Method description:
◦ Top k outliers are found for every year using Exact-Grid Top-k algorithm
◦ Outliers are tracked using the OutStretch algorithm
◦ The outlier sequences generated are analyzed
How to find the outlier (Exact-Grid Top-k algorithm)
◦ Concept of discrepancy is used
◦ Discrepancy value is assigned to each rectangular region using
Kulldorff’s scan statistic.
◦ Top-k outliers are selected for further processing as it is necessary in
order to track the outliers
49. How to calculate the discrepancy?
◦ Two parameters are required:
a measurement m (number of incidences of an event)
a baseline b (total population at risk)
◦ The measurement M and baseline B values for the whole dataset (U) are
calculated as
p m M ) (
p U
B b( p)
p
U
◦ The measurement M and baseline B values for the region (R) are calculated
as
m p
( )
M
p R
R m
b p
( )
B
p R
R b
◦ The discrepancy score of the shaded area is calculated by using the given
formula:
)
1
1
m
m
m b m
R
( , ) log (1 ) log(
m
b
b
R
R
R
R
R R R
d
◦ For the above figure, M=6, B=16, mR= 4/6, and bR = 4/16
50. Outstretch algorithm
◦ The region is stretched around each side of the outlier region of the
previous year
◦ Each of outlier in current year is examined to see whether it lies in the
region consisting of stretched region and outlier region of previous year
◦ If it is, then it will be added to child list of previous year outlier
RecurseNode algorithm:
◦ All the sequences starting at root node of trees and ending at leaf node
are fetched.
Outlier region of
previous year
Stretched
region
51. (1,1), (2,2) and (3,2)
corresponds to one
sequence followed
by outlier
Forest built by applying outstretch algorithm recursively
52. Anomaly detection
◦ Land cover change detection
◦ Outlier precipitation detection
◦ Outlier time interval detection
Detection of water fraction per flood pixel
Detection of forest cover per pixel
53. Input:
◦ Sea surface temperature (SST) data of Equatorial Pacific Ocean.
◦ The data consisted of measurements of sea surface temperature
for 44 sensors in Pacific Ocean
◦ Each sensor had a time series of 1440 data points.
Output:
◦ Time intervals where spatial neighborhood has shown abnormal
behavior.
54. Terms:
Spatial distance (sd) : Distance between 2 locations based on distance between
spatial coordinates
2 2
sd s pxsqx s pysqy
( ) ( )
Measurement distance (md) : Distance between 2 points based on difference
between features of 2 points.
Where p and q are 2 locations
m
Spatial neighborhood : Cluster of locations such that the spatial distance (sd)
and measurement distance (md) between every 2 locations is less than the
respective threshold values.
Sum of squared error (SSE) : Measure of degree of abnormality of the interval
Where valbn is each
temporal reading in base
interval and μ is the mean
of the temporal readings
md s pam sqam
1
2
( )
BN
bn
valbn
SSE dist
1
2
( ) int
55. Aim:
◦ To find time intervals where spatial neighborhoods are likely to show
abnormal behavior.
Algorithm:
◦ Time series is first divided into a set of base equal size temporal
intervals
◦ Spatial neighborhoods are found for every base interval
◦ Each of spatial node in every base interval is analyzed and binary
classified as 1 if showing abnormal behavior or 0 otherwise
◦ Count of spatial nodes having a binary error classification of 1 is
found for every base interval and this count is called vote count.
◦ A threshold mv is then applied and those intervals for which votes >
mv are binary classified as 1 and others as 0.
56. ◦ Consecutive base intervals which have same binary classification are
merged to form the larger intervals.
◦ Mean value for each edge is calculated for every interval.
◦ Spatial neighborhoods are calculated for each interval using the
mean value of edge.
57. Agglomerative temporal intervals for SST data
Location : 0ºN latitude and 110ºW longitude
Time period : 10 day period from 01/01/2004 to 01/10/2004
No of measurements: approx.1400
58. Neighborhood (a) represents cooler water
Neighborhood (b) represents warmer water
Neighborhood (c) and (d) represents moderate water
• Edge clustering is validated by satellite image of SST.
• Light regions represent cooler temperatures
•Dark regions represent warmer temperatures.
59. Neighborhood quality for each interval
SSE of
neighborhood (a)
shows interesting
pattern between
intervals 16 and 19
SSE goes from
high to low and then
back from low to high
60. Neighborhood (a)
has more spread
during 16th interval
as compared to
17th interval
61. Input :
◦ Land cover type
◦ 8-day composite surface reflectance for NIR band (CH1) and VIS
band (CH2)
◦ CH2-CH1
◦ CH2/CH1
◦ NDVI dataset
◦ Data before flooding in Mississippi basin is used as training dataset
◦ Data after flooding in Mississippi basin that occurred on June 17-19,
2008 is used for testing
Output:
◦ The best attribute (R) for classification i.e. CH2-CH1 is found.
◦ The threshold values of the best attribute (R) for pure water and pure
land are found.
Validation data:
◦ 30m spatial resolution Landsat TM imagery for validation purposes
62. Aim:
◦ To find the fraction of water in flood pixels which are usually water
mixed with land cover features for MODIS dataset which has
coarse resolution
Method description:
◦ Decision tree approach is used to find
the best parameter (predictor) in order to differentiate between
land and water.
the threshold values of the predictor R for pure water (Rwater)
and pure land (Rland)
◦ Water fraction per pixel can be found by comparing actual value
of predictor with its value for pure water or pure land
R WF * R (1WF )*
Rmix water land
((R R ) /(R R ))*100 land mix land water
WF
63. Experimental Results:
Some of the rules used for deciding threshold values are :
◦ (CH2-CH1) > 9.17 -> class Land
◦ (CH2-CH1) <= 2.91 -> class Water
Correlation between TM and MODIS water fractions is 0.97 with
bias of 4.47% and standard deviation of 4.4%.
65. Anomaly detection
◦ Land cover change detection
◦ Outlier precipitation detection
◦ Outlier time interval detection
Detection of water fraction per flood pixel
Detection of forest cover per pixel
66. Input :
◦ Land surface temperature 5-monthly composited MOD11C3
product
◦ NDVI and EVI from monthly composited MOD13C2 product
◦ Land cover type from MCD12C1 yearly product
Output:
◦ Fraction of forest cover per pixel
Validation:
◦ Forest cover information from PRODES data at 90 m resolution
available in GeoTiff format is used for validation purposes
67. Aim :
◦ To find the forest cover per pixel for MODIS dataset having coarse
resolution
◦ The data values for parameters like NDVI, EVI, land surface temperature
etc are available per pixel.
◦ Therefore, value is affected by vegetation cover of every point covered in
that pixel.
◦ Same parameter value may correspond to different fraction of forest
cover depending on vegetation type for whole area per pixel.
Algorithm:
◦ Modification of Leeuwen et al. approach.
◦ Leeuwen et al. approach gives the single logistic regression model for all
vegetation types.
◦ But, improved algorithm considers vegetation type and gives
independent logistic regression model for each vegetation type
68. Leeuwen et al. approach
Terms:
pit : Fraction of forest cover for pixel i in year t (generated from the
analysis of high-resolution LandSat TM) images
Xit : Vector of MODIS observations for pixel i in year t
β: Vector of model parameters (which are estimated from a set of
training data) for pixel i in year t
The vectors Xit and β each have three components:
◦ the first corresponding to a constant intercept term
◦ the second to a NDVI measurement,
◦ and the third to a LST measurement.
Model :
X
p
it
p
it
T
it
1
ln
69. Learning independent regression algorithms require segmentation
of observation space into multiple categories.
Segmentation is done by partitioning the feature space which is n-dimensional
space with one feature corresponding to one of axis.
Features are selected based on their ability in differentiating
between different vegetation types.
For example: Forests show high inter-annual NDVI and EVI mean
and low inter-annual LST mean but intra-annual variance of NDVI,
EVI and LST is low.
Therefore, mean(μ) and variance(σ2) are selected as features.
70. Forests show high
inter-annual mean
and low intra-annual
variance
Farms show high
intra-annual variance
due to crop cycles
Grasslands show
high intra-annual
variance and high
inter-annual mean
Water locations
show high intra-annual
variance and
low inter-annual
mean
Vegetation type distribution in feature space (μ, σ2) of NDVI
71. Analysis of partition
corresponding to forest
vegetation type
Scatter plot of residual of
baseline approach and residual
of vegetation specific approach
Residual of vegetation
specific approach has lower
magnitude than baseline model
Therefore, vegetation
approach better than baseline
model
72. Analysis of partition
corresponding to cropland
vegetation type
Residual of vegetation
specific model is lower in
magnitude as compared to
baseline model
74. Various research works related to anomaly detection and
detection of water fraction or forest cover per pixel have been
discussed.
Most of the research works are pixel-based and do not
consider the spatial neighborhood of a pixel.
Domain knowledge is also required along with data mining
techniques
Future works should work towards addressing these
limitations.
76. [1] Jonathan T. Overpeck, Gerald A.Meehl, Sandrine Bony, David R. Easterling, D. (2011) ,"Climate
data challenges in the 21st century," in Science, 2011.
[2] James H. Faghmous and Vipin Kumar," Spatio-temporal data mining for climate data : Advances,
Challenges and Opportunities," in Data Mining and Knowledge Discovery for Big Data, 2014.
[3] Donglian Sun, Yunyue Yu, Mitchell D. Goldberg ,"Deriving Water Fraction and Flood Maps From
MODIS Images Using Decision Tree Approach," in IEEE Journal of Selected Topics In Applied Earth
Observations And Remote Sensing , 2011.
[4] Shyam Boriah, Vipin Kumar, Michael Steinbach, Christopher Potter, Steven Klooster," Land
Cover Change Detection : A Case Study," in Knowledge Discovery in Databases Proceedings, 2008.
[5] Elizabeth Wu, Wei Liu, Sanjay Chawla ,"Spatio-Temporal Outlier Detection in Precipitation Data,"
in Knowledge Discovery in Databases Proceedings, 2008.
[6] Hong Yeon Cho, Ji Hee Oh, Kyeong Ok Kim, and Jae Seol Shim, "Outlier Detection and missing
data filling methods for coastal water temperature data," in Journal of Coastal Research, 2013.
77. [7] C.T.Dhanya and D.Nagesh Kumar, " Data mining for evolution of association rules for droughts
and floods in India using climate inputs," in Journal of Geophysical Research, 2009.
[8] Ruixin Yang, Jiang Tang, and Donglian Sun, " Association Rule Data Mining Applications for
Atlantic Tropical Cyclone Intensity Changes," in Journal of American Meteorological Society, 2011.
[9] James H.Faghmous, Yashu Chamber, Shyam Boriah, Stefan Liess, Vipin Kumar, "A novel and
scalable spatio-temporal technique for ocean eddy monitoring," in Association for Advancement of
Artificial Intelligence, 2012.
[10] Imran Maqsood, Muhammad Riaz Khan, and Ajith Abrahim, "An ensemble of neural networks for
weather forecasting," in Neural Computing & Applications , 2004.
[11] Agboola A.H., Gabriel A.J., Aliyu E.O., Alese B.K., "Development of a Fuzzy Logic Based
Rainfall Prediction Model," in International Journal of Engineering and Technology, 2013.
[12] Christopher G.Healey, "On the Use of Perceptual Cues and Data Mining for Effective
Visualization of Scientific Datasets," in Proceedings Graphics Interface, 1998.
78. [13] Wenwen Li, Chaowei Yang, Donglian Sun, "Mining geophysical Parameters through Decision Tree
Analysis to Determine Correlation with Tropical Cyclone Development," in Computers & Geosciences,
2008.
[14]Pinky Saikia Dutta, and Hitesh Tahbilder, "Prediction of Rainfall using Data mining Technique over
Assam," in Indian Journal of Computer Science and Engineering (IJCSE),2014.
[15]Anuj Karpatne, Mace Blank, Michael Lau, Shyam Boriah, Karsten Steinhaeuser, Michael Steinbach
and Vipin Kumar," Importance of Vegetation Type in Forest Cover," in Intelligent Data Understanding,
2012.
[16] James H.Faghmous, Mathew Le, Muhammed Uluyol, Vipin Kumar and Snigdhansu Chatterjee, "A
parameter-free spatio-temporal pattern mining model to catalog ocean dynamics," in IEEE 13th
International Conference on Data Mining, 2013.
[17] Rie Honda and Osamu Konishi, "Temporal rule discovery for Time-Series Satellite Images and
Integration with RDB," in Principles of Data Mining and Knowledge Discovery, Lecture Notes in
Computer Science ,2001.
79. [18] Pol R. Coppin and Marvin E. Bauer, "Change Detection in Forest Ecosystems with Remote Sensing
Digital Imagery ," in Remote Sensing Reviews, 1996.
[19] Varun Mithal, Ashish Garg , Ivan Brugere, Shyam Boriah, Vipin Kumar, Michael Steinbach, ristopher
Potter, Steven Klooster , "Incorporating Natural Variation Into Time Series-Based Land Cover Change
Identification," in Proceedings of the NASA Conference on Intelligent Data Understanding, 2011.
[20] Varun Mithal, Shyam Boriah, Ashish Garg, Michael Steinbach, Vipin Kumar, "Monitoring Global
Forest Cover Using Data Mining," in Journal of Association for Computing Machinery, Volume V, 2010.
[21] D. Agarwal, A. McGregor, J.M.Phillips, S.Venkatsubramanian, and Z.Zhu, "Spatial Scan Statistics:
Approximations and Performance Study," in Knowledge Discovery in Databases Proceedings, 2006.
[22] Michael P. McGuire, Vandana P. Janeja, Aryya Gangopadhyay, "Spatiotemporal Neighborhood
Discovery for Sensor Data" in Knowledge Discovery in Databases Proceedings, 2008.
[23]Thijs T. van Leeuwen, Andrew J. Frank, Yufang Jin, Padhraic Smyth, Michael L. Goulden, Guido R.
van der Werf and James T. Randerson, "Optimal use of land surface temperature data to detect changes
in tropical forest cover," in Journal of Geophysical Research: Biogeosciences, 2011.
Editor's Notes
Example for concrete objects
Uncertainty due to presence of clouds or others
Explanation for the HDF file
Basic idea about these
Residual – difference between actual value of x and predicted value of x.