Presentation by Julian Ramirez-Villegas.
CCAFS workshop titled "Using Climate Scenarios and Analogues for Designing Adaptation Strategies in Agriculture," 19-23 September in Kathmandu, Nepal.
A watershed is defined as the area of land that drains to a particular point along a stream. The boundary of a watershed is defined by the highest elevations surrounding the stream, and a drop of water falling outside this boundary will drain to another watershed. This document outlines steps to automatically delineate watershed areas and stream networks in ArcGIS using tools like flow direction, flow accumulation, and watershed. The process involves converting DEM data to raster format, removing sinks, and generating channels, stream links, and watersheds based on flow patterns and accumulation thresholds.
The document discusses the process of creating a terrain model and delineating floodplains from LiDAR data using ArcGIS and HEC-GeoRAS/HEC-RAS software. Key steps include building a terrain from LiDAR point clouds, breaklines and other feature classes, converting the terrain to a raster, using HEC-GeoRAS to generate cross-section geometry from the raster for hydraulic modeling in HEC-RAS, and mapping floodplains back in GIS from HEC-RAS results. Issues like missing LiDAR data or incorrect elevations may require additional field surveying.
1) The document outlines the steps to delineate a catchment area using software like Global Mapper and WMS. It begins with preparing DEM and satellite imagery data and getting GPS coordinates for an outlet point.
2) The initial catchment area is delineated in WMS and streams are verified. The catchment area is then visualized. Relevant data like drainage networks are downloaded and imported.
3) Flow direction and accumulation is calculated using TOPAZ tools in Global Mapper. Contour lines and other data are displayed to finalize the delineated catchment area containing the outlet point.
03 sajjad ali -qgis working with rasterTOUSEEF3347
QGIS uses the GDAL library to support over 100 raster format types. Raster layers can be loaded into QGIS by clicking the Load Raster icon or selecting Add Raster Layer from the Layer menu. The Raster Calculator allows performing calculations on existing raster pixel values to create a new output raster layer, and includes fields for input raster layers, operators, and defining the result layer.
This document discusses using FME for working with raster data. It provides an overview of raster types and formats supported by FME, as well as common raster workflows and transformations. These include format conversion, loading/extracting to databases, processing via resampling or cell operations, enriching GIS data by converting between raster and vector, 3D workflows like draping, and consuming/publishing raster to the web. The document demonstrates several raster workflows in FME including generating reports from LAS point clouds, writing raster images to Excel, creating KML for tracking storm paths over time, and modeling flood inundation from DEM and river vector data.
This document describes how to create a Unix space management report using SAS. Key steps include:
1) Using Unix commands like df, du, and find within a SAS program to report on space capacity, usage, and availability at the volume and directory level.
2) Employing SAS/Graph and ODS to output the results into a PDF report with bar charts and area plots.
3) Automating the reporting process using cron jobs on Unix or the Enterprise Guide job scheduler to regularly execute the SAS program.
This document outlines the steps to derive watersheds in Central Celebes, Indonesia from SRTM data using 3D modeling software and ArcGIS tools:
1) Load and preprocess SRTM data to create a DEM raster, including cropping, filling blanks, converting projections, and saving as a GeoTIFF.
2) Perform terrain analysis in ArcGIS to calculate flow direction, flow accumulation, stream definition, stream segmentation, and catchment grid delineation using the ArcHydro tools.
3) View the resulting watersheds and hydrological networks in 3D in ArcScene by setting the appropriate layer properties and vertical exaggeration.
This document discusses new formats and transformers for point cloud data, including the CARIS Spatial Archive format and Mojang Minecraft format. It introduces the PointCloudSorter, PointCloudMerger, and PointCloudStatisticsCalculator transformers for sorting, merging, and calculating statistics on point clouds. An example workspace is described that demonstrates using point clouds for classification, biomass calculation, and feature extraction, with the ability to output rasters and vectors. The PointCloudStatisticsCalculator is demonstrated on a Port Coquitlam building example.
A watershed is defined as the area of land that drains to a particular point along a stream. The boundary of a watershed is defined by the highest elevations surrounding the stream, and a drop of water falling outside this boundary will drain to another watershed. This document outlines steps to automatically delineate watershed areas and stream networks in ArcGIS using tools like flow direction, flow accumulation, and watershed. The process involves converting DEM data to raster format, removing sinks, and generating channels, stream links, and watersheds based on flow patterns and accumulation thresholds.
The document discusses the process of creating a terrain model and delineating floodplains from LiDAR data using ArcGIS and HEC-GeoRAS/HEC-RAS software. Key steps include building a terrain from LiDAR point clouds, breaklines and other feature classes, converting the terrain to a raster, using HEC-GeoRAS to generate cross-section geometry from the raster for hydraulic modeling in HEC-RAS, and mapping floodplains back in GIS from HEC-RAS results. Issues like missing LiDAR data or incorrect elevations may require additional field surveying.
1) The document outlines the steps to delineate a catchment area using software like Global Mapper and WMS. It begins with preparing DEM and satellite imagery data and getting GPS coordinates for an outlet point.
2) The initial catchment area is delineated in WMS and streams are verified. The catchment area is then visualized. Relevant data like drainage networks are downloaded and imported.
3) Flow direction and accumulation is calculated using TOPAZ tools in Global Mapper. Contour lines and other data are displayed to finalize the delineated catchment area containing the outlet point.
03 sajjad ali -qgis working with rasterTOUSEEF3347
QGIS uses the GDAL library to support over 100 raster format types. Raster layers can be loaded into QGIS by clicking the Load Raster icon or selecting Add Raster Layer from the Layer menu. The Raster Calculator allows performing calculations on existing raster pixel values to create a new output raster layer, and includes fields for input raster layers, operators, and defining the result layer.
This document discusses using FME for working with raster data. It provides an overview of raster types and formats supported by FME, as well as common raster workflows and transformations. These include format conversion, loading/extracting to databases, processing via resampling or cell operations, enriching GIS data by converting between raster and vector, 3D workflows like draping, and consuming/publishing raster to the web. The document demonstrates several raster workflows in FME including generating reports from LAS point clouds, writing raster images to Excel, creating KML for tracking storm paths over time, and modeling flood inundation from DEM and river vector data.
This document describes how to create a Unix space management report using SAS. Key steps include:
1) Using Unix commands like df, du, and find within a SAS program to report on space capacity, usage, and availability at the volume and directory level.
2) Employing SAS/Graph and ODS to output the results into a PDF report with bar charts and area plots.
3) Automating the reporting process using cron jobs on Unix or the Enterprise Guide job scheduler to regularly execute the SAS program.
This document outlines the steps to derive watersheds in Central Celebes, Indonesia from SRTM data using 3D modeling software and ArcGIS tools:
1) Load and preprocess SRTM data to create a DEM raster, including cropping, filling blanks, converting projections, and saving as a GeoTIFF.
2) Perform terrain analysis in ArcGIS to calculate flow direction, flow accumulation, stream definition, stream segmentation, and catchment grid delineation using the ArcHydro tools.
3) View the resulting watersheds and hydrological networks in 3D in ArcScene by setting the appropriate layer properties and vertical exaggeration.
This document discusses new formats and transformers for point cloud data, including the CARIS Spatial Archive format and Mojang Minecraft format. It introduces the PointCloudSorter, PointCloudMerger, and PointCloudStatisticsCalculator transformers for sorting, merging, and calculating statistics on point clouds. An example workspace is described that demonstrates using point clouds for classification, biomass calculation, and feature extraction, with the ability to output rasters and vectors. The PointCloudStatisticsCalculator is demonstrated on a Port Coquitlam building example.
This document discusses cluster analysis techniques in R. It mentions kmeans clustering and using KML files in R. The document focuses on using different cluster analysis methods in the R programming language.
This document describes the 7 main steps to create a watershed from SRTM DEM data using ArcGIS software: 1) Create a DEM from SRTM data and reproject it; 2) Remove sinks in the DEM; 3) Generate a flow direction raster; 4) Generate a flow accumulation raster; 5) Generate a stream channel raster; 6) Generate stream links; and 7) Generate the watershed polygons. Each step involves using different ArcGIS hydrology tools on the output of the previous step to delineate watershed boundaries.
This document discusses flood mapping and summarizes the key inputs and processes. It notes that more accurate flood maps are needed and describes using precipitation data, rainfall-runoff models, hydraulic models, and terrain data to create flood maps. Issues with importing data and a lack of ArcGIS 10 support are mentioned. Future work on real-time flood mapping by interpolating water surface elevations from stage data is also discussed.
This document provides an overview of Hadoop MapReduce. It begins with an introduction to MapReduce and defines it as the processing component of Apache Hadoop that processes data in parallel across a distributed environment. The document then discusses two main advantages of MapReduce: 1) parallel processing, which makes data processing fast, and 2) data locality where processing is moved to the data rather than moving large amounts of data. It also provides an example of how MapReduce can be used to efficiently count words in a document by splitting the work across nodes and aggregating the results.
Flood Map Desktop (FMD) is free software that helps users create digital flood insurance rate maps (DFIRMs) using GIS. It allows users to set up geodatabases and project settings to manage flood mapping data and produce maps that meet FEMA standards. Final deliverables include FEMA-compliant GIS layers, PDFs and PNGs of the maps, and required metadata. FMD has improved efficiency for DNR's mapping program by streamlining data creation, management, and standardization compared to using FEMA's online Mapping Information Platform.
Map Reduce introduction (google white papers)Archith777
This document describes MapReduce, a programming model for large-scale data processing across distributed systems. It explains that MapReduce exploits large sets of commodity computers to execute processes in a distributed manner and offers high availability. The core operations in MapReduce are the Map and Reduce functions. Map processes input key-value pairs to generate intermediate outputs, while Reduce merges all intermediate values with the same key. MapReduce handles scheduling tasks across machines and rerunning tasks if failures occur, simplifying programming for large-scale data problems.
Wind Force & Direction from Ships Captains LogsAndrew Zolnai
This document discusses shipping and weather data from 1662-1855 recorded in captains' logs. It contains over 290,000 data points across 120 weather parameters. The data has been added to GIS databases and published online to allow exploration and comparison of shipping routes and weather patterns over time to support climate research. Tools like time sliders and filtering by nationality facilitate investigation of this large dataset. Lessons include joining data losslessly, sharing openly to enable reuse and collaboration, and using the data to further climate studies.
North Energy is an oil and gas exploration company based in Norway with offices in four locations. They use GIS software ArcGIS to integrate and analyze exploration data from their interpretation software Petrel. This includes directly importing grids, seismic attributes, and maps into ArcGIS to visualize results. Well data is also joined to attribute data to analyze lithology and reservoir properties. They have implemented a Portal for ArcGIS to publish maps and data for sharing internally and to manage their prospect inventory using a prospect uploader tool.
This document outlines the steps taken to determine material loss in the Grasberg area of Papua caused by private company exploration using a 3D analysis technique called cut and fill. The analysis involved generating elevation data points from SRTM data, converting the points to a vector file, creating a TIN surface, and executing a cut and fill between two TINs to calculate the volume of material loss in cubic meters.
The document outlines objectives to perform an uncertainty analysis of flood performance in the Timis Bega catchment, develop flood hazard maps for a 2005 extreme event, test a novel uncertainty analysis method, and study applying cloud computing to the modeling and analysis. It will use numerical modeling with HEC-HMS, HEC-RAS, and SOBEK software and uncertainty analysis methods like Bayesian and GLUE approaches to analyze reservoir and polder performance, develop lead time curves, and create flood hazard maps to improve flood management in the catchment. Cloud computing will also be explored to improve the uncertainty analysis for complex water systems.
This document summarizes a time series analysis of airline sales data from 1949 to 1961 using SAS software to forecast sales for 1961. It describes preparing the data by checking for volatility, non-stationarity, and seasonality. Several ARIMA models were fitted and the best model with p=0 and q=3 was selected using error metrics. Forecasts were made for 1961 and graphically compared to actual sales, with the aim of predicting airline sales for planning purposes.
This document discusses sales forecasting for an airline using time series modeling. It describes preparing the data by checking for volatility, non-stationarity, and seasonality. Several time series models are identified and compared using information criteria. The best model is found to be ARIMA(0,1,3) based on lowest MAPE error. Forecasts are generated for the next 12 months and graphically represented along with the actual historical sales values.
This document discusses data visualization techniques using R and provides 24 examples of different types of plots and graphs that can be created, including calendar heatmaps, bivariate density plots, hexagonal binning, regression plots, jittering, square tiles, table plots, mosaic plots, tree maps, bar plots, pie charts, dashboards, images of matrices, Monte Carlo simulations, parametric curves, area charts, bubble charts, box plots, word clouds, R color palettes, time plots, path plots, conditioning plots, and moving scatterplots. The examples demonstrate how to create informative yet easy to understand graphs using R for data analysis and visualization.
The document discusses bringing mixed integer linear programming (MILP) online for path planning of unmanned aerial vehicles (UAVs). It outlines challenges with using traditional MILP for dynamic online path planning, including inability to react to changes and slow solve times. It then presents a solution of using geographic coordinate conversions and a receding horizon approach to discretize the problem and allow incremental re-solving as new information becomes available. This allows MILP to be used for online dynamic path planning of UAVs while addressing its limitations for such applications.
Sales forecasting of an airline company using time series analysis (1) (1)Ashish Ranjan
The document describes using time series analysis in SAS to forecast airline sales for the year 1961 based on monthly sales data from 1949-1960. Key steps included: checking for non-stationarity and seasonality, transforming the data using logs, selecting a development and validation sample, identifying the best ARIMA model using minimum BIC and AIC/SBC averages, generating forecasts from multiple models and selecting the model with minimum MAPE, and producing a final forecast for 1961 sales.
This document outlines the implementation of a real-time numerical weather prediction (NWP) forecasting system in Southern Africa. The objectives were to implement the NWP model, validate the model data using other weather data sources, and disseminate the graphical forecasts on a website. The model was set up using common physics and dynamics options. The domain covered parts of Botswana, South Africa, Zimbabwe, Lesotho, and Swaziland. The model data was validated against temperature data from meteorological services and showed similar maximum and minimum temperatures. The graphical forecast data was then disseminated through an intranet website. In conclusion, the project objectives were met and the model captured weather patterns relatively well but could be improved by comparing
The document analyzes agricultural change in Chile's Aconcagua Valley between 1989 and 2010 using Landsat imagery. It classified changes in normalized difference vegetation index (NDVI) values to identify new agricultural areas, which were extracted as polygons. The largest increase was in vineyards. Assuming the new agriculture was all wine grapes, the analysis estimated the area of new vineyards as 9598 acres, and calculated the potential water usage as between 558 million to 892 million liters annually depending on vineyard yield. Concerns about increased agriculture and water scarcity in the valley were also noted.
This document discusses accessing and working with spatial data in R. It covers reading vector and raster data formats like shapefiles and GeoTIFFs using packages like maptools and rgdal. It also discusses writing spatial data, coordinate reference systems (CRS), and loose vs tight coupling with GIS interfaces.
To use precipitation grids outside the US in HEC-HMS, two things are required: 1) loading the precipitation grids into a DSS file and 2) creating a file that associates grid cells with subbasins using GeoHMS. While GeoHMS was designed for the conterminous US, it can be used outside the US by modifying projection files to tell GeoHMS the watershed is in the US coordinate system and generating gridcell files using this "lie". Grids must also be loaded to DSS properly by understanding how row and column numbers relate to coordinates.
This document discusses cluster analysis techniques in R. It mentions kmeans clustering and using KML files in R. The document focuses on using different cluster analysis methods in the R programming language.
This document describes the 7 main steps to create a watershed from SRTM DEM data using ArcGIS software: 1) Create a DEM from SRTM data and reproject it; 2) Remove sinks in the DEM; 3) Generate a flow direction raster; 4) Generate a flow accumulation raster; 5) Generate a stream channel raster; 6) Generate stream links; and 7) Generate the watershed polygons. Each step involves using different ArcGIS hydrology tools on the output of the previous step to delineate watershed boundaries.
This document discusses flood mapping and summarizes the key inputs and processes. It notes that more accurate flood maps are needed and describes using precipitation data, rainfall-runoff models, hydraulic models, and terrain data to create flood maps. Issues with importing data and a lack of ArcGIS 10 support are mentioned. Future work on real-time flood mapping by interpolating water surface elevations from stage data is also discussed.
This document provides an overview of Hadoop MapReduce. It begins with an introduction to MapReduce and defines it as the processing component of Apache Hadoop that processes data in parallel across a distributed environment. The document then discusses two main advantages of MapReduce: 1) parallel processing, which makes data processing fast, and 2) data locality where processing is moved to the data rather than moving large amounts of data. It also provides an example of how MapReduce can be used to efficiently count words in a document by splitting the work across nodes and aggregating the results.
Flood Map Desktop (FMD) is free software that helps users create digital flood insurance rate maps (DFIRMs) using GIS. It allows users to set up geodatabases and project settings to manage flood mapping data and produce maps that meet FEMA standards. Final deliverables include FEMA-compliant GIS layers, PDFs and PNGs of the maps, and required metadata. FMD has improved efficiency for DNR's mapping program by streamlining data creation, management, and standardization compared to using FEMA's online Mapping Information Platform.
Map Reduce introduction (google white papers)Archith777
This document describes MapReduce, a programming model for large-scale data processing across distributed systems. It explains that MapReduce exploits large sets of commodity computers to execute processes in a distributed manner and offers high availability. The core operations in MapReduce are the Map and Reduce functions. Map processes input key-value pairs to generate intermediate outputs, while Reduce merges all intermediate values with the same key. MapReduce handles scheduling tasks across machines and rerunning tasks if failures occur, simplifying programming for large-scale data problems.
Wind Force & Direction from Ships Captains LogsAndrew Zolnai
This document discusses shipping and weather data from 1662-1855 recorded in captains' logs. It contains over 290,000 data points across 120 weather parameters. The data has been added to GIS databases and published online to allow exploration and comparison of shipping routes and weather patterns over time to support climate research. Tools like time sliders and filtering by nationality facilitate investigation of this large dataset. Lessons include joining data losslessly, sharing openly to enable reuse and collaboration, and using the data to further climate studies.
North Energy is an oil and gas exploration company based in Norway with offices in four locations. They use GIS software ArcGIS to integrate and analyze exploration data from their interpretation software Petrel. This includes directly importing grids, seismic attributes, and maps into ArcGIS to visualize results. Well data is also joined to attribute data to analyze lithology and reservoir properties. They have implemented a Portal for ArcGIS to publish maps and data for sharing internally and to manage their prospect inventory using a prospect uploader tool.
This document outlines the steps taken to determine material loss in the Grasberg area of Papua caused by private company exploration using a 3D analysis technique called cut and fill. The analysis involved generating elevation data points from SRTM data, converting the points to a vector file, creating a TIN surface, and executing a cut and fill between two TINs to calculate the volume of material loss in cubic meters.
The document outlines objectives to perform an uncertainty analysis of flood performance in the Timis Bega catchment, develop flood hazard maps for a 2005 extreme event, test a novel uncertainty analysis method, and study applying cloud computing to the modeling and analysis. It will use numerical modeling with HEC-HMS, HEC-RAS, and SOBEK software and uncertainty analysis methods like Bayesian and GLUE approaches to analyze reservoir and polder performance, develop lead time curves, and create flood hazard maps to improve flood management in the catchment. Cloud computing will also be explored to improve the uncertainty analysis for complex water systems.
This document summarizes a time series analysis of airline sales data from 1949 to 1961 using SAS software to forecast sales for 1961. It describes preparing the data by checking for volatility, non-stationarity, and seasonality. Several ARIMA models were fitted and the best model with p=0 and q=3 was selected using error metrics. Forecasts were made for 1961 and graphically compared to actual sales, with the aim of predicting airline sales for planning purposes.
This document discusses sales forecasting for an airline using time series modeling. It describes preparing the data by checking for volatility, non-stationarity, and seasonality. Several time series models are identified and compared using information criteria. The best model is found to be ARIMA(0,1,3) based on lowest MAPE error. Forecasts are generated for the next 12 months and graphically represented along with the actual historical sales values.
This document discusses data visualization techniques using R and provides 24 examples of different types of plots and graphs that can be created, including calendar heatmaps, bivariate density plots, hexagonal binning, regression plots, jittering, square tiles, table plots, mosaic plots, tree maps, bar plots, pie charts, dashboards, images of matrices, Monte Carlo simulations, parametric curves, area charts, bubble charts, box plots, word clouds, R color palettes, time plots, path plots, conditioning plots, and moving scatterplots. The examples demonstrate how to create informative yet easy to understand graphs using R for data analysis and visualization.
The document discusses bringing mixed integer linear programming (MILP) online for path planning of unmanned aerial vehicles (UAVs). It outlines challenges with using traditional MILP for dynamic online path planning, including inability to react to changes and slow solve times. It then presents a solution of using geographic coordinate conversions and a receding horizon approach to discretize the problem and allow incremental re-solving as new information becomes available. This allows MILP to be used for online dynamic path planning of UAVs while addressing its limitations for such applications.
Sales forecasting of an airline company using time series analysis (1) (1)Ashish Ranjan
The document describes using time series analysis in SAS to forecast airline sales for the year 1961 based on monthly sales data from 1949-1960. Key steps included: checking for non-stationarity and seasonality, transforming the data using logs, selecting a development and validation sample, identifying the best ARIMA model using minimum BIC and AIC/SBC averages, generating forecasts from multiple models and selecting the model with minimum MAPE, and producing a final forecast for 1961 sales.
This document outlines the implementation of a real-time numerical weather prediction (NWP) forecasting system in Southern Africa. The objectives were to implement the NWP model, validate the model data using other weather data sources, and disseminate the graphical forecasts on a website. The model was set up using common physics and dynamics options. The domain covered parts of Botswana, South Africa, Zimbabwe, Lesotho, and Swaziland. The model data was validated against temperature data from meteorological services and showed similar maximum and minimum temperatures. The graphical forecast data was then disseminated through an intranet website. In conclusion, the project objectives were met and the model captured weather patterns relatively well but could be improved by comparing
The document analyzes agricultural change in Chile's Aconcagua Valley between 1989 and 2010 using Landsat imagery. It classified changes in normalized difference vegetation index (NDVI) values to identify new agricultural areas, which were extracted as polygons. The largest increase was in vineyards. Assuming the new agriculture was all wine grapes, the analysis estimated the area of new vineyards as 9598 acres, and calculated the potential water usage as between 558 million to 892 million liters annually depending on vineyard yield. Concerns about increased agriculture and water scarcity in the valley were also noted.
This document discusses accessing and working with spatial data in R. It covers reading vector and raster data formats like shapefiles and GeoTIFFs using packages like maptools and rgdal. It also discusses writing spatial data, coordinate reference systems (CRS), and loose vs tight coupling with GIS interfaces.
To use precipitation grids outside the US in HEC-HMS, two things are required: 1) loading the precipitation grids into a DSS file and 2) creating a file that associates grid cells with subbasins using GeoHMS. While GeoHMS was designed for the conterminous US, it can be used outside the US by modifying projection files to tell GeoHMS the watershed is in the US coordinate system and generating gridcell files using this "lie". Grids must also be loaded to DSS properly by understanding how row and column numbers relate to coordinates.
The document discusses spatial analysis and visualization software called S-PLUS SpatialStats. It provides an overview of the software's capabilities including tools for analyzing different types of spatial data like point patterns, lattice data, and geostatistical data. It also describes additional S-PLUS modules that can be used for tasks like spatial statistics, linking S-PLUS with GIS software, and environmental statistics.
This document summarizes a presentation on assessing the accuracy of LiDAR data using ArcGIS 10.1. The goals were to determine if ArcGIS could accurately assess LiDAR data by comparing it to check points based on 8 statistics. It discusses the history of LiDAR, how it is handled in ArcGIS, and compares LAS datasets to terrain datasets. The code structure calculates residuals and statistics to output accuracy measurements to assess if the data meets ASPRS and USGS guidelines. In conclusion, ArcGIS can visually inspect LiDAR but other software is needed for full analysis capabilities.
1) Stratosphere is a distributed data processing system that extends the MapReduce model by supporting more operators and advanced data flow graphs composed of operators.
2) It has components like a query parser, compiler, and optimizer that translate queries into execution plans composed of operators like Map, Reduce, Join, Cross, CoGroup, and Union.
3) Stratosphere supports arbitrary data flows while MapReduce only supports MapReduce, and Stratosphere has better performance through in-memory processing and pipelining compared to MapReduce which always writes to disk.
R is a free software environment for statistical computing and graphics. It can be used for spatial data analysis and GIS tasks. Spatial data such as points, polygons, and raster files can be imported and analyzed in R using specialized packages. Two case studies demonstrated using R for spatial interpolation of temperature data, LiDAR data processing to create digital elevation models, and developing online viewers for spatial datasets. R allows for reproducible analysis through scripting and has numerous packages that implement statistical procedures, graphics, and interfaces with GIS software like GRASS and ArcGIS.
Reeves: Modelling & Estimating Forest Structure Attributes Using LiDARCOGS Presentations
This document summarizes a research project using LiDAR data and permanent sample plot (PSP) field data to generate models for estimating forest attributes. The following steps were completed: 1) LiDAR data was acquired and processed, 2) LiDAR metrics were extracted for PSP locations, 3) models were developed in R relating forest attributes (average height, basal area, biomass) to LiDAR metrics, 4) models were applied to a larger LiDAR dataset, and 5) results were assessed. Models achieved R-squared values from 0.73-0.80 and error margins from 10-28% depending on the attribute. The document discusses limitations and conclusions.
Scripts have been developed to facilitate two-way exchange of geometric and hydraulic data between GRASS GIS and the Hec-RAS hydraulic model. The scripts allow extracting cross-sections from a DEM and exporting them to Hec-RAS via a standard text file format. Additional data like riverbanks and levees can be included. After hydraulic simulation in Hec-RAS, water level results can be imported back into GRASS GIS for flood risk mapping and analysis. The open source workflow provides an alternative to proprietary tools and allows processing of high-resolution data directly in GRASS for hydraulic modeling.
Free and open source software for remote sensing and GISNopphawanTamkuan
This document provides information on various free geospatial data sources and products available online. It summarizes datasets including aerial imagery, digital elevation models, Landsat, MODIS, Sentinel satellite imagery and products, OpenStreetMap, and other vector and raster data that can be used for applications such as agriculture, climate monitoring, disaster management and more. Many of the datasets are hosted by government agencies and scientific organizations looking to make Earth observation data openly available.
Mapping Toolbox provides tools for analyzing, visualizing, and mapping geographic data. It allows users to import vector and raster data formats, customize data through operations like subsetting and trimming, and perform geospatial analyses. The toolbox enables 2D and 3D map displays with imported data and base map layers. It offers functions for digital terrain analysis, geodesy calculations, map projections, and other geographic utilities.
This document provides a tutorial on using HEC-GeoRAS, ArcGIS, and HEC-RAS to create flood inundation maps for steady and unsteady flow conditions. It discusses the software and data requirements, and provides step-by-step instructions for preprocessing data in HEC-GeoRAS and ArcGIS, running HEC-RAS simulations, and postprocessing the results to create flood extent polygons. The tutorial demonstrates the full workflow for both 1D steady state and 1D unsteady simulations.
Precise Attitude Determination Using a Hexagonal GPS PlatformCSCJournals
In this paper, a method of precise attitude determination using GPS is proposed. We use a hexagonal antenna platform of 1 m diameter (called the wheel) and post-processing algorithms to calculate attitude, where we focus on yaw to prove the concept. The first part of the algorithm determines an initial absolute position using single point positioning. The second part involves double differencing (DD) the carrier phase measurements for the received GPS signals to determine relative positioning of the antennas on the wheel. The third part consists of Direct Computation Method (DCM) or Implicit Least Squares (ILS) algorithms which, given sufficiently accurate knowledge of the fixed body frame coordinates of the wheel, takes in relative positions of all the receivers and produces the attitude. Field testing results presented in this paper will show that an accuracy of 0.05 degrees in yaw can be achieved. The results will be compared with a theoretical error, which is shown by Monte Carlo simulation to be < 0.001 degrees. The improvement to the current state-of-the-art is that current methods require either very large baselines of several meters to achieve such accuracy or provide errors in yaw that are orders of magnitude greater.
Data Science Meetup: DGLARS and Homotopy LASSO for Regression ModelsColleen Farrelly
Short overview of two regression model extensions using differential geometry and homotopy continuation. Case study involves an open-source dataset that can be found on my ResearchGate page, along with the R code used in the analysis. Contains a short reference section for readers interested in learning more about the methods.
The concepts related of the New Model of River Adige, and especially an analysys of the existing OMS components ready and their interpretation on the basis of travel time approaches
This document outlines a project to develop a web-based application for visualizing and analyzing big agricultural data to help with crop management. The project aims to integrate geospatial and meteorological data sources with remote sensing imagery to monitor crop and rangeland conditions. Key objectives are to identify crop and grazing areas, incorporate climate/weather data, analyze the data convergence to assess crop/rangeland states, and identify areas needing close monitoring. The final product will be a web application allowing users to display layers, zoom/pan, search/query features, measure distances, and more to facilitate analysis of big data for improved crop and resource management.
The document discusses MapReduce programs for analyzing weather data. It describes:
1) The MapReduce framework which breaks jobs into map and reduce tasks to process large datasets in parallel across clusters.
2) A sample weather dataset from NOAA containing records with temperature and other weather readings from stations.
3) An example MapReduce program to find the maximum recorded temperature each year from the data using map tasks to extract temperatures and reduce tasks to find the yearly maximum values.
This document provides instructions for calculating vegetation indices from Landsat 5 TM and Landsat 7 ETM+ data using ArcGIS. It describes a multi-step process to: 1) reclassify Landsat digital number data to exclude null values, 2) convert Landsat 5 TM data to the Landsat 7 ETM+ format, 3) calculate radiance values, 4) calculate reflectance values using sun elevation angles and earth-sun distances, and 5) enforce positive reflectance values by setting negatives to zero. This allows vegetation indices to be accurately calculated from the reflectance data.
1. A single map can be analyzed using summary statistics, measurements of feature aspects, and buffers.
2. There are several ways to calculate summary statistics in ArcGIS including selection statistics, attribute table statistics, and summarization.
3. Measurements that can be made on geographic features include number, area, length, shape, fragmentation, and distance calculations.
Similar to The Analogues R-Package - Ramirez-Villegas (20)
The Accelerating Impact of CGIAR Climate Research for Africa (AICCRA) project works to deliver a climate-smart African future driven by science and innovation in agriculture.
AICCRA does this by enhancing access to climate information services and climate-smart agricultural technology to millions of smallholder farmers in Africa.
With better access to climate technology and advisory services—linked to information about effective response measures—farmers can better anticipate climate-related events and take preventative action that help communities better safeguard their livelihoods and the environment.
AICCRA is supported by a grant from the International Development Association (IDA) of the World Bank, which is used to enhance research and capacity-building activities by the CGIAR centers and initiatives as well as their partners in Africa.
About IDA: IDA helps the world’s poorest countries by providing grants and low to zero-interest loans for projects and programmes that boost economic growth, reduce poverty, and improve poor people’s lives.
IDA is one of the largest sources of assistance for the world’s 76 poorest countries, 39 of which are in Africa.
Annual IDA commitments have averaged about $21 billion over circa 2017-2020, with approximately 61 percent going to Africa.
This presentation was given on 27 October 2021 by Mengpin Ge, Global Climate Program Associate at WRI, during the webinar "Achieving NDC Ambition in Agriculture" organized by CCAFS, FAO and WRI.
Find the recording and more information here: https://bit.ly/AchievingNDCs
This presentation was given on 27 October 2021 by Sabrina Rose, Policy Consultant at CCAFS, during the webinar "Achieving NDC Ambition in Agriculture" organized by CCAFS, FAO and WRI.
Find the recording and more information here: https://bit.ly/AchievingNDCs
This presentation was given on 27 October 2021 by Krystal Crumpler, Climate Change and Agricultural Specialist at FAO, during the webinar "Achieving NDC Ambition in Agriculture" organized by CCAFS, FAO and WRI.
Find the recording and more information here: https://bit.ly/AchievingNDCs
This presentation was meant to be included in the 2021 CLIFF-GRADS Welcome Webinar and presented by Ciniro Costa Jr. (CCAFS).
The webinar recording can be found here: https://youtu.be/UoX6aoC4fhQ
The multilevel CSA monitoring set of standard core uptake and outcome indicators + expanded indicators linked to a rapid and reliable ICT based data collection instrument to systematically
assess and monitor:
- CSA Adoption/ Access to CIS
- CSA effects on food security and livelihoods household level)
- CSA effects on farm performance
The document discusses plant-based proteins as a potential substitute for animal-based proteins. It notes that plant-based proteins are growing in popularity due to environmental and ethical concerns with animal agriculture. However, plant-based meats also present some health and nutritional challenges compared to animal proteins. The document analyzes opportunities and impacts related to plant-based proteins across Asia, including leveraging the region's soy and pea production and tailoring products to Asian diets and cultural preferences.
Presented by Ciniro Costa Jr., CCAFS, on 28 June 2021 at the Asian Development Bank (ADB) Webinar on Sustainable Protein Case Study: Outputs and Synthesis of Results.
Presented by Marion de Vries, Wageningen Livestock Research at Wageningen University, on 28 June 2021 at the Asian Development Bank (ADB) Webinar on Sustainable Protein Case Study: Outputs and Synthesis of Results.
This document assesses the environmental sustainability of plant-based meats and pork in China. It finds that doubling food production while reducing agricultural greenhouse gas emissions by 73% by 2050 will be a major challenge. It compares the life cycle impacts of plant-based meats made from soy, pea, and wheat proteins and oils, as well as pork and beef. The results show that the crop type and source country of the core protein ingredient drives the environmental performance of plant-based meats. The document provides sustainability guidelines for sourcing ingredients from regions with low deforestation risk and irrigation needs, using renewable energy in production, and avoiding coal power.
This document summarizes a case study on the dairy value chain in China. It finds that milk production and consumption have significantly increased in China from 1978 to 2018. Large-scale dairy farms now dominate production. The study evaluates greenhouse gas emissions from different stages and finds feed production is a major contributor. It models options to reduce the carbon footprint, finding improving feed practices and yield have high potential. Land use is also assessed, with soybean meal requiring significant land. Recommendations include changing feeds to lower land and carbon impacts.
This document summarizes information on the impacts of livestock production globally and in Asia. It finds that livestock occupies one third of global cropland and one quarter of ice-free land for pastures. Asia accounts for 32% of global enteric greenhouse gas emissions from livestock, with most emissions coming from India, China, Pakistan, and Bangladesh. Rapid growth of livestock production in Asia is contributing to water and air pollution through nutrient runoff and emissions. The document discusses opportunities for public and private investment in more sustainable and climate-friendly livestock systems through technologies, monitoring, plant-based alternatives, and policies to guide intensification.
Presentation by Han Soethoudt, Jan Broeze, and Heike Axmann of Wageningen University & Resaearch (WUR).
WUR and Olam Rice Nigeria conducted a controlled experiment in Nigeria in which mechanized rice harvesting and threshing were introduced on smallholder farms. The result of the study shows that mechanization considerably reduces losses, has a positive impact on farmers’ income, and the climate.
Learn more: https://www.wur.nl/en/news-wur/show-day/Mechanization-helps-Nigerian-farms-reduce-food-loss-and-increase-income.htm
Presentation on the rapid evidence review findings and key take away messages.
Current evidence for biodiversity and agriculture to achieve and bridging gaps in research and investment to reach multiple global goals.
The document evaluates how climate services provided to farmers in Rwanda through programs like Participatory Integrated Climate Services for Agriculture (PICSA) and Radio Listeners’ Clubs (RLC) have impacted women and men differently, finding that the programs have increased women's climate knowledge and participation in agricultural decision making, leading to perceived benefits like higher incomes, food security, and ability to cope with climate risks for both women and men farmers.
This document provides an introduction to climate-smart agriculture (CSA) in Busia County, Kenya. It defines CSA and its three objectives of sustainably increasing agricultural productivity and income, adapting and building resilience to climate change, and reducing and/or removing greenhouse gas emissions. It discusses CSA at the farm and landscape scales and provides examples of CSA practices and projects in Kenya. It also outlines Kenya's response to CSA through policies and programs. The document describes prioritizing CSA options through identifying the local context, available options, relevant outcomes, evaluating evidence on options' impacts, and choosing best-bet options based on the analysis.
1) The document outlines an action plan to scale research outputs from the EC LEDS project in Vietnam. It identifies key activities to update livestock feed databases and software, improve feeding management practices, develop policies around carbon tracking and subsidies, and raise awareness of stakeholders.
2) The plan's main goals are to strengthen national feed resources, update the PC Dairy software, build greenhouse gas inventory systems, and adopt standards to reduce emissions in agriculture and the livestock industry.
3) Key stakeholders involved in implementing the plan include the Department of Livestock Production, universities, and ministries focused on agriculture and the environment.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
2. The tool Entirely coded as an package Optimised for large datasets with GRASS-GIS (experimental) Example data at 0.5-degree (~100km), globally, for 24 GCMs, and the SRES-A1B emission scenario, but any other data can be integrated Implemented using the raster, rgdal, sp, and maptoolspackages, so that it is easy to handle GIS formats, and export outputs Dissimilarity is calculated via two measures (CCAFS and Hallegatte), and uncertainty is provided as the SD and CV among individual GCMs, but, R is flexible Calculations can be done and outputs generated for any geographic region at any resolution.
3. What do you need?Set up: just download and install R >= 2.13.0 (http://www.r-project.org), and packages: raster, sp, rgdal, maps, spgrass6, stringr, maptools, foreign, lattice, akima, plotrix, rimage, XML GRASS GIS >= 6.4 (http://grass.fbk.eu/) (exp) Quantum GIS >= 1.6 (http://www.qgis.org/) (opt)
4. What do you need?Set up: just download and install http://code.google.com/p/ccafs-analogues/
5.
6.
7. At least one variable, for a given area, with any time-step (from whole year to daily)
8. Is uniform in spatial coverage (i.e. extent) and resolution