This document discusses using advanced technologies to more effectively utilize historic exploration data from the Van Deemen Gold Project. It summarizes the exploration history of the project and describes efforts to compile data from over 200 drill holes and 1000 drill core samples. Statistical analysis of geochemical and color data from the samples was able to infer the lithology of drill holes and develop an implicit geologic model. Further work is recommended to better understand the relationship between alteration signatures and gold mineralization.
Red Hen Systems develops hardware and software for airborne and ground-based emissions detection surveys. They surveyed parts of Pennsylvania in 2014 using a helicopter equipped with sensors including a gas analyzer. Their goal was to quantify potential methane emissions from abandoned oil/gas wells and pipelines. They developed an algorithm to classify methane concentration points without user input. It extracts features from concentration data and uses techniques like wavelet analysis and principal component analysis. It defines significant points of interest, determines boundaries between concentrations, and reduces cluster overlap. Their source finding method uses sensor locations and wind direction to project emission sources onto pipeline and well pad infrastructure, locating sources within 5 feet of accuracy. The presentation concluded the method was adopted as an industry standard.
Use of UAV Assessment in Navajo Nation Abandoned Uranium Mine Cleanup UAS Colorado
Presentation by Jim Oliver of Crestone Environmental, LLC on uranium mine cleanup efforts for the Navajo Nation. Presented at the September Rocky Mountain UAS Professionals Meetup at the Wings over the Rockies Air & Space Museum in Denver, Colorado.
The City of Oklahoma City established its Primary Control Network in 1990 based on the NAD83 (1986) datum. The network originally included around 450 control monuments across the city. Over the past 12 years, survey crews have used and tested the network extensively and found positions to check well within the network. By 2005, there were approximately 197 control points remaining. From 2005 to 2008, the city conducted GPS surveys on the control points and processed the results through OPUS to compare to the original published positions. The horizontal coordinates matched closely but vertical discrepancies of up to 1 foot were found. Additional GPS surveys from 2006 to 2008 helped further refine the positions of the remaining 197 control points.
Fugro Survey performs geophysical surveys and site surveys in Norwegian waters to identify hazards for offshore drilling. They use seismic data to interpret shallow soils and identify features like shallow gas. An amplitude anomaly workflow in ArcGIS is used to standardize mapping and visualizing interpreted seismic amplitude anomalies from site surveys in a geodatabase. This allows the data to be easily incorporated into reports, presentations, web maps, and 3D visualizations.
Geocap seismic oil and gas for ArcGIS- Oil and Gas seminar October 10thGeodata AS
This document describes Geocap Seismic/Oil & Gas for ArcGIS software. It can be used by geoscientists for exploration and field development planning, creating high quality maps, calculating oil and gas volumes, assessing subsurface risk, and input to well planning and production forecasting. The software includes modules for seismic visualization and interpretation, and modeling petrophysical properties to calculate oil and gas volumes initially in place. It integrates seismic data, wells, and interpretations in ArcGIS for geoscience analysis and decision making.
Andrew Sitek has over 37 years of experience in geophysics. He has extensive experience processing and quality controlling 2D and 3D seismic data using various software, including ProMAX, Focus, and Geodepth. He has also worked on depth imaging, interpretation, and velocity modeling. Sitek has worked for various oil and gas companies as a geophysicist and senior geophysicist, and has also operated as an independent consultant. He is fluent in Russian and Polish and holds an M.Sc. in Applied Geophysics from the University of Mining and Metallurgy in Krakow, Poland.
Use FME To Efficiently Create National-Scale Vector Contours From High-Resolu...Safe Software
TerraLogik was contracted by NAV CANADA to generate a 1:500,000-scale topographic base from 1:50,000-scale data for use in aeronautical navigation charts. TerraLogik used the raster analysis tools in FME, coupled with custom Python code and the GDAL open-source library, to create generalized contours at 1:500,000 scale from 1:50,000-scale DEMs covering Canada and the Northern United States. See how the WorkspaceRunner is used to perform this process - and reduced processing time from 24 hours to 1 hour per chart.
TxDOT has used GPS since 1984 and now operates 182 reference stations across Texas to support high-precision applications like surveying, construction, and photogrammetry. The agency provides real-time kinematic corrections from these stations to help engineers, contractors and others perform positioning tasks. TxDOT also collects static data from the stations for control networks and shares data with other organizations like NGS and NOAA.
Red Hen Systems develops hardware and software for airborne and ground-based emissions detection surveys. They surveyed parts of Pennsylvania in 2014 using a helicopter equipped with sensors including a gas analyzer. Their goal was to quantify potential methane emissions from abandoned oil/gas wells and pipelines. They developed an algorithm to classify methane concentration points without user input. It extracts features from concentration data and uses techniques like wavelet analysis and principal component analysis. It defines significant points of interest, determines boundaries between concentrations, and reduces cluster overlap. Their source finding method uses sensor locations and wind direction to project emission sources onto pipeline and well pad infrastructure, locating sources within 5 feet of accuracy. The presentation concluded the method was adopted as an industry standard.
Use of UAV Assessment in Navajo Nation Abandoned Uranium Mine Cleanup UAS Colorado
Presentation by Jim Oliver of Crestone Environmental, LLC on uranium mine cleanup efforts for the Navajo Nation. Presented at the September Rocky Mountain UAS Professionals Meetup at the Wings over the Rockies Air & Space Museum in Denver, Colorado.
The City of Oklahoma City established its Primary Control Network in 1990 based on the NAD83 (1986) datum. The network originally included around 450 control monuments across the city. Over the past 12 years, survey crews have used and tested the network extensively and found positions to check well within the network. By 2005, there were approximately 197 control points remaining. From 2005 to 2008, the city conducted GPS surveys on the control points and processed the results through OPUS to compare to the original published positions. The horizontal coordinates matched closely but vertical discrepancies of up to 1 foot were found. Additional GPS surveys from 2006 to 2008 helped further refine the positions of the remaining 197 control points.
Fugro Survey performs geophysical surveys and site surveys in Norwegian waters to identify hazards for offshore drilling. They use seismic data to interpret shallow soils and identify features like shallow gas. An amplitude anomaly workflow in ArcGIS is used to standardize mapping and visualizing interpreted seismic amplitude anomalies from site surveys in a geodatabase. This allows the data to be easily incorporated into reports, presentations, web maps, and 3D visualizations.
Geocap seismic oil and gas for ArcGIS- Oil and Gas seminar October 10thGeodata AS
This document describes Geocap Seismic/Oil & Gas for ArcGIS software. It can be used by geoscientists for exploration and field development planning, creating high quality maps, calculating oil and gas volumes, assessing subsurface risk, and input to well planning and production forecasting. The software includes modules for seismic visualization and interpretation, and modeling petrophysical properties to calculate oil and gas volumes initially in place. It integrates seismic data, wells, and interpretations in ArcGIS for geoscience analysis and decision making.
Andrew Sitek has over 37 years of experience in geophysics. He has extensive experience processing and quality controlling 2D and 3D seismic data using various software, including ProMAX, Focus, and Geodepth. He has also worked on depth imaging, interpretation, and velocity modeling. Sitek has worked for various oil and gas companies as a geophysicist and senior geophysicist, and has also operated as an independent consultant. He is fluent in Russian and Polish and holds an M.Sc. in Applied Geophysics from the University of Mining and Metallurgy in Krakow, Poland.
Use FME To Efficiently Create National-Scale Vector Contours From High-Resolu...Safe Software
TerraLogik was contracted by NAV CANADA to generate a 1:500,000-scale topographic base from 1:50,000-scale data for use in aeronautical navigation charts. TerraLogik used the raster analysis tools in FME, coupled with custom Python code and the GDAL open-source library, to create generalized contours at 1:500,000 scale from 1:50,000-scale DEMs covering Canada and the Northern United States. See how the WorkspaceRunner is used to perform this process - and reduced processing time from 24 hours to 1 hour per chart.
TxDOT has used GPS since 1984 and now operates 182 reference stations across Texas to support high-precision applications like surveying, construction, and photogrammetry. The agency provides real-time kinematic corrections from these stations to help engineers, contractors and others perform positioning tasks. TxDOT also collects static data from the stations for control networks and shares data with other organizations like NGS and NOAA.
HRSC Technologies: Using MiHpt for Rapid In-Situ Contaminant and Hydrostratig...ASC-HRSC
This document discusses high-resolution site characterization (HRSC) technologies for rapid in-situ contaminant and hydrostratigraphic characterization. It provides an overview of various direct push technologies used for HRSC including the hydraulic profiling tool (HPT), membrane interface probe (MIP), optical image profiler (OIP), and their applications. Combining the MIP and HPT into a single tool called MiHpt allows for continuous, real-time profiling of hydrocarbons and hydrogeologic properties. Example projects demonstrating HRSC technologies are described for a former dry cleaner site in Monterey, California and a U.S. Army site on Kwajalein Atoll.
This document contains information about an individual's education and work experience. It includes:
1) Details of a M Sc. in Geophysics and B Sc. in Mining Engineering, along with the individual's work as a senior exploration geophysicist at SKCE on various 3D land seismic projects.
2) A list of the individual's engineering projects with SKCE, including data acquisition, processing, and interpretation projects for 2D and 3D land seismic surveys, downhole surveys, and refraction surveys.
3) The individual's computer skills including experience with seismic processing, modeling, and interpretation software, as well as Microsoft Office, and their language skills studying subjects in English and passing intensive
HRSC Techniques: High-Resolution Hydrogeologic Characterization
Presentation given at the Remediation Workshops in:
* Oakland, CA - January 25, 2017
* Sacramento, CA- January 26, 2017
* Long Beach, CA- February 7, 2017
* Los Angeles (Rosemead), CA- February 8, 2017
The document summarizes the Hydrology Project - II being implemented by the Central Ground Water Board. It provides details on project costs, expenditures, activities completed and in progress. Key points:
- Total project cost has increased from original Rs. 27.95 crore to Rs. 66.32 crore as per revised cost estimates. Expenditure so far is Rs. 38.45 crore against a target of Rs. 51.64 crore.
- Activities include upgrading hardware, procuring software, conducting training programs, pilot aquifer mapping studies, piezometer construction, and developing a decision support system.
- The pilot aquifer mapping studies have characterized aquifer systems in six areas and
VIIe - Global Soil Organic Carbon Sequestration Potential Map - GSOCseqSoils FAO-GSP
The document discusses developing a global soil organic carbon sequestration potential map (GSOCseq) using two approaches. The top-down approach uses climate change scenarios to project SOC stocks over time without and with sustainable soil management. The bottom-up approach uses process modeling calibrated with soil profile observations to estimate baseline SOC stocks and potential under different scenarios. Preliminary results show potential SOC sequestration ranges from 60-245 petagrams for RCP2.6 and 82-325 petagrams for RCP8.5 by 2100 depending on management practices. The top-down approach uses empirical relationships between management factors and SOC stock changes to assess mitigation potential from sustainable soil practices.
2017 ASPRS-RMR Big Data Track: Using NASA's AppEEARS to Slice and Dice Big Ea...GIS in the Rockies
This document summarizes the AppEEARS tool, which was developed by NASA's LP DAAC to allow users to easily subset, reformat, and analyze large Earth observation datasets. AppEEARS provides interactive subsetting of spatial, temporal, and variable subsets of datasets. It outputs the subsets in common file formats like GeoTIFF and NetCDF while maintaining metadata and provenance. The document describes several use cases where researchers were able to efficiently extract relevant data for studies on vegetation productivity, population changes, snow zones, and wildfire impacts using AppEEARS. It highlights how the tool eliminates much of the data processing workload and enables more focus on analysis.
Development of a soil carbon map for the United Republic of TanzaniaExternalEvents
This presentation was presented during the Workshop on Soil Cabon Mapping of the Global Soil Partnership (GSP) that took place at FAO headquarters 23 November 2016. The presentation was made by Bas Kempen from ISRIC, the Netherlands
The document summarizes the creation of a GIS system to coordinate environmental cleanup and property transfer efforts at the Volunteer Army Ammunition Plant in Chattanooga, TN. The plant produced over 3 billion pounds of TNT from 1942-1977, leaving widespread soil and groundwater contamination. A multi-agency team was established in 2000 but lacked a unified information system. In 2002, the Tennessee Department of Environment and Conservation began digitizing historical maps and aerial photos into a GIS to identify areas of concern and facilitate data sharing between agencies. This enhanced remediation efforts and allowed over 285 acres to be transferred for redevelopment, saving an estimated $300,000 compared to traditional regulatory processes.
The integration between data and conventional monitoring system in order to u...Lanteri Luca
Arpa Piemonte manages an inventory of landslides in the Piedmont region of Italy called SIFRAP. They are updating the inventory by integrating satellite data, conventional monitoring data, and other sources. Satellite interferometry data from 1992-2010 covering over 25,000 km2 was processed and identified over 2.4 million measurement points. This data helped detect 35 new landslides, update boundaries of existing landslides, and determine activity states. Monitoring data from over 300 sites was also incorporated. The integrated data improves understanding of landslide kinematics and activity. The updated inventory provides important information for land use planning and hazard management.
This document summarizes a project to acquire high-resolution aerial imagery for the entire state of Texas through a buy-up of the USDA's National Agriculture Imagery Program (NAIP). Key points:
- The state of Texas, through the Texas Natural Resources Information System (TNRIS), contracted with North West Geomatics to acquire 0.5m resolution imagery for the entire 275,000 square mile state by augmenting the standard 1m resolution NAIP data.
- Flights were conducted in summer 2008 and winter 2009 using a Leica ADS40 sensor, providing 0.75m resolution multispectral and 0.375m panchromatic imagery that was resampled to meet NAIP
Available Software Tools for Land Use GHG Inventories and Project Carbon Bala...World Agroforestry (ICRAF)
This document discusses several existing software tools that can be used for land use greenhouse gas (GHG) inventories and project carbon balance verification:
- COMET-VR is a web-based decision support tool that provides rapid assessments of GHG impacts from land use and management scenarios.
- ALU (Agriculture and Land Use) is a national GHG inventory software that supports reporting to the UNFCCC and guides users through the inventory process using IPCC methods.
- GEFSOC is a system for regional/national assessments of soil carbon dynamics that can address a variety of issues at large scales with flexibility. It links with other data and models.
This document discusses comparisons of vertical profile measurements of greenhouse gases from an intensive campaign using AirCore samplers at Sodankylä, Finland. It finds that AirCore is a cost-effective tool for stratospheric measurements, with uncertainties of 0.15-0.2 ppm for CO2 and 4-7 ppb for CO. Differences between AirCore profiles were mostly within these uncertainties, though tubing coatings could cause larger CO2 differences. Altitude registration uncertainty was typically around 2 hPa. Future work includes a campaign in Kiruna, Sweden to further improve accuracy of CO measurements and altitude registration.
The document summarizes seismic acquisition and processing parameters for a marine seismic survey conducted off the coast of Newfoundland, Canada between July and October 2015. The survey covered 14,599 km using a 24-bit streamer towed behind the vessel Atlantic Explorer. Initial processing was done by PGS and included noise attenuation, wavefield separation, and extrapolation. Further processing will be done by Arcis to produce outputs such as pre-stack migrated gathers, raw and processed migrations, velocities, and angle stacks.
Case study cumbria_university_Utility surveyNick Blenkarn
The survey aimed to map all detectable buried utility services and update building footprints at the University of Cumbria's campuses in Lancaster and Carlisle. The Severn Partnership Utility team spent 4 weeks surveying Lancaster and 2 weeks in Carlisle using techniques like GPR and electromagnetic location to detect pipes, cables, and culverts. The final deliverables included mapped utilities overlaid on a topographic base map in AutoCAD, a utility detection report, and a manhole schedule detailing methods, data quality, and recommendations.
The document summarizes the status of the Galileo satellite navigation system operated by the European Union and European Space Agency. It describes the constellation plans including the initial and full operational capabilities. It provides details on the satellites, ground infrastructure, atomic clocks, signals, services, and system performance. The status of the initial in-orbit validation satellites and deployment of the full operational capability satellites is outlined.
The document summarizes the use of GIS in the archaeological excavations for the N18 Oranmore to Gort road project in Ireland. Key points:
- Eachtra Archaeological Projects was commissioned to provide archaeological services for the project in 4 phases, including surveys, test excavations, full excavations, and post-excavation analysis.
- GIS was used throughout the project from the planning phase through excavation, post-excavation analysis, and dissemination of results. A relational database and geo-database were created to store all excavation data.
- Benefits of the GIS system included improved planning, on-site excavation recording, collaboration in post-excav
HRSC Technologies: Using MiHpt for Rapid In-Situ Contaminant and Hydrostratig...ASC-HRSC
This document discusses high-resolution site characterization (HRSC) technologies for rapid in-situ contaminant and hydrostratigraphic characterization. It provides an overview of various direct push technologies used for HRSC including the hydraulic profiling tool (HPT), membrane interface probe (MIP), optical image profiler (OIP), and their applications. Combining the MIP and HPT into a single tool called MiHpt allows for continuous, real-time profiling of hydrocarbons and hydrogeologic properties. Example projects demonstrating HRSC technologies are described for a former dry cleaner site in Monterey, California and a U.S. Army site on Kwajalein Atoll.
This document contains information about an individual's education and work experience. It includes:
1) Details of a M Sc. in Geophysics and B Sc. in Mining Engineering, along with the individual's work as a senior exploration geophysicist at SKCE on various 3D land seismic projects.
2) A list of the individual's engineering projects with SKCE, including data acquisition, processing, and interpretation projects for 2D and 3D land seismic surveys, downhole surveys, and refraction surveys.
3) The individual's computer skills including experience with seismic processing, modeling, and interpretation software, as well as Microsoft Office, and their language skills studying subjects in English and passing intensive
HRSC Techniques: High-Resolution Hydrogeologic Characterization
Presentation given at the Remediation Workshops in:
* Oakland, CA - January 25, 2017
* Sacramento, CA- January 26, 2017
* Long Beach, CA- February 7, 2017
* Los Angeles (Rosemead), CA- February 8, 2017
The document summarizes the Hydrology Project - II being implemented by the Central Ground Water Board. It provides details on project costs, expenditures, activities completed and in progress. Key points:
- Total project cost has increased from original Rs. 27.95 crore to Rs. 66.32 crore as per revised cost estimates. Expenditure so far is Rs. 38.45 crore against a target of Rs. 51.64 crore.
- Activities include upgrading hardware, procuring software, conducting training programs, pilot aquifer mapping studies, piezometer construction, and developing a decision support system.
- The pilot aquifer mapping studies have characterized aquifer systems in six areas and
VIIe - Global Soil Organic Carbon Sequestration Potential Map - GSOCseqSoils FAO-GSP
The document discusses developing a global soil organic carbon sequestration potential map (GSOCseq) using two approaches. The top-down approach uses climate change scenarios to project SOC stocks over time without and with sustainable soil management. The bottom-up approach uses process modeling calibrated with soil profile observations to estimate baseline SOC stocks and potential under different scenarios. Preliminary results show potential SOC sequestration ranges from 60-245 petagrams for RCP2.6 and 82-325 petagrams for RCP8.5 by 2100 depending on management practices. The top-down approach uses empirical relationships between management factors and SOC stock changes to assess mitigation potential from sustainable soil practices.
2017 ASPRS-RMR Big Data Track: Using NASA's AppEEARS to Slice and Dice Big Ea...GIS in the Rockies
This document summarizes the AppEEARS tool, which was developed by NASA's LP DAAC to allow users to easily subset, reformat, and analyze large Earth observation datasets. AppEEARS provides interactive subsetting of spatial, temporal, and variable subsets of datasets. It outputs the subsets in common file formats like GeoTIFF and NetCDF while maintaining metadata and provenance. The document describes several use cases where researchers were able to efficiently extract relevant data for studies on vegetation productivity, population changes, snow zones, and wildfire impacts using AppEEARS. It highlights how the tool eliminates much of the data processing workload and enables more focus on analysis.
Development of a soil carbon map for the United Republic of TanzaniaExternalEvents
This presentation was presented during the Workshop on Soil Cabon Mapping of the Global Soil Partnership (GSP) that took place at FAO headquarters 23 November 2016. The presentation was made by Bas Kempen from ISRIC, the Netherlands
The document summarizes the creation of a GIS system to coordinate environmental cleanup and property transfer efforts at the Volunteer Army Ammunition Plant in Chattanooga, TN. The plant produced over 3 billion pounds of TNT from 1942-1977, leaving widespread soil and groundwater contamination. A multi-agency team was established in 2000 but lacked a unified information system. In 2002, the Tennessee Department of Environment and Conservation began digitizing historical maps and aerial photos into a GIS to identify areas of concern and facilitate data sharing between agencies. This enhanced remediation efforts and allowed over 285 acres to be transferred for redevelopment, saving an estimated $300,000 compared to traditional regulatory processes.
The integration between data and conventional monitoring system in order to u...Lanteri Luca
Arpa Piemonte manages an inventory of landslides in the Piedmont region of Italy called SIFRAP. They are updating the inventory by integrating satellite data, conventional monitoring data, and other sources. Satellite interferometry data from 1992-2010 covering over 25,000 km2 was processed and identified over 2.4 million measurement points. This data helped detect 35 new landslides, update boundaries of existing landslides, and determine activity states. Monitoring data from over 300 sites was also incorporated. The integrated data improves understanding of landslide kinematics and activity. The updated inventory provides important information for land use planning and hazard management.
This document summarizes a project to acquire high-resolution aerial imagery for the entire state of Texas through a buy-up of the USDA's National Agriculture Imagery Program (NAIP). Key points:
- The state of Texas, through the Texas Natural Resources Information System (TNRIS), contracted with North West Geomatics to acquire 0.5m resolution imagery for the entire 275,000 square mile state by augmenting the standard 1m resolution NAIP data.
- Flights were conducted in summer 2008 and winter 2009 using a Leica ADS40 sensor, providing 0.75m resolution multispectral and 0.375m panchromatic imagery that was resampled to meet NAIP
Available Software Tools for Land Use GHG Inventories and Project Carbon Bala...World Agroforestry (ICRAF)
This document discusses several existing software tools that can be used for land use greenhouse gas (GHG) inventories and project carbon balance verification:
- COMET-VR is a web-based decision support tool that provides rapid assessments of GHG impacts from land use and management scenarios.
- ALU (Agriculture and Land Use) is a national GHG inventory software that supports reporting to the UNFCCC and guides users through the inventory process using IPCC methods.
- GEFSOC is a system for regional/national assessments of soil carbon dynamics that can address a variety of issues at large scales with flexibility. It links with other data and models.
This document discusses comparisons of vertical profile measurements of greenhouse gases from an intensive campaign using AirCore samplers at Sodankylä, Finland. It finds that AirCore is a cost-effective tool for stratospheric measurements, with uncertainties of 0.15-0.2 ppm for CO2 and 4-7 ppb for CO. Differences between AirCore profiles were mostly within these uncertainties, though tubing coatings could cause larger CO2 differences. Altitude registration uncertainty was typically around 2 hPa. Future work includes a campaign in Kiruna, Sweden to further improve accuracy of CO measurements and altitude registration.
The document summarizes seismic acquisition and processing parameters for a marine seismic survey conducted off the coast of Newfoundland, Canada between July and October 2015. The survey covered 14,599 km using a 24-bit streamer towed behind the vessel Atlantic Explorer. Initial processing was done by PGS and included noise attenuation, wavefield separation, and extrapolation. Further processing will be done by Arcis to produce outputs such as pre-stack migrated gathers, raw and processed migrations, velocities, and angle stacks.
Case study cumbria_university_Utility surveyNick Blenkarn
The survey aimed to map all detectable buried utility services and update building footprints at the University of Cumbria's campuses in Lancaster and Carlisle. The Severn Partnership Utility team spent 4 weeks surveying Lancaster and 2 weeks in Carlisle using techniques like GPR and electromagnetic location to detect pipes, cables, and culverts. The final deliverables included mapped utilities overlaid on a topographic base map in AutoCAD, a utility detection report, and a manhole schedule detailing methods, data quality, and recommendations.
The document summarizes the status of the Galileo satellite navigation system operated by the European Union and European Space Agency. It describes the constellation plans including the initial and full operational capabilities. It provides details on the satellites, ground infrastructure, atomic clocks, signals, services, and system performance. The status of the initial in-orbit validation satellites and deployment of the full operational capability satellites is outlined.
The document summarizes the use of GIS in the archaeological excavations for the N18 Oranmore to Gort road project in Ireland. Key points:
- Eachtra Archaeological Projects was commissioned to provide archaeological services for the project in 4 phases, including surveys, test excavations, full excavations, and post-excavation analysis.
- GIS was used throughout the project from the planning phase through excavation, post-excavation analysis, and dissemination of results. A relational database and geo-database were created to store all excavation data.
- Benefits of the GIS system included improved planning, on-site excavation recording, collaboration in post-excav
Mark Thomas_A digital soil mapping approach for regolith thickness in the com...TERN Australia
This document summarizes research on modeling regolith depth in the Mt Lofty Ranges of South Australia. Regolith includes all weathered material above bedrock and plays an important role in hydrology, biology, energy transfer, biogeochemistry, land use, and more. While some regolith maps exist, coverage is limited. The researchers collected over 700 depth measurements and used environmental data like topography, climate, and geology in a regression model to predict regolith depth across the 128,000 hectare study area. Their goal is to develop a consistent national regolith map to support biophysical modeling. Future work includes testing the approach in other regions and integrating results to create a comprehensive national map.
This document discusses geoscience division services that provide analysis of core samples using cutting edge technology. The analysis includes program pyrolysis to determine hydrocarbons, organic carbon, and thermal maturity. X-ray diffraction is used to determine mineralogy, brittleness, and formation tops. X-ray fluorescence provides elemental composition. This precise data helps with exploration by identifying pay zones and reservoirs, and production by optimizing well placement and completions. The division produces high quality data faster than conventional labs using standardized procedures and experienced professionals. Case studies show how the analysis helped clients by locating unanticipated pay zones and reservoirs.
The document summarizes improvements made by the US Census Bureau to the spatial accuracy of geographic data products over time. It describes how the Topologically Integrated Geographic Encoding and Referencing (TIGER) system was initially developed for the 1990 Census and relied on relative locations. Technological advances in the late 1990s/early 2000s allowed for enhanced geocoding using GPS. The MAF/TIGER Enhancement Program aimed to improve spatial accuracy from 2002-2008. Address canvassing in 2009 further updated the data through field collection. The 2010 Census then released updated TIGER/Line shapefiles and other geographic reference products incorporating these improvements.
Monitoring measuring and verification, Gonzalo Zambrano, University of AlbertaGlobal CCS Institute
This document summarizes Gonzalo Zambrano's presentation on monitoring, measuring and verification (MMV) for CO2 storage projects. It discusses the Aquistore project, which aims to demonstrate safe CO2 storage in deep saline formations in Alberta, Canada. The Aquistore project involves injecting CO2 into a saline formation over 3 km underground and uses various surface and downhole techniques to monitor the CO2 plume and ensure containment. These include 3D and time-lapse seismic surveys, a permanent seismic array, tiltmeters, GPS, and soil gas and groundwater monitoring.
The DART project aims to improve the detection of archaeological residues using remote sensing techniques. It will analyze factors that influence contrasts between residues and surrounding soil over time and space. Through data collection, modeling, and tool development, DART seeks to determine optimal conditions and sensors for detecting residues. The consortium includes academic, heritage, and industry partners who will work on data analysis, decision support tools, and project evaluation over 3 years with a budget of £800k. The goal is to strengthen remote sensing approaches and heritage management.
Item 5: Introduction to the Global Spectral Calibration LibrarySoils FAO-GSP
First plenary meeting on spectroscopy
Virtual | 23 - 25 September 2020
Lucrezia Caon, FAO
Richard Ferguson, USDA, United States of America
Fenny van Egmond, ISRIC, Netherlands
This document summarizes a study that tested objective and subjective methods for measuring land area, soil fertility, and crop production in Ethiopia. The study involved collecting over 3,700 soil samples from 1,799 fields across 85 areas, which were tested using several methods including spectral and conventional analysis. The objective data showed variation in soil properties within and between areas. Comparisons found farmers' subjective assessments of soil quality did not capture the full variation and sometimes overestimated quality. The results suggest spectral soil analysis could improve soil data collection but challenges include cost, lab capacity, and scaling to different regions and crop cycles.
De erfgoedradar: kansen voor vrijwilligers door Verbeek B., Seinen P., Werkgr...Onroerend Erfgoed
Studiedag 13 juni 2018: de rol van geofysisch onderzoek in het archeologieproces
Presentatie van de lezing De erfgoedradar: kansen voor vrijwilligers door Verbeek B., Seinen P., Werkgroep Innovatieve Meettechnieken tbv Archeologie (WIMA) (Nl.)
On February 12, 2013, the Canada Mining Innovation Council held its 2nd Annual Signature Event, a mining conference bringing representatives from industry, government, academia, and other sectors together in Toronto to discuss the role of innovation in the industry's future. The VP and Chief Geologist of Global Exploration at Barrick, Francois Robert, and the Research Director for CMIC, Alan Galley, shared the plans, programs and projects being carried out by CMIC's Exploration Innovation Consortium.
The document discusses the need for a shared earth model to manage big data from subsea and subsurface exploration. It notes requirements like managing large volumes of seismic, bathymetry and other data; being open, scalable, and cost-efficient; and enabling sharing and collaboration. A shared earth model is needed for exploration, development, production planning, environmental management, and delineating maritime boundaries. It proposes using ArcGIS and Geocap tools to integrate subsurface data and interpretations with GIS, allowing visualization and sharing of big data while avoiding data duplication.
Seeing the Unseen- Improving aerial archaeological prospectiondavstott
The document summarizes the DART project which aims to better understand how archaeological features interact with their environment to improve detection techniques. It discusses using spectroradiometry to measure spectral profiles across archaeological linear features over time. Preliminary flights captured imagery using sensors like CASI and thermal. Challenges included drought conditions reducing vegetation marks. Further work involves analyzing spectral data to identify diagnostic features and building a knowledge system to predict contrast in new and archive imagery.
Using gravity to target gold at Tampia Hill, Western AustraliaKenex Ltd
The discovery of the Tampia Hill orogenic gold deposit in the wheatbelt of Western Australia has sparked interest
in this under-explored region of the state. The deposit is hosted within a granulite facies greenstone belt, with
mineralisation mostly hosted in mafic gneiss, which has been intruded by undeformed and unmetamorphosed
granite.
A lack of outcrop in the project area has meant that geophysics has been vital for interpretation of the geology. A
recent gravity and magnetic survey has allowed the most detailed interpretation of the underlying lithology and
structures to date, and has highlighted previously unknown areas of mafic gneiss, with a similar signature to that
at Tampia Hill.
In order to extract the most useful information from the survey, spatial statistical analyses were conducted on the
gravity survey data. The analyses over the project area map features within the gravity data that can be used to
identify areas of known gold mineralisation. The results confirm that the gravity data not only provides critical
geological information, but will also allow the identification of high priority targets for future exploration using
spatial data modelling techniques.
This document summarizes three case studies that used remote sensing and GIS techniques to analyze land use and land cover change over time. The first case study analyzed changes from 1990-2010 in Hawalbagh, India using Landsat imagery. It found increases in built-up land and decreases in barren land. The second studied coastal Egypt from 1987-2001 using Landsat, identifying 8 land cover classes. The third examined Simly watershed, Pakistan from 1992-2012 using Landsat and SPOT data, finding increases in agriculture and decreases in vegetation. All three used supervised classification and post-classification comparison to analyze land use/cover changes.
The document summarizes Michael Ndhlovu's attachment activities at the Ayrshire Mine in Banket, Zimbabwe. The mine conducts both underground and surface gold mining. Michael assisted with various exploration activities including geological mapping, soil geochemistry, trenching, diamond drilling, percussion drilling, core logging and sampling, open pit mining, and Datamine training. He gained experience in project management, fieldwork skills, and applying geology in an industrial setting while working with environmental responsibility.
The National Soil Information System of SudanExternalEvents
The document summarizes the steps taken to establish the National Soil Information System of Sudan (SUSIS) from 2013 to 2016. It involved data collection from archives and field work, data entry and harmonization, recruiting consultants, purchasing equipment, setting up a geoportal, and training on open source software and data. Various maps were produced, including soil maps, property maps on texture, carbon, and threats. Field data was merged with existing data to create national soil maps and identify gaps for further data collection and analysis. The system aims to make soil data accessible online through a geoportal.
Similar to Using Advanced Technologies to More Effectively Utilize Historic Exploration Data (20)
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
The Department of Veteran Affairs (VA) invited Taylor Paschal, Knowledge & Information Management Consultant at Enterprise Knowledge, to speak at a Knowledge Management Lunch and Learn hosted on June 12, 2024. All Office of Administration staff were invited to attend and received professional development credit for participating in the voluntary event.
The objectives of the Lunch and Learn presentation were to:
- Review what KM ‘is’ and ‘isn’t’
- Understand the value of KM and the benefits of engaging
- Define and reflect on your “what’s in it for me?”
- Share actionable ways you can participate in Knowledge - - Capture & Transfer
High performance Serverless Java on AWS- GoTo Amsterdam 2024Vadym Kazulkin
Java is for many years one of the most popular programming languages, but it used to have hard times in the Serverless community. Java is known for its high cold start times and high memory footprint, comparing to other programming languages like Node.js and Python. In this talk I'll look at the general best practices and techniques we can use to decrease memory consumption, cold start times for Java Serverless development on AWS including GraalVM (Native Image) and AWS own offering SnapStart based on Firecracker microVM snapshot and restore and CRaC (Coordinated Restore at Checkpoint) runtime hooks. I'll also provide a lot of benchmarking on Lambda functions trying out various deployment package sizes, Lambda memory settings, Java compilation options and HTTP (a)synchronous clients and measure their impact on cold and warm start times.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
Northern Engraving | Modern Metal Trim, Nameplates and Appliance PanelsNorthern Engraving
What began over 115 years ago as a supplier of precision gauges to the automotive industry has evolved into being an industry leader in the manufacture of product branding, automotive cockpit trim and decorative appliance trim. Value-added services include in-house Design, Engineering, Program Management, Test Lab and Tool Shops.
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
This talk will cover ScyllaDB Architecture from the cluster-level view and zoom in on data distribution and internal node architecture. In the process, we will learn the secret sauce used to get ScyllaDB's high availability and superior performance. We will also touch on the upcoming changes to ScyllaDB architecture, moving to strongly consistent metadata and tablets.
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...DanBrown980551
This LF Energy webinar took place June 20, 2024. It featured:
-Alex Thornton, LF Energy
-Hallie Cramer, Google
-Daniel Roesler, UtilityAPI
-Henry Richardson, WattTime
In response to the urgency and scale required to effectively address climate change, open source solutions offer significant potential for driving innovation and progress. Currently, there is a growing demand for standardization and interoperability in energy data and modeling. Open source standards and specifications within the energy sector can also alleviate challenges associated with data fragmentation, transparency, and accessibility. At the same time, it is crucial to consider privacy and security concerns throughout the development of open source platforms.
This webinar will delve into the motivations behind establishing LF Energy’s Carbon Data Specification Consortium. It will provide an overview of the draft specifications and the ongoing progress made by the respective working groups.
Three primary specifications will be discussed:
-Discovery and client registration, emphasizing transparent processes and secure and private access
-Customer data, centering around customer tariffs, bills, energy usage, and full consumption disclosure
-Power systems data, focusing on grid data, inclusive of transmission and distribution networks, generation, intergrid power flows, and market settlement data
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
3. How many of you have used …
• Brunton
• Planimeter
• Plane Table/Alidade
• Hip Chain (Topofil)
• Light Table
• GPS
• Portable XRF
• Leapfrog
• Mobile Field Device
GeoGRAFX GIS Services
4. What do these items have in
common?
• They are all tools a geologist has used in
the past 50 years to gain insight into the
question..
• What is the Tonnage/Grade of a deposit
• Is it economic?
GeoGRAFX GIS Services
5. Economics defined by
• Price of Gold
• Total mine, leach and
recovery costs
• Gold recovery
• Government
Regulations
• Permitting
• Environmental
• Socio/Economic
Considerations
GeoGRAFX GIS Services
7. Times have changed
• Drilling costs
• Permitting issues
• Geochemical Analytical methods
• Statistical Analysis
• Modeling
• Lots of data – much of it incomplete
GeoGRAFX GIS Services
10. Geology
AREA IV - Gold
associated with
brecciated, low
sulphidation, detachment
fault system.
11. Mineralization
Gold mineralization at
the Van Deemen project
occurs primarily in
gently-dipping zones of
quartz-sericite-hematite-
pyrite clay alteration of
brecciated Precambrian
gneiss.
Mineralization consists
of very fine free gold with
silica.
12. • +202 Drill Holes
• 3 distinct gold zones
• Brecciated crystalline
assemblage along the Van
Deemen Fault
Gold Zones
16. What’s Left in 2011
• 2 detailed 1’=100’ geologic maps from Fisher-Watt
• 5 cross sections from Fisher-Watt
• 140 Drill hole locations
• Drill logs from 1987 drilling by Fisher Watt
• Assay Certificates for Au only for all drilling
• Amselco data
• Miscellaneous reports and maps
• Historic resource of 32,000 oz Au
GeoGRAFX GIS Services
17. 2011 – 2013 International Star, Inc
• Data Compilation
• Field Check of Drill Hole locations using
survey grade gps
• Check Resource - 140 holes
Resource result – inferred resource of 1,294,442
tons at .034 Au oz/t for 44,011 contained ounces
– confirmed historical resource.
GeoGRAFX GIS Services
18. 2013 – 2017 International Star, Inc
• 2013 - Obtain over 1000 pulps from 1986
drilling and surface sampling
• Reassay pulps – confirm Au values
• Bulk Sample for Met testing
• 2015 – obtain additional information from
AZGS archives including 44 hole locations
• Resource with known data – 202 drill holes
• Engage Mine Engineer - Preliminary mine
cost
• Is it economic – at $1200 Au yes
• Commissioned SEC Guide 7 technical
report
GeoGRAFX GIS Services
20. 2017 – Current Newton and Boyle
• Incomplete technical report
• In-situ resource has increased contained ounces to 97,000
oz. No geology.
• Deposit inside ACEC area which requires a mine plan of
operations for any additional drilling
• Missing data for 100 Kunkes holes and unknown number
drilled by Red Dog and Frisco
• No drill logs for 1986 drilling (55 holes)
• 5 cross sections
• Over 1000 pulps from 1986 drilling and surface sampling
GeoGRAFX GIS Services
21. Moving Forward
• Finish Technical Report
• Find a way to include geology in the resource
– Use drill hole pulps to assign lithologic signatures
to samples
– Use additional information to create geologic
model
– Assign Munsell color signature to pulps
– Look for Au signatureGeoGRAFX GIS Services
22. 1986 DDH Pulp Analysis
• Pulps were rebagged
prior to analysis
• 932 Samples + Standards
and Blanks analyzed
using Olympus Delta
Handheld XRF in
Geochem Mode
• 38 elements analyzed
GeoGRAFX GIS Services
23. Munsell Color Assignment
•Took picture of each
pulp sample with
smartphone
•Used Soil Analysis
Pro app to assign
color to the sample
•Created db with ddh
info + color
GeoGRAFX GIS Services
28. Statistical Analysis – Discriminant Analysis
Discriminant Analysis technique creates a set of rules to
assign a sample to one of a set of groups. In order to
come up with these 'rules', a training dataset is used for
which the group memberships are known ahead of time
• 14 drill holes with lithology from cross sections –
training set
• 289 lithology values
• 4 lith classifications, Qt(3-elim), Tba(8), Pca(211),
Pcs(67)
GeoGRAFX GIS Services
30. Discriminant Analysis - Results
ioGAS classified 640 samples with unknown
lithologies into 3 groups based on the
training set parameters
• Tba – 239 items
• Pca – 145 items
• Pcs – 256 items
GeoGRAFX GIS Services
33. Discriminant Analysis – Check Results
Checked against known database of lithologies, Munsell
color designation
• Discriminant Function only classifies against groupings
it knows
• Several holes showed discrepancies – in area II near
QMP intrusive
• Area I Pca/Pcs order reversed.
• Added Qal values from Munsell color chart
• 32 Drill holes added to the database
GeoGRAFX GIS Services
34. Implicit Modeling of Geology
• Surface geology, ddh lithology
• Using Leapfrog
• Added in cross sections
GeoGRAFX GIS Services
37. What’s Next
• See if color and lithologic signature can be
used to predict Au mineralization.
• Alteration signatures
• Update resource to include geologic
domains
• Preliminary pit design
GeoGRAFX GIS Services
38. Conclusions
• Try to find as much information as possible so you don’t have to
regenerate it
• Statistical techniques need to follow standard practice
• Check your data to confirm results
• Implicit modeling needed addition of cross sections to refine the model
These techniques provide tools to aid in interpretation, i.e. boot leather
on the ground, they are not a replacement for field observation and
understanding the geology and mineralization of the deposit.
GeoGRAFX GIS Services
39. Contact Information
Barbara Carroll, CPG
GeoGRAFX GIS Services
1790 E. River Rd., Suite 213
Tucson, AZ 85718
Tel. 520.275.6173
bcarroll@geografxworld.com
GeoGRAFX GIS Services
Editor's Notes
Good afternoon. First off, I’d like to thank the co-authors, Clark Arnold and Steve Van Nort for their kind assistance and work that has gone into the Van Deemen project.
The topic of todays talk as you can see is Using Advanced Technologies to more effectively Utilize Historic Exploration Data
We’ll touch briefly on techniques that were commonly used in the 80s on a project and then fast forward to what we can do with that same information today.
But before we begin, I’d like to ask you all a couple of questions….…
So – raise your hands – how many of you have used a Brunton
Planimeter
Plane Table/Alidade
Hip Chain/Topofil
Light Table
Gps
Portable XRF
Leapfrog
Mobile Field Device – like your smartphone to collect strik/dip readings, or for field mapping..
Ok that give me a good idea of where we all are
And what do these items have in common???
They are all tools a geologist has used in the past 50 years or so to gain insight into the question…
What is the tonnage/grade of a deposit
And is it economic
Normally you’d define economics in our case by the price of gold, the mining costs, and % recovery. But these days that’s just part of the story, you also have to consider government regulations, permitting, access, environmental concerns and socio economic considerations.
Technology is changing how we work – this is a screen grab of my phone showing Historic Surface samples anomalous Au colored dots so I can see the Geochem values at a specific location, or I can turn on another layer of a scan of a geologic map and walk up to the outcrop and actually read the notes from the geologist that had been on the ground in the 80s. Or I can send claim location information from my office to a crew while they are still in the field..
But I digress… I think we all have stories about how we are using the current technology
Thirty years ago, costs for reverse circulation drilling would be in the range of $8.50/ foot, so, for 1,000 feet of it would cost $8,500 for drilling alone. Times have changed. It can now cost over $35,000 to drill that same footage, assuming that there are no permitting issues.
Multi-element trace geochemical analysis is a standard tool in detailed and regional geochemical exploration. In the past, geologists would utilize the top 10% of elemental analysis as anomalous. Currently, affordable trace element geochemistry provides the necessary tools to interpret bedrock geology and mineralization systems, a lot of 46 element ICP data resides in company files with limited interpretation.
Modern computational tools can rapidly process 3D geological datasets and assist in generating 3D geological models. Implicit modelling generates geological models directly from drill-hole data, using mathematical interpolation functions are used to generate 3D isosurfaces, instead of manual linkage of hand-digitized 2D cross-sections.
Many deposits have seen some historic development, whether it is surface sampling, geophysics, drilling, or production. Most current historic data sets are incomplete, with missing drill hole locations, drill logs, assay certificates, level plans or cross sections. How do you incorporate the past work on a project by people who understood the deposit into a coherent package so you can make an informed decision on the potential viability of the project without having to redrill the deposit?
Which leads us into today’s talk – has anyone been to the Van Deemen?? I continue to be amazed at the number of people who have visited the site at some point.
The Van Deemen project is located in the northern Black Mountains, Mohave County Arizona, 50 mi northwest of Kingman, Arizona (population ~28,000), which historically is one of the most prolific gold-producing mountain ranges in Arizona, yielding some 2.5 million ounces of gold. That number is most likely higher with the current production from the Moss Mine.
The Van Deemen property is characterized by Precambrian gneiss and schist which is overlain by Middle Tertiary volcanics and sediments. These two rock formations were brought into contact by a regional, low angle detachment fault with its attendant breccia zone. The top and bottom of this zone is in many cases a sharp contact between broken, but not brecciated, volcanic and sedimentary rocks above, intensely chloritized and moderately to poorly broken Precambrian gneiss and schist below.
This picture is from what is referred to as area IV – you can see the detachment fault on the bottom with the yellowish sericite alteration, overlain by Tertiary volcanics.
Gold mineralization at the Van Deemen occurs primarily in gently dipping zones of quartz-sericite-hematite-pyrite clay alteration of brecciated Precambrian gneiss. The alteration zones are spatially associated with rocks generally exhibiting an open style of brecciation. Stacked sheets of quartz breccia are often present in the gold zones, sometimes forming at the fault contact with the upper plate, and other times forming irregular lenses in the faulted gneiss.
These quartz breccias often contain mixed fragment types including brecciated chunks of vein quartz. The matrix supporting the breccia fragments appears to be made up of finely pulverized rock flour subsequently replaced by fine-grained quartz. In these quartz breccia zones, sulphides (pyrite and arsenopyrite) occur in and near late-stage fractures.
Mineralization consists of very fine free gold with silica.
There are three distinct gold zones at the Van Deemen prospect; Area II, III, and IV, Each gold zone occurs within the brecciated crystalline assemblage along the Van Deemen Fault. However, it is also evident that all the gold zones have a pronounced northeasterly trend. In Areas I, III, and IV the northeasterly trend of the gold zones appears to be related to a northeasterly-trending high-angle deformation zone, especially in Areas III and IV. Area I is an exploration target and is off the map.
Gold production on the Van Deemen Mine is believed to date from the 1930s. A section of mineralized rock was mined by open cut. Approximately 350 feet of exploratory adits, along with two shallow shafts, were driven in what is now the central part of the property
More recently, the Van Deemen area has been actively explored for both copper and gold. Copper exploration was conducted mostly in the 1970's and was directed toward deciphering a highly faulted and sliced Laramide (?) quartz monzonite porphyry copper system. The Van Deemen area again received attention in 1979/1980, but this time as a gold play rather than copper and has essentially been explored for gold since then. As you can see there has been extensive exploration by experienced mining people / companies
With all the previous work that’s been done on the property, you’d think there would be quite a large data base to draw from…
Technology in use in the 80s included
Using Cooper Aerial to fly the area
Creating a local mine grid based on a known 0,0 point – these were x,y coordinates no projection information
High quality detailed geologic mapping at 1”=100’ by Bud Hillemeyer, Jim Faulds
Detachment Model represented current thinking at the time.
F/W produced computer generated maps with topography, geochemical samples
Geochem analysis was usually limited to Au, Ag, Cu, Zn, Pb, W, Sn – in this case we had Au assays and only limited Ag assays from Amselco
Sectional Resource hand generated using a 50 area of influence
Some of the techniques we will be discussing later were actually developed in the 80s, for example Clark Arnold developed techniques as a consultant for Freeport, and I developed statistical techniques in for Duval during that time.
Historic data available for this case study at the Van Deemen gold deposit near Kingman, Arizona was incomplete, with missing drill hole locations, assays, drill logs, cross-sections
2 detailed 1’=100’ geologic maps from Fisher-Watt
5 cross sections from Fisher-Watt
140 Drill hole locations
Drill logs from 1987 drilling by Fisher Watt
Assay Certificates for Au only for all drilling
Amselco data
Miscellaneous reports and maps
Historic resource of 32,000 oz Au
Cross sections were generated by hand
This slide shows 4 of the 5 sections we received hung in space so we could check continuity of lithology
2011 Technology relies on computer assistance, everything from data entry, to resource estimation. GIS has replaced a draftsman and light table. GPS units have replaced plane tables and alidades. Multi element Geochem packages are the industry standard. Drilling costs have increased. Price of gold has gone up from $288 in the 1989 to $1500 in 2011. Deposits that were not worth looking at in the 80s are economic at the 2011 gold price.
International Star, a publicly traded company on the OTC markets took out an option on the property in 2011. Their focus was to demonstrate that the historical exploration data confirmed that the project merits additional work.
Towards that end, they hired GeoGRAFX to create a geological data base, calculate a resource using the 140 drill holes that were available at the time using the same parameters that had been used in the 1980s resource. The 2013 the check resource served to confirm the historic resource, ILST elected to move forward with the project.
In 2013 they obtained over 1000 pulps from the 1986 drilling and surface sampling. The pulps were used to confirm the historic drill hole assays. ILST also collected a bulk sample to check historic metallurgical results.
In 2015, additional information was recovered from the Arizona Geological Survey Archives. This included reports, geochemical, drill hole location maps for the 1986 and 1989 holes which added 44 holes to the data base.
With the additional information, ILST was able to regenerate the resource, engage a mine engineer Joe Bardswich who has worked with several other projects in the Black Mountains, to create a preliminary mine cost analysis.
Is the project economic? At $1200 Au yes. Based on that information ILST commissioned a SEC Guide 7 Technical Report as an aid in raising funds to move the Van Deemen towards production.
This is the block model with the additional drill holes showing a 0.014 cutoff. You can see that quite a bit of the deposit is either at or near surface.
In 2017 ILST defaulted on the claims. 3 of us that had worked on the project for ILST elected to move the project forward.
What do we have and how do we move the project forward??
We have a new company made up of geologist, mine engineer, and business manager
Incomplete technical report
In-situ resource has increased contained ounces to 97,000 oz. No geology. The reason we could increase the resource was based on the additional drilling information and the variography showed the continuity of 65 feed in a horizonal direction and 35 feet in the vertical rather than the 50 feet that had been previously used
Deposit inside ACEC area which requires a mine plan of operations for any additional drilling
Missing data for 100 Kunkes holes and unknown number drilled by Red Dog and Frisco
No drill logs for 1986 drilling
5 cross sections
Over 1000 pulps from 1986 drilling and surface sampling
N&B is Looking for funding to move towards production
Need to
Finish Technical Report
Find a way to include geology in the resource
Use drill hole pulps to assign lithologic signatures to samples
No map with surface sample IDs found so ignore surface sample pulps
Use additional information to create geologic model
Assign Munsell color signature to pulps
We focused on the pulps
Pulps were rebagged prior further work
932 Samples + Standards and Blanks analyzed using Olympus Delta Handheld XRF in Geochem Mode
38 elements analyzed
The Munsell Code of was designed as a visual color system for classification of rocks, sediments and soils. Steve van Nort had found this to be useful at Picacho.
Took picture of each pulp sample with smartphone
Used Soil Analysis Pro app to assign color to the sample
Created db with ddh info + color
The main goal of the statistical analysis was to find a way to use the Geochem analysis from the pXRF to classify the data by known lithology.
I used ioGAS™ software developed for geochemical data analysis. It’s advantage over standard statistical packages such as SPSS, Statistica, or excel spreadsheet add-ons is that it has additional useful tools such as Classification diagrams, and the tool I was interested in specifically Discriminant Projection Analysis. It also reads pXRF data and integrates with the discover software platform I was using to manage the data.
As with any statistical applications, there are certain rules that must be followed to optimize your results. For example, in order for a data set to be use for statistical analysis it is assumed to be taken from a random population, normally distributed, and it is assumed that the sample set we are working with represents the population we are describing.
24 elements with most values above detection limit (including Au from 1980s assays)
The first thing to look at is the data distribution. It is common for elements such as Au, Cu, Zn, etc to show a lognormal distribution, while elements such as Al, Ca, K, Na show a normal distribution.
Looking at the distribution – that is apparent here – the histograms are colored to reflect known lithology in our sample set with the black values representing unclassified lithologic values.
This slide shows the elements transformed to log values and you can see an improvement in the distribution of some of the elements, for example Cu. Based on a visual inspection and skewness/kurtosis values, 16 of the elements were transformed to log values for further work.
Log of Au, S, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Sr, Y, W, Pb, Th
Next, I created probability plots of the elements. What I’m looking for here is an indication of the number of populations I may be dealing with and what’s happening with the left and right tails of the plots. If the data is from a single population the graph should be a relatively straight line, if there are bends or breaks in the line then it gives me an idea that there may be several population present. Flat line tails on the left side show me the number of samples at or below detection limit. Scatter at the top of the right tail may indicate an additional population, possibly related to mineralization.
Discriminant Analysis technique creates a set of rules to assign a sample to one of a set of groups. In order to come up with these 'rules', a training dataset is used for which the group memberships are known ahead of time
14 drill holes from the 1986 drilling with lithology derived from cross sections – training set
289 lithology values
4 lith classifications, Qt(3-elim), Tba(8), Pca(211), Pcs(67)
Qt – Quaternary gravels
Tba – Tertiary volcanics
Pca – PreCambrian acidic gneiss, metarhyolite
Pcs – PreCambrian mixed gneiss schist
Discriminant Projection Analysis is similar to Principal component analysis in that it tries to maximize the variance between groupings, except that here we already know what our grouping variables are going to be – in this case lithology.
The projection step uses the training data to calculate linear discriminant functions, which are linear combinations of the original variables that maximize the differences between the predefined lithology groups. Discriminant Projection 1 (DP1) calculation is a combination of elements that explain the majority of the variance in the population, DP2 explains the next amount of variance and so on. It creates a value for each Discriminant Projection - in this case DP1 and DP2 for each data point. These functions allow the samples to be plotted in the discriminant space so that group separation can be visualized and investigated.
UR the graph in the upper right hand of the screen displays a scatter diagram of Discriminant Projection Axes 1 vs DP 2 – Tba samples are in blue, Pca samples in red, and Pcs in orange. You can see that the Tertiary volcanics separate well (except for one lone sample) from the Precambrian units, while in the Precambrian units there is some overprint of the acidic and gneissic units.
UL – displays the same graph with the classification colored in the background, so for any new sample that falls in the blue area would be classified as Tba
LL – Probability plot of all data colored by lithology – in the case DP1 you can see Tba samples shown differently than the Precambrian units
LR - A split probability plot subdivides the data according to the color attribute groups (lithology) and plots a normal probability plot for each group for the selected variables, on the same diagram (with the respective color from the attribute dialog). You can see in the case of DP2 it’s doing a better job of separating the lithologies
Once we had the calculations from the training set we applied those calculations to the other samples
ioGAS classified 640 samples with unknown lithologies into 3 groups based on the training set parameters
Tba – 239 items
Pca – 145 items
Pcs – 256 items
Scatter diagram showing DP1 and DP2 on the x,y axes similar to what was shown in the training set.
As you can see it did a fairly decent job of classifying the unknown values, however there is still a bit of scatter with the samples in each grouping
Split probability plot of DP2 and again you can see that there is a visual difference between the lithologies
So now I’m fairly comfortable with the results, however I do need to check the results.
640 samples were checked against known database of lithologies, Munsel color designation
Discriminant Function only classifies against groupings it knows, if there is a lithology that is not in the training set it will continue to classify it as best it can.
Several holes showed discrepancies – Hole 31 in area II near QMP intrusive, the pulp were a relatively light gray in color so I’m thinking that those values could be from the intrusive
Area I Pca/Pcs order reversed. Those are exploration holes outside the resource area, possible new lithology?
In all, 32 drill holes were added to the data base
That data was merged with the existing lithologic data base. So now I have a database with additional lithologies that I can use to create a geologic model
It seems that everyone wants to jump onto the implicit modelling bandwagon.
Implicit modelling uses mathematical tools to derive the model from the data. A definition from the micromine web site says that An Implicit Model is a continuous mathematical representation of an attribute across a volume.. "This method is not only efficient, it eliminates the personal perceptions and interpretations of geologists because it is a numerical process that is free of bias, which means less subjectivity and greater reliability."
Traditional modelling methods (those relying heavily on manual digitizing) as ‘explicit’.
Implicit modeling techniques were used to create a 3D geologic model of the deposit. We used Leapfrog to modeled Area III and IV separately from Area II. Those results were then tied back to historic observations and existing cross sections to confirm the validity of the model.
We modeled just 3 lithologies, the Tertiary upper unit shown here in green, the detachment fault shown in red, and the PreCambrian basement shown in brown. You’ll see that there are also yellow quaternary units displayed on some of the drill holes. We started out using only the surface geology, and drill holes to create the model. We did include strike and dips from the historical mapping. We then cut sections along the same section lines as the 4 historical sections we had previously recovered. The results showed marked dissimilarities in contacts between units To resolve the discrepancies, we digitized the boundaries on the sections back into the model. The result was more in line with what was encountered in the drill logs, sections and diagrams in reports.
Area II was more complex to model. We had the same lithologic units as in Areas III and IV, but we also had faulting that ran thru the area. We modeled the fault blocks separately. Once we’d resolved the issues involved with merging the surface geology with drill holes, leapfrog did a good job of handling the fault blocks.
As of September 1 that’s what we’ve done up to this point to move the Van Deemen forward.
There is still more to do. our goal is to move the project forward towards production.
As far as science goes, I need to see if we can tie the Munsell color assignments from the pulps to lithology to gold values to predict gold mineralization. This is something that Steve Van Nort was able to do at Picacho
I’d like to do additional multivariate work with the Geochem data to see if I can get alteration signatures and test if it’s important to mineralization.
I need to update the resource to include the geologic domains.
Once that is complete, we can create a preliminary Pit design – from there it involves the mine engineer to move the project forward
Try to find as much information as possible so you don’t have to regenerate it – as we said earlier, – Thirty years ago, costs for reverse circulation drilling would be in the range of $8.50/foot – now it can cost $35/foot or more and there may now be access and permitting issues. The AZGS provides a wonderful service with it’s document repository.
Statistical techniques need to follow standard practice. Treat each data set separately, they are different
Check your data to confirm results - this applies to everything from data entry, to modeling to resource estimation
Implicit modeling needed addition of historical cross sections to refine the model
While these techniques provide tools to aid in interpretation, i.e. boot leather on the ground, they are not a replacement for field observation and understanding the geology and mineralization of the deposit.
Thank you for your time. For additional information or questions please feel free to contact me at 520 275-6173 or bcarroll@geografxworld.com