This presentation will provide an update on the Lidar Division's work to produce an ASPRS document providing valuable information on airborne lidar density measuring and reporting. The scope of this document is:
• To clarify various definitions of density and related terms
• To document various methods for quantifying density
• To develop and document various methods for representation of density
• To document various tradeoffs among the methods of representation and quantification of density
• To provide recommendations on the use of density
ASPRS LiDAR Division Update with a focus on quantifying horizontal sampling d...MattBethel1
This presentation will provide an update on the Lidar Division's work to produce an ASPRS document providing valuable information on airborne lidar density measuring and reporting.
An in depth examination of airborne lidar density measurement and reporting. This will review issues with existing methodologies and comparisons to a proposed replacement.
How is airborne LiDAR point density measured? How is it reported? What constitutes point density acceptance? The varied answers may surprise you. LiDAR providers, data QC entities, and end users go about this many different ways. It is eye opening to learn how open ended this topic is and how data density can be masked to look good but actually be compromised and vice versa. So what method is best? Maybe the answer depends on the application. This presentation will address these topics and uncover how a simple concept can be much more complicated than one would ever think. A better understanding of LiDAR point density is needed so that everyone involved can have clear measurement and reporting expectations at the beginning of projects.
A Framework for Scene Recognition Using Convolutional Neural Network as Featu...Tahmid Abtahi
Scene recognition is one of the hallmark tasks of computer vision, allowing definition of a context for object recognition. Availability of large data sets like ImageNet and VGG has provided scopes of applying machine learning classifiers to train models. However high data dimensionality is an issue while training classifiers such as Support Vector Machine (SVM) and perceptron. To reduce data dimensionality and take advantage of parallel and distributed processing, we propose a framework with Convolutional Neural Network (CNN) as Feature extractor and SVM and perceptron as the classifier. MPI (Message passing interface) was used for programming clusters of CPUs. SVM showed 1.05x times improvement over perceptron in terms of run time and CNN reduced data dimensionality by 10x times.
This paper applies inverse transform sampling to sample training points for surrogate models. Inverse transform sampling uniformly generates a sequence of real numbers ranging from 0 to 1 as the probabilities at sample points. The coordinates of the sample points are evaluated using the inverse functions of Cumulative Distribution Functions (CDF). The inputs to surrogate models are assumed to be independent random variables. The sample points obtained by inverse transform sampling can effectively represent the frequency of occurrence of the inputs. The distributions of inputs to the surrogate models are fitted to their observed data. These distributions are used for inverse transform sampling. The sample points have larger densities in the regions where the Probability Density Functions (PDF) are higher. This sampling approach ensures that the regions with higher densities of sample points are more prevalent in the observations of the random variables. Inverse transform sampling is applied to the development of surrogate models for window performance evaluation. The distributions of the following three climatic conditions are fitted: (i) the outside temperature, (ii) the wind speed, and (iii) the solar radiation. The sample climatic conditions obtained by the inverse transform sampling are used as training points to evaluate the heat transfer through a generic triple pane window. Using the simulation results at the sample points, surrogate models are developed to represent the heat transfer through the window as a function of the climatic conditions. It is observed that surrogate models developed using the inverse transform sampling can provide higher accuracy than that developed using the Sobol sequence directly for the window performance evaluation.
ASPRS LiDAR Division Update with a focus on quantifying horizontal sampling d...MattBethel1
This presentation will provide an update on the Lidar Division's work to produce an ASPRS document providing valuable information on airborne lidar density measuring and reporting.
An in depth examination of airborne lidar density measurement and reporting. This will review issues with existing methodologies and comparisons to a proposed replacement.
How is airborne LiDAR point density measured? How is it reported? What constitutes point density acceptance? The varied answers may surprise you. LiDAR providers, data QC entities, and end users go about this many different ways. It is eye opening to learn how open ended this topic is and how data density can be masked to look good but actually be compromised and vice versa. So what method is best? Maybe the answer depends on the application. This presentation will address these topics and uncover how a simple concept can be much more complicated than one would ever think. A better understanding of LiDAR point density is needed so that everyone involved can have clear measurement and reporting expectations at the beginning of projects.
A Framework for Scene Recognition Using Convolutional Neural Network as Featu...Tahmid Abtahi
Scene recognition is one of the hallmark tasks of computer vision, allowing definition of a context for object recognition. Availability of large data sets like ImageNet and VGG has provided scopes of applying machine learning classifiers to train models. However high data dimensionality is an issue while training classifiers such as Support Vector Machine (SVM) and perceptron. To reduce data dimensionality and take advantage of parallel and distributed processing, we propose a framework with Convolutional Neural Network (CNN) as Feature extractor and SVM and perceptron as the classifier. MPI (Message passing interface) was used for programming clusters of CPUs. SVM showed 1.05x times improvement over perceptron in terms of run time and CNN reduced data dimensionality by 10x times.
This paper applies inverse transform sampling to sample training points for surrogate models. Inverse transform sampling uniformly generates a sequence of real numbers ranging from 0 to 1 as the probabilities at sample points. The coordinates of the sample points are evaluated using the inverse functions of Cumulative Distribution Functions (CDF). The inputs to surrogate models are assumed to be independent random variables. The sample points obtained by inverse transform sampling can effectively represent the frequency of occurrence of the inputs. The distributions of inputs to the surrogate models are fitted to their observed data. These distributions are used for inverse transform sampling. The sample points have larger densities in the regions where the Probability Density Functions (PDF) are higher. This sampling approach ensures that the regions with higher densities of sample points are more prevalent in the observations of the random variables. Inverse transform sampling is applied to the development of surrogate models for window performance evaluation. The distributions of the following three climatic conditions are fitted: (i) the outside temperature, (ii) the wind speed, and (iii) the solar radiation. The sample climatic conditions obtained by the inverse transform sampling are used as training points to evaluate the heat transfer through a generic triple pane window. Using the simulation results at the sample points, surrogate models are developed to represent the heat transfer through the window as a function of the climatic conditions. It is observed that surrogate models developed using the inverse transform sampling can provide higher accuracy than that developed using the Sobol sequence directly for the window performance evaluation.
Concept Drift: Monitoring Model Quality In Streaming ML ApplicationsLightbend
Most machine learning algorithms are designed to work with stationary data. Yet, real-life streaming data is rarely stationary. Machine learned models built on data observed within a fixed time period usually suffer loss of prediction quality due to what is known as concept drift.
The most common method to deal with concept drift is periodically retraining the models with new data. The length of the period is usually determined based on cost of retraining. The changes in the input data and the quality of predictions are not monitored, and the cost of inaccurate predictions is not included in these calculations.
A better alternative is monitoring the model quality by testing the inputs and predictions for changes over time, and using change points in retraining decisions. There has been significant development in this area within the last two decades.
In this webinar, Emre Velipasaoglu, Principal Data Scientist at Lightbend, Inc., will review the successful methods of machine learned model quality monitoring.
Measurement Procedures for Design and Enforcement of Harm Claim ThresholdsPierre de Vries
Presentation at DySPAN 2017, March 2017
Paper forthcoming on IEEE Xplore
Paper authors:
Janne Riihijärvi, Petri Mähönen (RWTH Aachen University, Germany)
J. Pierre de Vries (Silicon Flatirons Centre, University of Colorado, USA)
Improving travel time estimates for car in the Dutch NRM-west strategic trans...Luuk Brederode
Presentation at the European transport conference 2017 (Barcelona) (full paper available from https://aetransport.org/past-etc-papers/conference-papers-2017?abstractId=5685&state=b)
This paper advances the Domain Segmentation based on Uncertainty in the Surrogate (DSUS) framework which is a novel approach to characterize the uncertainty in surrogates. The leave-one-out cross-validation technique is adopted in the DSUS framework to measure local errors of a surrogate. A method is proposed in this paper to evaluate the performance of the leave-out-out cross-validation errors as local error measures. This method evaluates local errors by comparing: (i) the leave-one-out cross-validation error with (ii) the actual local error estimated within a local hypercube for each training point. The comparison results show that the leave-one-out cross-validation strategy can capture the local errors of a surrogate. The DSUS framework is then applied to key aspects of wind resource as- sessment and wind farm cost modeling. The uncertainties in the wind farm cost and the wind power potential are successfully characterized, which provides designers/users more confidence when using these models
FickleNet: Weakly and Semi-supervised Semantic Image Segmentation using Stoch...Seunghyun Hwang
Review : FickleNet: Weakly and Semi-supervised Semantic Image Segmentation using Stochastic Inference
- by Seunghyun Hwang (Yonsei University, Severance Hospital, Center for Clinical Data Science)
High-Performance Thin-Layer Chromatography (HPTLC) is an advanced chromatographic technique used for the separation, identification, and quantification of chemical compounds in complex mixtures. It shares principles with traditional thin-layer chromatography (TLC) but offers enhanced resolution, sensitivity, and reproducibility.
In HPTLC, a stationary phase is coated onto a flat glass or aluminum plate, forming a thin layer typically 100-200 μm thick. Samples are applied as spots near the bottom of the plate and then developed in a suitable solvent system. The separation occurs as the components of the sample migrate at different rates through the stationary phase, driven by capillary action and affinity interactions.
HPTLC excels in its ability to handle small sample volumes and to analyze multiple samples simultaneously. Moreover, it allows for the quantification of compounds through densitometric detection, where separated spots are visualized and quantified based on their absorbance or fluorescence intensity.
This technique finds widespread applications in various industries, including pharmaceuticals, food and beverages, cosmetics, and environmental analysis. It is utilized for quality control, purity assessment, and identification of compounds in raw materials, finished products, and complex matrices. HPTLC's versatility, speed, and cost-effectiveness make it a valuable tool for analytical laboratories seeking accurate and reliable chemical analysis.
Slides for a libfabric tutorial presented at SC|15 (http://sc15.supercomputing.org/) on November 17, 2015 by Sean Hefty, Dave Goodell, Howard Pritchard, and Paul Grun.
In spite of the recent developments in surrogate modeling techniques, the low fidelity of these models often limits their use in practical engineering design optimization. When surrogate models are used to represent the behavior of a complex system, it is challenging to simultaneously obtain high accuracy over the entire design space. When such surrogates are used for optimization, it becomes challenging to find the optimum/optima with certainty. Sequential sampling methods offer a powerful solution to this challenge by providing the surrogate with reasonable accuracy where and when needed. When surrogate-based design optimization (SBDO) is performed using sequential sampling, the typical SBDO process is repeated multiple times, where each time the surrogate is improved by addition of new sample points. This paper presents a new adaptive approach to add infill points during SBDO, called Adaptive Sequential Sampling (ASS). In this approach, both local exploitation and global exploration aspects are considered for updating the surrogate during optimization, where multiple iterations of the SBDO process is performed to increase the quality of the optimal solution. This approach adaptively improves the accuracy of the surrogate in the region of the current global optimum as well as in the regions of higher relative errors. Based on the initial sample points and the fitted surrogate, the ASS method adds infill points at each iteration in the locations of: (i) the current optimum found based on the
fitted surrogate; and (ii) the points generated using cross-over between sample points that
have relatively higher cross-validation errors. The Nelder and Mead Simplex method is adopted as the optimization algorithm. The effectiveness of the proposed method is illustrated using a series of standard numerical test problems.
Algorithmic Techniques for Parametric Model RecoveryCurvSurf
A complete description of algorithmic techniques for automatic feature extraction from point cloud. The orthogonal distance fitting, an art of maximum liklihood estimation, plays the main role. Differential geometry determines the type of object surface.
Using FME for Topographical Data Generalization at Natural Resources CanadaSafe Software
To meet increasing and diversified user needs for geographic information, Natural Resources Canada (NRCan) must produce and maintain geographic data at multiple scales. To automate the generalization process NRCan is using an approach based on FME and MetaAlgorithms.
Concept Drift: Monitoring Model Quality In Streaming ML ApplicationsLightbend
Most machine learning algorithms are designed to work with stationary data. Yet, real-life streaming data is rarely stationary. Machine learned models built on data observed within a fixed time period usually suffer loss of prediction quality due to what is known as concept drift.
The most common method to deal with concept drift is periodically retraining the models with new data. The length of the period is usually determined based on cost of retraining. The changes in the input data and the quality of predictions are not monitored, and the cost of inaccurate predictions is not included in these calculations.
A better alternative is monitoring the model quality by testing the inputs and predictions for changes over time, and using change points in retraining decisions. There has been significant development in this area within the last two decades.
In this webinar, Emre Velipasaoglu, Principal Data Scientist at Lightbend, Inc., will review the successful methods of machine learned model quality monitoring.
Measurement Procedures for Design and Enforcement of Harm Claim ThresholdsPierre de Vries
Presentation at DySPAN 2017, March 2017
Paper forthcoming on IEEE Xplore
Paper authors:
Janne Riihijärvi, Petri Mähönen (RWTH Aachen University, Germany)
J. Pierre de Vries (Silicon Flatirons Centre, University of Colorado, USA)
Improving travel time estimates for car in the Dutch NRM-west strategic trans...Luuk Brederode
Presentation at the European transport conference 2017 (Barcelona) (full paper available from https://aetransport.org/past-etc-papers/conference-papers-2017?abstractId=5685&state=b)
This paper advances the Domain Segmentation based on Uncertainty in the Surrogate (DSUS) framework which is a novel approach to characterize the uncertainty in surrogates. The leave-one-out cross-validation technique is adopted in the DSUS framework to measure local errors of a surrogate. A method is proposed in this paper to evaluate the performance of the leave-out-out cross-validation errors as local error measures. This method evaluates local errors by comparing: (i) the leave-one-out cross-validation error with (ii) the actual local error estimated within a local hypercube for each training point. The comparison results show that the leave-one-out cross-validation strategy can capture the local errors of a surrogate. The DSUS framework is then applied to key aspects of wind resource as- sessment and wind farm cost modeling. The uncertainties in the wind farm cost and the wind power potential are successfully characterized, which provides designers/users more confidence when using these models
FickleNet: Weakly and Semi-supervised Semantic Image Segmentation using Stoch...Seunghyun Hwang
Review : FickleNet: Weakly and Semi-supervised Semantic Image Segmentation using Stochastic Inference
- by Seunghyun Hwang (Yonsei University, Severance Hospital, Center for Clinical Data Science)
High-Performance Thin-Layer Chromatography (HPTLC) is an advanced chromatographic technique used for the separation, identification, and quantification of chemical compounds in complex mixtures. It shares principles with traditional thin-layer chromatography (TLC) but offers enhanced resolution, sensitivity, and reproducibility.
In HPTLC, a stationary phase is coated onto a flat glass or aluminum plate, forming a thin layer typically 100-200 μm thick. Samples are applied as spots near the bottom of the plate and then developed in a suitable solvent system. The separation occurs as the components of the sample migrate at different rates through the stationary phase, driven by capillary action and affinity interactions.
HPTLC excels in its ability to handle small sample volumes and to analyze multiple samples simultaneously. Moreover, it allows for the quantification of compounds through densitometric detection, where separated spots are visualized and quantified based on their absorbance or fluorescence intensity.
This technique finds widespread applications in various industries, including pharmaceuticals, food and beverages, cosmetics, and environmental analysis. It is utilized for quality control, purity assessment, and identification of compounds in raw materials, finished products, and complex matrices. HPTLC's versatility, speed, and cost-effectiveness make it a valuable tool for analytical laboratories seeking accurate and reliable chemical analysis.
Slides for a libfabric tutorial presented at SC|15 (http://sc15.supercomputing.org/) on November 17, 2015 by Sean Hefty, Dave Goodell, Howard Pritchard, and Paul Grun.
In spite of the recent developments in surrogate modeling techniques, the low fidelity of these models often limits their use in practical engineering design optimization. When surrogate models are used to represent the behavior of a complex system, it is challenging to simultaneously obtain high accuracy over the entire design space. When such surrogates are used for optimization, it becomes challenging to find the optimum/optima with certainty. Sequential sampling methods offer a powerful solution to this challenge by providing the surrogate with reasonable accuracy where and when needed. When surrogate-based design optimization (SBDO) is performed using sequential sampling, the typical SBDO process is repeated multiple times, where each time the surrogate is improved by addition of new sample points. This paper presents a new adaptive approach to add infill points during SBDO, called Adaptive Sequential Sampling (ASS). In this approach, both local exploitation and global exploration aspects are considered for updating the surrogate during optimization, where multiple iterations of the SBDO process is performed to increase the quality of the optimal solution. This approach adaptively improves the accuracy of the surrogate in the region of the current global optimum as well as in the regions of higher relative errors. Based on the initial sample points and the fitted surrogate, the ASS method adds infill points at each iteration in the locations of: (i) the current optimum found based on the
fitted surrogate; and (ii) the points generated using cross-over between sample points that
have relatively higher cross-validation errors. The Nelder and Mead Simplex method is adopted as the optimization algorithm. The effectiveness of the proposed method is illustrated using a series of standard numerical test problems.
Algorithmic Techniques for Parametric Model RecoveryCurvSurf
A complete description of algorithmic techniques for automatic feature extraction from point cloud. The orthogonal distance fitting, an art of maximum liklihood estimation, plays the main role. Differential geometry determines the type of object surface.
Using FME for Topographical Data Generalization at Natural Resources CanadaSafe Software
To meet increasing and diversified user needs for geographic information, Natural Resources Canada (NRCan) must produce and maintain geographic data at multiple scales. To automate the generalization process NRCan is using an approach based on FME and MetaAlgorithms.
The Forest and the Trees - Enhanced Resolution and Efficiency featuring Galax...MattBethel1
An in-depth look at a unique forestry project featuring lidar-derived forest biomass data products. Led by Merrick and Co and supported by the Galaxy G2 Sensor System, this project features unsurpassed data collection efficiency in restricted airspace limitations. We look forward to sharing details of Merrick's innovative approach resulting in the enablement of flood hazard mitigation and preparedness, conservation planning and various biomass related forest products.
Airborne LiDAR – Traditional Nadir Versus Oblique PerspectivesMattBethel1
Airborne topographic LiDAR started out 25+ years ago with the most commonly known approach of scanning a laser perpendicular to flight direction but with that scan pattern generally pointed straight down. This nadir orientation is still widely used but the airborne LiDAR mapping industry has grown to include a variety of oblique laser perspectives for some applications. Each approach has strengths for some applications and weaknesses for others. LiDAR sidelap should also be grouped into this application-approach discussion. This presentation will review the benefits of each scanning perspective and show data examples highlighting the strengths of each approach.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
2023_ASPRS_LiDAR_Division_Update.pdf
1. ASPRS LiDAR Division Update
With a focus on
Quantifying horizontal sampling density of
aerial lidar point cloud data
2023 ASPRS Conference at GeoWeek
February 15, 2023
Matt Bethel
Assistant Lidar Division Director for ASPRS
Director of Operations and Technology
Merrick & Company
2. ASPRS LiDAR Division Update Working Groups
• Best Practices and Guidelines Working Group
• Control points
• Data acquisition
• Data processing
• Standards
• LAS Working Group
• Update to LAS Domain Profile (LDP) Description:
Topobathy Lidar Version 2.0
• Bathy Working Group
• Created December 2022
• Initial focus is to complete and release the draft of the
Bathy Lidar Specification being worked on by federal
partners for community feedback
3. ASPRS Lidar Division Update
The airborne lidar calibration and validation working group has been
focused almost entirely on the completion of a new document titled:
“Quantifying horizontal sampling density of aerial lidar point cloud data”.
This includes the following:
• Requirements for lidar point density measurement and reporting
• Review the needs for visualizing density and violations
• Description of the methods typically used for estimating and
reporting lidar point density
• Comparisons of methods and identification of issues/limitations
• Recommendations
Why is change needed?
5. Density Per Swath
Points per square meter
Line number 100% of swath 95% of swath 75% of swath 50% of swath
1 14.76 13.02 11.72 11.40
2 13.85 12.33 11.10 10.77
3 14.14 12.55 11.33 11.06
4 12.96 11.83 10.53 10.30
0
2
4
6
8
10
12
14
16
Line 1 Line 2 Line 3 Line 4
Points
per
square
meter
Chart of Single Swath Densities by Edge Clipping
100% of swath 95% of swath 75% of swath 50% of swath
Pros
• Ideal to compare against planned
swath density
• Relatively easy to compute
• Reasonably batchable – one
process per flightline
• Decent to use for reporting
• Is not biased (inflated) by sidelap
• Very straightforward
Cons
• Does not adequately account for
localized density variations such as
changes in aircraft speed or sudden
variations in pitch.
• Needs interpretation if flying >50%
sidelap or multiple passes to
achieve planned density
• Results from lidar systems with
inconsistent scanner swath
densities can adversely affect the
reported density results. Edge
exclusion may need to be used.
6. Aggregate / Project Wide Point Density
Pros
• Considers all collected points (if linear mode, only first or
last return is used)
• Straightforward approach (number of first or last return
points / area of project boundary)
Cons
• Swath edge densities, crosslines, sidelap, collection block
overlap, and patches can inflate density results
• Tabular reporting only will not identify localized density
failures. A thematic raster is needed for locating potential
density issues. Thematic density raster can be difficult to
interpret and unreliable to use due to aliasing.
Number of First
Return Points
Area of Polygon
(m2)
Point Density
(points/m2)
339,650,243 17,204,792 19.742
7. Grid / Point in Pixel Counting / Tile Based
Density Measurement Method
Typical grid analysis
Pros
• Straightforward approach – use grid or tile scheme to
count points and report on normalized point counts per
grid/tile area
• Fast and easy to calculate
• Easy to use for reporting – pass fail percentage results and
graphic
Cons
• Integer rounding is inherent in this process, lacks decimal
precision compared to representative area density
calculation
• Different user defined processing cell size changes the
results
• Inherent with aliasing problems that invalidates the
results
Hybrid of swath and grid analysis using the
sweet spot of the swath
Pros
• Useful to compare against planned swath density
• Relatively easy to compute
• Reasonably batchable – one process per flightline
• Is not biased (inflated) by sidelap nor densification at the
edges of some scanners’ swaths
Cons
• Integer rounding is inherent in this process, lacks decimal
precision compared to representative area density
calculation
• Needs interpretation if flying >50% sidelap or multiple
passes to achieve planned density
• Does not show density everywhere
• Different user defined processing cell size changes the
results
• Inherent with aliasing problems that invalidates the
results
9. Binary Raster for Pass/Fail
Density Assessment
Typical grid analysis
Pros
• Seemingly straightforward approach – use grid or tile scheme to
count points and report on normalized point counts per grid/tile
area
• Fast and easy to calculate
• Easy to use for reporting – pass fail percentage results and
graphic
Cons
• Integer rounding is inherent in this process, lacks decimal
precision compared to representative area density calculation
• The results are in pass/fail cell counts yet there are no establish
parameters for use or analysis (no passing thresholds)
• Results are severely misunderstood yet widely used and relied
upon by some in our industry
• Different user defined processing cell size changes the results
• Inherent with aliasing problems that invalidates the results
Hybrid of swath and grid analysis using the
sweet spot of the swath
Pros
• Useful to compare against planned swath density
• Relatively easy to compute
• Reasonably batchable – one process per flightline
• Is not biased (inflated) by sidelap nor densification at the
edges of some scanners’ swaths
Cons
• Integer rounding is inherent in this process, lacks decimal
precision compared to representative area density
calculation
• Needs interpretation if flying >50% sidelap or multiple
passes to achieve planned density
• Does not show density everywhere
• Different user defined processing cell size changes the
results
• Inherent with aliasing problems that invalidates the
results
11. What is Aliasing?
Aliasing is defined as the distortion or artifact that results when a signal reconstructed from samples is different from the original continuous signal.
Aliasing is defined as the distortion or artifact that results when measurements of evenly spaced samples are used to create a raster product from randomly spaced points.
20. Voronoi Density Measurement Method
Pros
• Most accurate representation of point density
• Measurement is an area of point influence. Density can be derived by 1/Voronoi area.
• Pass/fail is not biased by scanner type, sidelap, crosslines, or acquisition approach (e.g., >50% sidelap or
multiple sensors)
• Is not affected by aliasing or varying tile sizes
• Preserves decimal precision rather than being integer limited
Cons
• Generally, longer processing time than other methods but this can be mitigated with parallel and even
distributed processing
21. All Swaths Density Results
Using Voronoi Method
(Charts of 4X [default] and 8X Required Density)