The European Copernicus programme with its Sentinel satellites is one of the most ambitious Earth observation programmes to date with all data being freely accessible. Copernicus addresses several thematic areas including land, marine, atmosphere, climate change, emergency management and security. Different satellite types have been and will be further launched; hence, weather independent Radar data, optical and infrared data are now available. In Europe the revisit time is between 3-5 days, allowing to monitor the same areas at high frequency. Actual land use, forest structure, and vegetation phases can be recorded promptly, to name only a few examples. While the Copernicus program is well perceived in the Earth observation community, the new data sets are still widely unnoticed or underused in the GIS community as well as in public administration, also due to the sheer amount of available data in the Petabyte range and the need of notable computational power. This is a great opportunity for specialized service providers to develop new applications for administration, science, and business in order to find new ways of retrieving information from Petabytes of raw data.
In our talk we will present an open source approach to a processing such data in a cloud based system, providing standardized OGC Web Services by GeoServer and MapProxy software. The backend of the system is able to timely post-process and analyze Sentinel data in an automated way using the GRASS GIS and GDAL software. We have developed a REST API based system that allows the user to automatically derive thematic data layers based on algorithms provided by the portal. This greatly simplifies the user’s life since own topical layers can be generated without the need of deep technical knowledge of software, hardware and time series management. We believe that this approach likely widens up the potential user group of the Copernicus program. At the same time, it connects two worlds that are often unnecessarily disentangled: the GIS and the remote sensing communities.
The presentation is completed by some examples and practical use cases, illustrating the idea of the workflow and the architecture of the portal.
Generation of high resolution DSM using UAV Images Nepal Flying Labs
A final year project by Geomatics Engineering Students at Kathmandu University,Dhulikhel,Kavre.
All the datasets required for this project have been downloaded from the popular Trimble Company.This project makes use of 27 high resolution (2.4 cm average spatial resolution) UAV-acquired images of a sand mine at Tielt-Winge, Belgium . These images have been acquired by a Sony Nex-5R digital camera mounted on a Trimble UX5 Imaging Rover, a fixed wing UAV. Three software: LPS, AgiSoft PhotoScan and PIX4D were used for image processing.
The team members:
1.Uttam Pudasaini : utmpudasaini@hotmail.com
2.Niroj Panta : sadrose777@gmail.com
3.Biplov Bhandari : bionicbiplov45@gmail.com
4.Upendra Oli : Upendraoli@gmail.com
New features presentation: meteodyn WT 4.8 software - Wind EnergyJean-Claude Meteodyn
New feature of meteodyn WT, CFD software for wind resource assessment and wind park optimisation. Worldwide terrain database, convergence improvements and others improvements.
Extreme weather events pose great potential risk on ecosystem, infrastructure and human health. Analyzing extreme weather in the observed record (satellite, reanalysis products) and characterizing changes in extremes in simulations of future climate regimes is an important task. Thus far, extreme weather events have been typically specified by the community through hand-coded, multi-variate threshold conditions. Such criteria are usually subjective, and often there is little agreement in the community on the specific algorithm that should be used. We propose the use of a different approach: machine learning (and in particular deep learning) for solving this important problem. If human experts can provide spatio-temporal patches of a climate dataset, and associated labels, we can turn to a machine learning system to learn the underlying feature representation. The trained Machine Learning (ML) system can then be applied to novel datasets, thereby automating the pattern detection step. Summary statistics, such as location, intensity and frequency of such events can be easily computed as a post-process.
We will report compelling results from our investigations of Deep Learning for the tasks of classifying tropical cyclones, atmospheric rivers and weather front events. For all of these events, we observe 90-99% classification accuracy. We will also report on progress in localizing such events: namely drawing a bounding box (of the correct size and scale) around the weather pattern of interest. Both tasks currently utilize multi-layer convolutional networks in conjunction with hyper-parameter optimization. We utilize HPC systems at NERSC to perform the optimization across multiple nodes, and utilize highly-tuned libraries to utilize multiple cores on a single node. We will conclude with thoughts on the frontier of Deep Learning and the role of humans (vis-a-vis AI) in the scientific discovery process.
The European Copernicus programme with its Sentinel satellites is one of the most ambitious Earth observation programmes to date with all data being freely accessible. Copernicus addresses several thematic areas including land, marine, atmosphere, climate change, emergency management and security. Different satellite types have been and will be further launched; hence, weather independent Radar data, optical and infrared data are now available. In Europe the revisit time is between 3-5 days, allowing to monitor the same areas at high frequency. Actual land use, forest structure, and vegetation phases can be recorded promptly, to name only a few examples. While the Copernicus program is well perceived in the Earth observation community, the new data sets are still widely unnoticed or underused in the GIS community as well as in public administration, also due to the sheer amount of available data in the Petabyte range and the need of notable computational power. This is a great opportunity for specialized service providers to develop new applications for administration, science, and business in order to find new ways of retrieving information from Petabytes of raw data.
In our talk we will present an open source approach to a processing such data in a cloud based system, providing standardized OGC Web Services by GeoServer and MapProxy software. The backend of the system is able to timely post-process and analyze Sentinel data in an automated way using the GRASS GIS and GDAL software. We have developed a REST API based system that allows the user to automatically derive thematic data layers based on algorithms provided by the portal. This greatly simplifies the user’s life since own topical layers can be generated without the need of deep technical knowledge of software, hardware and time series management. We believe that this approach likely widens up the potential user group of the Copernicus program. At the same time, it connects two worlds that are often unnecessarily disentangled: the GIS and the remote sensing communities.
The presentation is completed by some examples and practical use cases, illustrating the idea of the workflow and the architecture of the portal.
Generation of high resolution DSM using UAV Images Nepal Flying Labs
A final year project by Geomatics Engineering Students at Kathmandu University,Dhulikhel,Kavre.
All the datasets required for this project have been downloaded from the popular Trimble Company.This project makes use of 27 high resolution (2.4 cm average spatial resolution) UAV-acquired images of a sand mine at Tielt-Winge, Belgium . These images have been acquired by a Sony Nex-5R digital camera mounted on a Trimble UX5 Imaging Rover, a fixed wing UAV. Three software: LPS, AgiSoft PhotoScan and PIX4D were used for image processing.
The team members:
1.Uttam Pudasaini : utmpudasaini@hotmail.com
2.Niroj Panta : sadrose777@gmail.com
3.Biplov Bhandari : bionicbiplov45@gmail.com
4.Upendra Oli : Upendraoli@gmail.com
New features presentation: meteodyn WT 4.8 software - Wind EnergyJean-Claude Meteodyn
New feature of meteodyn WT, CFD software for wind resource assessment and wind park optimisation. Worldwide terrain database, convergence improvements and others improvements.
Extreme weather events pose great potential risk on ecosystem, infrastructure and human health. Analyzing extreme weather in the observed record (satellite, reanalysis products) and characterizing changes in extremes in simulations of future climate regimes is an important task. Thus far, extreme weather events have been typically specified by the community through hand-coded, multi-variate threshold conditions. Such criteria are usually subjective, and often there is little agreement in the community on the specific algorithm that should be used. We propose the use of a different approach: machine learning (and in particular deep learning) for solving this important problem. If human experts can provide spatio-temporal patches of a climate dataset, and associated labels, we can turn to a machine learning system to learn the underlying feature representation. The trained Machine Learning (ML) system can then be applied to novel datasets, thereby automating the pattern detection step. Summary statistics, such as location, intensity and frequency of such events can be easily computed as a post-process.
We will report compelling results from our investigations of Deep Learning for the tasks of classifying tropical cyclones, atmospheric rivers and weather front events. For all of these events, we observe 90-99% classification accuracy. We will also report on progress in localizing such events: namely drawing a bounding box (of the correct size and scale) around the weather pattern of interest. Both tasks currently utilize multi-layer convolutional networks in conjunction with hyper-parameter optimization. We utilize HPC systems at NERSC to perform the optimization across multiple nodes, and utilize highly-tuned libraries to utilize multiple cores on a single node. We will conclude with thoughts on the frontier of Deep Learning and the role of humans (vis-a-vis AI) in the scientific discovery process.
Geopsy yaygın olarak kullanılan profesyonel bir program. Özellikle, profesyonel program deneyimi yeni mezunlarda çok aranan bir özellik. Bir öğrencim çalışmasında kullanmayı planlıyor.
How can the use of computer simulation benefit the monitoring and mitigation ...BrennanMinns
This research essay will attempt to explore the understanding that accurate computer simulation could vastly benefit the future mitigation and monitoring of orbital debris.
Artificial intelligence (AI) is experiencing steadily growing interest over the recent years. For good reason, since these innovative algorithms and methods, such as machine learning and deep neural networks, in which knowledge is acquired and applied based on data, enable the automation of a wide range of processes and quickly deliver precise results. AI is also getting more and more popular in the space sector. The Institute of Space Technology & Space Applications (ISTA) at the Universität der Bundeswehr in Munich is conducting research around AI for space operations, science, and technology. An overview of activities and current developments towards fault management, autonomous collision avoidance, autonomous landing, as well as radio science at ISTA will be presented.
Advanced weather forecasting for RES applications: Smart4RES developments tow...Leonardo ENERGY
Recording at: https://youtu.be/45Zpjog95QU
This is the 3rd Smart4RES webinar that will address technological and market challenges in RES prediction and will introduce the Smart4RES strategy to improve weather forecasting models with high resolution.
Through wind and solar applications, Innovative Numerical Weather Prediction and Large-Eddy Simulation approaches will be presented.
This is a presentation of the JGrass-newAGE system held in Potenza on February 24 20117. It contains an overview of concepts, ideas, behing JGrass-NewAGE ans shows some achievements in a critical manner.
Invited talk at workshop "Exascale Computing in Astrophysics" held in Ascona, Switzerland, 8-13 September 2013.
http://www.itp.uzh.ch/exastro2013/Home.html
El 29 de febrero y el 1 de marzo de 2016, la Fundación Ramón Areces analizó la relación entre 'Big Data y el cambio climático' en unas jornadas. ¿Puede el Big Data ayudar a reducir el cambio climático? ¿Cómo contribuirá ese análisis masivo de datos a prevenir y gestionar catástrofes naturales? Son solo algunas de las preguntas a las que intentarán responder los ponentes. Las ciencias vinculadas al clima tienen en el Big Data una herramienta muy prometedora para afrontar diferentes fenómenos asociados al cambio climático.
Construction of Structurally and Stratigraphically Consistent Structural Mode...Laurent Souche
The following slides were presented at the International Petroleum Technology Conference in Kuala Lumpur, in December 2014.
This presentation addresses the general problem of building a geological model from interpretation data. Three topics are discussed in details:
- The interpolation of the model in presence of sparse/incomplete data ;
- The incorporation of dense data, and of complex geological constraints ;
- The improvements that an initial model can bring to geological interpretations.
The technical solution we are proposing is based on the interpolation of a relative geological age attribute in the volume of interest. This is the volume-based modeling (VBM) technique.
A case study detailing the construction of a 3D model from a dataset located offshore Australia is also presented to demonstrate this technique.
Authors: Laurent Souche (1), Gulnara Iskenova (1), Francois Lepage(1,2) and David Desmarest (1)
(1) Schlumberger
(2) Rocksoft
Similar to OpenQuake: Impact of Engine v 1.0 launch on worldwide #seismic hazard assessment #GEMRVL2013 (20)
Social Vulnerability Datasets through the OpenQuake Platform and Description of a Case-Scenario of Integrated Risk and Resilience using OpenQuake Tools.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Elevating Tactical DDD Patterns Through Object Calisthenics
OpenQuake: Impact of Engine v 1.0 launch on worldwide #seismic hazard assessment #GEMRVL2013
1. OpenQuake: Impact of Engine 1.0 launch
on worldwide seismic hazard assessment
Marco Pagani, GEM Hazard Team, GEM Foundation
Damiano Monelli, GEM Hazard Team, SED-ETH
Graeme Weatherill, GEM Hazard Team, GEM Foundation
Laurentiu Danciu, GEM Hazard Team, SED-ETH
on behalf of the GEM Risk, the GEM IT Teams and the GEM Community
2. Motivations
“anything less than release of the actual source code is an
indefensible approach for any scientific result that
depends on computation, because not releasing such
code raises needless, and needlessly confusing roadblocks
to reproducibility” (Ince et al., 2012; Nature)
Reproducibility and transparency
We wanted to shift from the current one-
one-one paradigm to the one-many-many
development model. Only in this way it’s
possible to ensure long term maintenance,
incorporate newest ideas and features and
aim at a large community of users.
A community based development
process A multi-HW and multi-OS
software
3. Main features
A modular software
OQ-engine is organized into a
number of libraries
oq-hazardlib
oq-risklib
oq-nrmllib
A multi-purpose tool
A software for the calculation of hazard
and physical risk
Hazard Risk
Open and transparent
Take a look at
http://github.com/gem
4. Main features (contd)
Documented
We produce documentation explaining
how to use the oq-engine and the
methods
Tested
Testing is an integral part of the
development process
6. OQ-engine risk module
The risk module of the oq-engine is currently comprised of five
risk calculation workflows:
• Two calculate losses and damage distributions due to a single
earthquake
• Two calculate seismic risk using probabilistic seismic hazard
• One uses loss exceedance curves to assess whether
retrofitting measures would be economically viable or not
Main result typologies:
• Loss maps, loss statistics, loss curves, total loss curve
• Damage distribution per asset or per taxonomy, collapse maps
• Benefit/cost ratio map
8. The oq-engine natively support PSHA input models accounting
for epistemic uncertainties by means of a logic tree structure.
The input generally consists of:
- Configuration file
- Seismic source model logic tree
- Ground motion logic tree
Input model
9. A collection of source typologies
Five source typologies:
‒ Point Source
‒ Area Source
Can be used for
modeling distributed
seismicity
Can be used for
modeling shallow
crustal faults,
subduction faults
Without floating ruptures
With floating ruptures
‒ Simple Fault Source
‒ Complex Fault Source
‒ Characteristic fault Source
10. A growing list of Ground Motion Prediction Equations
for different Tectonic Regions
Stable continental regions
‣ Atkinson and Boore (2006)
‣ Campbell (2003)
‣ Toro et al. (1997)
Shallow Earthquakes in Active
Tectonic R.
‣ Abrahamnson and Silva (2008)
‣ Akkar and Bommer (2010)
‣ Akkar and Cagnan (2010)
‣ Akkar et al. (2013)
‣ Boore and Atkinson (2008)
‣ Cauzzi and Faccioli (2008)
‣ Chiou and Youngs (2008)
‣ Sadigh et al. (1997)
Subduction
‣ Atkinson and Boore (2003)
‣ Lin and Lee (2008)
‣ Youngs et al. (1997)
‣ Zhao et al. (2006)
11. Calculators:
‣ Classical Probabilistic Seismic Hazard Analysis (PSHA)
‣ Event-based PSHA
‣ Scenario hazard
‣ Disaggregation (currently only for Classical PSHA)
One code serving different needs …
16. Incorporating models from the community
‣ United States 2008 (USGS)
‣ Canada (Canada Geological Survey)
‣ Alaska 2007 (USGS)
‣ Japan 2012 (J-SHIS – NIED)
‣ Australia (Geoscience Australia)
‣ Taiwan (Cheng et al., 2007)
‣ SHARE (Regional program for the Europe)
‣ EMME (Regional program for the Middle East)
‣ South America 2010 (USGS)
‣ Global Uniform Model
18. Testing typologies in the OpenQuake-engine:
- Unittests
- Quality-assurance tests
“Many of these scientists rely on the fact that the software has
appeared in a peer-reviewed article, recommendations, and
personal opinion, as their reason for adopting software. This is
scientifically misplaced, as the software code used to conduct the
science is not formally peer-reviewed.” (Joppa et al., 2013;
Science)
Testing, testing, testing
21. Verification calculations SSHAC Level 3 project
‣ An application of the Classical-PSHA methodology
‣ SSHAC level 3 and 4 are the most sophisticated PSHA models
‣ Compared the results of some Ground Motion Prediction
Equations implemented in OQ-engine against the ones
prepared inside the project using a commercial software for
Probabilistic Seismic Hazard Analysis
‣ Computed hazard curves for a selected set of test cases
23. Index Event Table for the Bosphorus 1 deal
‣ An application of the Event-based PSHA methodology
‣ Computed Stochastic Event Sets (SES) of different duration
using one of the SHARE models
‣ From SES we
obtained a collection
of spatially correlated
Ground Motion Fields
‣ For each Ground
Motion Field we
computed the
corresponding Event
Index
28. Potential new features
Tsunami hazard
Tsunamis pose a serious risk
threat in several costal areas of
the world. A module in the OQ-
engine would be thus extremely
useful.
Short term hazard
The classical PSHA methodology
takes into account only
mainshocks. For this reason it
necessary to implement a
specialised calculator to be used to
for the assessment of losses
produced by long aftershock
sequences.
Non-parametric sources
This source typology will allow the
calculation of hazard virtually using
whatever PSHA input model. A non-
parametric source is a list of ruptures
each one with an associated probability
of occurrence in a given time span.
Near Source Effects
Some of the newest GMPEs
incorporate terms for accounting
near-source effects. Since the ground
motion close to faults is largely
controlled by phenomena their
incorporation into hazard can be of
paramount importance.
30. Relative difference (%): J-SHIS – OpenQuake
Underestimation
probably due to
missing modeling of
correction for anomalous
seismic intensity distribution
Overestimation in inland
locations – J-SHIS uses point
ruptures – OpenQuake uses
finite ruptures