Presentation by Peter Vermeulen (Deltares) at the iMOD International User Day 2018, during Delft Software Days - Edition 2018. Tuesday 13 November 2018, Delft.
Detecting probability of ice formation on overhead lines of the Dutch railway...Irene Garcia-Marti
Slides used during my presentation at IEEE eScience 2018 conference in Amsterdam, during the parallel session "Weather & Climate Science in the Digital Era"
Detecting probability of ice formation on overhead lines of the Dutch railway...Irene Garcia-Marti
Slides used during my presentation at IEEE eScience 2018 conference in Amsterdam, during the parallel session "Weather & Climate Science in the Digital Era"
In computer science and mathematics, graphs are abstract data structures that model structural relationships among objects. They are now widely used for data modeling in application domains for which identifying relationship patterns, rules, and anomalies is useful. These domains include the web graph,
social networks,etc. The ever increasing size of graph structured data for these applications creates a critical need for scalable systems that can process large amounts of it efficiently. The project aims at making a benchmarking tool for testing the performance of graph algorithms like BFS, Pagerank, DFS. with
MapReduce, Giraph, GraphLab and testing which approach works better on what kind of graphs.
Hashing has witnessed an increase in popularity over the
past few years due to the promise of compact encoding and fast query
time. In order to be effective hashing methods must maximally preserve
the similarity between the data points in the underlying binary representation.
The current best performing hashing techniques have utilised
supervision. In this paper we propose a two-step iterative scheme, Graph
Regularised Hashing (GRH), for incrementally adjusting the positioning
of the hashing hypersurfaces to better conform to the supervisory signal:
in the first step the binary bits are regularised using a data similarity
graph so that similar data points receive similar bits. In the second
step the regularised hashcodes form targets for a set of binary classifiers
which shift the position of each hypersurface so as to separate opposite
bits with maximum margin. GRH exhibits superior retrieval accuracy to
competing hashing methods.
Crop identification using geo spatial technologiesGodiSaiKiran
The ability to identify crop type makes it possible to estimate the area allocated to each crop type and thus compute relevant statistics providing essential information for crop control of area-based subsidies .Accurate and faster estimation of crop area is very essential for projecting yearly agriculture and deciding agriculture policies.
. Government agencies and agricultural managers require information on the spatial distribution and area of cultivated crops for planning purposes. Agencies can more adequately plan the import and export of food products based on such information. Although some ministries of agriculture and food security annually commission their staff to map different crop types, these ground surveys are expensive and yet cover only a sample of farms. Remote sensing data, together with ancillary lo information, enable the determination of the spatial distribution of crops at varying spatial scales with relatively little financial resources.
ACT Science Coffee, Towards super-resolution for astronomical applications, A...Advanced-Concepts-Team
Super-resolution techniques enable measurements beyond the resolution limit
of conventional systems. Although such techniques have already been
demonstrated in some fields like microscopy, there is still no practically
applicable method that would enable super-resolution in astronomy.
By applying quantum estimation theory, we have taken the first steps towards a
super-resolution strategy that could find various applications in astronomy,
from the characterization of binary star and exoplanet systems, to high-precision
measurement of stellar magnetic fields. In this talk, the main principles,
potential applications, and further challenges of our method will be discussed.
Building Science-Heat exchange analysis of residential buildingShyam Anandjiwala
The thermal performance of a building refers to the process of modelling the energy transfer between a building and its surroundings. For a conditioned building, it estimates the heating and cooling load and hence, the sizing and selection of HVAC equipment can be correctly made. For a non-conditioned building, it calculates temperature variation inside the building over a specified time and helps one to estimate the duration of uncomfortable periods. These quantifications enable one to determine the effectiveness of the design of a building and help in evolving improved designs for realising energy efficient buildings with comfortable indoor conditions. The lack of proper quantification is one of the reasons why passive solar architecture is not popular among architects. Clients would like to know how much energy might be saved, or the temperature reduced to justify any additional expense or design change. Architects too need to know the relative performance of buildings to choose a suitable alternative. Thus, knowledge of the methods of estimating the performance of buildings is essential to the design of passive solar buildings.
Presentation on Salt Lake City Solar Energy Modeling project done in partnership with Utah Clean Energy and the Automated Geographic Reference Center (done in the style of Ignite lightning talks, with a bit of cheating).
Prepare LiDAR Data To Meet Your RequirementsSafe Software
Watch the webinar video on demand at: www.safe.com/webinars
Find out how to quickly prepare LiDAR data to meet your requirements with FME, the leading technology for spatial data transformation. Through demonstrations, you'll see how you can easily perform coordinate system re-projection, format translation, and integration with GIS, CAD and raster data on millions or billions of points in seconds. We'll also share how the enhancements in FME 2012 make it even easier to get the most out of LiDAR data.
Using Trimble TX9 terrestrial laser scanner my surveying team scanned an active runway in Western Australia to a tolerance spec of 3mm. The teams were working a live site providing aircraft right of way meant the teams had to setup and takedown scanner and targets to yield to any aircraft movements around the site and airspace. Surveyors took a ground based approach over drone UAS to maintain tighter vertical control than can be achieved using drone capture. We were tasked with looking for deviations, rutting and areas to derive Pavement Condition Index (PCI) criteria for their asset.
Once the data was captured surveying teams utilized TopoDot to assemble the raw scans into a consolidated model. They then attempted to use the pavement roughness algorithms in the software against the close to 3.4b points of classified data but had to split the datasets into halves and quads in order for the processing runs to complete. The Bentley product has an inbuilt “Road condition tool” which reports on pavement roughness characteristics but has preset expected pavement widths, roads not runway widths, set in the software. We explained to our surveyors that the algorithms might run faster in another product. It allowed us to explore FME as a point cloud processing workflow using feature tables functionality to quickly generate the statistics required for reporting deliverables using the entire dataset in one process.
GraphChi: Large-Scale Graph Computation on Just a PC
published by Aapo Kyrola, Guy Blelloch and Carlos Guestrin.
[OSDI 2012]
For handling large graph that containing millions of vertices and billions of edges, a distributed computing cluster is required. The amount of data that the graph contains is also large. By using cloud services we can easily perform operations on the graph in a distributed environment. But the distributed system has some disadvantages like concurrency, security, scalability and failure handling. The reason why large Graphs are so hard from system perspective is therefore in the computation. A bit surprising motivation comes from thinking about scalability in large scale. From the perspective of programmers, debugging and writing & optimizing distributed algorithms are hard.
Now such big problems if we are able to run in single machine with your IDE and its debugger then the productivity and efficiency would be better. GraphChi - a disk-based system able to computing on large scale of graph efficiently. For that a novel “parallel sliding windows” method is very useful. By using this method, GraphChi is able to execute several advanced data mining on very large graph using just a single consumer – level computer.
Clusters are complex, and expensive to scale, while in this new model, it is very simple we can double the throughput by doubling the machines. The industry wants to compute many tasks on the same graph. Cluster just to compute one single task. To compute tasks faster, you grow the cluster. But this work allows a different way. Since one machine can handle one big task, you can dedicate one task per machine.
GraphChi (Michael Leznik, Head of BI - London, King)
GraphChi, a disk-based system for computing efficiently on graphs with billions of edges. By using a well-known method to break large graphs into small parts, and a novel parallel sliding windows method, GraphChi is able to execute several advanced data mining, graph mining, and machine learning algorithms on very large graphs, using just a single consumer-level computer.
Big Spatial(!) Data Processing mit GeoMesa. AGIT 2019, Salzburg, Austria.Anita Graser
This talk introduces GeoMesa and discusses how it can be used to store and analyze massive amounts of movement data.
Talk recording: https://av.tib.eu/media/42874
Preparing LiDAR for Use in ArcGIS 10.1 with the Data Interoperability ExtensionSafe Software
Find out how to quickly prepare LiDAR data for use in ArcGIS 10.1 with the Data Interoperability Extension. Through demos, you’ll see how to perform: format translation; coordinate system re-projection; and integration with GIS, CAD, and raster data on millions of points in seconds. You'll also learn how to clip, tile, split, combine and more - overall enabling you to prepare LiDAR data according to your precise requirements and use it immediately in ArcGIS.
Despite the existence of data analysis tools such as R, SQL, Excel and others, it is still insufficient to cope with today's big data analysis needs.
The author proposes a CUI (Character User Interface) toolset with dozens of functions to neatly handle tabular data in TSV (Tab Separated Values) files.
It implements many basic and useful functions that have not been implemented in existing software with each function borrowing the ideas of Unix philosophy and covering the most frequent pre-analysis tasks during the initial exploratory stage of data analysis projects.
Also, it greatly speeds up basic analysis tasks, such as drawing cross tables, Venn diagrams, etc., while existing software inevitably requires rather complicated programming and debugging processes for even these basic tasks.
Here, tabular data mainly means TSV (Tab-Separated Values) files as well as other CSV (Comma Separated Value)-type files which are all widely used for storing data and suitable for data analysis.
Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Da...Beniamino Murgante
Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Data Quality Interpretation
Erik Borg, Bernd Fichtelmann - German Aerospace Center, German Remote Sensing Data Center
Hartmut Asche - Department of Geography, University of Potsdam
In computer science and mathematics, graphs are abstract data structures that model structural relationships among objects. They are now widely used for data modeling in application domains for which identifying relationship patterns, rules, and anomalies is useful. These domains include the web graph,
social networks,etc. The ever increasing size of graph structured data for these applications creates a critical need for scalable systems that can process large amounts of it efficiently. The project aims at making a benchmarking tool for testing the performance of graph algorithms like BFS, Pagerank, DFS. with
MapReduce, Giraph, GraphLab and testing which approach works better on what kind of graphs.
Hashing has witnessed an increase in popularity over the
past few years due to the promise of compact encoding and fast query
time. In order to be effective hashing methods must maximally preserve
the similarity between the data points in the underlying binary representation.
The current best performing hashing techniques have utilised
supervision. In this paper we propose a two-step iterative scheme, Graph
Regularised Hashing (GRH), for incrementally adjusting the positioning
of the hashing hypersurfaces to better conform to the supervisory signal:
in the first step the binary bits are regularised using a data similarity
graph so that similar data points receive similar bits. In the second
step the regularised hashcodes form targets for a set of binary classifiers
which shift the position of each hypersurface so as to separate opposite
bits with maximum margin. GRH exhibits superior retrieval accuracy to
competing hashing methods.
Crop identification using geo spatial technologiesGodiSaiKiran
The ability to identify crop type makes it possible to estimate the area allocated to each crop type and thus compute relevant statistics providing essential information for crop control of area-based subsidies .Accurate and faster estimation of crop area is very essential for projecting yearly agriculture and deciding agriculture policies.
. Government agencies and agricultural managers require information on the spatial distribution and area of cultivated crops for planning purposes. Agencies can more adequately plan the import and export of food products based on such information. Although some ministries of agriculture and food security annually commission their staff to map different crop types, these ground surveys are expensive and yet cover only a sample of farms. Remote sensing data, together with ancillary lo information, enable the determination of the spatial distribution of crops at varying spatial scales with relatively little financial resources.
ACT Science Coffee, Towards super-resolution for astronomical applications, A...Advanced-Concepts-Team
Super-resolution techniques enable measurements beyond the resolution limit
of conventional systems. Although such techniques have already been
demonstrated in some fields like microscopy, there is still no practically
applicable method that would enable super-resolution in astronomy.
By applying quantum estimation theory, we have taken the first steps towards a
super-resolution strategy that could find various applications in astronomy,
from the characterization of binary star and exoplanet systems, to high-precision
measurement of stellar magnetic fields. In this talk, the main principles,
potential applications, and further challenges of our method will be discussed.
Building Science-Heat exchange analysis of residential buildingShyam Anandjiwala
The thermal performance of a building refers to the process of modelling the energy transfer between a building and its surroundings. For a conditioned building, it estimates the heating and cooling load and hence, the sizing and selection of HVAC equipment can be correctly made. For a non-conditioned building, it calculates temperature variation inside the building over a specified time and helps one to estimate the duration of uncomfortable periods. These quantifications enable one to determine the effectiveness of the design of a building and help in evolving improved designs for realising energy efficient buildings with comfortable indoor conditions. The lack of proper quantification is one of the reasons why passive solar architecture is not popular among architects. Clients would like to know how much energy might be saved, or the temperature reduced to justify any additional expense or design change. Architects too need to know the relative performance of buildings to choose a suitable alternative. Thus, knowledge of the methods of estimating the performance of buildings is essential to the design of passive solar buildings.
Presentation on Salt Lake City Solar Energy Modeling project done in partnership with Utah Clean Energy and the Automated Geographic Reference Center (done in the style of Ignite lightning talks, with a bit of cheating).
Prepare LiDAR Data To Meet Your RequirementsSafe Software
Watch the webinar video on demand at: www.safe.com/webinars
Find out how to quickly prepare LiDAR data to meet your requirements with FME, the leading technology for spatial data transformation. Through demonstrations, you'll see how you can easily perform coordinate system re-projection, format translation, and integration with GIS, CAD and raster data on millions or billions of points in seconds. We'll also share how the enhancements in FME 2012 make it even easier to get the most out of LiDAR data.
Using Trimble TX9 terrestrial laser scanner my surveying team scanned an active runway in Western Australia to a tolerance spec of 3mm. The teams were working a live site providing aircraft right of way meant the teams had to setup and takedown scanner and targets to yield to any aircraft movements around the site and airspace. Surveyors took a ground based approach over drone UAS to maintain tighter vertical control than can be achieved using drone capture. We were tasked with looking for deviations, rutting and areas to derive Pavement Condition Index (PCI) criteria for their asset.
Once the data was captured surveying teams utilized TopoDot to assemble the raw scans into a consolidated model. They then attempted to use the pavement roughness algorithms in the software against the close to 3.4b points of classified data but had to split the datasets into halves and quads in order for the processing runs to complete. The Bentley product has an inbuilt “Road condition tool” which reports on pavement roughness characteristics but has preset expected pavement widths, roads not runway widths, set in the software. We explained to our surveyors that the algorithms might run faster in another product. It allowed us to explore FME as a point cloud processing workflow using feature tables functionality to quickly generate the statistics required for reporting deliverables using the entire dataset in one process.
GraphChi: Large-Scale Graph Computation on Just a PC
published by Aapo Kyrola, Guy Blelloch and Carlos Guestrin.
[OSDI 2012]
For handling large graph that containing millions of vertices and billions of edges, a distributed computing cluster is required. The amount of data that the graph contains is also large. By using cloud services we can easily perform operations on the graph in a distributed environment. But the distributed system has some disadvantages like concurrency, security, scalability and failure handling. The reason why large Graphs are so hard from system perspective is therefore in the computation. A bit surprising motivation comes from thinking about scalability in large scale. From the perspective of programmers, debugging and writing & optimizing distributed algorithms are hard.
Now such big problems if we are able to run in single machine with your IDE and its debugger then the productivity and efficiency would be better. GraphChi - a disk-based system able to computing on large scale of graph efficiently. For that a novel “parallel sliding windows” method is very useful. By using this method, GraphChi is able to execute several advanced data mining on very large graph using just a single consumer – level computer.
Clusters are complex, and expensive to scale, while in this new model, it is very simple we can double the throughput by doubling the machines. The industry wants to compute many tasks on the same graph. Cluster just to compute one single task. To compute tasks faster, you grow the cluster. But this work allows a different way. Since one machine can handle one big task, you can dedicate one task per machine.
GraphChi (Michael Leznik, Head of BI - London, King)
GraphChi, a disk-based system for computing efficiently on graphs with billions of edges. By using a well-known method to break large graphs into small parts, and a novel parallel sliding windows method, GraphChi is able to execute several advanced data mining, graph mining, and machine learning algorithms on very large graphs, using just a single consumer-level computer.
Big Spatial(!) Data Processing mit GeoMesa. AGIT 2019, Salzburg, Austria.Anita Graser
This talk introduces GeoMesa and discusses how it can be used to store and analyze massive amounts of movement data.
Talk recording: https://av.tib.eu/media/42874
Preparing LiDAR for Use in ArcGIS 10.1 with the Data Interoperability ExtensionSafe Software
Find out how to quickly prepare LiDAR data for use in ArcGIS 10.1 with the Data Interoperability Extension. Through demos, you’ll see how to perform: format translation; coordinate system re-projection; and integration with GIS, CAD, and raster data on millions of points in seconds. You'll also learn how to clip, tile, split, combine and more - overall enabling you to prepare LiDAR data according to your precise requirements and use it immediately in ArcGIS.
Despite the existence of data analysis tools such as R, SQL, Excel and others, it is still insufficient to cope with today's big data analysis needs.
The author proposes a CUI (Character User Interface) toolset with dozens of functions to neatly handle tabular data in TSV (Tab Separated Values) files.
It implements many basic and useful functions that have not been implemented in existing software with each function borrowing the ideas of Unix philosophy and covering the most frequent pre-analysis tasks during the initial exploratory stage of data analysis projects.
Also, it greatly speeds up basic analysis tasks, such as drawing cross tables, Venn diagrams, etc., while existing software inevitably requires rather complicated programming and debugging processes for even these basic tasks.
Here, tabular data mainly means TSV (Tab-Separated Values) files as well as other CSV (Comma Separated Value)-type files which are all widely used for storing data and suitable for data analysis.
Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Da...Beniamino Murgante
Data Usability Assessment for Remote Sensing Data: Accuracy of Interactive Data Quality Interpretation
Erik Borg, Bernd Fichtelmann - German Aerospace Center, German Remote Sensing Data Center
Hartmut Asche - Department of Geography, University of Potsdam
presentation about 2 emerging standards activities that I started and led in MPeG, point cloud compression on a new image and video format, and NBMP for media delivery in 5G networks. Presented at Philips R&D in Eindhoven the Netherlands
Wireless network implementation is a viable option for building network infrastructure in rural communities. Rural people lack network infrastructures for information services and socio-economic development. The aim of this study was to develop a wireless network infrastructure architecture for network services to rural dwellers. A user-centered approach was applied in the study and a wireless network infrastructure was designed and deployed to cover five rural locations. Data was collected and analyzed to assess the performance of the network facilities. The results shows that the system had been performing adequately without any downtime with an average of 200 users per month and the quality of service has remained high. The transmit/receive rate of 300Mbps was thrice as fast as the normal Ethernet transmit/receive specification with an average throughput of 1 Mbps. The multiple output/multiple input (MIMO) point-to-multipoint network design increased the network throughput and the quality of service experienced by the users.
3D reconstruction is a technique used in computer vision which has a wide range of applications in areas like object recognition, city modelling, virtual reality, physical simulations, video games and special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required. Such systems were often very expensive and was only available for industrial or research purpose. With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition, the goal of this work also included making the 3D scanning process fully automated by building and integrating a turntable alongside the software. This means the user can perform a full 3D scan only by a press of a few buttons from our dedicated graphical user interface. Three main steps were followed to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and convert the acquired point cloud data into a watertight mesh of good quality. Third, export the reconstructed model to a 3D printer to obtain a proper 3D print of the model.
3D reconstruction is a technique used in computer vision which has a wide range of applications in areas like object recognition, city modelling, virtual reality, physical simulations, video games and special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required. Such systems were often very expensive and was only available for industrial or research purpose. With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition, the goal of this work also included making the 3D scanning process fully automated by building and integrating a turntable alongside the software. This means the user can perform a full 3D scan only by a press of a few buttons from our dedicated graphical user interface. Three main steps were followed to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and convert the acquired point cloud data into a watertight mesh of good quality. Third, export the reconstructed model to a 3D printer to obtain a proper 3D print of the model.
COMPLETE END-TO-END LOW COST SOLUTION TO A 3D SCANNING SYSTEM WITH INTEGRATED...ijcsit
3D reconstruction is a technique used in computer vision which has a wide range of applications in
areas like object recognition, city modelling, virtual reality, physical simulations, video games and
special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required.
Such systems were often very expensive and was only available for industrial or research purpose.
With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design
inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and
processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition,
the goal of this work also included making the 3D scanning process fully automated by building and
integrating a turntable alongside the software. This means the user can perform a full 3D scan only by
a press of a few buttons from our dedicated graphical user interface. Three main steps were followed
to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system
acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and
convert the acquired point cloud data into a watertight mesh of good quality. Third, export the
reconstructed model to a 3D printer to obtain a proper 3D print of the model.
ICFHR 2014 Competition on Handwritten KeyWord Spotting (H-KWS 2014)Konstantinos Zagoris
H-KWS 2014 is the Handwritten Keyword Spotting Competition organized in conjunction with ICFHR 2014 conference. The main objective of the competition is to record current advances in keyword spotting algorithms using established performance evaluation measures frequently encountered in the information retrieval literature. The competition comprises two distinct tracks, namely, a segmentation-based and a segmentation- free track. Five (5) distinct research groups have participated in the competition with three (3) methods for the segmentation- based track and four (4) methods for the segmentation-free track. The benchmarking datasets that were used in the contest contain both historical and modern documents from multiple writers. In this paper, the contest details are reported including the evaluation measures and the performance of the submitted methods along with a short description of each method.
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...inside-BigData.com
In this deck from the 2018 Swiss HPC Conference, Gilles Fourestey from EPFL presents: Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lensing Software.
"LENSTOOL is a gravitational lensing software that models mass distribution of galaxies and clusters. It was developed by Prof. Kneib, head of the LASTRO lab at EPFL, et al., starting from 1996. It is used to obtain sub-percent precision measurements of the total mass in galaxy clusters and constrain the dark matter self-interaction cross-section, a crucial ingredient to understanding its nature.
However, LENSTOOL lacks efficient vectorization and only uses OpenMP, which limits its execution to one node and can lead to execution times that exceed several months. Therefore, the LASTRO and the EPFL HPC group decided to rewrite the code from scratch and in order to minimize risk and maximize performance, a bottom-up approach that focuses on exposing parallelism at hardware and instruction levels was used. The result is a high performance code, fully vectorized on Xeon, Xeon Phis and GPUs that currently scales up to hundreds of nodes on CSCS’ Piz Daint, one of the fastest supercomputers in the world."
Watch the video: https://wp.me/p3RLHQ-ili
Learn more: https://infoscience.epfl.ch/record/234382/files/EPFL_TH8338.pdf?subformat=pdfa
and
http://www.hpcadvisorycouncil.com/events/2018/swiss-workshop/agenda.php
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
A Predictive Stock Data Analysis with SVM-PCA Model .......................................................................1
Divya Joseph and Vinai George Biju
HOV-kNN: A New Algorithm to Nearest Neighbor Search in Dynamic Space.......................................... 12
Mohammad Reza Abbasifard, Hassan Naderi and Mohadese Mirjalili
A Survey on Mobile Malware: A War without End................................................................................... 23
Sonal Mohite and Prof. R. S. Sonar
An Efficient Design Tool to Detect Inconsistencies in UML Design Models............................................. 36
Mythili Thirugnanam and Sumathy Subramaniam
An Integrated Procedure for Resolving Portfolio Optimization Problems using Data Envelopment
Analysis, Ant Colony Optimization and Gene Expression Programming ................................................. 45
Chih-Ming Hsu
Emerging Technologies: LTE vs. WiMAX ................................................................................................... 66
Mohammad Arifin Rahman Khan and Md. Sadiq Iqbal
Introducing E-Maintenance 2.0 ................................................................................................................. 80
Abdessamad Mouzoune and Saoudi Taibi
Detection of Clones in Digital Images........................................................................................................ 91
Minati Mishra and Flt. Lt. Dr. M. C. Adhikary
The Significance of Genetic Algorithms in Search, Evolution, Optimization and Hybridization: A Short
Review ...................................................................................................................................................... 103
DSD-INT 2023 Hydrology User Days - Intro - Day 3 - KroonDeltares
Presentation by Timo Kroon and Nadine Slootjes (Deltares, Netherlands) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
Presentation by Sabrina Couvin Rodriguez (Deltares, Netherlands) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
Presentation by Umit Taner (Deltares, Netherlands) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
Presentation by Daan Rooze (Deltares, Netherlands) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
DSD-INT 2023 Approaches for assessing multi-hazard risk - WardDeltares
Presentation by Philip Ward (Deltares and IVM VU Amsterdam) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
Presentation by Andrew Warren (Deltares, Netherlands) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
DSD-INT 2023 Global hydrological modelling to support worldwide water assessm...Deltares
Presentation by Marc Bierkens (Utrecht University and Deltares, Netherlands) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
DSD-INT 2023 Modelling implications - IPCC Working Group II - From AR6 to AR7...Deltares
Presentation by Bart van den Hurk (WGII Co-Chair, IPCC AR7, Deltares) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
DSD-INT 2023 Knowledge and tools for Climate Adaptation - JeukenDeltares
Presentation by Ad Jeuken (Deltares, Netherlands) at the Climate Adaptation Symposium 2023, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
DSD-INT 2023 Coupling RIBASIM to a MODFLOW groundwater model - BootsmaDeltares
Presentation by Huite Bootsma (Deltares, Netherlands) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
DSD-INT 2023 Create your own MODFLOW 6 sub-variant - MullerDeltares
Presentation by Mike Muller (hydrocomputing GmbH & Co. KG, Germany) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
DSD-INT 2023 Example of unstructured MODFLOW 6 modelling in California - RomeroDeltares
Presentation by Betsy Romero Verástegui (Deltares, Netherlands) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
DSD-INT 2023 Challenges and developments in groundwater modeling - BakkerDeltares
Presentation by Mark Bakker (Delft University of Technology, Netherlands) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
DSD-INT 2023 Demo new features iMOD Suite - van EngelenDeltares
Presentation by Joeri van Engelen (Deltares, Netherlands) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
DSD-INT 2023 iMOD and new developments - DavidsDeltares
Presentation by Tess Davids (Deltares, Netherlands) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
Presentation by Christian Langevin (U.S. Geological Survey (USGS), USA) at the Hydrology Suite User Days (Day 3) - Groundwater modelling, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Thursday, 30 November 2023, Delft.
DSD-INT 2023 Hydrology User Days - Presentations - Day 2Deltares
Presentation by several speakers at the Hydrology Suite User Days (Day 2) - wflow and HydroMT, during the Delft Software Days - Edition 2023 (DSD-INT 2023). Wednesday, 29 November 2023, Delft.
DSD-INT 2023 Needs related to user interfaces - SnippenDeltares
Presentation by Edwin Snippen (Deltares, Netherlands) at the Hydrology Suite User Days (Day 1) - Hydrology Suite introduction and River Basin Management software (RIBASIM), during the Delft Software Days - Edition 2023 (DSD-INT 2023). Tuesday, 28 November 2023, Delft.
DSD-INT 2023 Coupling RIBASIM to a MODFLOW groundwater model - BootsmaDeltares
Presentation by Huite Bootsma (Deltares, Netherlands) at the Hydrology Suite User Days (Day 1) - Hydrology Suite introduction and River Basin Management software (RIBASIM), during the Delft Software Days - Edition 2023 (DSD-INT 2023). Tuesday, 28 November 2023, Delft.
DSD-INT 2023 Parameterization of a RIBASIM model and the network lumping appr...Deltares
Presentation by Harm Nomden (SWECO, Netherlands) at the Hydrology Suite User Days (Day 1) - Hydrology Suite introduction and River Basin Management software (RIBASIM), during the Delft Software Days - Edition 2023 (DSD-INT 2023). Tuesday, 28 November 2023, Delft.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
Designing for Privacy in Amazon Web ServicesKrzysztofKkol1
Data privacy is one of the most critical issues that businesses face. This presentation shares insights on the principles and best practices for ensuring the resilience and security of your workload.
Drawing on a real-life project from the HR industry, the various challenges will be demonstrated: data protection, self-healing, business continuity, security, and transparency of data processing. This systematized approach allowed to create a secure AWS cloud infrastructure that not only met strict compliance rules but also exceeded the client's expectations.
Your Digital Assistant.
Making complex approach simple. Straightforward process saves time. No more waiting to connect with people that matter to you. Safety first is not a cliché - Securely protect information in cloud storage to prevent any third party from accessing data.
Would you rather make your visitors feel burdened by making them wait? Or choose VizMan for a stress-free experience? VizMan is an automated visitor management system that works for any industries not limited to factories, societies, government institutes, and warehouses. A new age contactless way of logging information of visitors, employees, packages, and vehicles. VizMan is a digital logbook so it deters unnecessary use of paper or space since there is no requirement of bundles of registers that is left to collect dust in a corner of a room. Visitor’s essential details, helps in scheduling meetings for visitors and employees, and assists in supervising the attendance of the employees. With VizMan, visitors don’t need to wait for hours in long queues. VizMan handles visitors with the value they deserve because we know time is important to you.
Feasible Features
One Subscription, Four Modules – Admin, Employee, Receptionist, and Gatekeeper ensures confidentiality and prevents data from being manipulated
User Friendly – can be easily used on Android, iOS, and Web Interface
Multiple Accessibility – Log in through any device from any place at any time
One app for all industries – a Visitor Management System that works for any organisation.
Stress-free Sign-up
Visitor is registered and checked-in by the Receptionist
Host gets a notification, where they opt to Approve the meeting
Host notifies the Receptionist of the end of the meeting
Visitor is checked-out by the Receptionist
Host enters notes and remarks of the meeting
Customizable Components
Scheduling Meetings – Host can invite visitors for meetings and also approve, reject and reschedule meetings
Single/Bulk invites – Invitations can be sent individually to a visitor or collectively to many visitors
VIP Visitors – Additional security of data for VIP visitors to avoid misuse of information
Courier Management – Keeps a check on deliveries like commodities being delivered in and out of establishments
Alerts & Notifications – Get notified on SMS, email, and application
Parking Management – Manage availability of parking space
Individual log-in – Every user has their own log-in id
Visitor/Meeting Analytics – Evaluate notes and remarks of the meeting stored in the system
Visitor Management System is a secure and user friendly database manager that records, filters, tracks the visitors to your organization.
"Secure Your Premises with VizMan (VMS) – Get It Now"
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...informapgpstrackings
Keep tabs on your field staff effortlessly with Informap Technology Centre LLC. Real-time tracking, task assignment, and smart features for efficient management. Request a live demo today!
For more details, visit us : https://informapuae.com/field-staff-tracking/
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
How Does XfilesPro Ensure Security While Sharing Documents in Salesforce?XfilesPro
Worried about document security while sharing them in Salesforce? Fret no more! Here are the top-notch security standards XfilesPro upholds to ensure strong security for your Salesforce documents while sharing with internal or external people.
To learn more, read the blog: https://www.xfilespro.com/how-does-xfilespro-make-document-sharing-secure-and-seamless-in-salesforce/
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
SOCRadar Research Team: Latest Activities of IntelBrokerSOCRadar
The European Union Agency for Law Enforcement Cooperation (Europol) has suffered an alleged data breach after a notorious threat actor claimed to have exfiltrated data from its systems. Infamous data leaker IntelBroker posted on the even more infamous BreachForums hacking forum, saying that Europol suffered a data breach this month.
The alleged breach affected Europol agencies CCSE, EC3, Europol Platform for Experts, Law Enforcement Forum, and SIRIUS. Infiltration of these entities can disrupt ongoing investigations and compromise sensitive intelligence shared among international law enforcement agencies.
However, this is neither the first nor the last activity of IntekBroker. We have compiled for you what happened in the last few days. To track such hacker activities on dark web sources like hacker forums, private Telegram channels, and other hidden platforms where cyber threats often originate, you can check SOCRadar’s Dark Web News.
Stay Informed on Threat Actors’ Activity on the Dark Web with SOCRadar!
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Innovating Inference - Remote Triggering of Large Language Models on HPC Clus...Globus
Large Language Models (LLMs) are currently the center of attention in the tech world, particularly for their potential to advance research. In this presentation, we'll explore a straightforward and effective method for quickly initiating inference runs on supercomputers using the vLLM tool with Globus Compute, specifically on the Polaris system at ALCF. We'll begin by briefly discussing the popularity and applications of LLMs in various fields. Following this, we will introduce the vLLM tool, and explain how it integrates with Globus Compute to efficiently manage LLM operations on Polaris. Attendees will learn the practical aspects of setting up and remotely triggering LLMs from local machines, focusing on ease of use and efficiency. This talk is ideal for researchers and practitioners looking to leverage the power of LLMs in their work, offering a clear guide to harnessing supercomputing resources for quick and effective LLM inference.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
2. Recent Developments
LLUR
Landesamt für Landwirtschaft, Umwelt und ländliche Räume
Geologischer Dienst Schleswig-Holstein
• Double precision coordinates
AGS
Alberta Geological Survey
• Extended queries for boreholes
• Artificial Boreholes
• GEN files
16 november 2018
3. Double-Precision
Example in single precision(7 significant) numbers.
120000.0,440000.0
Such a coordinate means that a maximal accuracy of 0.1 can be
obtained. Smaller values of <0.1 meter cannot be projected
accurately.
16 november 2018
4. Double-Precision
Example (ETRS89/UTM) in single precision (7 significant) numbers.
590000.0,5659000
Such a coordinate means that a maximal accuracy of 1.0 can be
obtained. Smaller values of <1.0 meter cannot be projected
accurately.
Usage of double precision enlarges
the number of signifcant numbers to 15.
16 november 2018
5. Double-Precision
iMOD uses a “shadow” coordinate
system
iMODMap
0,01000000,5000000
dx=100,dy=1001000100,5000100
16 november 2018
0,0
dx,dy
6. Double-Precision
16 november 2018
Choice between single- and double precision
– iMOD converts IDF files between single-
and double precision. Besides the
coordinates, the data will be double precision
too (doubles the size of an IDF-file)
IDF Files
Notes:
• If ASCII-files are imported iMOD “sees” whether
the coordinates need to be in double-precision
• If IDF Calc is used, a single- or double-precision is
inherited from the first mentioned IDF-file
7. Double-Precision
16 november 2018
ISG Files Choice between single- and double precision –
iMOD converts ISG files between single- and
double precision. Besides the coordinates, the
data (coordinates, elevations, cross-sections) will
be double precision, also additional attribute time
added to data for future usage
9. GEN-files
Current GEN-file
16 november 2018
1
245143,516806
245114,516811
245095,516796
245073,516776
245027,516753
245004,516739
244994,516730
244938,516700
244927,516697
244917,516698
244903,516692
244895,516687
244887,516673
244883,516665
244834,516632
244815,516595
END
END
id,province,capital
1,Drenthe,Assen
2,Flevoland,Lelystad
3,Schiermonnikoog,Schiermonnikoog
4,Ameland,Nes
5,'Het Rif',-
6,Terschelling,West-Terschelling
7,Friesland,Leeuwarden
8,Vlieland,Oost-Vlieland
9,Richel,-
11,Gelderland,Arnhem
12,Rottemerplaat,-
13,Rottemeroog,-
14,Zuiderstrand,-
15,Simonszand,-
Drawback
Very slow drawing
huge GEN-files
10. GEN-Files
ASCII and BINARY GEN-files both supported, however, for several
features in iMOD it is obliged to use the binary format. iMOD offers
the possibility to convert ASCII GEN-files into BINARY GEN-files.
16 november 2018
New BINARY format (*.GEN)
includes coordinates (double
precision) and attributes in a
single file.
11. GEN-files
New BINARY format (*.GEN) displays much-quicker and performances increases if zoomed in
16 november 2018
140MB
12. Resizable and movable scalebar (interactively)
Clipboard takes the selected area
bounded by the axes automatically
Movable axes
(interactively)
Extra’s
13. - Capable of sorting IDF files in the iMOD
Manager based upon keywords, in this
manner it is easy to sort a list as
TOP_L1.IDF, BOT_L1.IDF, TOP_L2.IDF,
BOT_2.IDF etc.
- Select files in the iMOD Manager via a
search string, this is handy to select
multiply files to be given a similar legend
Extra’s