Water scope in water industry 4.0 2022Csaba Ilcsik
Water Industry 4.0 - WaterScope S2N developed for water industrial experts and water utilities. Data collection and wireless data transfer from sensors in the water distribution system to the server with real time monitoring and AI/BI aided analysis.
An overview of ICTs as a strategic instrument in Smart Water Management, this deck acts as a catalyst for further discussion and future successful implementation of SWM initiatives in India
Presentation by Steffen Zeuch, Researcher at German Research Center for Artificial Intelligence (DFKI) and Post-Doc at TU Berlin (Germany), at the FogGuru Boot Camp training in September 2018.
Water scope in water industry 4.0 2022Csaba Ilcsik
Water Industry 4.0 - WaterScope S2N developed for water industrial experts and water utilities. Data collection and wireless data transfer from sensors in the water distribution system to the server with real time monitoring and AI/BI aided analysis.
An overview of ICTs as a strategic instrument in Smart Water Management, this deck acts as a catalyst for further discussion and future successful implementation of SWM initiatives in India
Presentation by Steffen Zeuch, Researcher at German Research Center for Artificial Intelligence (DFKI) and Post-Doc at TU Berlin (Germany), at the FogGuru Boot Camp training in September 2018.
The evolution of machine learning and IoT have made it possible for manufacturers to build more effective applications for predictive maintenance than ever before. Despite the huge potential that machine learning offers for predictive maintenance, it's challenging to build solutions that can handle the speed of IoT data streams and the massively large datasets required to train models that can forecast rare events like mechanical failures. Solving these challenges requires knowledge about state-of-the-art dataware, such as MapR, and cluster computing frameworks, such as Spark, which give developers foundational APIs for consuming and transforming data into feature tables useful for machine learning.
It seems that everyone is talking about Big Data these days. As the Industrial Internet evolves and continues to feed the Big Data machine, companies are finding it more and more critical to develop strategies for turning data into information and information in intelligence. There’s certainly not a shortage of technologies in the marketplace to start playing with the petabytes of data coming from within and outside of the enterprise.
This is the powerpoint presentation used by our guest speaker Barbara (Barb) Kruetzkamp, IT Leader in Data Management at GE Aviation, to discuss approaches and frameworks to enhance business intelligence capabilities by linking industrial and enterprise (internal) data. We also compared traditional vs. transformational IT execution models, and how to put data first.
Barb was born and raised in Cincinnati, Ohio. She attended Thomas More College in Kentucky, graduating with a B.A. in Computer Science and Business Administration. She built technical depth leading infrastructure architecture, then as a Chief Enterprise Architect at GE Corporate. Barb returned to GE Aviation in 2014 to lead the master data management initiative. Barb enjoys volunteering with the developmentally disabled, STEM and high school band students. She also likes to cook, jazzercise, and travel abroad. Her 3 kids keep life active and fun.
Digital technologies for improved performance in cognitive Production PlantsMário Gamas
Develop new technologies to realise cognitive production plants, with improved efficiency and sustainability, by use of smart and networked sensor technologies, intelligent handling and online evaluation of various forms of data streams as well as new methods for self-organizing processes and process chains.
In Short: Go from Smart to Smarter (Cognitive).
The world is changing and the Internet of Things is fast becoming the "new normal" in the industrial 4.0 age. Learn more about how IoT can be leverage in your industry and what the Kaleidoscope IoT platform can to to lower the barrier to entry for IoT.
Review of Cloud Computing Simulation Platforms and Related EnvironmentsRECAP Project
This presentation was given by Dr. James Byrne at the Cloud Simulation Workshop @ NC4 2017 on 11th April 2017. Dr. Byrne presents a review of cloud computing simulation platforms and related environments. He provides an overview and multi-level feature analysis of DES tools for cloud computing environments and discusses how these cloud simulation platforms are being used for research purposes.
From Cloud to Fog: the Tao of IT Infrastructure DecentralizationFogGuru MSCA Project
Keynote by Dr. Guillaume Pierre, Professor of Computer Science at the University of Rennes 1 (France), at the IEEE CloudNet conference, 4th November 2019.
Presentation by Guillaume Pierre, Professor of Computer Science at the University of Rennes 1 (France), at the FogGuru Boot Camp training in September 2018.
Optimizing Monitorability of Multi-cloud ApplicationsMonica Vitali
Performance are important, but also monitoring data. How to take into account the monitorability capability of different cloud providers for application deployment.
Unlocking the Power of IoT: A comprehensive approach to real-time insightsconfluent
In today's data-driven world, the Internet of Things (IoT) is revolutionizing industries and unlocking new possibilities. Join Data Reply, Confluent, and Imply as we unveil a comprehensive solution for IoT that harnesses the power of real-time insights.
Michael will discuss some of the issues and challenges around Big Data. It is all very well building Big Data friendly databases to manage the tidal wave of real-time data that the IoT inevitably creates but this must also be incorporated into legacy data to deliver actionable insight.
Big Data LDN 2018: FORTUNE 100 LESSONS ON ARCHITECTING DATA LAKES FOR REAL-TI...Matt Stubbs
Date: 13th November 2018
Location: Fast Data Theatre
Time: 11:10 - 11:40
Speaker: Sunil MIstry
Organisation: Attunity
About: How do you maximise the value from your operational data? There is a growing need to process and analyse data in motion, as your business looks to generate additional value from multiple data sources. Analysis of real-time data streams can deliver competitive business advantage, reduces costs and create new revenue streams.
Come learn how Attunity, through CDC technology helps organisation on this journey from a batch orientated world to the modern streaming architecture on premise and in the cloud. Learn how to bring your most valuable data from Relational OLTP, Mainframes and SAP into this modern data architecture.
Get Cloud Resources to the IoT Edge with Fog ComputingBiren Gandhi
Fog Computing as a foundational architectural concept for Internet of Things (IoT) and Internet of Everything (IoE).
Embedded devices in the IoT are hampered by the compute, storage, and service limitations of living life on the edge. As IoT edge devices comprise broader sensor networks for industrial automation, transportation, and other safety critical applications, their high uptime requirements are nonnegotiable and service latencies must be kept within realtime or near real time parameters. However, the size, weight, power, and cost constraints of edge platforms also inhibit the ondevice resources available for executing such functions. In this session, Gandhi will introduce Fog Computing, a new paradigm for the IoT that extends compute, storage, and application resources from the cloud to the network edge. Beyond the interplay between Fog and Cloud, Gandhi will show how Fog services can be leveraged across a range of heterogeneous platforms—from end user devices and access points to edge routers and switches—through software technology that facilitates the collection, storage, analysis, and fusion of data to drive success in your next IoT device deployment.
The evolution of machine learning and IoT have made it possible for manufacturers to build more effective applications for predictive maintenance than ever before. Despite the huge potential that machine learning offers for predictive maintenance, it's challenging to build solutions that can handle the speed of IoT data streams and the massively large datasets required to train models that can forecast rare events like mechanical failures. Solving these challenges requires knowledge about state-of-the-art dataware, such as MapR, and cluster computing frameworks, such as Spark, which give developers foundational APIs for consuming and transforming data into feature tables useful for machine learning.
It seems that everyone is talking about Big Data these days. As the Industrial Internet evolves and continues to feed the Big Data machine, companies are finding it more and more critical to develop strategies for turning data into information and information in intelligence. There’s certainly not a shortage of technologies in the marketplace to start playing with the petabytes of data coming from within and outside of the enterprise.
This is the powerpoint presentation used by our guest speaker Barbara (Barb) Kruetzkamp, IT Leader in Data Management at GE Aviation, to discuss approaches and frameworks to enhance business intelligence capabilities by linking industrial and enterprise (internal) data. We also compared traditional vs. transformational IT execution models, and how to put data first.
Barb was born and raised in Cincinnati, Ohio. She attended Thomas More College in Kentucky, graduating with a B.A. in Computer Science and Business Administration. She built technical depth leading infrastructure architecture, then as a Chief Enterprise Architect at GE Corporate. Barb returned to GE Aviation in 2014 to lead the master data management initiative. Barb enjoys volunteering with the developmentally disabled, STEM and high school band students. She also likes to cook, jazzercise, and travel abroad. Her 3 kids keep life active and fun.
Digital technologies for improved performance in cognitive Production PlantsMário Gamas
Develop new technologies to realise cognitive production plants, with improved efficiency and sustainability, by use of smart and networked sensor technologies, intelligent handling and online evaluation of various forms of data streams as well as new methods for self-organizing processes and process chains.
In Short: Go from Smart to Smarter (Cognitive).
The world is changing and the Internet of Things is fast becoming the "new normal" in the industrial 4.0 age. Learn more about how IoT can be leverage in your industry and what the Kaleidoscope IoT platform can to to lower the barrier to entry for IoT.
Review of Cloud Computing Simulation Platforms and Related EnvironmentsRECAP Project
This presentation was given by Dr. James Byrne at the Cloud Simulation Workshop @ NC4 2017 on 11th April 2017. Dr. Byrne presents a review of cloud computing simulation platforms and related environments. He provides an overview and multi-level feature analysis of DES tools for cloud computing environments and discusses how these cloud simulation platforms are being used for research purposes.
From Cloud to Fog: the Tao of IT Infrastructure DecentralizationFogGuru MSCA Project
Keynote by Dr. Guillaume Pierre, Professor of Computer Science at the University of Rennes 1 (France), at the IEEE CloudNet conference, 4th November 2019.
Presentation by Guillaume Pierre, Professor of Computer Science at the University of Rennes 1 (France), at the FogGuru Boot Camp training in September 2018.
Optimizing Monitorability of Multi-cloud ApplicationsMonica Vitali
Performance are important, but also monitoring data. How to take into account the monitorability capability of different cloud providers for application deployment.
Unlocking the Power of IoT: A comprehensive approach to real-time insightsconfluent
In today's data-driven world, the Internet of Things (IoT) is revolutionizing industries and unlocking new possibilities. Join Data Reply, Confluent, and Imply as we unveil a comprehensive solution for IoT that harnesses the power of real-time insights.
Michael will discuss some of the issues and challenges around Big Data. It is all very well building Big Data friendly databases to manage the tidal wave of real-time data that the IoT inevitably creates but this must also be incorporated into legacy data to deliver actionable insight.
Big Data LDN 2018: FORTUNE 100 LESSONS ON ARCHITECTING DATA LAKES FOR REAL-TI...Matt Stubbs
Date: 13th November 2018
Location: Fast Data Theatre
Time: 11:10 - 11:40
Speaker: Sunil MIstry
Organisation: Attunity
About: How do you maximise the value from your operational data? There is a growing need to process and analyse data in motion, as your business looks to generate additional value from multiple data sources. Analysis of real-time data streams can deliver competitive business advantage, reduces costs and create new revenue streams.
Come learn how Attunity, through CDC technology helps organisation on this journey from a batch orientated world to the modern streaming architecture on premise and in the cloud. Learn how to bring your most valuable data from Relational OLTP, Mainframes and SAP into this modern data architecture.
Get Cloud Resources to the IoT Edge with Fog ComputingBiren Gandhi
Fog Computing as a foundational architectural concept for Internet of Things (IoT) and Internet of Everything (IoE).
Embedded devices in the IoT are hampered by the compute, storage, and service limitations of living life on the edge. As IoT edge devices comprise broader sensor networks for industrial automation, transportation, and other safety critical applications, their high uptime requirements are nonnegotiable and service latencies must be kept within realtime or near real time parameters. However, the size, weight, power, and cost constraints of edge platforms also inhibit the ondevice resources available for executing such functions. In this session, Gandhi will introduce Fog Computing, a new paradigm for the IoT that extends compute, storage, and application resources from the cloud to the network edge. Beyond the interplay between Fog and Cloud, Gandhi will show how Fog services can be leveraged across a range of heterogeneous platforms—from end user devices and access points to edge routers and switches—through software technology that facilitates the collection, storage, analysis, and fusion of data to drive success in your next IoT device deployment.
Industrial production is becoming increasingly interlinked with modern information and communication technology. From the foundation of intelligent digitally-networked systems, a largely self-organized production will be possible. In Industrie4.0, people, machinery, plants, logistics and products will communicate and cooperate directly. To connect these different strands, a unified, flexible, high-performance system is needed to provide company-wide, real-time, information flow.
To target these issues, we developed enterprise:inmation.
It securely and efficiently gathers data from manufacturing, process control and IT systems all around the globe, contextualizes it and transforms it into actionable information, which is presented to every decision-maker on any device, anytime, at any location.
Software made by industrial system integration pros, in close cooperation with industry leaders. Business performance in real-time, anytime, anywhere, for all decision- makers -that is enterprise:inmation.
Standing on the Shoulders of Open-Source Giants: The Serverless Realtime Lake...HostedbyConfluent
"Unlike just a few years ago, today the lakehouse architecture is an established data platform embraced by all major cloud data companies such as AWS, Azure, Google, Oracle, Microsoft, Snowflake and Databricks.
This session kicks off with a technical, no-nonsense introduction to the lakehouse concept, dives deep into the lakehouse architecture and recaps how a data lakehouse is built from the ground up with streaming as a first-class citizen.
Then we focus on serverless for streaming use cases. Serverless concepts are well-known from developers triggering hundreds of thousands of AWS Lambda functions at a negligible cost. However, the same concept becomes more interesting when looking at data platforms.
We have all heard about the principle ""It runs best on Powerpoint"", so I decided to skip slides here and bring a serverless demo instead:
A hands-on, fun, and interactive serverless streaming use case example where we ingest live events from hundreds of mobile devices (don't miss out - bring your phone and be part of it!!). Based on this use case I will critically explore how much of a modern lakehouse is serverless and how we implemented that at Databricks (spoiler alert: serverless is everywhere from data pipelines, workflows, optimized Spark APIs, to ML).
TL;DR benefits for the Data Practitioners:
-Recap the OSS foundation of the Lakehouse architecture and understand its appeal
- Understand the benefits of leveraging a lakehouse for streaming and what's there beyond Spark Structured Streaming.
- Meat of the talk: The Serverless Lakehouse. I give you the tech bits beyond the hype. How does a serverless lakehouse differ from other serverless offers?
- Live, hands-on, interactive demo to explore serverless data engineering data end-to-end. For each step we have a critical look and I explain what it means, e.g for you saving costs and removing operational overhead."
Webinar: Cutting Time, Complexity and Cost from Data Science to Productioniguazio
Imagine a system where one collects real-time data, develops a machine learning model… Runs analysis and training on powerful GPUs… Clicks on a magic button and then deploys code and ML models to production… All without any heavy lifting from data and DevOps engineers. Today, data scientists work on laptops with just a subset of data and time is wasted while waiting for data and compute.
It’s about efficient use of time! Join Iguazio and NVIDIA so that you can get home early today! Learn how to speed up data science from development to production:
- Access to large scale, real-time and operational data without waiting for ETL
- Run high performance analytics and ML on NVIDIA GPUs (Rapids)
- Work on a shared, pre-integrated Kubernetes cluster with - - Jupyter notebook and leading data science tools
- One-click (really!) deployment to production
Speakers: Yaron Haviv, CTO at Iguazio, Or Zilberman, Data Scientist at Iguazio and Jacci Cenci, Sr. Technical Marketing Engineer at NVIDIA
Thomas Weise, Apache Apex PMC Member and Architect/Co-Founder, DataTorrent - ...Dataconomy Media
Thomas Weise, Apache Apex PMC Member and Architect/Co-Founder of DataTorrent presented "Streaming Analytics with Apache Apex" as part of the Big Data, Berlin v 8.0 meetup organised on the 14th of July 2016 at the WeWork headquarters.
Edge optimized architecture for fabric defect detection in real-timeShuquan Huang
In textile industry, fabric defect relies on human inspection traditionally, which is inaccurate, inconsistent, inefficient and expensive. There were automatic systems developed on the defect detection by identifying the faults in fabric surface using the image and video processing techniques. However, the existing solution has insufficiencies in defect data sharing, backhaul interconnect, maintenance and etc. By evolving to an edge-optimized architecture, we can help textile industry improve fabric quality, reduce operation cost and increase production efficiency. In this session, I’ll share:
What’s edge computing and why it’s important to intelligence manufacturing
What’s the characteristics, strengths and weaknesses of traditional fabric defect detection method
Why textile industry can benefit from edge computing infrastructure
How to design and implement an edge-enabled application for fabric defect detection in real-time
Insights, synergy and future research directions
Workload Automation for Cloud Migration and Machine Learning PlatformActiveeon
Activeeon has developed two Innovative Solutions based on workflows for:
1. Workload Automation for Cloud Migration
2. Data Science and Machine Learning Platform
Independent of the source of data, the integration of event streams into an Enterprise Architecture gets more and more important in the world of sensors, social media streams and Internet of Things. Events have to be accepted quickly and reliably, they have to be distributed and analyzed, often with many consumers or systems interested in all or part of the events. Storing such huge event streams into HDFS or a NoSQL datastore is feasible and not such a challenge anymore. But if you want to be able to react fast, with minimal latency, you can not afford to first store the data and doing the analysis/analytics later. You have to be able to include part of your analytics right after you consume the data streams. Products for doing event processing, such as Oracle Event Processing or Esper, are available for quite a long time and used to be called Complex Event Processing (CEP). In the past few years, another family of products appeared, mostly out of the Big Data Technology space, called Stream Processing or Streaming Analytics. These are mostly open source products/frameworks such as Apache Storm, Spark Streaming, Flink, Kafka Streams as well as supporting infrastructures such as Apache Kafka. In this talk I will present the theoretical foundations for Stream Processing, discuss the core properties a Stream Processing platform should provide and highlight what differences you might find between the more traditional CEP and the more modern Stream Processing solutions.
MIPM PCo to Kafka Faurecia SAP co-innovation at Hannover Messe 2017Jose Gascon
Co-innovation between Faurecia and SAP in the context of IIoT in order to capture in realtime process data from
manufacturing machines directly thru SAP Leonardo into the Data Lake via Apache Kafka Message Broker.
Most existing infrastructure handles stormwater passively. These systems, designed for a targeted event or average performance over the long term, often function poorly and contribute to negative environmental impacts including combined sewer overflows, poor water quality, and rapid runoff. In this eShowcase, Marcus Quigley will share how Internet-of-Things technology can provide intelligent, forecast-based controls to optimize the performance of stormwater infrastructure. Several case studies will highlight how cost-efficient retrofits result in optimized performance of existing infrastructure, keeping our water clean and our cities safe.
Most existing infrastructure handles stormwater passively. These systems, designed for a targeted event or average performance over the long term, often function poorly and contribute to negative environmental impacts including combined sewer overflows, poor water quality, and rapid runoff. Marcus Quigley shares how Internet-of-Things technology can provide intelligent, forecast-based controls to optimize the performance of stormwater infrastructure. Several case studies will highlight how cost-efficient retrofits result in optimized performance of existing infrastructure, keeping our water clean and our cities safe.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
The Metaverse and AI: how can decision-makers harness the Metaverse for their...Jen Stirrup
The Metaverse is popularized in science fiction, and now it is becoming closer to being a part of our daily lives through the use of social media and shopping companies. How can businesses survive in a world where Artificial Intelligence is becoming the present as well as the future of technology, and how does the Metaverse fit into business strategy when futurist ideas are developing into reality at accelerated rates? How do we do this when our data isn't up to scratch? How can we move towards success with our data so we are set up for the Metaverse when it arrives?
How can you help your company evolve, adapt, and succeed using Artificial Intelligence and the Metaverse to stay ahead of the competition? What are the potential issues, complications, and benefits that these technologies could bring to us and our organizations? In this session, Jen Stirrup will explain how to start thinking about these technologies as an organisation.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
High Performance Green Infrastructure, New Directions in Real-Time Control
1. An internet-based system for adding real-time monitoring, control, and dashboards to existing and New infrastructure. OptiRTC
2. OptiRTC is… a means for adding real-time monitoring, conditional decision-making, control, and communications to existing infrastructure
3. OptiRTC is… a method of making existing and future active BMP technologies adaptive to changing environmental conditions (and maybe an incentive to add active controls to passive BMPs)
4. OptiRTC Overview Interfaces with in-the-field measurement devices and internet data feeds Logs data to secure cloud based solution Runs models on logged data – producing “Decision Space” data With measured data, decision-space data, and conditional logic… Actuates devices in the field Sends internet-based communications Client-specific data visualization/control dashboards and mobile applications
7. Platform Architecture OptiRTCUser Interface Web Services and User Dashboards Internet Based Weather Forecast or other data sources (METSTAT or other Web service API Azure Tables/Blobs Data Logging and Telemetry Solutions OptiRTCData Aggregator and Decision Space Email Tweet SMS Voice Autodial Field Monitoring and Control (Sensors, Gauges, and Actuators)
8. Major OptiRTC Component Technologies Server Stack/Cloud Platform and Storage Solution Federated Authentication Service Physical-world interface Primary User Interface for Dashboards Future Mobile Applications
9. Key Design Features of OptiRTC Lay-friendly interface with no special software requirements for end users. Web-based integration of data collection, analysis, and control. Built out of proven technologies. Treats environmental data as simply another enterprise data stream. Easily customizable end-user experience. Scalable cloud-based data processing and storage. “Controllers” are just standard web services. ODATA Protocol Support. Very cost effective. Design feature details:
10. Key Design Feature 1 Built out of proven technologies. Highly flexible and scalable platform Cost effective. And… operating each component will only become less expensive as the service is scaled up, and as the component technologies continue to improve.
11. High Flexible Organic “System” Design Field hardware layer and user experience do not need to be directly coupled. SYSTEM 3 SYSTEM 1 SYSTEM 2 DECISION DATASET 1 DECISION DATASET 2 DECISION DATASET 3
13. Key Design Feature 2 OptiRTC treats environmental data as simply another enterprise data stream. leverages readily available and powerful enterprise data management solutions and new developments as they occur. Database solutions Open source visualization, statistical analysis, reporting tools… Mobile platforms supported by a large and competent developer base (i.e., Microsoft Azure, Silverlight, HTML 5, etc…) Definition: Enterprise Data Stream: Precisely defined, easily integrated and effectively retrieved data for both internal applications and external communication. Data in well run modern corporations are enterprise data streams. (e.g., Geosyntec’s accounting data)
14. Key Design Feature 3 “Controllers” are just standard web services ANY internet-accessible structured dataset can be collected (e.g., weather feeds) and integrated into decision-space. Can be done in real-time.
15. Key Design Feature 4 Cloud-based data processing and storage No physical server hardware at Geosyntec Bandwidth availability (i.e., internet facing external connectivity of 99.95%) Forward compatibility. Massively redundant data storage. Scalability. 99.95% application uptime.
17. ODATA Protocol Support Clients can access their real-time data from within Microsoft Excel, or write their own web pages using their live data. Seamless support of modeling software Key Design Feature 5
18. Run Complicated Models in Real Time Run SWMM or other DSS model as part of decision space calculations. Incorporate spatial processing libraries. Control Active Systems in Real-time based on model output. MapWindow GIS .NET spatial processing library EPA SWMM Model
20. Technology Application - CSO Control >$10B in Green Infrastructure will be built over next 20 years as part of LTCP Compliance Requirements OptiRTC provides a high performance return on investment compared with passive control technologies or traditional grey infrastructure alone OptiRTC provides new solutions that are not possible with passive control Examples Advanced Rainwater Harvesting Systems Conventional Rainwater Harvesting is not a viable CSO control measure. Active Blue and Green Roofs Significant increased performance and reduced cost for same level of control Retrofit of existing ponds and water features Drawdown in advance of precipitation “Smart” Detention Instantaneous real-time detention coupled with in-sewer and weather monitoring/modeling Geosyntec DDOE Project Self-Cleaning Inverted Siphon Advanced Rainwater Harvesting System Design Rendering (Construction in May 2011)
21. Technology Application - CSO Control Simplest Definition of Advanced Rainwater Harvesting: Drain storage in advance of predicted rainfall or other trigger
22. Flow Comparison – DDOE Modeling Timing of release relative to forecast (blue line) allows for dramatic reduction in wet-weather discharge without giving up harvesting performance. Note no discharge during baseline event. Water remains in system for potential onsite use while providing improved CSO flow control. Drains only right before events. Detention tank empty except during rainfall.
23. Technology Application - CSO Control DDOE Modeling Summary Baseline runoff volume: 12,680 cf/yr Passive detention wet-weather runoff volume: 11,326 cf/yr 11% reduction OptiRTC controlled wet-weather runoff volume: 3,899 cf/yr 69% reduction in wet-weather flow volume Note no harvesting factored in, assumes accurate forecasts
24. Technology Application – Proposed Gowanas Canal Smart Detention Retrofit >85% reduction in wet weather volume with 950 gallon smart detention system >90% reduction in wet weather volume with 1200 gallon smart detention system
26. Technology Application - Retrofit for CSO and Water Quality Control Retrofit outlet to drawdown pond or wetland (slightly) prior to rain events that might cause CSOs downstream Minimal Investment with High ROI for Client
27. Technology Application – Wetland Retrofit Depth Time Series and Average Hydraulic Residence Time for Uncontrolled Outlet Average Hydraulic Residence Time (hrs)13 days Depth Time Series and Average Hydraulic Residence Time for Actively Controlled Outlet Average Hydraulic Residence Time (hrs)24 days
KEY:Clients get social incentive to buy Geosyntec services…
KEY:Clients get social incentive to buy Geosyntec services…
Key point: Standing on the shoulders of giants… not re-inventing the wheel, assembling the carriage.
Systems are arbitrary groupings of controllers in our systemA controller can be a part of n systems.The variables available in a system, as defined by the channels of the controllers in that system, define the collection of variables available for use in calculated the dataset of decision variables.Theoretically, this could result in multiple decision datasets and sets of actions (also grouped at system level) acting on the same controller’s devices. In reality, the wizard for building these systems is being set up to check for this, and it is possible to enforce such logic with referential integrity rules at the database level (read: constrained, made impossible).