The document discusses Digital Twin, a key element of Industry 4.0. It provides an overview of Digital Twin including definitions, examples, and the features and capabilities of Eclipse Ditto, an open source Digital Twin platform. Specifically, it allows [1] creation of virtual representations of physical devices through abstractions and APIs, [2] interaction with and modification of Digital Twins, and [3] searching across large sets of Digital Twins through filtering and querying. The presentation aims to demonstrate how Digital Twin technologies like Eclipse Ditto can help implement key Industry 4.0 use cases.
This document provides an introduction and overview of ElasticSearch, including:
- ElasticSearch is an open source search and analytics engine that is written in Java and based on Lucene. It allows for full text search, real-time analytics, and distributed scaling.
- The document outlines the server-side and client-side architecture, including indexing data, performing searches, and using plugins and APIs.
- Examples are provided of starting an ElasticSearch cluster, indexing and searching for data, and using features like scroll searching and facets.
This document discusses serverless computing and Nuclio, an open source serverless platform. It provides an overview of serverless and its benefits, describes Nuclio's features like high performance, support for any programming language or data source, and easy deployment. Examples are given for real-time serverless applications involving IoT, analytics, and AI.
The Alfresco Development Framework (ADF) provides over 100 reusable Angular components and services, development tools to streamline building applications, and is based on standard technologies like Angular and Material Design; it has four pillars including the JavaScript library, Angular components, app generator, and example apps; and the framework core utilizes technologies like JavaScript, HTML5, CSS, TypeScript, Angular, and development tools like Node, NPM, and GitHub.
Testing API platform with Behat BDD testsStefan Adolf
An addon talk for the api platform talk that goes into the depth of BDD support in api platform. Use Behat to write human readable feature and scenario descriptions. Accompanying demo repo for the basic behat test cases: https://github.com/coding-berlin/great-countries
Lessons Learnt from Running Thousands of On-demand Spark ApplicationsItai Yaffe
Ada Sharoni (Software Engineering Architect) @ Hunters:
Imagine you had to manage thousands of Spark applications that are automatically spinning up on-demand upon every customer interaction.
Our unique constraints in Hunters have led us to adopt an architecture and concepts that we believe many other companies will find useful.
In this lecture we will share our solutions and insights in running many lightweight, cheap Spark applications on Kubernetes, that can easily survive frequent restarts and smartly share resources on Spot EC2 instances.
Building AOL's High Performance, Enterprise Wide Mail Application With Silver...goodfriday
Come join the Rich Internet Application engineering team from AOL and see first-hand how AOL created a rich, scalable mail application using Microsoft Silverlight 2.
Open Shift.Run2019 マイクロサービスの開発に疲れる前にdaprを使おうkei omizo
Dapr is an open source, portable runtime that makes it easy for developers to build distributed applications across cloud and edge. It provides common services for microservices applications such as service invocation, state management, publish/subscribe, secrets management and more. Dapr runs as a sidecar to manage application logic and provides a standard way for applications to integrate with infrastructure components like databases, queues and object storage.
Sparkling Water Webinar October 29th, 2014Sri Ambati
Sparkling Water is the newest application on the Apache Spark in-memory platform to extend Machine Learning for better predictions and to quickly deploy models into production. H2O is proud to partner with Cloudera and Databricks to bring this capability to a wide audience.
H2O is for data scientists and business analysts who need scalable and fast machine learning. H2O is an open source predictive analytics platform. Unlike traditional analytics tools, H2O provides a combination of extraordinary math and high performance parallel processing with unrivaled ease of use. H2O speaks the language of data science with support for R, Python, Scala, Java and a robust REST API. Smart business applications are powered by H2O’s NanoFast¬TM Scoring Engine. Learn more by going to http://www.h2o.ai and contact us for more information.
- Powered by the open source machine learning software H2O.ai. Contributors welcome at: https://github.com/h2oai
- To view videos on H2O open source machine learning software, go to: https://www.youtube.com/user/0xdata
This document provides an introduction and overview of ElasticSearch, including:
- ElasticSearch is an open source search and analytics engine that is written in Java and based on Lucene. It allows for full text search, real-time analytics, and distributed scaling.
- The document outlines the server-side and client-side architecture, including indexing data, performing searches, and using plugins and APIs.
- Examples are provided of starting an ElasticSearch cluster, indexing and searching for data, and using features like scroll searching and facets.
This document discusses serverless computing and Nuclio, an open source serverless platform. It provides an overview of serverless and its benefits, describes Nuclio's features like high performance, support for any programming language or data source, and easy deployment. Examples are given for real-time serverless applications involving IoT, analytics, and AI.
The Alfresco Development Framework (ADF) provides over 100 reusable Angular components and services, development tools to streamline building applications, and is based on standard technologies like Angular and Material Design; it has four pillars including the JavaScript library, Angular components, app generator, and example apps; and the framework core utilizes technologies like JavaScript, HTML5, CSS, TypeScript, Angular, and development tools like Node, NPM, and GitHub.
Testing API platform with Behat BDD testsStefan Adolf
An addon talk for the api platform talk that goes into the depth of BDD support in api platform. Use Behat to write human readable feature and scenario descriptions. Accompanying demo repo for the basic behat test cases: https://github.com/coding-berlin/great-countries
Lessons Learnt from Running Thousands of On-demand Spark ApplicationsItai Yaffe
Ada Sharoni (Software Engineering Architect) @ Hunters:
Imagine you had to manage thousands of Spark applications that are automatically spinning up on-demand upon every customer interaction.
Our unique constraints in Hunters have led us to adopt an architecture and concepts that we believe many other companies will find useful.
In this lecture we will share our solutions and insights in running many lightweight, cheap Spark applications on Kubernetes, that can easily survive frequent restarts and smartly share resources on Spot EC2 instances.
Building AOL's High Performance, Enterprise Wide Mail Application With Silver...goodfriday
Come join the Rich Internet Application engineering team from AOL and see first-hand how AOL created a rich, scalable mail application using Microsoft Silverlight 2.
Open Shift.Run2019 マイクロサービスの開発に疲れる前にdaprを使おうkei omizo
Dapr is an open source, portable runtime that makes it easy for developers to build distributed applications across cloud and edge. It provides common services for microservices applications such as service invocation, state management, publish/subscribe, secrets management and more. Dapr runs as a sidecar to manage application logic and provides a standard way for applications to integrate with infrastructure components like databases, queues and object storage.
Sparkling Water Webinar October 29th, 2014Sri Ambati
Sparkling Water is the newest application on the Apache Spark in-memory platform to extend Machine Learning for better predictions and to quickly deploy models into production. H2O is proud to partner with Cloudera and Databricks to bring this capability to a wide audience.
H2O is for data scientists and business analysts who need scalable and fast machine learning. H2O is an open source predictive analytics platform. Unlike traditional analytics tools, H2O provides a combination of extraordinary math and high performance parallel processing with unrivaled ease of use. H2O speaks the language of data science with support for R, Python, Scala, Java and a robust REST API. Smart business applications are powered by H2O’s NanoFast¬TM Scoring Engine. Learn more by going to http://www.h2o.ai and contact us for more information.
- Powered by the open source machine learning software H2O.ai. Contributors welcome at: https://github.com/h2oai
- To view videos on H2O open source machine learning software, go to: https://www.youtube.com/user/0xdata
Fully Automate Application Delivery with Puppet and F5 - PuppetConf 2014Puppet
The document discusses F5 programmability and using Puppet for automation and deployment. It provides an overview of F5 programmability tools like iRules, iApps, and iControl. It then covers benefits of using Puppet for infrastructure as code and automation. Examples are given of using REST APIs and languages like Perl and Python to programmatically configure F5 devices.
Oracle GoldenGate 18c - REST API ExamplesBobby Curtis
This document provides examples of using RESTful APIs with Oracle GoldenGate 18c. It includes examples of creating, deleting, and listing extracts, replicats, credentials, and distribution paths using cURL commands. It also provides examples of using RESTful APIs within shell scripts to automate administration tasks like adding extracts and replicats.
This document provides an overview of Apache Apex and real-time data visualization. Apache Apex is a platform for developing scalable streaming applications that can process billions of events per second with millisecond latency. It uses YARN for resource management and includes connectors, compute operators, and integrations. The document discusses using Apache Apex to build real-time dashboards and widgets using the App Data Framework, which exposes application data sources via topics. It also covers exporting and packaging dashboards to include in Apache Apex application packages.
ApacheCon @Home 2020
StreamPipes is an open source self-service IoT toolbox to enable non-technical users to connect, analyze and explore IoT data streams.
https://streampipes.apache.org/
Choreo Community Call 1: How to Create a Service in Choreo!
Agenda:
1. Introduction to Choreo
2. Introduction to Service Components
- What are Service components and its use cases
- Capabilities
3. Demo
- Deploy a Todo list web application
Watch the video at : https://youtube.com/live/FX06RgpNUB4
Kubernetes is a container cluster manager that aims to provide a platform for automating deployment, scaling, and operations of application containers across clusters of machines. It uses pods as the basic building block, which are groups of application containers that share storage and networking resources. Kubernetes includes control planes for replication, scheduling, and services to expose applications. It supports deployment of multi-tier applications through replication controllers, services, labels, and pod templates.
This document provides an overview of cloud-native patterns for building applications. It begins with an agenda that includes a cloud-native patterns overview, applying the patterns to apps, and a demo. It then maps out various cloud-native patterns and principles including microservices, API design, service discovery, resiliency patterns, observability, and chaos engineering. It demonstrates applying these patterns through a demo application and discusses monitoring the application with Prometheus and Grafana. The document emphasizes building cloud-native software through a methodology, tools, and platform approach.
How do you apply modern Cloud-native patterns to your apps? In this talk, you'll find how to use frameworks like Spring Boot & Spring Cloud to build agile & resilient apps, leveraging Cloud platforms. Get the app source code here: https://github.com/alexandreroman/yatc.
Going FaaSter, Functions as a Service at NetflixYunong Xiao
The document discusses Netflix's use of serverless computing via its own Function as a Service (FaaS) platform. Some key points:
- Netflix built its own FaaS platform called Titus that runs functions at scale using containers for portability and efficiency.
- The platform handles operations concerns so developers can focus on business logic. It provides a full runtime API and handles updates, metrics, and management automatically.
- Netflix developed tools like NEWT to improve the developer experience with one-click setup, local development and debugging, testing, and CI/CD integration for fast and reliable software development.
Apache Ambari BOF Meet Up @ Hadoop Summit 2013
APIs and SPIs – How to Integrate with Ambari
http://www.meetup.com/Apache-Ambari-User-Group/events/119184782/
The document provides steps to connect to a CloudFoundry environment and deploy a sample Predix application. It includes instructions on installing the CF CLI, logging in, listing services, creating a PostgreSQL service instance, pushing a sample app, and binding the app to the database. The steps cover common operations for deploying and managing apps on Pivotal CloudFoundry and interacting with services on Predix.
Docker Berlin Meetup June 2015: Docker powering Radical Agility @ Zalando TechHenning Jacobs
Docker allows for radical agility in deploying applications. STUPS is Zalando's cloud platform that uses Docker and microservices to give development teams autonomy while maintaining compliance. It provides immutable infrastructure, centralized logging, and monitoring across isolated AWS accounts for each team.
SymfonyCon Berlin 2016 - Symfony Plugin for PhpStorm - 3 years laterHaehnchen
In 2013 the "Symfony Plugin" for PhpStorm was born. Today we see over 1 million downloads and several other plugins for projects like Laravel, Drupal, Shopware, ... that help to improve your productivity.
I will talk about Symfony related features and will give you some tips and tricks. Also, we take a look at the infrastructure behind these plugins and how I maintain all of them.
KNATIVE - DEPLOY, AND MANAGE MODERN CONTAINER-BASED SERVERLESS WORKLOADSElad Hirsch
Knative is the new kid in town in the Serverless community.
As Kubernetes is de facto our cloud infrastructure, Knative allows us to focus more on our business logic and less on infrastructure, All while committing to the new paradigm of Serverless computing.
This session will explore a high-level overview of Knative and follow the architectural design of a modern data pipeline shifting from AWS Lambda to Knative.
This document discusses transforming monolithic applications into microservices using Red Hat OpenShift. It provides an overview of OpenShift capabilities like application lifecycle management, container orchestration, security and monitoring. It then describes a hands-on lab where developers will learn OpenShift concepts, efficient development workflows and promoting applications between environments using CI/CD pipelines.
DeployR: Revolution R Enterprise with Business Intelligence ApplicationsRevolution Analytics
The document discusses Revolution Confidential, a product from Revolution Analytics called Revolution R Enterprise that allows scaling of R to the enterprise level. It enables distributed high performance analytics and building/deploying analytics applications. Revolution R Enterprise uses Revolution DeployR, which provides a way to make R accessible and scalable to any web-enabled application through common APIs. It also discusses stateless and stateful execution of R scripts/code through DeployR's RESTful API and examples of implementation in JavaScript.
Sebastien Thomas, System Architect at Coyote Amerique, gave a presentation on operator frameworks. His talk covered how Operator SDK can be used to create Kubernetes Operators with Go.
This document describes a hands-on technical workshop on transforming applications from monoliths to microservices using Red Hat OpenShift. It provides an overview of OpenShift capabilities including container orchestration, application lifecycle management, and cloud-native development tools. It also outlines the goal and steps for a lab on developing applications on OpenShift, including separating development and production environments and promoting applications between environments using CI/CD pipelines.
System and Software Engineering for Industry 4.0Pankesh Patel
This document provides an overview of Industry 4.0 concepts including:
- Examples of Industry 4.0 use cases like predictive maintenance, quality control, and remote monitoring.
- The Industry 4.0 architecture including devices, edge computing, data lakes, analytics, and applications.
- The technology stack required including connectivity, security, device and data management, analytics, and digital twins.
Getting Started for SMEs in Industry 4.0Pankesh Patel
The document describes an Industry 4.0 assessment tool for manufacturers to evaluate their current position and maturity levels. The tool consists of a questionnaire that covers different assessment dimensions, including current products/services, customer access, value chains/processes, and IT architecture. It provides examples of questions to understand manufacturers' Industry 4.0 practices. The tool was used by a company called Semforex to assess its maturity levels and identify problems and solutions, such as implementing a web portal for online product measurements.
The document discusses how cloud platforms can help enable Industry 4.0 applications, providing examples of companies using Microsoft Azure and AWS for uses like predictive maintenance, quality assurance, and enhancing services. It also covers technologies on these cloud platforms that can be used to develop Industry 4.0 applications, such as Azure IoT and various Azure preconfigured solutions and accelerators.
More Related Content
Similar to Hands-on Workshop on Building Digital Twin for Factory of the Future
Fully Automate Application Delivery with Puppet and F5 - PuppetConf 2014Puppet
The document discusses F5 programmability and using Puppet for automation and deployment. It provides an overview of F5 programmability tools like iRules, iApps, and iControl. It then covers benefits of using Puppet for infrastructure as code and automation. Examples are given of using REST APIs and languages like Perl and Python to programmatically configure F5 devices.
Oracle GoldenGate 18c - REST API ExamplesBobby Curtis
This document provides examples of using RESTful APIs with Oracle GoldenGate 18c. It includes examples of creating, deleting, and listing extracts, replicats, credentials, and distribution paths using cURL commands. It also provides examples of using RESTful APIs within shell scripts to automate administration tasks like adding extracts and replicats.
This document provides an overview of Apache Apex and real-time data visualization. Apache Apex is a platform for developing scalable streaming applications that can process billions of events per second with millisecond latency. It uses YARN for resource management and includes connectors, compute operators, and integrations. The document discusses using Apache Apex to build real-time dashboards and widgets using the App Data Framework, which exposes application data sources via topics. It also covers exporting and packaging dashboards to include in Apache Apex application packages.
ApacheCon @Home 2020
StreamPipes is an open source self-service IoT toolbox to enable non-technical users to connect, analyze and explore IoT data streams.
https://streampipes.apache.org/
Choreo Community Call 1: How to Create a Service in Choreo!
Agenda:
1. Introduction to Choreo
2. Introduction to Service Components
- What are Service components and its use cases
- Capabilities
3. Demo
- Deploy a Todo list web application
Watch the video at : https://youtube.com/live/FX06RgpNUB4
Kubernetes is a container cluster manager that aims to provide a platform for automating deployment, scaling, and operations of application containers across clusters of machines. It uses pods as the basic building block, which are groups of application containers that share storage and networking resources. Kubernetes includes control planes for replication, scheduling, and services to expose applications. It supports deployment of multi-tier applications through replication controllers, services, labels, and pod templates.
This document provides an overview of cloud-native patterns for building applications. It begins with an agenda that includes a cloud-native patterns overview, applying the patterns to apps, and a demo. It then maps out various cloud-native patterns and principles including microservices, API design, service discovery, resiliency patterns, observability, and chaos engineering. It demonstrates applying these patterns through a demo application and discusses monitoring the application with Prometheus and Grafana. The document emphasizes building cloud-native software through a methodology, tools, and platform approach.
How do you apply modern Cloud-native patterns to your apps? In this talk, you'll find how to use frameworks like Spring Boot & Spring Cloud to build agile & resilient apps, leveraging Cloud platforms. Get the app source code here: https://github.com/alexandreroman/yatc.
Going FaaSter, Functions as a Service at NetflixYunong Xiao
The document discusses Netflix's use of serverless computing via its own Function as a Service (FaaS) platform. Some key points:
- Netflix built its own FaaS platform called Titus that runs functions at scale using containers for portability and efficiency.
- The platform handles operations concerns so developers can focus on business logic. It provides a full runtime API and handles updates, metrics, and management automatically.
- Netflix developed tools like NEWT to improve the developer experience with one-click setup, local development and debugging, testing, and CI/CD integration for fast and reliable software development.
Apache Ambari BOF Meet Up @ Hadoop Summit 2013
APIs and SPIs – How to Integrate with Ambari
http://www.meetup.com/Apache-Ambari-User-Group/events/119184782/
The document provides steps to connect to a CloudFoundry environment and deploy a sample Predix application. It includes instructions on installing the CF CLI, logging in, listing services, creating a PostgreSQL service instance, pushing a sample app, and binding the app to the database. The steps cover common operations for deploying and managing apps on Pivotal CloudFoundry and interacting with services on Predix.
Docker Berlin Meetup June 2015: Docker powering Radical Agility @ Zalando TechHenning Jacobs
Docker allows for radical agility in deploying applications. STUPS is Zalando's cloud platform that uses Docker and microservices to give development teams autonomy while maintaining compliance. It provides immutable infrastructure, centralized logging, and monitoring across isolated AWS accounts for each team.
SymfonyCon Berlin 2016 - Symfony Plugin for PhpStorm - 3 years laterHaehnchen
In 2013 the "Symfony Plugin" for PhpStorm was born. Today we see over 1 million downloads and several other plugins for projects like Laravel, Drupal, Shopware, ... that help to improve your productivity.
I will talk about Symfony related features and will give you some tips and tricks. Also, we take a look at the infrastructure behind these plugins and how I maintain all of them.
KNATIVE - DEPLOY, AND MANAGE MODERN CONTAINER-BASED SERVERLESS WORKLOADSElad Hirsch
Knative is the new kid in town in the Serverless community.
As Kubernetes is de facto our cloud infrastructure, Knative allows us to focus more on our business logic and less on infrastructure, All while committing to the new paradigm of Serverless computing.
This session will explore a high-level overview of Knative and follow the architectural design of a modern data pipeline shifting from AWS Lambda to Knative.
This document discusses transforming monolithic applications into microservices using Red Hat OpenShift. It provides an overview of OpenShift capabilities like application lifecycle management, container orchestration, security and monitoring. It then describes a hands-on lab where developers will learn OpenShift concepts, efficient development workflows and promoting applications between environments using CI/CD pipelines.
DeployR: Revolution R Enterprise with Business Intelligence ApplicationsRevolution Analytics
The document discusses Revolution Confidential, a product from Revolution Analytics called Revolution R Enterprise that allows scaling of R to the enterprise level. It enables distributed high performance analytics and building/deploying analytics applications. Revolution R Enterprise uses Revolution DeployR, which provides a way to make R accessible and scalable to any web-enabled application through common APIs. It also discusses stateless and stateful execution of R scripts/code through DeployR's RESTful API and examples of implementation in JavaScript.
Sebastien Thomas, System Architect at Coyote Amerique, gave a presentation on operator frameworks. His talk covered how Operator SDK can be used to create Kubernetes Operators with Go.
This document describes a hands-on technical workshop on transforming applications from monoliths to microservices using Red Hat OpenShift. It provides an overview of OpenShift capabilities including container orchestration, application lifecycle management, and cloud-native development tools. It also outlines the goal and steps for a lab on developing applications on OpenShift, including separating development and production environments and promoting applications between environments using CI/CD pipelines.
System and Software Engineering for Industry 4.0Pankesh Patel
This document provides an overview of Industry 4.0 concepts including:
- Examples of Industry 4.0 use cases like predictive maintenance, quality control, and remote monitoring.
- The Industry 4.0 architecture including devices, edge computing, data lakes, analytics, and applications.
- The technology stack required including connectivity, security, device and data management, analytics, and digital twins.
Similar to Hands-on Workshop on Building Digital Twin for Factory of the Future (20)
Getting Started for SMEs in Industry 4.0Pankesh Patel
The document describes an Industry 4.0 assessment tool for manufacturers to evaluate their current position and maturity levels. The tool consists of a questionnaire that covers different assessment dimensions, including current products/services, customer access, value chains/processes, and IT architecture. It provides examples of questions to understand manufacturers' Industry 4.0 practices. The tool was used by a company called Semforex to assess its maturity levels and identify problems and solutions, such as implementing a web portal for online product measurements.
The document discusses how cloud platforms can help enable Industry 4.0 applications, providing examples of companies using Microsoft Azure and AWS for uses like predictive maintenance, quality assurance, and enhancing services. It also covers technologies on these cloud platforms that can be used to develop Industry 4.0 applications, such as Azure IoT and various Azure preconfigured solutions and accelerators.
Software Tools for Building Industry 4.0 ApplicationsPankesh Patel
The document discusses software tools for building Industry 4.0 applications. It describes challenges like fragmentation, complexity, and lock-in in Industry 4.0. The document presents an approach using middleware and rapid application development tools to address these challenges. Specific tools discussed include Node-RED for rapid prototyping, SMEWB to close gaps between technical experts and software, and IoTSuite as a toolkit for prototyping IoT applications.
Accelerating Application Development in the Internet of Things using Model-dr...Pankesh Patel
This document discusses model-driven development approaches for accelerating application development in the Internet of Things (IoT).
It introduces IoTSuite, a toolkit that enables IoT application development with minimal effort through separation of concerns. Domain, functionality, and deployment specifications are compiled to generate programming frameworks. This reduces development time and effort and improves reusability.
It also describes the SMEWB (Subject Matter Expert Workbench), which aims to empower industrial subject matter experts to create, reuse, and deploy analytic algorithms with little coding. It allows dragging and dropping to develop analytic modules and supports various deployment options.
[DOCUMENT]
Smart Factory - App Based Quality MonitoringPankesh Patel
The document discusses the development of an app-based monitoring system to improve quality assurance in manufacturing. It notes that visual inspections currently make up 90% of quality checks but are inefficient. The proposed system would use an app to randomly select visual checks at different steps, upload images for verification, and store them for future reference to create an easy and low-cost quality monitoring solution. Screenshots provide an example of how the app would guide users through recording checks during a sample manufacturing process.
This document discusses accelerating the development of analytic models by subject matter experts (SMEs) at ABB. It describes the traditional process where SMEs worked with developers to create models, which took months to integrate into solutions. The new approach allows SMEs to independently create and evolve models using a drag-and-drop workbench, with models integrated into solutions within minutes by developers. This closes the gaps between SME knowledge and solutions by empowering SMEs as end-user developers and decoupling model development from solution development lifecycles.
This is user manual for IoTSuite. This contains description about how to setup IoTSuite on your PC for programming, how to use development environment provided by IoTSuite.
A step-by-step video guide is available at URL: https://www.youtube.com/watch?v=nS_Je7IzPvM
IoTSuite: A Framework to Design, Implement, and Deploy IoT ApplicationsPankesh Patel
IoTSuite is a framework that provides modeling languages and automation techniques to design, implement, and deploy IoT applications with reduced development effort compared to existing approaches. It integrates different life-cycle phases like design, implementation, and deployment. Early results show that IoTSuite requires fewer lines of code than general purpose languages or Node-RED to develop a smart home IoT application.
Towards application development for the internet of thingsPankesh Patel
Application development in the Internet of Things (IoT) is challenging because it involves dealing with a wide range of related issues such as lack of separation of concerns, and lack of high-level of abstractions to address both the large scale and heterogeneity. Moreover, stakeholders involved in the application development have to address issues that can be attributed to different life-cycles phases. when developing applications. First, the application logic has to be analyzed and then separated into a set of distributed tasks for an underlying network. Then, the tasks have to be implemented for the specific hardware. Apart from handling these issues, they have
to deal with other aspects of life-cycle such as changes in application requirements and deployed devices.
Several approaches have been proposed in the closely related fields of wireless sensor network, ubiquitous and pervasive
computing, and software engineering in general to address the above challenges. However, existing approaches only cover limited subsets of the above mentioned challenges when applied to the IoT. This work proposes an integrated approach for addressing the above mentioned challenges. The main contributions of this work are: (1) a development methodology that separates IoT application development into different concerns and provides a conceptual framework to develop an application, (2) a development framework that implements the development methodology to support actions of stakeholders. The development framework provides a set of modeling languages to specify each development concern and abstracts
the scale and heterogeneity related complexity. It integrates code generation, task-mapping, and linking techniques
to provide automation. Code generation supports the application development phase by producing a programming
framework that allows stakeholders to focus on the application logic, while our mapping and linking techniques together support the deployment phase by producing device-specific code to result in a distributed system collaboratively hosted by individual devices. Our evaluation based on two realistic scenarios shows that the use of our approach improves the productivity of stakeholders involved in the application development.
Cloud computing that provides cheap and pay-as-you-go computing resources is rapidly gaining momentum as an alternative to traditional IT Infrastructure. As more and more consumers delegate their tasks to cloud providers, Service Level Agreements(SLA) between consumers and providers emerge as a key aspect. Due to the dynamic nature of the cloud, continuous monitoring on Quality of Service (QoS)
attributes is necessary to enforce SLAs. Also numerous other factors such as trust (on the cloud provider) come into consideration, particularly for enterprise customers that may outsource its critical data. This complex nature of the cloud landscape warrants a sophisticated means of managing SLAs. This paper proposes a mechanism for managing SLAs in a cloud computing environment using the Web Service Level Agreement(WSLA) framework, developed for SLA monitoring and SLA enforcement
in a Service Oriented Architecture (SOA). We use the third
party support feature of WSLA to delegate monitoring and enforcement tasks to other entities in order to solve the trust issues. We also present a real world use case to validate our proposal.
Towards application development for the physical cyber-social systemsPankesh Patel
This document discusses challenges in developing applications for physical-cyber-social systems due to heterogeneity, large scale, and requiring multiple expertise. It proposes a development framework that separates concerns through high-level programming abstractions and automation to address these challenges. The framework uses vocabulary and architecture languages and frameworks to specify domains and functionality, compiles these into middleware, and links generated code for different devices and platforms.
A model driven development framework for developing sense-compute-control app...Pankesh Patel
This document proposes a model-driven development framework for developing Sense-Compute-Control (SCC) applications. SCC applications involve sensors collecting data from the environment, computational services processing the data, and actuators controlling the environment. The framework aims to enable development of SCC applications with minimal effort from stakeholders by integrating modeling languages, addressing SCC characteristics, and automation techniques. It includes vocabulary, architecture, and deployment modeling languages as well as frameworks to generate code for different platforms from the models.
A tool suite for prototyping internet of things applicationsPankesh Patel
This document describes a ToolSuite for prototyping Sense-Compute-Control (SCC) applications. The ToolSuite aims to enable development of SCC applications with minimal effort from various stakeholders involved. It provides common concepts at different levels like functionality, domain, and deployment to enable reusability. These concepts include entities of interest, sensors, actuators, computational services. The ToolSuite uses domain experts to specify vocabularies, software designers to specify architectures, and application developers to implement application logic. It generates code for different platforms and links all components through a middleware.
Enabling high level application development for internet of thingsPankesh Patel
Application development in the Internet of Things (IoT) is challenging because it involves dealing with
a wide range of related issues such as lack of separation of concerns, and lack of high-level of abstractions
to address both the large scale and heterogeneity. Moreover, stakeholders involved in the application development have to address issues that can be attributed to different life-cycles phases when developing applications.
First, the application logic has to be analyzed and then separated into a set of distributed tasks for an underlying network. Then, the tasks have to be implemented for the specific hardware. Apart from handling these issues, they have to deal with other aspects of life-cycle such as changes in application requirements and deployed devices.
Several approaches have been proposed in the closely related fields of wireless sensor network, ubiquitous and pervasive computing, and software engineering in general to address the above challenges. However, existing approaches only cover limited subsets of the above mentioned challenges when applied to the IoT. This paper proposes an integrated approach for addressing the above
mentioned challenges. The main contributions of this paper are$\colon$ (1) a development methodology that separates
IoT application development into different concerns and provides a conceptual framework to develop an application,
(2) a development framework that implements the development methodology to support actions of stakeholders. The development framework provides a set of modeling languages to specify each development concern and abstracts the scale and heterogeneity related complexity. It integrates code generation, task-mapping, and linking techniques to provide automation. Code generation supports the application development phase by producing a programming framework that allows stakeholders to focus on the application logic, while our mapping and linking techniques together support the deployment phase by producing device-specific code to result in a distributed system collaboratively hosted by individual devices. Our evaluation based on two realistic scenarios shows that the use of our approach improves the productivity of stakeholders involved in the application development.
Enabling high level application development for internet of thingsPankesh Patel
This document proposes an approach for enabling high-level application development for the Internet of Things (IoT). The approach addresses challenges like heterogeneity, scale, and the involvement of multiple expertise through: 1) modeling languages that abstract heterogeneity and scale, 2) clear division of roles for stakeholders, and 3) code generators that automate the development process and reduce hand-written code. It presents a smart buildings example and evaluation showing the approach reduces development efforts by generating 74-72% of code compared to manual development. Ongoing work includes further evaluation and support for end-user applications and evolution.
Application development for the internet of thingsPankesh Patel
1. The document discusses application development challenges for the Internet of Things (IoT), including heterogeneity of devices, large scale, lack of separation of concerns, and life-cycle issues.
2. It proposes a conceptual model that classifies IoT concepts and relates development concerns to promote reusability.
3. A multi-stage model-driven approach is presented using a set of modeling languages to abstract heterogeneity, scale, and support automation across development stages.
Enabling High Level Application Development In The Internet Of ThingsPankesh Patel
The Internet of Things (IoT) combines Wireless Sensor and Actuation Networks (WSANs), Pervasive
computing, and the elements of the \\traditional" Internet such as Web and database servers. This leads to
the dual challenges of scale and heterogeneity in these systems, which comprise a large number of devices of
dierent characteristics. In view of the above, developing IoT applications is challenging because it involves
dealing with a wide range of related issues, such as lack of separation of concerns, need for domain experts to
write low level code, and lack of specialized domain specic languages (DSLs). Existing software engineering
approaches only cover a limited subset of the above-mentioned challenges.
In this work, we propose an application development process for the IoT that aims to comprehensively
address the above challenges. We rst present the semantic model of the IoT, based on which we identify
the roles of the various stakeholders in the development process, viz., domain expert, software designer,
application developer, device developer, and network manager, along with their skills and responsibilities.
To aid them in their tasks, we propose a model-driven development approach which uses customized lan-
guages for each stage of the development process: Srijan Vocabulary Language (SVL) for specifying the
domain vocabulary, Srijan Architecture Language (SAL) for specifying the architecture of the application,
and Srijan Network Language (SNL) for expressing the properties of the network on which the application
will execute; each customized to the skill level and area of expertise of the relevant stakeholder. For the
application developer specifying the internal details of each software component, we propose the use of a
customized generated framework using a language such as Java. Our DSL-based approach is supported by
code generation and task-mapping techniques in an application development tool developed by us. Our
initial evaluation based on two realistic scenarios shows that the use of our techniques/framework succeeds
in improving productivity while developing IoT applications.
Towards application development for the internet of things updatedPankesh Patel
The document discusses developing a domain model for Internet of Things (IoT) applications. It identifies common IoT behaviors like data collection, sense-compute-actuate, and intermittent sensing. An IoT domain model is presented that captures key concepts like entities, sensors, actuators, devices, and software components, as well as their relationships. The domain model provides benefits like a common understanding of IoT terminology, modeling invariant properties, and enabling modular application design.
What is an RPA CoE? Session 2 – CoE RolesDianaGray10
In this session, we will review the players involved in the CoE and how each role impacts opportunities.
Topics covered:
• What roles are essential?
• What place in the automation journey does each role play?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...DanBrown980551
This LF Energy webinar took place June 20, 2024. It featured:
-Alex Thornton, LF Energy
-Hallie Cramer, Google
-Daniel Roesler, UtilityAPI
-Henry Richardson, WattTime
In response to the urgency and scale required to effectively address climate change, open source solutions offer significant potential for driving innovation and progress. Currently, there is a growing demand for standardization and interoperability in energy data and modeling. Open source standards and specifications within the energy sector can also alleviate challenges associated with data fragmentation, transparency, and accessibility. At the same time, it is crucial to consider privacy and security concerns throughout the development of open source platforms.
This webinar will delve into the motivations behind establishing LF Energy’s Carbon Data Specification Consortium. It will provide an overview of the draft specifications and the ongoing progress made by the respective working groups.
Three primary specifications will be discussed:
-Discovery and client registration, emphasizing transparent processes and secure and private access
-Customer data, centering around customer tariffs, bills, energy usage, and full consumption disclosure
-Power systems data, focusing on grid data, inclusive of transmission and distribution networks, generation, intergrid power flows, and market settlement data
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
"What does it really mean for your system to be available, or how to define w...Fwdays
We will talk about system monitoring from a few different angles. We will start by covering the basics, then discuss SLOs, how to define them, and why understanding the business well is crucial for success in this exercise.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
The Department of Veteran Affairs (VA) invited Taylor Paschal, Knowledge & Information Management Consultant at Enterprise Knowledge, to speak at a Knowledge Management Lunch and Learn hosted on June 12, 2024. All Office of Administration staff were invited to attend and received professional development credit for participating in the voluntary event.
The objectives of the Lunch and Learn presentation were to:
- Review what KM ‘is’ and ‘isn’t’
- Understand the value of KM and the benefits of engaging
- Define and reflect on your “what’s in it for me?”
- Share actionable ways you can participate in Knowledge - - Capture & Transfer
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
QA or the Highway - Component Testing: Bridging the gap between frontend appl...zjhamm304
These are the slides for the presentation, "Component Testing: Bridging the gap between frontend applications" that was presented at QA or the Highway 2024 in Columbus, OH by Zachary Hamm.
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
ScyllaDB is making a major architecture shift. We’re moving from vNode replication to tablets – fragments of tables that are distributed independently, enabling dynamic data distribution and extreme elasticity. In this keynote, ScyllaDB co-founder and CTO Avi Kivity explains the reason for this shift, provides a look at the implementation and roadmap, and shares how this shift benefits ScyllaDB users.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.