SAP Secure Login Service for SAP GUI: Überblick über den Nachfolger von SAP S...IBsolution GmbH
Inhalt:
Die SAP hat im März diesen Jahres den Nachfolger für SAP Single Sign-On veröffentlicht. Zahlreiche Unternehmen mit SSO im Einsatz stehen nun vor der Herausforderung, diese neue Lösung zu implementieren. Wir zeigen Ihnen, wie der Wechsel von SSO zum SAP Secure Login Service verläuft und welche Vorteile die neue cloud-basierte Lösung der SAP mit sich bringt.
Zielgruppe:
- IT Leiter
- CSO
- CIO
Agenda:
- Ablösung von SAP Single Sign-On
- Vorstellung SAP Secure Login Service for SAP GUI
- Funktionen, Vorteile und Mehrwerte
Mehr über uns:
Website: https://www.ibsolution.com/
Karriereportal: https://ibsolution.de/karriere/
Webinare: https://www.ibsolution.com/academy/webinare
YouTube: https://www.youtube.com/user/IBSolution
LinkedIn: https://de.linkedin.com/company/ibsolution-gmbh
Xing: https://www.xing.com/companies/ibsolutiongmbh
Facebook: https://de-de.facebook.com/IBsolutionGmbH/
Instagram: https://www.instagram.com/ibsolution/?hl=de
Weitere Informationen:
https://www.ibsolution.com/academy/webinar-aufzeichnungen/sap-security-sap-idm-wartungsende-2027-und-was-nun
This document provides an agenda and introduction for a session on automating Enterprise Performance Management (EPM) Cloud solutions using EPM Automate. The agenda includes an introduction to EPM Cloud and automation, an overview of EPM Automate, examples of Windows batch automation architecture, EPM Automate algorithms, use cases, challenges and other possibilities. The introduction provides background on the presenter and their experience with EPM solutions, as well as an overview of EPM Cloud products and the need for automation in EPM processes.
The document provides an overview of SAP Cloud Platform, including key use cases for integrating apps and data, extending existing cloud and on-premise apps, and building new cloud apps. It also discusses connecting people and data. Customer stories demonstrate how companies are using SAP Cloud Platform for integration, innovation, Internet of Things applications, and digital experiences. Architectural blueprints illustrate potential implementations involving SAP and non-SAP systems and applications.
Aljoscha Krettek is the PMC chair of Apache Flink and Apache Beam, and co-founder of data Artisans. Apache Flink is an open-source platform for distributed stream and batch data processing. It allows for stateful computations over data streams in real-time and historically. Flink supports batch and stream processing using APIs like DataSet and DataStream. Data Artisans originated Flink and provides an application platform powered by Flink and Kubernetes for building stateful stream processing applications.
The document provides an overview of the actions available in the SAP interface for managing SAP transactions and function calls. It describes the SAP BC Start Session, SAP BC Connector, SAP BC Commit, and SAP BC Rollback actions for starting a session, executing calls, and committing or rolling back transactions. It also summarizes the SAP Web AS Interface action and SAP Interface Repository action. Common properties for connecting to SAP systems are also defined.
Introduction to extracting data from sap s 4 hana with abap cds viewsLuc Vanrobays
The document provides guidance on extracting data from SAP S/4HANA to SAP BW/4HANA using CDS views. It discusses that SAP has developed a communication scenario requiring configuration in both S/4HANA and BW. Extracts can be done in full or delta mode, with hierarchies supported in full mode. CDS views containing the extractor annotation can be found using the view browser in S/4HANA.
DevOps and APIs: Great Alone, Better Together MuleSoft
DevOps has emerged as a critical enabler of agility in enterprise IT; a DevOps model increases reliability and minimizes disruption, with the added benefit of increasing speed. But that isn’t enough. DevOps must be balanced with a focus on asset consumption and reuse to make sure the organization is extracting maximum value out of all the newly built assets. And that’s where an API strategy comes in. In this session, we'll discuss how organizations use DevOps and API-led connectivity to reduce time to market 3-4x.
SAP Secure Login Service for SAP GUI: Überblick über den Nachfolger von SAP S...IBsolution GmbH
Inhalt:
Die SAP hat im März diesen Jahres den Nachfolger für SAP Single Sign-On veröffentlicht. Zahlreiche Unternehmen mit SSO im Einsatz stehen nun vor der Herausforderung, diese neue Lösung zu implementieren. Wir zeigen Ihnen, wie der Wechsel von SSO zum SAP Secure Login Service verläuft und welche Vorteile die neue cloud-basierte Lösung der SAP mit sich bringt.
Zielgruppe:
- IT Leiter
- CSO
- CIO
Agenda:
- Ablösung von SAP Single Sign-On
- Vorstellung SAP Secure Login Service for SAP GUI
- Funktionen, Vorteile und Mehrwerte
Mehr über uns:
Website: https://www.ibsolution.com/
Karriereportal: https://ibsolution.de/karriere/
Webinare: https://www.ibsolution.com/academy/webinare
YouTube: https://www.youtube.com/user/IBSolution
LinkedIn: https://de.linkedin.com/company/ibsolution-gmbh
Xing: https://www.xing.com/companies/ibsolutiongmbh
Facebook: https://de-de.facebook.com/IBsolutionGmbH/
Instagram: https://www.instagram.com/ibsolution/?hl=de
Weitere Informationen:
https://www.ibsolution.com/academy/webinar-aufzeichnungen/sap-security-sap-idm-wartungsende-2027-und-was-nun
This document provides an agenda and introduction for a session on automating Enterprise Performance Management (EPM) Cloud solutions using EPM Automate. The agenda includes an introduction to EPM Cloud and automation, an overview of EPM Automate, examples of Windows batch automation architecture, EPM Automate algorithms, use cases, challenges and other possibilities. The introduction provides background on the presenter and their experience with EPM solutions, as well as an overview of EPM Cloud products and the need for automation in EPM processes.
The document provides an overview of SAP Cloud Platform, including key use cases for integrating apps and data, extending existing cloud and on-premise apps, and building new cloud apps. It also discusses connecting people and data. Customer stories demonstrate how companies are using SAP Cloud Platform for integration, innovation, Internet of Things applications, and digital experiences. Architectural blueprints illustrate potential implementations involving SAP and non-SAP systems and applications.
Aljoscha Krettek is the PMC chair of Apache Flink and Apache Beam, and co-founder of data Artisans. Apache Flink is an open-source platform for distributed stream and batch data processing. It allows for stateful computations over data streams in real-time and historically. Flink supports batch and stream processing using APIs like DataSet and DataStream. Data Artisans originated Flink and provides an application platform powered by Flink and Kubernetes for building stateful stream processing applications.
The document provides an overview of the actions available in the SAP interface for managing SAP transactions and function calls. It describes the SAP BC Start Session, SAP BC Connector, SAP BC Commit, and SAP BC Rollback actions for starting a session, executing calls, and committing or rolling back transactions. It also summarizes the SAP Web AS Interface action and SAP Interface Repository action. Common properties for connecting to SAP systems are also defined.
Introduction to extracting data from sap s 4 hana with abap cds viewsLuc Vanrobays
The document provides guidance on extracting data from SAP S/4HANA to SAP BW/4HANA using CDS views. It discusses that SAP has developed a communication scenario requiring configuration in both S/4HANA and BW. Extracts can be done in full or delta mode, with hierarchies supported in full mode. CDS views containing the extractor annotation can be found using the view browser in S/4HANA.
DevOps and APIs: Great Alone, Better Together MuleSoft
DevOps has emerged as a critical enabler of agility in enterprise IT; a DevOps model increases reliability and minimizes disruption, with the added benefit of increasing speed. But that isn’t enough. DevOps must be balanced with a focus on asset consumption and reuse to make sure the organization is extracting maximum value out of all the newly built assets. And that’s where an API strategy comes in. In this session, we'll discuss how organizations use DevOps and API-led connectivity to reduce time to market 3-4x.
From Postgres to Event-Driven: using docker-compose to build CDC pipelines in...confluent
Mark Teehan, Principal Solutions Engineer, Confluent
Use the Debezium CDC connector to capture database changes from a Postgres database - or MySQL or Oracle; streaming into Kafka topics and onwards to an external data store. Examine how to setup this pipeline using Docker Compose and Confluent Cloud; and how to use various payload formats, such as avro, protobuf and json-schema.
https://www.meetup.com/Singapore-Kafka-Meetup/events/276822852/
BW Migration to HANA Part1 - Preparation in BW SystemLinh Nguyen
This series of publication intends to provide an overview and explanation of major steps and considerations for BW on HANA migrations from anyDB (any database). The complex procedure involves:
1) Preparatory work in the BW system
2) SUM DMO Upgrade and Actual migration
3) Post processing on the migrated systems
This first part focuses on the preparation tasks on the BW system.
By OZSoft Consulting for ITConductor.com
Author: Terry Kempis
Editor: Linh Nguyen
Kafka error handling patterns and best practices | Hemant Desale and Aruna Ka...HostedbyConfluent
Transaction Banking from Goldman Sachs is a high volume, latency sensitive digital banking platform offering. We have chosen an event driven architecture to build highly decoupled and independent microservices in a cloud native manner and are designed to meet the objectives of Security, Availability Latency and Scalability. Kafka was a natural choice – to decouple producers and consumers and to scale easily for high volume processing. However, there are certain aspects that require careful consideration – handling errors and partial failures, managing downtime of consumers, secure communication between brokers and producers / consumers. In this session, we will present the patterns and best practices that helped us build robust event driven applications. We will also present our solution approach that has been reused across multiple application domains. We hope that by sharing our experience, we can establish a reference implementation that application developers can benefit from.
S4 h 188 sap s4hana cloud implementation with sap activateLokesh Modem
The document provides an overview of deploying SAP S/4HANA Cloud using SAP Activate methodology. It discusses the SAP Activate implementation journey and tools that guide customers through trial, setup, fit-to-standard analysis, configuration, testing and deployment phases. The methodology ensures standardized, efficient and innovative implementations through pre-configured best practice processes, roles and content.
SAP HANA SPS09 - Multitenant Database ContainersSAP Technology
This document provides an overview of SAP HANA multitenant database containers, a new feature in SAP HANA SPS09. It discusses how the feature allows a single SAP HANA system to host multiple isolated tenant databases. Each tenant database gets dedicated resources and administration while being managed through a central system database. The initial focus is on replacing multiple component deployments and addressing cloud and multi-tenant on-premise scenarios. Status updates are provided on installation, parameters, security, backup/recovery and other technical aspects.
SAP Cloud Platform - Integration, Extensibility & ServicesAndrew Harding
SAP Cloud Platform enables businesses to extend their SAP solutions to create new applications, integrate with other SAP solutions and external third parties (applications, businesses & government) with the addition of cloud services bringing access to the latest technologies such as IoT, Machine Learning, Intelligent RPA, etc.
SAP BusinessObjects Private Cloud Edition (PCE)Wiiisdom
Discover everything you need to know about SAP BusinessObjects Private Cloud Edition:
- Can I convert my on-premise licenses for PCE?
- What version do I need to be able to migrate to PCE?
- What will SAP manage on PCE?
- What are the differences between SAP BusinessObjects on-premise and SAP BusinessObjects on PCE?
Watch for more details: https://youtu.be/RcUuyAy8dmc
Visit our website to learn more: https://wiiisdom.com/sap-pce-package/
BW Migration to HANA Part 3 - Post-processing on the Migrated SystemLinh Nguyen
This series of publication intends to provide an overview and explanation of major steps and considerations for BW on HANA migrations from anyDB (any database). The complex procedure involves:
1) Preparatory work in the BW system
2) SUM DMO Upgrade and Actual migration
3) Post processing on the migrated systems
This part focuses on post-processing, which includes standard tasks after upgrade and HANA-specific post-tasks.
Lo extraction – part 5 sales and distribution (sd) datasource overviewJNTU University
This document provides an overview of Sales and Distribution (SD) data sources that can be used for LO extraction in SAP BI. It describes the different event types that can trigger data transfers to the data warehouse and lists various SD extractors and their assigned data sources. Useful SAP notes are also referenced that provide more details on specific SD data sources.
AmerisourceBergen implemented SAP Solution Manager Change Request Management (ChaRM) to streamline their transport management process and bring organization and controls to managing transports across their complex SAP landscape. They assembled a core team to define requirements and implement ChaRM in a multi-phase project. This included configuring ChaRM, establishing transport approval workflows, and integrating it with their change and transport systems. AmerisourceBergen realized benefits like improved planning and coordination of changes, reduced transport inspection time, and keeping their dual non-production landscapes synchronized.
The document discusses SAP S/4HANA migration from SAP ERP. Key points include:
1) SAP S/4HANA leverages SAP HANA as the underlying platform for real-time processing and simplified data models.
2) There are multiple upgrade paths for migrating from SAP ERP, including upgrading the database to SAP HANA and installing additional code for new capabilities.
3) The migration process involves preparation, installation, customization, data migration, and post-migration activities like reconciliation to ensure accuracy. Testing is critical before migrating production systems.
Oracle GoldenGate and Apache Kafka: A Deep Dive Into Real-Time Data StreamingMichael Rainey
We produce quite a lot of data! Much of the data are business transactions stored in a relational database. More frequently, the data are non-structured, high volume and rapidly changing datasets known in the industry as Big Data. The challenge for data integration professionals is to combine and transform the data into useful information. Not just that, but it must also be done in near real-time and using a target system such as Hadoop. The topic of this session, real-time data streaming, provides a great solution for this challenging task. By integrating GoldenGate, Oracle’s premier data replication technology, and Apache Kafka, the latest open-source streaming and messaging system, we can implement a fast, durable, and scalable solution.
Presented at Oracle OpenWorld 2016
The NRB Group mainframe day 2021 - IBM Z-Strategy & Roadmap - Adam John Sturg...NRB
This presentation is about the IBM Z Software Strategy. Key points of IBM's strategy for the platform, including Hardware and Software with a quick view on future roadmaps.
The document discusses SAP S/4HANA, the next generation ERP suite from SAP that is designed to run only on the SAP HANA in-memory database. It provides an overview of S/4HANA and its benefits such as a simplified data model, improved user experience with SAP Fiori, and real-time analytics. The document also discusses deployment options for S/4HANA including on-premise, private cloud, public cloud, and hybrid models. It outlines different transition scenarios for moving to S/4HANA and the latest release of S/4HANA 1809.
Hepsistream real time click-stream data analytics platformHepsiburada
Hepsistream veri analitik platformu, Hepsiburada platformuna desktop, mobile, mobile-site kanalları üzerinden erişen kullanıcıların gerçekleştirdikleri ürün görüntüleme, sayfa görüntüleme, sepete ekleme vs. gibi aksiyonları gerçek zamanlı olarak toplayıp, lambda mimarisi ile büyük veri altyapısı üzerinde işlemektedir. Hepsistream büyük veri altyapısına değinilerek, Efsane Cuma gibi büyük bir ölçekte gerçek zamanlı veri keşif ve izleme aracının geliştirilmesi sürecinde kullanılan teknolojiler ve kazanılan deneyimler sunulmustur.
Die Data Warehouse Cloud (DWC) ist SAP's neuestes Data Warehouse (DWH) Produkt. Als Software-as-a-Service Lösung basiert es auf den neuen HANA Cloud Services. Dabei soll die DWC über, kurz oder lang, in der Lage sein neben einem Self Service Data Preparation Use Case auch ein vollwertiges Enterprise Data Warehouse abzubilden.
Die am weitesten verbreiteste SAP DWH Lösung ist bisher das SAP Business Warehouse (BW). Was passiert nun mit dem SAP BW? Ist die DWC das schleichende Ende von SAP BW?
Wir beantworten diese und weitere Fragen und geben einen Überblick zur Positionierung der SAP DWH Lösungen. Anhand eines Showcases zeigen wir zudem Potentiale hybrider Architekturen auf.
Complex event processing (CEP) and stream analytics are commonly treated as distinct classes of stream processing applications. While CEP workloads identify patterns from event streams in near real-time, stream analytics queries ingest and aggregate high-volume streams. Both types of use cases have very different requirements which resulted in diverging system designs. CEP systems excel at low-latency processing whereas engines for stream analytics achieve high throughput. Recent advances in open source stream processing yielded systems that can process several millions of events per second at a sub-second latency. One of these systems is Apache Flink and it enables applications that include typical CEP features as well as heavy aggregations.
Guided by examples, I will demonstrate how Apache Flink enables the user to process CEP and stream analytics workloads alike. Starting from aggregations over streams, we will next detect temporal patterns in our data triggering alerts and finally aggregate these alerts to gain more insights from our data. As an outlook, I will present Flink's CEP-enriched StreamSQL interface providing a declarative way to specify temporal patterns in your SQL query.
SaaS is all around. It does not change integration needs, but makes you face new challenges. In this presentation, you will discover the impact of SaaS on cloud integration.
From Postgres to Event-Driven: using docker-compose to build CDC pipelines in...confluent
Mark Teehan, Principal Solutions Engineer, Confluent
Use the Debezium CDC connector to capture database changes from a Postgres database - or MySQL or Oracle; streaming into Kafka topics and onwards to an external data store. Examine how to setup this pipeline using Docker Compose and Confluent Cloud; and how to use various payload formats, such as avro, protobuf and json-schema.
https://www.meetup.com/Singapore-Kafka-Meetup/events/276822852/
BW Migration to HANA Part1 - Preparation in BW SystemLinh Nguyen
This series of publication intends to provide an overview and explanation of major steps and considerations for BW on HANA migrations from anyDB (any database). The complex procedure involves:
1) Preparatory work in the BW system
2) SUM DMO Upgrade and Actual migration
3) Post processing on the migrated systems
This first part focuses on the preparation tasks on the BW system.
By OZSoft Consulting for ITConductor.com
Author: Terry Kempis
Editor: Linh Nguyen
Kafka error handling patterns and best practices | Hemant Desale and Aruna Ka...HostedbyConfluent
Transaction Banking from Goldman Sachs is a high volume, latency sensitive digital banking platform offering. We have chosen an event driven architecture to build highly decoupled and independent microservices in a cloud native manner and are designed to meet the objectives of Security, Availability Latency and Scalability. Kafka was a natural choice – to decouple producers and consumers and to scale easily for high volume processing. However, there are certain aspects that require careful consideration – handling errors and partial failures, managing downtime of consumers, secure communication between brokers and producers / consumers. In this session, we will present the patterns and best practices that helped us build robust event driven applications. We will also present our solution approach that has been reused across multiple application domains. We hope that by sharing our experience, we can establish a reference implementation that application developers can benefit from.
S4 h 188 sap s4hana cloud implementation with sap activateLokesh Modem
The document provides an overview of deploying SAP S/4HANA Cloud using SAP Activate methodology. It discusses the SAP Activate implementation journey and tools that guide customers through trial, setup, fit-to-standard analysis, configuration, testing and deployment phases. The methodology ensures standardized, efficient and innovative implementations through pre-configured best practice processes, roles and content.
SAP HANA SPS09 - Multitenant Database ContainersSAP Technology
This document provides an overview of SAP HANA multitenant database containers, a new feature in SAP HANA SPS09. It discusses how the feature allows a single SAP HANA system to host multiple isolated tenant databases. Each tenant database gets dedicated resources and administration while being managed through a central system database. The initial focus is on replacing multiple component deployments and addressing cloud and multi-tenant on-premise scenarios. Status updates are provided on installation, parameters, security, backup/recovery and other technical aspects.
SAP Cloud Platform - Integration, Extensibility & ServicesAndrew Harding
SAP Cloud Platform enables businesses to extend their SAP solutions to create new applications, integrate with other SAP solutions and external third parties (applications, businesses & government) with the addition of cloud services bringing access to the latest technologies such as IoT, Machine Learning, Intelligent RPA, etc.
SAP BusinessObjects Private Cloud Edition (PCE)Wiiisdom
Discover everything you need to know about SAP BusinessObjects Private Cloud Edition:
- Can I convert my on-premise licenses for PCE?
- What version do I need to be able to migrate to PCE?
- What will SAP manage on PCE?
- What are the differences between SAP BusinessObjects on-premise and SAP BusinessObjects on PCE?
Watch for more details: https://youtu.be/RcUuyAy8dmc
Visit our website to learn more: https://wiiisdom.com/sap-pce-package/
BW Migration to HANA Part 3 - Post-processing on the Migrated SystemLinh Nguyen
This series of publication intends to provide an overview and explanation of major steps and considerations for BW on HANA migrations from anyDB (any database). The complex procedure involves:
1) Preparatory work in the BW system
2) SUM DMO Upgrade and Actual migration
3) Post processing on the migrated systems
This part focuses on post-processing, which includes standard tasks after upgrade and HANA-specific post-tasks.
Lo extraction – part 5 sales and distribution (sd) datasource overviewJNTU University
This document provides an overview of Sales and Distribution (SD) data sources that can be used for LO extraction in SAP BI. It describes the different event types that can trigger data transfers to the data warehouse and lists various SD extractors and their assigned data sources. Useful SAP notes are also referenced that provide more details on specific SD data sources.
AmerisourceBergen implemented SAP Solution Manager Change Request Management (ChaRM) to streamline their transport management process and bring organization and controls to managing transports across their complex SAP landscape. They assembled a core team to define requirements and implement ChaRM in a multi-phase project. This included configuring ChaRM, establishing transport approval workflows, and integrating it with their change and transport systems. AmerisourceBergen realized benefits like improved planning and coordination of changes, reduced transport inspection time, and keeping their dual non-production landscapes synchronized.
The document discusses SAP S/4HANA migration from SAP ERP. Key points include:
1) SAP S/4HANA leverages SAP HANA as the underlying platform for real-time processing and simplified data models.
2) There are multiple upgrade paths for migrating from SAP ERP, including upgrading the database to SAP HANA and installing additional code for new capabilities.
3) The migration process involves preparation, installation, customization, data migration, and post-migration activities like reconciliation to ensure accuracy. Testing is critical before migrating production systems.
Oracle GoldenGate and Apache Kafka: A Deep Dive Into Real-Time Data StreamingMichael Rainey
We produce quite a lot of data! Much of the data are business transactions stored in a relational database. More frequently, the data are non-structured, high volume and rapidly changing datasets known in the industry as Big Data. The challenge for data integration professionals is to combine and transform the data into useful information. Not just that, but it must also be done in near real-time and using a target system such as Hadoop. The topic of this session, real-time data streaming, provides a great solution for this challenging task. By integrating GoldenGate, Oracle’s premier data replication technology, and Apache Kafka, the latest open-source streaming and messaging system, we can implement a fast, durable, and scalable solution.
Presented at Oracle OpenWorld 2016
The NRB Group mainframe day 2021 - IBM Z-Strategy & Roadmap - Adam John Sturg...NRB
This presentation is about the IBM Z Software Strategy. Key points of IBM's strategy for the platform, including Hardware and Software with a quick view on future roadmaps.
The document discusses SAP S/4HANA, the next generation ERP suite from SAP that is designed to run only on the SAP HANA in-memory database. It provides an overview of S/4HANA and its benefits such as a simplified data model, improved user experience with SAP Fiori, and real-time analytics. The document also discusses deployment options for S/4HANA including on-premise, private cloud, public cloud, and hybrid models. It outlines different transition scenarios for moving to S/4HANA and the latest release of S/4HANA 1809.
Hepsistream real time click-stream data analytics platformHepsiburada
Hepsistream veri analitik platformu, Hepsiburada platformuna desktop, mobile, mobile-site kanalları üzerinden erişen kullanıcıların gerçekleştirdikleri ürün görüntüleme, sayfa görüntüleme, sepete ekleme vs. gibi aksiyonları gerçek zamanlı olarak toplayıp, lambda mimarisi ile büyük veri altyapısı üzerinde işlemektedir. Hepsistream büyük veri altyapısına değinilerek, Efsane Cuma gibi büyük bir ölçekte gerçek zamanlı veri keşif ve izleme aracının geliştirilmesi sürecinde kullanılan teknolojiler ve kazanılan deneyimler sunulmustur.
Die Data Warehouse Cloud (DWC) ist SAP's neuestes Data Warehouse (DWH) Produkt. Als Software-as-a-Service Lösung basiert es auf den neuen HANA Cloud Services. Dabei soll die DWC über, kurz oder lang, in der Lage sein neben einem Self Service Data Preparation Use Case auch ein vollwertiges Enterprise Data Warehouse abzubilden.
Die am weitesten verbreiteste SAP DWH Lösung ist bisher das SAP Business Warehouse (BW). Was passiert nun mit dem SAP BW? Ist die DWC das schleichende Ende von SAP BW?
Wir beantworten diese und weitere Fragen und geben einen Überblick zur Positionierung der SAP DWH Lösungen. Anhand eines Showcases zeigen wir zudem Potentiale hybrider Architekturen auf.
Complex event processing (CEP) and stream analytics are commonly treated as distinct classes of stream processing applications. While CEP workloads identify patterns from event streams in near real-time, stream analytics queries ingest and aggregate high-volume streams. Both types of use cases have very different requirements which resulted in diverging system designs. CEP systems excel at low-latency processing whereas engines for stream analytics achieve high throughput. Recent advances in open source stream processing yielded systems that can process several millions of events per second at a sub-second latency. One of these systems is Apache Flink and it enables applications that include typical CEP features as well as heavy aggregations.
Guided by examples, I will demonstrate how Apache Flink enables the user to process CEP and stream analytics workloads alike. Starting from aggregations over streams, we will next detect temporal patterns in our data triggering alerts and finally aggregate these alerts to gain more insights from our data. As an outlook, I will present Flink's CEP-enriched StreamSQL interface providing a declarative way to specify temporal patterns in your SQL query.
SaaS is all around. It does not change integration needs, but makes you face new challenges. In this presentation, you will discover the impact of SaaS on cloud integration.
In this presentation, we show how Data Reply helped an Austrian fintech customer to overcome previous performance limitations in their data analytics landscape, leverage real-time pipelines, break down monoliths, and foster a self-service data culture to enable new event-driven and business-critical use cases.
SAP on AWS: Big Businesses, Big Workloads, Big Time featuring Ingram-Micro - ...Amazon Web Services
In order to increase business agility and reduce costs, a large number of enterprise customers are moving their entire SAP landscapes, including their production environments, to AWS. Some examples of enterprise customers running their core businesses on AWS are BP, Kellogg’s, Brooks Brothers, AIG, and Ingram-Micro. In this session, hear how customers are running mission-critical workloads on AWS, and understand how we guide Fortune 50 companies as they rapidly adopt emerging technologies and accelerate greater innovation on AWS.
HP CloudSystem is an integrated platform that enables enterprises and service providers to build and manage private, public, and hybrid cloud environments. It integrates system management, servers, storage, networking, and security to address challenges like speeding innovation and reducing time to revenue. Key components include Cloud Service Automation, Insight Orchestration, Server Automation, Network Automation, and SiteScope monitoring. The platform provides infrastructure, templates, and automation tools to deploy both internal and external cloud services.
Using Mainframe Data in the Cloud: Design Once, Deploy Anywhere in a Hybrid W...Precisely
Your company is storing and processing more data in the cloud – and mainframe data is no exception. Whether you’re centralizing enterprise data for analytics, streaming it to real-time cloud-native applications, or archiving for regulatory compliance, you know your mainframe data has to be included. Unfortunately, as with most mainframe initiatives, this is easier said than done!
View this webcast on-demand to learn some practical strategies for leveraging mainframe data in the cloud. We will cover:
• Common use cases for mainframe data in the cloud
• Challenges in using mainframe data in the cloud – and how to solve them
• How to get started
Lessons from Building Large-Scale, Multi-Cloud, SaaS Software at DatabricksDatabricks
The cloud has become one of the most attractive ways for enterprises to purchase software, but it requires building products in a very different way from traditional software
Confluent Partner Tech Talk with Synthesisconfluent
A discussion on the arduous planning process, and deep dive into the design/architectural decisions.
Learn more about the networking, RBAC strategies, the automation, and the deployment plan.
IBM Smart Cloud Orchestrator is a cloud management platform that allows businesses to leverage IBM as a business partner. It provides an overview of Smart Cloud Orchestrator and the IBM Cloud Marketplace. It discusses how business partners can engage with IBM and benefit from opportunities to reach new customers by providing integration content and solutions on the marketplace. It also covers how partners can get technical enablement, certification for their content, early access to betas, and exposure at IBM conferences. The document provides information for business partners on how to develop and deliver content for Smart Cloud Orchestrator and the IBM Cloud Marketplace.
This document discusses placing the SAP Application Server Central Services (ASCS) into containers on Kubernetes. It proposes using containers for the ASCS and Enqueue Replication Server (ERS) with anti-affinity rules to ensure high availability without traditional clustering. Benefits include simplified high availability without requiring cluster technology while still providing required features and allowing SAP systems to utilize anonymous compute nodes rather than dedicated hardware. Considerations include licensing and ensuring the Message Server and ERS are never placed on the same node.
This session provides an overview of how organizations can migrate workloads to the AWS cloud at scale. We will go through available migration frameworks and best practices with common use case examples during this session. After migrating the initial workloads, understand how to migrate at scale to the AWS cloud. Hear about real life experiences from the AWS Professional Services team and learn about common use case examples, frameworks, and best practices. Hear about what to avoid when migrating applications at scale to AWS and understand the tools and partner services that can assist you when migrating applications to AWS.
The document provides an overview of lessons learned from brokering cloud services. It discusses 5 key lessons: open source technologies can be more closed than they appear; managing customer expectations to control scope creep; avoiding vendor lock-in ("stickiness") by using multi-cloud orchestration tools; security opportunities exist in leveraging cloud service provider security controls; and the importance of trust between brokers and their customers.
This document provides a case study and requirements for planning the migration of Contoso's SAP systems from an on-premise environment to Azure. Contoso currently uses SAP ECC and BW and wants to migrate these workloads to Azure to reduce datacenter costs. The requirements include sizing estimates for migrating BW first within 3 months, then ECC, as well as plans for high availability, disaster recovery, backups, user access, and system integrations between Azure and on-premise. The document also discusses selection of Azure VMs, storage, and networking to meet SAP certification requirements and optimize performance for Contoso's SAP workloads.
This document outlines an agenda for a webinar on building secure, event-driven microservices with Confluent Cloud on AWS. The agenda includes presentations on building modern streaming analytics with Confluent on AWS, event streaming made easy with Confluent, and a lab on building end-to-end streaming data pipelines with Confluent Cloud. The hosts for the webinar are Ahmed Zamzam from Confluent and Nuno Barreto from AWS.
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Service-Level Objective for Serverless Applicationsalekn
Deploying commercial applications that meet their expected business needs is challenging due to the differences between how business goals are specified and how the system is evaluated. Furthermore, business goals are dynamic, requiring deployment to change constantly over time. Such difficulties make it costly to maintain application quality as the underlying infrastructure is not always fast enough to keep up with business changes. Nowadays, serverless opens a new approach to build application. By abstracting out the deployment details, serverless application can be implemented with minimum deployment efforts. Serverless also reduces maintenance cost with auto-scaling and pay-as-you-go. Such abilities make us believe that by adopting serverless, we can build application that can meet and quickly adapt to business goals.
However, simply writing applications with serverless is not sufficient. Due to best-effort invocation mechanisms and the lack of application structure awareness, serverless performance is highly variable and often fails to support applications with rigorous quality of service requirements. In this study, we aim to mitigate such limitations by coupling serverless deployment with business needs. In particular, we define an Serverless Service-Level Objective (SLO) interface that allows developers to describe their application structure and business goals in terms of software-level objectives. We implement an SLO enforcer, which uses this information in combination with the system performance metrics to decide a proper serverless deployment and resource allocation for meeting business goals. The Serverless SLO leverages blueprint model, which allow developers to describe applications' architecture and runtime characteristics needs, to map application description to serverless function deployment on the top of Knative. We deploy our proposed system on KinD, a tool to run Kubernetes cluster over our local Docker container, and evaluate it with different system configurations. Evaluation results showed that SLO definition and enforcement helps serverless application use resources in accordance with business goals.
MuleSoft London Community October 2017 - Hybrid and SAP IntegrationPace Integration
Our latest MuleSoft meetup in London covered both hybrid connectivity and SAP integration patterns. Real business scenarios for customer and sales order management - and how to turn these into a seamless API design.
Presenting the newest version of Cloudify - 4.6 including a orchestrated SD-WAN demo from MEF18 where Cloudify is used as the orchestration platform for uCPE based on containers.
This document discusses running enterprise applications like Microsoft, SAP, and Oracle on Amazon Web Services. It covers extending an existing enterprise data center into AWS for backup/storage, development/testing, and disaster recovery. It also discusses running full production workloads in AWS, including using AWS services like storage gateway, VPC networking, databases, and reference architectures for SAP, SharePoint, and high availability. Case studies show cost savings of 50% compared to traditional hosting.
This document provides an overview of enterprise cloud transformation best practices. It discusses key aspects of cloud maturity models, alignment of IT and business strategy, agile cloud development practices, and software defined networking (SDN). Specific topics covered include virtualization maturity, cloud brokerage, application lifecycles, and network functions virtualization. Examples from AT&T and Virtela are given to illustrate real-world SDN implementations.
Similar to Principal Propagation with SAP Cloud Platform (20)
Checkout the latest article by Darryl Griffiths from Aliter Consulting. SAP on Azure Web Dispatcher High Availability provides an overview of how to utilise an Azure Internal Load Balancer in conjunction with the parallel SAP Web Dispatchers to achieve a highly available, load-balanced and scalable solution for fronting SAP Fiori and other SAP components. This deployment is proving very successful on a current SAP Fiori and SAP S/4HANA implementation project for one of our clients.
Aliter Consulting's latest challenge on a customer project was the integration of SAP on Azure into the customer’s SaaS Office 365 environment for outbound and inbound email for SAP S/4HANA to support inbound email for OpenText VIM and SAP GRC, and other general outbound mail requirements...
OpenText Archive Center 16.2 Single File Vendor Interface (VI) using Microsoft Azure Storage Account as a storage device is now supported on Linux. Checkout this brief overview of its usage on one of our current projects. Thanks to Manish Shah (Microsoft) for his contribution and working with OpenText to achieve support on Linux, to Supriya Pande for her article on the Microsoft Azure Storage Explorer, to Oleh Khrypko (SAP) for his input to handling disaster recovery on OpenText Archive Center and Gary Jackson (Aliter Consulting) for the article.
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)Gary Jackson MBCS
This document provides information about SAP HANA System Replication (HSR) and compares it to SAP Replication Server (SRS). HSR replicates transaction log entries from a primary HANA database to secondary databases. It supports synchronous and asynchronous replication and can be used for high availability and disaster recovery. The document outlines the initial setup process and ongoing administration of HSR configurations.
Tips on implementing SAP adaptive computing design with SAP LaMa on Microsoft Azure. We discuss the best options for SAP and some of the challenges faced.
This document provides instructions for setting up SSL connectivity between SAP LVM and the SAP Host Agent using x509 certificate authentication. It involves generating a certificate signing request for the LVM server, having it signed by a certificate authority, uploading the signed certificate and CA/ICA certificates to the LVM keystore. It also describes adding the CA/ICA certificates to the Host Agent's PSE, configuring the host profile, and testing the SSL connection between LVM and the Host Agent.
This document provides instructions for integrating SAP Business Process Automation (BPA) with SAP Landscape Virtualization Management (LVM). It involves creating a custom operation in LVM that allows controlling BPA queues. This is done by creating a provider implementation and custom operation in LVM along with a process definition and web service in BPA. It also requires registering a script with the host agent to connect the LVM and BPA configurations. The custom operation then allows holding or releasing BPA queues from the LVM interface.
This document provides an overview of how to customize SAP Landscape Virtualization Management (LVM) with custom operations and hooks. It describes defining a provider implementation ("LVM_CustomOperation_ClusterAdm") and custom operations ("Freeze", "Unfreeze", "Relocate") for managing a Red Hat cluster. A sample script ("ClusterAdm.ksh") demonstrates how custom operations could freeze/unfreeze the cluster before SAP instance start/stop operations. The provider implementation and custom operations/hooks allow LVM to integrate cluster management operations.
This document provides instructions for installing SAP Router using Secure Network Communication (SNC) and registering it with SAP. It outlines downloading the installation files, creating a dedicated system user and filesystem, unpacking and configuring the software, generating and importing an SNC certificate, creating a router table, and starting/stopping the SAP Router service.
This document provides guidance on customizing SAP Landscape Virtualization Management (LVM) to manage custom instance types. It describes how to configure generic operations like detect, monitor, start, and stop by creating scripts referenced in configuration files. An example is provided for managing SAP Replication Server (SRS) instances, with configuration files and sample scripting code shown.
The document discusses SAP Web Dispatcher 7.40, which is a load balancer that provides intelligent load distribution for SAP Portal. It can handle stateful or stateless sessions over HTTP or HTTPS invisibly to clients. It supports round-robin load distribution for non-SAP backends like Tomcat. It also allows for multiple SSL certificates to handle multiple domains and backends. SAP Web Dispatcher provides reliability, security, and high performance to handle thousands of concurrent users. It includes features like maintenance mode, custom error pages, and is free to use with an SAP license.
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j
Dr. Jesús Barrasa, Head of Solutions Architecture for EMEA, Neo4j
Découvrez les dernières innovations de Neo4j, et notamment les dernières intégrations cloud et les améliorations produits qui font de Neo4j un choix essentiel pour les développeurs qui créent des applications avec des données interconnectées et de l’IA générative.
DDS Security Version 1.2 was adopted in 2024. This revision strengthens support for long runnings systems adding new cryptographic algorithms, certificate revocation, and hardness against DoS attacks.
E-commerce Application Development Company.pdfHornet Dynamics
Your business can reach new heights with our assistance as we design solutions that are specifically appropriate for your goals and vision. Our eCommerce application solutions can digitally coordinate all retail operations processes to meet the demands of the marketplace while maintaining business continuity.
SOCRadar's Aviation Industry Q1 Incident Report is out now!
The aviation industry has always been a prime target for cybercriminals due to its critical infrastructure and high stakes. In the first quarter of 2024, the sector faced an alarming surge in cybersecurity threats, revealing its vulnerabilities and the relentless sophistication of cyber attackers.
SOCRadar’s Aviation Industry, Quarterly Incident Report, provides an in-depth analysis of these threats, detected and examined through our extensive monitoring of hacker forums, Telegram channels, and dark web platforms.
Utilocate offers a comprehensive solution for locate ticket management by automating and streamlining the entire process. By integrating with Geospatial Information Systems (GIS), it provides accurate mapping and visualization of utility locations, enhancing decision-making and reducing the risk of errors. The system's advanced data analytics tools help identify trends, predict potential issues, and optimize resource allocation, making the locate ticket management process smarter and more efficient. Additionally, automated ticket management ensures consistency and reduces human error, while real-time notifications keep all relevant personnel informed and ready to respond promptly.
The system's ability to streamline workflows and automate ticket routing significantly reduces the time taken to process each ticket, making the process faster and more efficient. Mobile access allows field technicians to update ticket information on the go, ensuring that the latest information is always available and accelerating the locate process. Overall, Utilocate not only enhances the efficiency and accuracy of locate ticket management but also improves safety by minimizing the risk of utility damage through precise and timely locates.
Hand Rolled Applicative User ValidationCode KataPhilip Schwarz
Could you use a simple piece of Scala validation code (granted, a very simplistic one too!) that you can rewrite, now and again, to refresh your basic understanding of Applicative operators <*>, <*, *>?
The goal is not to write perfect code showcasing validation, but rather, to provide a small, rough-and ready exercise to reinforce your muscle-memory.
Despite its grandiose-sounding title, this deck consists of just three slides showing the Scala 3 code to be rewritten whenever the details of the operators begin to fade away.
The code is my rough and ready translation of a Haskell user-validation program found in a book called Finding Success (and Failure) in Haskell - Fall in love with applicative functors.
Do you want Software for your Business? Visit Deuglo
Deuglo has top Software Developers in India. They are experts in software development and help design and create custom Software solutions.
Deuglo follows seven steps methods for delivering their services to their customers. They called it the Software development life cycle process (SDLC).
Requirement — Collecting the Requirements is the first Phase in the SSLC process.
Feasibility Study — after completing the requirement process they move to the design phase.
Design — in this phase, they start designing the software.
Coding — when designing is completed, the developers start coding for the software.
Testing — in this phase when the coding of the software is done the testing team will start testing.
Installation — after completion of testing, the application opens to the live server and launches!
Maintenance — after completing the software development, customers start using the software.
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
What is Augmented Reality Image Trackingpavan998932
Augmented Reality (AR) Image Tracking is a technology that enables AR applications to recognize and track images in the real world, overlaying digital content onto them. This enhances the user's interaction with their environment by providing additional information and interactive elements directly tied to physical images.
Software Engineering, Software Consulting, Tech Lead, Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Transaction, Spring MVC, OpenShift Cloud Platform, Kafka, REST, SOAP, LLD & HLD.
Takashi Kobayashi and Hironori Washizaki, "SWEBOK Guide and Future of SE Education," First International Symposium on the Future of Software Engineering (FUSE), June 3-6, 2024, Okinawa, Japan
OpenMetadata Community Meeting - 5th June 2024OpenMetadata
The OpenMetadata Community Meeting was held on June 5th, 2024. In this meeting, we discussed about the data quality capabilities that are integrated with the Incident Manager, providing a complete solution to handle your data observability needs. Watch the end-to-end demo of the data quality features.
* How to run your own data quality framework
* What is the performance impact of running data quality frameworks
* How to run the test cases in your own ETL pipelines
* How the Incident Manager is integrated
* Get notified with alerts when test cases fail
Watch the meeting recording here - https://www.youtube.com/watch?v=UbNOje0kf6E
Mobile App Development Company In Noida | Drona InfotechDrona Infotech
Looking for a reliable mobile app development company in Noida? Look no further than Drona Infotech. We specialize in creating customized apps for your business needs.
Visit Us For : https://www.dronainfotech.com/mobile-application-development/
UI5con 2024 - Boost Your Development Experience with UI5 Tooling ExtensionsPeter Muessig
The UI5 tooling is the development and build tooling of UI5. It is built in a modular and extensible way so that it can be easily extended by your needs. This session will showcase various tooling extensions which can boost your development experience by far so that you can really work offline, transpile your code in your project to use even newer versions of EcmaScript (than 2022 which is supported right now by the UI5 tooling), consume any npm package of your choice in your project, using different kind of proxies, and even stitching UI5 projects during development together to mimic your target environment.
WhatsApp offers simple, reliable, and private messaging and calling services for free worldwide. With end-to-end encryption, your personal messages and calls are secure, ensuring only you and the recipient can access them. Enjoy voice and video calls to stay connected with loved ones or colleagues. Express yourself using stickers, GIFs, or by sharing moments on Status. WhatsApp Business enables global customer outreach, facilitating sales growth and relationship building through showcasing products and services. Stay connected effortlessly with group chats for planning outings with friends or staying updated on family conversations.
Need for Speed: Removing speed bumps from your Symfony projects ⚡️Łukasz Chruściel
No one wants their application to drag like a car stuck in the slow lane! Yet it’s all too common to encounter bumpy, pothole-filled solutions that slow the speed of any application. Symfony apps are not an exception.
In this talk, I will take you for a spin around the performance racetrack. We’ll explore common pitfalls - those hidden potholes on your application that can cause unexpected slowdowns. Learn how to spot these performance bumps early, and more importantly, how to navigate around them to keep your application running at top speed.
We will focus in particular on tuning your engine at the application level, making the right adjustments to ensure that your system responds like a well-oiled, high-performance race car.
2. Automation
Core
• Technology
improvements
mean
computing
tasks
previously
requiring
interaction
with
people,
can
be
fully
automated.
• Automation
brings
repeatability,
reduced
error
rates,
easy
scalability
of
service
provision.
Platform
Agnostic
• Future
interoperability
and
open
standards
will
mean
businesses
can
swap
easily
between
cloud
providers.
• It
is
key
that
solutions
are
designed
to
operate
in
such
a
platform
agnostic
manner
outside
the
bounds
of
normal
technical
architecture
design
(i.e.
no
fixed
O/S
choices
or
fixed
DB
platforms).
Established
Technological
Principals
• Solutions
today,
should
be
built
using
already
established
technological
principals.
• Using
bleeding
edge
rarely
produces
the
perceived
benefits
in
places
such
as
core
business
systems,
without
significant
buy-‐in
from
business
leaders.
• Pre-‐empting
standards
not
already
widely
adopted,
could
produce
a
“Beta-‐Max”
scenario.
Future
Assurance
• Technology
solutions
should
deliver
for
a
minimum
timeframe
within
the
context
of
the
lifecycle
of
the
related
business
system.
• Example:
Re-‐writing
scripts
during
any
platform
migration
should
not
just
use
the
coolest
scripting
language,
they
should
use
a
commonly
known
language
widely
used
and
understood.
Drivers
3. • Permits
federated
authentication
(single-‐sign-‐on)
into
customer
SAP
systems
via
an
IdP such
as
SAP
IDM.
• Authentication
to
on-‐premise
SAP
IDM
is
possible.
• Subsequent
SAP
system
can
authenticate
against
the
IDM
generated
SAP
logon
ticket
(MYSAPSSO2
cookie)
or
SAML2
token.
• SAP
Cloud
Platform
(SCP)
users
(S-‐users)
can
use
SAP
Cloud
Platform
services
such
as
Web
IDE,
authenticating
into
the
customer
SAP
systems
against
their
respective
SAP
system
account
in
the
IdP (usually
their
corporate
identity).
About
Principal
Propagation
4. • SAP
Cloud
Platform
a.k.a.
SCP
(previously
called
SAP
HANA
Cloud).
• A
PaaS
set
of
tools,
utilities
and
cloud
capabilities
for
use
with
SAP
and
non-‐
SAP
products,
all
provided
in
the
cloud.
• Accessed
over
the
internet.
• Is
the
future
of
SAP
software
integration
and
will
provide
the
basis
for
many
SAP
SaaS
applications
also.
• Can
be
accessed
from
“on-‐premise”
(or
your
cloud
provider)
using
the
SAP
Cloud
Connector
(SCC),
which
acts
as
a
reverse
proxy.
About
SAP
Cloud
Platform
5. SCP
SAP
Cloud
Platform
Developer
with
S-‐user
account.
Destinations:
BE1:1234
SAP
Cloud
Connector
Sub-‐ Account:
ABC123
BE1:1234
=
https://be1.corp
Trust
Store
CA
Cert
System
Cert
BE1
SSL
Cert
Chain
Cloud “On-‐Premise”
(Cloud
be
cloud
hosted
IaaS)
IdP (SAP
IDM)
UME
Developer
corporate
identity
and
account.
BE1
– SAP
(https://be1.corp)
Optional
Web
Dispatcher
Trust
Store
SCC
CA
Cert
Target
ICF
Service
ICM
(+Web
Dispatcher)
Parameters:
login/certificate_mapping_rulebased=”1“
icm/trusted_reverse_proxy_0=<SCC
System
CA>
icm/HTTPS/verify_client=1
ICM
Trust
Store
SCC
CA
Cert
SSL
HTTP
HEADER
SCC
Cert
Chain
x.509
Client
Cert
SAML
Token
Customise:
STRUST
CERTRULE
RZ10
Wdisp SSL
Chain
Architecture
Overview
6. SCP:
• Create
S-‐user
account(s).
• Create
destination
to
back-‐end
SAP
system
via
SCC
with
Principal
Propagation
enabled
and
pointing
to
your
IdP.
IdP:
• SAML:
Configure
SAML
token
creation
for
SCP
users
after
authentication.
SCC:
• Sub-‐Account:
Register
SCP
sub-‐accounts
for
incoming
connections
from
SCP.
• On-‐Premise:
Configure
trust
store
with
back-‐end
SAP
system
SSL
server
cert
and
optional
Web
Disp SSL
cert.
• On-‐Premise:
Configure
Principal
Propagation
user
x.509
client
cert
creation
upon
SAML
token
receipt.
BE1:
• ICM:
Transaction
STRUST
to
trust
the
SCC
client
x.509
cert.
• AUTH:
Transaction
CERTRULE
to
map
SCC
dynamic
x.509
client
cert
CN
to
SAP
system
user
accounts.
• ICM:
Transaction
RZ10
to
configure
ICM
params to
enable
trusting
of
client
x.509
certs
forwarded
in
HTTP
header.
Optional
Web
Dispatcher:
• ICM:
Adding
SCC
client
x.509
cert
to
the
SAPSSLS
PSE.
• ICM:
DEFAULT.PFL
to
configure
ICM
params to
enable
trusting
of
client
x.509
certs
forwarded
in
HTTP
header.
Areas
for
Configuration
7. • Principal
Propagation
should
enable
smooth
efficient
access
to
back-‐end
SAP
systems
via
the
SAP
Cloud
Connector
from
the
SAP
Cloud
Platform.
• A
secure
setup
is
always
recommended,
paying
attention
to
SAP
recommendations
for
the
SCC
networking
and
HA.
• The
future
direction
of
SAP
integration
will
need
to
use
the
SCC
more
and
more.
Example:
SAP
Analytics
Coud.
• The
Principal
Propagation
trust
setup
is
complex
and
involves
multiple
certificates,
leaving
you
open
to
the
probability
of
certificate
expiration
causing
an
outage.
Summary
8. SAP
Notes:
• SAP
note
2462533
-‐ Configuring
Principal
Propagation
to
an
ABAP
System.
• SAP
note
2052899
-‐ ICM
-‐ Multiple
Trusted
Reverse
Proxies
• SAP
note
2461375
-‐ How
to
connect
SAP
Cloud
Platform
Identity
Authentication
Service
to
on-‐premise
user
store
SAP
Guides:
• SCC
secure
setup
recommendations:
https://help.sap.com/viewer/cca91383641e40ffbe03bdc78f00f681/Cloud/en-‐
US/e7ea82a4bb571014a4ceb61cb7e3d31f.html
• Configure
Principal
Propagation
for
an
ABAP
system:
https://help.sap.com/viewer/cca91383641e40ffbe03bdc78f00f681/Cloud/en-‐
US/a8bb87a72d094e0d981d2b1f67df7bc3.html
References