Tips on implementing SAP adaptive computing design with SAP LaMa on Microsoft Azure. We discuss the best options for SAP and some of the challenges faced.
This document discusses placing the SAP Application Server Central Services (ASCS) into containers on Kubernetes. It proposes using containers for the ASCS and Enqueue Replication Server (ERS) with anti-affinity rules to ensure high availability without traditional clustering. Benefits include simplified high availability without requiring cluster technology while still providing required features and allowing SAP systems to utilize anonymous compute nodes rather than dedicated hardware. Considerations include licensing and ensuring the Message Server and ERS are never placed on the same node.
Checkout the latest article by Darryl Griffiths from Aliter Consulting. SAP on Azure Web Dispatcher High Availability provides an overview of how to utilise an Azure Internal Load Balancer in conjunction with the parallel SAP Web Dispatchers to achieve a highly available, load-balanced and scalable solution for fronting SAP Fiori and other SAP components. This deployment is proving very successful on a current SAP Fiori and SAP S/4HANA implementation project for one of our clients.
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)Gary Jackson MBCS
This document provides information about SAP HANA System Replication (HSR) and compares it to SAP Replication Server (SRS). HSR replicates transaction log entries from a primary HANA database to secondary databases. It supports synchronous and asynchronous replication and can be used for high availability and disaster recovery. The document outlines the initial setup process and ongoing administration of HSR configurations.
This document summarizes the SAP products, database/operating system combinations, and Amazon Web Services (AWS) EC2 instance types that are currently supported on AWS infrastructure. It lists the supported SAP applications, databases, operating systems, and EC2 instance types and sizes. It also provides references to other SAP notes for additional details on AWS support prerequisites and requirements.
AWS re:Invent 2016: Optimizing workloads in SAP HANA with Amazon EC2 X1 Insta...Amazon Web Services
AWS and SAP have worked together closely to certify the AWS platform so that companies of all sizes can fully realize all the benefits of the SAP HANA in-memory database platform on the AWS cloud. By placing SAP systems in the cloud, organizations are achieving greater agility, flexibility, and cost efficiency while saving resources to focus on their core businesses. We will discuss recent SAP and AWS innovations including the Amazon EC2 X1 instance type that offers up to 2TB of RAM, and dive into features of the AWS platform that bring significant flexibility to SAP HANA deployments.
Tips on implementing SAP adaptive computing design with SAP LaMa on Microsoft Azure. We discuss the best options for SAP and some of the challenges faced.
This document discusses placing the SAP Application Server Central Services (ASCS) into containers on Kubernetes. It proposes using containers for the ASCS and Enqueue Replication Server (ERS) with anti-affinity rules to ensure high availability without traditional clustering. Benefits include simplified high availability without requiring cluster technology while still providing required features and allowing SAP systems to utilize anonymous compute nodes rather than dedicated hardware. Considerations include licensing and ensuring the Message Server and ERS are never placed on the same node.
Checkout the latest article by Darryl Griffiths from Aliter Consulting. SAP on Azure Web Dispatcher High Availability provides an overview of how to utilise an Azure Internal Load Balancer in conjunction with the parallel SAP Web Dispatchers to achieve a highly available, load-balanced and scalable solution for fronting SAP Fiori and other SAP components. This deployment is proving very successful on a current SAP Fiori and SAP S/4HANA implementation project for one of our clients.
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)Gary Jackson MBCS
This document provides information about SAP HANA System Replication (HSR) and compares it to SAP Replication Server (SRS). HSR replicates transaction log entries from a primary HANA database to secondary databases. It supports synchronous and asynchronous replication and can be used for high availability and disaster recovery. The document outlines the initial setup process and ongoing administration of HSR configurations.
This document summarizes the SAP products, database/operating system combinations, and Amazon Web Services (AWS) EC2 instance types that are currently supported on AWS infrastructure. It lists the supported SAP applications, databases, operating systems, and EC2 instance types and sizes. It also provides references to other SAP notes for additional details on AWS support prerequisites and requirements.
AWS re:Invent 2016: Optimizing workloads in SAP HANA with Amazon EC2 X1 Insta...Amazon Web Services
AWS and SAP have worked together closely to certify the AWS platform so that companies of all sizes can fully realize all the benefits of the SAP HANA in-memory database platform on the AWS cloud. By placing SAP systems in the cloud, organizations are achieving greater agility, flexibility, and cost efficiency while saving resources to focus on their core businesses. We will discuss recent SAP and AWS innovations including the Amazon EC2 X1 instance type that offers up to 2TB of RAM, and dive into features of the AWS platform that bring significant flexibility to SAP HANA deployments.
AWS Webcast - Best Practices for Deploying SAP Workloads on AWSAmazon Web Services
With AWS, it is easier for enterprises to deploy SAP workloads such as HANA and Business Suite. In this webinar, you'll learn the best practices for deploying SAP HANA and Business Suite on AWS. Additionally, you will learn about architectural considerations and pitfalls to look out for when migrating from on premises to AWS. This webinar will also discuss how Kellogg deployed SAP HANA on AWS and on-premises.
Learning Objectives:
• How to set SAP workloads on AWS
• Migration tips and tricks
• How to set up your architecture for optimal results Who Should Attend:
• Business and technical professionals who use SAP
Low latency high throughput streaming using Apache Apex and Apache KuduDataWorks Summit
True streaming is fast becoming a necessity for many business use cases. On the other hand the data set sizes and volumes are also growing exponentially compounding the complexity of data processing pipelines.There exists a need for true low latency streaming coupled with very high throughput data processing. Apache Apex as a low latency and high throughput data processing framework and Apache Kudu as a high throughput store form a nice combination which solves this pattern very efficiently.
This session will walk through a use case which involves writing a high throughput stream using Apache Kafka,Apache Apex and Apache Kudu. The session will start with a general overview of Apache Apex and capabilities of Apex that form the foundation for a low latency and high throughput engine with Apache kafka being an example input source of streams. Subsequently we walk through Kudu integration with Apex by walking through various patterns like end to end exactly once, selective column writes and timestamp propagations for out of band data. The session will also cover additional patterns that this integration will cover for enterprise level data processing pipelines.
The session will conclude with some metrics for latency and throughput numbers for the use case that is presented.
Speaker
Ananth Gundabattula, Senior Architect, Commonwealth Bank of Australia
The document discusses sizing storage for SAP implementations. It describes different types of sizing including greenfield, brownfield, and hybrid. It also covers sizing tools from SAP like QuickSizer that estimate hardware requirements based on metrics like CPU time, memory usage, and disk space. The document emphasizes that sizing is an iterative process that requires validating assumptions with usage data and testing.
Operationalizing Data Science Using Cloud FoundryVMware Tanzu
The document discusses how operationalizing machine learning models through continuous deployment and monitoring is important to realize business value but often overlooked, and describes how Alpine Data's Chorus platform in combination with Pivotal's Big Data Suite and Cloud Foundry can provide a turn-key solution for operationalizing models by deploying scalable scoring engines that can consume models exported in the PFA format. The platform aims to make it simple to deploy both individual models and complex scoring flows represented as PFA documents to ensure models have maximum impact on the business.
Hadoop for the Data Scientist: Spark in Cloudera 5.5Cloudera, Inc.
Inefficient data workloads are all too common across enterprises - causing costly delays, breakages, hard-to-maintain complexity, and ultimately lost productivity. For a typical enterprise with multiple data warehouses, thousands of reports, and hundreds of thousands of ETL jobs being executed every day, this loss of productivity is a real problem. Add to all of this the complex handwritten SQL queries, and there can be nearly a million queries executed every month that desperately need to be optimized, especially to take advantage of the benefits of Apache Hadoop. How can enterprises dig through their workloads and inefficiencies to easily see which are the best fit for Hadoop and what’s the fastest path to get there?
Cloudera Navigator Optimizer is the solution - analyzing existing SQL workloads to provide instant insights into your workloads and turns that into an intelligent optimization strategy so you can unlock peak performance and efficiency with Hadoop. As the newest addition to Cloudera’s enterprise Hadoop platform, and now available in limited beta, Navigator Optimizer has helped customers profile over 1.5 million queries and ultimately save millions by optimizing for Hadoop.
(BIZ401) Kellogg Company Runs SAP in a Hybrid Environment | AWS re:Invent 2014Amazon Web Services
Many enterprises today are moving their SAP workloads to the cloud in order to achieve business agility. In this session, learn strategies and recommended practices for architecting and implementing a phased (''hybrid'') approach for SAP workloads, while optimizing for availability and performance. In this session, Kellogg Company will walk through the business justification and how they leveraged a hybrid approach when implementing SAP Business Warehouse (BW) on SAP HANA on the AWS cloud.
Database Week at the San Francisco Loft
Oracle and SQL Server on the Cloud
Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. In this session we'll look at open commercial databases supported by Amazon RDS.
Level: 200
Speakers:
Joyjeet Banerjee - Enterprise Solutions Architect, AWS
Vishwajit Tigadi - Manager, Strategic Accounts, AWS
Microsoft SQL Server is a commonly-used commercial relational database, especially for organizations that use Microsoft development tools. We’ll look at how to run SQL Server on the AWS Cloud, with examples of organizations using it.
VMworld 2013: Strategic Reasons for Classifying Workloads for Tier 1 Virtuali...VMworld
This document discusses the importance of classifying workloads before virtualizing tier 1 applications. Workload classification involves measuring existing application and database workloads to properly size and place them in a new virtualized environment. This reduces risks and speeds up implementation by providing the proper analysis. The document outlines challenges, opportunities, models, metrics, tools and an example MolsonCoors used workload classification to virtualize their SAP landscape.
In addition to running databases in Amazon EC2, AWS customers can choose among a variety of managed database services. These services save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We explain the fundamentals of Amazon DynamoDB, a fully managed NoSQL database service; Amazon RDS, a relational database service in the cloud; Amazon ElastiCache, a fast, in-memory caching service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We’ll cover how each service might help support your application, how much each service costs, and how to get started.
A closer look at the MySQL and PostgreSQL compatible relational database built for the cloud that combines the performance and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. We’ll explore how Aurora uses the AWS cloud to provide high reliability, high durability, and high throughput.
Migrating and Running DBs on Amazon RDS for OracleMaris Elsins
The process of migrating Oracle DBs to Amazon RDS is quite complex. Some of the challenges are - capacity planning, efficient loading of data, dealing with limitations of RDS, provisioning instance configurations, and lack and SYSDBA's access to the database. The author has migrated over 20 databases to Amazon RDS, and will provide an insight into how these challenges can be addressed. Once done with the migrations – the support of the databases is very different too, because the SYSDBA access is not provided. The author will talk about his experience on migrating to and supporting databases on Amazon RDS for Oracle from Oracle DBAs perspective, and will reveal the different problems encountered as well the solutions applied.
Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. In this session we'll look at open commercial databases supported by Amazon RDS.
Speakers:
Roger Dahlstrom - Solutions Architect, AWS
Peter Dachnowicz - Sr. Technical Account Manager, AWS
by Darin Briskman, Technical Evangelist, AWS
Microsoft SQL Server is a commonly-used commercial relational database, especially for organizations that use Microsoft development tools. We’ll look at how to run SQL Server on the AWS Cloud, with examples of organizations using it.
Building Efficient Pipelines in Apache SparkJeremy Beard
This document provides an overview of techniques for optimizing Apache Spark pipelines. It discusses fundamentals of Spark execution including jobs, stages and tasks. It then provides recommendations for tuning aspects like sizing executors, using DataFrames/Datasets over RDDs, caching frequently used data, joining techniques to avoid shuffling large datasets, and addressing skew. The document aims to help debug and optimize Spark applications.
Maximizing performance via tuning and optimizationMariaDB plc
This document provides an overview of best practices for maximizing performance of MariaDB Server through tuning and optimization. It discusses general best practices like service level agreements and metrics collection. It also covers specific areas like server, storage, and network configuration, connection pooling, MariaDB configuration settings, query tuning using indexes and EXPLAIN, and monitoring tools like performance schema. The goal is to help users get the most out of their MariaDB deployment through performance analysis and tuning.
The correct answer is B. To enable encryption for future RDS database backups, we need to modify the backup section of the database configuration in RDS and toggle the "Enable encryption" checkbox. This will encrypt all new backups taken after this change. The other options are incorrect:
A) Enabling default encryption on the S3 bucket won't encrypt existing backups or future RDS backups taken by RDS.
C) Creating an encrypted snapshot from an unencrypted one doesn't help meet the requirements - we need future automated backups from RDS to be encrypted.
So the best option is B - modifying the database configuration directly in RDS to enable encryption for all new automated backups.
The answer is B.
SAP Landscape Management (LaMa) is a tool that helps manage SAP application landscapes across physical and cloud infrastructures. It automates tasks like system provisioning, monitoring, and maintenance. The latest version (3.0 SP13) adds support for managing SAP HANA environments through capabilities like automated system refresh, high availability management, and near-zero downtime maintenance. An adapter was also developed to integrate LaMa with Microsoft Azure, allowing end-to-end automation of SAP applications and cloud infrastructure on Azure.
Azure for SAP Solutions - Use Cases and Migration OptionsmyCloudDoor
Discover the advantages of deploying SAP Solutions on Azure Cloud. What are the use cases, best timing for migration, how to migrate SAP to the Cloud. We install any SAP Solution on Azure for free. Ask for details.
AWS Webcast - Best Practices for Deploying SAP Workloads on AWSAmazon Web Services
With AWS, it is easier for enterprises to deploy SAP workloads such as HANA and Business Suite. In this webinar, you'll learn the best practices for deploying SAP HANA and Business Suite on AWS. Additionally, you will learn about architectural considerations and pitfalls to look out for when migrating from on premises to AWS. This webinar will also discuss how Kellogg deployed SAP HANA on AWS and on-premises.
Learning Objectives:
• How to set SAP workloads on AWS
• Migration tips and tricks
• How to set up your architecture for optimal results Who Should Attend:
• Business and technical professionals who use SAP
Low latency high throughput streaming using Apache Apex and Apache KuduDataWorks Summit
True streaming is fast becoming a necessity for many business use cases. On the other hand the data set sizes and volumes are also growing exponentially compounding the complexity of data processing pipelines.There exists a need for true low latency streaming coupled with very high throughput data processing. Apache Apex as a low latency and high throughput data processing framework and Apache Kudu as a high throughput store form a nice combination which solves this pattern very efficiently.
This session will walk through a use case which involves writing a high throughput stream using Apache Kafka,Apache Apex and Apache Kudu. The session will start with a general overview of Apache Apex and capabilities of Apex that form the foundation for a low latency and high throughput engine with Apache kafka being an example input source of streams. Subsequently we walk through Kudu integration with Apex by walking through various patterns like end to end exactly once, selective column writes and timestamp propagations for out of band data. The session will also cover additional patterns that this integration will cover for enterprise level data processing pipelines.
The session will conclude with some metrics for latency and throughput numbers for the use case that is presented.
Speaker
Ananth Gundabattula, Senior Architect, Commonwealth Bank of Australia
The document discusses sizing storage for SAP implementations. It describes different types of sizing including greenfield, brownfield, and hybrid. It also covers sizing tools from SAP like QuickSizer that estimate hardware requirements based on metrics like CPU time, memory usage, and disk space. The document emphasizes that sizing is an iterative process that requires validating assumptions with usage data and testing.
Operationalizing Data Science Using Cloud FoundryVMware Tanzu
The document discusses how operationalizing machine learning models through continuous deployment and monitoring is important to realize business value but often overlooked, and describes how Alpine Data's Chorus platform in combination with Pivotal's Big Data Suite and Cloud Foundry can provide a turn-key solution for operationalizing models by deploying scalable scoring engines that can consume models exported in the PFA format. The platform aims to make it simple to deploy both individual models and complex scoring flows represented as PFA documents to ensure models have maximum impact on the business.
Hadoop for the Data Scientist: Spark in Cloudera 5.5Cloudera, Inc.
Inefficient data workloads are all too common across enterprises - causing costly delays, breakages, hard-to-maintain complexity, and ultimately lost productivity. For a typical enterprise with multiple data warehouses, thousands of reports, and hundreds of thousands of ETL jobs being executed every day, this loss of productivity is a real problem. Add to all of this the complex handwritten SQL queries, and there can be nearly a million queries executed every month that desperately need to be optimized, especially to take advantage of the benefits of Apache Hadoop. How can enterprises dig through their workloads and inefficiencies to easily see which are the best fit for Hadoop and what’s the fastest path to get there?
Cloudera Navigator Optimizer is the solution - analyzing existing SQL workloads to provide instant insights into your workloads and turns that into an intelligent optimization strategy so you can unlock peak performance and efficiency with Hadoop. As the newest addition to Cloudera’s enterprise Hadoop platform, and now available in limited beta, Navigator Optimizer has helped customers profile over 1.5 million queries and ultimately save millions by optimizing for Hadoop.
(BIZ401) Kellogg Company Runs SAP in a Hybrid Environment | AWS re:Invent 2014Amazon Web Services
Many enterprises today are moving their SAP workloads to the cloud in order to achieve business agility. In this session, learn strategies and recommended practices for architecting and implementing a phased (''hybrid'') approach for SAP workloads, while optimizing for availability and performance. In this session, Kellogg Company will walk through the business justification and how they leveraged a hybrid approach when implementing SAP Business Warehouse (BW) on SAP HANA on the AWS cloud.
Database Week at the San Francisco Loft
Oracle and SQL Server on the Cloud
Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. In this session we'll look at open commercial databases supported by Amazon RDS.
Level: 200
Speakers:
Joyjeet Banerjee - Enterprise Solutions Architect, AWS
Vishwajit Tigadi - Manager, Strategic Accounts, AWS
Microsoft SQL Server is a commonly-used commercial relational database, especially for organizations that use Microsoft development tools. We’ll look at how to run SQL Server on the AWS Cloud, with examples of organizations using it.
VMworld 2013: Strategic Reasons for Classifying Workloads for Tier 1 Virtuali...VMworld
This document discusses the importance of classifying workloads before virtualizing tier 1 applications. Workload classification involves measuring existing application and database workloads to properly size and place them in a new virtualized environment. This reduces risks and speeds up implementation by providing the proper analysis. The document outlines challenges, opportunities, models, metrics, tools and an example MolsonCoors used workload classification to virtualize their SAP landscape.
In addition to running databases in Amazon EC2, AWS customers can choose among a variety of managed database services. These services save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We explain the fundamentals of Amazon DynamoDB, a fully managed NoSQL database service; Amazon RDS, a relational database service in the cloud; Amazon ElastiCache, a fast, in-memory caching service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We’ll cover how each service might help support your application, how much each service costs, and how to get started.
A closer look at the MySQL and PostgreSQL compatible relational database built for the cloud that combines the performance and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. We’ll explore how Aurora uses the AWS cloud to provide high reliability, high durability, and high throughput.
Migrating and Running DBs on Amazon RDS for OracleMaris Elsins
The process of migrating Oracle DBs to Amazon RDS is quite complex. Some of the challenges are - capacity planning, efficient loading of data, dealing with limitations of RDS, provisioning instance configurations, and lack and SYSDBA's access to the database. The author has migrated over 20 databases to Amazon RDS, and will provide an insight into how these challenges can be addressed. Once done with the migrations – the support of the databases is very different too, because the SYSDBA access is not provided. The author will talk about his experience on migrating to and supporting databases on Amazon RDS for Oracle from Oracle DBAs perspective, and will reveal the different problems encountered as well the solutions applied.
Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. In this session we'll look at open commercial databases supported by Amazon RDS.
Speakers:
Roger Dahlstrom - Solutions Architect, AWS
Peter Dachnowicz - Sr. Technical Account Manager, AWS
by Darin Briskman, Technical Evangelist, AWS
Microsoft SQL Server is a commonly-used commercial relational database, especially for organizations that use Microsoft development tools. We’ll look at how to run SQL Server on the AWS Cloud, with examples of organizations using it.
Building Efficient Pipelines in Apache SparkJeremy Beard
This document provides an overview of techniques for optimizing Apache Spark pipelines. It discusses fundamentals of Spark execution including jobs, stages and tasks. It then provides recommendations for tuning aspects like sizing executors, using DataFrames/Datasets over RDDs, caching frequently used data, joining techniques to avoid shuffling large datasets, and addressing skew. The document aims to help debug and optimize Spark applications.
Maximizing performance via tuning and optimizationMariaDB plc
This document provides an overview of best practices for maximizing performance of MariaDB Server through tuning and optimization. It discusses general best practices like service level agreements and metrics collection. It also covers specific areas like server, storage, and network configuration, connection pooling, MariaDB configuration settings, query tuning using indexes and EXPLAIN, and monitoring tools like performance schema. The goal is to help users get the most out of their MariaDB deployment through performance analysis and tuning.
The correct answer is B. To enable encryption for future RDS database backups, we need to modify the backup section of the database configuration in RDS and toggle the "Enable encryption" checkbox. This will encrypt all new backups taken after this change. The other options are incorrect:
A) Enabling default encryption on the S3 bucket won't encrypt existing backups or future RDS backups taken by RDS.
C) Creating an encrypted snapshot from an unencrypted one doesn't help meet the requirements - we need future automated backups from RDS to be encrypted.
So the best option is B - modifying the database configuration directly in RDS to enable encryption for all new automated backups.
The answer is B.
SAP Landscape Management (LaMa) is a tool that helps manage SAP application landscapes across physical and cloud infrastructures. It automates tasks like system provisioning, monitoring, and maintenance. The latest version (3.0 SP13) adds support for managing SAP HANA environments through capabilities like automated system refresh, high availability management, and near-zero downtime maintenance. An adapter was also developed to integrate LaMa with Microsoft Azure, allowing end-to-end automation of SAP applications and cloud infrastructure on Azure.
Azure for SAP Solutions - Use Cases and Migration OptionsmyCloudDoor
Discover the advantages of deploying SAP Solutions on Azure Cloud. What are the use cases, best timing for migration, how to migrate SAP to the Cloud. We install any SAP Solution on Azure for free. Ask for details.
Providing SAP customers with SAP BTP based tool to self-evaluate SAP landscape usage, optimize existing and simulate future SAP user license requirements by calculating a reliable number of Full Use Equivalents (FUEs).
We address three main scenarios for reliable self-assessment and monitoring of your SAP landscape in terms of SAP S/4HANA Enterprise Management Users, or Full Use Equivalents requirements.
𝐔𝐬𝐞 𝐂𝐚𝐬𝐞 #1 | 𝐑𝐞𝐥𝐢𝐚𝐛𝐥𝐞 𝐋𝐚𝐧𝐝𝐬𝐜𝐚𝐩𝐞 𝐄𝐯𝐚𝐥𝐮𝐚𝐭𝐢𝐨𝐧 𝐢𝐧 𝐅𝐮𝐥𝐥 𝐔𝐬𝐞 𝐄𝐪𝐮𝐢𝐯𝐚𝐥𝐞𝐧𝐭𝐬 (𝐅𝐔𝐄𝐬)
A key element of moving towards SAP S/4HANA Enterprise Management is understanding how legacy SAP ECC On-Premise Named User Licenses (NUL) are converted to SAP S/4HANA Enterprise Management Users, or Full Use Equivalent requirements.
For an SAP system, or consolidation of SAP systems, you can see Full Use Equivalents and estimated costs according to your regional SAP Price List.
𝐔𝐬𝐞 𝐂𝐚𝐬𝐞 #2 | “𝐖𝐡𝐚𝐭 𝐈𝐟” 𝐒𝐜𝐞𝐧𝐚𝐫𝐢𝐨𝐬 𝐌𝐨𝐝𝐞𝐥𝐢𝐧𝐠
Self-assessment of FUE requirements provides enhanced clarity and helps to model different “what if” scenarios via adjustment of SAP Roles, SAP Roles assignment and SAP User responsibilities in the landscape.
Now you can make fast and reliable budget assessment of your implementation project impact on SAP software use.
𝐔𝐬𝐞 𝐂𝐚𝐬𝐞 #3 | 𝐅𝐮𝐥𝐥 𝐔𝐬𝐞 𝐄𝐪𝐮𝐢𝐯𝐚𝐥𝐞𝐧𝐭𝐬 (𝐅𝐔𝐄𝐬) 𝐌𝐨𝐧𝐢𝐭𝐨𝐫𝐢𝐧𝐠
As soon as you optimize your SAP landscape Full Use Equivalents, you can keep monitoring Full Use Equivalents trends and get notifications on use increase, so that you could understand what leads to the increase and whether there’s been planned authorization adjustment.
Continuous evaluation of FUE requirements provide licensing transparency, predictability, and simplicity.
Presenting the newest version of Cloudify - 4.6 including a orchestrated SD-WAN demo from MEF18 where Cloudify is used as the orchestration platform for uCPE based on containers.
Mastering SAP Monitoring - SAP HANA Monitoring, Management & AutomationLinh Nguyen
Part 7 of Mastering SAP Monitoring series http://www.itconductor.com/blog/mastering-sap-monitoring-without-sap-ccms-or-solman explains SAP HANA monitoring and management challenges and solutions.
HANA use cases have grown rapidly from BW to Suite on HANA to S/4HANA, along with myriads of choices for platforms such as on-premise, HANA Cloud Platform, HANA Enterprise Cloud, Public Cloud like AWS, and Private clouds such as VirtualStream, etc. No matter what scenario or platform, one thing is certain - it has to be monitored and managed to ensure the best possible performance, availability and ROI. Run Simple with SAP may mean simple for users, however for Basis and IT Operations we also need tools to help simplify the life cycle management aspects.
We will explain these topics in detail with regards to SAP and the 10 principles of Application-Centric Service Management & Automation
Benefits:
1) Look at an updated list of tools available from SAP and other solutions
2) Focus on availability, performance, alerts management
Proactive health checks
3) Automation of common housekeeping tasks
Trend analysis
Audience: SAP Basis Administrator, SAP DBA, HANA Admin, IT operations and managers of environments.
SAP HANA is an in-memory database platform that stores transactional data in memory for real-time analytics by eliminating disk input/output lag; it uses columnar storage with efficient compression and built-in indexing for each column to enable fast queries and joins; SAP HANA features include predictive analytics, multitenant containers, high availability, and tools for data modeling, administration, and development.
This document proposes a method for installing SAP systems within an Active Directory domain without requiring Domain Administrator rights. It involves delegating rights to a group to create user accounts, groups, and computer accounts within an organizational unit. This allows SAP administrators to install systems without help from Domain Administrators, taking advantage of Active Directory features like single sign-on and synchronization.
This technical pitch deck summarizes SAP solutions on Microsoft Azure. It outlines challenges with on-premises SAP environments and how moving to SAP HANA in the cloud on Azure can enable faster processes, accelerated innovation, and 360-degree insights. It then covers the journey to migrating SAP landscapes to SAP HANA and Azure, including lifting SAP systems with any database to Azure, migrating to SAP HANA, and migrating to S/4HANA. Finally, it discusses how Azure enables insights from SAP and non-SAP data.
Learn 15 different use cases of how you can deploy Azure services, the Microsoft Cloud platform for your SAP systems and applications, and what type of problems they can solve for a business. If you are interested in the technical feasibility or a proof of concept, myCloudDoor is offering it for free until the end of 2015.
VMware vCOPs Management Pack for SAP CCMS OverviewBlue Medora
This document discusses a management pack for VMware vCenter Operations Manager that provides visibility into SAP workloads. The management pack integrates vCenter Ops with SAP's Central Management Console (CCMS) to collect key performance indicators and metrics about SAP resources. It provides out-of-the-box dashboards and mappings of SAP workloads running on VMware vSphere. The management pack enables vCenter Ops to ingest SAP data via CCMS, giving customers insights into the performance, capacity, and health of their SAP environments.
The document provides information about a webcast on SAP HANA Finance Accelerators and cloud scenarios. It includes details on how to access recordings and ask questions during the webcast. It also shares information on the HANA Finance Accelerators cloud "try-and-decide" option, which allows testing accelerated ERP processes and reporting in the cloud at minimal cost and maximum flexibility. The presentation provides an overview of the accelerators and scenarios available.
Advanced Application Monitoring and Management in Microsoft Azure with KEMP360Kemp
Enterprise level management and support tools for simplified and seamless management of your application deployment fabric - from managing LoadMaster, F5 Big-IP, Amazon ELB, HAProxy and NGINX to 24/7/365 pro-active monitored support team, alert management, issue diagnosis and escalation.
This document provides a case study and requirements for planning the migration of Contoso's SAP systems from an on-premise environment to Azure. Contoso currently uses SAP ECC and BW and wants to migrate these workloads to Azure to reduce datacenter costs. The requirements include sizing estimates for migrating BW first within 3 months, then ECC, as well as plans for high availability, disaster recovery, backups, user access, and system integrations between Azure and on-premise. The document also discusses selection of Azure VMs, storage, and networking to meet SAP certification requirements and optimize performance for Contoso's SAP workloads.
You don’t have to rethink your payroll process when you move your HR processes to the Cloud. In this webinar experts explain the simple transition path for Payroll so you can maximize all the benefits without having to re-implement SAP Payroll.
Sma, the hybrid provisioning engine for public cloudsStijn Callebaut
In this session where we are going to demonstrate the power of Service Management Automation (SMA).
The components of the solution and how you can extend the functionality of the engine are the focus points for this automation trip.
Different public clouds are targeted in this automation scenario where we will make use of different System Center products to submit and provision the request.
Less slides, lots of demonstrations where we will explain the complete configuration that supports the demo scenario.
Sysctr Track: SMA, the hybrid provisioning engine for public cloudsITProceed
by Kurt Van Hoecke , Stijn Callebaut
Join us in this session where we are going to demonstrate the power of Service Management Automation (SMA).
The components of the solution and how you can extend the functionality of the engine are the focus points for this automation trip.
Different public clouds are targeted in this automation scenario where we will make use of different System Center products to submit and provision the request.
Less slides, lots of demonstrations where we will explain the complete configuration that supports the demo scenario.
Acumatica SaaS provides benefits like disaster recovery, backups, high availability, software updates and maintenance that surpass most external hosting providers. It uses 24/7 monitoring to ensure consistent performance. Data is securely hosted on AWS and accessible from any device. Automated backups are taken every 2 hours and retained for months. The optional backup access service allows downloading backups. Failover protection is included, and the recovery process involves restoring from the additional backup location. Customizations can be easily maintained through upgrades due to Acumatica's APIs.
[CON6985]Expanding DBaaS Beyond Data Centers Hybrid Cloud Onboarding via Orac...Bharat Paliwal
This document discusses using Oracle Enterprise Manager to manage hybrid cloud environments with Oracle Cloud. It outlines the key capabilities including planning workload migrations, migrating workloads securely to the cloud, and operating hybrid environments at scale with unified monitoring, lifecycle management, and self-service capabilities extended to Oracle Cloud. The document also discusses capabilities like automated synchronization between on-premises Enterprise Manager and Oracle Cloud, data cloning and refresh across clouds, and unveils upcoming support for managing Oracle Database Cloud Service via Enterprise Manager.
Aliter Consulting's latest challenge on a customer project was the integration of SAP on Azure into the customer’s SaaS Office 365 environment for outbound and inbound email for SAP S/4HANA to support inbound email for OpenText VIM and SAP GRC, and other general outbound mail requirements...
OpenText Archive Center 16.2 Single File Vendor Interface (VI) using Microsoft Azure Storage Account as a storage device is now supported on Linux. Checkout this brief overview of its usage on one of our current projects. Thanks to Manish Shah (Microsoft) for his contribution and working with OpenText to achieve support on Linux, to Supriya Pande for her article on the Microsoft Azure Storage Explorer, to Oleh Khrypko (SAP) for his input to handling disaster recovery on OpenText Archive Center and Gary Jackson (Aliter Consulting) for the article.
This document provides instructions for setting up SSL connectivity between SAP LVM and the SAP Host Agent using x509 certificate authentication. It involves generating a certificate signing request for the LVM server, having it signed by a certificate authority, uploading the signed certificate and CA/ICA certificates to the LVM keystore. It also describes adding the CA/ICA certificates to the Host Agent's PSE, configuring the host profile, and testing the SSL connection between LVM and the Host Agent.
This document provides instructions for integrating SAP Business Process Automation (BPA) with SAP Landscape Virtualization Management (LVM). It involves creating a custom operation in LVM that allows controlling BPA queues. This is done by creating a provider implementation and custom operation in LVM along with a process definition and web service in BPA. It also requires registering a script with the host agent to connect the LVM and BPA configurations. The custom operation then allows holding or releasing BPA queues from the LVM interface.
This document provides an overview of how to customize SAP Landscape Virtualization Management (LVM) with custom operations and hooks. It describes defining a provider implementation ("LVM_CustomOperation_ClusterAdm") and custom operations ("Freeze", "Unfreeze", "Relocate") for managing a Red Hat cluster. A sample script ("ClusterAdm.ksh") demonstrates how custom operations could freeze/unfreeze the cluster before SAP instance start/stop operations. The provider implementation and custom operations/hooks allow LVM to integrate cluster management operations.
This document provides instructions for installing SAP Router using Secure Network Communication (SNC) and registering it with SAP. It outlines downloading the installation files, creating a dedicated system user and filesystem, unpacking and configuring the software, generating and importing an SNC certificate, creating a router table, and starting/stopping the SAP Router service.
This document provides guidance on customizing SAP Landscape Virtualization Management (LVM) to manage custom instance types. It describes how to configure generic operations like detect, monitor, start, and stop by creating scripts referenced in configuration files. An example is provided for managing SAP Replication Server (SRS) instances, with configuration files and sample scripting code shown.
The document discusses SAP Web Dispatcher 7.40, which is a load balancer that provides intelligent load distribution for SAP Portal. It can handle stateful or stateless sessions over HTTP or HTTPS invisibly to clients. It supports round-robin load distribution for non-SAP backends like Tomcat. It also allows for multiple SSL certificates to handle multiple domains and backends. SAP Web Dispatcher provides reliability, security, and high performance to handle thousands of concurrent users. It includes features like maintenance mode, custom error pages, and is free to use with an SAP license.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
2. • Since we started this series of presentations, SAP have renamed
their Landscape Virtualization Manager (LaMa) product to
purely Landscape Management (LaMa) to avoid some
confusion
• This is the fifth presentation in our series dedicated to SAP
Landscape Management(LaMa)
• SAP LaMa Cloud Managers allow customers to integrate their
public cloud operations into SAPLaMa
• This document provides a quick overview of how you can
integrate the Deactivate/Active of Microsoft Azure virtual
machines hosting SAP systems with the Microsoft Azure
Connector for SAPLaMa
Introduction
3. Overview
• As of Q2/2018, SAP LaMa Cloud Managers support the following
public cloud platforms
– Microsoft Azure
– Amazon Web Services
• The MicrosoftAzure Connectorprovides the following scenarios
– Activate
– Deactivate (Power Off)
– SAP System Relocate
– SAP System Copy & Clone
– SAP System Refresh
• The setup described in this document is based on SAP LaMa 3.0 SP06
on SAP NetWeaver7.50 SP10
4. Drivers
• Monitoring hosting costs in public cloud environments is of
paramount importance otherwise some of the perceived
benefitswill not be realised
• The ability to deactivate virtual machines during periods of
system inactivity (e.g. weekends or overnight) coupled with an
activationwhen the system is required,becomes important
• To seamlessly integrate this deactivation/activation with the
stop/startof the hosted SAPsystem and database is crucial
• Up to 50% savings could be achieved by running a system 16
hours per day for 20 days per month as opposed to 24 hours
per day for 30 days per month
5. Pre-‐Requisites
• Some of the necessary configuration for the setup are taken
from SAP note “2343511 Microsoft Azure connector for SAP
Landscape Management(LaMa)”
• The use of Cloud Managers in SAP LaMa requires the
installation of the SAP Adaptive Extension 1.0 with patch level
39 or above
• An Azure Service Principle with sufficient level of access to the
virtual machines in the Azure Resource Group must be defined
in advance of the configuration in SAPLaMa
• Any firewall must be open to allow HTTPS access to
management.azure.comon port 443
6. Cloud
Manager
for
Azure
• Cloud connections are created in the Infrastructure è Cloud
Manager section within SAPLaMa
• Key fields for the Azure Cloud Manager configuration are Name,
Password,Subscription and AD Identity
• The Name and Password fields equate to the Service Principal
and Key
• Details of the Service Principal should be safe-‐guardedcarefully
7. Manual
Cloud
Operations
• Once the configuration has been completed cloud operations can be
performed within the Cloud tab from the Operations menu
• The Azure Service Principal is displayed with a summary of the resources in
from the resource groups it can access
8. Manual
Cloud
Operations
• Navigating to the Service Principal expands the available resource groups
shown under OS RESOURCE POOLS
• Selecting a resource group will further drill into the virtual machines
9. Manual
Cloud
Operations
• A listed virtual machine can be shutdown and deactivated in Azure by choosing
OperationsèDeativate (Power off)
• Monitor the progress of the operation by navigating to the Logs in the usual way
10. Scheduled
Cloud
Operations
• Since SAP LaMa 3.0 SP06 cloud operations can now be
scheduledin the LaMa scheduler
• This is a very interesting innovation which effectively allows SAP
systems to be stopped and their hosting VM(s) to be
deactivated on a time basis – thus saving pay-‐as-‐you-‐go hosting
costs
• Start by creatingtwo OperationTemplates:
– one to stop the SAP system and deactivate the VM
– one to activate the VM and start the SAP system
• Then include the two Operation Templates in the SAP LaMa
schedule
11. Reference
Material
• The
following
pages
on
SAP
Help
are
useful:
– SAP
Landscape
Management
3.0
– Configuring
Cloud
Managers
• The
following
SAP
notes
provide
some
information:
– 2343511
Microsoft
Azure
connector
for
SAP
Landscape
Management
(LaMa)”
• Further
details
are
available
on
request
from
our
SAP
LaMa
Certified
Consultants
– mailto:info@aliterconsulting.co.uk