Scale-Up or Scale-Out?
Which is better?
What to do after Go-Live?
How to improve /fine-tune performance?
Can you share some results of such an optimisation exercise?
Shrink your DB and increase SAP BW performanceDataVard
This presentation will show you 5 steps you should consider in your data management for SAP BW. Smart data management offers a tactics how to keep BW at high performance and keep the data growth under control.
Azure SQL Database Managed Instance is a new flavor of Azure SQL Database that is a game changer. It offers near-complete SQL Server compatibility and network isolation to easily lift and shift databases to Azure (you can literally backup an on-premise database and restore it into a Azure SQL Database Managed Instance). Think of it as an enhancement to Azure SQL Database that is built on the same PaaS infrastructure and maintains all it's features (i.e. active geo-replication, high availability, automatic backups, database advisor, threat detection, intelligent insights, vulnerability assessment, etc) but adds support for databases up to 35TB, VNET, SQL Agent, cross-database querying, replication, etc. So, you can migrate your databases from on-prem to Azure with very little migration effort which is a big improvement from the current Singleton or Elastic Pool flavors which can require substantial changes.
Power BI Desktop | Power BI Tutorial | Power BI Training | EdurekaEdureka!
This Edureka "Power BI Desktop" tutorial will help you to understand what is Power BI Desktop with examples and demo. Below are the topics covered in this tutorial:
1. Why Power BI?
2. What Power BI?
3. Who use Power BI?
4. Flow of Work
5. Power BI Trends
Introduction to Power BI to make smart decisionsVIVEK GURURANI
Power BI: Business intelligence like never before!
Power BI is a tool that allows accounting and finance managers and professionals to have relevant information at the right time to make strategic decisions in a flexible, centralized, and secure way. Transform your business with predictive analytics, data visualization, and information in real-time, thanks to Power BI.
Presentation Agenda:
1. Introduction to Power BI
2. Why would your business need Power BI?
3. Power BI Architecture
4. Data Sanitization & Cleansing Capabilities
5. Analytical Insights
6. Generating Department Wise Reports & Visualizations
7. Latest & Upcoming Power BI features
8. Live Q&A
Shrink your DB and increase SAP BW performanceDataVard
This presentation will show you 5 steps you should consider in your data management for SAP BW. Smart data management offers a tactics how to keep BW at high performance and keep the data growth under control.
Azure SQL Database Managed Instance is a new flavor of Azure SQL Database that is a game changer. It offers near-complete SQL Server compatibility and network isolation to easily lift and shift databases to Azure (you can literally backup an on-premise database and restore it into a Azure SQL Database Managed Instance). Think of it as an enhancement to Azure SQL Database that is built on the same PaaS infrastructure and maintains all it's features (i.e. active geo-replication, high availability, automatic backups, database advisor, threat detection, intelligent insights, vulnerability assessment, etc) but adds support for databases up to 35TB, VNET, SQL Agent, cross-database querying, replication, etc. So, you can migrate your databases from on-prem to Azure with very little migration effort which is a big improvement from the current Singleton or Elastic Pool flavors which can require substantial changes.
Power BI Desktop | Power BI Tutorial | Power BI Training | EdurekaEdureka!
This Edureka "Power BI Desktop" tutorial will help you to understand what is Power BI Desktop with examples and demo. Below are the topics covered in this tutorial:
1. Why Power BI?
2. What Power BI?
3. Who use Power BI?
4. Flow of Work
5. Power BI Trends
Introduction to Power BI to make smart decisionsVIVEK GURURANI
Power BI: Business intelligence like never before!
Power BI is a tool that allows accounting and finance managers and professionals to have relevant information at the right time to make strategic decisions in a flexible, centralized, and secure way. Transform your business with predictive analytics, data visualization, and information in real-time, thanks to Power BI.
Presentation Agenda:
1. Introduction to Power BI
2. Why would your business need Power BI?
3. Power BI Architecture
4. Data Sanitization & Cleansing Capabilities
5. Analytical Insights
6. Generating Department Wise Reports & Visualizations
7. Latest & Upcoming Power BI features
8. Live Q&A
Power BI Governance and Development Best Practices - Presentation at #MSBIFI ...Jouko Nyholm
Selected slides from presentation regarding Power BI Governance and Development Best Practices. Presentation was held at MS BI & Power BI User Group Finland event 12.6.2018 at Microsoft Flux, Helsinki.
Without the animations & hands-on demos the slides do not tell the whole story, but hopefully valuable to some nevertheless.
Power bi (1)Power BI Online Training Hyderabad | power bi online training ben...Big IT Trainings
Power BI Desktop integrates proven Microsoft technologies the powerful Query engine, data modeling, and visualizations. All sorts of different data sources, then combine and shape them in ways that facilitate making interesting, in Power BI online Training.
The current Microsoft PowerBI governance enabling and recommendations. Including the changes following the November PowerBI release and PASS conference announcements.
Power BI Tutorial For Beginners | Power BI Tutorial | Power BI Demo | Power B...Edureka!
( Power BI Training - https://www.edureka.co/power-bi-training )
This Edureka videoon "Power BI Tutorial" will provide you with the fundamental knowledge on Power BI (Blog: https://goo.gl/uFTDU3). Below are the topics covered in this tutorial:
1. Why do we need Business Intelligence?
2. What is Self Service Business Intelligence?
3. Why Power BI?
4. What is Power BI?
5. Demo: Report and Dashboard Creation
The presentation discusses the different aspects of Power BI like Power BI for O365, Data Discovery, Data Analysis, Data Visualization & Power Maps, Natural Language Search etc.
Its a business analytics solution presented by Netwoven at the Microsoft Power BI workshop held on Oct 30th at SVC Microsoft, Mountain View.
Power BI Overview, Deployment and GovernanceJames Serra
Deploying Power BI in a large enterprise is a complex task, and one that requires a lot of thought and planning. The purpose of this presentation is to help you make your Power BI deployment a success. After a quick Power BI overview, I’ll discuss deployment strategies, common usage scenarios, how to store and refresh data, prototyping options, how to share externally, and then finish with how to administer and secure Power BI. I’ll outline considerations and best practices for achieving an optimal, well-performing, enterprise level Power BI deployment.
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
Power BI: Introduction with a use case and solutionAlvina Verghis
This PPT gives a brief introduction to the Power BI software. It gives the brief intro of the software with a use case of how Power BI is used in Heathrow Airport for ease of functions
Slides for the Usergroup meeting for the Manchester Power BI User Group on June 27th, 2019.
Subject: Power BI for Developers about Power BI Embedded and Power BI Custom Visuals
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
With the launch of Guided Configuration and offcourse SAP HANA, S/4 HANA; the configuration work has become easier and companies can save up to 75% of configuration efforts. Also the Guided Configuration tool helps to validate and accurately compose this important exercise by removing any human errors as experienced in traditional use of IMG , SPRO.
CIO CFO Planning Forum London 2016 Talking as independent expert without any company references. All views/opinions are solely Ajay Kumar Uppal / speaker at the Forum.
Power BI Governance and Development Best Practices - Presentation at #MSBIFI ...Jouko Nyholm
Selected slides from presentation regarding Power BI Governance and Development Best Practices. Presentation was held at MS BI & Power BI User Group Finland event 12.6.2018 at Microsoft Flux, Helsinki.
Without the animations & hands-on demos the slides do not tell the whole story, but hopefully valuable to some nevertheless.
Power bi (1)Power BI Online Training Hyderabad | power bi online training ben...Big IT Trainings
Power BI Desktop integrates proven Microsoft technologies the powerful Query engine, data modeling, and visualizations. All sorts of different data sources, then combine and shape them in ways that facilitate making interesting, in Power BI online Training.
The current Microsoft PowerBI governance enabling and recommendations. Including the changes following the November PowerBI release and PASS conference announcements.
Power BI Tutorial For Beginners | Power BI Tutorial | Power BI Demo | Power B...Edureka!
( Power BI Training - https://www.edureka.co/power-bi-training )
This Edureka videoon "Power BI Tutorial" will provide you with the fundamental knowledge on Power BI (Blog: https://goo.gl/uFTDU3). Below are the topics covered in this tutorial:
1. Why do we need Business Intelligence?
2. What is Self Service Business Intelligence?
3. Why Power BI?
4. What is Power BI?
5. Demo: Report and Dashboard Creation
The presentation discusses the different aspects of Power BI like Power BI for O365, Data Discovery, Data Analysis, Data Visualization & Power Maps, Natural Language Search etc.
Its a business analytics solution presented by Netwoven at the Microsoft Power BI workshop held on Oct 30th at SVC Microsoft, Mountain View.
Power BI Overview, Deployment and GovernanceJames Serra
Deploying Power BI in a large enterprise is a complex task, and one that requires a lot of thought and planning. The purpose of this presentation is to help you make your Power BI deployment a success. After a quick Power BI overview, I’ll discuss deployment strategies, common usage scenarios, how to store and refresh data, prototyping options, how to share externally, and then finish with how to administer and secure Power BI. I’ll outline considerations and best practices for achieving an optimal, well-performing, enterprise level Power BI deployment.
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
Power BI: Introduction with a use case and solutionAlvina Verghis
This PPT gives a brief introduction to the Power BI software. It gives the brief intro of the software with a use case of how Power BI is used in Heathrow Airport for ease of functions
Slides for the Usergroup meeting for the Manchester Power BI User Group on June 27th, 2019.
Subject: Power BI for Developers about Power BI Embedded and Power BI Custom Visuals
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
With the launch of Guided Configuration and offcourse SAP HANA, S/4 HANA; the configuration work has become easier and companies can save up to 75% of configuration efforts. Also the Guided Configuration tool helps to validate and accurately compose this important exercise by removing any human errors as experienced in traditional use of IMG , SPRO.
CIO CFO Planning Forum London 2016 Talking as independent expert without any company references. All views/opinions are solely Ajay Kumar Uppal / speaker at the Forum.
VMworld 2013: Big Data: Virtualized SAP HANA Performance, Scalability and Bes...VMworld
VMworld 2013
Bob Goldsand, VMware
Todd Muirhead, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Cloud centric consumption based services for SAP, HANA, Concur, Ariba, C4CAjay Kumar Uppal
Customers are advised NOT to buy Hosting, Application Development and Application Maintenance Services on Fixed Price basis....Customers should buy ''Services'' on flexible pay-as-you-consume model with provision of ramping-up, ramping down and even terminating contracts without penalties/termination fees.... Outsourcing contracts should allow customers to move consumption based on what business needs as opposed to getting tied up in contracts...
Architecture review certificate generation of client files Ajay Kumar Uppal
Example - Architecture Review Certificate - which is an instrument to safeguard organsations' ICT spend and ensure that all changes/projects are aligned with overall business strategy, ICT strategy, Architectural principles and Guidelines.. Helps to stop people's 'pet projects' and also helps to re-align Business with IT.
Hana Memory Scale out using the hecatonchire ProjectBenoit Hudzia
This session will present a memory scale-out solution that liberates SAP HANA, or similar memory demanding enterprise applications, from the classical limitation of underlying physical servers. The solution relies on key enabling technology developed within SAP. It allows applications or hypervisors to go beyond the boundaries of the underlying hardware, and effectively enables a fluid transformation from commodity sized physical nodes to very large virtual instances, in order to meet the rapidly growing demand of memory intensive applications.
Running SAP Business Warehouse in the AWS Cloud-SAPPHIRE NOW 2016Amazon Web Services
In this presentation, learn how AWS works with SAP to certify non-production and production workloads, best practices for sizing and deploying SAP Business Warehouse on AWS, pricing, and how to get started.
SAP/HANA Financial Closing can help you ACCELERATE your financial closing cycle. Benefit from increased governance, higher user efficiency and automation, strong collaboration, and real-time insight.
SAP HANA Online Training/ SAP HANA Interview QuestionsGlobustrainings
SAP HANA is an in-memory database.It is a combination of hardware and software made to process massive real time data using In-Memory computing.It combines row-based, column-based database technology.
Data now resides in main-memory (RAM) and no longer on a hard disk.It’s best suited for performing real-time analytics, and developing and deploying real-time applications.
An in-memory database means all the data is stored in the memory (RAM). This is no time wasted in loading the data from hard-disk to RAM or while processing keeping some data in RAM and temporary some data on disk. Everything is in-memory all the time, which gives the CPUs quick access to data for processing.
BW Migration to HANA Part1 - Preparation in BW SystemLinh Nguyen
This series of publication intends to provide an overview and explanation of major steps and considerations for BW on HANA migrations from anyDB (any database). The complex procedure involves:
1) Preparatory work in the BW system
2) SUM DMO Upgrade and Actual migration
3) Post processing on the migrated systems
This first part focuses on the preparation tasks on the BW system.
By OZSoft Consulting for ITConductor.com
Author: Terry Kempis
Editor: Linh Nguyen
Best Practices to Administer, Operate, and Monitor an SAP HANA SystemSAPinsider Events
Review this session from HANA 2015 in Las Vegas. Coming to Europe! www.HANA2015.com
Best Practices to Administer, Operate, and Monitor an SAP HANA System by Kurt Hollis, Deloitte
This session provides easy to understand, step-by-step instruction for operation and administration of SAP HANA post go-live. Through live demo and detailed instruction, attendees will:
· Learn how to use the SAP HANA studio for security, user management, credential management, high availability administration, system maintenance, and performance optimization
· Gain a comprehensive understanding of available SAP HANA platform lifecycle management tools, deployment options, and system relocation
· Get an introduction to SAP HANA HA/DR capabilities, and learn best practices for backup and recovery of the SAP HANA system
Prepare for your interview with these top 20 SAP HANA interview questions. For more IT Profiles, Sample Resumes, Practice exams, Interview Questions, Live Training and more…visit ITLearnMore – Most Trusted Website for all Learning Needs by Students, Graduates and Working Professionals.
Looking to add weight to your resume? Check out for ITLearnmore for varied online IT courses at affordable prices intended for career boost. There is so much in store for both fresh graduates and professionals here. Hurry up..! Get updated with the current IT job market requirements and related courses.For more information visit http://www.ITLearnMore.com.
What is Sap HANA Convista Consulting Asia.pdfankeetkumar4
SAP HANA is the most recent, in-memory information base, and stage which can be sent on-premises or cloud. SAP HANA is a blend of equipment and programming, which coordinates various parts like SAP HANA Database, SAP SLT (System Landscape Transformation) Replication server, SAP HANA Direct Extractor association, and Sybase replication.
HANA has remarkable reception by the SAP clients. SAP HANA is equipped for handling a lot of constant information in a brief time frame period
Enterprise Architecture - Information Security
Promotes innovation, creativity and transform Enterprise in secure manner by creating common insights and overviews of relationships and inter-dependencies to reduce miscommunication and misunderstandings. And take/make decisions with confidence
Adaptive computing and pay-as-you-model for SAP.....
Customers should stop buying on CAPEX and signing long term contracts.... it is time to move to Consumption Based Computing...
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
2. BW on HANA : Scale-Up Versus Scale-Out
Configuration Scale-Out Configuration for BW
on HANA
Scale-Up Configuration for
BW on HANA
Capacity Ramp-Up, Flexibility, Scalability
Can be Ramped-Up easily to almost
200 TBs and this configuration allows
maximum flexibility and scalability. Limited scalability.
Capacity Ramp-Down Available No Provision.
SAP Certification for Ramp-Up from
1TB to > $4 TBs and beyond Already Available Not Certain. At mercy of SAP.
Business Downtime during Capacity
Ramp-Up Zero
Equal to Migration Time ~ >
18hours
Costs associated with Capacity Ramp-
Up No Project Fees
Has to be done as a Migration
Project
Installed based >95% of customers < 5% of customers
Largest Production Instance Size > 60 TB <4TB
Architecture, Operations and Support
Multi-node System(s) with more
parts. Calls for little extra
management effort as compared to
Scale-Up. Allows better flexibility.
Single Node. Calls for less
management and support effort.
But incase of failure of Single
Node the only fall back is DR.
Table Re-distribution
May call for table(s)- re-distribution
once or twice a year and this can be
done during planned maintenance.
Little to no need as everything is
stored on Single Node.
Reliability
Higher reliability and a spare node
can be provisioned in case 1 node
goes down.
Low Reliability . If the Single Node
goes down , the entire system
fails.
3. BW on HANA: Performance Optimisation
I am prepared to share in brief some items that certainly result in performance optimization
• Data Storage Object (DSO) activation is a critical step in process of transferring data from source
systems to the business warehouse. We has achieved activation times that are 54 times faster
than the previous process.
• Faster DSO activation makes data available more quickly, supports more-frequent loading of
data, and more-frequent updates for availability closer to real-time.
• It speeds the flow of data from source systems
• Conversion of In-memory objects where by the extended star schema will be slimmed and
dimensions will be realigned and provide scope for Re-architecting some data flows without
disruption.
• Suggest Data Volume Optimization strategy to maintain system clean and green
• In a tool-supported post-migration step InfoCubes can be selected and converted to SAP HANA-
optimized objects
4. Housekeeping is the single most significant
contributorSAP HANA Data Volume Management Tasks (Sample Template)
Priority Action Description Deadline Startegy
High
Define retention times for PSA records individually for all
DataSources and delete outdated data.
After BW on HANA Live NA
High Schedule periodic batch job for deletion of outdated entries from table ODQDATA_F After BW on HANA Live
High Enable the non-active data concept as suggested in SAP Note 1767880 After BW on HANA Live
High Archive / move to NLS old/unused data from the DSOs and InfoCube After BW on HANA Live
High
End to End Review with Philips Team and recommendations
regarding the top BW Schema Tables
After BW on HANA Live
High Frequent Review and Check on HANA DB parameter settings After BW on HANA Live
High Check whether power save mode is active. For more information, see SAP Note 1890444 After BW on HANA Live
High Weekly Check for HANA DB trace settings After BW on HANA Live
High Review and Propose a Best Practice for backup procedure Ongoing
High
At least a Weekly Review and Monitoring on the recommendations for the
alerts generated in the HANA DB system
After BW on HANA Live
High
Use report RSDRI_RECONVERT_DATASTORE to convert HANA- optimized DSOs back to classic objects.
Since from BW 7.3 SP5 Standard DSO’s will support on HANA Schema Algorithm
After BW on HANA Live
Medium
Consider partitioning for tables that are expected to grow rapidly in order to ensure parallelization
and adequate performance.
After BW on HANA Live
Medium
Propose and Suggest for re-partitioning tables that are expected to grow.
HP recommend that you re-partition tables before inserting mass data or while they are still small.
After BW on HANA Live
Medium
Review, test and implement SAP basis and memory management parameter recommendations to
Avoid OOM (Out of Memory )Issues
After BW on HANA Live
5. Optimization 1 : Recommendations to reduce data footprint on HANA Database : As a thumb rule, at least 45-50% of the
SAP HANA memory should be reserved for the SQL computations, SAP HANA services, Analytics and other OS related
services. The rest can be occupied by the actual data in different column and row stores. Frequent check for BW on HANA
system on the memory occupied with data and the rest is left for computations. This will adversely affect the performance as
it will increase the number of unloads for different tables from memory to disk which further deteriorates the performance.
This obviously will lead to high memory peaks in SAP HANA. In keeping the monitoring we can always keep BW on HANA
system in line with Best Practices.
Optimization 2 : Provider proposes frequent analysis of HANA database configuration, review of HANA DB parameters,
CPU frequency settings and trace settings. Analyzing the objects with high number of records for table partitioning should be
considered if these tables are expected to grow rapidly in the future.
Optimization 3: To reduce the data footprint in the HANA database, review and implement the recommendations:
• Keep Track of the size of the top may be 30 PSA tables and assess the Data Retention Policies .
• Cold/Warm data can be unloaded to Disk.
• Define retention times for PSA records individually for all DataSources and delete outdated data. Start with the largest
PSA tables
BW on HANA project: additional questions
6. Optimization 4 : Delete the outdated entries from table ODQDATA_F be scheduling the periodic batch job
ODQ_CLEANUP as suggested in SAP Note 1836773.
Recommendation: Table ODQDATA_F is part of the operational delta queue. Refer SAP Note 1836773 (How to delete
outdated entries from delta queues - SAP Data Services) and delete the outdated entries from this table using the batch job
called ODQ_CLEANUP.
Once a day a cleanup process removes all outdated entries from the delta queues so they do not fill up. This is a regular
batch job and can be maintained as such. With the ODQMON transaction the job and the retention interval can be
configured.
In transaction ODQMON, choose menu Goto -> Reorganize delta queues,
Schedule a job for reorganization, e.g. ODQ_CLEANUP_CLIENT_004
By default the job is scheduled each day at 01:23:45 system time
If needed adapt the start time and frequency in transaction SM37
If needed adapt the retention time for recovery (see F1 help for details)
Optimization 5 : Enable the non-active data concept for BW on SAP HANA DB, review and implement the code corrections
contained in SAP Note 1767880 - Non-active data concept for BW on SAP HANA DB.
After implementing the code corrections, follow the manual steps to ensure that the unload priorities of all tables are set
correctly.
This would ensure that Persistent Staging Area tables, change log tables, and write-optimized DataStore objects are flagged
as EARLY UNLOAD by default. This means that these objects are displaced from the memory before other BW objects
(such as InfoCubes and standard DataStore objects).
BW on HANA project: additional questions
7. Optimization 6: Understand and review the CPU type, CPU clock frequency, and the hosts. If the CPU clock frequency is
set too low, this has a negative impact on the overall performance of the SAP HANA system. Usually the CPU clock
frequency should be above 2000 MHz.
Optimization 7: If an inappropriate trace level is set for SAP HANA database components, a high amount of trace
information may be generated during routine operation. This can impair system performance and lead to unnecessary
consumption of disk space.
Recommendation: For production usage of your SAP HANA database, we recommend setting the trace level of all
components according to the recommendations in the table above.
Background: Traces can be switched in the 'Trace Configuration' tab of the SAP HANA studio Administration Console
Optimization 8: Largest Non-partitioned Column Tables: There are objects with high number of records (more than 300
million). This is not yet critical with regard to the technical limit of SAP HANA (2 billion records), but table partitioning should
be considered if these tables are expected to grow rapidly in the future.
Recommendation: Consider partitioning for tables that are expected to grow rapidly in order to ensure parallelization and
adequate performance. We recommend that you partition tables before inserting mass data or while they are still small.
BW on HANA project: additional questions
8. Optimization 9 : Largest Partitioned Column Tables (Records : Consider re-partitioning tables that are expected to grow.
We also need to look at re- partition tables before inserting mass data or while they are still small.
For more information see SAP Note 1650394 or refer to the SAP HANA Administration Guide
Optimization 10 : Largest Column Tables in terms of delta size The separation into main and delta storage allows high
compression and high write performance at the same time. Write operations are performed on the delta store and changes
are transferred from the delta store to the main store asynchronously during delta merge. The column store automatically
performs a delta merge according to several technical limits that are defined by parameters. If applications require more
direct control over the merge process, the smart merge function can be used for certain tables (for example, BW prevents
delta merges during data loading for performance reasons
Optimization 11 : Memory Utilization Details for HANA Services
The following table shows the memory usage of the SAP HANA engines (services) and is only a snapshot of the time of the
download collection.
Different aspects of the memory consumption of the HANA database are highlighted: "Physical Memory Used by Services"
corresponds to the "Database Resident Size" in the SAP HANA studio and can be compared with the resident size of the
service in the operating system. The sum of "Heap Memory Used Size" and "Shared Allocated Size" roughly corresponds to
the memory usage of the SAP HANA database, which is shown in the SAP HANA studio as "Database Memory Usage".
The difference between the "Database Memory Usage" and the "Resident Database Memory" can usually be explained by
the "Allocated Heap Size".
BW on HANA project: additional questions
9. Optimization 12: Reducing Table Sizes All tables located in the row store are loaded into the main memory when the
database is started. Furthermore, row store tables cannot be compressed as much as tables located in the column store.
Therefore, we need to keep the row store as small as possible.
RSDDSTAT* data BW statistical data saved in the RSDDSTAT* tables are located in the row store. Since new data is
continuously loaded into the Business Warehouse (BW), the amount of statistical data is always increasing. Therefore, it is
essential to keep the statistical tables as small as possible, which also provide information about the performance of your
queries.
Recommendation: Reduce the number of records saved in the RSDDSTAT* tables. Consider the following:
When you maintain the settings for the query statistics, deactivating the statistics is the same as activating the statistics
internally with detail
The settings on the "InfoProvider" tab page affect the collection of statistical data for queries, as well as the settings on the
"Query" tab page (transaction RSDDSTAT). For Web templates, workbooks, and InfoProviders, you can decide between
activating or deactivating the statistics only. If you did not maintain settings for the individual objects, the default setting for
the object is used. If you did not change the default settings, the statistics
are activated.
You can delete statistics data using report RSDDSTAT_DATA_DELETE or using the corresponding graphical interface
accessible via transaction RSDDSTAT.
BW on HANA project: additional questions
10. Optimization 13 : Conversion of InfoCubes and DataStore Objects After an upgrade to SAP NetWeaver BW 7.30 SP5 or
later with SAP HANA, all DataStore objects and InfoCubes remain unchanged. In a tool-supported post processing step
(transaction RSMIGRHANADB or report RSDRI_CONVERT_CUBE_TO_INMEMORY), DataStore objects and InfoCubes
can be selected and converted to SAP HANA-optimized objects.
All InfoCubes should be converted to fully benefit of the advantages provided by SAP BW powered by HANA DB. On the
other hand, we do not recommend converting DataStore Objects as the advantages of the converted objects can be
achieved without modifying the DSOs.
Optimization 14 : SAP HANA-optimized DataStore Objects
Background: All advantages of HANA-optimized DataStore Objects are now available for standard DSOs too, which renders
conversion unnecessary. While HANA-optimized DSOs will still be supported in the future, we do not recommend converting
DSOs but, rather, reconverting any existing HANA-optimized DSOs back to classic objects.Starting with BW 7.30 SP10 (BW
7.31 SP09, BW 7.40 SP04), converting classic DSOs to HANA-optimized DSOs will not be possible anymore.
SAP HANA-optimized DataStore objects cannot be included in an SAP BW 3.x data flow. If you want to optimize a
DataStore object that is part of a 3.x data flow, you first have to migrate the actual data flow.
Furthermore, an SAP HANA-optimized DataStore object cannot be populated directly with real-time data acquisition (RDA).
The 'unique records' property does not provide any performance gain. In addition, the
uniqueness check is not performed at all in BW; instead, the uniqueness is checked by an SQL statement (DBMS exit).
Never Generate SIDs: SIDs are never generated. This option is useful for all DSO that are used (only) for further processing
in other DSOs or InfoCubes as it is not possible to run a query directly on this kind of DSO.
BW on HANA project: additional questions
11. Optimization 15 : SAP HANA-optimized InfoCubes With SAP HANA-optimized InfoCubes, the star schema is transferred to
a flat structure, which means that the dimension tables are no longer physically available. Since no dimension IDs have to
be created for the SAP HANA-optimized InfoCubes, the loading process is accelerated. The accelerated insertion of data in
SAP HANA- optimized InfoCubes means that the data is available for reporting earlier.
Optimization 16 : Analytic Indexes - Analytic indexes can be created in the APD (transaction RSANWB) or they can be an
SAP HANA model published in the SAP BW system. If you want to use SAP BW OLAP functions to report on SAP HANA
analytic or calculation views, you can publish these SAP HANA models to the SAP BW system (transaction
RSDD_HM_PUBLISH).
Optimization 17 : MultiProvider Queries
For MultiProvider queries based on SAP HANA, the standard setting for the “operations in BWA” query property (transaction
RSRT) is “3 Standard”. However, if the MultiProvider consists of a mixed landscape (there are SAP HANA-optimized as well
as non-converted InfoProviders underneath), performance problems might occur.
Recommendation: If you are running queries on top of MultiProvider containing SAP HANA-optimized InfoProviders, as well
as standard InfoProviders, either: Convert all InfoProviders to SAP HANA-optimized or
Always we need set it to S-standard and Mode 3
Last But not the Least :
Always remember to test and make a full system backup before implementing any changes in a productive environment.
BW on HANA project: additional questions
12. Performance testing – Query
Impact of current run time- using Scale-Out BW on
HANA
4
22
40
73
2 10 7
25
0
10
20
30
40
50
60
70
80
Less than 10 s 10s to 30 s 30s to 60s More than 60s
Time(s)
Type of query
Query before & after BW on HANA
move per type
Avg Before
Avg After
3.6
2
1.6
5
0.00
2.00
4.00
Less than
10 s
Time(s)
13. Data-Load results: 'Customer' 12TB BW on HANA
PoC
Application Impacting / long
Number of
loads Improvement
A2A Long running load 1 78%
CL SCM Impacting load 1 95%
CORE 1 Long running load 1 15%
Master data Long running load 1 87%
One PI Long running load 1 91%
PDS Impacting load 1 98%
POS Impacting load 1 89%
QXP Impacting load 2 91%
SCM Dashboards Impacting load 3 81%
SMART - SRM Impacting load 3 57%
SMART - VBM Impacting load 1 88%
Long running load 2 66%
VIPP LI Impacting load 3 64%
VIPP PH Impacting load 2 45%
Long running load 4 71%