Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
Want to see a high-level overview of the products in the Microsoft data platform portfolio in Azure? I’ll cover products in the categories of OLTP, OLAP, data warehouse, storage, data transport, data prep, data lake, IaaS, PaaS, SMP/MPP, NoSQL, Hadoop, open source, reporting, machine learning, and AI. It’s a lot to digest but I’ll categorize the products and discuss their use cases to help you narrow down the best products for the solution you want to build.
Amazon Redshift Deep Dive - Serverless, Streaming, ML, Auto Copy (New feature...Amazon Web Services Korea
이 세션에 참여하여 Amazon Redshift의 새로운 기능을 자세히 살펴보십시오. Amazon Data Sharing, Amazon Redshift Serverless, Redshift Streaming, Redshift ML 및 자동 복사 등에 대한 자세한 내용과 데모를 통해 Amazon Redshift의 새로운 기능을 알고 싶은 사용자에게 적합합니다.
The new Microsoft Azure SQL Data Warehouse (SQL DW) is an elastic data warehouse-as-a-service and is a Massively Parallel Processing (MPP) solution for "big data" with true enterprise class features. The SQL DW service is built for data warehouse workloads from a few hundred gigabytes to petabytes of data with truly unique features like disaggregated compute and storage allowing for customers to be able to utilize the service to match their needs. In this presentation, we take an in-depth look at implementing a SQL DW, elastic scale (grow, shrink, and pause), and hybrid data clouds with Hadoop integration via Polybase allowing for a true SQL experience across structured and unstructured data.
Learn to Use Databricks for Data ScienceDatabricks
Data scientists face numerous challenges throughout the data science workflow that hinder productivity. As organizations continue to become more data-driven, a collaborative environment is more critical than ever — one that provides easier access and visibility into the data, reports and dashboards built against the data, reproducibility, and insights uncovered within the data.. Join us to hear how Databricks’ open and collaborative platform simplifies data science by enabling you to run all types of analytics workloads, from data preparation to exploratory analysis and predictive analytics, at scale — all on one unified platform.
Organizations are grappling to manually classify and create an inventory for distributed and heterogeneous data assets to deliver value. However, the new Azure service for enterprises – Azure Synapse Analytics is poised to help organizations and fill the gap between data warehouses and data lakes.
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
Want to see a high-level overview of the products in the Microsoft data platform portfolio in Azure? I’ll cover products in the categories of OLTP, OLAP, data warehouse, storage, data transport, data prep, data lake, IaaS, PaaS, SMP/MPP, NoSQL, Hadoop, open source, reporting, machine learning, and AI. It’s a lot to digest but I’ll categorize the products and discuss their use cases to help you narrow down the best products for the solution you want to build.
Amazon Redshift Deep Dive - Serverless, Streaming, ML, Auto Copy (New feature...Amazon Web Services Korea
이 세션에 참여하여 Amazon Redshift의 새로운 기능을 자세히 살펴보십시오. Amazon Data Sharing, Amazon Redshift Serverless, Redshift Streaming, Redshift ML 및 자동 복사 등에 대한 자세한 내용과 데모를 통해 Amazon Redshift의 새로운 기능을 알고 싶은 사용자에게 적합합니다.
The new Microsoft Azure SQL Data Warehouse (SQL DW) is an elastic data warehouse-as-a-service and is a Massively Parallel Processing (MPP) solution for "big data" with true enterprise class features. The SQL DW service is built for data warehouse workloads from a few hundred gigabytes to petabytes of data with truly unique features like disaggregated compute and storage allowing for customers to be able to utilize the service to match their needs. In this presentation, we take an in-depth look at implementing a SQL DW, elastic scale (grow, shrink, and pause), and hybrid data clouds with Hadoop integration via Polybase allowing for a true SQL experience across structured and unstructured data.
Learn to Use Databricks for Data ScienceDatabricks
Data scientists face numerous challenges throughout the data science workflow that hinder productivity. As organizations continue to become more data-driven, a collaborative environment is more critical than ever — one that provides easier access and visibility into the data, reports and dashboards built against the data, reproducibility, and insights uncovered within the data.. Join us to hear how Databricks’ open and collaborative platform simplifies data science by enabling you to run all types of analytics workloads, from data preparation to exploratory analysis and predictive analytics, at scale — all on one unified platform.
Organizations are grappling to manually classify and create an inventory for distributed and heterogeneous data assets to deliver value. However, the new Azure service for enterprises – Azure Synapse Analytics is poised to help organizations and fill the gap between data warehouses and data lakes.
Azure Synapse Analytics is Azure SQL Data Warehouse evolved: a limitless analytics service, that brings together enterprise data warehousing and Big Data analytics into a single service. It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. This is a huge deck with lots of screenshots so you can see exactly how it works.
Microsoft Azure is the only hybrid cloud to help you migrate your apps, data, and infrastructure with cost-effective and flexible paths. At this event you’ll learn how thousands of customers have migrated to Azure, at their own pace and with high confidence by using a reliable methodology, flexible and powerful tools, and proven partner expertise. Come to this event to learn how Azure can help you save—before, during, and after migration, and how it offers unmatched value during every stage of your cloud migration journey. Learn about assessments, migration offers, and cost management tools to help you migrate with confidence.
Architect’s Open-Source Guide for a Data Mesh ArchitectureDatabricks
Data Mesh is an innovative concept addressing many data challenges from an architectural, cultural, and organizational perspective. But is the world ready to implement Data Mesh?
In this session, we will review the importance of core Data Mesh principles, what they can offer, and when it is a good idea to try a Data Mesh architecture. We will discuss common challenges with implementation of Data Mesh systems and focus on the role of open-source projects for it. Projects like Apache Spark can play a key part in standardized infrastructure platform implementation of Data Mesh. We will examine the landscape of useful data engineering open-source projects to utilize in several areas of a Data Mesh system in practice, along with an architectural example. We will touch on what work (culture, tools, mindset) needs to be done to ensure Data Mesh is more accessible for engineers in the industry.
The audience will leave with a good understanding of the benefits of Data Mesh architecture, common challenges, and the role of Apache Spark and other open-source projects for its implementation in real systems.
This session is targeted for architects, decision-makers, data-engineers, and system designers.
Azure SQL Database (SQL DB) is a database-as-a-service (DBaaS) that provides nearly full T-SQL compatibility so you can gain tons of benefits for new databases or by moving your existing databases to the cloud. Those benefits include provisioning in minutes, built-in high availability and disaster recovery, predictable performance levels, instant scaling, and reduced overhead. And gone will be the days of getting a call at 3am because of a hardware failure. If you want to make your life easier, this is the presentation for you.
by Darin Briskman, Technical Evangelist, AWS
Database Freedom means being able to use the database engine that’s right for you as your needs evolve. Being locked into a specific technology can prevent you from achieving your mission. Fortunately, AWS Database Migration Service makes it easy to switch between different database engines. We’ll look at how to use Schema Migration Tool with DMS to switch from a commercial database to open source. You’ll need a laptop with a Firefox or Chrome browser.
This is based on the following publications:
Azure Strategy and Implementation Guide by Joachim Hafner, Simon Schwingel, Tyler Ayers, and Rolf Masuch. Introduction by Britt Johnston.
With reference to Enterprise Cloud Strategy, 2nd Edition by Eduardo Kassner and Barry Briggs.
All Links to resources are at the end of the presentation.
Azure Role Based Access Control with an use case and explanation about various concepts like Global Administrators, Role Assignments, Account Administrators, Azure Roles, Custom Roles for both Azure AD and Azure Subscriptions
Think of big data as all data, no matter what the volume, velocity, or variety. The simple truth is a traditional on-prem data warehouse will not handle big data. So what is Microsoft’s strategy for building a big data solution? And why is it best to have this solution in the cloud? That is what this presentation will cover. Be prepared to discover all the various Microsoft technologies and products from collecting data, transforming it, storing it, to visualizing it. My goal is to help you not only understand each product but understand how they all fit together, so you can be the hero who builds your companies big data solution.
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...Amazon Web Services Korea
ccAmazon Aurora 데이터베이스는 클라우드용으로 구축된 관계형 데이터베이스입니다. Aurora는 상용 데이터베이스의 성능과 가용성, 그리고 오픈소스 데이터베이스의 단순성과 비용 효율성을 모두 제공합니다. 이 세션은 Aurora의 고급 사용자들을 위한 세션으로써 Aurora의 내부 구조와 성능 최적화에 대해 알아봅니다.
With this support you would be able to have the basic of Azure and you will have the necessary knowledge to take the AZ900 Microsoft Azure Fundamentals Exam.
This support is a summary from the path Azure fundamentals in Microsoft Learn: https://docs.microsoft.com/en-us/learn/paths/azure-fundamentals/.
How Can I Build a Landing Zone & Extend my Operations into AWS to Support my ...Amazon Web Services
AWS Landing Zone accelerates customer adoption of the cloud by providing a prescriptive set of instructions for deploying an AWS-recommended foundation of interrelated AWS accounts, networks, and core services. AWS Landing Zone provides prescriptive guidance and best practice templates that a customer can deploy into their initial AWS environment with confidence that it will grow to meet future business needs including security and regulatory compliance requirements. Learn More: https://aws.amazon.com/government-education/
Migrating on premises workload to azure sql databasePARIKSHIT SAVJANI
Azure SQL Database is a fully managed cloud database service with built-in intelligence, elastic scale, performance, reliability, and data protection that enables enterprises and ISVs to reduce their total cost of ownership and operational cost and overheads. In this session, I will share real-world experience of successfully migrated existing SaaS application and on-premises workload for some our tier 1 customers and ISV partners to Azure SQL Database service. The session walks through planning, assessment, migration tools and best practices from the proven experiences and practices of migrating real world applications to Azure SQL Database service.
Microsoft Azure is the only hybrid cloud to help you migrate your apps, data, and infrastructure with cost-effective and flexible paths. At this event you’ll learn how thousands of customers have migrated to Azure, at their own pace and with high confidence by using a reliable methodology, flexible and powerful tools, and proven partner expertise. Come to this event to learn how Azure can help you save—before, during, and after migration, and how it offers unmatched value during every stage of your cloud migration journey. Learn about assessments, migration offers, and cost management tools to help you migrate with confidence.
Architect’s Open-Source Guide for a Data Mesh ArchitectureDatabricks
Data Mesh is an innovative concept addressing many data challenges from an architectural, cultural, and organizational perspective. But is the world ready to implement Data Mesh?
In this session, we will review the importance of core Data Mesh principles, what they can offer, and when it is a good idea to try a Data Mesh architecture. We will discuss common challenges with implementation of Data Mesh systems and focus on the role of open-source projects for it. Projects like Apache Spark can play a key part in standardized infrastructure platform implementation of Data Mesh. We will examine the landscape of useful data engineering open-source projects to utilize in several areas of a Data Mesh system in practice, along with an architectural example. We will touch on what work (culture, tools, mindset) needs to be done to ensure Data Mesh is more accessible for engineers in the industry.
The audience will leave with a good understanding of the benefits of Data Mesh architecture, common challenges, and the role of Apache Spark and other open-source projects for its implementation in real systems.
This session is targeted for architects, decision-makers, data-engineers, and system designers.
Azure SQL Database (SQL DB) is a database-as-a-service (DBaaS) that provides nearly full T-SQL compatibility so you can gain tons of benefits for new databases or by moving your existing databases to the cloud. Those benefits include provisioning in minutes, built-in high availability and disaster recovery, predictable performance levels, instant scaling, and reduced overhead. And gone will be the days of getting a call at 3am because of a hardware failure. If you want to make your life easier, this is the presentation for you.
by Darin Briskman, Technical Evangelist, AWS
Database Freedom means being able to use the database engine that’s right for you as your needs evolve. Being locked into a specific technology can prevent you from achieving your mission. Fortunately, AWS Database Migration Service makes it easy to switch between different database engines. We’ll look at how to use Schema Migration Tool with DMS to switch from a commercial database to open source. You’ll need a laptop with a Firefox or Chrome browser.
This is based on the following publications:
Azure Strategy and Implementation Guide by Joachim Hafner, Simon Schwingel, Tyler Ayers, and Rolf Masuch. Introduction by Britt Johnston.
With reference to Enterprise Cloud Strategy, 2nd Edition by Eduardo Kassner and Barry Briggs.
All Links to resources are at the end of the presentation.
Azure Role Based Access Control with an use case and explanation about various concepts like Global Administrators, Role Assignments, Account Administrators, Azure Roles, Custom Roles for both Azure AD and Azure Subscriptions
Think of big data as all data, no matter what the volume, velocity, or variety. The simple truth is a traditional on-prem data warehouse will not handle big data. So what is Microsoft’s strategy for building a big data solution? And why is it best to have this solution in the cloud? That is what this presentation will cover. Be prepared to discover all the various Microsoft technologies and products from collecting data, transforming it, storing it, to visualizing it. My goal is to help you not only understand each product but understand how they all fit together, so you can be the hero who builds your companies big data solution.
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...Amazon Web Services Korea
ccAmazon Aurora 데이터베이스는 클라우드용으로 구축된 관계형 데이터베이스입니다. Aurora는 상용 데이터베이스의 성능과 가용성, 그리고 오픈소스 데이터베이스의 단순성과 비용 효율성을 모두 제공합니다. 이 세션은 Aurora의 고급 사용자들을 위한 세션으로써 Aurora의 내부 구조와 성능 최적화에 대해 알아봅니다.
With this support you would be able to have the basic of Azure and you will have the necessary knowledge to take the AZ900 Microsoft Azure Fundamentals Exam.
This support is a summary from the path Azure fundamentals in Microsoft Learn: https://docs.microsoft.com/en-us/learn/paths/azure-fundamentals/.
How Can I Build a Landing Zone & Extend my Operations into AWS to Support my ...Amazon Web Services
AWS Landing Zone accelerates customer adoption of the cloud by providing a prescriptive set of instructions for deploying an AWS-recommended foundation of interrelated AWS accounts, networks, and core services. AWS Landing Zone provides prescriptive guidance and best practice templates that a customer can deploy into their initial AWS environment with confidence that it will grow to meet future business needs including security and regulatory compliance requirements. Learn More: https://aws.amazon.com/government-education/
Migrating on premises workload to azure sql databasePARIKSHIT SAVJANI
Azure SQL Database is a fully managed cloud database service with built-in intelligence, elastic scale, performance, reliability, and data protection that enables enterprises and ISVs to reduce their total cost of ownership and operational cost and overheads. In this session, I will share real-world experience of successfully migrated existing SaaS application and on-premises workload for some our tier 1 customers and ISV partners to Azure SQL Database service. The session walks through planning, assessment, migration tools and best practices from the proven experiences and practices of migrating real world applications to Azure SQL Database service.
Azure SQL Database now has a Managed Instance, for near 100% compatibility for lifting-and-shifting applications running on Microsoft SQL Server to Azure. Contact me for more information.
Supercharge your data analytics with BigQueryMárton Kodok
Powering interactive data analysis require massive architecture, and Know-How to build a fast real-time computing system. BigQuery solves this problem by enabling super-fast, SQL-like queries against petabytes of data using the processing power of Google’s infrastructure. We will cover its core features, creating tables, columns, views, working with partitions, clustering for cost optimizations, streaming inserts, User Defined Functions, and several use cases for everydaay developer: funnel analytics, behavioral analytics, exploring unstructured data.
The other part will be about BigQuery ML, which enables users to create and execute machine learning models in BigQuery using standard SQL queries. BigQuery ML democratizes machine learning by enabling SQL practitioners to build models using existing SQL tools and skills. BigQuery ML increases development speed by eliminating the need to move data.
Triple C - Centralize, Cloudify and Consolidate Dozens of Oracle Databases (O...Lucas Jellema
Dozens of Oracle Databases - each health center location has one on its local server with the same data model and the same set of applications. These databases have to be centralized and cloudified and also be consolidated into one or as few databases as possible. To lower costs, ease operations and enable innovation. Each location can access only its own data, applications do not have to be changed and different locations can run different versions of applications and database objects. This is the story of a critical migration. About the cloud ready analysis, the Proofs of Concept with Oracle Database features VPD and Edition Based Redefinition, the scalability investigation, the redesign of change management, rollout and operational management processes and the careful modernization of a 25 year old platform on the latest database release and a shiny new, fully automated cloud platform.This is the story of an organization that had state of the art systems in the mid-90s. And they have these same systems today - no longer state of the art. They can keep the systems alive, but barely, and at increasing cost. In the Fall of 2020, we started an investigation into the feasibility of bringing the 100s of databases from each of the locations together, in a central location, in the cloud and finally: consolidated into one or at least as few database instances as possible. Using Oracle Database Virtual Private Database and Edition Based Redefinition, a smart database connection configuration in each site and a limited reimplementation of non-cloud/non-consolidated mechanisms (interaction with local file system for example) we have designed and proven a working new design and migration approach.
Why NBC Universal Migrated to MongoDB AtlasDatavail
NBCUniversal, a worldwide mass media corporation, was looking for a more affordable and easier way to manage their database solution that hosts their extensive online digital assets. With Datavail’s assistance, NBCUniversal made the move from MongoDB 3.6 to MongoDB Atlas on AWS.
In this presentation, learn how making this move enabled the entertainment titan to reduce overhead and labor costs associated with managing its database environment.
In this session, you will learn the difference between Azure SQL Database, SQL Managed Instances, Elastic Pools, and SQL Virtual Machines. You will learn how to use tools to test migrations for issues before you start the migration process. You will learn how to successfully migrate your database schema and data to the cloud. Finally, you will learn how to determine which performance tier is a good starting point for your existing workload(s) and how to monitor your workload overtime to make sure your users have a great experience while you save as much money as possible.
Developing scalable enterprise serverless applications on azure with .netCallon Campbell
Over the years we have seen an accelerated shift to adopting serverless and cloud-native application architectures. Benefits to these architectures include decreased infrastructure costs and improved time to market, however, it's still important to consider high availability and resiliency in your application design. In this session, Callon will talk about developing scalable enterprise serverless applications on Azure with .NET and use a real-world example of a solution he developed and running in production.
20200123 Ignite the Tour Seoul: Azure Cognitive Search 발표자료입니다. Form Recognizer와 Azure Function을 활용한 Azure Cognitive Search 실습자료는 aka.ms/AIML10repo 에서 확인하세요.
Microsoft Bot Framework with Azure Bot Service and Azure Cognitive Services. 마이크로소프트 챗봇 개발환경에서 풍부한 AI 기술을 활용하세요. 자연어처리 및 답변 구성 등 대화를 손쉽게 구성하고 간편하게 서비스를 배포하세요.
Microsoft Azure Cognitive Services - vision demo app 'Intelligent Kiosk' userguide guide Intelligent Kiosk는 마이크로소프트의 실시간 이미지 및 영상처리 데모앱입니다. 애저 인지서비스 (Azure Cognitive Services)의 Computer Vision, Face, Custom Vision 등의 API를 활용하여 사진을 학습시키고, 사진분석 결과를 확인하세요.
Power BI portfolio overview. 셀프서비스 BI 시각화 도구 및 보고서 배포 서비스 Power BI. 클라우드와 온프레미스에서 지원하는 Power BI의 기능을 소개합니다. https://powerbi.com/ https://portal.azure.com/
2018.11
Microsoft Azure Cognitive Services OCR with Computer Vision hands-on-lab guide. Computer Vision을 활용한 이미지 내 텍스트처리 활용 가이드 https://westus.dev.cognitive.microsoft.com/docs/services/ https://portal.azure.com/
2019.01
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
2. PostgreSQL is more popular than ever
loved
wanted
https://insights.stackoverflow.com/survey/2019?utm_source=so-owned&utm_medium=blog&utm_campaign=dev-survey-2019&utm_content=launch-blog
https://db-engines.com/en/blog_post/76
https://db-engines.com/en/ranking_trend/system/PostgreSQL
DBMS of the Year
DB-Engines’ ranking of PostgreSQL popularity
PostgreSQL is more popular than ever
3. More and more organizations are shifting open source
workloads to the cloud to benefit from key advantages:
• Improved manageability and security
• Improved performance and intelligence
• Global scalability
6. High performance
scale-out with
Hyperscale (Citus)
Intelligent performance
optimization
Flexible and openFully managed
and secure
Single Server
Hyperscale (Citus) NEW
Build or migrate your workloads with confidence
9. Built-in High Availability
99.99% SL A
server=server.postgresql.database.azure.com
Retry
3 copies of data for
data reliability
PGSQL IP:5432
US West
Azure Storage
PostgreSQL
Server
PostgreSQL
Server
Application
Scale storage
instantaneously
Scale compute up or
down in seconds
10. = $285 vs $132 == $285 vs $262 =
High-Availability High-Availability
PostgreSQL on Azure VM
(IaaS)
Azure DB for PostgreSQL
(PaaS)
Built-in High Availability
99.99% SL A
11. Backup and Restore
• Built-in backups
• Choose LRS or GRS
• Restore from geo-redundant
backups for disaster recovery
(RPO <= 1 hr.)
• 1x Backup storage included
• PITR up to 35 days (min. 7 days)
13. Read replicas to scale out workloads
Up to
5 Replicas
Application DashboardBI and Analytics
Reporting
Master server
Read Replica #1 Read Replica #2 Read Replica #3 Read Replica #4 Read Replica #5
Asynchronous
updates
14. Deployment options
Best for a broad range of traditional transactional workloads Best for ultra-high performance and data needs beyond 100GB
15. Scaled-out transaction
APPLICATION
BEGIN;
UPDATE
SET
WHERE
UPDATE
SET
WHERE
COMMIT;
campaigns
feedback = ‘relevance’
company_type = ‘platinum’ ;
ads
feedback = ‘relevance’
company_type = ‘platinum’ ;
METADATA
COORDINATOR NODE
W1
W2
W3 … Wn
BEGIN …
assign_Scaled-out_
transaction_id …
UPDATE campaigns_2009 …
COMMIT PREPARED …
BEGIN …
assign_Scaled-out_
transaction_id …
UPDATE campaigns_2001 …
COMMIT PREPARED …
BEGIN …
assign_Scaled-out_
transaction_id …
UPDATE campaigns_2017 …
COMMIT PREPARED …
Shard your Postgres database across multiple nodes to give your application more
memory, compute, and disk storage
Easily add worker nodes to achieve horizontal scale
WORKER NODES
16. Co-located Join
APPLICATION
SELECT
FROM
WHERE
AND
count(*)
ads JOIN campaigns ON
ads.company_id = campaigns.company_id
ads.designer_name = ‘Isaac’
campaigns.company_id = ‘Elly Co’ ;
METADATA
COORDINATOR NODE
WORKER NODES
W1
W2
W3 … Wn
SELECT …
FROM
ads_1001,
campaigns_2001
…
It’s logical to place shards containing related rows of related tables together on the same nodes
Join queries between related rows can reduce the amount of data sent over the network
17. Cloud Shard Rebalancer
APPLICATION
ALTER TABLE
ADD COLUMN
campaigns
company_type text
METADATA
COORDINATOR NODE
W1
W2
W3 … Wn
BEGIN …
assign_Scaled-out_
transaction_id …
UPDATE campaigns_2009 …
COMMIT PREPARED …
BEGIN …
assign_Scaled-out_
transaction_id …
UPDATE campaigns_2001 …
COMMIT PREPARED …
BEGIN …
assign_Scaled-out_
transaction_id …
UPDATE campaigns_2017 …
COMMIT PREPARED …
-- Schema Change
Shard rebalancer redistributes shards across old and new worker nodes for balanced data scale-out
Shard rebalancer will recommend rebalance when shards can be placed more evenly
Schema can be updated when types of tables and scale-out strategy change
WORKER NODES
21. Protect data with multiple layers of security
Built-in encryption for
data at rest and in motion
Secure SSL connectivity
Server firewall rules
Virtual Network (SE)
Native authentication
Threat detection
Azure provides multiple layers of
security to safeguard your data
23. Security & Compliance
SOC 2
Type 2
CSA STAR
Certification
Level 1
Security built-in with native and AAD integration
Control access with secure SSL, server firewall
rules, and VNET
Built-in encryption for data and backups in-
motion and at-rest
Protect your data with up-to-date security and
compliance features using the Azure IP Advantage
Leading compliance offerings (SOC, ISO, CSA
STAR, PCI DSS, HIPAA, etc.)
24. Monitoring and Alerting
• Built-in monitoring
• Enabled for database engine
monitoring by default
• Configurable alerts
• Auto notifications
25. •
• Configure log_statement to “ALL” for analyzing
performance issues
• log_min_duration_statement lets you specify the
minimum execution time (in milliseconds) above
which statements will be logged.
• Consumes server provisioned storage
• The log files rotate every one hour or 100 MB size,
whichever comes first.
Server Logs
Built-in server logs for troubleshooting database
errors or performance issues
27. Intelligent Performance
Built-in intelligence optimizes your database
within minutes, without the need to be an expert
• Query Store
• Query Performance Insights
• Performance Recommendations
34. Inventory
database assets,
and application
stack discovery
Assess workloads
and fix
recommendations
Convert the
source schema to
work in the target
environment. This
is only relevant for
heterogeneous
migrations.
Remediate
applications
Iteratively make any
necessary changes
to your applications
Run functional &
performance tests
Iteratively run
functional and
performance tests
Optimize
Based on the tests you
performed, address any
performance issues, and
then retest to confirm the
performance improvements
Pre-migration
Discover Assess Convert
Migrate the source
schema, and then
migrate the source
data to the target
Sync your target
schema and data
with the source. This
is only relevant for
minimal-downtime
migrations
Cut over from the
source to the target
environment. This is
only relevant for
minimal-downtime
migrations
Migrate schema,
data & objects
Data sync Cutover
Migration
Post-migration
Migrating a database
35. Ora2pg for assess/migration
• Ora2pg tool migrates Oracle to Postgres
• ora2pg reads the Oracle catalog and creates the equivalent Postgres objects (tables,
views, sequences, indexes), with unique, primary, foreign key and check constraints
without syntactic pitfalls
• If using also for data migration, ora2pg connects to Oracle and dumps data in a
Postgres-compatible format (highly configurable and connects to Postgres and
migrate everything on the fly)
• Azure DMS is another data migration option
• Ora2pg provides a migration assessment report
• Ora2pg creates migration projects
• All triggers, functions, procedures and packages are exported and converted to
PLPGSQL
• More complicated procedures may need to be translated manually
• Oracle specific code always need to be rewritten
• External modules (DBMS, UTL, ...)
• CONNECT BY (use CTE “WITH RECURSIVE”)
• OUTER JOIN (+)
• DECODE
• Oracle Spatial to PostGis export
• Ora2Pg Installation steps and config sample
http://ora2pg.darold.net/
36. On-premises
Assessment Migration
Azure Database Migration
Service
Microsoft Azure
Online Migration with Azure Database Migration Service
Database Migration Guide: https://datamigration.microsoft.com/
37.
38.
39. Demo:
Oracle migration to
Azure Database for PostgreSQL
with Azure Database Migration Service
https://docs.microsoft.com/ko-kr/azure/dms/resource-network-topologies
40.
41.
42. 47
Microsoft는 데이터 전문 파트너사와 함께 효과적인 데이터베이스 마이그레이션 전략 수립을 위한
사전진단 분석 컨설팅을 지원합니다.
Database Migration Assessment Offering
Assessment
Schema conversion /
Application Conversion
Data Migration Verification / Test Service on Azure
• 인터뷰 및 AS-IS 시스템 취합을 통한 데이터 환경 조사
• 사전 진단 방법 및 범위 산정
• 진단 도구 실행 및 분석/진단
• 결과 보고 / 제언 및 컨설팅: TCO/ROI 분석, 이행 및 전환 계획 수립을 위한 구성 방안 수립, Table 설계 변경 및
데이터 표준화 / 데이터 품질 검토, 어플리케이션 / 메타 영향 분석, DBMS 별 특성 분석
• 최종 리뷰 및 보고
Partner with
43. “After migrating to Citus, we can onboard Vonto customers 20X faster, in 2 minutes vs. the 40+ minutes it
used to take. And with the launch of Hyperscale (Citus) on Azure Database for PostgreSQL, we are excited
to see what we can build next on Azure.”
– Vonto by ASB
44. Azure Database for PostgreSQL is
fully-managed, community PostgreSQL
Global
reach
Security
Scale up
& out
Built-in HA
Compliance
Intelligent
performance
Easy ecosystem
integration
Extension
support
Extensions
JSONB
Full text
search
Geospatial
support
Rich
indexing
감사
환영, 감사
자기소개
DMIAD 워크샵 2회, 지난 웨비나에서는 SQL Server Azure SQL DB Migration
오늘의 주제
대상: 현재 오라클, Postgre -> 클라우드 이관
오늘 웨비나에서 다룰 내용
오라클/PostgreSQL 사용자 클라우드 마이그레이션 옵션들 (간략)
(오늘의 메인 주제) PaaS버전 Azure PostgreSQL 소개와,
Azure DMS (Database Migration Service) 활용하면 Migration을 얼마나 간편하게 할 수 있는지
중간중간 화면 전환하여 직접 데모와 함께 보여드리겠습니다.
요즘 PostgreSQL 많이 사용하고 계시죠. 선정
풍부한 기능으로 엔터프라이즈에서도 사용
특히 오라클 사용자 중 비용 절감/ 오픈소스 도입의 과제 -> 가장 많이 선택
오라클과 비슷한 부분이 多 -> DB 이관, 마이그레이션 하실 때 가장 적은 공수
PostgreSQL has gained credibility of enterprise ready and feature-rich database
Reduce total cost of ownership (TCO)
Shifting to adopt open source
Similarities between Oracle and PostgreSQL to ease effort of migration
Sources:
https://insights.stackoverflow.com/survey/2019?utm_source=so-owned&utm_medium=blog&utm_campaign=dev-survey-2019&utm_content=launch-blog
https://db-engines.com/en/blog_post/76
https://db-engines.com/en/ranking_trend/system/PostgreSQL
클라우드에서 postgre SQL 사용 시
관리, 보안, 성능 확장, 그리고 글로벌서비스 측면에서의 장점을 누릴 수 있음
Microsoft 클라우드 Azure로 이관 시 선택 옵션
Oracle 사용자 -> Azure VM 위 Oracle 직접 설치
Oracle/기존 OSS Postgre 사용자 ->
Azure VM 위 오픈 소스 PostgreSQL (IaaS)
Azure Database for PostgreSQL (PaaS)
참고
Postgre -> VM 위 IaaS
IaaS와 비교하여 PaaS 서비스이기 때문에 가지는 장점
주요 장점
고가용성, 성능, 확장 등 Microsoft 완전하게 관리. 스토리지/컴퓨팅 성능을 따로 자유로이
Intelligent 성능 최적화
3. 마이크로소프트 개발자들이 PostgreSQL 커뮤니티버전에 기여한 확장 기능들 포함하여
오픈소스 엔진 그대로, 가장 최신 버전까지 사용 가능.
리소스 생성 시 배포 옵션 2가지
PostgreSQL 개발자들이 차린 회사 Citus를 인수
샤딩을 지원하는 고성능 서버그룹 -> Hyperscale 사용할 수 有
Overview: Microsoft has a numerous database services from open source to SQL; all with built in intelligence, flexibility and trust you expect from an Azure PaaS offering.
Talking Points:
We’re uniquely positioned to address the complexity our customers face because we see ourselves as a data platform company, not an engine company
Our relational cloud assets are all built on the same platform
Our aspiration is that platform innovations are shared across engines, so customers can leverage the features that make them more productive in the engine of their choice.
Our strategy is built upon pillars that uniquely differentiate us in the market. We provide scalable, performant, secure and intelligent relational databases for:
Born in the cloud applications and
Existing applications which are either being modernized on-premises or moving to the cloud.
Let’s walk through the pillars:
Hybrid – we’re providing a frictionless migration experience for existing apps, whether moving to a fully-managed database as a service or transitioning over time with a hybrid strategy.
Enterprise Scale and Performance – we’re helping customers manage their resources and build for the future with dynamic scaling up to 100TB.
Security & Compliance – Security management can be complex, particularly when working across entire data estates. We are simplifying security with a consistent and comprehensive policy-based approach across the platform
Built-in Intelligence – we’ve been enabling customers to be more productive and gather new insights with adaptive and ML-based features for a couple years now. We gather telemetry across millions of databases to fine tune our algorithms to do more and help our customers be more productive than ever.
Choice - Our platform is under-girded by choice that guides customers to the right solution for their workloads at the best TCO.
Customers can exercise choice and flexibility across the relational database platform, and be assured that they can maximize productivity, efficiency and ROI for any of their workloads.
--------------------------------
Choice of hosting – on-premises, hybrid, VM or fully-managed PaaS
Choice of engine – SQL, PostgreSQL, MySQL, Maria DB
Choice of deployment options – instance and database scoped, compute and IO-intensive
Choice of resources – wide spectrum of compute and storage
Choice of languages - Python, Java, Node.js, PHP, Ruby, and .NET
지금부터는
1. 기능.
고가용성 및 비즈니스 지속성
성능 및 확장성
보안 및 관리
설명
2. 마이그레이션 방법 보여드리고
3. 사례 소개
Azure Database for PostgreSQL의
고가용성 및 비즈니스 지속성
99.99% SLA 지원
Azure SQL DB와 마찬가지, Service Fabric 기반 기술 사용
복잡한 failover 구성, 비싼 고가용성 솔루션 없이
Azure Database Services is built upon the SQL Database platform which is a Service Fabric-based PaaS solution. As such, rather than having to boot-up an entire OS stack to bring up a new server (such as in IaaS), Azure Database Services run the database engine in a custom container technology which you can think of as a secure “pico process”. The time it takes to bring-up a new server in this custom container is a matter of seconds. This means that in the event that your database server has hung, or “gone away”, the Azure Database Management Service” detects the failure, brings-up a new server in this lightweight container, maps the new IP address to the DNS name of your instance and maps to your storage. This entire process takes between 30-45 seconds. This is built-in to all performance tiers of Azure Database Services and since a replica instance isn’t needed, there is no additional cost to the customers. In contrast, an AWS RDS server that is deployed in a single AZ would take minutes to start – and that does not account for how you would detect the failure and switch-over
Scale compute up or down in seconds
Scale storage instantaneously
High availability without the need for replicas
Setting-up high availability for database servers is hard, requiring either custom code to manage detection/failover, or expensive 3rd party solutions to make it a bit easier.
Compute redundancy:
If a node-level interruption occurs, the database server automatically creates a new node and attaches data storage to the new node. Any active connections are dropped and any inflight transactions are not committed.
Data reliability:
3 copies of data for data reliability
오라클 -> 최대 95% 비용 절감 가능.
같은 Azure 내의 IaaS와 비교해서도
비용측면에서 장점 누릴 수 있음.
타 클라우드 벤더사 PaaS DB -> VM 형식
Customers who are IaaS customers in Azure today to understand that the specs of a VM do not equate directly to the specs of Azure Database Services. The reason is two-fold:
Customers do not size a VM based on their typical workload, rather they size wisely to handle workload spikes so as not to impact performance of the application. With Azure Database Services, the ability to scale performance on the fly means they SHOULD size their instance based on typical workload needs and then elastically scale when necessary. This lowers costs.
A VM has to support the performance requirements for both the database engine as well as the host OS. With Azure Database Services, the SQLPAL isolated pico-process (mini-OS) significantly lowers the HW needs compared to a VM.
So in this example, if I have a D4S_V3 VM with 4 vCPUs and 32GB of SSD, when I choose an Azure Database for MySQL the customer can likely choose a smaller size of 2 vCores with the same storage (and in fact, they would get more storage as the storage for Azure Database Services is dedicated to the database, logs, etc. – no host OS footprint here). The customer can then profile their workload and determine if it meets their performance requirements, and if it does not, they can easily scale-up to the next tier.
More importantly, in an IaaS VM implementation, if you want to achieve HA you need a second server (replica). This will double their costs, in this case from $143/mo. to $286/mo. With Azure Database Services with built-in HA, there are no additional replicas needed and as such – there is no cost impact. So to sum up this example, a HA IaaS MySQL VM costs $286/mo., whereas Azure Database for MySQL would cost $132/mo. That’s a saving of $154/mo.
Customers who are IaaS customers in Azure today to understand that the specs of a VM do not equate directly to the specs of Azure Database Services. The reason is two-fold:
Customers do not size a VM based on their typical workload, rather they size wisely to handle workload spikes so as not to impact performance of the application. With Azure Database Services, the ability to scale performance on the fly means they SHOULD size their instance based on typical workload needs and then elastically scale when necessary. This lowers costs.
A VM has to support the performance requirements for both the database engine as well as the host OS. With Azure Database Services, the SQLPAL isolated pico-process (mini-OS) significantly lowers the HW needs compared to a VM.
So in this example, if I have a D4S_V3 VM with 4 vCPUs and 32GB of SSD, when I choose an Azure Database for MySQL the customer can likely choose a smaller size of 2 vCores with the same storage (and in fact, they would get more storage as the storage for Azure Database Services is dedicated to the database, logs, etc. – no host OS footprint here). The customer can then profile their workload and determine if it meets their performance requirements, and if it does not, they can easily scale-up to the next tier.
More importantly, in an IaaS VM implementation, if you want to achieve HA you need a second server (replica). This will double their costs, in this case from $143/mo. to $286/mo. With Azure Database Services with built-in HA, there are no additional replicas needed and as such – there is no cost impact. So to sum up this example, a HA IaaS MySQL VM costs $286/mo., whereas Azure Database for MySQL would cost $132/mo. That’s a saving of $154/mo.
로컬 중복 스토리지/ 지역 중복 스토리지
지역 복원
특정시점 복원
Reference:
https://docs.microsoft.com/en-us/azure/postgresql/concepts-business-continuity
https://docs.microsoft.com/en-us/azure/postgresql/concepts-backup
All backups are encrypted using AES 256-bit encryption.
성능 및 확장성
Read replicas help improve performance and scale of read-intensive workloads such as BI and analytics
Consider the read replica features in scenarios when delays in synching data between the master and replicas are acceptable
Create a replica in a different Azure region from the master for a disaster recovery plan, where a replica replaces the master in cases of regional disasters
Data storage on replica servers grows automatically without impacting workloads
Reference:
https://docs.microsoft.com/en-us/azure/postgresql/concepts-read-replicas
https://docs.microsoft.com/en-us/azure/postgresql/howto-read-replicas-portal
단일서버 / 울트라 고성능 서버 그룹 배포
Single Server의 경우, 티어에 따라 차이는 있지만
최대 64vCore, vCore의 최대 메모리는 10GB.
스토리지 최대 크기 16TB, 최대 IOPS 20000까지 지원.
Azure premium storage (GP, MO)
Hyperscale의 구조
코디네이터 노드, 워커노드 작업자 노드 구성으로 샤딩 지원.
최대 20개의 워커노드까지 수평적 확장 scale up.
쿼리를 병렬처리함으로써 더 빠르게 응답
Hyperscale의 경우에는 티어가 따로 없고
최대 64vCore, vCore 당 메모리는 8기비바이트.
SSD 스토리지를 사용하고,
스토리지 최대 크기는 2티비바이트
최대 2티비바이트+20개 작업자 노드 =
스토리지 사이즈는 40티비바이트, IOPS는 12만2960.
Aggregating data before transactions avoids rewriting each row and can save write overhead and table bloat
Bulk aggregation avoids concurrency issues
관련 테이블의 관련된 행은 동일한 노드에 위치시켜
관련 행 join 시 데이터가 불필요하게 네트워크 이동을 것을 방지.
노드를 확장시켰을 때 과거 작업자 노드와 새 작업자 노드가 균형을 이룰 수 있도록 재분산.
트랜잭션 확장, 샤드 밸런싱 뿐만 아니라
Azure Database for PostgreSQL Hyperscale
-> 파티셔닝, 병렬 인덱스, savepoint, window function 등 더 다양한 기능 지원
Transactional support
Savepoint support
Multi-value inserts
PostgreSQL10, PostgreSQL11
Window functions
Online shard rebalancing
Scaled-out transactions
Distinct on/count distinct
CTE support
Native PostgreSQL partitioning
Enhanced SQL support
TopN
Citus MX (beta)
Rename of scale-out tables
Parallel index
Parallel vacuum
Scaled-out backups
Hyperscale (Citus) Cloud Shard Rebalancer
Shard rebalancer redistributes shards across old and new worker nodes for balanced data scale-out
Shard rebalancer will recommend rebalance when shards can be placed more evenly
For more control, use tenant isolation to easily allocate dedicated to specific tenants with greater needs
https://docs.microsoft.com/ko-kr/azure/postgresql/concepts-pricing-tiers
https://docs.microsoft.com/ko-kr/azure/postgresql/concepts-hyperscale-configuration-options
Single Server 티어에 따라 다르지만
최대 64vCore, vCore의 최대 메모리는 10GB
Azure premium storage 사용할 수 있고, 스토리지 최대 크기 16TB
최대 20000IOPS
Hyperscale 티어가 따로 없고
코디네이터 노드/ 작업자 worker 노드 (최대 20개)
최대 64vCore, vCore 당 메모리는 8기비바이트
SSD 스토리지를 사용하고, 스토리지 최대 크기는 2티비바이트
2티비바이트+20개 작업자 노드 = IOPS는 12000이상
제가 설명드렸던 Azure Database for PostgreSQL 싱글서버/hyperscale을 생성하는 데모
PC 화면을 전환
장표로 돌아와 기능소개를 계속하겠습니다.
보안 및 관리
보안 기능 여러 단계 지원
기존 인증 기능, AAD -> 추가 구성없이 접근 제어
SSL 연결, 방화벽, Vnet 당연 지원
스토리지 암호화/ 네트워크를 보호.
AWS Directory Services integration requires additional coding
AWS Identity & Access Management requires creation of additional users
GuardDuty integration requires additional configuration
추가 선택 옵션: 고급 위협 보호
감지 -> 관리자 알림
Azure PostgreSQL threat detection provides an additional layer of security intelligence which detects suspicious activities going on in the database.
A simple way to enable threat detection using Azure portal, which requires no modifications to existing application code or client applications.
A proprietary set of algorithms that work around the clock to learn, profile and detect suspicious databases activities, indicating a potentially harmful attempts to access or exploit data in the database.
Someone has logged from an unusual location - change in the access pattern from an unusual geographical location
An unfamiliar principal successfully logged- - change in the access pattern using an unusual SQL user.
Someone is attempting to brute force SQL credentials abnormally high number of failed logins with different credentials.
Someone has logged from a potentially harmful application
It provides actionable alerts over email and in Azure portal which provides details of the suspicious activity and recommends how to further investigate and mitigate the threat.
----------------------------------------------------------
We are embedding machine learning directly into our cloud services to deliver intelligent data services that keep your data safe. For example, consider the security features in Azure SQL DB
Our ML systems analyze and learn from over 700 TB data/per day to ensure we keep your applications highly efficient and data safe – through automatic auditing and threat detection. With active Threat Detection, the service can identify anomalies in your workload and alert you of a potential attack like SQL injection. The service does the hard work so you don’t have to – so you can focus on the business problems you’re solving and creating breakthrough applications.
---------------------------------------------------------------------------------------------------
SQL Threat Detection allows you to detect suspicious activities indicating a possible malicious intent to access, breach or exploit data in the database. SQL Database Threat Detection runs multiple sets of algorithms which detect potential vulnerabilities and SQL injection attacks, as well as anomalous database access patterns (such as access from an unusual location or by an unfamiliar principal). Security officers or other designated administrators get email notification once a threat is detected on the database. Each notification provides details of the suspicious activity and recommends how to further investigate and mitigate the threat.
“Azure SQL Database Threat Detection is now generally availableThreat Detection leverages machine learning to provide an additional layer of security built into the SQL Database service, enabling SQL Database customers to protect their databases within minutes without needing to be an expert in database security. It works around the clock to profile and alert you of anomalous activities on your databases. Threat detection alerts can be viewed from Azure Security Center and provide details of suspicious activity and recommend action on how to investigate and mitigate the threat. To learn more about Threat Detection, including pricing, visit the Azure blog.
보안 관련 여러 certificate 보유
기업의 규정을 준수하셔야 하는 분들은 참고
Protecting your innovation in the cloud: Reduce risk, innovate with confidence, and operate with freedom in the cloud. Azure IP Advantage provides the industry’s most comprehensive protection against intellectual property (IP) risks.
-Best-in-industry intellectual property protection
-Build confidently with uncapped indemnification
-Deter and defend lawsuits with patent pick
-Get broad protection with a springing license
Based on customer demand from various industry verticals
SOC2 - Service Organization Controls standards for operational security
ISO 27001 - Information Security Management Standards
ISO 27018 - Code of Practice for Protecting Personal Data in the Cloud
CSA STAR - Cloud Security Alliance: Security, Trust & Assurance Registry (STAR)
PCI DSS Level 1 - Payment Card Industry (PCI) Data Security Standard (DSS) Level 1 Service Provider
HIPAA / HITECH Act - Health Insurance Portability and Accountability Act / Health Information Technology for Economic and Clinical Health Act
ISO 27017:2015 - Code of Practice for Information Security Controls
ISO 9001:2015 Quality Management Systems Standards
ISO 22301:2012 Business Continuity Management Standard
ISO/IEC 20000-1:2011 Information Technology Service Management
모니터링 기능 내장 -> default로 모니터링 화면 사용
사용자에게 경고/ 자동알림 구성
기본적으로 내장된 server log 기능. 간편하게 구성
기존 PostgreSQL DB에서 복잡하게 관리하셨던 로그들은
Azure Portal에서 간편하게 관리할 수 有
매우 유용한 성능 최적화 기능 기본 내장. 다음 3가지
Query Store
가장 오래 도는 쿼리, 가장 많은 리소스를 사용하는 쿼리 등을 바로 찾을 수 있고,
Query Performance Insight 라는
기본 제공 모니터링 화면을 통해 바로 확인 가능.
인덱스 생성/삭제과 같이 성능 개선 추천사항 제공하여
여러분들의 업무시간 효율적으로 활용할 수 있도록 도움.
클라우드의 많은 부분 활용
For application developers using PostgreSQL, Azure provides integration with popular frameworks like Drupal, Django, etc. And also popular languages like python, etc.
We have done work make it simple for application developers to provision both applications and PostgreSQL with build in connection it Azure App Services and other services within Azure.
We have several customers building interesting solution (which I will talk about later) building interesting scenarios leveraging advanced analytics and AI scenarios. PG has deep integration with intelligent Azure services like Cortana APIs.
Our customers are building solutions to reach their customer base world wide. PostgreSQL is and will take advantage of Azure’s global reach of 50+ regions.
Also we have several customers wanting to migrate off of on premises/private clouds to Azure. The Azure Database Migration service provides online migration capabilities to Azure PostgreSQL w/o the application taking any downtime.
Span with Azure’s availability in more regions worldwide than any other cloud provider
PBI
Azure Functions
이런 기능들을 어디서 확인할 수 있는지 보여드리기 위해
생성된 Azure DB for Postgre 함께 확인
쿼리 툴로 연결 demo
화면 전환.
http://127.0.0.1:52934/browser/
SELECT version();
SELECT * FROM pg_tables;
AS-IS 소스 DB 진단/분석
타겟 DB에 맞게 전환
마이그레이션
최적화
오프라인
서비스 중지 시간 없이, 다운타임 없이
온라인 마이그레이션
Azure의 DB로 데이터를 이관하는 작업을 담당
PaaS
마지막 데모
Oracle -> azure PostgreSQL 온라인 마이그레이션
준비한 DB 소스 환경은 Windows 가상머신 위 설치된 오라클 express버전
리눅스를 사용하시는 분들도 환경설정만 해주신다면 똑같이
화면 전환.
데이터베이스 마이그레이션 전략 수립을 위한 사전진단 분석 컨설팅이 필요하신 경우
Microsoft는 데이터 전문 파트너사와 함께 하실 수 有
AS-IS 데이터베이스 진단 및 분석
스키마, 어플리케이션 전환 및 데이터 마이그레이션
검증 테스트-> 프로덕션
까지의 DB 마이그레이션 작업들을 데이터솔루션, 메타넷T플랫폼 MTP와 함께할 수 있음
마지막으로 사례 부분입니다.
본사 윈도우 팀
1.5PB 이상의 데이터를 저장해서 실시간 분석 활용
95%의 쿼리응답속도가 4초 미만
75%의 쿼리응답속도가 200ms 미만이라고 공개.
관련 자세한 내용은 곧이어 영상으로 직접 보여드리도록 하겠습니다.
제가 오늘 준비한 내용은 여기까지입니다.
현재 오라클/PostgreSQL 사용자
Azure PostgreSQL 사용하게 되면 어떤 장점들을 어떻게 누릴 수 있는지 소개
Azure DMS (Database Migration Service) 활용하여 online Migration을 얼마나 손쉽게 할 수 있는지 직접 보여드렸습니다.
비용정보를 포함한 모든 기술문서는 오픈되어 있으니 다음 링크에서 참고부탁드립니다.
이만 웨비나를 마치도록 하겠습니다. 경청해주셔서 대단히 감사합니다.
https://www.youtube.com/watch?v=TBZdOMv8a6Q
OCI Enterprise는 파티션을 지원하지 않는다 (Azure PostgreSQL은 11.5 이후부터는 완벽한 파티션을 지원)
AWS의 경우 Zone Redundent를 위해서는 2개 이상의 VM이 필요하다 (> x 2 cost)
AWS의 경우 SLA가 99.95% (Azure 99.99%)
DMS가 online mig를 지원하기 때문에(오라클 10,11,12 지원) 최소다운타임 마이그레이션이 가능