This document provides an overview of how to successfully migrate Oracle workloads to Microsoft Azure. It begins with an introduction of the presenter and their experience. It then discusses why customers might want to migrate to the cloud and the different Azure database options available. The bulk of the document outlines the key steps in planning and executing an Oracle workload migration to Azure, including sizing, deployment, monitoring, backup strategies, and ensuring high availability. It emphasizes adapting architectures for the cloud rather than directly porting on-premises systems. The document concludes with recommendations around automation, education resources, and references for Oracle-Azure configurations.
This is the second session of the learning pathway at PASS Summit 2019, which is still a stand alone session to teach you how to write proper Linux BASH scripts
High concurrency, Low latency analytics using Spark/KuduChris George
With the right combination of open source projects, you can have a high concurrency and low latency spark jobs for doing data analysis. We'll show both REST and JDBC access to access data from a persistent spark context and then show how the combination of Spark Job Server, Spark Thrift Server and Apache Kudu can create a scalable backend for low latency analytics.
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UKhuguk
This session will give you an update on what SUSE is up to in the Big Data arena. We will take a brief look at SUSE Linux Enterprise Server and why it makes the perfect foundation for your Hadoop Deployment.
This is the second session of the learning pathway at PASS Summit 2019, which is still a stand alone session to teach you how to write proper Linux BASH scripts
High concurrency, Low latency analytics using Spark/KuduChris George
With the right combination of open source projects, you can have a high concurrency and low latency spark jobs for doing data analysis. We'll show both REST and JDBC access to access data from a persistent spark context and then show how the combination of Spark Job Server, Spark Thrift Server and Apache Kudu can create a scalable backend for low latency analytics.
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UKhuguk
This session will give you an update on what SUSE is up to in the Big Data arena. We will take a brief look at SUSE Linux Enterprise Server and why it makes the perfect foundation for your Hadoop Deployment.
This one-hour presentation covers the tools and techniques for migrating SQL Server databases and data to Azure SQL DB or SQL Server on VM. Includes SSMA, DMA, DMS, and more.
Low latency high throughput streaming using Apache Apex and Apache KuduDataWorks Summit
True streaming is fast becoming a necessity for many business use cases. On the other hand the data set sizes and volumes are also growing exponentially compounding the complexity of data processing pipelines.There exists a need for true low latency streaming coupled with very high throughput data processing. Apache Apex as a low latency and high throughput data processing framework and Apache Kudu as a high throughput store form a nice combination which solves this pattern very efficiently.
This session will walk through a use case which involves writing a high throughput stream using Apache Kafka,Apache Apex and Apache Kudu. The session will start with a general overview of Apache Apex and capabilities of Apex that form the foundation for a low latency and high throughput engine with Apache kafka being an example input source of streams. Subsequently we walk through Kudu integration with Apex by walking through various patterns like end to end exactly once, selective column writes and timestamp propagations for out of band data. The session will also cover additional patterns that this integration will cover for enterprise level data processing pipelines.
The session will conclude with some metrics for latency and throughput numbers for the use case that is presented.
Speaker
Ananth Gundabattula, Senior Architect, Commonwealth Bank of Australia
Big Data Day LA 2016/ Big Data Track - How To Use Impala and Kudu To Optimize...Data Con LA
This session describes how Impala integrates with Kudu for analytic SQL queries on Hadoop and how this integration, taking full advantage of the distinct properties of Kudu, has significant performance benefits.
cloudera Apache Kudu Updatable Analytical Storage for Modern Data PlatformRakuten Group, Inc.
Apache Kudu is an open source distributed storage for a real-time analytical workload. Since it supports Update and Inserts, Kudu can be used for both real-time operational database and analytic database. In this session, I will describe the detailed architecture of Kudu to reveal how it supports Update and Insert on columnar storage architecture.
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in ProductionCloudera, Inc.
Walk through some of the best practices to keep in mind when it comes to upgrading your cluster, and learn how to leverage new Upgrade Wizard features in Cloudera Enterprise 5.3.
For most mission critical workloads, downtime is never an option. Any downtime can have a direct impact on revenue and lead to frantic calls in the middle of the night. For this reason, upgrading the software that powers these workloads can often be a daunting task. It can cause unpredictable issues without access to support. That’s why an enterprise-grade administration tool is crucial for running Hadoop in production. Hadoop consists of dozens of components, running across multiple machines, all with their own configurations. That can lead to a lot of complexity and uncertainty - especially when taking the upgrade plunge.
Cloudera Manager makes it easy and is the only production-ready administration tool for Hadoop. Not only does Cloudera Manager feature zero-downtime rolling upgrades, but it also has a built in Upgrade Wizard to make upgrades simple and predictable.
Intel and Cloudera: Accelerating Enterprise Big Data SuccessCloudera, Inc.
The data center has gone through several inflection points in the past decades: adoption of Linux, migration from physical infrastructure to virtualization and Cloud, and now large-scale data analytics with Big Data and Hadoop.
Please join us to learn about how Cloudera and Intel are jointly innovating through open source software to enable Hadoop to run best on IA (Intel Architecture) and to foster the evolution of a vibrant Big Data ecosystem.
Azure Identity (AD,ADFS 2.0,AAD,ADB2C,OAuth,OpenID,PingID,AD Custom Policies) ,
Azure PaaS (Azure Functions, Serverless computing, Azure Comsos DB, Webhooks, API Apps, Logic Apps, Kudu, Azure Websites), Azure Functions, Lamda Function, Event Functions, Serverless architecture, Implementing azure functions on GIT HUB comment feature, Why Azure Functions, Azure Virtual Machines, Azure Cloud Services, Azure Web Apps & WebJobs, Service Fabric, Consumption Plans, Billing Model, Benefits of Azure Functions, What is serverless, Implementing bigger solutions into smaller azure functions, Microservices, Use cases, Function App, Implementation storing unstructured data using Azure functions into Cosmos DB, Cosmos DB, Custom Azure functions, Azure Cosmos DB, IOTS, Document DB, Doc DB, How to setup a Jenkins build server and automatically trigger code from Visual studio online,Azure App Service, App service Environment, Azure Stack, Managing Azure App services, Azure Powershell, Azure CLI, REST APIS, Azure Portal, Templates, Kudu Console access, Run GIT Commands on Kudu Console, Locking Azure Resources, Configuring Custom Domains, Adding Extensions to Azure Web App/Websites, App service Deployment options, Data Services in Azure , Azure SQL, Azure SQL server, Azure SQL database vs SQL server in a Azure VM, SQL Tiers, DTU, Data Transactional Unit, Planning & provisioning azure SQL databases,Migrating SQL Databases, Azure SQL Server, SQL server transactional replication, Deploy database to Microsoft Azure Database Wizard, DAC package, DAC, SQL compatibility issues, Migrating SQL with downtime, DMA, Data Migration Assistant, Database Snapshot, Migrating SQL without downtime, DTU, Data Transactional Unit, Recommendations for best performance during SQL Import Process, Transactional Replication, T-SQL, Task to implement what ever you learnt till now,
This one-hour presentation covers the tools and techniques for migrating SQL Server databases and data to Azure SQL DB or SQL Server on VM. Includes SSMA, DMA, DMS, and more.
Low latency high throughput streaming using Apache Apex and Apache KuduDataWorks Summit
True streaming is fast becoming a necessity for many business use cases. On the other hand the data set sizes and volumes are also growing exponentially compounding the complexity of data processing pipelines.There exists a need for true low latency streaming coupled with very high throughput data processing. Apache Apex as a low latency and high throughput data processing framework and Apache Kudu as a high throughput store form a nice combination which solves this pattern very efficiently.
This session will walk through a use case which involves writing a high throughput stream using Apache Kafka,Apache Apex and Apache Kudu. The session will start with a general overview of Apache Apex and capabilities of Apex that form the foundation for a low latency and high throughput engine with Apache kafka being an example input source of streams. Subsequently we walk through Kudu integration with Apex by walking through various patterns like end to end exactly once, selective column writes and timestamp propagations for out of band data. The session will also cover additional patterns that this integration will cover for enterprise level data processing pipelines.
The session will conclude with some metrics for latency and throughput numbers for the use case that is presented.
Speaker
Ananth Gundabattula, Senior Architect, Commonwealth Bank of Australia
Big Data Day LA 2016/ Big Data Track - How To Use Impala and Kudu To Optimize...Data Con LA
This session describes how Impala integrates with Kudu for analytic SQL queries on Hadoop and how this integration, taking full advantage of the distinct properties of Kudu, has significant performance benefits.
cloudera Apache Kudu Updatable Analytical Storage for Modern Data PlatformRakuten Group, Inc.
Apache Kudu is an open source distributed storage for a real-time analytical workload. Since it supports Update and Inserts, Kudu can be used for both real-time operational database and analytic database. In this session, I will describe the detailed architecture of Kudu to reveal how it supports Update and Insert on columnar storage architecture.
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in ProductionCloudera, Inc.
Walk through some of the best practices to keep in mind when it comes to upgrading your cluster, and learn how to leverage new Upgrade Wizard features in Cloudera Enterprise 5.3.
For most mission critical workloads, downtime is never an option. Any downtime can have a direct impact on revenue and lead to frantic calls in the middle of the night. For this reason, upgrading the software that powers these workloads can often be a daunting task. It can cause unpredictable issues without access to support. That’s why an enterprise-grade administration tool is crucial for running Hadoop in production. Hadoop consists of dozens of components, running across multiple machines, all with their own configurations. That can lead to a lot of complexity and uncertainty - especially when taking the upgrade plunge.
Cloudera Manager makes it easy and is the only production-ready administration tool for Hadoop. Not only does Cloudera Manager feature zero-downtime rolling upgrades, but it also has a built in Upgrade Wizard to make upgrades simple and predictable.
Intel and Cloudera: Accelerating Enterprise Big Data SuccessCloudera, Inc.
The data center has gone through several inflection points in the past decades: adoption of Linux, migration from physical infrastructure to virtualization and Cloud, and now large-scale data analytics with Big Data and Hadoop.
Please join us to learn about how Cloudera and Intel are jointly innovating through open source software to enable Hadoop to run best on IA (Intel Architecture) and to foster the evolution of a vibrant Big Data ecosystem.
Azure Identity (AD,ADFS 2.0,AAD,ADB2C,OAuth,OpenID,PingID,AD Custom Policies) ,
Azure PaaS (Azure Functions, Serverless computing, Azure Comsos DB, Webhooks, API Apps, Logic Apps, Kudu, Azure Websites), Azure Functions, Lamda Function, Event Functions, Serverless architecture, Implementing azure functions on GIT HUB comment feature, Why Azure Functions, Azure Virtual Machines, Azure Cloud Services, Azure Web Apps & WebJobs, Service Fabric, Consumption Plans, Billing Model, Benefits of Azure Functions, What is serverless, Implementing bigger solutions into smaller azure functions, Microservices, Use cases, Function App, Implementation storing unstructured data using Azure functions into Cosmos DB, Cosmos DB, Custom Azure functions, Azure Cosmos DB, IOTS, Document DB, Doc DB, How to setup a Jenkins build server and automatically trigger code from Visual studio online,Azure App Service, App service Environment, Azure Stack, Managing Azure App services, Azure Powershell, Azure CLI, REST APIS, Azure Portal, Templates, Kudu Console access, Run GIT Commands on Kudu Console, Locking Azure Resources, Configuring Custom Domains, Adding Extensions to Azure Web App/Websites, App service Deployment options, Data Services in Azure , Azure SQL, Azure SQL server, Azure SQL database vs SQL server in a Azure VM, SQL Tiers, DTU, Data Transactional Unit, Planning & provisioning azure SQL databases,Migrating SQL Databases, Azure SQL Server, SQL server transactional replication, Deploy database to Microsoft Azure Database Wizard, DAC package, DAC, SQL compatibility issues, Migrating SQL with downtime, DMA, Data Migration Assistant, Database Snapshot, Migrating SQL without downtime, DTU, Data Transactional Unit, Recommendations for best performance during SQL Import Process, Transactional Replication, T-SQL, Task to implement what ever you learnt till now,
What is in a modern BI architecture? In this presentation, we explore PaaS, Azure Active Directory and Storage options including SQL Database and SQL Datawarehouse.
Dans cette session nous vous présenterons les différentes manières d'utiliser SQL Server dans une infrastructure Cloud (Microsoft Azure). Seront présentés des scénarios hybrides, de migration, de backup, et d'hébergement de bases de données SQL Server en mode IaaS ou PaaS.
[db tech showcase Tokyo 2018] #dbts2018 #B31 『1,2,3 and Done! 3 easy ways to ...Insight Technology, Inc.
[db tech showcase Tokyo 2018] #dbts2018 #B31
『1,2,3 and Done! 3 easy ways to migrate to the cloud!』
Data Intensity - Director of Innovation Francisco Munoz Alvarez 氏
Migrating on premises workload to azure sql databasePARIKSHIT SAVJANI
Azure SQL Database is a fully managed cloud database service with built-in intelligence, elastic scale, performance, reliability, and data protection that enables enterprises and ISVs to reduce their total cost of ownership and operational cost and overheads. In this session, I will share real-world experience of successfully migrated existing SaaS application and on-premises workload for some our tier 1 customers and ISV partners to Azure SQL Database service. The session walks through planning, assessment, migration tools and best practices from the proven experiences and practices of migrating real world applications to Azure SQL Database service.
VMworld 2013: Virtualizing Databases: Doing IT Right VMworld
VMworld 2013
Michael Corey, Ntirety, Inc
Jeff Szastak, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
This presentation is for those of you who are interested in moving your on-prem SQL Server databases and servers to Azure virtual machines (VM’s) in the cloud so you can take advantage of all the benefits of being in the cloud. This is commonly referred to as a “lift and shift” as part of an Infrastructure-as-a-service (IaaS) solution. I will discuss the various Azure VM sizes and options, migration strategies, storage options, high availability (HA) and disaster recovery (DR) solutions, and best practices.
This are my keynote slides from SQL Saturday Oregon 2023 on AI and the Intersection of AI, Machine Learning and Economnic Challenges as a Technical Specialist
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
2. Kellyn Gorman
Engineer in Customer Success, SME for Oracle onAzure
23 years Oracle and SQL Server experience
Instructor on Linux and DevOps
Previous president of both Rocky Mtn Oracle User Group and
Denver SQL Server User Group
Author of several books on Oracle and Microsoft technology, plus
a couple on diversity and inclusion
Presenter on database, optimization and automation at various
conferences in the world.
Meetup owner over Girl Geek Dinners Denver and Several STEM
groups
3.
4. Thank you-
Thank you for the being here today!
Thank you to the PASS DatabaseVirtual Chapter
Please ask questions throughout the session and I’ll try to answer
them, may hold them to the end, if it makes sense.
My contact info will be provided at the end.
The slides will be uploaded and available- there’s A LOT of links
for deeper information on all the content I will be covering.
5. WhyMigrate to the
Cloud
Get rid of the datacenter
Scalability
Pay for what’s used
Easier access to resources
More agile infrastructure
Going to the cloud will not
grant you massive savings.
6. Azure Data
Platform
Although I specialize in
Oracle on Azure IaaS, there’s
more opportunities to
migrate to Azure than I can
keep up with!
https://azure.microsoft.com/en-us/services/#databases
7. Azure
Database 101
What is Azure Infrastructure as a Service, (IaaS)?
What is Azure Platform as a Service, (PaaS)?
What Azure SQL Database ServiceTiers are there?
How do I find the main information about Azure Databases?
UnderstandingAzure Single Databases and Elastic Pools.
What is an Azure SQL Managed Instance?
Main How-to Guides for Azure Databases.
Where do I find the Azure Roadmap for the future of Databases?
8. Oracle on
Azure
Oracle to PostgreSQL- 10%
Oracle Cloud Infrastructure, (OCI) and Azure
Interconnect- 10%
Oracle to Azure IaaS- 80%
Yeah, I’m the last one, btw
9. Oracle
Workloads to
Azure
Lift and Shift theWorkload, not the hardware
Use the Oracle sizing tool withAutomaticWorkload
Repository, (AWR) reports to ensure sizing is correct.
Don’t combine upgrades with migrations and
architect for the cloud.
Separate sizing from optimizing
10. Certified Deployment ofOracle onAzure
Oracle
Enterprise
Linux
Oracle 12.2
Oracle
DataGuard
Oracle
RMAN
Oracle
Weblogic
Oracle
Cloud
Control
Operating
System
Database or
Application
Monitoring &
Management
Backup and
DR
Oracle 18c
Oracle 19c
Cloud
Control
Express
Oracle on
http://www.oracle.com/us/corporate/pricing/authorized-cloud-environments-3493562.pdf
Certified!
11. Oracle for
AzureSizing
Tool
Learn more about it here.
Uses AWR reports to size out workloads
Calculates the vCPU, memory and IO/MBps that will be required to
run the workload in Azure
13. Migration toAzure- High LevelSteps
Begin Begin migration/build.
Agree Adjust pricing, ensure customer is in agreement and begin deployment of Azure services
Adjust If needed, make adjustments to architecture based off results.
Perform Perform POC for expected scenario
Choose Choose a database to use for a POC if desired
Propose Review with customer to ensure this meets their needs
Create Review and create a proposal.
Collect Collect workload usage for 24hrs or 7 days, depending on the workload type, (consistent vs. varied)
Document Begin with spreadsheet of databases to migrate or create a DR environment for.
14. MostCommon
Bottlenecks
Refactoring of database or
application tier
Attempting to upgrade while
migrating
Not getting stakeholder buy-
in
Packaging complex goals in
with the migration
15. GlobalTip #1
Don’t architect cloud solutions like you architected for on-prem
Cloud infrastructure is different than an on-prem data center.
May result in extra redundancy and/or gaps.
Educate on the infrastructure and security already in the cloud
Take egress into consideration with how processing occurs
between systems.
Identify, Inventory, Document and Diagram all. Involve all
stakeholders.
Choose your battles and create battle plans on what will be
addressed first.
Use tools available from the cloud provider
16. Azure PlanningTools
SQL DTUCalculator is a translator and calculator that sizes out theAzure environment you will require vs. the current
on-premises environments you have.
A DTU, (DatabaseThroughput Units) is a form of cloud measure of the average CPU, IO and Memory to come up
with minimum service level and tier of Azure service needed for migration:
Processor - % ProcessorTime
Logical Disk - Disk Reads/sec
Logical Disk - DiskWrites/sec
Database - Log Bytes Flushed/sec
Azure Instances Information will provide the information you need to understand the wide variety of configurations
and service levels/tiers that are available.
17. Database
Archival and
BlobStorage
Azure Stretch Database is a way to move read only databases to
low cost storage and have them automatically move from warm,
(queried) to cold, (unused) storage that’s very cost effective.
Azure Archive Storage is a way to store years of data at a lower
cost, similar to stretch, but less RDBMS focused, automatic
movement from warm to cold storage.
Azure Blob Storage is an excellent choice for unstructured data,
backup files, etc. Again, possesses automation to move files from
hot, to warm, to cold storage.
This goes for IaaS, too.
Azure Site Recovery snapshots and backups can take advantage of
this.
18. GlobalTip #2
Use the RightTool for the Job
Don’t try to duplicate everything you have
on-prem in the cloud.
Locate what you need to accomplish and
then discover what is the most efficient and
compatible tool to reach your goal.
19. Tools to Help
With Migration
Azure Cloud Assessment reviews your entire A-to-Z
environment and discovers what needs to be migrated,
helps select a migration strategy, assists in the
migration and helps optimize, post migration.
Building a High AvailabilityArchitecture in Azure for
SQL DB, for Oracle use this and Oracle DataGuard.
Azure Database Migration Assistant is available to ease
the migration steps if performed one-off.
Database Platform Migration Assistant- no matter the
data source, there is a migration path to Azure.
Azure Site Recovery allows for backups to Azure and
then path to migrate environments to Azure.
Azure Data Sync is a free service that allows for data to
be sync’d from on-premises to Azure environments.
Azure Migrate is actually aVM migration tool for Azure.
20. GlobalTip #3
Don’t boil the ocean
Break the project into achievable, bite size
pieces.
Don’t allow others to distract with
unimportant issues or insignificant challenges.
Ensure everyone signs off on what is agreed
to.
Perform a POC if it makes sense, but choose a
viable workload to test.
21. Tip #4
Consider Services to Scale
Most common failure point are those that try to LITERALLY lift
and shift what they have on-prem into the cloud instead of taking
the workload, the features and move those to the cloud.
Consider:
Cloud backup solutions likeVeeam, Commvault, ANF, ASR, etc.
Data movement and transformation products likeADF, AAS, etc.
Cloud analytics that are simpler to refactor than a database or
application platform.
Automate as part of your migration usingAzure DevOps, Github,
etc.
Use thin and thick clones over archaic data refresh technology
22. Templates,
Libraries, etc.
Use a scripting/automation factory to perform the work in an
automated fashion.
ARMTemplates
Image Libraries
Data MigrationGithub Solutions
Modern DataWarehouse
23. Heavy IO and
throughput
demands
When sizing aVM to use with an IaaS solution, ensure that you
identify the Max cached and temp storage throughput for
both IOPS and MB/s
A larger disk may not always be the wise solution, when
smaller, striped disks may provide more throughput.
When needs surpass the VM capabilities, consider capacity
pools by Azure NetApp Files or other solutions.
25. Azure Partners
/Products that
can HelpWith
Performance
Pure Storage
Azure NetApp Files
Flashgrid
CloudSimple
Pacemaker Cluster
26. Monitoring
Azure Insights, part of Monitor
Azure Data Studio
Solarwinds Database Performance Analyzer,
(DPA), (multi-platform)
Oracle Enterprise Manager,
(Cloud Control)
27. Evolve
Think about how you’ve done work on-
prem-
Will it perform and satisfy the requirements
in the cloud?
Are there services that can perform the task
simpler and for less?
What archaic can you leave behind?
Can new projects be implemented on
newer technology/platforms
28. Backup
Servers
Databases
Applications
Copy ARMTemplates
Consider newer ways to
“backup”
Identify the RPO/RTO and
build out solution that meets
the requirements.
29. Security
Encrypt data at the right layer, both at rest and in motion.
Use best practice when securing database server as you would on-
prem, but additional security with cloud environments.
Identify overhead of security products and build out suite of
solutions that meet requirements and SLAs
ConsiderAzure Security Center
30. HighAvailability
Consider Co-location
UseAvailability Regions and Availability
Scale Sets
Use Always-onAG and Oracle Dataguard for
database tiers
UseAzure Site Recovery, (ASR) forVM
snapshots and schedule snapshots for PaaS
with viable retention times.
31. Education
Resources
MicrosoftVirtual Academy: https://mva.microsoft.com/product-
training/microsoft-azure
EdX offers a number of free classes that offer accreditation for a
small cost, (free for the learning):
https://www.edx.org/course?search_query=azure
Azure Readiness Kits onGithub: https://github.com/Azure-
Readiness
There are others you can find at the following link:
https://www.businessnewsdaily.com/10711-free-microsoft-azure-
online-training.html
32. Oracle Dataguard on Azure:
https://docs.microsoft.com/en-us/azure/virtual-machines/workloads/oracle/configure-oracle-dataguard
Oracle Dataguard Far Sync:
https://docs.oracle.com/database/121/SBYDB/create_fs.htm#SBYDB5416
Oracle DataGuard standby from RAC to single instance:
https://docs.oracle.com/en/database/oracle/oracle-database/19/sbydb/configuring-data-guard-standby-databases-in-oracle-
RAC.html#GUID-3140A293-DDD8-4559-8493-B6C21646E90F
Azure VM Sizing:
Generation 1: https://docs.microsoft.com/en-us/azure/virtual-machines/linux/sizes
Generation 2: https://docs.microsoft.com/en-us/azure/virtual-machines/linux/generation-2
Isolated Bare Metal:
https://docs.microsoft.com/en-us/azure/virtual-machines/linux/isolation
Express Route Documentation:
https://docs.microsoft.com/en-us/azure/expressroute/
Ultradisks for Azure Linux VMs:
https://docs.microsoft.com/en-us/azure/virtual-machines/linux/disks-enable-ultra-ssd
References