SlideShare a Scribd company logo
1 of 29
Project BI007 - Analysis
BI007 - Sample name for a real case
Synapse
Workspace
SQL Pool
Data Lake
GZRS
Data Share
Data Share
Main Region
Storage
Data Providers Consumers
BI007 - DR Scenarios
RTO : Recovery Time Objective
RPO : Recovery Point Objective
Normal BI Solution
RPO = 0 –> ( re-import data )
RTO = How many Time Down ?
Data Share
To be prepared for a data center outage, the data provider can
have a data share environment provisioned in a secondary
region. Measures can be taken to ensure a smooth failover in the
event that a data center outage does occur.
In this context data consumers can have an active share
subscription that is idle for DR purposes.
https://docs.microsoft.com/en-us/azure/data-share/disaster-recovery
Data Lake
Storage accounts that have hierarchical namespace enabled (such
as for Data Lake Storage Gen2) are not supported for failover at
this time.
https://docs.microsoft.com/en-us/azure/storage/common/storage-disaster-recovery-guidance
Trigger Failover Trigger Failover – not available
Failover for storage accounts with hierarchical namespace enabled (Azure Data Lake Storage Gen2 storage accounts)
is not supported at this time.
Copying data as an alternative to failover
If your storage account is configured for read access to the secondary, then you can design
your application to read from the secondary endpoint. If you prefer not to fail over in the
event of an outage in the primary region, you can use tools such as AzCopy, Azure
PowerShell, or the Azure Data Movement library to copy data from your storage account in
the secondary region to another storage account in an unaffected region. You can then
point your applications to that storage account for both read and write availability.
Data Lake
Data Lake
RA on secondary
Geo-redundant storage (with GRS or GZRS) replicates your data to another physical location
in the secondary region to protect against regional outages. However, that data is available
to be read only if the customer or Microsoft initiates a failover from the primary to
secondary region.
When you enable read access to the secondary region, your data is available to be
read at all times, including in a situation where the primary region becomes unavailable.
For read access to the secondary region, enable read-access geo-redundant storage (RA-
GRS) or read-access geo-zone-redundant storage (RA-GZRS).
Synapse
Geo-backups and disaster recovery
A geo-backup is created once per day to a paired data center. The RPO for a geo-restore is 24
hours. You can restore the geo-backup to a server in any other region where dedicated SQL pool
is supported. A geo-backup ensures you can restore data warehouse in case you cannot access
the restore points in your primary region.
You can also create a user-defined restore point and restore from the newly created restore point
to a new data warehouse in a different region. After you have restored, you have the data
warehouse online and can pause it indefinitely to save compute costs. The paused database
incurs storage charges at the Azure Premium Storage rate. Another common pattern for a
shorter recovery point is to ingest data into primary and secondary instances of a data
warehouse in parallel. In this scenario, data is ingested from a source (or sources) and persisted
to two separate instances of the data warehouse (primary and secondary). To save on compute
costs, you can pause the secondary instance of the warehouse. If you need an active copy of the
data warehouse, you can resume, which should take only a few minutes.
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/backup-and-restore
Synapse
Move synapse from one region to another
https://docs.microsoft.com/en-us/azure/synapse-analytics/how-to-move-
workspace-from-one-region-to-another
Steps resume:
• Provision new Synapse instance and restore your last state to it from
the Automated or User-Defined snapshot.
• Proper permissions should be granted
• Connections to Azure Services should be reestablished
• New connection parameters should be propagated to the end-users
• Model drift should be mitigated
Synapse
Workspace
SQL Pool
Data Lake
GZRS
Data Share
Data Share
Main Region Pair Region
Secondary
Data Lake
LRS
Automated
Snapshot
Copy
Snapshot
RPO < 24h
RPO = 8h
RPO < 15m (no SLA)
Failover not available due
to hierarchical namespace
Current Scenario
Current scenario
Data lake: ( RPO < 15 minutes )
• Activate (RA-GRS) – allow read only Secondary region
SQL Pool ( RPO 8 H + 24 H )
• possibility to use user-defined snapshot backup
Synapse
Workspace
SQL Pool
Data Lake
GZRS
Data Share
Data Share
Main Region Pair Region
Scenario 1 : recover from current scenario
Synapse
Workspace
SQL Pool
Data Lake
Read Only
LRS
(Microsoft Activated)
Data Share
Data Share
Data Lake
Writable
Copy
Snapshot
restore
sync
Provision new
Scenario 1 : recover from current scenario
• Provision new Azure Services – BI007 infrastructure
network, roles, data lake permissions…
• Wait for failover by Microsoft – allow read only secondary
region
• Restore SQL Pool database
Synapse
Workspace
SQL Pool
Data Lake
GZRS
Data Share
Data Share
Main Region Pair Region
Scenario 2 : Pair region provisioned stand by
Synapse
Workspace
SQL Pool
Data Lake
Read Only
LRS
(Microsoft Activated)
Data Share
Data Share
Data Lake
Writable
Copy
Snapshot
restore
sync
Stand by / Stepped activation
1
3
1
2
Scenario 2 : Pair region provisioned stand by
The deploy is made to 2 regions – CI/CD?
All azure services will be deployed and configured
1. Activate data lake synchronization – read only to read/write
and Restore SQL Pool from snapshot
2. Activate Synapse pipeline triggers
3. Activate Data Share triggers
Synapse
Workspace
SQL Pool
Data Lake
GZRS
Data Share
Data Share
Main Region Pair Region
Scenario 3 : current replicated hot stand by
Synapse
Workspace
SQL Pool
Data Share
Data Share
Data Lake
Writable
GZRS
Daily
Activated
Scenario 3 : current replicated hot stand by
2 production systems in each region – CI/CD deploy
Data Share and Synapse pipelines activated
Start/Pause SQL Pool
Possibility to have 2 online systems
2 production systems to maintain
If data lake replication removed => data lake cost will be equal
Additional cost with Synapse pipelines
Low RPO and RTO – replicated environment
BI007 – Step by Step
Scenario 2
Synapse
Workspace
SQL Pool
Data Lake
GZRS
Data Share
Data Share
Main Region Pair Region
Scenario 2 : Pair region provisioned stand by
Synapse
Workspace
SQL Pool
Data Lake
Read Only
LRS
(Microsoft Activated)
Data Share
Data Share
Data Lake
Writable
Copy
Snapshot
restore
sync
Stand by / Stepped activation
1
3
1
2
Scenario 2 : Pair region provisioned stand by
Pre-Requisites
Phase 0 : Provisioning and failover validation
Phase 1 : Data Lake synchronization and SQL Pool restore
Phase 2 : Activate Synapse trigger pipelines
Phase 3 : Activate Data Share triggers
Phase 4 : Adjust/Notify consumers for new endpoints
References
Pre-Requisites
Ensure that the current redundant Storage Account on secondary
is RA activated (RA-GZRS)
Ensure that a Read/Write Storage Account is provisioned on
secondary region, using ZRS
Ensure that Synapse Dedicated SQL Pool geo-backup policy is
enabled
Phase 0 : Provisioning and failover validation
• Use current automation deployment strategy (terraform)
• Provision all Azure Services that represent the current MD Data hub
infrastructure (as in previous diagram)
• Ensure network, roles, data lake permissions, etc
• Replicate Data Share subscription(s)
• Consider secondary backup region from data provider(s)
• Use current automation (DevOps CI/CD) to replicate last
developments
• Periodic failover validation
Phase 1 : Data Lake synchronization and SQL Pool
restore
Evaluate possible data loss (regarding RTO)
This steps can run in parallel:
• Sync Data Lake (avg 3s/GB)
• Execute AzCopy script to sync Read Only Storage Account with Read/Write
Storage Account
azcopy sync
"https://<source_storage>.blob.core.windows.net/?<sas_token>"
"https://<destination_storage>.blob.core.windows.net/?<sas_token>"
--recursive
--delete-destination=true
https://docs.microsoft.com/en-us/azure/storage/common/storage-ref-azcopy-sync?toc=/azure/storage/blobs/toc.json
• SQL Pool
• restore (using powershell or azure portal)
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-restore-from-geo-backup#restore-
from-an-azure-geographical-region-through-powershell
1
Using Synapse Workspace:
• Open Synapse Studio
• Go to Manage / Integration / Triggers (menu)
• Start triggers
Phase 2 : Activate Synapse trigger pipelines
Phase 3 : Activate Data Share triggers
Using Azure Portal:
• Go to Data Share Service
• Select Received Shares (on left menu)
• For each shared subscription:
• on snapshot schedule, enable recurrence interval
Phase 4 : Adjust/Notify consumers for new endpoints
• Distribute new end-points for consumers or
• If DNS based, apply new end-points
References
Synapse
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/backup-and-restore
https://www.sqlshack.com/restore-dedicated-sql-pools-in-azure-synapse-analytics/
https://docs.microsoft.com/en-us/azure/synapse-analytics/how-to-move-workspace-from-one-region-to-another
Data Lake
https://docs.microsoft.com/en-us/azure/storage/common/storage-disaster-recovery-guidance#copying-data-as-an-alternative-to-failover
https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy?toc=/azure/storage/blobs/toc.json
https://docs.microsoft.com/en-us/azure/storage/common/last-sync-time-get?tabs=azure-powershell
https://docs.microsoft.com/en-us/azure/automation/automation-runbook-execution
https://docs.microsoft.com/en-us/azure/automation/automation-runbook-types#powershell-runbooks
Data Share
https://docs.microsoft.com/en-us/azure/data-share/disaster-recovery
Ricardo Linhares
BI Specialist | Data & AI Solutions @ DevScope
Start with SQL 2000
r.linhas@gmail.com
https://www.linkedin.com/in/r-linhares/
https://twitter.com/RLinhas

More Related Content

What's hot

Databricks Fundamentals
Databricks FundamentalsDatabricks Fundamentals
Databricks FundamentalsDalibor Wijas
 
Introducing Delta Live Tables: Make Reliable ETL Easy on Delta Lake
Introducing Delta Live Tables: Make Reliable ETL Easy on Delta LakeIntroducing Delta Live Tables: Make Reliable ETL Easy on Delta Lake
Introducing Delta Live Tables: Make Reliable ETL Easy on Delta LakeDatabricks
 
Azure Synapse Analytics Overview (r2)
Azure Synapse Analytics Overview (r2)Azure Synapse Analytics Overview (r2)
Azure Synapse Analytics Overview (r2)James Serra
 
Building Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft AzureBuilding Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
 
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...DataScienceConferenc1
 
Data Lakehouse, Data Mesh, and Data Fabric (r1)
Data Lakehouse, Data Mesh, and Data Fabric (r1)Data Lakehouse, Data Mesh, and Data Fabric (r1)
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
 
Introducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data WarehouseIntroducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data WarehouseSnowflake Computing
 
Azure Data Factory Data Flow Performance Tuning 101
Azure Data Factory Data Flow Performance Tuning 101Azure Data Factory Data Flow Performance Tuning 101
Azure Data Factory Data Flow Performance Tuning 101Mark Kromer
 
Introduction to Azure Databricks
Introduction to Azure DatabricksIntroduction to Azure Databricks
Introduction to Azure DatabricksJames Serra
 
Snowflake: The Good, the Bad, and the Ugly
Snowflake: The Good, the Bad, and the UglySnowflake: The Good, the Bad, and the Ugly
Snowflake: The Good, the Bad, and the UglyTyler Wishnoff
 
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...Cathrine Wilhelmsen
 
Migrating Data and Databases to Azure
Migrating Data and Databases to AzureMigrating Data and Databases to Azure
Migrating Data and Databases to AzureKaren Lopez
 
Azure Data Factory v2
Azure Data Factory v2Azure Data Factory v2
Azure Data Factory v2inovex GmbH
 
Building Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics PrimerBuilding Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics PrimerDatabricks
 
Data platform modernization with Databricks.pptx
Data platform modernization with Databricks.pptxData platform modernization with Databricks.pptx
Data platform modernization with Databricks.pptxCalvinSim10
 
Modernizing to a Cloud Data Architecture
Modernizing to a Cloud Data ArchitectureModernizing to a Cloud Data Architecture
Modernizing to a Cloud Data ArchitectureDatabricks
 
Time-Series Apache HBase
Time-Series Apache HBaseTime-Series Apache HBase
Time-Series Apache HBaseHBaseCon
 
Azure Data Factory Data Flow
Azure Data Factory Data FlowAzure Data Factory Data Flow
Azure Data Factory Data FlowMark Kromer
 
Free Training: How to Build a Lakehouse
Free Training: How to Build a LakehouseFree Training: How to Build a Lakehouse
Free Training: How to Build a LakehouseDatabricks
 
Azure data platform overview
Azure data platform overviewAzure data platform overview
Azure data platform overviewJames Serra
 

What's hot (20)

Databricks Fundamentals
Databricks FundamentalsDatabricks Fundamentals
Databricks Fundamentals
 
Introducing Delta Live Tables: Make Reliable ETL Easy on Delta Lake
Introducing Delta Live Tables: Make Reliable ETL Easy on Delta LakeIntroducing Delta Live Tables: Make Reliable ETL Easy on Delta Lake
Introducing Delta Live Tables: Make Reliable ETL Easy on Delta Lake
 
Azure Synapse Analytics Overview (r2)
Azure Synapse Analytics Overview (r2)Azure Synapse Analytics Overview (r2)
Azure Synapse Analytics Overview (r2)
 
Building Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft AzureBuilding Modern Data Platform with Microsoft Azure
Building Modern Data Platform with Microsoft Azure
 
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
[DSC Europe 22] Lakehouse architecture with Delta Lake and Databricks - Draga...
 
Data Lakehouse, Data Mesh, and Data Fabric (r1)
Data Lakehouse, Data Mesh, and Data Fabric (r1)Data Lakehouse, Data Mesh, and Data Fabric (r1)
Data Lakehouse, Data Mesh, and Data Fabric (r1)
 
Introducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data WarehouseIntroducing the Snowflake Computing Cloud Data Warehouse
Introducing the Snowflake Computing Cloud Data Warehouse
 
Azure Data Factory Data Flow Performance Tuning 101
Azure Data Factory Data Flow Performance Tuning 101Azure Data Factory Data Flow Performance Tuning 101
Azure Data Factory Data Flow Performance Tuning 101
 
Introduction to Azure Databricks
Introduction to Azure DatabricksIntroduction to Azure Databricks
Introduction to Azure Databricks
 
Snowflake: The Good, the Bad, and the Ugly
Snowflake: The Good, the Bad, and the UglySnowflake: The Good, the Bad, and the Ugly
Snowflake: The Good, the Bad, and the Ugly
 
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
 
Migrating Data and Databases to Azure
Migrating Data and Databases to AzureMigrating Data and Databases to Azure
Migrating Data and Databases to Azure
 
Azure Data Factory v2
Azure Data Factory v2Azure Data Factory v2
Azure Data Factory v2
 
Building Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics PrimerBuilding Lakehouses on Delta Lake with SQL Analytics Primer
Building Lakehouses on Delta Lake with SQL Analytics Primer
 
Data platform modernization with Databricks.pptx
Data platform modernization with Databricks.pptxData platform modernization with Databricks.pptx
Data platform modernization with Databricks.pptx
 
Modernizing to a Cloud Data Architecture
Modernizing to a Cloud Data ArchitectureModernizing to a Cloud Data Architecture
Modernizing to a Cloud Data Architecture
 
Time-Series Apache HBase
Time-Series Apache HBaseTime-Series Apache HBase
Time-Series Apache HBase
 
Azure Data Factory Data Flow
Azure Data Factory Data FlowAzure Data Factory Data Flow
Azure Data Factory Data Flow
 
Free Training: How to Build a Lakehouse
Free Training: How to Build a LakehouseFree Training: How to Build a Lakehouse
Free Training: How to Build a Lakehouse
 
Azure data platform overview
Azure data platform overviewAzure data platform overview
Azure data platform overview
 

Similar to BI007 - Step-by-Step Disaster Recovery Scenario

SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)Gary Jackson MBCS
 
Microsoft SharePoint Disaster Recovery to Azure
Microsoft SharePoint Disaster Recovery to AzureMicrosoft SharePoint Disaster Recovery to Azure
Microsoft SharePoint Disaster Recovery to AzureDavid J Rosenthal
 
Snowflake Notes_Part_2.docx
Snowflake Notes_Part_2.docxSnowflake Notes_Part_2.docx
Snowflake Notes_Part_2.docxNabumaKhala1
 
SRV407 Deep Dive on Amazon Aurora
SRV407 Deep Dive on Amazon AuroraSRV407 Deep Dive on Amazon Aurora
SRV407 Deep Dive on Amazon AuroraAmazon Web Services
 
GECon2017_High-volume data streaming in azure_ Aliaksandr Laisha
GECon2017_High-volume data streaming in azure_ Aliaksandr LaishaGECon2017_High-volume data streaming in azure_ Aliaksandr Laisha
GECon2017_High-volume data streaming in azure_ Aliaksandr LaishaGECon_Org Team
 
Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...
Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...
Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...WinWire Technologies Inc
 
In Memory Analytics with Apache Spark
In Memory Analytics with Apache SparkIn Memory Analytics with Apache Spark
In Memory Analytics with Apache SparkVenkata Naga Ravi
 
Webcenter application performance tuning guide
Webcenter application performance tuning guideWebcenter application performance tuning guide
Webcenter application performance tuning guideVinay Kumar
 
De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...
De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...
De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...CitiusTech
 
SharePoint Disaster Recovery to Microsoft Azure
SharePoint Disaster Recovery to Microsoft AzureSharePoint Disaster Recovery to Microsoft Azure
SharePoint Disaster Recovery to Microsoft AzureDavid J Rosenthal
 
Champion Fas Deduplication
Champion Fas DeduplicationChampion Fas Deduplication
Champion Fas DeduplicationMichael Hudak
 
07_DP_300T00A_HA_Disaster_Recovery.pptx
07_DP_300T00A_HA_Disaster_Recovery.pptx07_DP_300T00A_HA_Disaster_Recovery.pptx
07_DP_300T00A_HA_Disaster_Recovery.pptxKareemBullard1
 
Testing Delphix: easy data virtualization
Testing Delphix: easy data virtualizationTesting Delphix: easy data virtualization
Testing Delphix: easy data virtualizationFranck Pachot
 

Similar to BI007 - Step-by-Step Disaster Recovery Scenario (20)

SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)
SAP HANA System Replication (HSR) versus SAP Replication Server (SRS)
 
Azure Databases with IaaS
Azure Databases with IaaSAzure Databases with IaaS
Azure Databases with IaaS
 
Azure DBA with IaaS
Azure DBA with IaaSAzure DBA with IaaS
Azure DBA with IaaS
 
Microsoft SharePoint Disaster Recovery to Azure
Microsoft SharePoint Disaster Recovery to AzureMicrosoft SharePoint Disaster Recovery to Azure
Microsoft SharePoint Disaster Recovery to Azure
 
Snowflake Notes_Part_2.docx
Snowflake Notes_Part_2.docxSnowflake Notes_Part_2.docx
Snowflake Notes_Part_2.docx
 
PAAS Architecture Strategy for cloud Business Intelligence Solution
PAAS Architecture Strategy for cloud Business Intelligence SolutionPAAS Architecture Strategy for cloud Business Intelligence Solution
PAAS Architecture Strategy for cloud Business Intelligence Solution
 
Cloud Strategy Architecture for multi country deployment
Cloud Strategy Architecture for multi country deploymentCloud Strategy Architecture for multi country deployment
Cloud Strategy Architecture for multi country deployment
 
Cnam azure 2014 storage
Cnam azure 2014   storageCnam azure 2014   storage
Cnam azure 2014 storage
 
SRV407 Deep Dive on Amazon Aurora
SRV407 Deep Dive on Amazon AuroraSRV407 Deep Dive on Amazon Aurora
SRV407 Deep Dive on Amazon Aurora
 
GECon2017_High-volume data streaming in azure_ Aliaksandr Laisha
GECon2017_High-volume data streaming in azure_ Aliaksandr LaishaGECon2017_High-volume data streaming in azure_ Aliaksandr Laisha
GECon2017_High-volume data streaming in azure_ Aliaksandr Laisha
 
Azure storage deep dive
Azure storage deep diveAzure storage deep dive
Azure storage deep dive
 
My sql
My sqlMy sql
My sql
 
Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...
Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...
Designing Resilient Applications on Microsoft Azure/Disaster Recovery of Appl...
 
In Memory Analytics with Apache Spark
In Memory Analytics with Apache SparkIn Memory Analytics with Apache Spark
In Memory Analytics with Apache Spark
 
Webcenter application performance tuning guide
Webcenter application performance tuning guideWebcenter application performance tuning guide
Webcenter application performance tuning guide
 
De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...
De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...
De-duplicated Refined Zone in Healthcare Data Lake Using Big Data Processing ...
 
SharePoint Disaster Recovery to Microsoft Azure
SharePoint Disaster Recovery to Microsoft AzureSharePoint Disaster Recovery to Microsoft Azure
SharePoint Disaster Recovery to Microsoft Azure
 
Champion Fas Deduplication
Champion Fas DeduplicationChampion Fas Deduplication
Champion Fas Deduplication
 
07_DP_300T00A_HA_Disaster_Recovery.pptx
07_DP_300T00A_HA_Disaster_Recovery.pptx07_DP_300T00A_HA_Disaster_Recovery.pptx
07_DP_300T00A_HA_Disaster_Recovery.pptx
 
Testing Delphix: easy data virtualization
Testing Delphix: easy data virtualizationTesting Delphix: easy data virtualization
Testing Delphix: easy data virtualization
 

Recently uploaded

9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home ServiceSapana Sha
 
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingNeil Barnes
 
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)jennyeacort
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfgstagge
 
DBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdfDBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdfJohn Sterrett
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFAAndrei Kaleshka
 
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Jack DiGiovanna
 
VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...
VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...
VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...Suhani Kapoor
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxEmmanuel Dauda
 
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptdokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptSonatrach
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130Suhani Kapoor
 
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样vhwb25kk
 
Schema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfSchema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfLars Albertsson
 
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAmazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAbdelrhman abooda
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024thyngster
 

Recently uploaded (20)

9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service
 
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data Storytelling
 
Decoding Loan Approval: Predictive Modeling in Action
Decoding Loan Approval: Predictive Modeling in ActionDecoding Loan Approval: Predictive Modeling in Action
Decoding Loan Approval: Predictive Modeling in Action
 
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdf
 
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
 
DBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdfDBA Basics: Getting Started with Performance Tuning.pdf
DBA Basics: Getting Started with Performance Tuning.pdf
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFA
 
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
 
VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...
VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...
VIP High Profile Call Girls Amravati Aarushi 8250192130 Independent Escort Se...
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptx
 
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptdokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
 
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
 
꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...
꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...
꧁❤ Aerocity Call Girls Service Aerocity Delhi ❤꧂ 9999965857 ☎️ Hard And Sexy ...
 
Schema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfSchema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdf
 
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAmazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
 

BI007 - Step-by-Step Disaster Recovery Scenario

  • 1.
  • 2. Project BI007 - Analysis BI007 - Sample name for a real case
  • 3. Synapse Workspace SQL Pool Data Lake GZRS Data Share Data Share Main Region Storage Data Providers Consumers
  • 4. BI007 - DR Scenarios RTO : Recovery Time Objective RPO : Recovery Point Objective Normal BI Solution RPO = 0 –> ( re-import data ) RTO = How many Time Down ?
  • 5. Data Share To be prepared for a data center outage, the data provider can have a data share environment provisioned in a secondary region. Measures can be taken to ensure a smooth failover in the event that a data center outage does occur. In this context data consumers can have an active share subscription that is idle for DR purposes. https://docs.microsoft.com/en-us/azure/data-share/disaster-recovery
  • 6. Data Lake Storage accounts that have hierarchical namespace enabled (such as for Data Lake Storage Gen2) are not supported for failover at this time. https://docs.microsoft.com/en-us/azure/storage/common/storage-disaster-recovery-guidance Trigger Failover Trigger Failover – not available Failover for storage accounts with hierarchical namespace enabled (Azure Data Lake Storage Gen2 storage accounts) is not supported at this time.
  • 7. Copying data as an alternative to failover If your storage account is configured for read access to the secondary, then you can design your application to read from the secondary endpoint. If you prefer not to fail over in the event of an outage in the primary region, you can use tools such as AzCopy, Azure PowerShell, or the Azure Data Movement library to copy data from your storage account in the secondary region to another storage account in an unaffected region. You can then point your applications to that storage account for both read and write availability. Data Lake
  • 8. Data Lake RA on secondary Geo-redundant storage (with GRS or GZRS) replicates your data to another physical location in the secondary region to protect against regional outages. However, that data is available to be read only if the customer or Microsoft initiates a failover from the primary to secondary region. When you enable read access to the secondary region, your data is available to be read at all times, including in a situation where the primary region becomes unavailable. For read access to the secondary region, enable read-access geo-redundant storage (RA- GRS) or read-access geo-zone-redundant storage (RA-GZRS).
  • 9. Synapse Geo-backups and disaster recovery A geo-backup is created once per day to a paired data center. The RPO for a geo-restore is 24 hours. You can restore the geo-backup to a server in any other region where dedicated SQL pool is supported. A geo-backup ensures you can restore data warehouse in case you cannot access the restore points in your primary region. You can also create a user-defined restore point and restore from the newly created restore point to a new data warehouse in a different region. After you have restored, you have the data warehouse online and can pause it indefinitely to save compute costs. The paused database incurs storage charges at the Azure Premium Storage rate. Another common pattern for a shorter recovery point is to ingest data into primary and secondary instances of a data warehouse in parallel. In this scenario, data is ingested from a source (or sources) and persisted to two separate instances of the data warehouse (primary and secondary). To save on compute costs, you can pause the secondary instance of the warehouse. If you need an active copy of the data warehouse, you can resume, which should take only a few minutes. https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/backup-and-restore
  • 10. Synapse Move synapse from one region to another https://docs.microsoft.com/en-us/azure/synapse-analytics/how-to-move- workspace-from-one-region-to-another Steps resume: • Provision new Synapse instance and restore your last state to it from the Automated or User-Defined snapshot. • Proper permissions should be granted • Connections to Azure Services should be reestablished • New connection parameters should be propagated to the end-users • Model drift should be mitigated
  • 11. Synapse Workspace SQL Pool Data Lake GZRS Data Share Data Share Main Region Pair Region Secondary Data Lake LRS Automated Snapshot Copy Snapshot RPO < 24h RPO = 8h RPO < 15m (no SLA) Failover not available due to hierarchical namespace Current Scenario
  • 12. Current scenario Data lake: ( RPO < 15 minutes ) • Activate (RA-GRS) – allow read only Secondary region SQL Pool ( RPO 8 H + 24 H ) • possibility to use user-defined snapshot backup
  • 13. Synapse Workspace SQL Pool Data Lake GZRS Data Share Data Share Main Region Pair Region Scenario 1 : recover from current scenario Synapse Workspace SQL Pool Data Lake Read Only LRS (Microsoft Activated) Data Share Data Share Data Lake Writable Copy Snapshot restore sync Provision new
  • 14. Scenario 1 : recover from current scenario • Provision new Azure Services – BI007 infrastructure network, roles, data lake permissions… • Wait for failover by Microsoft – allow read only secondary region • Restore SQL Pool database
  • 15. Synapse Workspace SQL Pool Data Lake GZRS Data Share Data Share Main Region Pair Region Scenario 2 : Pair region provisioned stand by Synapse Workspace SQL Pool Data Lake Read Only LRS (Microsoft Activated) Data Share Data Share Data Lake Writable Copy Snapshot restore sync Stand by / Stepped activation 1 3 1 2
  • 16. Scenario 2 : Pair region provisioned stand by The deploy is made to 2 regions – CI/CD? All azure services will be deployed and configured 1. Activate data lake synchronization – read only to read/write and Restore SQL Pool from snapshot 2. Activate Synapse pipeline triggers 3. Activate Data Share triggers
  • 17. Synapse Workspace SQL Pool Data Lake GZRS Data Share Data Share Main Region Pair Region Scenario 3 : current replicated hot stand by Synapse Workspace SQL Pool Data Share Data Share Data Lake Writable GZRS Daily Activated
  • 18. Scenario 3 : current replicated hot stand by 2 production systems in each region – CI/CD deploy Data Share and Synapse pipelines activated Start/Pause SQL Pool Possibility to have 2 online systems 2 production systems to maintain If data lake replication removed => data lake cost will be equal Additional cost with Synapse pipelines Low RPO and RTO – replicated environment
  • 19. BI007 – Step by Step Scenario 2
  • 20. Synapse Workspace SQL Pool Data Lake GZRS Data Share Data Share Main Region Pair Region Scenario 2 : Pair region provisioned stand by Synapse Workspace SQL Pool Data Lake Read Only LRS (Microsoft Activated) Data Share Data Share Data Lake Writable Copy Snapshot restore sync Stand by / Stepped activation 1 3 1 2
  • 21. Scenario 2 : Pair region provisioned stand by Pre-Requisites Phase 0 : Provisioning and failover validation Phase 1 : Data Lake synchronization and SQL Pool restore Phase 2 : Activate Synapse trigger pipelines Phase 3 : Activate Data Share triggers Phase 4 : Adjust/Notify consumers for new endpoints References
  • 22. Pre-Requisites Ensure that the current redundant Storage Account on secondary is RA activated (RA-GZRS) Ensure that a Read/Write Storage Account is provisioned on secondary region, using ZRS Ensure that Synapse Dedicated SQL Pool geo-backup policy is enabled
  • 23. Phase 0 : Provisioning and failover validation • Use current automation deployment strategy (terraform) • Provision all Azure Services that represent the current MD Data hub infrastructure (as in previous diagram) • Ensure network, roles, data lake permissions, etc • Replicate Data Share subscription(s) • Consider secondary backup region from data provider(s) • Use current automation (DevOps CI/CD) to replicate last developments • Periodic failover validation
  • 24. Phase 1 : Data Lake synchronization and SQL Pool restore Evaluate possible data loss (regarding RTO) This steps can run in parallel: • Sync Data Lake (avg 3s/GB) • Execute AzCopy script to sync Read Only Storage Account with Read/Write Storage Account azcopy sync "https://<source_storage>.blob.core.windows.net/?<sas_token>" "https://<destination_storage>.blob.core.windows.net/?<sas_token>" --recursive --delete-destination=true https://docs.microsoft.com/en-us/azure/storage/common/storage-ref-azcopy-sync?toc=/azure/storage/blobs/toc.json • SQL Pool • restore (using powershell or azure portal) https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-restore-from-geo-backup#restore- from-an-azure-geographical-region-through-powershell 1
  • 25. Using Synapse Workspace: • Open Synapse Studio • Go to Manage / Integration / Triggers (menu) • Start triggers Phase 2 : Activate Synapse trigger pipelines
  • 26. Phase 3 : Activate Data Share triggers Using Azure Portal: • Go to Data Share Service • Select Received Shares (on left menu) • For each shared subscription: • on snapshot schedule, enable recurrence interval
  • 27. Phase 4 : Adjust/Notify consumers for new endpoints • Distribute new end-points for consumers or • If DNS based, apply new end-points
  • 28. References Synapse https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/backup-and-restore https://www.sqlshack.com/restore-dedicated-sql-pools-in-azure-synapse-analytics/ https://docs.microsoft.com/en-us/azure/synapse-analytics/how-to-move-workspace-from-one-region-to-another Data Lake https://docs.microsoft.com/en-us/azure/storage/common/storage-disaster-recovery-guidance#copying-data-as-an-alternative-to-failover https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy?toc=/azure/storage/blobs/toc.json https://docs.microsoft.com/en-us/azure/storage/common/last-sync-time-get?tabs=azure-powershell https://docs.microsoft.com/en-us/azure/automation/automation-runbook-execution https://docs.microsoft.com/en-us/azure/automation/automation-runbook-types#powershell-runbooks Data Share https://docs.microsoft.com/en-us/azure/data-share/disaster-recovery
  • 29. Ricardo Linhares BI Specialist | Data & AI Solutions @ DevScope Start with SQL 2000 r.linhas@gmail.com https://www.linkedin.com/in/r-linhares/ https://twitter.com/RLinhas