SlideShare a Scribd company logo
1 of 38
Deep Dive into Azure Data
Factory v2
Eric Bragas
• Senior Business Intelligence
Consultant with DesignMind
• Always had a passion for art, design,
and clean engineering (aka. I own a
Dyson vacuum)
• Undergoing my Accelerated Freefall
training to become a certified
skydiver
• And I often overuse parentheses
(and commas).
https://www.linkedin.com/in/ericbragas93/
@ericbragas
eric@designmind.com
Agenda
• Overview
• Azure Data Factory v2
• ADF and SSIS
• Components
• Expressions, Functions, Parameters, and System Variables
• Development
• Monitoring and Management
• Q&A
Overview
What is Azure Data Factory v2?
Overview
• "[Azure Data Factory] is a cloud-
based data integration service
that allows you to create data-
driven workflows in the cloud
that orchestrate and automate
data movement and data
transformation.“
• Version 1 – service for batch
processing of time series data
• Version 2 – a general purpose
data processing and workflow
orchestration tool
Comparison
ADFv2
ADFv1SSIS
ETL vs. ELT
• Key difference is where the transformations are processed
• ETL – transforms are processed by the integration tool (i.e. SSIS)
• ELT – transforms are processed by the target database (i.e. Data Lake, SQL, etc.)
• Main benefit is scalability to larger data volumes
• Main drawback is the added step between source and destination
• This isn’t always a drawback when you are feeding multiple sources from the same
pool of raw data
My preference is ELT, even in non-big data scenarios because a database
engine can typically perform asynchronous transformations faster than SSIS
Version 1 vs. Version 2
Version 1:
• Time-series based
• Schedules driven by dataset
availability
• Developed using Visual Studio
• Pretty cool
Version 2:
• General purpose
• Explicit and Tumbling-window
scheduling
• Freaking awesome
Version 1 vs. Version 2
Component Changes
Datasets No longer use the "availability" property
Linked Services Includes the new "connectVia" property which allows
selection of the Integration Runtime to use (see
Integration Runtime section)
Pipelines Unit of scheduling instead of activities
Activities Control and non-control activity types
Dependencies between activities
Triggers New scheduling component
Integration Runtime Replacement for the Data Management Gateway
Version 2 vs. SSIS
• Pipelines ~= Packages
• Can use similar master-child patterns
• Linked Services ~= Connection Managers
• SSIS usually extracts, transforms, and loads data all as a single
process. ADF leverages external compute services to do
transformation.
Can also deploy and trigger SSIS packages to ADFv2 using the Azure-
SSIS Integration Runtime
Sample of Supported Sources/Sinks
Components
Building Blocks
Linked Services
• A saved connection string to a data storage or compute service
• Doesn’t specify anything about the data itself, just the means of
connecting to it
• Referenced by Datasets
Dataset
• A data structure within a storage linked service
• Think: SQL table, blob file or folder, HTTP endpoint, etc.
• Can be read from and written to by Activities
Activity
• A component within a Pipeline that performs a single operation
• Control and Non-control activities
• Copy
• Lookup
• Web Request
• Execute U-SQL Job
• Can be linked together via dependencies
• On Success
• On Failure
• On Completion
• On Skip
Pipeline
• Pipelines are the containers for a series of activities that makes up a
workflow
• Started via a trigger, accept parameters, and maintain system
variables such as the @pipeline().RunId
Triggers
• Schedules that trigger pipeline execution
• More than one pipeline can subscribe to a single trigger
• Explicit schedule - i.e. every Monday at 3 AM, or…
• Tumbling window - i.e. every 6 hours starting today at 6 AM
Integration Runtime
An activity defines the action to be performed.
A linked service defines the target data store or compute service.
An integration runtime provides the bridge between the two.
• Data Movement: between public and private data stores, on-premise
networks, supports built-in connectors, format conversion, column
mapping, etc.
• Activity Dispatch: dispatch and monitor transformation activities to
services such as: SQL Server, HDInsight, AML, etc.
• SSIS Package Execution: natively execute SSIS packages.
Integration Runtime (cont’d)
Types of IR:
• Azure (default)
• Self-Hosted
• Azure-SSIS
Demo
Create Copy Pipeline
Expressions, Functions,
Parameters, and System Variables
Oh my!
Expressions
• Syntax evaluated during execution of an activity that allows for dynamic
changes to the property configurations they are used within
• Reference things such as parameters, the output of previous activities, and
provide access to the current item being iterated over by loops
@pipeline().parameters.myParam
@formatDateTime(item().value.myDateAttr, ‘yyyy-MM-DD’)
Custom State Passing
• Custom State Passing refers to the ability for a downstream activity to
access the output of an upstream activity
• Expressions can be used to access these output states and change
configuration of the currently executing activity
@activity(‘myUpstreamActivity’).output.rowsRead
Functions
• String – string manipulation
• Collection – operate over arrays, strings, and sometimes dictionaries
• Logic – conditions
• Conversion – convert between native types
• Math – can be used on integer or float
• Date
There is not currently a way to add or define additional functions
Parameters
• Key-value pairs that can be passed to a pipeline when it is started or a dataset
when it is used by an activity
• Need to first be configured to receive a parameter with a specific name and data
type before the calling component can be configured to pass a value.
Two types of Parameters:
• Pipeline
• Dataset
@pipeline().parameters.myParam
@dataset().myParam
System Variables
• System Variables are read-only values that are managed by the Data
Factory and provide metadata to the current execution
• These can be used for custom logging or within expressions
• They can be either Pipeline-scoped or Trigger-scoped
@pipeline().DataFactory
@trigger().scheduledTime
ADFv2 Development
Tools and Techniques
Design Patterns
• Delta/Incremental Loading
• Dynamic table loading
• Custom logging
• Using database change tracking
Monitoring and Management
Tools and Techniques
Tools for Monitoring
• Azure Portal – Author and Monitor
• PowerShell
Deployment
• Use separate dev/test/prod resource groups and Data Factory
services
• Deploy to separate services using ARM Templates (until VS extension
available)
• Can also script deployments using PowerShell or Python SDK
Debugging
• Use monitor, drill into a pipeline and view error messages directly on
the activity
• Cannot see the result of an evaluated expression, so you may need to
be clever
• Depending on the error, you may get a message that is completely
useless. Good luck.
Deploying SSIS to Azure-SSIS Integration
Runtime
• Allows deployment and execution of native SSIS packages
• Use Azure SQL Database to host SSISDB Catalog
• Limitations exist with using the Azure SDK for SSIS
• Cannot execute U-SQL jobs
• Lift-and-shift option for existing SSIS packages
Contact Info
https://www.linkedin.com/in/ericbragas93/
@ericbragas
eric@designmind.com
https://github.com/ebragas
fin.

More Related Content

What's hot

What's hot (20)

Data Quality Patterns in the Cloud with Azure Data Factory
Data Quality Patterns in the Cloud with Azure Data FactoryData Quality Patterns in the Cloud with Azure Data Factory
Data Quality Patterns in the Cloud with Azure Data Factory
 
Azure Data Factory v2
Azure Data Factory v2Azure Data Factory v2
Azure Data Factory v2
 
Azure Data Factory Introduction.pdf
Azure Data Factory Introduction.pdfAzure Data Factory Introduction.pdf
Azure Data Factory Introduction.pdf
 
Azure data factory
Azure data factoryAzure data factory
Azure data factory
 
1- Introduction of Azure data factory.pptx
1- Introduction of Azure data factory.pptx1- Introduction of Azure data factory.pptx
1- Introduction of Azure data factory.pptx
 
Introduction to Azure Data Factory
Introduction to Azure Data FactoryIntroduction to Azure Data Factory
Introduction to Azure Data Factory
 
Azure Data Engineering.pptx
Azure Data Engineering.pptxAzure Data Engineering.pptx
Azure Data Engineering.pptx
 
Azure Data Factory Data Flows Training (Sept 2020 Update)
Azure Data Factory Data Flows Training (Sept 2020 Update)Azure Data Factory Data Flows Training (Sept 2020 Update)
Azure Data Factory Data Flows Training (Sept 2020 Update)
 
Azure datafactory
Azure datafactoryAzure datafactory
Azure datafactory
 
Azure Data Storage
Azure Data StorageAzure Data Storage
Azure Data Storage
 
Microsoft Azure - Introduction to microsoft's public cloud
Microsoft Azure - Introduction to microsoft's public cloudMicrosoft Azure - Introduction to microsoft's public cloud
Microsoft Azure - Introduction to microsoft's public cloud
 
Introduction to Azure Data Lake
Introduction to Azure Data LakeIntroduction to Azure Data Lake
Introduction to Azure Data Lake
 
Pipelines and Packages: Introduction to Azure Data Factory (DATA:Scotland 2019)
Pipelines and Packages: Introduction to Azure Data Factory (DATA:Scotland 2019)Pipelines and Packages: Introduction to Azure Data Factory (DATA:Scotland 2019)
Pipelines and Packages: Introduction to Azure Data Factory (DATA:Scotland 2019)
 
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
Pipelines and Data Flows: Introduction to Data Integration in Azure Synapse A...
 
Azure data bricks by Eugene Polonichko
Azure data bricks by Eugene PolonichkoAzure data bricks by Eugene Polonichko
Azure data bricks by Eugene Polonichko
 
NOVA SQL User Group - Azure Synapse Analytics Overview - May 2020
NOVA SQL User Group - Azure Synapse Analytics Overview -  May 2020NOVA SQL User Group - Azure Synapse Analytics Overview -  May 2020
NOVA SQL User Group - Azure Synapse Analytics Overview - May 2020
 
Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouse
 
Migrate an Existing Application to Microsoft Azure
Migrate an Existing Application to Microsoft AzureMigrate an Existing Application to Microsoft Azure
Migrate an Existing Application to Microsoft Azure
 
Azure purview
Azure purviewAzure purview
Azure purview
 
TechEvent Databricks on Azure
TechEvent Databricks on AzureTechEvent Databricks on Azure
TechEvent Databricks on Azure
 

Similar to Deep Dive into Azure Data Factory v2

Azure from scratch part 3 By Girish Kalamati
Azure from scratch part 3 By Girish KalamatiAzure from scratch part 3 By Girish Kalamati
Azure from scratch part 3 By Girish Kalamati
Girish Kalamati
 
Architectures, Frameworks and Infrastructure
Architectures, Frameworks and InfrastructureArchitectures, Frameworks and Infrastructure
Architectures, Frameworks and Infrastructure
harendra_pathak
 
Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture
Rajesh Kumar
 

Similar to Deep Dive into Azure Data Factory v2 (20)

Building cloud native data microservice
Building cloud native data microserviceBuilding cloud native data microservice
Building cloud native data microservice
 
Stay productive_while_slicing_up_the_monolith
Stay productive_while_slicing_up_the_monolithStay productive_while_slicing_up_the_monolith
Stay productive_while_slicing_up_the_monolith
 
Azure from scratch part 3 By Girish Kalamati
Azure from scratch part 3 By Girish KalamatiAzure from scratch part 3 By Girish Kalamati
Azure from scratch part 3 By Girish Kalamati
 
Architectures, Frameworks and Infrastructure
Architectures, Frameworks and InfrastructureArchitectures, Frameworks and Infrastructure
Architectures, Frameworks and Infrastructure
 
DesignMind SQL Server 2008 Migration
DesignMind SQL Server 2008 MigrationDesignMind SQL Server 2008 Migration
DesignMind SQL Server 2008 Migration
 
ASMUG February 2015 Knowledge Event
ASMUG February 2015 Knowledge EventASMUG February 2015 Knowledge Event
ASMUG February 2015 Knowledge Event
 
Artur Borycki - Beyond Lambda - how to get from logical to physical - code.ta...
Artur Borycki - Beyond Lambda - how to get from logical to physical - code.ta...Artur Borycki - Beyond Lambda - how to get from logical to physical - code.ta...
Artur Borycki - Beyond Lambda - how to get from logical to physical - code.ta...
 
Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture
 
Who's in your Cloud? Cloud State Monitoring
Who's in your Cloud? Cloud State MonitoringWho's in your Cloud? Cloud State Monitoring
Who's in your Cloud? Cloud State Monitoring
 
Service quality monitoring system architecture
Service quality monitoring system architectureService quality monitoring system architecture
Service quality monitoring system architecture
 
Modern ETL: Azure Data Factory, Data Lake, and SQL Database
Modern ETL: Azure Data Factory, Data Lake, and SQL DatabaseModern ETL: Azure Data Factory, Data Lake, and SQL Database
Modern ETL: Azure Data Factory, Data Lake, and SQL Database
 
Azure Data Engineer Course | Azure Data Engineer Training Hyderabad.pptx
Azure Data Engineer Course | Azure Data Engineer Training Hyderabad.pptxAzure Data Engineer Course | Azure Data Engineer Training Hyderabad.pptx
Azure Data Engineer Course | Azure Data Engineer Training Hyderabad.pptx
 
Kubernetes Infra 2.0
Kubernetes Infra 2.0Kubernetes Infra 2.0
Kubernetes Infra 2.0
 
Scalable relational database with SQL Azure
Scalable relational database with SQL AzureScalable relational database with SQL Azure
Scalable relational database with SQL Azure
 
Big Data_Architecture.pptx
Big Data_Architecture.pptxBig Data_Architecture.pptx
Big Data_Architecture.pptx
 
Co 4, session 2, aws analytics services
Co 4, session 2, aws analytics servicesCo 4, session 2, aws analytics services
Co 4, session 2, aws analytics services
 
Paa sing a java ee 6 application kshitiz saxena
Paa sing a java ee 6 application   kshitiz saxenaPaa sing a java ee 6 application   kshitiz saxena
Paa sing a java ee 6 application kshitiz saxena
 
Azure App Service Deep Dive
Azure App Service Deep DiveAzure App Service Deep Dive
Azure App Service Deep Dive
 
Migrate from Oracle to Aurora PostgreSQL: Best Practices, Design Patterns, & ...
Migrate from Oracle to Aurora PostgreSQL: Best Practices, Design Patterns, & ...Migrate from Oracle to Aurora PostgreSQL: Best Practices, Design Patterns, & ...
Migrate from Oracle to Aurora PostgreSQL: Best Practices, Design Patterns, & ...
 
Azure PaaS (WebApp & SQL Database) workshop solution
Azure PaaS (WebApp & SQL Database) workshop solutionAzure PaaS (WebApp & SQL Database) workshop solution
Azure PaaS (WebApp & SQL Database) workshop solution
 

Recently uploaded

一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
pyhepag
 
一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理
pyhepag
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
cyebo
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
cyebo
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Riyadh +966572737505 get cytotec
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
pyhepag
 
Exploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptxExploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptx
DilipVasan
 
Fuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyFuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertainty
RafigAliyev2
 

Recently uploaded (20)

一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
 
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPsWebinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
 
一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
 
社内勉強会資料  Mamba - A new era or ephemeral
社内勉強会資料   Mamba - A new era or ephemeral社内勉強会資料   Mamba - A new era or ephemeral
社内勉強会資料  Mamba - A new era or ephemeral
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
 
AI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdfAI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdf
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
 
Exploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptxExploratory Data Analysis - Dilip S.pptx
Exploratory Data Analysis - Dilip S.pptx
 
Generative AI for Trailblazers_ Unlock the Future of AI.pdf
Generative AI for Trailblazers_ Unlock the Future of AI.pdfGenerative AI for Trailblazers_ Unlock the Future of AI.pdf
Generative AI for Trailblazers_ Unlock the Future of AI.pdf
 
How I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prisonHow I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prison
 
Slip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp ClaimsSlip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp Claims
 
basics of data science with application areas.pdf
basics of data science with application areas.pdfbasics of data science with application areas.pdf
basics of data science with application areas.pdf
 
Fuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertaintyFuzzy Sets decision making under information of uncertainty
Fuzzy Sets decision making under information of uncertainty
 
2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call2024 Q1 Tableau User Group Leader Quarterly Call
2024 Q1 Tableau User Group Leader Quarterly Call
 
Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)
 
Artificial_General_Intelligence__storm_gen_article.pdf
Artificial_General_Intelligence__storm_gen_article.pdfArtificial_General_Intelligence__storm_gen_article.pdf
Artificial_General_Intelligence__storm_gen_article.pdf
 
Pre-ProductionImproveddsfjgndflghtgg.pptx
Pre-ProductionImproveddsfjgndflghtgg.pptxPre-ProductionImproveddsfjgndflghtgg.pptx
Pre-ProductionImproveddsfjgndflghtgg.pptx
 
2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting
 

Deep Dive into Azure Data Factory v2

  • 1. Deep Dive into Azure Data Factory v2
  • 2. Eric Bragas • Senior Business Intelligence Consultant with DesignMind • Always had a passion for art, design, and clean engineering (aka. I own a Dyson vacuum) • Undergoing my Accelerated Freefall training to become a certified skydiver • And I often overuse parentheses (and commas). https://www.linkedin.com/in/ericbragas93/ @ericbragas eric@designmind.com
  • 3. Agenda • Overview • Azure Data Factory v2 • ADF and SSIS • Components • Expressions, Functions, Parameters, and System Variables • Development • Monitoring and Management • Q&A
  • 4. Overview What is Azure Data Factory v2?
  • 5. Overview • "[Azure Data Factory] is a cloud- based data integration service that allows you to create data- driven workflows in the cloud that orchestrate and automate data movement and data transformation.“ • Version 1 – service for batch processing of time series data • Version 2 – a general purpose data processing and workflow orchestration tool
  • 6.
  • 8. ETL vs. ELT • Key difference is where the transformations are processed • ETL – transforms are processed by the integration tool (i.e. SSIS) • ELT – transforms are processed by the target database (i.e. Data Lake, SQL, etc.) • Main benefit is scalability to larger data volumes • Main drawback is the added step between source and destination • This isn’t always a drawback when you are feeding multiple sources from the same pool of raw data My preference is ELT, even in non-big data scenarios because a database engine can typically perform asynchronous transformations faster than SSIS
  • 9.
  • 10. Version 1 vs. Version 2 Version 1: • Time-series based • Schedules driven by dataset availability • Developed using Visual Studio • Pretty cool Version 2: • General purpose • Explicit and Tumbling-window scheduling • Freaking awesome
  • 11. Version 1 vs. Version 2 Component Changes Datasets No longer use the "availability" property Linked Services Includes the new "connectVia" property which allows selection of the Integration Runtime to use (see Integration Runtime section) Pipelines Unit of scheduling instead of activities Activities Control and non-control activity types Dependencies between activities Triggers New scheduling component Integration Runtime Replacement for the Data Management Gateway
  • 12. Version 2 vs. SSIS • Pipelines ~= Packages • Can use similar master-child patterns • Linked Services ~= Connection Managers • SSIS usually extracts, transforms, and loads data all as a single process. ADF leverages external compute services to do transformation. Can also deploy and trigger SSIS packages to ADFv2 using the Azure- SSIS Integration Runtime
  • 13. Sample of Supported Sources/Sinks
  • 15.
  • 16. Linked Services • A saved connection string to a data storage or compute service • Doesn’t specify anything about the data itself, just the means of connecting to it • Referenced by Datasets
  • 17. Dataset • A data structure within a storage linked service • Think: SQL table, blob file or folder, HTTP endpoint, etc. • Can be read from and written to by Activities
  • 18. Activity • A component within a Pipeline that performs a single operation • Control and Non-control activities • Copy • Lookup • Web Request • Execute U-SQL Job • Can be linked together via dependencies • On Success • On Failure • On Completion • On Skip
  • 19. Pipeline • Pipelines are the containers for a series of activities that makes up a workflow • Started via a trigger, accept parameters, and maintain system variables such as the @pipeline().RunId
  • 20. Triggers • Schedules that trigger pipeline execution • More than one pipeline can subscribe to a single trigger • Explicit schedule - i.e. every Monday at 3 AM, or… • Tumbling window - i.e. every 6 hours starting today at 6 AM
  • 21. Integration Runtime An activity defines the action to be performed. A linked service defines the target data store or compute service. An integration runtime provides the bridge between the two. • Data Movement: between public and private data stores, on-premise networks, supports built-in connectors, format conversion, column mapping, etc. • Activity Dispatch: dispatch and monitor transformation activities to services such as: SQL Server, HDInsight, AML, etc. • SSIS Package Execution: natively execute SSIS packages.
  • 22. Integration Runtime (cont’d) Types of IR: • Azure (default) • Self-Hosted • Azure-SSIS
  • 24. Expressions, Functions, Parameters, and System Variables Oh my!
  • 25. Expressions • Syntax evaluated during execution of an activity that allows for dynamic changes to the property configurations they are used within • Reference things such as parameters, the output of previous activities, and provide access to the current item being iterated over by loops @pipeline().parameters.myParam @formatDateTime(item().value.myDateAttr, ‘yyyy-MM-DD’)
  • 26. Custom State Passing • Custom State Passing refers to the ability for a downstream activity to access the output of an upstream activity • Expressions can be used to access these output states and change configuration of the currently executing activity @activity(‘myUpstreamActivity’).output.rowsRead
  • 27. Functions • String – string manipulation • Collection – operate over arrays, strings, and sometimes dictionaries • Logic – conditions • Conversion – convert between native types • Math – can be used on integer or float • Date There is not currently a way to add or define additional functions
  • 28. Parameters • Key-value pairs that can be passed to a pipeline when it is started or a dataset when it is used by an activity • Need to first be configured to receive a parameter with a specific name and data type before the calling component can be configured to pass a value. Two types of Parameters: • Pipeline • Dataset @pipeline().parameters.myParam @dataset().myParam
  • 29. System Variables • System Variables are read-only values that are managed by the Data Factory and provide metadata to the current execution • These can be used for custom logging or within expressions • They can be either Pipeline-scoped or Trigger-scoped @pipeline().DataFactory @trigger().scheduledTime
  • 31. Design Patterns • Delta/Incremental Loading • Dynamic table loading • Custom logging • Using database change tracking
  • 33. Tools for Monitoring • Azure Portal – Author and Monitor • PowerShell
  • 34. Deployment • Use separate dev/test/prod resource groups and Data Factory services • Deploy to separate services using ARM Templates (until VS extension available) • Can also script deployments using PowerShell or Python SDK
  • 35. Debugging • Use monitor, drill into a pipeline and view error messages directly on the activity • Cannot see the result of an evaluated expression, so you may need to be clever • Depending on the error, you may get a message that is completely useless. Good luck.
  • 36. Deploying SSIS to Azure-SSIS Integration Runtime • Allows deployment and execution of native SSIS packages • Use Azure SQL Database to host SSISDB Catalog • Limitations exist with using the Azure SDK for SSIS • Cannot execute U-SQL jobs • Lift-and-shift option for existing SSIS packages
  • 38. fin.

Editor's Notes

  1. ADFv2 is more similar to SSIS than ADFv1. It’s a general purpose ELT tool with similar scheduling mechanisms Still supports time-series sources using tumbling window schedules
  2. http://www.jamesserra.com/archive/2012/01/difference-between-etl-and-elt/ Other reasons I prefer ELT: Set based code Performance tune T-SQL instead of packages Allocate more resources to database engine over SSIS Code looks cooler
  3. https://docs.microsoft.com/en-us/azure/data-factory/compare-versions
  4. https://docs.microsoft.com/en-us/azure/data-factory/compare-versions
  5. All supported services in v1: https://docs.microsoft.com/en-us/azure/data-factory/v1/data-factory-create-datasets Supported Services in v2: https://docs.microsoft.com/en-us/azure/data-factory/concepts-datasets-linked-services
  6. Previously in ADFv1, the Activity was the unit of scheduling as well as a unit of execution. This meant that activities had the ability to execute on different schedules within a given pipeline. This is no longer the case as the Pipeline is now the unit of scheduling and is kicked off by a trigger (or on-demand).
  7. Because of the change from a time-series based scheduling to explict scheduling, pipelines can no longer be setup to back-fill data using time-slices. i.e. Every day at noon starting 3 months ago. Now schedules can only start today and going forward. This means that any back-filling logic must be explicitly defined by the developer.
  8. Types of IR: Azure Self-Hosted Azure-SSIS
  9. https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
  10. When debugging a pipeline, you can’t really see the result of an expression, so you have to be attentive to figure out how it’s resolving https://docs.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions
  11. https://docs.microsoft.com/en-us/azure/data-factory/control-flow-system-variables
  12. https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-overview https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-multiple-tables-portal