Building Scalable Robust Data Pipelines with
❖ A brief introduction to Qubole
❖ Apache Airflow
❖ Operational Challenges in managing an ETL
❖ Alerts and Monitoring
❖ Quality Assurance in ETL’s
About Qubole Data service
❖ A self-service platform for big data analytics.
❖ Delivers best-in-class Apache tools such as Hadoop, Hive, Spark,
etc. integrated into an enterprise-feature rich platform optimized
to run in the cloud.
❖ Enables users to focus on their data rather than the platform.
Data Team @ Qubole
❖ Data Warehouse for Qubole
❖ Provides Insights and Recommendations to users
❖ Just Another Qubole Account
❖ Enabling data driven features within QDS
Multi Tenant Nature Of Data
Apache Airflow For ETL
❖ Developer Friendly
❖ A rich collection of Operators, CLI utilities and UI to author and manage your
❖ Horizontally Scalable.
❖ Tight Integration With Qubole
Operational Challenges In ETL World.
How to achieve
How to effectively
ETL’s in a multi
How we do we
make ETL’s aware of
the Data Warehouse
Use Airflow Variables For Saving ETL
Airflow Variables for ETL Configuration
❖ Stores the information as a key value pair in airflow.
❖ Extensive support like CLI, UI and API to manage the variables
❖ Can be used from within the airflow script as
❖ A leaf out of Ruby on Rails: Active Record
❖ Each migration is tagged and committed as a
single commit to version control along with ETL
The PROCESS IS EASY
Run any new
❖ Traditional deployment too messy when multiple users are handling airflow.
❖ Data Apps for ETL deployment.
❖ Provides cli option like <ETL_NAME> deploy -r <version_tag> -d <start_date>
Copy the final
script file to
❖ This graph has 90+
❖ 8 -9 different types.
❖ Clearly, error prone!
IMPORTANCE OF DATA
❖ Application’s correctness depends on correctness of data.
❖ Increase confidence on data by quantifying data quality.
❖ Correcting existing data can be expensive - prevention better than cure!
❖ Stopping critical downstream tasks if the data is invalid.
❖ Monitor dips, peaks, anomalies.
❖ Hard problem!
❖ Not real time.
❖ One size doesn’t fit all - Different ETLs manipulate data in different ways.
❖ Difficult to maintain.
Using Apache Airflow Check operators:
check operator for
queries running on
Fail the operator if
the validation fails
Limitations and Enhancements
to open source Apache Airflow
Problem: Airflow Check operators required pass_value to be defined
the ETL starts.
Use case: Validating data import logic
Solution: Make pass_value an Airflow template field
This way it can be configured at run-time. The pass value can be injected
through multiple mechanisms once it’s an airflow template field.
1. Compare Data across engines
Problem: Currently, Apache airflow check operators consider single row for
Use case: Run group queries, compare each of the values against the pass_value.
Solution: Qubole_check_operator adds `results_parser_callable` parameter
The function pointed to by `results_parser_callable` holds the logic to return a list
of records on which the checks would be performed.
2. Validate multiline results
Parser function as parameter to Check
Integration of Apache Airflow
Check Operators with Qubole ETLs
ETL # 1: Data Ingestion Imports data from RDS tables into Data Warehouse for analysis
Mismatch with source data
1. Data duplication
2. Data missing for certain duration
- Count comparison across the two data stores - source and
How checks have helped us:
- Verify and rectify upsert logic (which is not plain copy of
PS: Runtime fetching of expected values!
ETL # 2: Data
Repartitions a day’s worth of data into hourly partitions.
1. Data ending up in single partition field (Default hive
2. Wrong ordering of values in fields.
1. Number of partitions getting created are 24 (one for every
2. Check the value of critical field, “source” .
How checks have helped us: Verify and rectify repartitioning
ETL # 3: Cost Computation
Computes Qubole Compute Unit
Situation: We are narrowing down on the
granularity of cost computation from daily to hourly.
How Checks have helped?
To monitor new data and alarm in case of
mismatch in trends of old and new data.
ETL # 4: Data
Parses customer queries and outputs table usage information.
1. Data missing for a customer account.
2. Data loss due to different syntaxes across engines.
3. Data loss due to query syntax changes across different versions of data-
1. Group by account ids, if any of them is 0, raise an alert.
2. Group by on engine type, account ids. If high error %, raise an alert.
How checks have helped us:
- Insights into amount of data loss.
- Provides feedback, helped us make syntax checking more robust.
❖ Ability to plug-in different alerting mechanisms.
❖ Dependency management and Failure handling.
❖ Ability to parse the output of assert query in a user defined manner.
❖ Run time fetching of the pass_value against which the comparison is made.
❖ Ability to generate failure/success report.
One size doesn’t fit
all- Estimation of data
trends is a difficult
validation task to the
Source code has been
contributed to Apache Airflow
AIRFLOW-2228: Enhancements in Check operator
AIRFLOW-2213: Adding Qubole Check Operator
In data we trust!
You can find us at: