Come hear about T-Mobile's success story!
MongoDB was our choice to service customers at high velocity and improve customer experience with lightning speed response for complex read queries.
2. 2
Before
Monolithic EIP App Read & Write
SOAP Services
EIP DB
Retail
Self-
Care
Care
Read & Write
SOAP Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Physical Unix
Boxes
User Interfaces
Oracle SQL DB
Challenges
1) Complex SOAP Services
2) No Focused Services
3) Expand New Capabilities
4) Monetize the Services New
Customers
5) Faster Delivery
6) Complex PL/SQL queries
needs
7) Service Latency, Throughput
8) Security (transport, Access,
data, threat, script
ingestion, lock/unlock,
Quota, etc.)
9) Scalability
3. 3
Digital API First Approach
Monolithic App Read & Write SOAP
Services
EIP DB
Retail
Self-
Care
Care
Loan
History
Loan
Profile
APIGEE
Read SOAP
Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read REST Service
Physical Unix
Boxes
User Interfaces
Oracle SQL DB
Loan
HistoryAll Gets
Loan MS
Loan
ProfileAll Gets
Lease MS
SAM(Simple Account
Management)
User Interfaces
Hook UP New REST Service Version 1
Wrap around SOAP service read data
4. 4
Data Migration & Reconcile
EIP DB
Loans Data Lease Data
Reconciliation After insertion
Oracle SQL DB
Collections in Mongo Cluster
Collections
Data Migration Extraction
Process
Data Prep In to SQL DB (big time)
• Design a Flat table
structure with around 80
columns as required
• We managed with no of
transformation with
complex queries to fill
the Flat Table over 90
millions records
• We created Partitions on
Flat table, 5 million each
to retrieve faster in
Prepare data Extraction Env
• Spark 2.4
• Spring Boot App
• PKS Space
Optimize the Spark data Load
on K8
Reconciliation Jobs: Spring
Jobs for Auto Run every
configured intervals and
notifications
• Prepare MongoDB Env
• Standard setup includes
Spark connector
• Create Space and
Collections
• Create Indexes
• Set up Sharding
• High availability
replication
• Secure
• Monitoring tools
5. 5
Mongo ETL Spark Java Code Snippet
Get Mongo Spark Session on Target MongoDB Cluster
return SparkSession.builder().master(masterUri).appName(appName).config("spark.mongodb.output.uri",mongoUrl)
.config("spark.driver.cores", "4").config("spark.executor.memory", ”2g").config("spark.mongodb.output.database", mongoEntity)
.config("spark.mongodb.output.collection", collectionName) .config("spark.mongodb.output.maxBatchSize", "5000000")
.config("spark.mongodb.output.shardKey", shardKey) .config("spark.mongodb.keep_alive_ms", "120000") .getOrCreate();
Get DataFrameReader & Load data in to DataSets from Source OracleDB
Map<String, String> map = new HashMap<>();
map.put("url", jdbcUrl);
map.put("dbtable", dbtable); Query for SELECT /*+ full(F) parallel(F,16) */ * FROM EIP_STGS3 F where sno between 1 and 9999999 and
FINANCINGMODELTYPE ='LOAN'
map.put("user", user);
map.put("password", password);
map.put("driver", driver);
map.put("numPartitions", numPartitions);
map.put("partitionColumn", partitionColumn);
map.put("lowerBound", lowerBound);
map.put("upperBound", upperBound);
map.put("pushDownPredicate", pushDownPredicate);
map.put("fetchsize", fetchsize);
Dataset<Row> jdbcDF = sparkSession.read().format("jdbc").options(map).load();
6. 6
Mongo ETL Spark Java Code Snippet
Map the SQL columns as required attributes data types
Dataset<Row> jdbcDF1 = jdbcDF.drop("SNO").withColumn("accountNumber", functions.col("AC_ACCOUNTNUMBER").cast(
DataTypes.StringType)).drop("AC_ACCOUNTNUMBER")
Construct JSON structure with domain Objcet
Dataset<Row> jdbcDF1 = jdbcDF .withColumn("_class",functions.lit("com.xxxx.finance.migration.model.Loans"))
.withColumn("nominalInterestRate", functions.col("annualPercentageRate")).withColumn("account",
functions.struct(functions.col("accountNumber"),functions.col("billCycle"),functions.col("accountStatus"),functions.col("acco
untSubType"),functions.col("accountType"))).drop("billCycle").drop("accountStatus").drop("accountSubType").drop("account
Type")
Save the data in to Target Mongo Cluster
MongoSpark.write(jdbcDF1).option("collection", collectionName).mode(SaveMode.Append).save();
7. 7
Mongo ETL Spark Log Snippet
Spark Master Log :
2019-03-28 12:20:17 INFO Master:2566 - Started daemon with process name: 25430@prdplbgat0001
2019-03-28 12:20:18 INFO Master:54 - I have been elected leader! New state: ALIVE
.
.
2019-03-28 12:21:36 INFO Master:54 - Registering worker 10.135.50.121:37489 with 1 cores, 2.0 GB
RAM
2019-03-28 12:21:36 INFO Master:54 - Registering worker 10.135.50.121:45031 with 1 cores, 2.0 GB
RAM
2019-03-28 12:21:36 INFO Master:54 - Registering worker 10.135.50.121:37956 with 1 cores, 2.0 GB
RAM
2019-03-28 12:21:36 INFO Master:54 - Registering worker 10.135.50.121:43157 with 1 cores, 2.0 GB
RAM
.
.
2019-03-28 12:33:00 INFO Master:54 - Registering app spark
8. 8
Mongo ETL Spark Log Snippet
Slave 2
2019-03-28 12:21:33 INFO Worker:2566 - Started daemon with process name: 25742@prdplbgat0001
2019-03-28 12:21:35 INFO Utils:54 - Successfully started service 'sparkWorker' on port 45031.
2019-03-28 12:21:35 INFO Worker:54 - Starting Spark worker 10.135.50.121:45031 with 1 cores, 2.0 GB RAM
2019-03-28 12:33:00 INFO Worker:54 - Asked to launch executor app-20190328123300-0000/1 for spark
2019-03-28 12:50:27 INFO Worker:54 - Asked to kill executor app-20190328123300-0000/1
2019-03-28 12:50:27 INFO ExecutorRunner:54 - Runner thread for executor app-20190328123300-0000/1 interrupted
2019-03-28 12:50:27 INFO ExecutorRunner:54 - Killing process!
2019-03-28 12:50:27 INFO Worker:54 - Executor app-20190328123300-0000/1 finished with state KILLED exitStatus 1
Slave 3
2019-03-28 12:21:33 INFO Worker:2566 - Started daemon with process name: 25738@prdplbgat0001
2019-03-28 12:21:35 INFO Utils:54 - Successfully started service 'sparkWorker' on port 43157.
2019-03-28 12:21:35 INFO Worker:54 - Starting Spark worker 10.135.50.121:43157 with 1 cores, 2.0 GB RAM
2019-03-28 12:21:35 INFO Worker:54 - Running Spark version 2.4.0
2019-03-28 12:21:36 INFO Worker:54 - Successfully registered with master spark://10.135.50.121:7077
2019-03-28 12:33:00 INFO Worker:54 - Asked to launch executor app-20190328123300-0000/2 for spark
2019-03-28 12:50:27 INFO Worker:54 - Asked to kill executor app-20190328123300-0000/2
2019-03-28 12:50:27 INFO ExecutorRunner:54 - Runner thread for executor app-20190328123300-0000/2 interrupted
2019-03-28 12:50:27 INFO ExecutorRunner:54 - Killing process!
9. 9
Data Load and Data Sync
Monolithic App Read & Write SOAP
Services
EIP DB
Retail
Self-
Care
Care
Loan
History
Loan
Profile
APIGEE
Read SOAP
Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read REST APIs
Physical Unix
Boxes
Cloud Foundry
Loans Data Lease Data
Mass data load to Mongo
Cluster and Oracle. Reconcile
Jobs
Event Sourcing
User Interfaces
Read Data MongoDB ClusterReal time messages
Oracle SQL DB
Loan/Lease data messages to
RMQ
Collections in Mongo
Cluster
Loan
HistoryAll Get
Loans
Data
Loan
ProfileAll Get
Lease
Data
Loans
Listener
s
Lease
Listener
s
Read All GET Microservices
SAM(Simple Account
Management)
User Interfaces
10. 10
Microservice Canary Transition MongoDB
Monolithic App Read & Write SOAP
Services
EIP DB
Retail
Self-
Care
Care
Loan
History
Loan
Profile
APIGEE
Read SOAP
Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read REST APIs
Physical Unix
Boxes
Cloud Foundry
Loans Data Lease Data
Data Reconcile to Mongo
Cluster
Event Sourcing
User Interfaces
Read Data MongoDB Clustermessages
Oracle SQL DB
Loan/Lease data messages to
RMQ
Messages
Collections in Mongo
Cluster
Loan
HistoryAll Get
Loans
Data
Loan
ProfileAll Get
Lease
Data
Loans
Listener
s
Lease
Listener
s
Read All GET Microservices
SAM(Simple Account
Management)
User Interfaces
11. 11
Better Customer Experience
Monolithic App Read & Write SOAP
Services
EIP DB
Retail
Self-
Care
Care
Loan
History
Loan
Profile
APIGEE
Read & Write
SOAP Service
Read & Write
SOAP Service
Read & Write
SOAP Service
Read REST APIs
Physical Unix
Boxes
Cloud Foundry
Loans Data Lease Data
Recons Jobs Data sync SOR
and Read DB
Event Sourcing
User Interfaces
Read Data MongoDB Clustermessages
Oracle SQL DB
Loan/Lease data messages to
RMQ
Collections in Mongo
Cluster
Loan
HistoryAll Get
Loans
Data
Loan
ProfileAll Get
Lease
Data
Loans
Listener
s
Lease
Listener
s
Read All GET Microservices
SAM(Simple Account
Management)
User Interfaces
12. 12
Lessons
§ Build data quality tests upfront as few data quality
issues came up in Production and weren't identified in
Dev/QA. We ended up doing 3 canary deployments to
address it
§ Ensure data is reconciled in a shorter period of time to
avoid sync issues
§ Data migration job failed couple of times due to
network issues. Ensure stable and performant network
connectivity and build some redundancy if possible
§ Leverage MongoDB Enterprise encrypted storage engine
to encrypt data at rest. Secure PII and CPNI data.
16. 16
Zero
Downtime
Deployments
Small daytime
changes as
common practice
Fewer and
shorter
incidents
Faster resolutions
What effect did it have on the Business?
1
6
Faster apps
Over 60 % reduction
in Service response
time
More
frequent
changes
20x increase in
planned changes