FOR THE BALANCE SHEET OPTIMIZATION
EBA’S BANKING SECTOR SUMMARY LOOKS GOOD!
PROFITABILITY REMAINS A CONCERN
FOR THE EU BANKING SECTOR….
High competition
Regulatory
constraints
Volatile
environment
“Banks that prove adroit in managing their liquidity, risk, and balance sheets will have a clear
advantage over their peers. By adopting a new treasury operating model—one that gives a clearer
mandate, centralized governance, and enhanced system and data capabilities—treasuries can
improve their collateral, liquidity, and interest-rate maturity transformation. And those changes to
the operating model can help treasuries boost net interest income (NII) by 10% to 15% and reduce
balance sheet consumption by 10% to 20%” Boston Consulting Group
ARE THE NEW FUEL FOR BETTER
BUSINESS DECISIONS
• LINEAR/NON-LINEAR PROGRAMMING FOR
DETERMINISTIC PROBLEMS
• STOCHASTIC PROGRAMMING
(RECOURSIVE/DEF/MULTI-STAGE ETC.)
• UNDERLYING METHODS (INCLUDING NEURAL
NETWORKS ETC.)
DO WE NEED THE NEW
QUALITY OF
TECHNOLOGY
NEW MODELS REQUIRE
SIGNIFICANTLY MORE DATA
NEED TO CONTROL EMERGING MODEL
RISK
INCREASING DEMAND FOR THE
COMPUTATIONAL POWER
BUT WHY
ONE SINGLE MODEL CAN’T HELP TO
OPTIMIZE THE REAL BALANCE SHEET
OF A UNIVERSAL BANK
SKYROCKETING COMPLEXITY OF
MODELS AND INFRASTRUCTURE
MAINTENANCE
TO SUPPORT THE TREASURY
OPTIMIZATION?
AN EXAMPLE OF SCALE…
 PART OF WORLD’S TOP-100 RANKING, ~ 500 Bln. USD AUM
 137 TECH PEOPLE IN ALM TECHNOLOGY TEAM, ~ 50 PEOPLE
DEDICATED TO DYNAMIC BS
 ~ 300 - 350 MODELS USED AT DYNAMIC BALANCE
CONSTRUCTION
 ~ 14 BLN CONTRACT EVENTS ARE PROCESSED DURING
DYNAMIC BS FORECASTING
 A DEDICATED TEAM OF 5 DATA SCIENTISTS
TREASURY AT TIER1 RUSSIAN BANK
CONSIDERATIONS OF THE STRAIGHTFORWARD OPTIMIZATION APPROACH
NEED TO IDENTIFY REALISTIC IMPACT
FUNCTION OF FUTURE EVENTS ON DECISIONS
AND IT’S COST (PENALTY
FUNCTION/TECHNOLOGY MATRIX)
NEED TO TAKE INTO ACCOUNT CUSTOMER
BEHAVIOR IMPACT (PENALTY FUNCTION +
MANAGEMENT ACTIONS)
NEED TO GENERATE CONSISTENT
UNDERLYING SCENARIOS (MACRO-,
MARKETS, ASSET CLASS LEVEL)
NEED TO TAKE INTO ACCOUNT RISK-
WEIGHTED PARAMETERS (RoRWA/RoROE)
– NEED TO MODEL PROVISIONS AND COST
OF FUNDING
NEEDS TO BE INTEGRATED INTO BANK’S
STRESS TESTING FRAMEWORK
(DETERMINISTIC FORECASTING)
EVENBYMEANSOFADVANCEDNON-LINEAROPTIMIZATIONALGORITHMS
(E.G.NEURALNETWORKS)ITCANBECHALLENGINGTOGETIMPLEMENTABLE
RESULTS
E.G.ICAAP&ILAAPPARAMETERS
+ECONOMICCAPITAL
E.G.OPEX,IMACROECONOMIC
ANALISYS,FINANCIALMARKETS
ANALYSIS ETC.
EARLYWITHDRAWALS,NON-
PERFORMANCEETC.
COSTOFBS
RESTRUCTURIZATION
STOCHASTIC
SCENARIO
MODELING
DETERMINISTIC
SCENARIO
MODELING
INTEGRATION
LAYER
OPTIMIZATION
MODELS
4 DAYS OF EACH BUSINESS WEEK IS SPENT ON GATHERING DATA,
WHILE ONLY ONE IS SPENT ON RUNNING ALGORITHMIC MODELS
MACHINE LEARNING RULE
80/20
…AND PROBABLY ANOTHER ½ DAY IS SPENT ON DEPLOYMENT AND RUNTIME
MAINTENANCE…
BS OPTIMIZATION COMPLEXITY (AND BEYOND)
THE NEW TECHNOLOGY FOUNDATION
‘MODEL DEVOPS’
LAYER
DATA LA KE & SA NDBOX
MA NAGE ME NT LAYE R
MODEL
GOVERNANCE LAYER
INTEGR ATIO N W IT
BS M/ E R M
GENERIC ARCHITECTURE
Internal Sources
Credit Bureaus
Rating Agencies & Data
vendors
Financial Data
News Feeds
Social Media
Financial Markets (Prices
& Events)
Streaming data in real
time
Neoflex DataGram
Cloud/On premise
Sandbox Layer
DATA LAKE MNGMT.
Model Deployment &
Execution
MODEL DEV
OPS/GOVERN.
Corporate Incident Management System
Business Intelligence
Enterprise Risk ManagementSensitivities for
dynamic stress testing
Fund Transfer Pricing
ALM & Scenario analysis
Actual Basel ll/lll ratios
RAROC/RoRWA
Limits Management
Integration Services
Scenariomodelingandforecasting
Online Alerts
Modeling Toolkit
Data Quality
Model Quality Metrics
Models Repository
Service Layer Limits for BS
Optimization model
Input parameters for
OM & Current BS
BS exposure deltas
to create dynamic
BS in FR
Streaming analytics, Alerts generation in real - time
Streaming Data Processing, Business Rules, Machine Learning & AI -
streaming
Treasury FO & BO Loan Origination Core Banking Systems DWH…..
Internal systems
INTEGRATED DATA LAKE MANAGEMENT
ESSENTIAL FEATURES
MODEL DEVOPS IN A NUTSHELL
15
DYNAMIC HARDWARE SCALING
DEPLOYMENT LIBRARIES
CONTINUOUS MODEL INTEGRATION,
TESTING AND DELIVERY
EACH MODEL (OPTIMIZATION MODELS, ML/AI AND STATISTICAL MODELS) IS PLACED INTO CONTAINER THAT IS
RUNNING ON CLOUD-BASED PLATFORM
MODEL CONTAINER – WHAT’S INSIDE?
EACH MODEL CONTAINER INCLUDES THE FULL STACK TECHNOLOGY TO RUN THE MODEL:
OPERATING SYSTEM
PYTHON WITH ML LIBRARIES (E.G. SCIKIT-
LEARN ETC.)
MICROFRAMEWORK FLASK
EXECUTABLE CODE OF THE SERVICE
TRAINED MODEL
SAMPLE DEV OPS PIPELINE FOR THE MODEL
ALLOWS YOU TO CREATE MODELS
THAT WILL USE AND CONTROL
UNDERLYING MODELS
THIS TYPE OF
INFRASTRUCTURE
CASE STUDY: NEW TECHNOLOGY
FOUNDATION FOR ENTERPRISE
RISK MANAGEMENT
“We have gained successful experience in
integrating disparate external and internal
data into a single information environment
to improve the quality and speed of risk
assessment, use combined approaches to
analyzing and processing information: from
classical statistical analysis to machine
learning methods…”
Maxim Kondratenko, Board Member, VTB
https://www.neoflex.ru/press-center/publications/tsifrovoe-budushchee-
kak-vtb-vystraivaet-protsess-transformatsii-biznesa/?sphrase_id=1896
BIG DATA (CLOUDERA HADOOP, SPARK, HIVE, IMPALA)
DATA SCIENCE & ANALYTICS (ZEPPELIN, JUPITER, ML LIBRARIES, SPAGOBI)
PROJECT ‘KANTOR’ - OVERVIEW
TRANSITION FROM DESCRIPTIVE AND ‘DIAGNOSTICAL’ (A.K.A. “HOUSTON, WE HAVE A PROBLEM!”) ANALYTICS TO THE
PREDICTIVE MACHINE LEARINING MODELS WITHING RISK MANAGEMENT UNIVERSE OF THE BANK
INCLEASE CREDIT QUALITY OF THE LOAN
PORTFOLIO USING ENHANCED CREDIT RISK
MODELS AND EARLY WARNING FRAMEWORK
DECTREASE MANUAL WORKLOAD FOR
ANALYSTS
CONTRIBUTION TO CUSTOMER BEHAVIORAL
MODELING
INTERNAL
SOURCES DATA LAKE DATA SCIENCE SANDBOX
RISK DATA
MARTS
RISK
ANALYTIC
WORKSPACES
EXTERNAL
SOURCES
EXTERNAL DATA
INTERNAL DATA
…
СРР
DWH
DATA SCIENCE & ARTIFICIAL INTELLIGENCE
SOME ACHIEVENTS SO FAR..
EXTERNAL DATA:
YANDEX.METRICS
GOOGLE ANALYTICS
DWH DATA:
C/A’S OF LARGE AND
MIDDLE-SIZED CLIENTS
DWH DATA:
POSTINGS DESCRIPTION
(GUARANTEES ISSUED)
STAT.TESTS
RANDOM FOREST (FEATURE
IMPORTANCE)
RIDGE REGRESSION
SARIMA
BigARTM (ADDITIVE
REGULARIZATION TOPIC
MODEL)
CUSTOMER BEHAVIOUR
HYPOTHESIS
SIGNIFICANT FACTORS
BALANCES FORECAST
OPERATIONS
CLASSIFICATION BY
POSTING TYPE
CONTACTUS!
Alexey Antonov, Ph.D
Director | Capital Markets and Risk
Management Technology
E: aantonov@neoflex.ru
M: +79852884394

New technology foundations for the balance sheet optimization

  • 1.
    FOR THE BALANCESHEET OPTIMIZATION
  • 2.
    EBA’S BANKING SECTORSUMMARY LOOKS GOOD!
  • 3.
    PROFITABILITY REMAINS ACONCERN FOR THE EU BANKING SECTOR….
  • 4.
    High competition Regulatory constraints Volatile environment “Banks thatprove adroit in managing their liquidity, risk, and balance sheets will have a clear advantage over their peers. By adopting a new treasury operating model—one that gives a clearer mandate, centralized governance, and enhanced system and data capabilities—treasuries can improve their collateral, liquidity, and interest-rate maturity transformation. And those changes to the operating model can help treasuries boost net interest income (NII) by 10% to 15% and reduce balance sheet consumption by 10% to 20%” Boston Consulting Group
  • 5.
    ARE THE NEWFUEL FOR BETTER BUSINESS DECISIONS • LINEAR/NON-LINEAR PROGRAMMING FOR DETERMINISTIC PROBLEMS • STOCHASTIC PROGRAMMING (RECOURSIVE/DEF/MULTI-STAGE ETC.) • UNDERLYING METHODS (INCLUDING NEURAL NETWORKS ETC.)
  • 6.
    DO WE NEEDTHE NEW QUALITY OF TECHNOLOGY NEW MODELS REQUIRE SIGNIFICANTLY MORE DATA NEED TO CONTROL EMERGING MODEL RISK INCREASING DEMAND FOR THE COMPUTATIONAL POWER BUT WHY ONE SINGLE MODEL CAN’T HELP TO OPTIMIZE THE REAL BALANCE SHEET OF A UNIVERSAL BANK SKYROCKETING COMPLEXITY OF MODELS AND INFRASTRUCTURE MAINTENANCE TO SUPPORT THE TREASURY OPTIMIZATION?
  • 7.
    AN EXAMPLE OFSCALE…  PART OF WORLD’S TOP-100 RANKING, ~ 500 Bln. USD AUM  137 TECH PEOPLE IN ALM TECHNOLOGY TEAM, ~ 50 PEOPLE DEDICATED TO DYNAMIC BS  ~ 300 - 350 MODELS USED AT DYNAMIC BALANCE CONSTRUCTION  ~ 14 BLN CONTRACT EVENTS ARE PROCESSED DURING DYNAMIC BS FORECASTING  A DEDICATED TEAM OF 5 DATA SCIENTISTS TREASURY AT TIER1 RUSSIAN BANK
  • 8.
    CONSIDERATIONS OF THESTRAIGHTFORWARD OPTIMIZATION APPROACH NEED TO IDENTIFY REALISTIC IMPACT FUNCTION OF FUTURE EVENTS ON DECISIONS AND IT’S COST (PENALTY FUNCTION/TECHNOLOGY MATRIX) NEED TO TAKE INTO ACCOUNT CUSTOMER BEHAVIOR IMPACT (PENALTY FUNCTION + MANAGEMENT ACTIONS) NEED TO GENERATE CONSISTENT UNDERLYING SCENARIOS (MACRO-, MARKETS, ASSET CLASS LEVEL) NEED TO TAKE INTO ACCOUNT RISK- WEIGHTED PARAMETERS (RoRWA/RoROE) – NEED TO MODEL PROVISIONS AND COST OF FUNDING NEEDS TO BE INTEGRATED INTO BANK’S STRESS TESTING FRAMEWORK (DETERMINISTIC FORECASTING) EVENBYMEANSOFADVANCEDNON-LINEAROPTIMIZATIONALGORITHMS (E.G.NEURALNETWORKS)ITCANBECHALLENGINGTOGETIMPLEMENTABLE RESULTS
  • 9.
  • 10.
  • 11.
    4 DAYS OFEACH BUSINESS WEEK IS SPENT ON GATHERING DATA, WHILE ONLY ONE IS SPENT ON RUNNING ALGORITHMIC MODELS MACHINE LEARNING RULE 80/20 …AND PROBABLY ANOTHER ½ DAY IS SPENT ON DEPLOYMENT AND RUNTIME MAINTENANCE…
  • 12.
    BS OPTIMIZATION COMPLEXITY(AND BEYOND) THE NEW TECHNOLOGY FOUNDATION ‘MODEL DEVOPS’ LAYER DATA LA KE & SA NDBOX MA NAGE ME NT LAYE R MODEL GOVERNANCE LAYER INTEGR ATIO N W IT BS M/ E R M
  • 13.
    GENERIC ARCHITECTURE Internal Sources CreditBureaus Rating Agencies & Data vendors Financial Data News Feeds Social Media Financial Markets (Prices & Events) Streaming data in real time Neoflex DataGram Cloud/On premise Sandbox Layer DATA LAKE MNGMT. Model Deployment & Execution MODEL DEV OPS/GOVERN. Corporate Incident Management System Business Intelligence Enterprise Risk ManagementSensitivities for dynamic stress testing Fund Transfer Pricing ALM & Scenario analysis Actual Basel ll/lll ratios RAROC/RoRWA Limits Management Integration Services Scenariomodelingandforecasting Online Alerts Modeling Toolkit Data Quality Model Quality Metrics Models Repository Service Layer Limits for BS Optimization model Input parameters for OM & Current BS BS exposure deltas to create dynamic BS in FR Streaming analytics, Alerts generation in real - time Streaming Data Processing, Business Rules, Machine Learning & AI - streaming Treasury FO & BO Loan Origination Core Banking Systems DWH….. Internal systems
  • 14.
    INTEGRATED DATA LAKEMANAGEMENT ESSENTIAL FEATURES
  • 15.
    MODEL DEVOPS INA NUTSHELL 15 DYNAMIC HARDWARE SCALING DEPLOYMENT LIBRARIES CONTINUOUS MODEL INTEGRATION, TESTING AND DELIVERY EACH MODEL (OPTIMIZATION MODELS, ML/AI AND STATISTICAL MODELS) IS PLACED INTO CONTAINER THAT IS RUNNING ON CLOUD-BASED PLATFORM
  • 16.
    MODEL CONTAINER –WHAT’S INSIDE? EACH MODEL CONTAINER INCLUDES THE FULL STACK TECHNOLOGY TO RUN THE MODEL: OPERATING SYSTEM PYTHON WITH ML LIBRARIES (E.G. SCIKIT- LEARN ETC.) MICROFRAMEWORK FLASK EXECUTABLE CODE OF THE SERVICE TRAINED MODEL
  • 17.
    SAMPLE DEV OPSPIPELINE FOR THE MODEL
  • 18.
    ALLOWS YOU TOCREATE MODELS THAT WILL USE AND CONTROL UNDERLYING MODELS THIS TYPE OF INFRASTRUCTURE
  • 19.
    CASE STUDY: NEWTECHNOLOGY FOUNDATION FOR ENTERPRISE RISK MANAGEMENT “We have gained successful experience in integrating disparate external and internal data into a single information environment to improve the quality and speed of risk assessment, use combined approaches to analyzing and processing information: from classical statistical analysis to machine learning methods…” Maxim Kondratenko, Board Member, VTB https://www.neoflex.ru/press-center/publications/tsifrovoe-budushchee- kak-vtb-vystraivaet-protsess-transformatsii-biznesa/?sphrase_id=1896
  • 20.
    BIG DATA (CLOUDERAHADOOP, SPARK, HIVE, IMPALA) DATA SCIENCE & ANALYTICS (ZEPPELIN, JUPITER, ML LIBRARIES, SPAGOBI) PROJECT ‘KANTOR’ - OVERVIEW TRANSITION FROM DESCRIPTIVE AND ‘DIAGNOSTICAL’ (A.K.A. “HOUSTON, WE HAVE A PROBLEM!”) ANALYTICS TO THE PREDICTIVE MACHINE LEARINING MODELS WITHING RISK MANAGEMENT UNIVERSE OF THE BANK INCLEASE CREDIT QUALITY OF THE LOAN PORTFOLIO USING ENHANCED CREDIT RISK MODELS AND EARLY WARNING FRAMEWORK DECTREASE MANUAL WORKLOAD FOR ANALYSTS CONTRIBUTION TO CUSTOMER BEHAVIORAL MODELING INTERNAL SOURCES DATA LAKE DATA SCIENCE SANDBOX RISK DATA MARTS RISK ANALYTIC WORKSPACES EXTERNAL SOURCES EXTERNAL DATA INTERNAL DATA … СРР DWH
  • 21.
    DATA SCIENCE &ARTIFICIAL INTELLIGENCE SOME ACHIEVENTS SO FAR.. EXTERNAL DATA: YANDEX.METRICS GOOGLE ANALYTICS DWH DATA: C/A’S OF LARGE AND MIDDLE-SIZED CLIENTS DWH DATA: POSTINGS DESCRIPTION (GUARANTEES ISSUED) STAT.TESTS RANDOM FOREST (FEATURE IMPORTANCE) RIDGE REGRESSION SARIMA BigARTM (ADDITIVE REGULARIZATION TOPIC MODEL) CUSTOMER BEHAVIOUR HYPOTHESIS SIGNIFICANT FACTORS BALANCES FORECAST OPERATIONS CLASSIFICATION BY POSTING TYPE
  • 22.
    CONTACTUS! Alexey Antonov, Ph.D Director| Capital Markets and Risk Management Technology E: aantonov@neoflex.ru M: +79852884394

Editor's Notes

  • #2  - Thank you for joining the event and thanks a lot to our partners at Finastra for inviting us - Following up previous speakers who were highlighting the amazing opportunities that new models and algorithms bring to the ALM world, and since I’m a representative of an IT fintech company, I’d like to speak today about technology that would make it all possible - But before we’ll do a deep dive, let’s have a high level look at the current state of arts in the banking sector
  • #3  - On this slide you may see some charts featuring the key indicators of the European Banking sector, which were taken from the recent EBA’s risk dashboard As you may see the summary looks good and we may say that in general banking sector has recovered from the financial crisis consequences that took place in 2008/2009 and in 2014 with some degree Banks managed to build up required capital to cover evolving regulatory requirements and you may see that the CET1 ration is going up Banks managed to improve their risk management policies which resulted in decrease of Non-Performing Exposures Many banks survived well recent EBA stress test, including Polish banks from what I saw ROE stabilized at a level of 6 to 10% in general
  • #4  - But the profitability remains a concern – this is actually the EBA observation - There’s a number of reasons behind that, - First of all it’s increasing regulatory pressure – you all have just experienced the impact of IFRS9 and probably some of you have faced PnL impact due to ECL provisions build up (generally in the global banking sector provision buildup is around 10%) Secondly, the competition becomes tougher, including pressure from fintech companies that try to penetrate financial services market Finally we live in a volatile world and face a lot of risks like politics that may affect the business.
  • #5  - So the environment is tough, but the good news is that ALM optimization has a great potential to improve Bank’s profitability - For instance, one of the recent BCG’s (Boston Consulting Group) researches says that proper treasury optimization may unlock 10-15% of increase in NII, which is especially important for the universal banks
  • #6  - It’s clear that to unlock this value we need models that will help Treasurers to make better business decisions Specifically, when we speak about BS optimizations, usually we should make use of Linear/Non-linear optimization techniques, or stochastic programming models (which may be of various types – e.g. anticipative, adaptive, recursive, Deterministic Equivalent Formulation) as well as use some of the underlying techniques. A number of approaches to the optimization problem relevant to ALM world can be found in a literature, but they all have one this in common – they’ re all sophisticated and require heavy computations.
  • #7 This leads me to the main statement that I would like to stress in my today’s presentation – TO MAKE USE OF THOSE COMPLEX MODELS, BANKS NEED TO SHIFT TO THE NEW TECHNOLOGY LEVEL Let me briefly point out a few things that we see as a drivers of this shift (READ SLIDE)
  • #8 I can give you an idea of the problem/or complexity scale using example of one of our clients, Russia’s oldest and largest bank – Sberbank READ THE FACTS FROM SLIDE So you may feel how difficult it might be to manage this type of scale
  • #9  - To illustrate this point further, we may turn to an example of a straightforward BS optimization approach, that can be found in literature and in practice as well. - Normally our ultimate target is to maximize some profitability – related parameter like ROE, ROA or Income, which is a function of our asset’s returns and cost of our liabilities and OPEX We also need to consider a number of constraints, both related to the regulation (e.g. BIII ratios, ICAAP/ILAAP) and Bank’s business model and risk appetite Even such a simple model has a number of considerations, that increase the level of complexity dramatically (read the slide).
  • #10  - Summing this up, a modern BS optimization model structure looks like an iceberg with the Optimization algorithm on top and sometimes dozens of underlying models that we need to use to model and forecast optimization models parameters and constraints values.
  • #11  - Another analogy that I like is a space station where you have a module with your optimization algorithms (including underlying technology!), a module of stochastic modeling (runs simulation of interest rates and asset returns) as well as module with deterministic modeling that is normally used to run CF and Economic value simulations - This all needs to be integrated - Underneath we have the whole planet of data, which becomes especially relevant when we start speaking about ML algorythms and Neural networks.
  • #12  - And the new data requirement posed by contemporary models lead us to another observation, which is usually called 80/20 ML rule - The actual Data Science work (and we observe it in our DS team as well) assumes lots of digging in data - This takes lot of time of expensive DS resources, and also actually DSs don’t really like to do it, they prefer rather to do math and research than data integration, cleansing and management. Interestingly, necessity to do too much data preps is one of the key reasons why DSs are willing to change the job – this is what I personally observe during the interwiews that we run with the candidates to the DS team.
  • #13  - So we’ve spoken enough about the problem, now it’s time to have a look on how modern technology offered by fintech may help banks to find a resolution - From our view there are 3 relatively emerging technology trends that have a great potential to boost model productivity both in ALM and Risk management world Microservices architecture, Big Data (has been on the radar for while) and DevOps – technology stack that’s intended to support development operations. On a logical level it’s translated into the new components of the Risk/Treasury architecture, namely Model DevOps (as we call it) or a Model Maintenance layer, Data Lake management layer (required to manage Big Data and build/train models), Model governance layer – in a classical sense (quality metrics and business process management), Integration layer to connect the whole thing with ERM that is used for many important calculations It’s all linked together and literally on a next slide I’ll explain how these things may be fitted into the Risk and Treasury architecture of the bank.
  • #14  - This is exactly how these new components are fitted in the typical Treasury/Risk architecture. Data lake management component is used for collecting of data of various nature (structured vs. unstructured, batch vs. streaming), data cleansing and model creation. Once the model is created it can be dropped to the dev ops pipeline that will automatically pass it through the operations cycle (testing/training/production) and will take care of technical maintenance such as monitoring, logging, auto-scaling for models requiring heavy computations This layer should be integrated with ALM/ERM systems and also can be linked to your corporate Incident management (e.g. Jira or Redmine to notify you of any issues with models or events detected by streaming models) and BI layer like Fusion Insight for instance or any other.
  • #15  - Now let me briefly talk to some features that we consider very important for those aforementioned components. - As I mentioned earlier, it’s very important for the DLM platform to have a pack of ETL functionality, powerful DQ and modeling toolkit, all together adopted to heavy computations on Big Data samples. - Another crucial thing that will define the future of such platforms is the ability to remove complexity, related to the big data technology. It should be visual, no manual coding of transport and no Scala! E.g. at Neoflex we have some developments in this area (you may see some screenshots of our software on the slide) and our DS team is using it themselves, and this is easing their life.
  • #16  - Another important concept that I was talking about is Model Dev Ops layer - In a nutshell the concept is spinning around the idea to think of a model as a standalone microservice, that can receive input (model inputs) and can give an output (model results or management commands). A collection of models can be run on top of microservices orchestration platform, which supports standardized PAI management, maintenance features (logging, monitoring, scaling, deployment infrastructure etc. and also integrated with the deployment pipeline). So this platform is managing multiple models that can have various nature (R/Python/MlLib/C++ etc.), and may interact with Bank’s or external systems as well as with each other. Finally, the platform should incorporate the DevOps technology stack that enables automated lifecycle management for the model.
  • #17  - This slide briefly illustrates how the model container may look like - List components
  • #18  - And this slide illustrates the sample dev ops pipeline that we’re using in some of our projects (SHIFT SLIDE) It’s sophisticated indeed and it’s really important to remove the complexity from Bank’s side for us it seems that the most prospective way to do that is to wrap it into the managed service that can be hosted by fintechs.
  • #19  - Summing it up, in the context of BS optimization this type of infrastructure will allow you to control your ‘Model Iceberg’, that we saw on the previous slides
  • #20  - Concluding my speech, I’d like to briefly highlight one use case that we had with another large Russian Bank, 2nd largest – VTB Group So basically a year ago or so Bank has developed new strategy and made a decision to march down the path of Risk Management Technology upgrade They started to implement elements of the new architecture that I was talking just now The project that has been successfully completed last year included implementation of the Data Lake management layer,
  • #23 This is basically it and we’re happy to answer your questions if any.