Big Data Analytics for Dodd-Frank


Published on

Published in: Business, Economy & Finance
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Big Data Analytics for Dodd-Frank

  1. 1. Big Data Analytics For Dodd-Frank Financial Regulations By Dr. Shyam Sundar Sarkar, CEO RiskCompute E-mail: shyam.sarkar@riskcompute.com06/20/12 All Rights Reserved by Dr. Shyam 1 Sarkar of RiskCompute, CA
  2. 2. Events Preceeding Dodd-Frank Act• Outsized, Unregulated OTC Derivatives Market (Bank ofInternational Settlements Data) ;• 2008 Collapse of Wall Street Banks (Lehman, Bear-Sterns,Morgan Stanley) ;• Record number of Home Foreclosures (Mortgage crisis) ;• Unprecedented US Govt. Support & Spending (Backstopbanks and Money market funds, Bailout GM, AIG etc.) ;• World-Wide Crisis and Recession ; 06/20/12 All Rights Reserved by Dr. Shyam 2 Sarkar of RiskCompute, CA
  3. 3. Evolution of Financial Regulations• Following Sox, firms rushed to implement a robust controlenvironment and the onus was placed squarely on compliance.• In the US, regulators are again cracking down on financialservices to ensure firms not only have a secure controlframework but also that they can prove their framework isrelevant to all the risks experienced in the organization;• Risk factors aren’t just financial reporting any more -- nowthe issues are broad risk management and governance; thetypes of controls and activities senior stakeholders want tohave more related to operational risk or compliance;06/20/12 All Rights Reserved by Dr. Shyam 3 Sarkar of RiskCompute, CA
  4. 4. Bottom-up Vs. Top-down Approach•In the initial years of Sox, response was process- and controls-driven and was built from the bottom up;• Because the program was very much based on process-levelcontrols, it helped companies comply but did not take a top-down, risk-based view;• Historically, financial services industries have built a morebottom-up approach as opposed to a top-down, risk-focusedapproach, which usually means having more controls;• Post-financial crisis is a good time to challenge whether thatbottom-up approach is right in terms of control frameworks; 06/20/12 All Rights Reserved by Dr. Shyam 4 Sarkar of RiskCompute, CA
  5. 5. Wall-Street Accountability and Consumer Protection Act of 2010 (Signed into law July 21, 2010) • There are Titles (Sections) I to XV1 ; • Focussing on Financial Stability :: Title I through VIII ; • Focussing on Investor Protection :: Title IX ; • Focussing on Consumer Financial Protection :: Title X ; • Focussing on Mortgage Reform :: Title XIV ; • Miscellaneous Provisions :: Title X1 - XIII, XV, XV1 ;06/20/12 All Rights Reserved by Dr. Shyam 5 Sarkar of RiskCompute, CA
  6. 6. IT and Dodd-Frank Implementations (contd..)• Title VI (Regulation of Banks and Saving Companies) :: Datawarehousing, core systems and risk analytics, as well asmaintaining clean, consistent data, will be imperative.• *Title VII (Wall Street Transparency and Accountability) ::This law changes how broker-dealers, mutual funds, hedge fundsand end users trade and clear OTC derivatives; (1) Swaps trades will be guaranteed by the clearinghouse to eliminate exposure to counter-party risk; (2) Cleared swaps contracts must be traded on a registered venue, either an exchange or a swap execution facility (SEF) ; 06/20/12 All Rights Reserved by Dr. Shyam 6 Sarkar of RiskCompute, CA
  7. 7. IT and Dodd-Frank Implementations (contd..)• *Title VII (Wall Street Transparency andAccountability) contd.. :: Firms need connectivity to theclearinghouses and to multiple swap execution facilities ;Real-time data reporting will enable regulators to see how riskis shifting through the markets ;• *Title VIII (Payment, Clearing and SettlementSupervision) :: This law takes title VII a step further byplacing clearinghouses under the watch of regulators; Theclearinghouses are classified as systemically important. Aninfrastructure is needed to take swaps trades conducted onswap execution facilities and funnel them to the clearinghousesfor monitoring ;06/20/12 All Rights Reserved by Dr. Shyam 7 Sarkar of RiskCompute, CA
  8. 8. Current Timeline for Implementation Source: Ernst and Young LLP06/20/12 All Rights Reserved by Dr. Shyam 8 Sarkar of RiskCompute, CA
  9. 9. Computational Model for Systemic RiskAnalysis (Office of Financial Research)06/20/12 All Rights Reserved by Dr. Shyam 9 Sarkar of RiskCompute, CA
  10. 10. Big Data, Big Process and Process Mining• Big Process is enterprise-wide business process transformation(not just improvement) program that is driven by top executives;• Big Process analysis embraces Big Data Analytics;• Big Process mining provides an important bridge betweendata mining/data analysis and business process modeling andanalysis providing techniques to more rigorously checkcompliance and regulations (Dodd-Frank, SOX, Basel II/III);• Big Process mining is an enabling technology for ContinuousProcess Improvement (CPI), Business Process Improvement(BPI), Total Quality Management (TQM), and Six Sigma, widelyused in Financial Services Processes and Regulations; 06/20/12 All Rights Reserved by Dr. Shyam 10 Sarkar of RiskCompute, CA
  11. 11. Big Process and Big Data Analytics Business Processes, Software Systems Implementing Trading Exchanges, Business Processes, Supports, Machines, Repositories, Databases, Controls Organizations Metadata, Communication Systems... Records, Events, Transactions Model Analyzes, Specifies (Volume and Velocity) and Configures Declarative Model for Big Data Analytics Processes, Process Discovery Systems Storing and Compliance Analyzing data generated Rules/Policies, Privileges, Conformance at high velocity and high Specifications volume from Big Process Enhancements Processes06/20/12 All Rights Reserved by Dr. Shyam 11 Sarkar of RiskCompute, CA
  12. 12. Business Intelligence and Process Analytics Business Intelligence and Big Data Analytics (Dashboards, KPIs) Big Process Analysis (Six Sigma, TQM, BPI, BAM, Dodd-Frank, SOX, Compliance) Streaming Process Metadata Data Mining Management Volume & Velocity Process Model Discovery Conformance Adjustment06/20/12 All Rights Reserved by Dr. Shyam 12 Sarkar of RiskCompute, CA
  13. 13. Straight-Through Processing• Straight-Through Process is designed to automate(1) The front office (trading), through the middle office (riskmanagement, confirmation & allocations) and on to the backoffice (clearing, payments and reporting) ;(2) A true STP environment allows all of these elements to belinked up electronically, without the need for any manual re-keying of data anywhere along the line ;(3) An optimal business system to meet Dodd-Frank requirementswill offer a high degree of automation to support a full straight-through-processing (STP) workflow ; 06/20/12 All Rights Reserved by Dr. Shyam 13 Sarkar of RiskCompute, CA
  14. 14. Straight-Through Processing06/20/12 All Rights Reserved by Dr. Shyam 14 Sarkar of RiskCompute, CA
  15. 15. The Issue of Business Entity Identification Source : Financial InterGroup06/20/12 All Rights Reserved by Dr. Shyam 15 Sarkar of RiskCompute, CA
  16. 16. Non-trivial Mapping of Business Entity Identifiers Significant operational risk is created as the consequence of the failed / insecure interaction of manual (operations) and automated (applications) with data Data tagging at source and common data identifiers will minimize operational risk, lower costs, allow regulators access to individual firm and industry-wide data for systemic risk analysis and lead the industry to Straight-Through- Processing06/20/12 All Rights Reserved by Dr. Shyam 16 Sarkar of RiskCompute, CA
  17. 17. US Regulators on Legal Entity IdentifierLack of global standard leads to much less transparency, inabilityto aggregate information efficiently and results in no audit trail ; LEI will allow Global Straight-Through-Processing ; ISO 17442:2012, Financial services – Legal Entity Identifier(LEI), is aimed at meeting the data collection and analysis needs of both national and global regulators in their responses to problems arising from the world financial crisis. An example LEI : F50EOCWSQFAUVO9Q8Z97 06/20/12 All Rights Reserved by Dr. Shyam 17 Sarkar of RiskCompute, CA
  18. 18. ISO Standard for LEIISO 17442 describes a 20-character alphanumeric code, as wellas additional elements for reference data attributes. Key attributesof the standard include the following: • Enables unique identification of global entities requiring an LEI • Defines robust open governance of the issuance and maintenance of the LEI scheme • Defines an LEI that contains no embedded intelligence • Can be applied worldwide to support the financial services industry • Leverages the expertise of ISO/TC 68 in defining and maintaining identifier standards • Is persistent • Defines a scheme that is scalable and free from assignment limitations. 06/20/12 All Rights Reserved by Dr. Shyam 18 Sarkar of RiskCompute, CA
  19. 19. Unique Identifier Format Source : Financial InterGroup06/20/12 All Rights Reserved by Dr. Shyam 19 Sarkar of RiskCompute, CA
  20. 20. Common Identifier System Source : Financial InterGroup06/20/12 All Rights Reserved by Dr. Shyam 20 Sarkar of RiskCompute, CA
  21. 21. Cloud In The Horizon For Capital Markets• The era of Big Proprietary Data centers is over for CapitalMarkets ;• NYSE Technologies’ new, industry-specific CapitalMarkets Community Platform addresses many of WallStreet’s concerns about public clouds ;• Straight-Through Processing with Common IdentifierSystem and pipelined large data sets will need cloud basedHadoop Map Reduce, Hbase, Hive and BI tools ;06/20/12 All Rights Reserved by Dr. Shyam 21 Sarkar of RiskCompute, CA
  22. 22. Straight-Through Processing (STP) and Pipelined Map Reduce Multiple rounds of Map / Reduce (Parallel algorithm can be structured) : Multiple rounds of Map/Reduce lead to pipelined Map/Reduce ; Straight-Through Processing for Dodd-Frank financial regulations can be implemented using multiple rounds of (pipelined) Map/Reduce phases executed in parallel ;06/20/12 All Rights Reserved by Dr. Shyam 22 Sarkar of RiskCompute, CA
  23. 23. Matrix Definition Over Securities Trade (Millions or Billions of Rows and Columns)(Identities) Buyer_1 Buyer_2 Buyer_3 Buyer_4Seller_1(Obj_1) 1 0 1 0Seller_1(Obj_2) 0 0 0 1Seller_2(Obj_3) 1 1 1 0Seller_3(Obj_4) 1 0 1 0Seller_3(Obj_5) 0 0 1 0 …. ….. …. ….Seller_n(Obj_m) 1 0 1 0 06/20/12 All Rights Reserved by Dr. Shyam 23 Sarkar of RiskCompute, CA
  24. 24. Straight-Through Processing06/20/12 All Rights Reserved by Dr. Shyam 24 Sarkar of RiskCompute, CA
  25. 25. Straight-Through Processing and Multiple Rounds of Map/Reduce Structure for Structure for Reduce Securities Trade Pricing from Market DataThe first stage in a Straight-Through Processing is equivalent to matrix multiplication and rule/policy application as follows :: Map function = (Structure for Securities Trade) X (Structure for Pricing from Market Data)Reduce function = Rule/policy application on result of Map function ;06/20/12 All Rights Reserved by Dr. Shyam 25 Sarkar of RiskCompute, CA
  26. 26. Straight-Through Processing and Multiple Rounds of Map/Reduce (Contd..)• Next stage of Map Reduce will be another matrixmultiplication over the resulting matrix from last stage andthe matrix created from Trading Portfolios ;• This stage will have a Reduce phase with specific rules toapply ;• Each stage of Map Reduce can run in parallel ;• It is possible to generate dashboard from the result of anyintermediate stage ; Pentaho or Hadoop Map/ Jaspersoft Hbase Hive Dashboard Reduce MySQL06/20/12 All Rights Reserved by Dr. Shyam 26 Sarkar of RiskCompute, CA
  27. 27. Map Reduce06/20/12 All Rights Reserved by Dr. Shyam 27 Sarkar of RiskCompute, CA
  28. 28. Hadoop Architecture06/20/12 All Rights Reserved by Dr. Shyam 28 Sarkar of RiskCompute, CA
  29. 29. Tools for Data Exchange06/20/12 All Rights Reserved by Dr. Shyam 29 Sarkar of RiskCompute, CA
  30. 30. Pipeline06/20/12 All Rights Reserved by Dr. Shyam 30 Sarkar of RiskCompute, CA
  31. 31. Implementation with Hadoop Eco-system 06/20/12 All Rights Reserved by Dr. Shyam 31 Sarkar of RiskCompute, CA
  32. 32. References 1. InformationWeek Dodd-Frank Cheat Sheet : 2. Financial InterGroup : 3. U.S. Congress website : http:// 4. Office of Financial Research: 5. Office of Financial research Working Paper Series: papers.aspx06/20/12 All Rights Reserved by Dr. Shyam 32 Sarkar of RiskCompute, CA
  33. 33. Q/A E-mail: shyam.sarkar@riskcompute.com06/20/12 All Rights Reserved by Dr. Shyam 33 Sarkar of RiskCompute, CA
  34. 34. Sessions will resume at11:25am Page 34