Primatics Financial - Parallel, High Throughput Risk Calculations On The Cloud
Upcoming SlideShare
Loading in...5
×
 

Primatics Financial - Parallel, High Throughput Risk Calculations On The Cloud

on

  • 5,455 views

 

Statistics

Views

Total Views
5,455
Views on SlideShare
4,241
Embed Views
1,214

Actions

Likes
1
Downloads
40
Comments
1

12 Embeds 1,214

http://www.gigaspaces.com 1142
http://wiki.gigaspaces.com 39
http://www.slideshare.net 11
http://old.gigaspaces.com 7
http://46.20.118.12 5
http://www.gigaspaces.rsvp1.com 3
http://translate.googleusercontent.com 2
http://www.gstil.com 1
http://gstil.com 1
http://gg.vaizata.com 1
http://gigaspaces.beta.sergata.com 1
http://www.slideee.com 1
More...

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • I have a question about cloud computing in general and I think it applies to the financial world more than any...

    It seems to me that cloud computing is to the digital network world as perhaps nuclear power plants are to the energy world. There is no question that it is cheaper more efficient, and sustainable for the long term compared to the local networks and coal plants of the past. However are they not flawed in terms of the 1/1,000,000,000 accident. What protects the info if the cloud fails for an extended amount of time or whatever.

    Is it like the nuke plants that have back up diesel generators that are in worst shape than the rest of the plant at the time they are needed. It would seem that it would be far to easy to out grow the brick and mortar backup.

    I know Microsoft has already had an incident or two. granted these are the early days but at the same time it has only been a couple of years now. what happens when we get a big one??

    Seems like the financial records of an already fragile economy need to be grounded at some level,

    Thanks

    Finance Guy
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Primatics Financial - Parallel, High Throughput Risk Calculations On The Cloud Primatics Financial - Parallel, High Throughput Risk Calculations On The Cloud Presentation Transcript

  • Evolv ™ Risk: Cloud Computing Use Case for Java One 2009 Nati Shalom CTO GigaSpaces Natishalom.typepad.com Twitter.com/natishalom Francis de la Cruz Manager , Argyn Kuketayev Senior Consultant, Primatics Financial
  • Use Case Outline
    • Business Challenge
      • Client Needs and Application Evolution
      • Business Needs
    • Architectural Challenges
      • Risk 1.5 Application Stack
      • Risk 2.0 Application Stack
      • Processing Unit diagram
      • Lessons Learned
  • Business Case
    • Many mid-size regional banks with sizeable mortgage portfolios
    • Outsource sophisticated mortgage credit risk modeling
    • Outsource IT infrastructure to perform costly computations, suited for on-demand PaaS
  • Primatics’ Story in Cloud Computing
    • Company with roots in financial services and IT
    • Multiple clients in financial services, 2 products
    • 2007
      • EVOLV Risk V1: Web-based hosted solution, a regional bank
    • 2008
      • V2: embrace Cloud computing, Amazon EC2
      • V1.5: big national bank M&A, portfolio valuation, AWS EC2
    • 2009
      • V2 “reloaded”: ground up redesign, GigaSpaces
  • Business Needs: Accounting-Cashflows Risk Analytics Platform Stochastic (Monte Carlo) simulation: for each loan run 100 paths (360 months each) INPUT Loan Portfolio ( 100MB): 100k Loans x 1 KB data Forecast Scenarios (100MB): 1MB per simulation path Market History (<1MB) Assumptions, Setting, Dials (<1KB) OUTPUT Cashflows (3GB): 100k Loans x 360 months x 10 Doubles
  • Business Needs: Scale Many jobs by the same user Many users in a firm Many firms More loans, more simulations, longer time horizon Risk Analytics Platform
  • Overarching Challenges
    • Quantitative financial applications (models) with heavy CPU load
    • Large volumes of data I/O
    • Infrastructure to support growing number of clients, and users
    • Varying load on the system, with spikes of heavy load
    • Avoid lock-in to cloud vendors
  • Evolv Risk 1.5 vs 2.0 Software Stack GIGASPACES Prior Grid Framework J2EE Platform Loans & securities Loans Web-Interface SSL Reporting Warehouse Model API Scenarios Loss & Valuation Securities Custom Templates Validation Pluggable Client models Model Validation & statistics Cohorts HPI rates Interest rates Market rates Provisioning Monitoring Job Scheduling Failover / Recovery Load Balancing AWS Billing Registration Management Loss Forecasts Valuations Discount rates Node Management OLAP Reports Auto Throttling SLA Space Based Architecture Scaling Manual Process Model Validation & statistics Provisioning Monitoring Billing AWS Registration AWS management Not In Architecture Auto Throttling Space Based Architecture SLA Node Management * Broker (MQ) Sun JMS Discovery Persistence Manager J2EE Platform Inbuilt models
  • Performance, Scalability, Data Security, Data Integrity
  • Performance, Scalability, Data Security, Data Integrity Primatics Gateway AWS Provisioning
  • Lessons Learned : Cloud Integration
    • Provisioning
      • Configuration API vs. Using Scripts
    • Auto Throttling vs. Manual Throttling
    • Failover and Recovery provided by framework
    • Monitoring VIA GS Console and API vs in house application.
    • Infrastructure took 5 weeks vs 8 weeks to develop.
    • Achieving linear scalability was a matter of creating Processing Units.
  • Lessons Learned : Data and Processing
    • Data Distribution
      • Uses Spaces Architecture vs. In house developed JMS Broker and Discovery to push to Grid
    • Deployment
      • Uses GS API vs. Pushing data to grid or using machine images.
  • Lessons Learned:
    • Problems are now high class problems.
      • GigaSpaces Framework spared us from low level problems.
      • Code written is for more answering ‘why’ not answering ‘how’.
    • GigaSpaces:
      • Provisioning out of the box
      • Failover and Recovery much easier
      • Monitoring through API and Console
      • SLA out of the box
      • Integrated into the Cloud
      • Significantly lowers the barrier of entry into Cloud computing