1 May 2009 Experts in  Data warehousing, BI and  Data Mining
About Kaizentric Kaizentric Technologies was founded in Feb 2008 Founder has 13 years of Data Warehousing Experience. Team strength is 20. Scalable to 200 with resources from Partners. Staff can be hired on demand in 2 to 4 weeks time. Serves Insurance clients on Data Management Technologies Conducts Extensive Research in Data Warehousing Develops On Demand BI products (SaaS architecture).  Possesses experience and expertise in building data warehouses using Oracle, Sybase, SQL Server as Backend Databases, Informatica as ETL tool and Kalido as Data warehouse Lifecycle Management tool. We specialize in Data Mining using SAS, Reporting using Business Objects and Micro strategy.
Kaizentric’s Business
Experience in Data Warehousing Technology focus Project Duration Data Warehouse Photons – Insurance Data Warehouse 7 months Data Warehouse Hornet – HR Repository 12 months Data Warehouse Group Benefits Insurance 9 months Data Integration Marketing and Sales Linkage 4 months Data Warehousing and Data Mining Mortgage Backed Securities 6 months Data Integration Services Data Integration Center of Excellence 2 years Data Warehousing and Data Mining Be InformEd – Education Industry 4 months
Kaizentric’s Products
Kaizentric’s Products - WIP
Photons – Insurance DW Photons is a Data Warehouse product developed by Kaizentric to help small and medium enterprises. Built on SaaS architecture (Software as a Service) Focuses on Employer Benefits/ Group Benefits Built on ACORD Data Model Standard There are about 50 Dimension Tables that cover Customer, Product and Employee profiles and about 10 Fact tables for transactions involving claims processing, benefit payments, other overheads, reserves etc There are about 30 ETL processes that load from typical ACORD model to this data warehouse
Photons – Architecture Data  Standardization Source Staging Area Oracle  DWH Interfaces & BI  reports Data Sources  Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation ACORD Data Standard Rectify data errors and enrich data Data Audit,  Data Quality  definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical  Components Process  Components DWH  Staging Area Data Sources OpCo POS EDI Sibel Master Data
Hornet – HR Data Warehouse Another data warehouse product by Kaizentric to assist HR consultants in finding the right candidate for a given job and vice versa Unstructured information from emails and job portals are cleansed, standardized and stored in a data warehouse.  ETL processes are used throughout the data transformation Customized Data Mining reports are generated for every client. Reports are generated to show usefulness of products. Product is used in USA for IT recruitment
Hornet – Architecture Data  Standardization Source Staging Area Oracle  DWH Jobs vs. Candidates  Reports Data Sources  Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation Data Sources Rectify data errors and enrich data Data Audit,  Data Quality  definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical  Components Process  Components DWH  Staging Area Reports Pdf Flat files documents Hotlists Emails
Insurance - Claims Mining The IBM Data Model IIW (Insurance Information Warehouse) is used for the data warehouse design Data warehouse is built for Disability Products such as Short-term disability, long-term disability, family medical leave and Premium-Waiver Life products Informatica was used for ETL, Kalido was used for warehouse life cycle management and SAS for data mining. 1.5 TB of Data from various source systems such as Claims, Losses, Reserves, Manual Claims, Country wide losses, Premium, Special Investigation, Litigation etc Data mining algorithms are used to score Claims, based on demography, employer characteristics and other data
Insurance – High level Design Source A Source B Source C Maps Mart Kalido iStage Stage Maps Spreadsheets  Maps Access Databases The Enterprise Logical Data Model will not be built in a single effort; instead projects requiring data will incrementally contribute to its build out The logical data model allows users to locate which systems contain particular data entities.  It also has attribute mappings that allow a user to know which tables and attributes in the sources map to the enterprise logical data model; the mapping would reference all sources that have that type of data.  In addition, the system of record would be identified for each type (and possibly segment) of data. The logical data model allows users to know if data elements are in the data warehouse and where in the environment they are located Other business owned data sources (such as spreadsheets and access databases) will also be mapped to the enterprise data model.  This will give the organization a better understanding of data that is not part of the application portfolio or where duplicate data is being stored by the business. Enterprise Logical Data Model First Name Last Name Street Address City Phone Number Date of Birth Social Security Age Employer First Name Last Name Phone Number Street Address Date of Birth City Social Security Age Employer First Name Last Name Phone Number Street Address Date of Birth City Social Security Age Employer
MSL – Marketing and Sales Link Disconnect between Marketing and Sales, as far as lead generation is concerned, is solved in this project through data interchange and integration Information from marketing system is validated with that of Sales system and an unique / non-overlapping leads are suggested to sales team Data from Party tables, Enterprise Data Warehouse (EDW), Salesforce.com, Siebel CRM are used to evaluate the accuracy of data and is reported to Sales team for lead qualification Recent Marketing History and Recent Contact’s activity are the key information in lead qualification process. Platform is built on Java, Informatica, Oracle and Unix
MSL Business Rules People who respond to Marketing campaigns (~800k Responses per year) 2-5% end-user interest is qualified via live agent 95-98% who have Interest with company Populated in myleads portal or lead tab in  SFDC (Salesforce.com)  Responded to Marketing Recently Not Already in SFDC New Contacts (Send to SFDC) Contacts who show interest in products based on flexible filtering rules Have budget confirmed Time frame Desire to be contacted  Example Filter Rules Already in SFDC  (Go to Marketing History)
MBS – Innovative Research Mortgage Baked Securities are analyzed using this system to suggest a better design of hedge funds Unstructured, Semi-structured and Structured data are parsed by Java scanner. Informatica/ ETL then standardizes the data into a data warehouse. Java is used to produce and publish reports to a portal.  Sybase and Sybase IQ are used for the data warehouse Around 70 ETL processes, including auditing processes are in use The system is used by a leader in financial industry
Proactive approach to Index changes Detailed control of risk and value Coordinated global trading  Global technology platform Strict control of operational and investment risk Multi-faceted risk control structure Diverse client base    better trading costs Proprietary trading cost model Scale    better market trading MBS – Business Requirement Return Risk Cost
Data Integration COE Leading Financial Services provider implemented a COE  - Center of Excellence for Data Interchange The system is built on MQ Series, Informatica RT (Real Time), Abinitio ETL, Mainframe systems, Informatica Power Connect and so on. System is built on Service Oriented Architecture (SOA). Different Costing Models for the utilization/ consumption of services were built in. Informatica 8.1 and MQ series provide the failover options for reliability DB2 UDB on Unix serves as backend data base
DI COE – Technical Architecture
Be InformEd – DW for Colleges and Schools Be Informed was a Data Warehouse product developed per request from Kaizentric’s Partner – Saveetha Engg College.  The college had an operational system. Their requirement is to build a data warehouse on top of it with intelligent data mining capability to improve their service to students, staff and parents.  The DW & BI solution is built on SaaS architecture (Software as a Service).  ETL process is used to standardize and load information from College Data Base to the Data warehouse hosted in Kaizentric’s premises. This offers Disaster Recovery Capability as well. Data bases: MS SQL Server source and Oracle Target. The BI (Business Intelligence) reports were delivered over web.
Be InformEd– Architecture Data  Standardization Source Staging Area Oracle  DWH Interfaces & BI  reports Data Sources  Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation Educational Institute Rectify data errors and enrich data Data Audit,  Data Quality  definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical  Components Process  Components DWH  Staging Area Kaizentric’s location Data Sources Student Staff Marks Attendance Others
Our Team
For clarifications, please contact Azhagarasan Annadorai Kaizentric Technologies Pvt Ltd +91-90947-98789 azhagarasan@kaizentric.com  www.kaizentric.com Thank   you Head office:   New #126, Old#329, Arcot Road, Kodambakkam, Chennai 600 024 India Phone: +91-44-64990787

Kaizentric Presentation

  • 1.
    1 May 2009Experts in Data warehousing, BI and Data Mining
  • 2.
    About Kaizentric KaizentricTechnologies was founded in Feb 2008 Founder has 13 years of Data Warehousing Experience. Team strength is 20. Scalable to 200 with resources from Partners. Staff can be hired on demand in 2 to 4 weeks time. Serves Insurance clients on Data Management Technologies Conducts Extensive Research in Data Warehousing Develops On Demand BI products (SaaS architecture). Possesses experience and expertise in building data warehouses using Oracle, Sybase, SQL Server as Backend Databases, Informatica as ETL tool and Kalido as Data warehouse Lifecycle Management tool. We specialize in Data Mining using SAS, Reporting using Business Objects and Micro strategy.
  • 3.
  • 4.
    Experience in DataWarehousing Technology focus Project Duration Data Warehouse Photons – Insurance Data Warehouse 7 months Data Warehouse Hornet – HR Repository 12 months Data Warehouse Group Benefits Insurance 9 months Data Integration Marketing and Sales Linkage 4 months Data Warehousing and Data Mining Mortgage Backed Securities 6 months Data Integration Services Data Integration Center of Excellence 2 years Data Warehousing and Data Mining Be InformEd – Education Industry 4 months
  • 5.
  • 6.
  • 7.
    Photons – InsuranceDW Photons is a Data Warehouse product developed by Kaizentric to help small and medium enterprises. Built on SaaS architecture (Software as a Service) Focuses on Employer Benefits/ Group Benefits Built on ACORD Data Model Standard There are about 50 Dimension Tables that cover Customer, Product and Employee profiles and about 10 Fact tables for transactions involving claims processing, benefit payments, other overheads, reserves etc There are about 30 ETL processes that load from typical ACORD model to this data warehouse
  • 8.
    Photons – ArchitectureData Standardization Source Staging Area Oracle DWH Interfaces & BI reports Data Sources Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation ACORD Data Standard Rectify data errors and enrich data Data Audit, Data Quality definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical Components Process Components DWH Staging Area Data Sources OpCo POS EDI Sibel Master Data
  • 9.
    Hornet – HRData Warehouse Another data warehouse product by Kaizentric to assist HR consultants in finding the right candidate for a given job and vice versa Unstructured information from emails and job portals are cleansed, standardized and stored in a data warehouse. ETL processes are used throughout the data transformation Customized Data Mining reports are generated for every client. Reports are generated to show usefulness of products. Product is used in USA for IT recruitment
  • 10.
    Hornet – ArchitectureData Standardization Source Staging Area Oracle DWH Jobs vs. Candidates Reports Data Sources Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation Data Sources Rectify data errors and enrich data Data Audit, Data Quality definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical Components Process Components DWH Staging Area Reports Pdf Flat files documents Hotlists Emails
  • 11.
    Insurance - ClaimsMining The IBM Data Model IIW (Insurance Information Warehouse) is used for the data warehouse design Data warehouse is built for Disability Products such as Short-term disability, long-term disability, family medical leave and Premium-Waiver Life products Informatica was used for ETL, Kalido was used for warehouse life cycle management and SAS for data mining. 1.5 TB of Data from various source systems such as Claims, Losses, Reserves, Manual Claims, Country wide losses, Premium, Special Investigation, Litigation etc Data mining algorithms are used to score Claims, based on demography, employer characteristics and other data
  • 12.
    Insurance – Highlevel Design Source A Source B Source C Maps Mart Kalido iStage Stage Maps Spreadsheets Maps Access Databases The Enterprise Logical Data Model will not be built in a single effort; instead projects requiring data will incrementally contribute to its build out The logical data model allows users to locate which systems contain particular data entities. It also has attribute mappings that allow a user to know which tables and attributes in the sources map to the enterprise logical data model; the mapping would reference all sources that have that type of data. In addition, the system of record would be identified for each type (and possibly segment) of data. The logical data model allows users to know if data elements are in the data warehouse and where in the environment they are located Other business owned data sources (such as spreadsheets and access databases) will also be mapped to the enterprise data model. This will give the organization a better understanding of data that is not part of the application portfolio or where duplicate data is being stored by the business. Enterprise Logical Data Model First Name Last Name Street Address City Phone Number Date of Birth Social Security Age Employer First Name Last Name Phone Number Street Address Date of Birth City Social Security Age Employer First Name Last Name Phone Number Street Address Date of Birth City Social Security Age Employer
  • 13.
    MSL – Marketingand Sales Link Disconnect between Marketing and Sales, as far as lead generation is concerned, is solved in this project through data interchange and integration Information from marketing system is validated with that of Sales system and an unique / non-overlapping leads are suggested to sales team Data from Party tables, Enterprise Data Warehouse (EDW), Salesforce.com, Siebel CRM are used to evaluate the accuracy of data and is reported to Sales team for lead qualification Recent Marketing History and Recent Contact’s activity are the key information in lead qualification process. Platform is built on Java, Informatica, Oracle and Unix
  • 14.
    MSL Business RulesPeople who respond to Marketing campaigns (~800k Responses per year) 2-5% end-user interest is qualified via live agent 95-98% who have Interest with company Populated in myleads portal or lead tab in SFDC (Salesforce.com) Responded to Marketing Recently Not Already in SFDC New Contacts (Send to SFDC) Contacts who show interest in products based on flexible filtering rules Have budget confirmed Time frame Desire to be contacted Example Filter Rules Already in SFDC (Go to Marketing History)
  • 15.
    MBS – InnovativeResearch Mortgage Baked Securities are analyzed using this system to suggest a better design of hedge funds Unstructured, Semi-structured and Structured data are parsed by Java scanner. Informatica/ ETL then standardizes the data into a data warehouse. Java is used to produce and publish reports to a portal. Sybase and Sybase IQ are used for the data warehouse Around 70 ETL processes, including auditing processes are in use The system is used by a leader in financial industry
  • 16.
    Proactive approach toIndex changes Detailed control of risk and value Coordinated global trading Global technology platform Strict control of operational and investment risk Multi-faceted risk control structure Diverse client base  better trading costs Proprietary trading cost model Scale  better market trading MBS – Business Requirement Return Risk Cost
  • 17.
    Data Integration COELeading Financial Services provider implemented a COE - Center of Excellence for Data Interchange The system is built on MQ Series, Informatica RT (Real Time), Abinitio ETL, Mainframe systems, Informatica Power Connect and so on. System is built on Service Oriented Architecture (SOA). Different Costing Models for the utilization/ consumption of services were built in. Informatica 8.1 and MQ series provide the failover options for reliability DB2 UDB on Unix serves as backend data base
  • 18.
    DI COE –Technical Architecture
  • 19.
    Be InformEd –DW for Colleges and Schools Be Informed was a Data Warehouse product developed per request from Kaizentric’s Partner – Saveetha Engg College. The college had an operational system. Their requirement is to build a data warehouse on top of it with intelligent data mining capability to improve their service to students, staff and parents. The DW & BI solution is built on SaaS architecture (Software as a Service). ETL process is used to standardize and load information from College Data Base to the Data warehouse hosted in Kaizentric’s premises. This offers Disaster Recovery Capability as well. Data bases: MS SQL Server source and Oracle Target. The BI (Business Intelligence) reports were delivered over web.
  • 20.
    Be InformEd– ArchitectureData Standardization Source Staging Area Oracle DWH Interfaces & BI reports Data Sources Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation Educational Institute Rectify data errors and enrich data Data Audit, Data Quality definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical Components Process Components DWH Staging Area Kaizentric’s location Data Sources Student Staff Marks Attendance Others
  • 21.
  • 22.
    For clarifications, pleasecontact Azhagarasan Annadorai Kaizentric Technologies Pvt Ltd +91-90947-98789 azhagarasan@kaizentric.com www.kaizentric.com Thank you Head office: New #126, Old#329, Arcot Road, Kodambakkam, Chennai 600 024 India Phone: +91-44-64990787