1 May 2009 Experts in  Data warehousing, BI and  Data Mining
About Kaizentric <ul><li>Kaizentric Technologies was founded in Feb 2008 </li></ul><ul><li>Founder has 13 years of Data Wa...
Kaizentric’s Business
Experience in Data Warehousing Technology focus Project Duration Data Warehouse Photons – Insurance Data Warehouse 7 month...
Kaizentric’s Products
Kaizentric’s Products - WIP
Photons – Insurance DW <ul><li>Photons is a Data Warehouse product developed by Kaizentric to help small and medium enterp...
Photons – Architecture Data  Standardization Source Staging Area Oracle  DWH Interfaces & BI  reports Data Sources  Identi...
Hornet – HR Data Warehouse <ul><li>Another data warehouse product by Kaizentric to assist HR consultants in finding the ri...
Hornet – Architecture Data  Standardization Source Staging Area Oracle  DWH Jobs vs. Candidates  Reports Data Sources  Ide...
Insurance - Claims Mining <ul><li>The IBM Data Model IIW (Insurance Information Warehouse) is used for the data warehouse ...
Insurance – High level Design Source A Source B Source C Maps Mart Kalido iStage Stage Maps Spreadsheets  Maps Access Data...
MSL – Marketing and Sales Link <ul><li>Disconnect between Marketing and Sales, as far as lead generation is concerned, is ...
MSL Business Rules People who respond to Marketing campaigns (~800k Responses per year) 2-5% end-user interest is qualifie...
MBS – Innovative Research <ul><li>Mortgage Baked Securities are analyzed using this system to suggest a better design of h...
<ul><li>Proactive approach to Index changes </li></ul><ul><li>Detailed control of risk and value </li></ul><ul><li>Coordin...
Data Integration COE <ul><li>Leading Financial Services provider implemented a COE  - Center of Excellence for Data Interc...
DI COE – Technical Architecture
Be InformEd – DW for Colleges and Schools <ul><li>Be Informed was a Data Warehouse product developed per request from Kaiz...
Be InformEd– Architecture Data  Standardization Source Staging Area Oracle  DWH Interfaces & BI  reports Data Sources  Ide...
Our Team
For clarifications, please contact Azhagarasan Annadorai Kaizentric Technologies Pvt Ltd +91-90947-98789 azhagarasan@kaize...
Upcoming SlideShare
Loading in …5
×

Kaizentric Presentation

4,137 views

Published on

Kaizentric is a Data Analytics firm, based in Chennai, India. Statistical Analysis is performed on a well-built client specific data warehouse, supported by Data Mining.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
4,137
On SlideShare
0
From Embeds
0
Number of Embeds
46
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Kaizentric Presentation

  1. 1. 1 May 2009 Experts in Data warehousing, BI and Data Mining
  2. 2. About Kaizentric <ul><li>Kaizentric Technologies was founded in Feb 2008 </li></ul><ul><li>Founder has 13 years of Data Warehousing Experience. Team strength is 20. Scalable to 200 with resources from Partners. Staff can be hired on demand in 2 to 4 weeks time. </li></ul><ul><li>Serves Insurance clients on Data Management Technologies </li></ul><ul><li>Conducts Extensive Research in Data Warehousing </li></ul><ul><li>Develops On Demand BI products (SaaS architecture). </li></ul><ul><li>Possesses experience and expertise in building data warehouses using Oracle, Sybase, SQL Server as Backend Databases, Informatica as ETL tool and Kalido as Data warehouse Lifecycle Management tool. We specialize in Data Mining using SAS, Reporting using Business Objects and Micro strategy. </li></ul>
  3. 3. Kaizentric’s Business
  4. 4. Experience in Data Warehousing Technology focus Project Duration Data Warehouse Photons – Insurance Data Warehouse 7 months Data Warehouse Hornet – HR Repository 12 months Data Warehouse Group Benefits Insurance 9 months Data Integration Marketing and Sales Linkage 4 months Data Warehousing and Data Mining Mortgage Backed Securities 6 months Data Integration Services Data Integration Center of Excellence 2 years Data Warehousing and Data Mining Be InformEd – Education Industry 4 months
  5. 5. Kaizentric’s Products
  6. 6. Kaizentric’s Products - WIP
  7. 7. Photons – Insurance DW <ul><li>Photons is a Data Warehouse product developed by Kaizentric to help small and medium enterprises. </li></ul><ul><li>Built on SaaS architecture (Software as a Service) </li></ul><ul><li>Focuses on Employer Benefits/ Group Benefits </li></ul><ul><li>Built on ACORD Data Model Standard </li></ul><ul><li>There are about 50 Dimension Tables that cover Customer, Product and Employee profiles and about 10 Fact tables for transactions involving claims processing, benefit payments, other overheads, reserves etc </li></ul><ul><li>There are about 30 ETL processes that load from typical ACORD model to this data warehouse </li></ul>
  8. 8. Photons – Architecture Data Standardization Source Staging Area Oracle DWH Interfaces & BI reports Data Sources Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation ACORD Data Standard Rectify data errors and enrich data Data Audit, Data Quality definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical Components Process Components DWH Staging Area Data Sources OpCo POS EDI Sibel Master Data
  9. 9. Hornet – HR Data Warehouse <ul><li>Another data warehouse product by Kaizentric to assist HR consultants in finding the right candidate for a given job and vice versa </li></ul><ul><li>Unstructured information from emails and job portals are cleansed, standardized and stored in a data warehouse. </li></ul><ul><li>ETL processes are used throughout the data transformation </li></ul><ul><li>Customized Data Mining reports are generated for every client. Reports are generated to show usefulness of products. </li></ul><ul><li>Product is used in USA for IT recruitment </li></ul>
  10. 10. Hornet – Architecture Data Standardization Source Staging Area Oracle DWH Jobs vs. Candidates Reports Data Sources Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation Data Sources Rectify data errors and enrich data Data Audit, Data Quality definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical Components Process Components DWH Staging Area Reports Pdf Flat files documents Hotlists Emails
  11. 11. Insurance - Claims Mining <ul><li>The IBM Data Model IIW (Insurance Information Warehouse) is used for the data warehouse design </li></ul><ul><li>Data warehouse is built for Disability Products such as Short-term disability, long-term disability, family medical leave and Premium-Waiver Life products </li></ul><ul><li>Informatica was used for ETL, Kalido was used for warehouse life cycle management and SAS for data mining. </li></ul><ul><li>1.5 TB of Data from various source systems such as Claims, Losses, Reserves, Manual Claims, Country wide losses, Premium, Special Investigation, Litigation etc </li></ul><ul><li>Data mining algorithms are used to score Claims, based on demography, employer characteristics and other data </li></ul>
  12. 12. Insurance – High level Design Source A Source B Source C Maps Mart Kalido iStage Stage Maps Spreadsheets Maps Access Databases The Enterprise Logical Data Model will not be built in a single effort; instead projects requiring data will incrementally contribute to its build out The logical data model allows users to locate which systems contain particular data entities. It also has attribute mappings that allow a user to know which tables and attributes in the sources map to the enterprise logical data model; the mapping would reference all sources that have that type of data. In addition, the system of record would be identified for each type (and possibly segment) of data. The logical data model allows users to know if data elements are in the data warehouse and where in the environment they are located Other business owned data sources (such as spreadsheets and access databases) will also be mapped to the enterprise data model. This will give the organization a better understanding of data that is not part of the application portfolio or where duplicate data is being stored by the business. Enterprise Logical Data Model First Name Last Name Street Address City Phone Number Date of Birth Social Security Age Employer First Name Last Name Phone Number Street Address Date of Birth City Social Security Age Employer First Name Last Name Phone Number Street Address Date of Birth City Social Security Age Employer
  13. 13. MSL – Marketing and Sales Link <ul><li>Disconnect between Marketing and Sales, as far as lead generation is concerned, is solved in this project through data interchange and integration </li></ul><ul><li>Information from marketing system is validated with that of Sales system and an unique / non-overlapping leads are suggested to sales team </li></ul><ul><li>Data from Party tables, Enterprise Data Warehouse (EDW), Salesforce.com, Siebel CRM are used to evaluate the accuracy of data and is reported to Sales team for lead qualification </li></ul><ul><li>Recent Marketing History and Recent Contact’s activity are the key information in lead qualification process. </li></ul><ul><li>Platform is built on Java, Informatica, Oracle and Unix </li></ul>
  14. 14. MSL Business Rules People who respond to Marketing campaigns (~800k Responses per year) 2-5% end-user interest is qualified via live agent 95-98% who have Interest with company Populated in myleads portal or lead tab in SFDC (Salesforce.com) Responded to Marketing Recently Not Already in SFDC New Contacts (Send to SFDC) Contacts who show interest in products based on flexible filtering rules <ul><li>Have budget confirmed </li></ul><ul><li>Time frame </li></ul><ul><li>Desire to be contacted </li></ul>Example Filter Rules Already in SFDC (Go to Marketing History)
  15. 15. MBS – Innovative Research <ul><li>Mortgage Baked Securities are analyzed using this system to suggest a better design of hedge funds </li></ul><ul><li>Unstructured, Semi-structured and Structured data are parsed by Java scanner. Informatica/ ETL then standardizes the data into a data warehouse. Java is used to produce and publish reports to a portal. </li></ul><ul><li>Sybase and Sybase IQ are used for the data warehouse </li></ul><ul><li>Around 70 ETL processes, including auditing processes are in use </li></ul><ul><li>The system is used by a leader in financial industry </li></ul>
  16. 16. <ul><li>Proactive approach to Index changes </li></ul><ul><li>Detailed control of risk and value </li></ul><ul><li>Coordinated global trading </li></ul><ul><li>Global technology platform </li></ul><ul><li>Strict control of operational and investment risk </li></ul><ul><li>Multi-faceted risk control structure </li></ul><ul><li>Diverse client base  better trading costs </li></ul><ul><li>Proprietary trading cost model </li></ul><ul><li>Scale  better market trading </li></ul>MBS – Business Requirement Return Risk Cost
  17. 17. Data Integration COE <ul><li>Leading Financial Services provider implemented a COE - Center of Excellence for Data Interchange </li></ul><ul><li>The system is built on MQ Series, Informatica RT (Real Time), Abinitio ETL, Mainframe systems, Informatica Power Connect and so on. </li></ul><ul><li>System is built on Service Oriented Architecture (SOA). </li></ul><ul><li>Different Costing Models for the utilization/ consumption of services were built in. </li></ul><ul><li>Informatica 8.1 and MQ series provide the failover options for reliability </li></ul><ul><li>DB2 UDB on Unix serves as backend data base </li></ul>
  18. 18. DI COE – Technical Architecture
  19. 19. Be InformEd – DW for Colleges and Schools <ul><li>Be Informed was a Data Warehouse product developed per request from Kaizentric’s Partner – Saveetha Engg College. </li></ul><ul><li>The college had an operational system. Their requirement is to build a data warehouse on top of it with intelligent data mining capability to improve their service to students, staff and parents. </li></ul><ul><li>The DW & BI solution is built on SaaS architecture (Software as a Service). </li></ul><ul><li>ETL process is used to standardize and load information from College Data Base to the Data warehouse hosted in Kaizentric’s premises. This offers Disaster Recovery Capability as well. </li></ul><ul><li>Data bases: MS SQL Server source and Oracle Target. The BI (Business Intelligence) reports were delivered over web. </li></ul>
  20. 20. Be InformEd– Architecture Data Standardization Source Staging Area Oracle DWH Interfaces & BI reports Data Sources Identification Data Mapping Data Validation Data Unification components Code Admin for Data Consolidation Educational Institute Rectify data errors and enrich data Data Audit, Data Quality definition Implement business rules for data integrity, validation rules for cleansing, transformation rules for formatting and consolidation Physical Components Process Components DWH Staging Area Kaizentric’s location Data Sources Student Staff Marks Attendance Others
  21. 21. Our Team
  22. 22. For clarifications, please contact Azhagarasan Annadorai Kaizentric Technologies Pvt Ltd +91-90947-98789 azhagarasan@kaizentric.com www.kaizentric.com Thank you Head office: New #126, Old#329, Arcot Road, Kodambakkam, Chennai 600 024 India Phone: +91-44-64990787

×