I’ll be giving you a brief overview of the project I am going to be working on this year,I will talk to you a bit a about the company, what they doand what I will be working on with the company.
Definition of data integration
Di can be use for querrying ..
Start-upStarted last year, the current system has the bare nProvides I.T Solutions For Automotive Finance Companies And Dealersby Assisting Car Dealerships, distributor/ manufacturers etc… Make The Most Profit From Car Sales, Finance And Income And After Sales I.E. Servicing And Parts.Dealer Performance Management SystemI will be working on the DPMS which stands for Dealer Performance Management System, this is the web application that will be used by the car dealers to help them make decisions.. Show Profit PotentialShow Profit Potential And Where business and operationalProcess Improvements Can Be Made, Using KPI or(key performance indicators) that will evaluate the success of deals in car sales, insurance ... ROI rate of income – how much profit u make on a saleF&I Hand over time How many deals have been convertedTurn Data Into InformationThese dealers already have records of sales, however these records are date not information. The purpose of this system is to turn this data into information. This will be done finding trends in the data for example.. Turning data into informationInformation is quite simply an understanding of the relationships between pieces of data, or between pieces of data and other information. – this is done in the system through KPIs such as improved performance, competitive advantage, innovation, the sharing of lessons learned and continuous improvement of the organization.
Currently the only way to input data into DPMS is by manual data entry that is: manually fill 50 or so fields, this leaves room for user to make mistake or enter inaccurate data and also takes up a lot of time.Avoid Manual Data EntryHow?Use Existing DataSo we Need a way for users to import their existing data from external systems into the application to help auto complete fields.The system need to be automatedA way to extract Data DPMS needs a way to use vehicle details informationfrom currently operated systems i.e. Point of Sale (POS), (Vehicle database)Allow users to auto-populate vehicle detail fields Automated Import of DataAuto-populate Form Fields Design and implement an interface that will allow user to load the data.Reconcile With Existing Data The system will convert different data formats into a standardised format that will be consumed by the system Flexible interface The interface need to be as Flexible as possible so the will it be a be to interface with as much systems as possible
DPMS User in DPMS to have choice regarding where to pull data from (Mototrack or POS for example)P.O.S – point of sale - Presume Point of Sale System will provide Web service searchData dump - which will contain a record of the table structure and the data from a vehicle database and is usually in the form of a list of SQL statements.A new set of tables for reference dataETLExtract: I will extract data from the export, Transform: Then transform the data, e.g. combine or split columns, format the data eg. If the data in a different data format, the idea of this is to get all the data into be consistent with our database.Load: finally the data will be loaded into the reference tableCustomised import process will convert the data from data dump
With a rise in the complexity of data and business demands, enterprises today find it challenging to handle a mass of application data and the relevant data issues in an efficient and flexible way. In this thesis, we proposed a Web-based data service system –
A scalable solution
TORQUEIT SOLUTIONS BTECH 451Empowering Automotive Finance Data Integration Final Presentation Shawn D’souza Oct2012
DEFINITION NEED FOR DI CHALLENGES FOR DI APPROACHES PREVIOUSLY TECHNICAL DETAILS DEMO SOLUTION ANALYSISFUTURE WORK CONCLUSION EXPERIENCE GAINED THANK YOU
Data integration involves combiningdata residing in different sources andproviding users with a unified view ofthese data. Maurizio Lenzerini (2002). "Data Integration: A Theoretical Perspective". PODS 2002. pp. 233–246
Data warehouse Live Reporting Pros: Pros: • Reports run against the Data Warehouse rather than • Less costly your production database so your production • Less complicated database can be dedicated to transactional • “IT Lite” with much less reliance on IT resources processing rather than reporting • Reports run against live production data rather • Reporting can be faster than a Data Warehouse so you know all data • Static Metadata is provided in the Data Warehouse returned in reports is guaranteed to be the most recent data in DPMS environment • Reports may run up to 10 to 30 times faster with Live Data reporting than with existingCons: DPMS• Building or buying pre-built Data Warehouses is more expensive than a Live Data strategy• “IT intensive” with heavy reliance on IT support Cons:• Resources intensive to manage, maintain, and provide • If POS tables are purged then tables often you will have to additional content on an ongoing basis be copied first if you want to report historical information• The frequency of data being refreshed in the Data with a Live Data strategy Warehouse may impact reporting • Report processing is shared with transactional processing• Requires additional database software to store data and ETL on DPMS database software to populate your Data Warehouse
• Querying on business activities, for statistical analysis, online analytical processing (OLAP), and data mining in order to en-able forecasting, decision making, enterprise-wide planning, and, in the end,• To gain sustainable competitive advantages.• Requirements for improved customer service or self- service
• Data quality • The data integration team must promote data quality to a first-class citizen.• Transparency and auditability • Even high-quality results will be questioned by business consumers. Providing complete transparency into how the data results were produced will be necessary to relieve business consumers’ concerns around data quality.• Tracking history • The ability to correctly report results at a particular period in time is an on- going challenge, particularly when there are adjustments to historical data.• Reducing processing times • Efficiently processing very large volumes of data within ever shortening processing windows is an on-going challenge for the data integration team
[Dittrich and Jonscher, 1999], All Together Now — Towards Integrating the World’s Information Systems
• Manual Integration • users directly interact with all relevant information systems and manually integrate selected data• Common User Interface • the user is supplied with a common user interface (e.g., a web browser) that provides a uniform look and feel.• Integration by Applications • Applications that access various data sources and return integrated results to the user• Integration by Middleware • reusable functionality that is generally used to solve dedicated aspects of the integration problem• Uniform Data Access • a logical integration of data is accomplished at the data access level• Common Data Storage • physical data integration is performed by transferring data to a new data storage [Dittrich and Jonscher, 1999], All Together Now — Towards Integrating the World’s Information Systems
• Enterprise Information Integration (EII) – This pattern loosely couples multiple data stores by creating a semantic layer above the data stores and using industry-standard APIs such as ODBC, OLE-DB, and JDBC to access the data in real time.• Enterprise Application Integration (EAI) – This pattern supports business processes and workflows that span multiple application systems. It typically works on a message-/event-based model and is not data-centric (i.e., it is parameter-based and does not pass more than one “record” at a time).• Extract, Transform, and Load (ETL) – This pattern extracts data from sources, transforms the data in memory and then loads it into a destination.• Extract, Load, and Transform (ELT) – This pattern first extracts data from sources and loads it into a relational database. The transformation is then performed within the relational database and not in memory.• Replication – This is a relational database feature that detects changed records in a source and pushes the changed records to a destination or destinations. The destination is typically a mirror of the source, meaning that the data is not transformed on the way from source to destination.
Torque IT Solutions • Provides I.T Solutions For Automotive Finance Companies And Car Dealerships The • • Start-up Dealer Performance Management System •Company Show Profit Potential
The GoalImplement an Interface that will allow users to Import Data from external databases
• Limitations• Pros • Flexibility – allows new external data sources to be easily configured• Cons • Exact match• Bulk Import• Edge server caching
• database caching at edge servers enables dynamic content to be replicated at the edge of the network, thereby improving the scalability and the response time of Web applications.• Integrates data service technology and edge server data replication architecture, in order to improve Web services‟ data performance and address a variety of data issues in the SOA network.
• Provide data services with edge server data replication to clients• Increase data service performance• Reduce client-perceived response time• Ensure data consistency is more easily achieved
• Importance of DI• Issues for DI• How You can improve DI• Scalability considerations for DI