The document discusses the business intelligence (BI) lifecycle, which includes 6 key stages: 1) Analyzing business requirements, 2) Designing a data model, 3) Designing the physical schema, 4) Building the data warehouse, 5) Creating project metadata, and 6) Developing BI objects. It also describes the Enterprise Performance Lifecycle (EPLC) framework, which manages project deliverables and reviews across various stages to minimize risk and ensure best practices are followed throughout the project lifecycle.
This presenation explains basics of ETL (Extract-Transform-Load) concept in relation to such data solutions as data warehousing, data migration, or data integration. CloverETL is presented closely as an example of enterprise ETL tool. It also covers typical phases of data integration projects.
This presenation explains basics of ETL (Extract-Transform-Load) concept in relation to such data solutions as data warehousing, data migration, or data integration. CloverETL is presented closely as an example of enterprise ETL tool. It also covers typical phases of data integration projects.
Introduction to Data Warehouse. Summarized from the first chapter of 'The Data Warehouse Lifecyle Toolkit : Expert Methods for Designing, Developing, and Deploying Data Warehouses' by Ralph Kimball
Data Integration is a key part of many of today’s data management challenges: from data warehousing, to MDM, to mergers & acquisitions. Issues can arise not only in trying to align technical formats from various databases and legacy systems, but in trying to achieve common business definitions and rules.
Join this webinar to see how a data model can help with both of these challenges – from ‘bottom-up’ technical integration, to the ‘top-down’ business alignment.
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
How to successfully implement Business Intelligence into your organisation.
A completely agnostic and independent view from a market leader in delivering technology transformation.
Details on how to build a strategy to successfully execute on and more importantly how to get the business to adopt Business Intelligence into their day to day role.
Essential tool kit for any organisation looking to invest in Business Intelligence.
Data Virtualization: An Essential Component of a Cloud Data LakeDenodo
Watch full webinar here: https://bit.ly/33GgqE9
Data Lake strategies seem to have found their perfect companion in cloud providers. After years of criticism and struggles in the on-prem Hadoop world, data lakes are flourishing thanks to the simplification in management and low storage prices provided by SaaS vendors. For some, this is the ultimate data strategy. For others, just a repetition of the same mistakes. Attend this session to learn:
- The benefits and shortcoming of cloud data lakes
- The role and value of data virtualization in this scenario
- New development in data virtualization for cloud
Data Warehouse – Introduction, characteristics, architecture, scheme and modelling, Differences between operational database systems and data warehouse.
Data Warehouse Tutorial For Beginners | Data Warehouse Concepts | Data Wareho...Edureka!
This Data Warehouse Tutorial For Beginners will give you an introduction to data warehousing and business intelligence. You will be able to understand basic data warehouse concepts with examples. The following topics have been covered in this tutorial:
1. What Is The Need For BI?
2. What Is Data Warehousing?
3. Key Terminologies Related To Data Warehouse Architecture:
a. OLTP Vs OLAP
b. ETL
c. Data Mart
d. Metadata
4. Data Warehouse Architecture
5. Demo: Creating A Data Warehouse
Building an Effective Data Warehouse ArchitectureJames Serra
Why use a data warehouse? What is the best methodology to use when creating a data warehouse? Should I use a normalized or dimensional approach? What is the difference between the Kimball and Inmon methodologies? Does the new Tabular model in SQL Server 2012 change things? What is the difference between a data warehouse and a data mart? Is there hardware that is optimized for a data warehouse? What if I have a ton of data? During this session James will help you to answer these questions.
Tableau Training For Beginners | Tableau Tutorial | Tableau Dashboard | EdurekaEdureka!
This Edureka Tableau Training for beginners (Tableau Tutorial Blog: https://goo.gl/DaqKvp) helps you understand about Tableau in detail. It provides knowledge on what Business Intelligence is and get an introduction to Tableau as well. This Tableau tutorial also gives a sample use case using a data set containing state wise population and crime rate, to create a Horizontal bar graph and Symbol map to represent the data.
Introduction to Data Warehouse. Summarized from the first chapter of 'The Data Warehouse Lifecyle Toolkit : Expert Methods for Designing, Developing, and Deploying Data Warehouses' by Ralph Kimball
Data Integration is a key part of many of today’s data management challenges: from data warehousing, to MDM, to mergers & acquisitions. Issues can arise not only in trying to align technical formats from various databases and legacy systems, but in trying to achieve common business definitions and rules.
Join this webinar to see how a data model can help with both of these challenges – from ‘bottom-up’ technical integration, to the ‘top-down’ business alignment.
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
How to successfully implement Business Intelligence into your organisation.
A completely agnostic and independent view from a market leader in delivering technology transformation.
Details on how to build a strategy to successfully execute on and more importantly how to get the business to adopt Business Intelligence into their day to day role.
Essential tool kit for any organisation looking to invest in Business Intelligence.
Data Virtualization: An Essential Component of a Cloud Data LakeDenodo
Watch full webinar here: https://bit.ly/33GgqE9
Data Lake strategies seem to have found their perfect companion in cloud providers. After years of criticism and struggles in the on-prem Hadoop world, data lakes are flourishing thanks to the simplification in management and low storage prices provided by SaaS vendors. For some, this is the ultimate data strategy. For others, just a repetition of the same mistakes. Attend this session to learn:
- The benefits and shortcoming of cloud data lakes
- The role and value of data virtualization in this scenario
- New development in data virtualization for cloud
Data Warehouse – Introduction, characteristics, architecture, scheme and modelling, Differences between operational database systems and data warehouse.
Data Warehouse Tutorial For Beginners | Data Warehouse Concepts | Data Wareho...Edureka!
This Data Warehouse Tutorial For Beginners will give you an introduction to data warehousing and business intelligence. You will be able to understand basic data warehouse concepts with examples. The following topics have been covered in this tutorial:
1. What Is The Need For BI?
2. What Is Data Warehousing?
3. Key Terminologies Related To Data Warehouse Architecture:
a. OLTP Vs OLAP
b. ETL
c. Data Mart
d. Metadata
4. Data Warehouse Architecture
5. Demo: Creating A Data Warehouse
Building an Effective Data Warehouse ArchitectureJames Serra
Why use a data warehouse? What is the best methodology to use when creating a data warehouse? Should I use a normalized or dimensional approach? What is the difference between the Kimball and Inmon methodologies? Does the new Tabular model in SQL Server 2012 change things? What is the difference between a data warehouse and a data mart? Is there hardware that is optimized for a data warehouse? What if I have a ton of data? During this session James will help you to answer these questions.
Tableau Training For Beginners | Tableau Tutorial | Tableau Dashboard | EdurekaEdureka!
This Edureka Tableau Training for beginners (Tableau Tutorial Blog: https://goo.gl/DaqKvp) helps you understand about Tableau in detail. It provides knowledge on what Business Intelligence is and get an introduction to Tableau as well. This Tableau tutorial also gives a sample use case using a data set containing state wise population and crime rate, to create a Horizontal bar graph and Symbol map to represent the data.
Accelerating Machine Learning as a Service with Automated Feature EngineeringCognizant
Building scalable machine learning as a service, or MLaaS, is critical to enterprise success. Key to translate machine learning project success into program success is to solve the evolving convoluted data engineering challenge, using local and global data. Enabling sharing of data features across a multitude of models within and across various line of business is pivotal to program success.
Turning your Excel Business Process Workflows into an Automated Business Inte...OAUGNJ
Many organizations have evolved key internal business processes built on top of Microsoft Excel. These cross-functional workflows involve several organizational units responsible for collecting business system transactions, modifying this raw data, consolidating, transforming, pivoting and preparing data into a published set of Reports & Graphs – all in MS Excel. Such workflows are a burden to organizations – not repeatable, costly, time-consuming, inflexible and hard to scale, and evolve to become more complex over time. Business critical processes such as financial analysis, operational analysis and revenue analysis are often supported this way. Attempting to replace such systems can be quite daunting and a barrier to replace. The goal of this session is to present an easy to understand methodology and use cases to demonstrate how to move from an operational workflow in Excel to truly automated Business Intelligence.
“The organizing logic for business processes and IT infrastructure reflecting the integration and standardization requirements of the firm’s operating model.” [1]
“A conceptual blueprint that defines the structure and operation of an organization. The intent of an enterprise architecture is to determine how an organization can most effectively achieve its current and future objectives.”[2]
Enterprise Architecture - An Introduction Daljit Banger
The Slides are from my session at "An Evening of Enterprise Architecture Awareness" held at theUniversity of Sussex Hosted by the BCS Local Chapter and facilitated by the BCS EA Specialist Group.
BizFlow - BPM at Jardine Lloyd Thompson for Sales, Document Handling, Custome...Garth Knudson
As far back as 2004, JLT EB started using business process management (BPM) to streamline a limited set of business operations. Use was confined to about 30 people in a “model office”. During that same time period, JLT acquired Profund, a leading provider of pension administration software in the UK. Customers included both in-house and third-party administrators. Profund had seen opportunities to expand its pension fund administration solutions into specific areas of process automation while helping customers to simplify the overall user experience. Deciding to use the current BPM tool, the company developed outward facing solutions that rolled out to end customers in 2007. BPM usage at JLT EB and Profund grew to about 300 users.
Between 2007-2010, JLT made more than 20 acquisitions globally across the group. JLT EB operations quickly became highly complex, distributed and paper-based. Employees were handling millions of documents annually covering Pension Administration, Payroll, Defined Contributions, Actuarial, Health and Risk, among other requests. Processes treated more than 16 million workflow elements, 300+ million rows of table data and 15 million SharePoint documents. The BPM solution covers 14 active offices in Europe and India, off-shoring and massive amounts of regulations. The company knew that in order to continuing growing at the same speed while containing costs, it would have to do more with less.
JLT EB accomplished its goals of increased revenues with lower costs with continual investment in BPM. JLT EB has worked with BizFlow and used the BizFlow BPM software to streamline >200 processes. From an ROI standpoint, this work has provided a key business component, contributing to JLT EB’s growth in trading profit by 50% in the last financial year. Revenue growth is enabled by more flexible solutions that can be highly tailored to internal client needs as well as end-customer engagements. Cost cutting is enabled through the use of process automation tied together with effective scanning, document handling and rule-based routing. Paper is largely removed, deadlines hit, and governance accomplished.
Supporting material for my Webinar to the ACS - June2017Daljit Banger
The attached slide deck was used to Support a webinar for the Australian Computer Society (Queensland) on June 1st 2017.
Some previously used slides with modified content and some additional slides to support the webinar theme
Full Webinar Video can be seen at https://youtu.be/_41-izCm5rw
Volvo Strike, Industrial Relations The genesis of the conflict lies in the low wages at the factory, right from the time the Volvo buses division was set up in 2001. The share of Azad Builders, who had a 30 per cent minority stake in Volvo India, was bought out by Volvo in 2008, making it a fully-owned subsidiary of the Swedish giant. At this point of time, workers were being paid monthly wage of Rs 5,500. After continuous demands from the workers for higher wages – the management consented to give a salary hike of a measly Rs 650 in July 2009. When the workers asked for a higher wage uptick, the management of Volvo insisted that they would only negotiate with a recognised union. This requirement led to the creation of the Volvo Bus Workers Union (VBWU) and was registered in October 2009. The VBWU presented its official charter of demands to the management in January 2010.
A balanced scorecard is a strategic management performance metric that helps companies identify and improve their internal operations to help their external outcomes.
Human Resource Scorecard, HR Scorecard illustrates the most important actions performed by the human resource department including its achievements, productivity, etc. HR Scorecard can also be stated as a procedure by which the non-financial, as well as the financial targets or metrics, are assigned to the HRM related activities group which are needed to accomplish organizational strategic goals and monitor results.
For organization, HR Scorecard development is continuous process. It is crucial for the HR professionals to keep themselves ready and prepared in order to face changes within organization.
Export Import Banking … providing financial assistance to exporters and importers, and … functioning as the principal financial institution for coordinating the working of institutions engaged in financing export and import of goods and services with a view to promoting the country's international trade...
A Comparison Between Pre and Post Covid-19 Recruitment Strategies in Tata Con...Home
A comparison between pre and post COVID-19 Recruitment strategies in TCS.
The worldwide flare-up of COVID-19 has expanded the trouble of screening ability, moved recruiting to being much more on the web, and has affected the way toward making bids for employment. Recruitment has changed altogether with the overall pandemic of Coronavirus. The Covid will uncover ability holes (and afterwards close once the pandemic is finished) and have longer-term impacts on the recruiting business.
The bank management system is an application for maintaining a person’s account in a bank. The system provides the access to the customer to create an account, deposit/withdraw the cash from his account, also to view reports of all accounts present
Throughout history, new and improved technologies have transformed the human experience. In the 20th century, the pace of change sped up radically as we entered the computing age. For nearly 40 years Intel innovations have continuously created new possibilities in the lives of people around the world.
MRPL, a subsidiary of ONGC, currently processes over 15 Million Metric Tonnes of crude per Annum, and is the only refinery in India to have two hydrocrackers producing premium diesel, and two CCRs producing high octane unleaded petrol and a versatile design with high flexibility to process crudes of various API gravity and with high degree of automation, which requires establishment of safe minimum levels of maintenance, changes to operating procedures and strategies and the establishment of capital maintenance regimes and plans. The actions include the combination of technical and corresponding administrative, managerial and supervision actions.
The internship opportunity with HMT Machine Tools Limited was a great chance for learning and professional development. Therefore, I consider myself as very lucky individual as was provided with an opportunity to be a part of it. I am also grateful for having a chance to meet so many wonderful people and professionals who led me though this internship period.
A cryogenic rocket engine is a rocket engine that uses a cryogenic fuel or oxidizer, that is, its fuel or oxidizer (or both) are gases liquefied and stored at very low temperatures. Notably, these engines were one of the main factors of NASA's success in reaching the Moon by the Saturn V rocket.
During World War II, when powerful rocket engines were first considered by the German, American and Soviet engineers independently, all discovered that rocket engines need high mass flow rate of both oxidizer and fuel to generate a sufficient thrust. At that time oxygen and low molecular weight hydrocarbons were used as oxidizer and fuel pair. At room temperature and pressure, both are in gaseous state. Hypothetically, if propellants had been stored as pressurized gases, the size and mass of fuel tanks themselves would severely decrease rocket efficiency. Therefore, to get the required mass flow rate, the only option was to cool the propellants down to cryogenic temperatures (below −183 °C [90 K], −253 °C [20 K]), converting them to liquid form. Hence, all cryogenic rocket engines are also, by definition, either liquid-propellant rocket engines or hybrid rocket engines.
Various cryogenic fuel-oxidizer combinations have been tried, but the combination of liquid hydrogen (LH2) fuel and the liquid oxygen (LOX) oxidizer is one of the most widely used. Both components are easily and cheaply available, and when burned have one of the highest enthalpy releases by combustion, producing specific impulse up to 450 s (effective exhaust velocity 4.4 km/s).
Industry 4.0 represents the fourth industrial revolution in manufacturing and industry. Industry 4.0 is the current industrial transformation with automation, data exchanges, cloud, cyber-physical systems, robots, Big Data, AI, IoT and (semi-)autonomous industrial techniques to realize smart industry and manufacturing goals in the intersection of people, new technologies and innovation. IoT (Internet of Things), the convergence of IT and OT, rapid application development, digital twin simulation models, cyber-physical systems, advanced robots and cobots, additive manufacturing, autonomous production, consistent engineering across the entire value chain, thorough data collection and provisioning, horizontal and vertical integration, the cloud, big data analytics, virtual/augmented reality and edge computing amidst a shift of intelligence towards the edge (artificial intelligence indeed with a convergence of AI and IoT and other technologies): these are some of the essential technological components of the fourth industrial revolution. Those are quite a lot of terms and components indeed. Yet, Industry 4.0 is a rather vast vision and, increasingly, a vast reality that also stretches beyond merely these technological aspects. It is an end-to-end industrial transformation.
Currently , emissions of nitrogen oxides (NOx), total hydrocarbon (THC), non-methane hydrocarbons (NMHC), carbon monoxide (CO) and particulate matter (PM) are regulated for most vehicle types, including cars, trucks (lorries), locomotives, tractors and similar machinery, barges, but excluding seagoing ships and aeroplanes. For each vehicle type, different standards apply. Compliance is determined by running the engine at a standardised test cycle. Non-compliant vehicles cannot be sold in the EU, but new standards do not apply to vehicles already on the roads. No use of specific technologies is mandated to meet the standards, though available technology is considered when setting the standards. New models introduced must meet current or planned standards, but minor lifecycle model revisions may continue to be of fered with pre-compliant engines.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
2. Introduction
It’s no secret that Business Intelligence* (BI) projects are both time consuming
and resource intensive, often suffer from poor communication between the
business and IT, and are usually inflexible to changes once development has
started. This is due, in large part, to the method by which BI projects are
traditionally implemented. Regardless of the methodology employed, a
successful BI iteration requires:
• Business requirements identification
• Data analysis
• Data architecture and modeling
• Data integration (ETL, ELT, data virtualization, etc)
• Front-end development
• Testing and release management.
3. 1.Business requirements identification
• Define requirements precisely
• Identify and Prioritize requirements
Eg: Swiggy - Working as a bridge between buyers and sellers using an
innovative technology platform that works as a single point of contact.
Business streategy - Not only Food delivery but also medicines, grocery,
gift shops, flower shops etc
2. Data analysis - is a process of inspecting, cleansing, transforming,
and modeling data with the goal of discovering useful information,
informing conclusions and supporting decision-making.
4. 3. Data architecture and modeling - Data architecture is concerned with what
tools and platforms to use for storing and analyzing it while data modeling focuses
on the representation of the data. Data architecture is about the infrastructure
housing that data while Data modeling is all about the accuracy of data.
Eg:Microsoft Azure is a cloud computing service created by Microsoft for building,
testing, deploying, and managing applications and services through Microsoft-
managed data centers. It provides software as a service (SaaS), platform as a
service (PaaS) and infrastructure as a service (IaaS) and supports many different
programming languages, tools, and frameworks, including both Microsoft-specific
and third-party software and systems.
4. Data integration - refers to the technical and business processes used to
combine data from multiple sources to provide a unified, single view of the data.
• Extract, Transform and Load: copies of datasets from disparate sources are
gathered together, harmonized, and loaded into a data warehouse or database
• Extract, Load and Transform: data is loaded as is into a big data system and
transformed at a later time for particular analytics uses.
5. The five critical differences of ETL vs ELT:
• ETL is the Extract, Transform, and Load process for data. ELT is Extract, Load,
and Transform process for data.
• In ETL, data moves from the data source to staging into the data warehouse.
• ELT leverages the data warehouse to do basic transformations. There is no
need for data staging. (Data staging is one of the data warehousing processes
in BI. BI provides mechanisms for staging data (master data, transaction data,
metadata) from various sources. They also describe how the data can be
transferred from the sources. The extraction and transfer of data generally
takes place when requested by BI)
• ETL can help with data privacy and compliance by cleaning sensitive and
secure data even before loading into the data warehouse.
• ETL can perform sophisticated data transformations and can be more cost-
effective than ELT.
6.
7. • Data virtualization - data from different systems are virtually
combined to create a unified view rather than loading data into a new
repository. It is an approach to data management that allows an
application to retrieve and manipulate data without requiring
technical details about the data, such as how it is formatted at source,
or where it is physically located, and can provide a single view of the
overall data.
• Unlike the traditional extract, transform, load ("ETL") process, the
data remains in place, and real-time access is given to the source
system for the data. This reduces the risk of data errors, of the
workload moving data around that may never be used, and it does
not attempt to impose a single data model on the data .
8. 5. Front end development - refers to ‘client side’ development, where
is focus is on what users visually see first in their browser or
application. Front end developers are responsible for the look and feel
of the application. It uses
• HTML - HyperText Markup Language - create the structure, layout
• Cascading Style Sheets (CSS) - Stylize the website
• Javascript - Increase interactivity
9. 6. Testing and release management - Testing verifies the staging data,
ETL process, BI reports and ensures the implementation is correct.
Release Management is the process that handles software
deployments and change initiatives. It starts with planning what will be
contained within a release, managing the software build through
different stages and environments, testing stability and finally,
deployment.
10. Business Intelligence Life cycle
The challenge with the way in which BI projects are usually
implemented is that the design and development steps across the
architecture aren't integrated with the inputs by the business.
In addition, since the tools usually employed for design and
development aren't integrated, both initial development and
subsequent changes require a significant level of manual effort and
change management.
If companies want to improve upon the traditional BI development
process, they need to start approaching this process as a business
driven lifecycle, as well as integrate and automate as much of the
development process as possible.
11. Meaning
Business intelligence lifecycle management is a design, development
and management approach to BI that incorporates business users into
the design process and focuses on generating the data models,
database objects, data integration mappings and front-end semantic
layers directly from the business user input.
12.
13. Analyze Business Requirements - Review business requirements to determine the
types of analysis user need to perform.
Design Data Model - Based on the business requirements, design the logical data
model, which shows the information that users want to analyze and the
relationships that exists within the data.
Design the Physical Schema - Using the data model, design the physical schema,
which defines the content and structure of the data warehouse.
Build the Data Warehouse - Build the data warehouse according to the schema
design and load data into the warehouse from source systems.
Create the Project Structure (Metadata) - Create the metadata and begin to
connect and map the metadata to table in the data warehouse.
Develop The BI Objects - Develop object, like attribute, fact, metric, reports and
dashboard.
Administer and Maintain the Project - Administer and maintain the project as it
undergoes continued development and changes, monitor performance and make
adjustments to improve it, manage security, and perform other ongoing
administrative tasks.
14. EPLC framework elements
Enterprise Performance Life Cycle (EPLC) is a framework to enhance
Information Technology (IT) governance through rigorous application of
sound investment and project management principles and industry's
best practices.
Enterprise Performance Life Cycle (EPLC) Framework Elements
describes the essential elements of the EPLC framework. The life cycle
stages, stakeholders, performances of various phases and deliverables,
exit factors, and project reviews. The EPLC framework is developed to
offer the flexibility which is required to effectively handle risk, funds,
and profits, while allowing for variations in project size, difficulty,
scope, period, and approach.
15. The EPLC framework manages the performances, deliverables, and reviews of a
project into various life-cycle stages. The EPLC framework offers a project
management technique that direct the performances of project managers,
industry owners, essential partners, IT representatives, and other stakeholders
all through the life cycle stages of the project to make sure that a project
viewpoint is maintained while scheduling, implementing, and while performing
other procedures. Even though one of the purposes of the EPLC framework is to
regulate the project management within the project based on most excellent
practices, the framework also allocates certain methods to accommodate the
particular conditions. For example, Volume, time duration, difficulty, and
achievement strategy of each venture.
Execution of the EPLC framework in a project management is planned to
minimize the risk within various projects and along with the investment groups.
IT projects will be handled and executed in a planned way, using effective
project management customs, and certifying good participation by industry
stakeholders and technical specialists all through the project‟s life cycle.
16. Life Cycle Phases
The EPLC framework involves ten life-cycle stages. Below are the stages
and a brief explanation of each stage:
1. Concept: This phase recognizes the superior level industry and
functional needs which are essential to enhance the product and the
overall advantages of the planned investment. The outcome of this
phase is the agreement of group stakeholders of the initial project
benefits, scope, price, plan, performance, and a vague evaluation of
project estimates.
2. Initiation: This phase classifies the industry requirements, vague
estimation of project expenditure and plan, and fundamental business
and procedural risks. The effect of the primary stage is the resolution to
invest in a complete business case evaluation, an agreed project
contract, and initial project management schedule.
17. 3. Planning: This phase identifies the entire development of the whole
project management schedule and modification of project price, plan
and performance baselines as required. Outcome of complete planning
stage, sufficient project planning and adequate needs are determined
to certify the development and project baselines.
4. Requirements Analysis: This phase creates thorough functional and
non-functional needs. Also it creates Requirements Traceability Matrix
(RTM) and award agreements if required. The effect of the
Requirements Analysis Phase is award of necessary agreement and
sanction of the requirements.
5. Design: This phase prepares the design document. The result of the
design stage is finishing of industry product plan and successful
completion of initial and comprehensive design reviews.
18. 6. Development: This phase helps in developing code and other
deliverables which are essential to assemble the industry product. The
effect of the development stage is successfully completing of all coding
and related documents; user, operator, and safeguarding documentation,
and test scheduling.
7. Test: This phase does detailed testing and audit of the industry
product‟s plan, coding, and documentation. The effect of the test phase is
ending with acceptance testing and speediness for preparation and
alteration of project plan.
8. Transition: This phase performs user and operator training, establishes
focus to execute, and carry out the performance plan, including any
efficient operation. The result of the transition stage is the successful
accomplishment of complete production capacity and finishing of the
post-implementation review.
19. 9. Operations: This phase operates and handles the production method
and carries out annual operational assessment. The result of the
operations stage is that it successfully operates the assets against present
price, plan, and performance standards.
10. Disposition: This final phase withdraws the assets when effective
analysis signifies that it is no longer economical to activate the asset. The
result of this phase is the planned and methodical decommissioning of the
industry product with proper consideration of information archiving and
safeguarding. Also it helps in relocation of information or operation of
new assets, and implementation of programs learned over the investment
life cycle phases.
In, EPLC framework lifecycle, each stakeholder plays an important role
in implementation of the EPLC framework and the accomplishment of
projects. The responsibility of every stakeholder differs all through the
life cycle.
20. Human factors in BI
Business intelligence will always get synchronized with business data to
take resolutions rationally. As the duration of business is growing, a
variety of business intelligence solutions are getting introduced.
Whenever any organization chooses business intelligence solutions,
they anticipate a good profit from an investment. Some organizations
turn this chance into business development and once more begin the
root cause analysis to find out the reason for not achieving it. Every
successful business intelligence execution, irrespective of the volume
and capacity, must deal with how the project is influenced by human
factors. Greatest possible procedures can be assembled and acquire
high operate software and infrastructure machineries.
21. Human Factors in BI Implementation refers to abilities, methodologies,
equipments, technology and customs used to enhance decision making.
Therefore a BI system is known as a decision support system. Even though
there are various aspects that could influence the execution method of a BI
system, a study reveals the below mentioned significant aspects for a
business intelligence implementation by human factors:
• Focus on business methods and needs: Frequently organizations get
focused upon methodological capabilities and pay no attention on how
the business operations to be carried out and what are the chief business
needs. Once this has been established, it is easy to get engaged in a more
efficient BI system.
• Focus on attaining a strong ROI (Return on Investment): This needs to
enlarge a sophisticated business case, setting up key performance
methods, establishing baselines and objectives for those methods, and
evaluating performance.
22. • Enhanced project managing and resource assurance: Make sure
there is an efficient project manager to encourage team and take part
in the project.
• Dedication from organization executive: Along with the support from
a manager, encouragement from the CEO and higher executives is
also essential.
• Take time to schedule future events: Make sure things are scheduled
in a proper way at the starting of the project rather than wasting time
to resolve the problems in future.
• Sufficient training and change in administration: Assist people to
understand and efficiently use the BI structure.
23. Business Intelligence strategy
A business intelligence strategy refers to all the steps undertaken in
order to implement business intelligence in company. It starts with
diving into the BI process, defining the stakeholders and main actors, to
assessing the situation, defining the goals and finding the performance
indicators that will help to measure the efforts to achieve these goals.
We define the strategy in terms of vision, organization, processes,
architecture and solutions, and then draw a roadmap based on the
assessment, the priority and the feasibility.
24. How To Create A Business Intelligence Strategy
• Assess the situation: analyze the organizational structure, processes and
software stack – or the absence of such. Find out what is working and what
isn’t, to save time on already functioning processes. Ask the right business
questions and define the strategic goals that company wants to achieve.
• Building the BI roadmap: It refers to establishing the steps before hitting the
road. It is to prepare ourselves for surprises and problems to handle.
• Defining your team: from the head of BI to the business analyst to the
developer, a solid team with clear roles that will be able to carry out the
different tasks are needed.
• Organizing your BI system: the data warehouse, the data sources, the
software drawing out insights… There’s a lot of thinking behind this that
shouldn’t be neglected, as it will be the central tool to navigate the data and
bring out insightful analytics.
25. • Get ready to hit the road - As one would say, company is now ready
to start. It has all the keys in hands to start the first step of roadmap
and launch new BI strategy.
26.
27.
28. BI Transformation process
BI Transformation Process (or Business Intelligence Transformation Process) is
an encapsulation of the depth and breadth of effort required to transform Data
Analytics and BI functions to excel in the digital age to support business
decisions and transform the company into a data-driven intelligent enterprise.
BI Transformation Approach
Value Proposition
Once the company is convinced about the value that BI brings to the
organization, they have to share that value proposition with others. The
concerned people have to “sell” the concept to the employees. It can be a
tough transition from making decisions based on instinct to making decisions
based on the analysis of data. Most companies have been successful in top-
down approach; once the board and top management are convinced, it’s easier
to convince lower level management.
29. Strategy
In this step that cmpany has to be clear about the following - What is company’s
vision of this system? What should BI do for the company? What are company’s
business objectives and key performance indicators (KPIs)? Many businesses share
three main KPIs: risk level, productivity, and financial value, but how to measure
those for a particular business? What are the informational needs for the entire
organization? Perhaps customer expectations are more important than income of
the company.
Plan
Once the strategy is developed, an implementation plan that is practical needs to
be developed. It is better to start small, even if you have big ideas. It is likely that a
will have limited IT staff and an even more limited budget, so developing the data
analysis and essential reporting that will maximize cost-benefit is required. Find
initial areas of business that make sense to undergo transformation. Companies
that have already implemented BI state that there are natural areas to use as
starting points, with many companies beginning with finance, logistics, and
customer relations management.
30. Build and Implement
Most companies use a mixture of vendors and products to build their BI system.
Better to keep it simple and think about ease of use for all people involved—
employees, customers, IT department and external partners. Ideally, groups
within the company should be able to create their own reports according to
their own unique needs, and should be able to share data across departments.
The system should be easy to configure, and it should be easy to import data
into and extract data out of the system. The data warehouse is the foundation
on which companies build other modules and reports, and the system should be
easily scalable as business changes and grows. The data interface and
presentation capabilities are key, and a majority of time should be spent on
designing the interface for various users.
31. Business Intelligence Transformation Drivers
What is motivating big companies to take the BI transformation
plunge? The following are the reasons and drivers as to why it makes
sense for one to transform a company’s business intelligence systems
and processes.
Business, as Usual, is a Death Knell
One of the main drivers for business intelligence transformation is the
cost of not taking the plunge. Companies that delay the transformation
do so at a significant cost. Companies without appropriate business
intelligence transformation risk competitive advantage and at times
their very survival in the fast-changing dynamics of the new economy.
32. Cost Factors
Maintenance of old inflexible systems is costly, and the lack of
scalability means that most companies spend money on other
applications or third party systems to make the antiquated system do
what they want the systems to do. A recent study showed that
organizations which moved to standardized BI tools had 36 percent
lower BI spend as a percentage of revenue.
There is also a hidden, more intangible cost of operational inefficiency
that comes with not having simple and consolidated BI tools for
employees, not to mention the lost opportunities of not having data
insight that can lead companies to new customers, product lines and
markets.
33. Constant Innovation
Today’s marketplace is one of constant innovation of products and their features.
Products, as well as new features of existing products, are released in a lightning-fast
time-to-market environment. If companies are trying to broaden sales scope, they
need better BI, because competitors are likely to have it. There is cut throat
competition, and more and more companies are turning to BI to survive in the
marketplace because it greatly reduces the time they need to react to market changes
and pressure from the competition. Smart companies also use it to reduce many risks
they face in today’s global market.
Effectiveness
Companies today have to be lean and effective, and BI dramatically increases
employees’ efficiency which in turn improves financial performance. Successful
companies are those that quickly respond in a company-wide fashion to the vastly
changing competitive marketplace. It’s not easy–in fact, it can be an onslaught.
Changes in legislation or technology, customer demand, and competitors’ actions all
require companies to be fast and nimble.
34. Underperformance
Many companies that have experienced declines in profits, market share and sales
have used BI transformation to turn low performance around.
Non-Traditional Thinking
Perhaps the most exciting driver, and sometimes the hardest one to follow, is that a
different way of thinking is an imperative in today’s market. We’re all used to the
traditional BI reports with historical information, which are no doubt useful. But BI
shifts the playing field by changing the kinds of questions you can ask and get
answered about your business. Traditional reports are after-the-fact: how was our
performance last quarter, and so on. With BI, there is predictive value. You can ask
questions like “What will happen if I change this part of my product line?” Companies
that are asking this type of question see the impacts that certain choices will have to
their overall strategy. They are the ones recognizing new risks as well as new niches.
The effectiveness and insight that a transformed business intelligence function
provides add tremendous value to the organization. There are many Business
intelligence transformation drivers. Irrespective of which business intelligence
transformation drivers are necessary, the key is to start BI transformation now to reap
the results.
35. Parallel development tracks
Every BI decision-support project has at least three development tracks
running in parallel after the project requirements have been defined
and before implementation.
1. The ETL Track
The ETL track is often referred to as the back end. The purpose of this
development track is to design and populate the BI target databases.
The ETL track is the most complicated and important track of a BI
decision-support project. The fanciest OLAP tools in the world will not
provide major benefits if the BI target databases are not designed
properly or if they are populated with dirty data. The team working on
the ETL track is usually staffed with knowledgeable business analysts,
experienced database administrators, and senior programmers.
36. 2. The Application Track
The Application track is often referred to as the front end. The purpose
of this development track is to design and build the access and analysis
applications. After all, the key reasons for building a BI decision-support
environment are to:
- Deliver value-added information
- Provide easy, spontaneous access to the business data
The team for the Application track is usually staffed with subject matter
experts, "power users," and programmers who know Web languages,
can effectively use OLAP tools, and have experience building
client/server-based decision-support applications that incorporate
graphical user interfaces.
37. 3. The Meta Data Repository Track
Meta data is a mandatory deliverable with every BI application. It can
no longer be shoved aside as documentation because it must serve the
business community as a navigation tool for the BI decision-support
environment. Therefore, the purpose of this development track is to
design, build, and populate a meta data repository. The team members
are responsible for designing and building the access interfaces as well
as the reporting and querying capabilities for the meta data repository.
The team working on the Meta Data Repository track is usually staffed
with a meta data administrator and developers who have experience
with building client/server-based interfaces and are knowledgeable
about Web applications.
40. Steps in BI road map
1. Go into the process with eyes wide open
Right mindset will help to address issues like complicated data
problems, change management resistance, waning sponsorship, IT
reluctance and user adoption challenges.
2. Determine stakeholder objectives
Though everyone at the organization benefits from increased data
access and insights, determining key stakeholders are important. Then
find out what they need: visible and vocal executive sponsorship is a
must. Gathering and setting executive team expectations early is
paramount.
41. 3. Choose a sponsor
While a business intelligence strategy should include multiple
stakeholders, it is imperative to have a sponsor to spearhead the
implementation.It should be sponsored by an executive who has
bottom-line responsibility, a broad picture of the organization’s strategy
and goals and knows how to translate the company mission into
mission focused KPIs. CFOs and CMOs are good fits. They can govern
the implementation with a documented business case and be
responsible for changes in scope.
4. BI is not just a technology initiative
To succeed, BI deployment must have the support of key business
areas. IT should be involved to ensure governance, knowledge transfer,
data integrity, and the actual implementation. But every stakeholder
and their respective business areas should also be involved throughout
the process.
42. 5. Employ a Chief Data Officer (CDO)
CDO is required for data gathering, management, optimization, and security .
6. Assess the current situation
Usually a BI deployment isn’t quick or easy. There is a lot of work to do on the
front end. One of the biggest sections on a business intelligence roadmap
should be assessing the current situation. Once all the right stakeholders are
agreeing for BI implementation, the next step is analyzing the current software
stack, and the processes and organizational structures surrounding it. Find out
what is working and document everything that isn’t working.
7. Clean the data
Cleaning data may not be simple, but it will ensure the success of BI. It is
crucial to guarantee a solid data quality management, as it will help to maintain
the cleanest data possible for better operational activities and decision-making
made relying on that data.
43. 8. Develop a “Data Dictionary”
For business intelligence to succeed there needs to be at least a
consensus on data definitions and business calculations. The lack of
agreement on definitions is a widespread problem in companies today.
For example, finance and sales may define “gross margin” differently,
leading to their numbers not matching.
9. Identify key performance indicators (KPIs)
KPIs are measurable values that show how effectively a company is
achieving their business objectives. They sit at the core of a good BI
strategy. KPIs indicate areas businesses are on the right track and
where improvements are needed. When implementing a BI strategy, it
is crucial to consider the company’s individual strategy and align KPIs to
company’s objectives. It may be tempting to create KPIs for everything.
It is best to start with the most important KPIs; then create standards
and governance with KPI examples in mind.
44. 10. Choose the right tool for business
Companies need to make sure to choose a solution that can start small but
easily scale as your company and needs grow. Look for flexible solutions that
address the needs of all the user in the company.
11. Pursue a phased approach
A successful BI strategy takes an iterative approach. Think “actionable” and take
baby steps. Choose a few KPIs and build a few business dashboards as
examples. Gather feedback. Repeat again with new releases every few weeks.
Continuously check what is working and what stakeholders are benefiting.
A good BI roadmap doesn’t have an end date. Organization should be invested
in it for the long term. Companies should continually measure and refine
processes, data and reports.
45. BI Framework
The advantages of Business Intelligence to your organization are many.
But the trick is making BI work at the lowest possible cost. How to
make sure the right information reaches the right user, and in the right
form? And without generating an entire battery of management
dashboards and reports. How to tackle this issue smartly and without
investing a ton of money, while maintaining the proper, high quality?
BI framework seamlessly connects the various elements of a business:
organizational roles, KPIs, authorization, and visualization. This helps to
implement Business Intelligence plans both easier and faster.
A BI framework helps to structure the improvement process of
Business Intelligence. On top of that, it lets to implement BI strategy in
a very cost-effective way. Developing this framework makes high-
quality BI available at a reasonable price.
46. Why do you need a BI framework?
• Relevant information which is effortlessly available to whoever needs
it, at any given time.
• Information is 100% current, interactive, and fast.
• Users no longer need to struggle with complex file structures or click
through myriad tabs looking for the right report or dashboard.
Advantages
• Making a new dashboard available in just a few clicks.
• Developing new BI applications within few days.
• The ability to create new mobile BI applications quickly.
• Lower administrative costs and a lower Total Cost.
• Excellent performance due to the high rate of reuse and caching.
(Caching - data stored for future use)
47. The results of our Business Intelligence framework
• More effective use of Business Intelligence software by more users.
• More successful BI projects that cost less and add more value.
• More efficiency across the organization, meaning higher margins.