The document discusses business intelligence vendors and their capabilities. It notes that the winners will be those able to quickly gather, analyze, and use data to make decisions. It also discusses how vendors are integrating different business intelligence functions into unified suites and how database vendors are building predictive analytics directly into their databases to enable real-time decision making from transactional data.
Learn the basics of business intelligence, including common terms, how to implement solutions, and what it can do for your company. For even more insight into how project management can benefit your work, visit: http://bit.ly/GuideToBI
To find a custom business intelligence solution that fits the specific needs of your work, visit: http://bit.ly/GetBI1
Trending use cases have pointed out the complementary nature of Hadoop and existing data management systems—emphasizing the importance of leveraging SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve distributed analytic processing. Many vendors have provided interfaces between SQL systems and Hadoop but have not been able to semantically integrate these technologies while Hive, Pig and SQL processing islands proliferate. This session will discuss how Teradata is working with Hortonworks to optimize the use of Hadoop within the Teradata Analytical Ecosystem to ingest, store, and refine new data types, as well as exciting new developments to bridge the gap between Hadoop and SQL to unlock deeper insights from data in Hadoop. The use of Teradata Aster as a tightly integrated SQL-MapReduce® Discovery Platform for Hadoop environments will also be discussed.
Considerations for Data Migration D365 Finance & OperationsGina Pabalan
Harvesting enterprise data is central to how organizations compete, and even survive, as industries transform digitally. Yet, as companies merge and technologies shift, managing data has become an extremely complex but critical task, especially handled alongside of an enterprise ERP implementation.
For companies moving from an on-premise legacy ERP system to Microsoft’s cloud-based Dynamics 365 for Finance and Operations (“D365”), there are some unique challenges and new tools to leverage when considering the data migration activity.
Microsoft delivers the Data Management Framework (“DMF”) tool to assist customers with data migration for D365. Data migration itself consists of three distinct activities, as illustrated below: Data extraction (from legacy systems), data transformation and data import into D365. DMF will assist in the import into the new D365 application, but what is the best way to extract and transform, to “ready” the data for the import?
What you need to know about Data Migration for D365 Finance & OperationsGina Pabalan
This is a "nuts & bolts" whitepaper discussing the capabilities and challenges of migrating data to Microsoft Dynamics365 for Finance and Operations (D365).
Kaizentric is a Data Analytics firm, based in Chennai, India. Statistical Analysis is performed on a well-built client specific data warehouse, supported by Data Mining.
Learn the basics of business intelligence, including common terms, how to implement solutions, and what it can do for your company. For even more insight into how project management can benefit your work, visit: http://bit.ly/GuideToBI
To find a custom business intelligence solution that fits the specific needs of your work, visit: http://bit.ly/GetBI1
Trending use cases have pointed out the complementary nature of Hadoop and existing data management systems—emphasizing the importance of leveraging SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve distributed analytic processing. Many vendors have provided interfaces between SQL systems and Hadoop but have not been able to semantically integrate these technologies while Hive, Pig and SQL processing islands proliferate. This session will discuss how Teradata is working with Hortonworks to optimize the use of Hadoop within the Teradata Analytical Ecosystem to ingest, store, and refine new data types, as well as exciting new developments to bridge the gap between Hadoop and SQL to unlock deeper insights from data in Hadoop. The use of Teradata Aster as a tightly integrated SQL-MapReduce® Discovery Platform for Hadoop environments will also be discussed.
Considerations for Data Migration D365 Finance & OperationsGina Pabalan
Harvesting enterprise data is central to how organizations compete, and even survive, as industries transform digitally. Yet, as companies merge and technologies shift, managing data has become an extremely complex but critical task, especially handled alongside of an enterprise ERP implementation.
For companies moving from an on-premise legacy ERP system to Microsoft’s cloud-based Dynamics 365 for Finance and Operations (“D365”), there are some unique challenges and new tools to leverage when considering the data migration activity.
Microsoft delivers the Data Management Framework (“DMF”) tool to assist customers with data migration for D365. Data migration itself consists of three distinct activities, as illustrated below: Data extraction (from legacy systems), data transformation and data import into D365. DMF will assist in the import into the new D365 application, but what is the best way to extract and transform, to “ready” the data for the import?
What you need to know about Data Migration for D365 Finance & OperationsGina Pabalan
This is a "nuts & bolts" whitepaper discussing the capabilities and challenges of migrating data to Microsoft Dynamics365 for Finance and Operations (D365).
Kaizentric is a Data Analytics firm, based in Chennai, India. Statistical Analysis is performed on a well-built client specific data warehouse, supported by Data Mining.
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
Modern Business Intelligence - Design and ImplementationsDavid J Rosenthal
During the first two “waves” of business intelligence, IT professionals and business analysts were the keepers of BI. They made BI accessible and consumable for end users.
While this approach still applies to complex business intelligence needs, today there is a new “wave.” This third wave of BI makes BI available to every kind of user.
As customer strive to take advantage of the digital transformation that is occurring in virtually every industry, they need to re-evaluate how they engage with their customers/prospects, how they transform their products and operations, and how they empower and understand their employees.
In today’s world, doing each of these things is more and more reliant upon data…traditionally, everything you knew about your customers and prospects was available in your business application systems and in the heads of employees. You learned almost everything about your products BEFORE they left your warehouse. Employees used technology more to enter data than to learn from it.
In the data-driven world we live in today, leveraging intelligent insights from data across customers, products, and employees is critical to be able to stay competitive and keep up with or lead digital transformation in any industry. And this isn’t just a customer’s typical business application data – it’s also about augmenting the customer’s data with additional data (e.g. search, employee behavioral data, sentiment data, benchmark data, etc..) – and applying the right intelligence to drive meaningful insights.
Data mapping in an important part of every data process. This eBook will help you understand what is data mapping and how it can help you establish connection between disparate data sets.
Analyst field reports on top 20 MDM and Data Governance implementation partne...Aaron Zornes
(1) Determining the evaluation criteria for selecting implementation partners for MDM, RDM and Data Governance projects
(2) Identifying which partners are market leaders in your industry & your chosen software technologies
(3) Managing the partner relationship – esp. avoiding “brain drain” & inflationary “blended rates”
Analyst field reports on top 20 multi domain MDM solutions - Aaron Zornes (NY...Aaron Zornes
“Top 10” MDM Evaluation Criteria
Data model
Business services
Identity resolution
Data governance
Architecture
Data management
Infrastructure
Analytics
Developer productivity
Vendor integrity
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
Modern Business Intelligence - Design and ImplementationsDavid J Rosenthal
During the first two “waves” of business intelligence, IT professionals and business analysts were the keepers of BI. They made BI accessible and consumable for end users.
While this approach still applies to complex business intelligence needs, today there is a new “wave.” This third wave of BI makes BI available to every kind of user.
As customer strive to take advantage of the digital transformation that is occurring in virtually every industry, they need to re-evaluate how they engage with their customers/prospects, how they transform their products and operations, and how they empower and understand their employees.
In today’s world, doing each of these things is more and more reliant upon data…traditionally, everything you knew about your customers and prospects was available in your business application systems and in the heads of employees. You learned almost everything about your products BEFORE they left your warehouse. Employees used technology more to enter data than to learn from it.
In the data-driven world we live in today, leveraging intelligent insights from data across customers, products, and employees is critical to be able to stay competitive and keep up with or lead digital transformation in any industry. And this isn’t just a customer’s typical business application data – it’s also about augmenting the customer’s data with additional data (e.g. search, employee behavioral data, sentiment data, benchmark data, etc..) – and applying the right intelligence to drive meaningful insights.
Data mapping in an important part of every data process. This eBook will help you understand what is data mapping and how it can help you establish connection between disparate data sets.
Analyst field reports on top 20 MDM and Data Governance implementation partne...Aaron Zornes
(1) Determining the evaluation criteria for selecting implementation partners for MDM, RDM and Data Governance projects
(2) Identifying which partners are market leaders in your industry & your chosen software technologies
(3) Managing the partner relationship – esp. avoiding “brain drain” & inflationary “blended rates”
Analyst field reports on top 20 multi domain MDM solutions - Aaron Zornes (NY...Aaron Zornes
“Top 10” MDM Evaluation Criteria
Data model
Business services
Identity resolution
Data governance
Architecture
Data management
Infrastructure
Analytics
Developer productivity
Vendor integrity
The reputational and financial damage from cyber security breaches for banks are so enormous that they cannot any longer afford to be reactive. Big Data Analytics lets them preempt attacks.
Firewalls have proved to be ineffective for cyber-security. Instead, a new category of security applications has emerged which learn from the criminal behavior of intruders and use data in combination with deception to trap hackers.
The cloud eases the burden of compliance for financial and health services companies but it also extracts valuable data for analytical and decision-making purposes.
Evolving Big Data Strategies: Bringing Data Lake and Data Mesh Vision to LifeSG Analytics
The new data technologies, along with legacy infrastructure, are driving market-driven innovations like personalized offers, real-time alerts, and predictive maintenance. However, these technical additions - ranging from data lakes to analytics platforms to stream processing and data mesh —have increased the complexity of data architectures. They are significantly hampering the ongoing ability of an organization to deliver new capabilities while ensuring the integrity of artificial intelligence (AI) models. https://us.sganalytics.com/blog/evolving-big-data-strategies-with-data-lakehouses-and-data-mesh/
Sap erp and oracle erp alternatives for small manufacturersMRPeasy
On many occasions, the most recognized systems with the highest market share, such as SAP ERP or Oracle ERP, are the chosen ones. But at the same time, there are many alternatives that can offer cost-effective solutions for small manufacturers.
#saperp #erp #mrp #mrpeasy #manufacturing #manufacturingsoftware #oracleerp #smallmanufacturing #erpsoftware
TechoERP, which is hosted in the cloud, is especially beneficial to businesses since it gives them access to full-featured apps at a low cost without requiring a large initial investment in hardware and software. A company can rapidly scale their business productivity software using the right cloud provider as their business grows or a new company is added.
Go from data to decision in one unified platform.pdfwebmaster553228
According to IDC’s January 2022 Worldwide CEO Survey, 65% of organizations are using at least 10 different data engineering and intelligence tools to integrate data.
Big Data Tools: A Deep Dive into Essential ToolsFredReynolds2
Today, practically every firm uses big data to gain a competitive advantage in the market. With this in mind, freely available big data tools for analysis and processing are a cost-effective and beneficial choice for enterprises. Hadoop is the sector’s leading open-source initiative and big data tidal roller. Moreover, this is not the final chapter! Numerous other businesses pursue Hadoop’s free and open-source path.
The business analytics marketplace is experiencing a challenge as classic BI tools meet up with evolving big data technologies, in particular Hadoop. We explore how IBM works to meet this challenge, providing a big picture perspective of their big data offerings around Hadoop, its open data platform and BigInsights.
Understanding the then and now of Enterprise Management Systems.pdfAnil
Enterprise Management Systems (EMS), also known as Enterprise Resource Planning (ERP) systems, have evolved significantly over the years. Understanding the "then" and "now" of EMS can provide insights into the transformation of business processes and technology.
Business Process Re-engineering (BPR) How to fight Maverick Rebel. A too frequently overlooked cause preventing process optimization. This is one of the causes why too big companies are having issues which must be addressed DATA QUALITY ASSURANCE - DQA a typically overlooked domain.
Top Big data Analytics tools: Emerging trends and Best practicesSpringPeople
For many IT experts, big data analytics tools and technologies are now a top priority. Let's find out the top big data analytics tools in this slide to initialize and advance the process of big data analysis.
Similar to Vendor comparisons: the end game in business intelligence (20)
Predictive analytics reaps the value from CRM databases aided by customer segmentation, competitive intelligence, and learning from marketing campaigns.
Sensors gather minute bits of data that are valuable when aggregated and processed to extract intelligence. A whole new generation of business solutions are emerging from the data gathered from M2M.
A white paper written for Bluepoint Solutions that discusses the context for enterprise content management. Regulatory compliance demands have grown in recent years as a result of Dodd-Frank regulations rendering the old IT systems obsolete.
Labor arbitrage benefits of global outsourcing are often offset by the costs of co-ordinating and managing far-flung operations. The downside can be mitigated by aggregating data on operations and real-time decision.
Multinationals are challenged by changing tax laws, accounting practices, valuation methods and penalties as administrations around the world clamp down on tax avoidance
Real time responses to events will be feasible when enterprises are designed to be maneuverable and their flow of activity is not disrupted by a breakdown in any one component in the chain of business processes that enable the completion of an activity.
As it incorporates a gamut of functions from business activity monitoring to performance management and business planning, business intelligence attracts a growing number of companies who earlier specialized in individual functions
IT departments need to meet performance goals to justify the money invested in them. Matching loads with resources is a big part of the solution that is facilitated by predictive analytics.
Enterprise Risk Management is increasingly important as Sarbanes Oxley raises the costs of lapses in corporate governance. Companies have to learn to preempt disasters and other causes of shortfalls in performance
Internal combustion engines can operate with much greater fuel efficiency with the help of automotive electronics. IT technologies can reduce fuel consumption with route optimization.
Mass marketing yields less and less from more and more as the number of marketing channels for communication and distribution increase in numbers. Predictive analytics finds the patterns that help to identify the clusters of customers more likely to respond to specific messages and offers.
The report describes wireless applications in media, location based services, medical, telematics and unified messaging enabled by storage area networks.
More from Kishore Jethanandani, MBA, MA, MPhil, (19)
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 4
Vendor comparisons: the end game in business intelligence
1. VENDOR COMPARISONS: THE END GAME IN BUSINESS INTELLIGENCE
A great deal of innovation in the business intelligence industry, in the past, will culminate with the
emergence of players who will dominate the industry for years to come. The winners will be
companies which are able to reduce the latencies in data gathering, analysis and decision
making. Since information for decision making requires multi-dimensional data, users have to be
able to aggregate information from diverse sources; the information should include not only
quantitative information but also qualitative information such as conversations, notes, images and
videos. The patterns in the data have to be discerned and understood as quickly as possible so
that action in taken before an opportunity is lost.
Feedback from customers provides clinching evidence that ease of integration is the most valued
attribute for customers. Inter-linked transaction systems allow companies to aggregate data from
their CRM, SCM, production and financial systems. All this data has to be free from errors and
the definitions have to be consistent across all sources. The extraction of patterns of data is aided
by machine learning systems and comprehended quickly if it is vividly visualized. Decision
making and its implementation involves the broad majority of employees in a company who need
to be able to compare their data with agreed standards of performance before they can take
action. All employees have to be able to share the same data and access in user friendly form. In
order to take action, the processes of companies have to be flexible enough to respond to
situations as they happen.
The winners among vendors will have exceptional capability in implementation of large projects
pulling together capabilities in reporting and querying, multidimensional analysis, analytics, data
management, visualization and business process management. Pure play business intelligence
vendors, such as Cognos, Business Objects, Hyperion, as much as ERP vendors like
Oracle/Siebel, IBM and advanced analytics players, such as SAS and SPSS, and upstarts like
Qliktech are all looking to provide suites of business intelligence functionalities. While pure play
vendors have an edge in consolidating products, the ERP players have accumulated competence
in integration of applications, business processes and data, the advanced analytics group of
vendors has strengths in enhancing the value of data by drawing insights for decision making
while upstarts continue to tap disruptive new technologies.
Enterprise scale business intelligence suites are the preferred flavor in the business community
eager to reduce complexity and costs. Pure Play business intelligence vendors have responded
to competition from ERP companies by tightly integrating their products. They are also agnostic
about the databases and applications and are more inclined to use service-oriented architecture
to be able to access any source of data on their network. BusinessObjects, for example, has
integrated its Crystal Enterprise and BusinessObjects products as a single suite which can be
operated with a common set of administrative tools thereby lowering the costs of installation. In
addition, users are able to take advantage of the composite of business intelligence functions
2. including reporting, ad hoc queries, OLAP and dashboards. Cognos 8 has unified its OLAP
(PowerPlay), Visualizer, Metrics Manager and NoticeCast, into the services-oriented architecture
that under grids its operational reporting tool ReportNet in a web based environment. In addition,
Cognos has improved access to data from the entire enterprise as a result of integration with its
DecisionStream ETL tool which can now be managed by ReportNet. As a result of the
partnership with Composite software, Cognos 8 users have the ability to query data from any
database such as Oracle, DB2, etc.
On the other hand, database and ERP vendors such as Oracle, Teradata and IBM are
consolidating their products by integrating the business intelligence functions into their databases
which considerably lowers the latencies in the transfer of data for analytical purposes. Microsoft
bundles its online analytic processing (OLAP) with SQL Server and has added data mining and a
reporting server. Oracle's10g database incorporates several of the routine business intelligence
functions into its database. SAP, Oracle, Siebel and Microsoft all offer products with automated
business processes; an update on a table triggers processes within the database, sets
applications and business processes in motion, causes updates in other databases, initiates
communication with users, and even trigger remote procedures in external systems. A tight
integration of Siebel Analytics package into its CRM applications helps to steer workflow and
receive real-time information.
All this is much harder with pure play business intelligence vendors who have to partner or make
risky acquisitions to achieve the same objective. These capabilities are important to lower the
latencies between the time a decision is taken and relevant actions are executed.
The other major advantage the traditional ERP and database vendors have to offer are their
platforms that support the broad range of functions such as composite applications, business
processes and data integration technologies. In the web services and SOA environment,
platforms are particularly useful to join myriad services. Within an SOA framework like Project
Fusion of Oracle or mySAP of SAP, Microsoft’s SQL Server 2005, diversity of functionalities can
be incorporated rendering business intelligence packages irrelevant.
REAL TIME PREDICTIVE ANALYTICS AND DATABASE PROVIDERS
One instance of the advantage database providers have in the business intelligence domain is
their ability to build in predictive analytics required for operational purposes. Several decisions are
recurring in nature yet they require up-to-date data for sound decisions. Financial institutions
have to be able to make judgments about credit worthiness, increasingly in real time, before they
can accept credit card applications from retail customers. Similarly, customers are often swayed
by fads when they choose colors for their cars or clothes or specific models. Other times some
combinations of products sell well and they are better displayed next to each other. Customer
churn, cross-selling and pricing policies are other problems that need to be addressed in real
3. time. Sales people have to be able to make impromptu decisions about their stocking policy as
such information flows into their transaction databases.
In the world of data warehouses and cubes, any kind of data analysis is preceded by an elaborate
process of cleaning and reformatting data before it will be ready for analysis. In operational
decisions, moreover, data volumes become overwhelmingly large and data warehouses much too
clunky to cope with the pressures of real time decision making. Increasingly, database providers
are looking to build-in canned models for the analysis of data required for decisions in routine
processes.
The change has come with the advent of the Predictive Model Markup Language (PMML), an
open standards XML based language, which facilitates the transfer of models created in one
environment, such as SAS, and transfer it to a relational database. The XML tagging helps to
describe data inputs into data mining models, the transformations used in preparing data for data
mining, and the parameters defining the data mining models so that the data mining algorithms
can be transferred to any environment whether it is CRM, SCM or production data.
IBM, for example, has a partnership with SAS, to create scoring models and transfer them to the
relational context of its DB2 database. While SQL is not meant to address complex queries, it has
the ability to find answers to relatively simple questions of most operational staff. Once the
predictive models are embedded in relational databases, they can be accessed and modified
using SQL. The data for such purposes is drawn directly from transactional databases and does
not have to be processed in a data warehouse. Similarly, Microsoft is building equivalent
capabilities for its SQL Server 2005.
The pure play BI vendors, on the other hand, are at a disadvantage as they have traditionally
used cubes for their analytical routines. Microstrategy is one exception among them with its long
standing ROLAP capabilities and is using embedded DBMS models to provide predictive
modeling capabilities for report generation. Hyperion has incorporated predictive analytics in its
Essbase product where the predictive model is added as another dimension.
Intelligence in the language I understand
The ideal that customers want to achieve in integration is to find information as close to human
natural language as possible and in a form that reflects their thought processes. This implies
pulling together data in any form whether structured or unstructured. They want to be able to
search information that matches concepts rather than specific queries. XML has helped to break
some barriers by affording an ability to describe data. Web Services and SOA architecture has
helped to join information from a variety of sources. Semantic metadata has built the foundation
for natural language searches.
When information is available close to natural language, decision makers are better able to
visualize a scenario before they can make decisions. In order for these decisions to be
actionable, decision makers need levers to implement their decisions. Integration, in other words,
4. is not simply a question of inter-linking applications and business processes as Enterprise
Applications Integration does. Similarly, integration is more than linking all data sources as
Enterprise Information Integration does. With the integration of data sources, enterprises can
begin to use metadata and federated queries to parse data spread all around their network. Both
EAI and EII have proved to be expensive so vendors are turning increasingly to grid computing to
lower costs. In the final analysis, integration is the sum total of integration of all sources of data,
metadata, applications and business processes.
Large scale vendors with their scalable platforms are best positioned to unify the diverse
elements that can help to extract information and present it in a form that is intelligible to decision
makers. Companies like Oracle, IBM, and Microsoft have long had strengths in application
servers that can help to bring together the composite of services required for business
intelligence purposes. Pure play business intelligence vendors, on the other hand, have
collaborated with companies, such as Composite Software, which provide integration servers.
Vendors have progressively moved from exposing legacy software as services and integrate
them with software of more recent vintage to increasingly a portfolio of services or composite
applications to department wide integration of exposed services to tentative efforts at enterprise
wide services-oriented architecture. One example of the early attempts at creating tools for
enterprise scale services architecture is SAP's Enterprise Services Architecture which provides
the tools to create services for use across an enterprise, to weave business processes with the
services using the SAP Composite Application Framework and to implement those processes on
the NetWeaver application server. BPEL-based service orchestration is used to integrate
enterprise services with SAP and non-SAP applications, including their business processes. The
distinctive aspect of SAP’s SOA strategy is that it exposes its ERP applications to services thus
saving its customers a major overhaul of their architecture. Siebel has also redesigned its CRM
so that it can be exposed as a service.
Oracle is able to integrate web services with its "Oracle Fusion Middleware," centered around its
Oracle Application Server 10g, web services orchestrated on J2EE (Java 2 Enterprise Edition)
Application Server Web services infrastructure, ESBs (enterprise service buses) and integration
server. At the base of the Fusion stack lie Oracle's version of the database grid supported
clusters of several computers. Although Oracle initially expected customers to choose its own
application server, it has increasingly been willing to integrate the products of other companies
including WebSphere.
Additional tools are available for business process management and activity monitoring tools,
business intelligence tools and enterprise portals as well as Oracle's data hubs and the Oracle
Collaboration Suite. Oracle also uses BPEL based business process integration; unlike SAP, it
extends the scope to all processes. Fusion middleware is open system architecture for integrating
services including those from other vendors such as IBM.
5. The action plans
The acid test for competing vendors would be the ability to orchestrate business processes with
applications and data flows. Flexible business processes are pivotal to translating strategy into
actions. Business users should be able to change business processes with ease using visual
tools. They need to be also monitor business processes and the performance parameters in order
to determine where efficiencies are possible.
A more flexible approach to management of business processes is now possible with the
emergence of Business Process Execution Language (BPEL). Standards based languages, such
as BPEL, enable companies to avoid vendor lock-in and facilitate widespread adoption. This
language is intuitive enough to let business users set up a flow of business activity; a series of
processes can be automated when the first of them is initiated. So a customer could request a
ticket for travel which will trigger a series of actions such as checking for availability, selecting a
seat, issuing a ticket, receiving a payment and depositing the money in a bank. More complex
business process integration will record the revenue in accounting software. All of this can be
done as a seamless flow of activity if none of these processes are embedded in any specific
application.
While integration of a few of the business processes has happened already, the more complex
integrations are beginning to happen with the entry of the larger vendors. Comprehensive
integration enables a company to optimize and simulate to extract efficiencies. Companies can
use data from their business processes to take decisions on improving operational efficiency.
Additional benefits follow when automated business processes respond to a new event. For
example, a retail store may find that inventories are lower than expected and its business
processes will initiate action to replenish them.
The emergence of business process management tools which can be operated by business
users, without the assistance from IT, is illustrated by Microsoft’s BizTalk Server. Business
processes can be programmed with the use of Visual Studio .NET 2003 development
environment. Alternatively, the server has the ability to insert Visio for business users to alter
business processes as they see fit.
The integration of business intelligence software with business processes paves the way for
business activity monitoring as well as event based monitoring of workflows. Business activity
monitoring compares planned performance to the actual. Events management software sets up
alerts so that managers receive warnings when the exceptions are experienced. One of the
leading players in this segment is Teradata with its Active Warehouse. Oracle and IBM have also
introduced sophisticated programs for business process management.
Master data: Navigating complex information systems
Master data management provides consistent definitions of data in a services-oriented
architecture where heterogeneous applications have to co-exist. The availability of consistent
6. definitions helps to considerably improve efficiencies by smoother cross-flow of processes. In the
past, each application had its own way to define business process and logic. When these
applications were integrated, the data stored with these applications had inconsistent definitions.
In addition, the data was duplicated in several applications and created confusion when it was
combined. Master data management systems provide a centralized library of business operations
such as querying customer information. The availability of master data management systems
helps to solve problems of data quality. However, applications have to be able to call information
from master data management systems before they can be utilized.
The extent of standardization of definitions can vary across different business intelligence
providers. Ideally, enterprises would like to see a master data management system is
comprehensive to include both unstructured and structured data and include all types of business
processes such as CRM, supply chain management and manufacturing data. For vendors, the
cost and complexity of Master Data Management systems grows as they aggregate more
information. The MDM solution offered by SAP, for example, is focused on transaction processing
while Hyperion MDM product is oriented toward business intelligence.
The usability of master data management systems depends greatly on how well they are
integrated with transaction and business intelligence systems and the ability to use them at run-
time. SAP, for example, has integrated its master data into its Enterprise Services Architecture
which enables its applications to look up data at run time. Siebel Systems has a market leading
master data management system in its Siebel's Universal Customer Master (UCM) management
and Universal Application Networking (UAN) systems which can work together to call data
definitions at run-time. IBM is also incorporating its Master Data Management System in its
service-oriented architecture so that data definitions can be called when applications need them.
Mining structured data
Increasingly, enterprises are looking for data mining solutions that provide analysis in real time to
feed into decision here and now. The typical problems they want to solve are to anticipate
customer churn or determining the credit worthiness of their customers. Analysis in such a short
period of time implies that data has to be extracted, cleaned and prepared for analysis in a short
enough intervals for decisions to be made. Some new entrants have found an opportunity in this
largely unaddressed segment of the market. KXEN, for example, offers Analytic Framework
product which reduces the time to define, develop and run a model. KXEN's Consistent Coder
module automatically transforms raw, inconsistent data into clean, uniformly formatted data ready
for modeling.
Several other vendors have offered solutions for the improvement of business processes in real
time. BusinessObjects XI has added BusinessObjects Process Tracker and BusinessObjects
Process Analysis that embed analytics in their business processes. The data on performance
metrics are linked to alerting capabilities so that managers can take action in real time. Cognos
7. has software, e-Applications, for supply chain management processes such as procurement,
sales and inventory. The tool allows customers to keep track of the performance metrics of their
suppliers and respond to alerts about events.
Mining unstructured data
Unstructured data is available in much larger volumes and is of greater interest in operational
situations where qualitative information, such as the lifestyles of customers, is more relevant.
Information from text documents can be extracted in a variety of ways including categorization,
classification, information extraction, summarization, identifying themes or topics, concepts,
information visualization and responses to questions.
Information extraction looks for key phrases such as “tourists vacationing in San Francisco tend
to come from China and other East Asian countries” in order to find data on the travel behavior of
tourists in California. When conducting topic searches, text mining tools search for broad themes
such as “the Japanese stock market performance” to cull out information most relevant for the
topic, text summarization tools scours for words like “in short” or “in conclusion” to find related
information that tells the gist of the text, categorization can be accomplished by looking at the
frequency of usage of words and their synonyms; the recurrence of a word such as mining would
suggest that the document is about decision support analysis tools. Similarly, clustering methods
comb through text to find frequently occurring words and themes; a text on business intelligence
would have clusters of predictive mining, decision-making, etc. It is also possible to extract
information related to a specific concept; a doctor could be looking for information on allergies
and would like to obtain related information on weather conditions, food habits and lifestyles. Text
mining tools are also capable of visualizing information in the form of link tables that help to
understand the mass of information. Finally, text mining can be used to answer specific questions
and the words used would suggest the required information.
Vendors can be differentiated by their inclination to use text mining tools for knowledge extraction
and real time operational needs of companies. Text mining tools, such as those offered by SAS
and SPSS, have strengths in information or knowledge extraction and categorization while the
more recent entrants are focused on addressing the real time text analysis requirements which
will focus more on clustering, summarization and visualization. SAS Text Miner has capabilities in
information extraction, categorization and concept linkage. SPSS has strengths in information
extraction, categorization and information visualization. Megacomputer, canned software like
Cognos on the other hand, has capabilities in summarization, clustering, answering questions
besides categorization and information extraction.
For the broader category of unstructured data, IBM Omnifind promises to leader in the
marketplace with its capabilities in searching content including e-mails. The search engine in the
software indexes the information and lays the ground for queries.
8. Intelligent data cleaning
Data cleaning software has a number of tools such as data profiling which finds inconsistencies,
parsing which identifies different types of data and places them in the relevant fields,
standardization which brings consistency into data from a variety of sources and verification tools
for comparing data against a universal master such as the U.S. Postal Service, matching which
links interrelated files and consolidation which eliminates duplicate entries. The Master Data used
by most of such software does not take into account the context, the connotations, the overtone
or the undertone in much of human expression. Inevitably, a large majority of the conversions that
happen are prone to error and require inordinate human effort to make the corrections.
Increasingly, vendors are looking to semantic metadata to do the translations of data from one
source to another. Such semantic metadata is conscious of the context in which language is
used. Some of the newer technologies, such as those offered by Silver Creek Systems, are able
to improve the efficiencies in data cleaning.
Pricing pressures
The launch of suites and the entry of ERP players in the business intelligence market have
intensified the pressure to lower prices which will work to the disadvantage of pure play BI
vendors. By all accounts, the recognized price leader in the market is Microsoft SQL Server.
Microsoft's SQL Server 2005, with its Analysis and Reporting Services, is priced at $80,000 for
1,000 users while most BI suites have a list price in the range of $450,000 to $700,000. However,
pricing data about business intelligence packages has several layers of complexity as strategic
pricing is the norm and it is not always possible to make apples-to-apples comparisons.
In a review of the data revealed during the anti-trust investigation conducted when Oracle made a
bid for PeopleSoft, the many caveats that have to be added when comparing prices were
revealed. Oracle’s well kept secret that spilled out that it willing to price out Microsoft at almost
any cost—with as much as 90% discounts. Customer acquisition has a lucrative reward in the
maintenance earnings that both companies can earn estimated at 25-30% of the license
revenues.
Mass adoption
Real time decisions have to be necessarily complemented by collaboration across the enterprise.
The broad majority of employees in organizations can participate if their familiar tools, such as
spreadsheets, are embedded in business intelligence tools. In the future, however, spreadsheets
cannot be used in isolation and have to be incorporated in enterprise wide systems; they have to
be able to migrate from desktops to servers. In the past, spreadsheets also allowed individual
users, often highly educated business analysts, to manually configure their spreadsheets to suit
their analytical needs. The formulas and the data were often lost when an employee left. In a
business intelligence environment, users have to be able to share their data and analytical
9. techniques with the larger community. It should be possible for all users, whatever their skill level,
to reuse the formulas somebody else might have created.
In the past, the data from business intelligence tools was, at best, exported to Excel sheets where
it could be manipulated in intractable ways and the final results were not imported back.
Increasingly, customers are looking for tools that integrate Excel spreadsheets with corporate
databases, relational or multi-dimensional, consistent with the format and architecture of their
business intelligence systems. The data should be available across all information systems and
all users should be able to trace back the methods used in analysis.
The players that stand-out in their integration of excel spreadsheets into business intelligence
systems are Hyperion, Actuate, Information Builders, Business Objects Outlook Soft, SAP and
lately Oracle. Hyperion was one of the earliest among leading Business Intelligence players and
SRC Software, later acquired by Business Objects, was among the first to offer an Excel
interface.
The value of integration of Windows Office is potentially more than the usability of a familiar
interface. Much greater benefits can be reaped when the Office applications integrate with the
applications, data and business processes of enterprises. Business intelligence vendors are
increasingly trying to gain an edge over their competitors by linking inter-related processes,
applications and data with a convenient Office interface. The joint product “Mendocino”, created
by the partnership between SAP and Microsoft typifies the competitive trends in industry; the
processes that were earlier integrated by APIs is increasingly done on a SOA platform and helps
to realize much larger gains in productivity.
Reporting tools
Two main types of reporting are available with business intelligence tools and these are
production reporting and management reporting. Production reporting generates routine
documents such as invoices, bank statements which are repeatable. Management reports, on the
other hand, are ad hoc in nature and extract data to decision related questions such as how many
customers bought goods worth more than $2000 in the Christmas season. Increasingly, vendors
seek to gain competitive advantage by building in the capability to generate more intelligent
reports. Users of production reports soon begin to ask questions such as the reasons for
exceptionally high debits recorded in their bank statements. Ad hoc reporting is meant for the
business analysts in companies. Over time, vendors have discovered a larger market for static
reports, with pre-defined templates and drill-down capabilities, for a much larger client base in the
operational staff of companies which is usually satisfied with simple queries most relevant for
their roles.
The leaders among the group of vendors who focus on the reporting space are recognized to be
Cognos which was offering Cognos 7, recently upgraded to its eighth edition recently with greater
integration of multidimensional cubes and the reporting engine, and Actuate 8. Cognos has been
10. a strong enough player to provoke Business Objects to acquire Crystal Reports to match its
reporting capabilities. Cognos stands out for its capabilities in ad hoc queries and a web interface.
Actuate 8 has gained considerable recognition for its ability after its acquisition of Nimble
technologies improved its ability to integrate with a diversity of data sources using EII
technologies. In addition, Actuate has its eSpreadsheet interface.
The visual big pictures in the detail
Business intelligence vendors see in interactive visualization a means to gain an edge by
providing customers a means to extract insight from large data stores. Several different
approaches are available to achieve this objective including geographical data, interactivity,
animation, super-imposing objects on data and dimensionality of the graphics.
Mapping of geo-spatial data is one of the means of relating data to location to understand trends
in terms of who, where and how determined them. A typical example could be the mapping of
concentrations of population to understand the impact store location could have on the
purchasing behavior of customers. Insights can be extracted by visually estimating the time it
would take customers to reach the store location. Additional insights could be extracted if store
locations are compared with the centers of crime in the city. Vendors seek to gain a competitive
edge by integrating business intelligence and location information so that they can be juxtaposed
on graphs which can be depicted without getting bogged down in tedious processes of data
extraction. SAS, one of the leaders in combining geographical information and business data,
now offers SAS/GIS which integrate business and location data that it draws from ESRI, a long
time market leader in location information.
Interactive visuals are another means to gain insight. While exploring information, users of
business analytics software want to view data from a variety of angles and want to portray
information as their thought process evolves. They want to see not just pretty pictures but
relationships which would require them to slide, move and juxtapose components of their visuals
to compare, contrast and highlight to bring into relief patterns and trends. They want to shuffle the
visuals to ask “what if” questions. In a typical application involving balanced scorecards, they
want to compare the planned and the actual performance. Infommersion, recently acquired by
Business Objects, provides these very features relevant for decision-support analytical
presentations.
Users can gain better understanding of their data if they have the ability to conduct visual queries
which enable them to select their data and their visuals to address the specific questions they
have in their mind. Tableau, a start-up, has pioneered visual queries using business intelligence
data. Instead of slicing and dicing data, users are able to flip visuals to spot any anomalies in their
data, noticeable trends or patterns that would be elusive especially in large data sets. The same
product has been licensed and renamed as Hyperion Visual Explorer.