"Building Data Foundations and Analytics Tools Across The Product" by Crystal...Tech in Asia ID
Crystal is a data nerd, self-taught programmer, and avid non-fiction reader.
Having joined GO-JEK over two years ago, she has first-hand experience of the many challenges involved with scaling data-driven teams at Indonesia’s first unicorn startup. She currently leads the strategy and vision of the Business Intelligence team’s internal products and data culture across the company. Her team aims to produce actionable insights for all of the different verticals on the GO-JEK platform.
This slide was shared at Tech in Asia Product Development Conference 2017 (PDC'17) on 9-10 August 2017.
Get more insightful updates from TIA by subscribing techin.asia/updateselalu
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...MongoDB
Mark Lewis, Senior MArketing Director EMEA, Cloudera.
Hadoop and the Future of Data Management. As Hadoop takes the data management market by storm, organisations are evolving the role it plays in the modern data centre. Explore how this disruptive technology is quickly transforming an industry and how you can leverage it today, in combination with MongoDB, to drive meaningful change in your business.
This presentation given by Think Big's senior data scientist Eliano Marques at Digital Natives conference in Berlin, Germany (November 2015), details how to go from experimentation to productionization for a predictive maintenance use case.
This webinar will go over various Big Data ideas and insights. It will also answer the question, What Does “Big Data” Really Mean? Amazon RedShift will also speak and there will be a Demonstration from Informatica Cloud. We will end with a few TechTuesdays Tips for Success.
Augmented Intelligence Bridging the Gap Between BI and AIJenny Midwinter
In this session Matt will give an overview of the Qlik Cognitive Engine and Qlik’s vision for Augmented Intelligence. Using Artificial Intelligence and Machine Learning the Qlik Cognitive Engine help analytics users find insight in their data faster in a self-service framework. Matt will give highlights of the Augmented Intelligence approach and implementation and share the technical journey from research to delivery of the functionality.
** From the Ottawa AI/ML Meetup November 26, 2018.
Sisense provides business analytics software that simplifies working with complex data for over 1,000 customers in 50 countries. It offers a single tool that can ingest disparate data sources, handle large datasets of terabytes, and create dashboards in just 90 minutes without needing IT expertise. Sisense's patented In-Chip and Single Stack technologies allow for immediate insights without the pitfalls of traditional analytics like lengthy data preparation, high costs for complex data, and overdependence on IT. The software offers a complete solution from data ingestion to visualization and has numerous industry awards for ease of use, customer value, and high customer retention rates.
Platfora - An Analytics Sandbox In A World Of Big DataMark Ginnebaugh
As Big Data becomes the norm in dealing with data volume, variety, and velocity, it becomes increasingly harder for the Data Analyst to understand and work with data sets. To overcome this we introduce Platfora, a Hadoop backed data analysis framework which nicely complements more traditional data warehousing and BI solutions. This presentation covers ingestion of new data and building of data sets and visualizations,in a system that requires no more work than interacting with a graphical interface. You'll see examples of peer-to-peer lending and how insights on loan applicants and their risk profiles can be quickly revealed with no ETL development or demanding data transformation.
"Building Data Foundations and Analytics Tools Across The Product" by Crystal...Tech in Asia ID
Crystal is a data nerd, self-taught programmer, and avid non-fiction reader.
Having joined GO-JEK over two years ago, she has first-hand experience of the many challenges involved with scaling data-driven teams at Indonesia’s first unicorn startup. She currently leads the strategy and vision of the Business Intelligence team’s internal products and data culture across the company. Her team aims to produce actionable insights for all of the different verticals on the GO-JEK platform.
This slide was shared at Tech in Asia Product Development Conference 2017 (PDC'17) on 9-10 August 2017.
Get more insightful updates from TIA by subscribing techin.asia/updateselalu
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...MongoDB
Mark Lewis, Senior MArketing Director EMEA, Cloudera.
Hadoop and the Future of Data Management. As Hadoop takes the data management market by storm, organisations are evolving the role it plays in the modern data centre. Explore how this disruptive technology is quickly transforming an industry and how you can leverage it today, in combination with MongoDB, to drive meaningful change in your business.
This presentation given by Think Big's senior data scientist Eliano Marques at Digital Natives conference in Berlin, Germany (November 2015), details how to go from experimentation to productionization for a predictive maintenance use case.
This webinar will go over various Big Data ideas and insights. It will also answer the question, What Does “Big Data” Really Mean? Amazon RedShift will also speak and there will be a Demonstration from Informatica Cloud. We will end with a few TechTuesdays Tips for Success.
Augmented Intelligence Bridging the Gap Between BI and AIJenny Midwinter
In this session Matt will give an overview of the Qlik Cognitive Engine and Qlik’s vision for Augmented Intelligence. Using Artificial Intelligence and Machine Learning the Qlik Cognitive Engine help analytics users find insight in their data faster in a self-service framework. Matt will give highlights of the Augmented Intelligence approach and implementation and share the technical journey from research to delivery of the functionality.
** From the Ottawa AI/ML Meetup November 26, 2018.
Sisense provides business analytics software that simplifies working with complex data for over 1,000 customers in 50 countries. It offers a single tool that can ingest disparate data sources, handle large datasets of terabytes, and create dashboards in just 90 minutes without needing IT expertise. Sisense's patented In-Chip and Single Stack technologies allow for immediate insights without the pitfalls of traditional analytics like lengthy data preparation, high costs for complex data, and overdependence on IT. The software offers a complete solution from data ingestion to visualization and has numerous industry awards for ease of use, customer value, and high customer retention rates.
Platfora - An Analytics Sandbox In A World Of Big DataMark Ginnebaugh
As Big Data becomes the norm in dealing with data volume, variety, and velocity, it becomes increasingly harder for the Data Analyst to understand and work with data sets. To overcome this we introduce Platfora, a Hadoop backed data analysis framework which nicely complements more traditional data warehousing and BI solutions. This presentation covers ingestion of new data and building of data sets and visualizations,in a system that requires no more work than interacting with a graphical interface. You'll see examples of peer-to-peer lending and how insights on loan applicants and their risk profiles can be quickly revealed with no ETL development or demanding data transformation.
Tableau Conference 2018: Binging on Data - Enabling Analytics at NetflixBlake Irvine
In this conference session we share how we are using Tableau “out of the box” and also describe how it fits into our overall data environment. In addition, we’ll describe how we expect to use the Data Catalog and Object Model, our explorations of large-scale data stores, and challenges we are working on including governance and data lineage. Video of session can be viewed here: https://youtu.be/Nr24tw3dmZQ
MongoDB IoT City Tour STUTTGART: Analysing the Internet of Things. By, PentahoMongoDB
Dominik Claßen, Sales Engineering Team Laed at Pentaho
Drawing on Pentaho's wide experience in solving customers' big data issues, Dominik positions the importance of analytics in the IoT.
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
The document discusses business analytics frameworks and trends. It provides Gartner's key findings that enterprises will use a combination of products and services to support diverse analytics needs. A strategic view requires defining decision-making and analytical processes. Gartner recommends using a framework to develop an implementation plan and portfolio of capabilities. The business analytics framework defines the people, processes, and platforms needed to take a strategic approach to business intelligence and analytics initiatives. Major focus areas for analytics include healthcare, finance, and supply chain.
This document discusses building a competitive advantage from a data lake. It recommends building a data reservoir with governance to hold large and complex datasets. Organizations should start small by working with business partners to develop new analytics from low-hanging fruit projects that provide value. This will help drive adoption and extension of the data lake approach over time.
MongoDB IoT City Tour LONDON: Analysing the Internet of Things: Davy Nys, Pen...MongoDB
1) The document discusses Pentaho's beliefs around Internet of Things (IoT) analytics, including applying the right data source and processing for different analytics needs, gaining insights by blending multiple data sources on demand, and planning for agility, flexibility and near real-time analytics.
2) It describes how emerging big data use cases demand blending different data sources and provides examples like improving operations and customer experience.
3) The document advocates an Extract-Transform-Report approach for IoT analytics that provides flexibility to integrate diverse data sources and enables real-time insights.
BUSINESS ENABLED ANALYTICAL DATA FLOW MANAGEMENT @ YOUNGCAPITAL - Big Data Ex...webwinkelvakdag
This presentation will focus on the importance of business enabled analytical data flow management to support business processes and decision making within YoungCapital. Part of the presentation will be an example of the ROI that is being realized with this approach.
At YoungCapital big data is used to find suitable candidates for jobs in the market place and to match jobs to candidates and candidates to jobs and using advanced in house developed artificial intelligence. These solutions are based on (web) behavior, demographic, registration and resume data. In BI we need to report and analyze on millions of transactions per year. With thousands of candidates putting in daily declarations both in our systems as well as our customers systems we need flexibility in consolidating all that data for both operational processes as well as business analytics and artificial intelligence. In stead of using data engineers in combination with business analysts to manage this YoungCapital is using Alteryx to enable business analysts to combine their analysis & business expertise with the capability of creating data flows that directly benefit the business. This frees up our data engineers for larger projects and improves the speed and quality of information delivered into our decision making.
About YoungCapital: YoungCapital is a recruitment specialist for young talent. It manages 26+ recruitment sites in 9 countries. The candidate database contains 6m+ profile. 50k+ per month are being added. With a revenue of €465 million, YoungCapital is the fastest growing company in the Dutch HR sector and number 5 in the top 250 of fastest growing companies in the Netherlands.
Presented by BrainSell, a top Sage ERP partner, www.brainsell.net
Sage Intelligence works with Sage 100, Sage 300, Sage 500 and more. See how it can make your life better today!
www.brainsell.net
Is your quality monitoring tech stack secure?Etech
Stay ahead of compliance violations with QEval. With real-time quality monitoring, we ensure near-instant alerts for critical processes and custom alerts. Additionally, automatically track compliance performance and reduce the possibility of risk.
The document provides a summary of various strategy tools that can be useful for startups. It discusses tools for context awareness when dealing with complex systems (Cynefin), analyzing value chain evolution and movement (Wardley Maps), platform and ecosystem plays using an Innovate-Leverage-Commoditize approach, identifying bounded contexts and subdomain strategies using Strategic Domain-Driven Design, capturing customer insights using a Value Proposition Canvas and Business Model Canvas, and validating hypotheses using the Lean Startup methodology. The document recommends these various tools as part of a startup's strategic toolbox to help guide strategic decision making.
Projects are dynamic. Staying ahead of the curve means having real-time access to your project data and it’s leading indicators for actionable intelligence. A real-time, flexible platform Tivitie Insights service enables organizations to visualize critical Microsoft Project Online data in real-time, spot trends as they occur and produce actionable facts for better decision making in a fraction of the time.
How an intelligence lab can accelerate your business - Big Data Expo 2019webwinkelvakdag
More and more business departments within the Rabobank want to make use of big data and advanced analytics capabilities. Most of the times the “real question” is not yet clear, or the tools, data and knowledge is not available within the business teams. To solve this problem the Data Science & Business Consulting department has introduced the Intelligence Lab where Dataiku is one of its core components.
Within this Intelligence Lab we provide data, tools, knowledge and support on different levels to help the business accelerate in creating viable products within a short timeframe. In this talk we will share some insights on our journey and the issues and choices we faced along the way.
Lightning-Fast, Interactive Business Intelligence Performance with MicroStrat...Tyler Wishnoff
See how extreme query speeds and ultra-high concurrency on MicroStrategy, and any other business intelligence (BI) tool, on Big Data is possible through the Kyligence platform. Learn more here: https://kyligence.io/
This document provides an overview of setting up a database for a social benefit organization. It discusses planning the database by reviewing current and future data needs, business processes, and reporting requirements. It also covers requisite resources like acquisition, maintenance, and ongoing costs. The document recommends creating a request for proposal, budget, and timeline and highlights advanced data capabilities like business intelligence, analytics, and benchmarking dashboards.
Presentation by Luc Delanglez (DataLumen) at the Data Vault Modelling and Dat...Patrick Van Renterghem
This document discusses data governance and provides guidance on getting started with a data governance program. It outlines key building blocks like a data catalog, workflows, and business and technical lineage. It also covers challenges like governing a multi-store data environment and ensuring data quality. The document recommends starting with understanding your data and building out foundational elements like a data catalog before operationalizing governance through workflows.
Charter Global builds Big Data business solutions that provide real-time, predictive analytics for measurable ROI results to help you find hidden opportunities for increased revenue and cost savings.
Platfora provides an interactive, in-memory business intelligence platform for Hadoop. It aims to solve challenges with existing BI offerings on Hadoop by using "software defined" data marts that are automatically and iteratively built from Hadoop data. The Platfora solution integrates a web-based BI application, an in-memory data mart and processing engine, and an automated Hadoop data refinery to enable powerful closed-loop analysis of big data. It was demonstrated using a case study of Edmunds.com, which moved to Platfora to gain faster access and agility with their large and growing web and mobile data.
From Business Intelligence to Big Data - hack/reduce Dec 2014Adam Ferrari
Talk given on Dec. 3, 2014 at MIT, sponsored by Hack/Reduce. This talk looks at the history of Business Intelligence from first generation OLAP tools through modern Data Discovery and visualization tools. And looking forward, what can we learn from that evolution as numerous new tools and architectures for analytics emerge in the Big Data era.
Agile Data Warehouse Design for Big Data PresentationVishal Kumar
Synopsis:
[Video link: http://www.youtube.com/watch?v=ZNrTxSU5IQ0 ]
Jim Stagnitto and John DiPietro of consulting firm a2c) will discuss Agile Data Warehouse Design - a step-by-step method for data warehousing / business intelligence (DW/BI) professionals to better collect and translate business intelligence requirements into successful dimensional data warehouse designs.
The method utilizes BEAM✲ (Business Event Analysis and Modeling) - an agile approach to dimensional data modeling that can be used throughout analysis and design to improve productivity and communication between DW designers and BI stakeholders. BEAM✲ builds upon the body of mature "best practice" dimensional DW design techniques, and collects "just enough" non-technical business process information from BI stakeholders to allow the modeler to slot their business needs directly and simply into proven DW design patterns.
BEAM✲ encourages DW/BI designers to move away from the keyboard and their entity relationship modeling tools and begin "white board" modeling interactively with BI stakeholders. With the right guidance, BI stakeholders can and should model their own BI data requirements, so that they can fully understand and govern what they will be able to report on and analyze.
The BEAM✲ method is fully described in
Agile Data Warehouse Design - a text co-written by Lawrence Corr and Jim Stagnitto.
About the speaker:
Jim Stagnitto Director of a2c Data Services Practice
Data Warehouse Architect: specializing in powerful designs that extract the maximum business benefit from Intelligence and Insight investments.
Master Data Management (MDM) and Customer Data Integration (CDI) strategist and architect.
Data Warehousing, Data Quality, and Data Integration thought-leader: co-author with Lawrence Corr of "Agile Data Warehouse Design", guest author of Ralph Kimball’s “Data Warehouse Designer” column, and contributing author to Ralph and Joe Caserta's latest book: “The DW ETL Toolkit”.
John DiPietro Chief Technology Officer at A2C IT Consulting
John DiPietro is the Chief Technology Officer for a2c. Mr. DiPietro is responsible
for setting the vision, strategy, delivery, and methodologies for a2c’s Solution
Practice Offerings for all national accounts. The a2c CTO brings with him an
expansive depth and breadth of specialized skills in his field.
Sponsor Note:
Thanks to:
Microsoft NERD for providing awesome venue for the event.
http://A2C.com IT Consulting for providing the food/drinks.
http://Cognizeus.com for providing book to give away as raffle.
Data Science Salon: Quit Wasting Time – Case Studies in Production Machine Le...Formulatedby
Presented by Yashas Vaidya, Sr Data Scientist at DataIku
Next DSS MIA Event - https://datascience.salon/miami/
The steps to taking a machine learning model to production. Modern architectures and technologies for building production machine learning. An overview of the talent and processes for creating and maintaining production machine learning.
Smarter Analytics: Supporting the Enterprise with AutomationInside Analysis
The Briefing Room with Barry Devlin and WhereScape
Live Webcast on June 10, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=5230c31ab287778c73b56002bc2c51a
The data warehouse is intended to support analysis by making the right data available to the right people in a timely fashion. But conditions change all the time, and when data doesn’t keep up with the business, analysts quickly turn to workarounds. This leads to ungoverned and largely un-managed side projects, which trade short-term wins for long-term trouble. One way to keep everyone happy is by creating an integrated environment that pulls data from all sources, and is capable of automating both the model development and delivery of analyst-ready data.
Register for this episode of The Briefing Room to hear data warehousing pioneer and Analyst Barry Devlin as he explains the critical components of a successful data warehouse environment, and how traditional approaches must be augmented to keep up with the times. He’ll be briefed by WhereScape CEO Michael Whitehead, who will showcase his company’s data warehousing automation solutions. He’ll discuss how a fast, well-managed and automated infrastructure is the key to empowering faster, smarter, repeatable decision making.
Visit InsideAnlaysis.com for more information.
This document discusses using data warehouses in retail and finance. It provides examples of how data warehouses are used in both industries, including for market basket analysis, product placement, supply chain management, and customer profiling. It also outlines some opportunities and challenges of implementing data warehouses, such as improved sales and customer loyalty but also large data volumes and data preparation difficulties. Specific company examples are given, like how Netflix uses customer streaming data and how Raymond James improved data backups and reporting with a new solution.
The Data Lake: Empowering Your Data Science TeamSenturus
Data science overview: defined, purpose, relation to BI, differences from BI and benefits from using both data science and BI. View the webinar video recording and download this deck: http://www.senturus.com/resources/data-lake-empowering-data-science-team/.
Learn how the data lake can empower data science teams and free up valuable data warehouse resources.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
Tableau Conference 2018: Binging on Data - Enabling Analytics at NetflixBlake Irvine
In this conference session we share how we are using Tableau “out of the box” and also describe how it fits into our overall data environment. In addition, we’ll describe how we expect to use the Data Catalog and Object Model, our explorations of large-scale data stores, and challenges we are working on including governance and data lineage. Video of session can be viewed here: https://youtu.be/Nr24tw3dmZQ
MongoDB IoT City Tour STUTTGART: Analysing the Internet of Things. By, PentahoMongoDB
Dominik Claßen, Sales Engineering Team Laed at Pentaho
Drawing on Pentaho's wide experience in solving customers' big data issues, Dominik positions the importance of analytics in the IoT.
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
The document discusses business analytics frameworks and trends. It provides Gartner's key findings that enterprises will use a combination of products and services to support diverse analytics needs. A strategic view requires defining decision-making and analytical processes. Gartner recommends using a framework to develop an implementation plan and portfolio of capabilities. The business analytics framework defines the people, processes, and platforms needed to take a strategic approach to business intelligence and analytics initiatives. Major focus areas for analytics include healthcare, finance, and supply chain.
This document discusses building a competitive advantage from a data lake. It recommends building a data reservoir with governance to hold large and complex datasets. Organizations should start small by working with business partners to develop new analytics from low-hanging fruit projects that provide value. This will help drive adoption and extension of the data lake approach over time.
MongoDB IoT City Tour LONDON: Analysing the Internet of Things: Davy Nys, Pen...MongoDB
1) The document discusses Pentaho's beliefs around Internet of Things (IoT) analytics, including applying the right data source and processing for different analytics needs, gaining insights by blending multiple data sources on demand, and planning for agility, flexibility and near real-time analytics.
2) It describes how emerging big data use cases demand blending different data sources and provides examples like improving operations and customer experience.
3) The document advocates an Extract-Transform-Report approach for IoT analytics that provides flexibility to integrate diverse data sources and enables real-time insights.
BUSINESS ENABLED ANALYTICAL DATA FLOW MANAGEMENT @ YOUNGCAPITAL - Big Data Ex...webwinkelvakdag
This presentation will focus on the importance of business enabled analytical data flow management to support business processes and decision making within YoungCapital. Part of the presentation will be an example of the ROI that is being realized with this approach.
At YoungCapital big data is used to find suitable candidates for jobs in the market place and to match jobs to candidates and candidates to jobs and using advanced in house developed artificial intelligence. These solutions are based on (web) behavior, demographic, registration and resume data. In BI we need to report and analyze on millions of transactions per year. With thousands of candidates putting in daily declarations both in our systems as well as our customers systems we need flexibility in consolidating all that data for both operational processes as well as business analytics and artificial intelligence. In stead of using data engineers in combination with business analysts to manage this YoungCapital is using Alteryx to enable business analysts to combine their analysis & business expertise with the capability of creating data flows that directly benefit the business. This frees up our data engineers for larger projects and improves the speed and quality of information delivered into our decision making.
About YoungCapital: YoungCapital is a recruitment specialist for young talent. It manages 26+ recruitment sites in 9 countries. The candidate database contains 6m+ profile. 50k+ per month are being added. With a revenue of €465 million, YoungCapital is the fastest growing company in the Dutch HR sector and number 5 in the top 250 of fastest growing companies in the Netherlands.
Presented by BrainSell, a top Sage ERP partner, www.brainsell.net
Sage Intelligence works with Sage 100, Sage 300, Sage 500 and more. See how it can make your life better today!
www.brainsell.net
Is your quality monitoring tech stack secure?Etech
Stay ahead of compliance violations with QEval. With real-time quality monitoring, we ensure near-instant alerts for critical processes and custom alerts. Additionally, automatically track compliance performance and reduce the possibility of risk.
The document provides a summary of various strategy tools that can be useful for startups. It discusses tools for context awareness when dealing with complex systems (Cynefin), analyzing value chain evolution and movement (Wardley Maps), platform and ecosystem plays using an Innovate-Leverage-Commoditize approach, identifying bounded contexts and subdomain strategies using Strategic Domain-Driven Design, capturing customer insights using a Value Proposition Canvas and Business Model Canvas, and validating hypotheses using the Lean Startup methodology. The document recommends these various tools as part of a startup's strategic toolbox to help guide strategic decision making.
Projects are dynamic. Staying ahead of the curve means having real-time access to your project data and it’s leading indicators for actionable intelligence. A real-time, flexible platform Tivitie Insights service enables organizations to visualize critical Microsoft Project Online data in real-time, spot trends as they occur and produce actionable facts for better decision making in a fraction of the time.
How an intelligence lab can accelerate your business - Big Data Expo 2019webwinkelvakdag
More and more business departments within the Rabobank want to make use of big data and advanced analytics capabilities. Most of the times the “real question” is not yet clear, or the tools, data and knowledge is not available within the business teams. To solve this problem the Data Science & Business Consulting department has introduced the Intelligence Lab where Dataiku is one of its core components.
Within this Intelligence Lab we provide data, tools, knowledge and support on different levels to help the business accelerate in creating viable products within a short timeframe. In this talk we will share some insights on our journey and the issues and choices we faced along the way.
Lightning-Fast, Interactive Business Intelligence Performance with MicroStrat...Tyler Wishnoff
See how extreme query speeds and ultra-high concurrency on MicroStrategy, and any other business intelligence (BI) tool, on Big Data is possible through the Kyligence platform. Learn more here: https://kyligence.io/
This document provides an overview of setting up a database for a social benefit organization. It discusses planning the database by reviewing current and future data needs, business processes, and reporting requirements. It also covers requisite resources like acquisition, maintenance, and ongoing costs. The document recommends creating a request for proposal, budget, and timeline and highlights advanced data capabilities like business intelligence, analytics, and benchmarking dashboards.
Presentation by Luc Delanglez (DataLumen) at the Data Vault Modelling and Dat...Patrick Van Renterghem
This document discusses data governance and provides guidance on getting started with a data governance program. It outlines key building blocks like a data catalog, workflows, and business and technical lineage. It also covers challenges like governing a multi-store data environment and ensuring data quality. The document recommends starting with understanding your data and building out foundational elements like a data catalog before operationalizing governance through workflows.
Charter Global builds Big Data business solutions that provide real-time, predictive analytics for measurable ROI results to help you find hidden opportunities for increased revenue and cost savings.
Platfora provides an interactive, in-memory business intelligence platform for Hadoop. It aims to solve challenges with existing BI offerings on Hadoop by using "software defined" data marts that are automatically and iteratively built from Hadoop data. The Platfora solution integrates a web-based BI application, an in-memory data mart and processing engine, and an automated Hadoop data refinery to enable powerful closed-loop analysis of big data. It was demonstrated using a case study of Edmunds.com, which moved to Platfora to gain faster access and agility with their large and growing web and mobile data.
From Business Intelligence to Big Data - hack/reduce Dec 2014Adam Ferrari
Talk given on Dec. 3, 2014 at MIT, sponsored by Hack/Reduce. This talk looks at the history of Business Intelligence from first generation OLAP tools through modern Data Discovery and visualization tools. And looking forward, what can we learn from that evolution as numerous new tools and architectures for analytics emerge in the Big Data era.
Agile Data Warehouse Design for Big Data PresentationVishal Kumar
Synopsis:
[Video link: http://www.youtube.com/watch?v=ZNrTxSU5IQ0 ]
Jim Stagnitto and John DiPietro of consulting firm a2c) will discuss Agile Data Warehouse Design - a step-by-step method for data warehousing / business intelligence (DW/BI) professionals to better collect and translate business intelligence requirements into successful dimensional data warehouse designs.
The method utilizes BEAM✲ (Business Event Analysis and Modeling) - an agile approach to dimensional data modeling that can be used throughout analysis and design to improve productivity and communication between DW designers and BI stakeholders. BEAM✲ builds upon the body of mature "best practice" dimensional DW design techniques, and collects "just enough" non-technical business process information from BI stakeholders to allow the modeler to slot their business needs directly and simply into proven DW design patterns.
BEAM✲ encourages DW/BI designers to move away from the keyboard and their entity relationship modeling tools and begin "white board" modeling interactively with BI stakeholders. With the right guidance, BI stakeholders can and should model their own BI data requirements, so that they can fully understand and govern what they will be able to report on and analyze.
The BEAM✲ method is fully described in
Agile Data Warehouse Design - a text co-written by Lawrence Corr and Jim Stagnitto.
About the speaker:
Jim Stagnitto Director of a2c Data Services Practice
Data Warehouse Architect: specializing in powerful designs that extract the maximum business benefit from Intelligence and Insight investments.
Master Data Management (MDM) and Customer Data Integration (CDI) strategist and architect.
Data Warehousing, Data Quality, and Data Integration thought-leader: co-author with Lawrence Corr of "Agile Data Warehouse Design", guest author of Ralph Kimball’s “Data Warehouse Designer” column, and contributing author to Ralph and Joe Caserta's latest book: “The DW ETL Toolkit”.
John DiPietro Chief Technology Officer at A2C IT Consulting
John DiPietro is the Chief Technology Officer for a2c. Mr. DiPietro is responsible
for setting the vision, strategy, delivery, and methodologies for a2c’s Solution
Practice Offerings for all national accounts. The a2c CTO brings with him an
expansive depth and breadth of specialized skills in his field.
Sponsor Note:
Thanks to:
Microsoft NERD for providing awesome venue for the event.
http://A2C.com IT Consulting for providing the food/drinks.
http://Cognizeus.com for providing book to give away as raffle.
Data Science Salon: Quit Wasting Time – Case Studies in Production Machine Le...Formulatedby
Presented by Yashas Vaidya, Sr Data Scientist at DataIku
Next DSS MIA Event - https://datascience.salon/miami/
The steps to taking a machine learning model to production. Modern architectures and technologies for building production machine learning. An overview of the talent and processes for creating and maintaining production machine learning.
Smarter Analytics: Supporting the Enterprise with AutomationInside Analysis
The Briefing Room with Barry Devlin and WhereScape
Live Webcast on June 10, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=5230c31ab287778c73b56002bc2c51a
The data warehouse is intended to support analysis by making the right data available to the right people in a timely fashion. But conditions change all the time, and when data doesn’t keep up with the business, analysts quickly turn to workarounds. This leads to ungoverned and largely un-managed side projects, which trade short-term wins for long-term trouble. One way to keep everyone happy is by creating an integrated environment that pulls data from all sources, and is capable of automating both the model development and delivery of analyst-ready data.
Register for this episode of The Briefing Room to hear data warehousing pioneer and Analyst Barry Devlin as he explains the critical components of a successful data warehouse environment, and how traditional approaches must be augmented to keep up with the times. He’ll be briefed by WhereScape CEO Michael Whitehead, who will showcase his company’s data warehousing automation solutions. He’ll discuss how a fast, well-managed and automated infrastructure is the key to empowering faster, smarter, repeatable decision making.
Visit InsideAnlaysis.com for more information.
This document discusses using data warehouses in retail and finance. It provides examples of how data warehouses are used in both industries, including for market basket analysis, product placement, supply chain management, and customer profiling. It also outlines some opportunities and challenges of implementing data warehouses, such as improved sales and customer loyalty but also large data volumes and data preparation difficulties. Specific company examples are given, like how Netflix uses customer streaming data and how Raymond James improved data backups and reporting with a new solution.
The Data Lake: Empowering Your Data Science TeamSenturus
Data science overview: defined, purpose, relation to BI, differences from BI and benefits from using both data science and BI. View the webinar video recording and download this deck: http://www.senturus.com/resources/data-lake-empowering-data-science-team/.
Learn how the data lake can empower data science teams and free up valuable data warehouse resources.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
Accelerating Data Lakes and Streams with Real-time AnalyticsArcadia Data
As organizations modernize their data and analytics platforms, the data lake concept has gained momentum as a shared enterprise resource for supporting insights across multiple lines of business. The perception is that data lakes are vast, slow-moving bodies of data, but innovations like Apache Kafka for streaming-first architectures put real-time data flows at the forefront. Combining real-time alerts and fast-moving data with rich historical analysis lets you respond quickly to changing business conditions with powerful data lake analytics to make smarter decisions.
Join this complimentary webinar with industry experts from 451 Research and Arcadia Data who will discuss:
- Business requirements for combining real-time streaming and ad hoc visual analytics.
- Innovations in real-time analytics using tools like Confluent’s KSQL.
- Machine-assisted visualization to guide business analysts to faster insights.
- Elevating user concurrency and analytic performance on data lakes.
- Applications in cybersecurity, regulatory compliance, and predictive maintenance on manufacturing equipment all benefit from streaming visualizations.
The demand for BI continues to grow, and while there's no question that analytics brings value, there is often uncertainty about how BI initiatives will deliver bottom-line benefits. Your business case for BI should prove ROI, but this is not always a straightforward process.
Building a 360 Degree View of Your Customers on BICSPerficient, Inc.
Why there is a need for Customer 360 and what the proposed cloud based solution is. We cover the stages of strategic marketing and how Oracle BI can help.
Collaborate 2018: How to Get Cross Functional Reporting with an Enterprise Da...Datavail
Many organizations not only lack the ability to look at their data across the organization as whole, but often have no lens into the metrics that they need to report against or manage the business of their own departments.
How beneficial would it be to have a central data information repository – we call it an Enterprise Data Warehouse – from which to retrieve accurate data from across all aspects of your business? This presentation explains how this, and more, can be a reality for your business, in a relatively short amount of time.
Seeing Redshift: How Amazon Changed Data Warehousing ForeverInside Analysis
The Briefing Room with Claudia Imhoff and Birst
Live Webcast April 9, 2013
What a difference a day can make! When Amazon announced their new RedShift offering – a data warehouse in the cloud – the entire industry of information management changed. The most notable disruption? Price. At a whopping $1,000 per year for a terabyte, RedShift achieved a price-point improvement that amounts to at least two orders of magnitude, if not three when compared to its top-tier competitors. But pricing is just one change; there's also the entire process by which data warehousing is done.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Claudia Imhoff explain why a new cloud-based reality for data warehousing significantly changes the game for business intelligence and analytics. She'll be briefed by Brad Peters of Birst who will tout his company's BI solution, which has been specifically architected for cloud-based hosting. Peters will discuss several key intricacies of doing BI in the cloud, including the unique provisioning, loading and modeling requirements. Founded in 2004, Birst has nearly a decade of doing cloud-based BI and Analytics.
Visit: http://www.insideanalysis.com
When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
How Western Alliance Bank is Innovating with Oracle Analytics CloudPerficient, Inc.
Western Alliance Bank had good visibility into profitability but lacked a platform that could pull data from different sources, model it, and create visual stories that empowered users to spot opportunities, forecast efficiently, and reduce risk.
Andrew Boucher, former vice president of FP&A at Western Alliance Bank, led the bank’s effort to find a cloud analytics platform that would meet its needs without being overwhelming to deploy and manage. The company chose Oracle Analytics Cloud (OAC) and partnered with Perficient to help.
We joined Andrew to discuss Western Alliance Bank’s experience with OAC, lessons learned, and how OAC could benefit your organization.
Discussion included:
-Challenges with the legacy environment
-Benefits realized with consolidated analytics, ability to trend KPIs, and drill to detail
-Best practices and methodology
-Latest development on Oracle Analytics Cloud
Data Discovery and BI - Is there Really a Difference?Inside Analysis
The Briefing Room with John O'Brien and Birst
Live Webcast Dec. 3, 2013
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?AT=pb&SP=EC&rID=7869542&rKey=1f6574abc879ca42
While the disciplines of business intelligence and discovery certainly overlap, there are key distinctions between the two, both in terms of design point and user interface. While traditionally it is believed different architectures are required to address these differing analytic needs, is that really the case? Or is discovery simply another key capability within an overall BI platform?
Register for this episode of The Briefing Room to learn from veteran Analyst John O'Brien of Radiant Advisors as he outlines best practices for enabling high-quality business intelligence and discovery, and the architectural capabilities to enable both. He'll be briefed by Brad Peters of Birst who will tout his company's cloud BI platform. In particular, Peters will demonstrate how the Birst architecture was especially designed for enterprise-caliber BI and argue for a more inclusive future BI architecture.
Visit InsideAnalysis.com for more information
Case Study: Lessons from Newell Rubbermaid's SAP HANA Proof of ConceptSAPinsider Events
View this session from Reporting & Analytics 2014. Coming to Las Vegas in November! www.reporting2015.com
In this session, Newell Rubbermaid guides you through the key elements that comprised its SAP HANA business case and proof of concept, including an emphasis on process improvement. Learn firsthand how Newell Rubbermaid:
· Identified which business processes were most likely to realize significant improvement as a result of utilizing SAP HANA
· Established a “current state” baseline and demonstrated a “projected state” that could be realized through the use of SAP HANA
· Determined which SAP BI tools to use based on specific reporting scenarios and end user requirements
Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...Precisely
Teams working on new business initiatives, whether for enhancing customer engagement, creating new value, or addressing compliance considerations, know that a successful strategy starts with the synchronization of operational and reporting data from across the organization into a centralized repository for use in advanced analytics and other projects. However, the range and complexity of data sources as well as the lack of specialized skills needed to extract data from critical legacy systems often causes inefficiencies and gaps in the data being used by the business.
The first part of our webcast series on Foundation Strategies for Trust in Big Data provides insight into how Syncsort Connect with its design once, deploy anywhere approach supports a repeatable pattern for data integration by enabling enterprise architects and developers to ensure data from ALL enterprise data sources– from mainframe to cloud – is available in the downstream data lakes for use in these key business initiatives.
Data Refinement: The missing link between data collection and decisionsVivastream
The document discusses the importance of data refinement between data collection and decision making. It emphasizes the need to transform raw data into useful insights through techniques like data summarization, categorization, and predictive modeling in order to provide accurate marketing answers and improve targeting, costs, and results. Specifically, it recommends structuring data into a model-ready environment, creating descriptive variables from transaction histories, matching data to the appropriate analytical goals and levels, and categorizing non-numeric attributes.
The document discusses Bon-Ton Stores working with Armeta Solutions to improve their analytics capabilities. An assessment found that merchants were overloaded with raw data dumps rather than actionable insights. The technical infrastructure was sufficient but not optimized. A roadmap was created to deliver intuitive analytic tools that empower merchants, moving Bon-Ton from a state of analyzing little to one where information can be a strategic weapon. The goal is reconnecting with customers by giving merchants local-level visibility through enhanced data-driven decision making.
Innovative Data Leveraging for Procurement AnalyticsTejari
This webinar will explore the types of problems and questions faced by procurement executives that can benefit most through the application of analytical solutions (e.g. innovation, strategic cost management, risk mitigation, etc.). In addition, we will cover the different forms of cognitive solutions that are emerging to drive real-time decision-making and predictive sourcing capabilities.
Similar to Webinar: If Your Data Could Talk, What Story Would it Tell? Would it Be a Documentary, a Thriller, or Horror Story? - QueBIT Consulting (20)
QueBIT Agile Crisis Planning - A Better Way to Plan for Uncertain TimesQueBIT Consulting
The QueBIT solution, which was built on IBM Planning Analytics, enabled the client to achieve two things that were not possible before the implementation. First, the solution identified correlations between historical sales and external factors such as inflation, Consumer Price Index (CPI), Gross Domestic Product (GDP), and commodity prices. This process which was built on IBM Planning Analytics and enhanced using QueBIT’s Euclid Studio tool, validated some prior assumptions by the business, while invalidating others. Second, the solution made it possible for the client to create numerous scenarios around reopening and consumer behaviors, including calculations of the resulting impact to business performance (volume and mix by channel). The future is still dynamic, and the QueBIT solution allows the business to identify its best assumptions and the resulting financial plan based on those assumptions.
JLG Case Study: Prescriptive Analytics & CPLEX Decision Optimization and TM1 ...QueBIT Consulting
The document discusses how JLG Industries used IBM Planning Analytics and CPLEX Decision Optimization integrated with TM1 to automate and optimize their master production scheduling process. This resulted in estimated annual savings of $700K from reducing the scheduling time from 10 days to 3 days per month, and $2.5M from improved forecast accuracy. The solution aligned production slot allocation across different factors based on optimization rules in 12 minutes, compared to 10 days previously.
Webinar December 2018 - Planning Analytics Workspace (PAW) Tips & Tricks. Today’s webinar is part of an advanced webinar series offered by QueBIT. Our next webinar is scheduled for Thursday, January 10th at 2pm Eastern. Learn about the advancements in the Cognos Analytics 11.1 release. These changes will bring the power of artificial intelligence, machine learning, and advanced analytics to all Cognos Analytics users to empower, enlighten, and facilitate a new breed of boundless data explorers! Register today by accessing the Events page on our website at quebit.com/news-events.
Telling “Your Story” Using Cognos Analytics WebinarQueBIT Consulting
Agenda:
-Introduction to QueBIT
-Overview of data “stories”
-Defining key elements in an effective “story”
-Demonstration of data delivery using these elements
-A technical dive into the Cognos Analytics features used in the demonstration
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...QueBIT Consulting
Why is good testing so hard to do? Not Enough Time. Not Enough Testers. Inconsistent or Incomplete Test Scripts. Lack of Performance Metrics. Difficult to Summarize Results
Practical Implementation Tips For Implementing a Financial Planning - QueBIT ...QueBIT Consulting
We’re driven to help organizations improve their agility to make intelligent decisions that create value.
This is why we’re committed to excellence in analytics strategy, implementation, and training.
QueBIT is a trusted expert in business analytics. They help organizations apply analytics to key functions to improve performance. QueBIT uses a collaborative approach to implement customized analytics solutions tailored for each client. Their solutions and expertise span data management, business intelligence, and advanced analytics. This allows clients to gain insights from their data to make better decisions and outperform competitors.
"Price optimization avoids loss; loss of a customer because you priced too high, or loss of margin because you priced too low. You are likely losing customers and margin every day without price optimization" -Gary Quirke, CEO
QueBIT Planning is a cloud-based financial planning and analytics solution that allows customers to leverage the benefits of TM1 without the cost and overhead of an on-premise system. It can replace an existing problematic planning process within weeks with a streamlined, collaborative cloud model. Key benefits include reusable Excel reports, predictive modeling, minimal training needs, an intuitive user experience, and rapid implementation managed by QueBIT.
Webinar Agenda:
-Brief QueBIT Introductions
-What is Price Optimization?
-QueBIT’s Approach to Price Optimization
-Achilles Demonstration
-Key Benefits
While predictive analytics is based on sophisticated mathematics, Achilles gives deep insights into how these models work. These models consume all available data (not just price) to ensure that pricing recommendations are realistic and take into account the dynamic market place.
QueBIT Planning is a cloud-based financial planning and analytics solution that will enable customers to use all the benefits of TM1 without the cost or overhead of owning an on-premise custom application.
JLG Case Study - Prescriptive Analytics & CPLEX Decision Optimization and TM1...QueBIT Consulting
The document discusses how JLG Industries used IBM Planning Analytics and CPLEX Decision Optimization integrated with TM1 to automate and optimize their master production scheduling process. This resulted in estimated annual savings of $700K from reducing the scheduling time from 10 days to 3 days per month, and $2.5M from improved forecast accuracy. The solution aligned production plans from their Sales, Inventory, and Operations Plan more quickly and accurately to improve supplier forecasts.
Introduction to TM1 TurboIntegrator Debugger Webinar - Quebit ConsultingQueBIT Consulting
Today’s webinar is part of a monthly advanced webinar series offered by QueBIT. Register for future webinars by accessing the events page on our website at quebit.com/news-events
AGENDA:
What is the TurboIntegrator Debugger?
Main Features/Main Window Walkthrough
How will the TI debugger help me?
Limitations
Installation and Configuration
QueBIT aims to make it easy to help you find the right information. Our mission is to empower you with the training you need, so that you can apply analytic techniques with confidence. We want you to succeed and see the power in the data that is at your fingertips, so that you can make better informed decisions. QueBIT is a full-service operation, offering flexible training sessions to meet your busy schedules. Our training is presented by certified, expert, technical trainers.
Cognos Analytics/Business Intelligence Training Catalog - Self Paced, Instruc...QueBIT Consulting
QueBIT aims to make it easy to help you find the right information. Our mission is to empower you with the training you need, so that you can apply analytic techniques with confidence. We want you to succeed and see the power in the data that is at your fingertips, so that you can make better informed decisions. QueBIT is a full-service operation, offering flexible training sessions to meet your busy schedules.
Auditable Financial System for Government Contracting at Accenture Federal Se...QueBIT Consulting
AGENDA:
Introductions and Company Overview
Government Contracting Industry Overview
AFS TM1 Model Overview
Business and Architectural Challenges
Solution Approach
Outcome and Results Achieved
Enhancements since Re-Architecture
Future Plans/Developments
Closing Statement
Q&A Session
Leveraging IBM Cognos TM1 for Merchandise Planning at Tractor Supply Company ...QueBIT Consulting
AGENDA:
Introductions and Company Overviews
TSC Merchandise Planning Solution Overview
Prior State
Solution and Implementation
Tips & Tricks for TM1 Perspectives Templates
Q&A
AGENDA:
Introductions and Company Overviews
JLG Master Scheduling Solution Overview
Problem Overview
What is CPLEX?
CPLEX & TM1 Integration
Solution and Implementation
Q&A
AGENDA:
Introductions and Company Overviews
Stein Mart’s Business Objectives
Leveraging the IBM Cloud and QueBIT FrameWORQ to maximize time to value
Stein Mart’s Reporting Solution
Results Achieved
Building Your Employer Brand with Social MediaLuanWise
Presented at The Global HR Summit, 6th June 2024
In this keynote, Luan Wise will provide invaluable insights to elevate your employer brand on social media platforms including LinkedIn, Facebook, Instagram, X (formerly Twitter) and TikTok. You'll learn how compelling content can authentically showcase your company culture, values, and employee experiences to support your talent acquisition and retention objectives. Additionally, you'll understand the power of employee advocacy to amplify reach and engagement – helping to position your organization as an employer of choice in today's competitive talent landscape.
The 10 Most Influential Leaders Guiding Corporate Evolution, 2024.pdfthesiliconleaders
In the recent edition, The 10 Most Influential Leaders Guiding Corporate Evolution, 2024, The Silicon Leaders magazine gladly features Dejan Štancer, President of the Global Chamber of Business Leaders (GCBL), along with other leaders.
Recruiting in the Digital Age: A Social Media MasterclassLuanWise
In this masterclass, presented at the Global HR Summit on 5th June 2024, Luan Wise explored the essential features of social media platforms that support talent acquisition, including LinkedIn, Facebook, Instagram, X (formerly Twitter) and TikTok.
IMPACT Silver is a pure silver zinc producer with over $260 million in revenue since 2008 and a large 100% owned 210km Mexico land package - 2024 catalysts includes new 14% grade zinc Plomosas mine and 20,000m of fully funded exploration drilling.
At Techbox Square, in Singapore, we're not just creative web designers and developers, we're the driving force behind your brand identity. Contact us today.
Company Valuation webinar series - Tuesday, 4 June 2024FelixPerez547899
This session provided an update as to the latest valuation data in the UK and then delved into a discussion on the upcoming election and the impacts on valuation. We finished, as always with a Q&A
Event Report - SAP Sapphire 2024 Orlando - lots of innovation and old challengesHolger Mueller
Holger Mueller of Constellation Research shares his key takeaways from SAP's Sapphire confernece, held in Orlando, June 3rd till 5th 2024, in the Orange Convention Center.
The Evolution and Impact of OTT Platforms: A Deep Dive into the Future of Ent...ABHILASH DUTTA
This presentation provides a thorough examination of Over-the-Top (OTT) platforms, focusing on their development and substantial influence on the entertainment industry, with a particular emphasis on the Indian market.We begin with an introduction to OTT platforms, defining them as streaming services that deliver content directly over the internet, bypassing traditional broadcast channels. These platforms offer a variety of content, including movies, TV shows, and original productions, allowing users to access content on-demand across multiple devices.The historical context covers the early days of streaming, starting with Netflix's inception in 1997 as a DVD rental service and its transition to streaming in 2007. The presentation also highlights India's television journey, from the launch of Doordarshan in 1959 to the introduction of Direct-to-Home (DTH) satellite television in 2000, which expanded viewing choices and set the stage for the rise of OTT platforms like Big Flix, Ditto TV, Sony LIV, Hotstar, and Netflix. The business models of OTT platforms are explored in detail. Subscription Video on Demand (SVOD) models, exemplified by Netflix and Amazon Prime Video, offer unlimited content access for a monthly fee. Transactional Video on Demand (TVOD) models, like iTunes and Sky Box Office, allow users to pay for individual pieces of content. Advertising-Based Video on Demand (AVOD) models, such as YouTube and Facebook Watch, provide free content supported by advertisements. Hybrid models combine elements of SVOD and AVOD, offering flexibility to cater to diverse audience preferences.
Content acquisition strategies are also discussed, highlighting the dual approach of purchasing broadcasting rights for existing films and TV shows and investing in original content production. This section underscores the importance of a robust content library in attracting and retaining subscribers.The presentation addresses the challenges faced by OTT platforms, including the unpredictability of content acquisition and audience preferences. It emphasizes the difficulty of balancing content investment with returns in a competitive market, the high costs associated with marketing, and the need for continuous innovation and adaptation to stay relevant.
The impact of OTT platforms on the Bollywood film industry is significant. The competition for viewers has led to a decrease in cinema ticket sales, affecting the revenue of Bollywood films that traditionally rely on theatrical releases. Additionally, OTT platforms now pay less for film rights due to the uncertain success of films in cinemas.
Looking ahead, the future of OTT in India appears promising. The market is expected to grow by 20% annually, reaching a value of ₹1200 billion by the end of the decade. The increasing availability of affordable smartphones and internet access will drive this growth, making OTT platforms a primary source of entertainment for many viewers.
Storytelling is an incredibly valuable tool to share data and information. To get the most impact from stories there are a number of key ingredients. These are based on science and human nature. Using these elements in a story you can deliver information impactfully, ensure action and drive change.
Industrial Tech SW: Category Renewal and CreationChristian Dahlen
Every industrial revolution has created a new set of categories and a new set of players.
Multiple new technologies have emerged, but Samsara and C3.ai are only two companies which have gone public so far.
Manufacturing startups constitute the largest pipeline share of unicorns and IPO candidates in the SF Bay Area, and software startups dominate in Germany.
3. 3
Agenda
Introduction to QueBIT
What’s your story?
Identifying Common Pitfalls
How to capture, process, and store data
How to rationalize key reference data
Reporting vs. Analytics
Use cases from the field
Q&A
4. 4 4
Housekeeping
Today’s webinar is part of an advanced webinar series offered by QueBIT.
Our next webinar is scheduled for Thursday, December 13th at 2pm Eastern.
We will demonstrate PAx/PAW Tips and Tricks and the benefits of upgrading.
Register today by accessing the Events page on our website at
quebit.com/news-events
Miss a past webinar? No problem! Visit the Resources page on our website
//quebit.com/who-we-are/video-catalog/
Please type all questions in the Questions Pane located on the GTW toolbar.
As time permits, the questions will be addressed and answered at the end of
the webinar
5. 5
900+ Successful
Implementations
400+ Customers in
numerous industries
100+ Employees
across the US
Award-winning leaders in
analytics, 6 years in a row
QueBIT is the leading end-to-end analytics provider.
7. 7
“84% of CEOs are concerned about the data
they’re basing their decisions on.”
- Forbes Insights and KPMG
8. 8
Who are your target audiences
and
what is your message?
Government Agencies
Investors
Executives
Management
Customers
Employees
Other Stakeholders
SEC, IRS, EPA
Growth Expectations
New Market Entry
Product Growth
Service Usage
Local Market Performance
9. 9
Common Pitfalls
• Your story is…unclear
• Your audience is…not well-defined
• Your data is…poorly structured
• You have no…authoritative source(s)
• Your solution is built on…“quick wins”
10. 10
Your story is unclear
• Lack of requirements
• Uncommitted or no project sponsor
• Incomplete vision
12. 12
Poorly Structured Data (a Frankenstein solution)
• Copy of transactional structures
• Little or no integration
• Structured when unstructured is needed
14. 14
“Quick Wins”
• “Dirty” out lives “Quick”
• Cost to “fix it” later
• No plan for follow-up project
• “No Plan’s Land”
• Follow-up/through
15. TRUSTED EXPERTS IN ANALYTICS
Business Sponsorship
Capture, Process, Store (ETL)
Rationalization
Reporting vs Analytics
Get these right and you can avoid the
pitfalls…
16. 16
Business Sponsorship
• What are you building?
• Who is funding it?
• When do they need it and why?
• How often do they need it?
• Are they committed?
17. 17
Capture, Process, Store
(a.k.a. ETL)
Capture
The right information at the right time
Process
The information is integrated
Store
The information is available and fit for purpose
18. 18
Rationalizing key data
• Reference data
• Data Stewards
• Standardize before matching
• Not a one-time process
19. 19
Reporting vs. Analytics
Reporting
• Descriptive
• Backward-looking
• Raises questions
• Data into information
• Reports and dashboards
• What is happening
Analytics
• Proscriptive
• Forward-looking
• Answers questions
• Information into actionable Insights
• Findings, predictions,
recommendations
• Why is it happening
20. 20
Use Case 1
Situation:
• Company XYZ’s finance department is using flat files from multiple ERP
systems in order to create monthly financial statements.
• They want I.T. to build them a data warehouse by connecting to the ERP
systems directly and extracting the data into the database so that they can
have a one-stop shop for their reporting source.
21. 21
Use Case 1
Concerns:
• Which ERP systems take priority in data accuracy in case of conflicts?
• Is connecting to the ERP systems directly the most efficient and effective
way to get the data into a database?
• Does the finance department realize that this is not necessarily a data
warehouse, but rather more of a data store that stores this finance data from
multiple sources?
22. 22
Use Case 1
Resolution:
• Flat file extracts. These aren’t a problem if your update frequency
doesn’t need to be near real time
• Set out a longer term goal to build out a data warehouse
• This builds on the idea of using reporting software on top of the
data warehouse to build the finance report
23. 23
Use Case 2
Situation:
• ABC Music Services collects royalties for song writers. They currently store
their music catalog and performance data in a legacy mainframe database.
• They would like to create a cloud-based portal to allow their writers to be
able to see which songs in their catalogs play when, by which service, by
geographic location. The music catalog does not change frequently. The
performance information is captured daily from many different sources in
many different formats.
24. 25
Use Case 2
Recommendation:
Star-Schema in a relational database.
Why?
Relational database system support star schemas well. The problem
statement describes the need to query by writer, by song, by time, by
which service, by geography. This is a natural fit for a star schema.
25. 26
What we covered…
• What’s your story?
• Identifying Common Pitfalls
• Business sponsorship
• How to capture, process, and store data
• How to rationalize key reference data
• Reporting vs. Analytics
• Use cases from the field
26. 27 27
Data. It’s what we do.
QueBIT designs and builds sustainable analytic solutions that enable you to
unravel the full power of your data. We work with you to understand your
business and technology needs and deliver short and long term projects. We
cater to companies of all sizes and industries and have solutions and offerings
that can work for you.
Data Focus Areas:
Data Integration & Warehousing
Data Governance
Data Quality
Data Architecture
27. 28 28
Q&A
Submit questions by typing them in the Questions Pane
on the GoToWebinar toolbar
Visit our website for additional information www.quebit.com
Or email us at info@quebit.com
Thank you for attending and have a wonderful day!
Editor's Notes
If your data could talk, what story would it tell? Would it be a documentary, a thriller, or horror story?
QueBIT cares passionately about helping good companies become great, through our analytics solutions and service offerings.
We advise clients on analytics best practices and embed analytics into every day business processes. We also develop and implement our own analytics products and solutions, and resell analytics software which together with our own solutions, help us deliver results for our clients.
Client success is at the heart of our mission. We listen to our clients and intensely focus our efforts on solving their business challenges. We hire the best talent we can find, we listen to our employees, and tirelessly work to create a culture of openness, collaboration, fun and opportunity. We believe that passionate, motivated and talented people enable success for our clients.
And we pride ourselves on being the leading end-to-end business analytics provider.
Thanks Mike! Welcome to the QueBIT Data Management webinar:” “What’s your story”. My name is Keith Hollen and I’ll be presenting this part of the webinar.
According to Forbes Insights and KPMG, “84% of CEOs are concerned about the data they’re basing their decisions on.” That’s a powerful statement. Only 16% of CEOs are comfortable with the data they use every day to run their businesses. With all the technology available to us today, you would think this wouldn’t be a problem. Yet we still find that companies are struggling to provide quality data in a reliable and timely manner. Today we’ll talk about why this happens and discuss some things you can do to help your CEOs be more comfortable with the information you provide.
When you think about what you do as an organization, what’s your story? Who is your target audience? You likely have multiple stories for different target audiences.
For instance, you probably report to one or more government agencies, such as the SEC, IRS, or EPA.
You have a story for your investors. For example, Growth Expectations.
Internally, you may have a story for your Executives, mid-level and line management about New Market entry and Product Growth.
What about your customers? Perhaps Service or Product usage
And your employees may be interested in local market performance.
Some, or all of these, are potential audiences for your story.
When you think of each of these, what do you want to say? Do you have the information necessary? Do you trust that information? Is it up-to-date?
In the end, to tell your story, you need the right information, in the right place, at the right time.
So how do you get there? During this webinar, we’ll discuss common pitfalls, key concepts you need to know to avoid these pitfalls, and we’ll follow up with some real-world use cases.
Let’s now talk about some common pitfalls which lead to untrusted data.
Your story is unclear,
your audience is not well-defined,
your data is poorly structured,
you have no, or multiple, authoritative sources,
your solution is built on “quick wins”.
You really don’t have a clear definition of the picture you’re trying to paint. This is usually caused by lack of requirements and/or uncommitted project sponsors
Perhaps you’re building a data warehouse with the “Build it and they will come” philosophy or you’re getting requirements second hand. This approach often results in a data repository that doesn’t get used.
Your data repository is a collection of copies of transactional system tables. This might be fine for operational reporting, but this isn’t a data warehouse pattern.
Data living in the same house is not a data warehouse. If it isn’t integrated, It’s just a bunch of data…
You also may be using the wrong structures for the type of reporting you’re doing.
You don’t have a single source of truth. Common in situations where multiple systems are used – ERP example.
We need it now and we’ll fix it later. A.K.A. “Quick and Dirty”.
The problem is that the dirty long outlives to quick and you never have time to go back and fix it.
However, we all run in to this situation. Usually because someone with a higher pay grade has made a commitment. But it could also be a time-to-market issues.
So what do you do? Mitigate this by having a budget and a plan to fix this at a future time.
This way you can better manage expectations and your chances of actually fixing what you put in are higher than if you take a “someone else will fix it” approach.
We’ve identified some common pitfalls and some of the reasons they may occur. Now let talk about four things you need to do or understand to avoid or mitigate these pitfalls.
Business Sponsorship – Make sure you have it. Someone needs to provide the vision or the blueprint.
Capture, Process, Store (ETL) – Think through the process. Build for flexibility. Keep it as simple as possible.
Rationalization – Identify key concepts and unify them across systems.
Reporting vs Analytics – Understand the difference. They often require different structures.
This is the most important thing you need to have in order to build trusted data solutions.
Why do you need a business sponsor? Well, ask yourself this: “Do you understand the business case?” How can you build a solution without knowing the problem you’re trying to solve?
A business sponsor should be able to answer the questions:
What you building?
Who is funding it?
When do they need it and why?
How often do they need it?
Are they committed?
Capture - This is where you Make sure you have the right information at the right time
Can the information be a day old or does it need to be current as of five minutes ago? Do you need history? How far back? Can you realistically meet the expectations? If not, should the project continue?
Process – This is where you making sure the information is integrated; Keep it simple but build for flexibility. This is where it’s important to assign an enterprise wide identifier to each concept. I’d recommend you create one if even you have a single system of record for a particular domain.
Store – At this stage, you need to make sure the information is available in the right format and fit for purpose - Not all reporting/ analysis methods use the same structures.
Reference Data
What do I mean by “Rationalizing Key Data?” We’re really talking about uniquely identifying reference data. Think of it this way: When you ask for information to make decisions, you’ll ask for something like “Show me sales for customer X by product, quarter, and sales person”. In this example, customer, product, quarter, and salesperson are all reference data. This information is captured as part of a sales transaction.
Transactions exist, or they don’t. You’re simply recording an event. But reference data is what ties it all together. If you don’t get this right, you won’t have true integrated reporting. Especially if you record sales across multiple systems. You need a way to uniquely identify reference information, such as customers, products, locations, and sales people, across all your systems if you want an accurate picture enterprise-wide.
Data Stewards
Who does this? Somebody has to be responsible for defining rules for rationalizing this data. This responsibility falls on the Data Stewards. Data stewards are the people who define which reference data is needed for a domain and in what format. They’ll define required key attributes such as contact methods and demographic information. They’ll also work with others to determine techniques for matching this information from difference source systems.
So, who should fill the data steward role? It should generally be a business person close to, or responsible for, the entry point of the data. The person in this role should also be recognized as the authority on the reference data for their domain.
Once the matching techniques have been defined, much of this can be automated. However, this is not a one-time process and the matching techniques used should be reviewed periodically to validate effectiveness. Even with automation, not everything will be matched with a high level of confidence. Data stewards are responsible for managing the processes, and for the manual review of these edge cases.
Standardize before matching – Pick a technique for standardizing names. You can store the standardized name as additional information or have a reference to the standard form. You should do the same for addresses as well.
And remember, this is not a one-time process.
It’s important to understand the difference between reporting and analytics because they require different approaches to managing data. Reporting is stating known facts; primarily against structured data. Analytics derives new facts from both structured and unstructured data and is usually more fluid. The message here is don’t be too rigid in how you store your data.
So, what are the key differences between reporting and analytics?
Descriptive vs. Proscriptive.
Backward-looking vs. Forward-looking
Raises questions vs. Answers questions
Turns data into information vs. Information into actionable insights
Consist of reports and dashboards vs. Findings, predictions, and recommendations
What is happening vs. Why is it happening
Problem statement
How it was resolved
Our recommendation
Flat file extracts. These aren’t a problem if your update frequency doesn’t need to be near real time. You want to hit the source system as little as possible. In the past we’ve gotten at least 10x better performance by uploading extract files to blob storage and running bulk inserts to load the data. Bulk inserts don’t overwrite existing data so they can be used for periodic refreshes.
Set a longer-term goal to build out a data warehouse. The effort going into building the data store may not be worth it if it’s just used as a convenient one-stop data source, as opposed to a source for an operational data store and eventually a data warehouse.
This builds on the idea of using reporting software on top of the data warehouse to build the finance report
Situation:
ABC Music Services collects royalties for song writers. They currently store their music catalog and performance data in a legacy mainframe database.
They would like to create a cloud-based portal to allow their writers to be able to see which songs in their catalogs play when, by which service, by geographic location. The music catalog does not change frequently. The performance information is captured daily from many different sources in many different formats.
How would you store the portal data to allow for fast processing by typical BI reporting tools?
Unstructured (big data solution)
Normalized tables in a relational database
Star Schema in a relational database
Call the Data Wrangler…
Recommendation:
Star-Schema in a relational database.
Why?
Relational database system support star schemas well. The problem statement describes the need to query by writer, by song, by time, by which service, by geography. This is a natural fit for a star schema. Relational databases can also use techniques to optimize for fast reads.
Typical BI reporting tools support relational formats well. You could use Normalized tables, but normalization works best for transactional systems where you need fast inserts into tables that only allow valid data.
Many BI tools have connectors to big data platforms (such as Hadoop/Hive). My biggest concern here is performance. Hadoop is a platform originally designed for processing very large data files in batch mode.
The Data Wrangler would also be an excellent choice, but he is very busy and probably won’t be available…