Et si dans dix ans les agences remplaçaient les planneurs stratégiques par des logiciels de « génération de langage naturel » ? La question est encore un peu extrapolée certes, mais la production robotique de rapport est déjà une réalité. La preuve aux Etats-Unis avec Quill.
The document discusses how analytics have become more widely accessible but advanced analytics requiring sophisticated skills have remained out of reach for most employees. It introduces IBM Watson Analytics as a solution that simplifies analytics so more people can access insights without technical skills. Watson Analytics removes obstacles like data preparation, addresses the skills gap through an intuitive interface, and leverages the cloud to make powerful analytics available anywhere.
The document discusses 5 trends for 2018: 1) The continued growth of the Internet of Things. 2) Embedded analytics becoming more common and useful. 3) A shift to providing predictions rather than focusing on predictive analytics. 4) Continued development of artificial intelligence but ensuring it helps rather than replaces humans. 5) Increased monetization of data through new data products and services.
This document discusses 5 limitations of spreadsheets for data analysis and visualization and provides alternatives:
1. Spreadsheets can't handle large, diverse datasets from multiple sources like databases and data warehouses. Integrating and analyzing all relevant data is important for accurate insights.
2. Complex calculations and macros can slow down spreadsheets, wasting time. Connecting to live data sources allows fast analysis of large datasets.
3. Blending and cleaning data from different sources is difficult in spreadsheets. Joining datasets on common fields provides a unified view.
4. Spreadsheets offer limited basic charts but advanced visualizations like maps and dashboards provide faster, more intuitive understanding.
5. Interactive dashboards with up
Mattel implemented a shallow-dive analytics approach to gain visibility into key metrics and drive a more data-driven supply chain culture. Employees were overwhelmed by large amounts of data, so Mattel focused on a select few critical metrics in real-time, such as on-time delivery rates. This allowed executives to quickly identify issues and take action. The shallow-dive approach helped Mattel steer its large, complex supply chain and reinforce strategic goals using data rather than feelings. It also engaged employees by giving them access to the same real-time metrics seen by executives.
This document discusses the importance of alignment in digital analytics. It emphasizes establishing roles like actors, influencers, and stakeholders who are aligned around key performance indicators. Actors need to receive the appropriate level and frequency of information to take action. Influencers are those whose actions impact an actor's success even if they are not directly accountable to each other. Stakeholders success depends directly on the actor and they set cascading KPIs. Overall, the document stresses the need for organizational alignment around roles and KPIs in order to take effective action based on analytics.
How Companies Turn Data Into Business ValueJamie Hribal
This document discusses how businesses can capture, combine, and turn data into actionable insights. It summarizes Umbric Data Services, a company that provides data solutions to help businesses harness data to improve strategies, operations, and revenue. The document outlines common misconceptions about big data, how to ask the right questions to examine customer value, and ways companies are using data analytics, including to find new customers, increase retention, improve service, manage marketing, and track social media.
The document discusses how algorithmic forecasting and artificial intelligence (AI) can enhance financial planning and analysis (FP&A) by providing better insights into future business performance. It explains that traditional forecasting approaches using Excel or reporting tools are limited and cannot comprehend millions of data points. New approaches using AI algorithms can identify patterns and correlations to help analysts. The document also gives examples of how predictive analytics firms are using machine learning to analyze large amounts of structured and unstructured data to gauge risks and probabilities of future events. Finally, it discusses how AI can provide a more nuanced look within an organization to understand how transactions and business drivers will affect future financial performance.
The document discusses business performance management (BPM) and how analytics can help drive better performance. It summarizes a webcast with experts who discussed challenges with BPM and how organizations can improve. Some of the key issues addressed are ensuring high quality data, using advanced analytics beyond spreadsheets, identifying the most important metrics, and using multiple forecasting methods to improve accuracy. The experts encourage organizations to evolve their use of analytics to help strategy, decision making and performance.
The document discusses how analytics have become more widely accessible but advanced analytics requiring sophisticated skills have remained out of reach for most employees. It introduces IBM Watson Analytics as a solution that simplifies analytics so more people can access insights without technical skills. Watson Analytics removes obstacles like data preparation, addresses the skills gap through an intuitive interface, and leverages the cloud to make powerful analytics available anywhere.
The document discusses 5 trends for 2018: 1) The continued growth of the Internet of Things. 2) Embedded analytics becoming more common and useful. 3) A shift to providing predictions rather than focusing on predictive analytics. 4) Continued development of artificial intelligence but ensuring it helps rather than replaces humans. 5) Increased monetization of data through new data products and services.
This document discusses 5 limitations of spreadsheets for data analysis and visualization and provides alternatives:
1. Spreadsheets can't handle large, diverse datasets from multiple sources like databases and data warehouses. Integrating and analyzing all relevant data is important for accurate insights.
2. Complex calculations and macros can slow down spreadsheets, wasting time. Connecting to live data sources allows fast analysis of large datasets.
3. Blending and cleaning data from different sources is difficult in spreadsheets. Joining datasets on common fields provides a unified view.
4. Spreadsheets offer limited basic charts but advanced visualizations like maps and dashboards provide faster, more intuitive understanding.
5. Interactive dashboards with up
Mattel implemented a shallow-dive analytics approach to gain visibility into key metrics and drive a more data-driven supply chain culture. Employees were overwhelmed by large amounts of data, so Mattel focused on a select few critical metrics in real-time, such as on-time delivery rates. This allowed executives to quickly identify issues and take action. The shallow-dive approach helped Mattel steer its large, complex supply chain and reinforce strategic goals using data rather than feelings. It also engaged employees by giving them access to the same real-time metrics seen by executives.
This document discusses the importance of alignment in digital analytics. It emphasizes establishing roles like actors, influencers, and stakeholders who are aligned around key performance indicators. Actors need to receive the appropriate level and frequency of information to take action. Influencers are those whose actions impact an actor's success even if they are not directly accountable to each other. Stakeholders success depends directly on the actor and they set cascading KPIs. Overall, the document stresses the need for organizational alignment around roles and KPIs in order to take effective action based on analytics.
How Companies Turn Data Into Business ValueJamie Hribal
This document discusses how businesses can capture, combine, and turn data into actionable insights. It summarizes Umbric Data Services, a company that provides data solutions to help businesses harness data to improve strategies, operations, and revenue. The document outlines common misconceptions about big data, how to ask the right questions to examine customer value, and ways companies are using data analytics, including to find new customers, increase retention, improve service, manage marketing, and track social media.
The document discusses how algorithmic forecasting and artificial intelligence (AI) can enhance financial planning and analysis (FP&A) by providing better insights into future business performance. It explains that traditional forecasting approaches using Excel or reporting tools are limited and cannot comprehend millions of data points. New approaches using AI algorithms can identify patterns and correlations to help analysts. The document also gives examples of how predictive analytics firms are using machine learning to analyze large amounts of structured and unstructured data to gauge risks and probabilities of future events. Finally, it discusses how AI can provide a more nuanced look within an organization to understand how transactions and business drivers will affect future financial performance.
The document discusses business performance management (BPM) and how analytics can help drive better performance. It summarizes a webcast with experts who discussed challenges with BPM and how organizations can improve. Some of the key issues addressed are ensuring high quality data, using advanced analytics beyond spreadsheets, identifying the most important metrics, and using multiple forecasting methods to improve accuracy. The experts encourage organizations to evolve their use of analytics to help strategy, decision making and performance.
Visualisation & Storytelling in Data Science & AnalyticsFelipe Rego
The document provides an overview of data visualization and storytelling in data science and analytics. It discusses key concepts like what data visualization is, compelling reasons to visualize data like Anscombe's Quartet, visualization in the context of analytics workflows, components of effective storytelling, considerations for presentation, guidelines for data storytelling, and examples of interesting data visualizations. Throughout the document, the author emphasizes best practices like keeping visualizations clear, addressing the intended audience, and avoiding bias.
Everyone's talking about big data – getting our arms around it and putting it to work for us. This paper summarizes a panel discussion at the 2012 SAS Financial Services Executive Summit where industry leaders shared their ideas about big data and what their organizations are doing with it. Aditya Bhasin from Bank of America talked about how to extract more value from the data you already have, even if it's just a fraction of what's out there. Robert Kirkpatrick, who leads the UN Global Pulse initiative, talked about how data can help us better understand global economies and human welfare. Charles Thomas, a market research and analytics executive at USAA, described how his company is navigating the shift to more real-time and predictive analysis. Request the full whitepaper at: http://www.sas.com/reg/wp/corp/50060?&utm_source=NAFCUServices&utm_medium=landingpage&utm_campaign=SASwhitepaper82912. More info at: www.nafcu.org/sas
1) The document discusses how digital intelligence powered by data and analytics can provide competitive advantages for organizations. It argues that to fully benefit, organizations need to be able to easily access, analyze, and act on both structured and unstructured data from various sources.
2) It describes how cognitive systems can understand data in new ways, allowing organizations to explore more types of information and generate new insights. This helps organizations overcome limitations that currently prevent them from utilizing much of their available data.
3) Empowering various roles across an organization to access and analyze data independently can accelerate innovation and improve business outcomes. Developers, data scientists, business professionals, CIOs, and others need tools and technologies that make the most of
The document discusses how modern data analytics are transforming business. It introduces the topic and explains that data is doubling every two years and analytics are becoming more valuable. The rest of the document is organized into five sections that will discuss topics like how analytics are changing business models, new technology platforms, industry examples, research, and marketing. The introduction of each section provides a brief overview of what essays in that section will cover. The overall goal is to provide insights from different perspectives on how analytics are rapidly evolving and playing an increasingly important role.
Algorithms and the technology of personalisation finalColin Strong
This presentation was created for a Google working group meeting which explores the nature of the relationship between the consumer and the Internet. Presented at Google Offices, Berlin June 5th 2015.
Prediction - the future of game analytics - white paperJune Lee
This document discusses predictive analytics and its applications in game analytics. It begins by explaining that predictive analytics uses patterns found in historical player data and game logs to predict future player behavior. While not 100% accurate, predictive models can provide accuracy rates from 50-90% depending on the quality of the data. The document provides examples of how predictive analytics can be used to predict player churn rates and lifetime values in order to target retention promotions more effectively. It emphasizes interpreting predictive values based on both the predicted likelihood and the model's accuracy rating.
The presentation talks about "Data Science being the sexiest job of the 21st century". What are the challenges faced by the industry and how to Overcome them, is the main theme of the presentation
This document summarizes a presentation about using data-driven marketing approaches. It discusses trends like treating customers like royalty through personalized experiences, using big data and predictive analytics to gain insights about customers. It also covers challenges of data silos and lack of contextual data. The presentation advocates for using multi-dimensional customer data management, predictive analytics, streaming analytics and bi-directional digital platforms to better understand and interact with customers in real-time.
The document discusses how organizations during the Dot Com era either succeeded or failed based on their ability to turn data into actionable business intelligence and performance management. It emphasizes that data alone is worthless, but data transformed into knowledge through performance management reports and metrics can guide strategic decision making. The document provides tips for organizations to harness historical performance data and metrics to create predictive exception-based reports, balance short and long-term goals, align individual and company objectives, and leverage technology to empower employees with relevant data and knowledge.
Big Data & Analytics Trends 2016 Vin MalhotraVin Malhotra
This document discusses several trends in analytics for 2016:
1. Data security is a major concern as data volumes grow exponentially and security risks increase. Analytics can help secure data but requires integration across innovation, analytics, connectivity and technology.
2. The Internet of Things generates massive sensor data that requires new analytics to extract value, though challenges remain in integrating sensor and structured data in real time.
3. Open source analytics solutions like Hadoop are increasingly used by enterprises but also require careful risk management and a clear strategy to ensure they align with technology needs.
To be updated is not enough for companies today. Organizations must be constantly watching also to the trends in order to predict and forecast the next steps for their business. The following document is a Executive Summary of the current situation but also of the more notable trends that will help to understand the basics of the Analytics Market
Big Data In Small Steps is a document that addresses common questions about big data including what it is, how it can provide value, and how to implement it. It defines big data using the four V's of volume, velocity, variety and veracity. It provides examples of how insurance and telecom companies can use big data for customer loyalty, risk management, claims processing, segmentation, capacity planning and promotional optimization. The document recommends establishing a center of excellence and identifies the key roles needed including an executive leader, project manager, data technologist, data scientist and data analyst. It advocates prototyping solutions and developing repeatable processes for extracting value from big data.
Data is becoming an engine for many businesses in the information age, and every company needs to consider look at how that feels in their business model.
This an introductory guest lecture for students at Stockholm School of Entrepreneurship.
Analytics in Financial Services: Keynote Presentation for TDWI and NY Tech Co...Fitzgerald Analytics, Inc.
Keynote Presentation Given in New York City on March 30th, at a joint event of The Data Warehousing Institute (TDWI) and the New York Technology Council. This keynote presentation by Jaime Fitzgerald focused on "Bridging the Gap" between business goals in the data and analytic enablers of achieving these goals.
This article discusses how community institutions can benefit from using data analytics to make better informed decisions. It provides examples of how institutions like Orrstown Bank have used metrics and customer data to improve processes and customer experiences. The article advocates that institutions should start small with basic analytics and dashboards using existing internal data resources to gain insights before pursuing more sophisticated techniques.
Big Data Management For Dummies InformaticaFiona Lew
This document is the introduction chapter of the book "Big Data Management For Dummies, Informatica Special Edition". It provides an overview of the book and its purpose. The book aims to provide a solution to struggling big data projects through the concept of big data management. Big data management is based on three pillars - integration, governance, and security - which provide processes and technologies to ensure data is clean, governed, and secure in order to discover insights and deliver business value from big data projects.
Big data refers to the large amounts of data collected from various sources like cell phones, credit cards, and the internet. As big data continues to grow exponentially, accountants must learn new skills to analyze and manage all this data. Traditional accounting methods like Excel spreadsheets are no longer sufficient. Accountants will need to form new partnerships and collaborate across departments to extract useful insights from big data. While big data presents many challenges, it also provides opportunities to improve business decisions, target customers more effectively, and even combat fraud through advanced data analysis. The role of accountants is evolving to be more strategic and analytic as businesses rely more heavily on big data.
This document discusses how to build a culture of measurement within an organization. It provides examples of companies like Cabela's and Barclaycard that have successfully cultivated measurement cultures. The key benefits are using data to increase revenue, lower costs, and improve customer satisfaction. However, changing an organization's culture presents challenges. The document recommends understanding current cultural roots, deploying measurement strategies in a phased approach, and communicating the value of measurement through stories.
La maîtrise de la vie privée à l'heure du digitalCHEMISTRY AGENCY
Plus nos vies se digitalisent, plus nos données personnelles circulent. Les consos en prennent peu à peu conscience et leurs inquiétudes montent, face à la multiplication des scandales autour de la vie privée sur internet. Les marques doivent alors changer d'approche face à la "big data" pour (r)établir une véritable relation de confiance.
Visualisation & Storytelling in Data Science & AnalyticsFelipe Rego
The document provides an overview of data visualization and storytelling in data science and analytics. It discusses key concepts like what data visualization is, compelling reasons to visualize data like Anscombe's Quartet, visualization in the context of analytics workflows, components of effective storytelling, considerations for presentation, guidelines for data storytelling, and examples of interesting data visualizations. Throughout the document, the author emphasizes best practices like keeping visualizations clear, addressing the intended audience, and avoiding bias.
Everyone's talking about big data – getting our arms around it and putting it to work for us. This paper summarizes a panel discussion at the 2012 SAS Financial Services Executive Summit where industry leaders shared their ideas about big data and what their organizations are doing with it. Aditya Bhasin from Bank of America talked about how to extract more value from the data you already have, even if it's just a fraction of what's out there. Robert Kirkpatrick, who leads the UN Global Pulse initiative, talked about how data can help us better understand global economies and human welfare. Charles Thomas, a market research and analytics executive at USAA, described how his company is navigating the shift to more real-time and predictive analysis. Request the full whitepaper at: http://www.sas.com/reg/wp/corp/50060?&utm_source=NAFCUServices&utm_medium=landingpage&utm_campaign=SASwhitepaper82912. More info at: www.nafcu.org/sas
1) The document discusses how digital intelligence powered by data and analytics can provide competitive advantages for organizations. It argues that to fully benefit, organizations need to be able to easily access, analyze, and act on both structured and unstructured data from various sources.
2) It describes how cognitive systems can understand data in new ways, allowing organizations to explore more types of information and generate new insights. This helps organizations overcome limitations that currently prevent them from utilizing much of their available data.
3) Empowering various roles across an organization to access and analyze data independently can accelerate innovation and improve business outcomes. Developers, data scientists, business professionals, CIOs, and others need tools and technologies that make the most of
The document discusses how modern data analytics are transforming business. It introduces the topic and explains that data is doubling every two years and analytics are becoming more valuable. The rest of the document is organized into five sections that will discuss topics like how analytics are changing business models, new technology platforms, industry examples, research, and marketing. The introduction of each section provides a brief overview of what essays in that section will cover. The overall goal is to provide insights from different perspectives on how analytics are rapidly evolving and playing an increasingly important role.
Algorithms and the technology of personalisation finalColin Strong
This presentation was created for a Google working group meeting which explores the nature of the relationship between the consumer and the Internet. Presented at Google Offices, Berlin June 5th 2015.
Prediction - the future of game analytics - white paperJune Lee
This document discusses predictive analytics and its applications in game analytics. It begins by explaining that predictive analytics uses patterns found in historical player data and game logs to predict future player behavior. While not 100% accurate, predictive models can provide accuracy rates from 50-90% depending on the quality of the data. The document provides examples of how predictive analytics can be used to predict player churn rates and lifetime values in order to target retention promotions more effectively. It emphasizes interpreting predictive values based on both the predicted likelihood and the model's accuracy rating.
The presentation talks about "Data Science being the sexiest job of the 21st century". What are the challenges faced by the industry and how to Overcome them, is the main theme of the presentation
This document summarizes a presentation about using data-driven marketing approaches. It discusses trends like treating customers like royalty through personalized experiences, using big data and predictive analytics to gain insights about customers. It also covers challenges of data silos and lack of contextual data. The presentation advocates for using multi-dimensional customer data management, predictive analytics, streaming analytics and bi-directional digital platforms to better understand and interact with customers in real-time.
The document discusses how organizations during the Dot Com era either succeeded or failed based on their ability to turn data into actionable business intelligence and performance management. It emphasizes that data alone is worthless, but data transformed into knowledge through performance management reports and metrics can guide strategic decision making. The document provides tips for organizations to harness historical performance data and metrics to create predictive exception-based reports, balance short and long-term goals, align individual and company objectives, and leverage technology to empower employees with relevant data and knowledge.
Big Data & Analytics Trends 2016 Vin MalhotraVin Malhotra
This document discusses several trends in analytics for 2016:
1. Data security is a major concern as data volumes grow exponentially and security risks increase. Analytics can help secure data but requires integration across innovation, analytics, connectivity and technology.
2. The Internet of Things generates massive sensor data that requires new analytics to extract value, though challenges remain in integrating sensor and structured data in real time.
3. Open source analytics solutions like Hadoop are increasingly used by enterprises but also require careful risk management and a clear strategy to ensure they align with technology needs.
To be updated is not enough for companies today. Organizations must be constantly watching also to the trends in order to predict and forecast the next steps for their business. The following document is a Executive Summary of the current situation but also of the more notable trends that will help to understand the basics of the Analytics Market
Big Data In Small Steps is a document that addresses common questions about big data including what it is, how it can provide value, and how to implement it. It defines big data using the four V's of volume, velocity, variety and veracity. It provides examples of how insurance and telecom companies can use big data for customer loyalty, risk management, claims processing, segmentation, capacity planning and promotional optimization. The document recommends establishing a center of excellence and identifies the key roles needed including an executive leader, project manager, data technologist, data scientist and data analyst. It advocates prototyping solutions and developing repeatable processes for extracting value from big data.
Data is becoming an engine for many businesses in the information age, and every company needs to consider look at how that feels in their business model.
This an introductory guest lecture for students at Stockholm School of Entrepreneurship.
Analytics in Financial Services: Keynote Presentation for TDWI and NY Tech Co...Fitzgerald Analytics, Inc.
Keynote Presentation Given in New York City on March 30th, at a joint event of The Data Warehousing Institute (TDWI) and the New York Technology Council. This keynote presentation by Jaime Fitzgerald focused on "Bridging the Gap" between business goals in the data and analytic enablers of achieving these goals.
This article discusses how community institutions can benefit from using data analytics to make better informed decisions. It provides examples of how institutions like Orrstown Bank have used metrics and customer data to improve processes and customer experiences. The article advocates that institutions should start small with basic analytics and dashboards using existing internal data resources to gain insights before pursuing more sophisticated techniques.
Big Data Management For Dummies InformaticaFiona Lew
This document is the introduction chapter of the book "Big Data Management For Dummies, Informatica Special Edition". It provides an overview of the book and its purpose. The book aims to provide a solution to struggling big data projects through the concept of big data management. Big data management is based on three pillars - integration, governance, and security - which provide processes and technologies to ensure data is clean, governed, and secure in order to discover insights and deliver business value from big data projects.
Big data refers to the large amounts of data collected from various sources like cell phones, credit cards, and the internet. As big data continues to grow exponentially, accountants must learn new skills to analyze and manage all this data. Traditional accounting methods like Excel spreadsheets are no longer sufficient. Accountants will need to form new partnerships and collaborate across departments to extract useful insights from big data. While big data presents many challenges, it also provides opportunities to improve business decisions, target customers more effectively, and even combat fraud through advanced data analysis. The role of accountants is evolving to be more strategic and analytic as businesses rely more heavily on big data.
This document discusses how to build a culture of measurement within an organization. It provides examples of companies like Cabela's and Barclaycard that have successfully cultivated measurement cultures. The key benefits are using data to increase revenue, lower costs, and improve customer satisfaction. However, changing an organization's culture presents challenges. The document recommends understanding current cultural roots, deploying measurement strategies in a phased approach, and communicating the value of measurement through stories.
La maîtrise de la vie privée à l'heure du digitalCHEMISTRY AGENCY
Plus nos vies se digitalisent, plus nos données personnelles circulent. Les consos en prennent peu à peu conscience et leurs inquiétudes montent, face à la multiplication des scandales autour de la vie privée sur internet. Les marques doivent alors changer d'approche face à la "big data" pour (r)établir une véritable relation de confiance.
O documento apresenta uma palestra sobre Internet das Coisas usando Azure como backend. É descrito o contexto da Internet das Coisas e apresentados os dispositivos Netduino e Arduino, além dos protocolos MQTT e RSMB para comunicação entre dispositivos. É mostrado como usar o Azure para hospedar dados e serviços backend para aplicações da Internet das Coisas.
If data is the new oil, then interfaces are the new delivery means -- Ignite ...3scale
"If data is the new oil, then interfaces are the new delivery means"
These are the slides of Manfred's Ignite talk at the O'Reilly Strata+Hadoop conference in Barcelona. It was about getting the interfaces right as an enabler of the benefits of the Second Machine Age.
A recording is included or on YouTube:
http://youtu.be/00eNinS50PU
A write-up will follow and be posted on the 3scale blog:
http://www.3scale.net/blog/
This document discusses the importance of load testing APIs to avoid performance issues like Twitter's "Fail Whale" outage. It recommends understanding an API's expected throughput, peak usage, and how traffic is distributed. Various load testing tools are compared and examples are given of using Loader.io and wrk to establish a baseline and maximum throughput. It also provides tips for identifying and addressing bottlenecks, such as optimizing available connections and using an API gateway. The goal of load testing is to refine an API until it meets performance targets.
iBANK is a platform that provides intelligent banking solutions through open banking initiatives, cognitive intelligence, banking bots, and other services. It aims to pioneer banking automation through compliance and security. The company's team of experts in areas like blockchain, AI, and analytics provide professional and customer support, as well as consulting services to help banks transform. iBANK has experienced rapid growth, with turnover increasing from £100k to £10 million since 2016, and has offices in London, New York, and Bangalore serving many partners in the banking industry.
Fiorano ESB: Integration Solution for BanksAshraf Imran
Fiorano provides an integration solution called the Fiorano ESB for banks to integrate their applications and systems. It has over 520 deployments worldwide and offers adapters and tools to integrate core banking, payments, and other systems. The Fiorano ESB provides a centralized, peer-to-peer architecture to reduce integration costs and complexity compared to point-to-point integration. It has pre-built components for databases, files, messaging and other integrations.
This document summarizes a presentation by Jason Bloomberg on integrating microservices in the cloud. Jason Bloomberg is the president of Intellyx and advises companies on digital transformation initiatives. The presentation discusses what microservices are, how to design them for qualities like granularity, parsimony and cohesion. It also covers challenges of integrating microservices in the cloud, including managing state across ephemeral instances and asynchronous interactions.
The document describes the DataBearings platform for semantic data integration. DataBearings uses semantic technologies to integrate heterogeneous data sources on-the-fly without loading data into a warehouse first. This allows for live access to data, integration of IoT data sources, and cost savings by leveraging existing data. DataBearings provides a lightweight solution for dynamic data integration through semantic annotations, reusable components, and a semantic agent programming language to define integration logic and automations.
This deck presents some basic concepts of IoT and some more advanced concepts, reviews the current market players and future of IoT as well as the key ingredients and architecture for success.
APIs for your Business + Stages of the API Lifecycle3scale
The document discusses APIs and the API lifecycle. It describes four stages of the API lifecycle: plan/design, build/integrate, operate/manage, and share/engage. Each stage is important to consider when developing APIs. The document also provides examples of how companies have benefited from APIs by creating new revenue sources, increasing reach, fostering innovation, and improving efficiency. It emphasizes the importance of managing APIs after they are built to secure access and monitor usage and traffic.
Integrating, exposing and managing distributed data with RESTful APIs and op...3scale
This was a 1h demo and talk co-presented by Red Hat's Cojan van Ballegooijen and 3scale's Manfred Bortenschlager talking about data integration of various diverse sources via data virtualization. Then we exposed the data via RESTful APIs and added the 3scale API Management layer on top to get full control and visibility about API access.
This presentation is from the Integration Monday session organized by Integration User Group held on September 19, 2016. In this presentation, Microsoft Integration Consultant Eldert Grootenboer gives an introduction on "Integration of Things". In this session, Eldert will show how you can set up integration by integrating your IoT devices and process, store and analyze the data in real time.
These are the slides of our talk and demo at the PAPIs.io conference on Predictive APIs and Apps in Barcelona, November 17/18.
In the demo we showed integration of various Web APIs: Bicing, Google Maps, and BigML. We customised API requests and responses exactly to our needs.
That's the power of APItools.com
A video of the live demo will be added soon. In the meantime you can find screenshots a the end of the slide deck.
For more info get in touch:
hello@apitools.com
(And check out the APItools Middleware contest.)
API Management Workshop (at Startupbootcamp Berlin)3scale
These are the slides from the API Management Workshop, held at the Startupbootcamp Berlin on October 17.
We covered benefits of APIs for an organisation (regardless of size, sector, stage or purpose) and gave examples of successful deployment of APIs.
We then described the typical API lifecycle:
plan/design > build/integrate > operate/manage > share/engage.
We covered many best practices and tools for each stage and gave practical demos about how to secure and manage APIs.
This document presents a maturity model for artificial intelligence adoption in enterprises. It outlines four stages of maturity: exploring, experimenting, formalizing, and integrating. It also discusses four macro trends affecting AI success: the shift from screen-based to sensory interactions; from rules-based to probabilistic decision making; from data analytics to data engineering; and from expertise-driven to data-driven leadership. Key aspects of maturity include having a data strategy, using AI in product development, establishing ethics principles, and integrating AI throughout the organization.
This document discusses the importance of data fluency skills in the 21st century. It defines key terms like data science, machine learning, data literacy, and statistical literacy. While these fields require extensive training, the document argues that domain expertise combined with basic data analysis skills can solve many problems. These basic skills include understanding data structures, using programming to interact with data, and exploratory data analysis through visualization. The data analysis process involves defining problems, collecting and preparing data, visualization and modeling, and communicating results. RStudio is presented as a tool that can support the entire data analysis process within a single integrated development environment.
A powerful data-driven narrative opens up new perspectives and concepts within the minds of those who read it by strategically utilizing narrative, data analysis, data visualization, and storytelling techniques.
This newsletter provides an overview of analytics and highlights some ways organizations are using analytics. It discusses how 140 million customer interactions can be analyzed to understand customers and how analytics is beginning to be used beyond basic reporting. Examples are given of analytics being used for customer segmentation, risk mitigation, and reducing transportation costs. Predictive analytics and big data are also discussed.
Converting Big Data To Smart Data | The Step-By-Step Guide!Kavika Roy
1. The document discusses how to convert big data into smart data through machine learning and artificial intelligence techniques. It involves filtering big data through criteria like timeframes and media channels to create more focused data streams.
2. Analytics are then used to derive insights from the filtered data by identifying themes, influential actors, emotions, and other patterns. This process of filtering and analyzing turns large amounts of raw data into actionable business intelligence.
3. The final stage is integrating smart data with other internal and external data sources through APIs and data sharing to develop a comprehensive view of customers and business operations. This full conversion process extracts strategic lessons from big data to guide decision-making.
Business Analytics Lesson Of The Day August 2012Pozzolini
Business analytics involves collecting and analyzing large amounts of data to help companies make better business decisions. While data analysis has been used in business for over a century, it is only recently that companies have had the capabilities to analyze huge volumes of data in real-time and make predictive decisions. However, many companies still struggle with issues like poor quality data that can lead to inaccurate analyses. To successfully implement business analytics, companies need to focus on developing skills, ensuring accurate data, and having the right technologies to capture and make sense of their data.
Difference B/w Data Analytics, Data Analysis, Data Mining, Data Science, Machine Learning, and Big Data
The most popular and rapidly evolving technologies in the world are Data Analytics, Data Analysis, Data Mining, Data Science, Machine Learning, and Big Data. All firms, large and small, are increasingly looking for IT experts who can filter through the data and help with the efficient implementation of sound business decisions. In light of the current competitive environment, Data Analytics, Data Analysis, Data Mining, Data Science, Machine Learning, and Big Data are essential technologies that drive company growth and development. In this topic, “Difference Between Data Analytics, Data Analysis, Data Mining, Data Science, Machine Learning, And Big Data,” we will examine the key definitions and skills needed to obtain them. We will also examine the main differences between Data Analytics, Data Analysis, Data Mining, Data Science, Machine Learning, and Big Data. So let’s start by briefly introducing each concept.
Data Analysis vs Data Analytics
Data Analysis is the process of analyzing, organizing, and manipulating a collection of data to extract relevant information. An “Analytics platform” is a piece of software that enables data and statistics to be generated and examined systematically, whereas a “business analyst” is a person who applies an analytical method to a collection of information for a specific goal. As this is becoming increasingly popular the corporate sector has started to broadly accept it. Data Analysis makes it easy to understand the data. It provides an important historical context for understanding what has occurred recent past. To master Power BI check out Power BI Online Course
Data Analytics includes both decision-making processes and performance enhancement through relevant forecasts. Businesses may utilize data analytics to enhance business decisions, evaluate market trends, and analyze customer satisfaction, all of which can lead to the creation of new, enhanced products and services. Using Data Analytics, it is possible to make more accurate forecasts for the future by examining previous data. To master Data Analytics Skills visit Data Analytics Course in Pune
Want Free Career Counseling?
Just fill in your details, and one of our experts will call you!
Call us: +918308103366
WhatsApp Us: https://wa.me/+918308103366
Data Analytics
Data Analysis
Data Analytics is analytics that is used to make conclusions based on data.
Data Analysis is a subset of data analytics that is used to analyze data and derive specific insights from it.
Using historical data and customer expectations, businesses may develop a solid business strategy.
Making the most of historical data helps organizations identify new possibilities promote business growth and make more effective decisions.
The term “data analytics” refers to the collecting and assessment of data that involves one or more users.
Augmented analytics essentially takes all but the first and last part of the general BI Workflow & delivers increasingly relevant business insights.
Understanding Data Science: Unveiling the Basics
What is Data Science?
Data science is an interdisciplinary field that combines techniques from statistics, mathematics, computer science, and domain knowledge to extract insights and knowledge from data. It involves collecting, processing, analyzing, and interpreting large and complex datasets to solve real-world problems.
Importance of Data Science
In today's data-driven world, organizations are inundated with data from various sources. Data science allows them to convert this raw data into actionable insights, enabling informed decision-making, improved efficiency, and innovation.
Intersection of Data Science, Statistics, and Computer Science
Data science borrows heavily from statistics and computer science. Statistical methods help in understanding data patterns, while computer science provides the tools to process and analyze large datasets efficiently.
Key Components of Data Science
Data Collection and Storage
The first step in data science is gathering relevant data from various sources. This data is then stored in databases or data warehouses for further processing.
Data Cleaning and Preprocessing
Raw data is often messy and inconsistent. Data cleaning involves removing errors, duplicates, and irrelevant information. Preprocessing includes transforming data into a usable format.
Exploratory Data Analysis (EDA)
EDA involves visualizing and summarizing data to uncover patterns, trends, and anomalies. It helps in forming hypotheses and guiding further analysis.
Machine Learning and Predictive Modeling
Machine learning algorithms are used to build predictive models from data. These models can make predictions and decisions based on new, unseen data.
Data Visualization
Visual representations of data, such as graphs and charts, help in understanding complex information quickly. Data visualization aids in conveying insights effectively.
The Data Science Process
Problem Definition
The data science process begins with understanding the problem you want to solve and defining clear objectives.
Data Collection and Understanding
Collect relevant data and understand its context. This step is crucial as the quality of the analysis depends on the quality of the data.
Data Preparation
Clean, preprocess, and transform the data into a suitable format for analysis. This step ensures that the data is accurate and ready for modeling.
Model Building
Select appropriate algorithms and build predictive models using machine learning techniques. This step involves training and fine-tuning the models.
Model Evaluation and Deployment
Evaluate the model's performance using metrics and test datasets. If the model performs well, deploy it for making predictions on new data.
Technologies Driving Data Science
Programming Languages
Languages like Python and R are widely used in data science due to their extensive libraries and versatility.
Machine Learning Libraries
Libraries like Scikit-Learn and TensorFlow prov
Data visualization refers to visually representing data through charts, graphs, and other images to more easily identify patterns and insights. It is an important tool for understanding data, communicating findings to others, and making informed decisions. Effective data visualization requires choosing the right type of visual based on the data, ensuring the data is accurate and from a reliable source, and using the visualization to tell a story or answer key questions. There are many tools available for creating data visualizations, from Excel and Google Sheets for basic charts to more advanced options like Tableau and Photoshop.
Data analytics for the mid-market: myth vs. realityDeloitte Canada
This document discusses 5 myths that prevent mid-market companies from making smarter decisions using data analytics. The myths are that they are not big enough to benefit, they just need more data, it is IT's responsibility, they are not equipped for it, and it won't provide new insights. The realities are that analytics levels the playing field, visualization tools make existing data more useful, analytics requires business leadership partnering with IT, starting small using cloud solutions is possible, and having a vision and strategy is key to realizing value from analytics.
The Value of Signal (and the Cost of Noise): The New Economics of Meaning-MakingCognizant
It’s a new era in business, in which growth will be driven by finding meaning and insights in data. Recent research demonstrates what separates winners from losers and how to rise to the top as a "meaning maker."
The document provides an overview of data science. It defines data science as a field that encompasses data analysis, predictive analytics, data mining, business intelligence, machine learning, and deep learning. It explains that data science uses both traditional structured data stored in databases as well as big data from various sources. The document also describes how data scientists preprocess and analyze data to gain insights into past behaviors using business intelligence and then make predictions about future behaviors.
Scott Thomson, Darren Drew. getting data fitbetterbigdata
Getting data-fit requires selecting and consuming the right kinds of data, applying it where it matters most by linking it to creative workstreams, and recognizing organizational barriers. Brands need to collect smart, clean, and purposeful data that is properly vetted and linked to objectives. Data should inform what brands communicate and the experiences they design by feeding into expectations and experience management. However, most attempts to leverage data fail due to organizational barriers like legacy systems, data silos, lack of skills, and not celebrating successes. Overcoming these barriers requires investment, building cross-functional relationships, internal training, focusing on easy wins, and learning from others.
The document is a white paper on trends in financial planning and analysis (FP&A) authored by Tanbir Jasimuddin and Larysa Melnychuk. It discusses three key elements of effective FP&A storytelling: building a narrative to guide audiences through data, visualizing data to communicate insights, and leveraging digital tools for analytical transformation. It provides examples of using frameworks like Rappaport's value drivers and analytical pathways to structure narratives. It also emphasizes using data visualization over static tables and exploring data in real-time to answer questions.
The document discusses several key trends in analytics for 2015:
1. Data security is a major concern as data volumes grow exponentially, requiring companies to quadruple down on security efforts through innovation, analytics, and tighter integration.
2. The rise of the Internet of Things generates massive sensor data that requires new analytics to extract value, though challenges remain in integrating these systems.
3. Some argue that data should be monetized as an asset, but this brings risks around privacy, ethics, and real costs that companies need to consider carefully.
4. Cognitive analytics is enhancing decision-making by providing users with vast new sources of knowledge, though questions remain about how these systems will impact human roles over time
This document summarizes an article that is critical of the term "Big Data" and argues that it is primarily a marketing term used by business intelligence vendors. Some key points:
1) The author argues that "Big Data" is just the latest marketing campaign by BI vendors and does not represent a meaningful change, as data has always been large and growing exponentially.
2) While vendors tout new sources of data and increased volumes, the author claims this is just "more of the same" and does not require fundamentally new approaches. Greater data does not necessarily lead to better insights or decisions.
3) Quotes and claims by vendors about the potential value and benefits of "Big Data" are exaggerated and
The Analytics Stack Guidebook (Holistics)Truong Bomi
Chapter 1: High-level Overview of an Analytics Setup
Chapter 2: Centralizing Data
Chapter 3: Data Modeling for Analytics
Chapter 4: Using Data
+++
Trích lời Huy - tác giả cuốn sách, co-founder & CTO của Holistics
+++
"Làm thế nào để thiết kế hệ thống BI stack phù hợp cho công ty mình?"
Có bao giờ bạn được công ty giao nhiệm vụ set up hệ thống BI/analytics stack cho công ty, rồi đến khi lên mạng google thì tá hoả vì mỗi bài viết, mỗi người bạn khác nhau lại khuyên bạn nên sử dụng một bộ công cụ/công nghệ khác nhau? ETL hay ELT, Hadoop hay BigQuery, Data Warehouse hay Data Lake, ...
Rồi bạn thắc mắc: Thiết kế một hệ thống analytics stack như thế nào là phù hợp với nhu cầu hiện tại của công ty mình? Làm thế nào để bắt đầu nhanh nhưng vẫn có thể scale được (mà không phải đập đi xây lại) khi nhu cầu dữ liệu tăng cao?
Thay vì chín người mười ý, bạn ước giá mà có 1 tấm bản đồ (map) có thể giúp bạn định vị được trong thế giới BI/analytics phức tạp này. Một tấm bản đồ cho bạn thấy các thành phần khác nhau của mỗi hệ thống BI là gì, lắp ráp nó lại như thế nào, và tradeoff giữa các cách tiếp cận khác nhau là sao.
Well, sau 2 tháng trời cực khổ thì team mình đã vẽ ra tấm bản đồ đó trong hình dạng một.. cuốn sách:
"The Analytics Setup Guidebook: How to build scalable analytics & BI stacks in modern cloud era."
Cuốn sách là một crash-course để bạn có thể trở thành một "part-time data architect", giúp bạn hiểu được rõ hơn về landscape analytics phức tạp hiện nay.
Sách giải thích high-level overview của một hệ thống analytics ntn, các thành phần tương tác với nhau ra sao, và đi sâu vào đủ chi tiết của những thành phần cũng như best practices cuả nó.
Cuốn sách được viết dành cho các bạn hơi technical được nhận nhiệm vụ phụ trách hệ thống analytics của công ty mình. Bạn có thể là một data analyst đang làm BI, software engineer được kêu qua hỗ trợ làm data engineering, hoặc đơn giản là 1 Product Manager đang thắc mắc sao quy trình data công ty mình chậm quá...
Cuốn sách cũng có những phần chia sẻ nâng cao như Data Modeling, BI evolution phù hợp với các bạn đã có kinh nghiệm làm BI lâu đời.
A moins de faire partie de la population électrosensible, nous y prêtons rarement attention et pourtant, nous sommes presque en permanence entourés de champs électromagnétiques. Nous baignons dans un océan d’ondes !
This document is an affordability index report produced by YouthfulCities that ranks 25 major cities based on their affordability for youth. It finds that Paris has the most affordable cost of living relative to minimum wage, followed by Toronto, Los Angeles, Chicago, and Berlin. The report considers costs of items like food, housing, transportation, and entertainment in each city. It also profiles initiatives that different cities are taking to increase affordability, such as discounted public transit for youth, subsidized housing, and free university tuition. The report concludes by acknowledging challenges in comparing costs and incomes across different cities and outlines the methodology used to calculate the affordability rankings.
The Inevitability of a Mobile Only Customer ExperienceEric Espinosa
Mon précieux... diront certains annonceurs. Longtemps perçu comme second écran, le mobile joue du coude et s’empare à vitesse grand V du statut de premier écran chez la majorité des consommateurs connectés. Cette prise de pouvoir exige une adaptation par les marques de leurs expériences consommateurs. Un nouveau rapport de l’Altimeter Group indique la marche à suivre.
Consommer responsable, c’est presque acquis ! Mais si chacun s’y engage de plus en plus c’est selon ses propres préoccupations ou motivations tandis que le pouvoir des actions collectives est de plus fort. Un paradoxe, reflet de la société civile et, décrypté par « Un pour un, Un pour tous », l’étude typologique de consommateurs réalisée par Ethicity.
This document describes a study that used both explicit and implicit questioning methods to gain a deeper understanding of consumer values, attitudes, and brands. The study found that consciously reported values and brand preferences often differed significantly from unconscious motivations. This suggests consumers have conflicting inner drives and a more complex psyche than traditional segmentation allows. The study also identified a new consumer group called "Generation World" that defies traditional demographics and values individuality, fluid identity, and empowerment through technology. This group feels marketers do not understand their multidimensional nature. The findings imply a need for marketing that speaks to universal tensions rather than singular concepts.
Codeless Generative AI Pipelines
(GenAI with Milvus)
https://ml.dssconf.pl/user.html#!/lecture/DSSML24-041a/rate
Discover the potential of real-time streaming in the context of GenAI as we delve into the intricacies of Apache NiFi and its capabilities. Learn how this tool can significantly simplify the data engineering workflow for GenAI applications, allowing you to focus on the creative aspects rather than the technical complexities. I will guide you through practical examples and use cases, showing the impact of automation on prompt building. From data ingestion to transformation and delivery, witness how Apache NiFi streamlines the entire pipeline, ensuring a smooth and hassle-free experience.
Timothy Spann
https://www.youtube.com/@FLaNK-Stack
https://medium.com/@tspann
https://www.datainmotion.dev/
milvus, unstructured data, vector database, zilliz, cloud, vectors, python, deep learning, generative ai, genai, nifi, kafka, flink, streaming, iot, edge
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Global Situational Awareness of A.I. and where its headedvikram sood
You can see the future first in San Francisco.
Over the past year, the talk of the town has shifted from $10 billion compute clusters to $100 billion clusters to trillion-dollar clusters. Every six months another zero is added to the boardroom plans. Behind the scenes, there’s a fierce scramble to secure every power contract still available for the rest of the decade, every voltage transformer that can possibly be procured. American big business is gearing up to pour trillions of dollars into a long-unseen mobilization of American industrial might. By the end of the decade, American electricity production will have grown tens of percent; from the shale fields of Pennsylvania to the solar farms of Nevada, hundreds of millions of GPUs will hum.
The AGI race has begun. We are building machines that can think and reason. By 2025/26, these machines will outpace college graduates. By the end of the decade, they will be smarter than you or I; we will have superintelligence, in the true sense of the word. Along the way, national security forces not seen in half a century will be un-leashed, and before long, The Project will be on. If we’re lucky, we’ll be in an all-out race with the CCP; if we’re unlucky, an all-out war.
Everyone is now talking about AI, but few have the faintest glimmer of what is about to hit them. Nvidia analysts still think 2024 might be close to the peak. Mainstream pundits are stuck on the wilful blindness of “it’s just predicting the next word”. They see only hype and business-as-usual; at most they entertain another internet-scale technological change.
Before long, the world will wake up. But right now, there are perhaps a few hundred people, most of them in San Francisco and the AI labs, that have situational awareness. Through whatever peculiar forces of fate, I have found myself amongst them. A few years ago, these people were derided as crazy—but they trusted the trendlines, which allowed them to correctly predict the AI advances of the past few years. Whether these people are also right about the next few years remains to be seen. But these are very smart people—the smartest people I have ever met—and they are the ones building this technology. Perhaps they will be an odd footnote in history, or perhaps they will go down in history like Szilard and Oppenheimer and Teller. If they are seeing the future even close to correctly, we are in for a wild ride.
Let me tell you what we see.
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
State of Artificial intelligence Report 2023kuntobimo2016
Artificial intelligence (AI) is a multidisciplinary field of science and engineering whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world. This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
The State of AI Report is now in its sixth year. Consider this report as a compilation of the most interesting things we’ve seen with a goal of triggering an informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Industry: Areas of commercial application for AI and its business impact.
Politics: Regulation of AI, its economic implications and the evolving geopolitics of AI.
Safety: Identifying and mitigating catastrophic risks that highly-capable future AI systems could pose to us.
Predictions: What we believe will happen in the next 12 months and a 2022 performance review to keep us honest.
2. 2 | Narrative Analytics | A Narrative Science Whitepaper
Big Data Excitement and Frustration
The point of computers was always to make us smarter. A core mission of computer science is to get us the
information we need to make important decisions and make us smarter about how we make them. This goal of helping
us, and making us smarter, is the driver that sustains work in Artificial Intelligence and is particularly important in the
growing world of Big Data.
The rise of Big Data has been an interesting combination of excitement and frustration. Excitement because everyone
knows that there is tremendous value to be found in the mass of data that is flowing through the world around us.
Frustration because the enormous investment – hundreds of billions of dollars globally – has resulted in shockingly few
success stories.
This result is really not surprising. The push toward Big Data began with the realization that we had been both passively
and actively amassing data, and we wanted to do something with it. This, in turn led to a massive investment in areas of
large-scale data analytics and machine learning to mine the value that we knew was there. The expansion of Business
Intelligence and visualization tools has all been driven by the need to glean something from the data in which we find
ourselves immersed. And, most importantly, the work has not only proceeded in a “bottom up” approach, but has also
been focused on “data as data” rather than the role it should play in decision-making. People took their eye off the prize –
to get insight – and focused instead on the gathering and management of data as the end goal.
But there is a solution. We can turn the machines that hold onto all of our data into something useful and meaningful.
We just need to teach them how to talk to us in a way that we understand. We can transform them into systems that
make us smarter.
AI That works for you
Progress with Big Data has always been reliant on a human interpreting the data as it’s displayed. All of our investment in
automation and high-speed performance at massive scale comes down to one guy in a chair looking at a screen, and we’re
relying on him to figure out what is going on and communicate it to everyone who needs the information. Recently, we
have elevated this role and invented a new type of analyst called a “data scientist.” Ironically, we need this role because we
have been given tools that are astoundingly difficult to understand, even as they are being cast as “easy to use.” The tools
themselves have made it nearly impossible for even the most data literate people to extract anything meaningful from the
data at hand.
We need something better, something with real power behind it that can empower every one of us. It is time to take an
approach based on business needs and address what organizations really want from data. Instead of having to go to a
machine, build queries, do the analysis and find the meaning hidden in the data, the machine should deliver the insight,
the meaning and the story to us. To put it simply, the machine should tell us the story that it finds in the data. Research
in Artificial Intelligence has already proven that computers can deliver on this promise; it’s now a matter of applying the
science universally.
To do this, we need a new approach and a new kind of analysis: analysis that is focused on business and communication
goals as the driver for the examination and analysis of data; an approach that looks at how to deliver meaning and insight
in a form that makes natural sense to us, as narratives. No spreadsheets, no charts, no struggle - just the story.
3. 3 | Narrative Analytics | A Narrative Science Whitepaper
Narrative Analytics:
Driven by the Story, Not the Numbers
Everything we want from the data that surrounds us, the information and the insight we need to help make
us smarter and make better decisions, can be derived from Narrative Analytics, a method that leverages the
tremendous potential of artificial intelligence to automatically transform data into meaning, insight and stories.
In the world defined by Narrative Analytics, the machine thinks like you. It considers what information you
need and drives all of its thinking and calculation to get you that information. Once it does this, it transforms
the results into clear, concise prose that you can simply read. You don’t have to struggle with the machine for
help. It comes to you with insight. It is smart enough to help make you smarter.
The central idea behind Narrative Analytics is simple: We need to know what is happening in the world
around us. In particular, we want to know about those aspects of the world that are important and relevant
to us. Obtaining this knowledge requires more than just exposure to the data. We need clear and instructive
communication that is focused on our needs, our interests and the decisions we have to make on a daily basis.
And computers, through Artificial Intelligence, are there to do all of this for us.
If we want a machine to communicate, we need to teach it to not only extract meaning from the data it
manages, but also to derive relevant insight from it. Fortunately, both of these tasks require that the systems
we build understand how to analyze data in order to extract meaning and insight.
The important distinction with this type of analysis is that it must be completely driven by the needs of the
narrative. We need to start with communication goals like, “I want to know about my logistical problems.” “I
want to know how my sales team is doing.” Or, “I want to know how my portfolio is performing.” These goals
drive the analysis. Any analysis of the data or even data collection is simply a waste of time if it doesn’t result in
some sort of communication or reporting that someone needs.
This is one of the crucial differences between the Narrative Analytics approach and a traditional data analytics
view. With data analytics, the algorithm is the driver, and data scientists always want more algorithms. From a
Narrative Analytics point of view, the story is the driver. Certainly, there are algorithms, analysis and data. But
they are all instrumental to telling the story, writing the report and communicating the insight.
4. 4 | Narrative Analytics | A Narrative Science Whitepaper
Food
Beer
Wine
Liquor
Total Sales
Meals Tax
Gift Certificates Sold
TOTAL RECEIPTS
LESS
Discounts
COmp Food & Bev
G.C. Redeemed
Other
subtotal
MON
15-Dec
$2,000.00
$200.00
$300.00
$325.00
$2,825.00
$141.24
$100.00
$3,066.24
$20.00
$40.00
$25.00
-
$85.00
TUE
16-Dec
$2,100.00
$220.00
$200.00
$310.00
$2,920.00
$146.00
-
$3,066.00
-
$70.00
-
-
$70.00
WED
17-Dec
$2,100.00
$410.00
$340.00
$360.00
$3,210.00
$160.50
-
$3,370.50
$35.00
-
-
-
$35.00
THURS
18-Dec
$2,220.00
$410.00
$340.00
$345.00
$3,315.00
$165.75
-
$3,480.75
-
$22.00
-
-
$22.00
FRI
19-Dec
$2,640.00
$200.00
$350.00
$360.00
$3,550.00
$177.50
-
$3,727.50
$60.00
-
-
-
$60.00
TOTALS
21-Dec
$2,010.00
$2,100.00
$2,340.00
$2,460.00
$22,860.00
$1,142.99
$150.00
$24,152.99
$115.00
$207.00
$125.00
-
$447.00
PERCENT
21-Dec
10.3%
10.4%
10.2%
10.8%
100.0%
5.0%
0.7%
0.5%
0.9%
0.5%
0.0%
2.0%
Data vs. Decision-Making
This focus on the story and its impact on communication goals is only the first part of the equation. The real
impact is realized by what gets produced: a narrative that gives voice to the important meaning and insight from
the data and presents it in natural language.
The differences between the current approaches to data analysis and its presentation and narrative analytics are
not subtle.
But when you are presented with a narrative that expresses the information that is truly
important to you and relevant to the decisions you need to make, all you have to do is read it.
Sales at the bar went up this week with a huge spike on Sunday. Overall dinner sales
stayed on par with last week, but lunch sales dropped a bit. If these trends persist, it might
make sense to pull a waiter off the lunch shift and get another bartender on Sunday.
It comes down to this: a simple choice between data and decision-making. You can either choose to spend
time figuring it all out, or choose to have it easily and quickly explained to you so you know what the data
means and can make decisions based upon the output.
Visualizations allow you to actually see these numbers in a form
that may be easier to deal with, but you still have to interpret those
visualizations to pull out the relevant components. Again, you are
presented with the data but are left to figure out what is meaningful.
A spreadsheet allows you to see all of the numbers and perform the
necessary calculations to figure out what you need to know. But the
job of doing this work is yours, and if you are not completely fluid with
the numbers, you will be lost.
5. data input: What data do I need.
Quarterly earnings numbers, expected values, actual values and any confidence weights
on the expected values
5 | Narrative Analytics | A Narrative Science Whitepaper
The Anatomy of
Narrative Analytics
The starting point of Narrative Analytics is always the story. From the story, we establish a set of
communication goals that reflect what it is we want to say. These, in turn, define the analysis we need to
perform in order to get to the truth that supports the communication goals and the story.
Consider a simple example, such as a quarterly earnings report. The goal of this report is to provide
information about how a company is doing based upon its current and historical earnings. So, the content
of the story needs to include a combination of history and a comparison against expectations. These
communication goals then define the analysis that needs to happen and the data that needs to be used. The
process looks like this:
The analysis serves the story and is, in fact, defined by the story. If I want my communication to satisfy my goals
and be true, then I need certain information. To get to that information I am going to have to perform specific
analysis of the data. And to perform that analysis, I need to have my data organized in a way that makes that
analysis possible. This is the case for any communication goal you can imagine. If you want to speak the truth,
then you need to ground your content in the right data and analysis.
data analysis: How I can figure it out.
Conduct a times series analysis against quarterly results
information needs: What I need to know.
Current earnings compared to last quarter’s
Communication Goal: What I want to say.
Are the company’s earnings improving or on the decline
6. 6 | Narrative Analytics | A Narrative Science Whitepaper
After establishing what it is you want to say, the next question follows: “What do I need to know in order to say
the things I want to say?” This step is extremely important in that it defines the fact base that any communication
is going to utilize. For example, if I want to say how well a company is doing, I need to know if its earnings are on the
rise and if the rate of change is going up or down. If I want to say a game was a thrashing, I need to know if the margin
on the score was above a particular threshold. If I want to say how a salesperson is doing, I need to know how his
performance has changed over time. At this stage, I am defining my information needs.
This transition from communication goals to information needs is crucial in defining the analysis that must be done
in order to actually write a document. Once we know that we want to understand how a company’s earnings have
changed over the past five years, we need to perform a time-series analysis of earnings over that time frame and use
the earnings data that will support that analysis. At this point, we have identified what analysis is required and exactly
what data we need to support it.
Of course, once the nature of the analysis has been determined, the data requirements are set. For example, in order
to do time series analysis around a particular metric, you need to have the historical data associated with that metric.
If you are comparing two objects, you need the data associated with the elements that are going to be compared. By
starting with communication goals and identifying the information, analysis and data needed to support those goals,
we end up with a complete set of requirements to drive the narrative.
While this may seem strikingly obvious, it hasn’t been addressed to date in the world of Big Data. Today’s “bottom-
up” approach ignores the notion that if you want to say something about the world, you need to have the data that
will support what you want to say. With Narrative Analytics, however, the linkage between story, analytics and data is
fundamental – you need the data, analysis, and information in order to say what you want to say.
7. 7 | Narrative Analytics | A Narrative Science Whitepaper
Narrative Structure
There is tremendous power to be gained by viewing the world of data and analytics through the
lens of communication goals. This lens focuses the type of computation that needs to be run, and it provides
a needed link between the data we are collecting and the messages that we are trying to communicate. It is even
more powerful when we think of these elements working together. Individual communication goals work together
to allow us to craft more complex narratives that tell complete stories.
A powerful way to look at these relationships is through the eyes of standard story types. In particular, there are
recurring story types that define related collections of communication goals that can be packaged together. These
packages of communication goals have parallel analytics that are equally repeatable. In the world of machines,
repeatability translates into scale. Because so many of the stories, reports and narratives that we use to
communicate with each other are the same, the types of processing that are required to support their generation
is the same as well. There are always differences, but the reality is that even these differences can be captured,
characterized and then turned into the parameters for systems that generate the stories we care about.
There is a natural, human desire to believe that the exact opposite of this is true, a desire
to believe that everything we do is unique and absolutely and fundamentally different than
anything that anyone else does. But when it comes to communicating things about the world,
this idea of uniqueness is the enemy. In fact, our ability to draw the variety of information about the world
into clear and coherent categories is part of what makes us intelligent in the first place. The Narrative Analytics
approach provides us with the ability to see the world through a clear lens.
Think, for example, about a performance review report.
If we are putting together a performance review, no matter the object of focus, we need to define a set of metrics
against which we will evaluate that object. Those metrics may have components associated with them, or drivers.
And those drivers may themselves have further components as well.
If we want to say something about how this object is doing, we have to consider how those metrics and drivers
are changing over time. If we want to say how it compares to other things that are related to it, we have to run
a set of comparisons. And if we want to say something about how this object can get better, we have to know
what can change and what the opportunities for change are (such as a driver that has fallen off over time or is
underperforming in comparison to similar objects).
8. 8 | Narrative Analytics | A Narrative Science Whitepaper
We can do exactly the same sort of mapping if we are looking at how a salesperson is doing.
Or how a company is performing.
Or how a student, an exercise plan, or an investment portfolio is doing. No matter what the object, the pattern of
the story you need to hear, the analysis that is required, and the nature of the data behind it remains the same.
Certainly there are differences between these examples. But they are also predictable and regular. While the
metrics are different, they are still metrics and can be treated with the same sort of analysis. While the time scale
for the analysis might change from case to case, the cycles of how often we want to look at the changes and the
overall time frame we are viewing can simply become parameters to the analytics. And while the definition of a
comparison cohort may be different, it still ends up just being another parameter.
The similarities far outweigh the differences, and it is the similarities that make it possible to create models of
what needs to be considered when any content of this sort is generated. In fact, it is the similarity and our ability
to model it that gives us a language for even talking about the ways in which the content is different. Once we
are talking about “metrics,” “drivers,” “benchmarks” and “cohorts,” we can begin to characterize things like how a
version of a performance review prioritizes one of these elements over another, or how a report is only focused
on the positive versus the negative drivers. The commonality gives us this language to explain the differences.
This pattern of standardization of communication goals and information needs around a type of story is found
everywhere. Communication, at least the sort we are considering here, is about insight based on truth. And truth
is derived from the data. So it is absolutely understandable that these patterns of standard communication define
patterns of standardized analysis.
This may seem very abstract, but now think about this in terms of a single metro area’s real estate market. The
top-level metric is home sales. The drivers are new, existing and short sales. The time series analysis is month-
over-month. The comparison is to other similar sized metro areas. The opportunities are defined by seeing
growth in one driver in a comparable metro area that can be improved upon in our target. So the abstract
becomes the specific:
In the Decatur, IL market, homes sales declined last month by nearly 6%. A decline in new homes
sales (down 8% from last month) was the biggest driver but existing home sales fell as well. This
decline was more pronounced than the fall in sales felt by nearby Bloomington which saw a 4% drop.
Dave Schmitt’s overall sales performance is up a bit this month. He has been closing smaller deals at
a higher than expected rate and still has larger deals in the pipeline. He remains in the middle of the
pack in the Southwest Region.
Zebra Technologies Corp.’s (ZBRA) fourth quarter profit is a continuation of the four consecutive
quarters of earnings growth we’ve seen over the last year. This quarter’s results put the company
in the top 10% of manufacturing firms in terms of earnings.
9. 9 | Narrative Analytics | A Narrative Science Whitepaper
Narrative Analytics
Makes Us Smarter
The mission is not to just provide an answer. That would be useful, but it doesn’t help you communicate with
others. What you need is the answer and the reasoning behind the answer. The rationale for thinking something is
true is what makes you smarter.
Technology that makes us smarter should always be the goal. Not just technology that is smart in and of
itself, but technology that is able to communicate with people in a way that amplifies our own abilities rather
than supplants them. Narrative Analytics solves this issue. By using the goals of communication as the driver,
Narrative Analytics opens the door to a world of communication in which the machine takes on the
task of explaining what it knows to us in a way that is both rigorous and natural. It tells us the story of the data
and the insight that it contains.
All you have to do is read.