Data driven video advertising campaigns - JustWatch & SnowplowGiuseppe Gaviani
This document summarizes JustWatch's journey in building a universal customer relationship management (CRM) system focused on the movie industry. It discusses how JustWatch started using the Snowplow open-source data pipeline to collect user data from their movie apps. JustWatch then automated building targeted audiences and managing user identities across channels to power personalized, data-driven movie marketing campaigns on platforms like YouTube and Facebook. The document outlines the key components developed, including an Audience Builder, ID Service, and interactions module. It also shares learnings around questioning real-time needs and the value of separating operational and analytics data stores.
Implementing improved and consistent arbitrary event tracking company-wide us...yalisassoon
Talk on the role Snowplow plays as part of the larger project to make data accessible to product marketing and other data-driven teams at StumbleUpon. Touches on technical and organizational challenges
Metail uses Snowplow to collect customer data and Cascalog to process that data into normalized batch views for analysis. Cascalog transforms the raw Snowplow event stream into structured tables for things like customer body shape, orders, items ordered, returns, and browsers. This makes the data more manageable and complex analysis and aggregation easier. For example, Cascalog is used to calculate key performance indicators by grouping customer data and summing metrics from the batch views. The output is then analyzed further in R. Looker will also allow business analysts to access and explore the batch views and raw Snowplow event data.
Snowplow - Evolve your analytics stack with your businessGiuseppe Gaviani
The document discusses how analytics stacks need to evolve with businesses over time as products, questions, and data change. It describes how Snowplow users can define their own event and entity schemas to model their data. Self-describing data validated against these schemas allows adding new sources, updating existing events/entities, and recomputing data models on the full dataset as needed. This enables the analytics pipeline to evolve flexibly in response to new business questions or tracked information.
Flows in the Service Console, Gotta Go with the Flow! by Duncan StewartSalesforce Admins
The document discusses using Salesforce Flow in the Service Console to simplify case management processes for technical support and logistics teams. It describes how custom objects and flows can be used to capture more detailed information for different teams while minimizing back-and-forth. The solution involves using the Service Console, custom objects to store additional details, and Visual Workflow to guide users through the process. Key points covered include how to update cases from flows while allowing users to still edit case details, and addressing timing issues when updating records from flows.
Snowplow Analytics: from NoSQL to SQL and back againAlexander Dean
A talk I gave to London NoSQL about Snowplow's journey from using NoSQL (via Amazon S3 and Hive), to columnar storage (via Amazon Redshift and PostgreSQL), and most recently to a mixed model of NoSQL and SQL, including S3, Redshift and Elasticsearch.
Data driven video advertising campaigns - JustWatch & SnowplowGiuseppe Gaviani
This document summarizes JustWatch's journey in building a universal customer relationship management (CRM) system focused on the movie industry. It discusses how JustWatch started using the Snowplow open-source data pipeline to collect user data from their movie apps. JustWatch then automated building targeted audiences and managing user identities across channels to power personalized, data-driven movie marketing campaigns on platforms like YouTube and Facebook. The document outlines the key components developed, including an Audience Builder, ID Service, and interactions module. It also shares learnings around questioning real-time needs and the value of separating operational and analytics data stores.
Implementing improved and consistent arbitrary event tracking company-wide us...yalisassoon
Talk on the role Snowplow plays as part of the larger project to make data accessible to product marketing and other data-driven teams at StumbleUpon. Touches on technical and organizational challenges
Metail uses Snowplow to collect customer data and Cascalog to process that data into normalized batch views for analysis. Cascalog transforms the raw Snowplow event stream into structured tables for things like customer body shape, orders, items ordered, returns, and browsers. This makes the data more manageable and complex analysis and aggregation easier. For example, Cascalog is used to calculate key performance indicators by grouping customer data and summing metrics from the batch views. The output is then analyzed further in R. Looker will also allow business analysts to access and explore the batch views and raw Snowplow event data.
Snowplow - Evolve your analytics stack with your businessGiuseppe Gaviani
The document discusses how analytics stacks need to evolve with businesses over time as products, questions, and data change. It describes how Snowplow users can define their own event and entity schemas to model their data. Self-describing data validated against these schemas allows adding new sources, updating existing events/entities, and recomputing data models on the full dataset as needed. This enables the analytics pipeline to evolve flexibly in response to new business questions or tracked information.
Flows in the Service Console, Gotta Go with the Flow! by Duncan StewartSalesforce Admins
The document discusses using Salesforce Flow in the Service Console to simplify case management processes for technical support and logistics teams. It describes how custom objects and flows can be used to capture more detailed information for different teams while minimizing back-and-forth. The solution involves using the Service Console, custom objects to store additional details, and Visual Workflow to guide users through the process. Key points covered include how to update cases from flows while allowing users to still edit case details, and addressing timing issues when updating records from flows.
Snowplow Analytics: from NoSQL to SQL and back againAlexander Dean
A talk I gave to London NoSQL about Snowplow's journey from using NoSQL (via Amazon S3 and Hive), to columnar storage (via Amazon Redshift and PostgreSQL), and most recently to a mixed model of NoSQL and SQL, including S3, Redshift and Elasticsearch.
2016 09 measurecamp - event data modelingyalisassoon
Presentation by Christophe Bogaert to Measurecamp London September 2016. Christophe discussed what makes consuming and analysing event-streams difficult, and outlined a number of techniques for overcoming those obstacles.
The analytics journey at Viewbix - how they came to use Snowplow and the setu...yalisassoon
This document summarizes the evolution of video measurement and analytics solutions used by a company. It describes several solutions the company implemented, including sending tracking events to a server hosted on Rackspace [1], distributing event collection to Akamai and processing with Hadoop/Hive/SQL in Azure [2], and ultimately implementing a solution using Snowplow that addressed all of their requirements [3]. Key benefits of Snowplow included no limits on data, flexible data modeling, fast reporting and owning their own data. The document ends by discussing lessons learned around data quality, infrastructure costs, modeling needs and focusing on small, actionable insights from big data.
Snowplow: open source game analytics powered by AWSGiuseppe Gaviani
This is a presentation by Alex Dean and Yali Sassoon at Snowplow about open source game analytics powered by AWS. It was presented at the Games Developer Conference (GDC) in San Francisco, February 2017
Introducing Sauna - Decisioning and response platform from SnowplowGiuseppe Gaviani
Snowplow allows companies to track customer data across channels and warehouses to build intelligence. Sauna then takes this intelligence and allows companies to act on it by pushing insights to different marketing platforms like Salesforce and MailChimp. Sauna currently integrates with Optimizely and SendGrid and is open source to easily add more integrations.
Why use big data tools to do web analytics? And how to do it using Snowplow a...yalisassoon
There are a number of mature web analytics products that have been on the market for ~20 years. Big data tools have only really taken off in the last 5 years. So why use big data tools mine web analytics data?
In this presentation, I explore the limitations of traditional approaches to web analytics, and explain how big data tools can be used to address those limitations and drive more value from the underlying data. I explain how a combination of Snowplow and Qubole can be used to do this in practice
This document discusses event data and the Snowplow data pipeline. It notes that 3 years ago, analyzing user behavior and engagement using tools like Google Analytics was difficult. The Snowplow data pipeline was created to collect and analyze event-level data at scale using open source big data technologies. The pipeline has expanded to encompass different types of digital event data by developing a schema for structured JSON events and a versioning system. A real-time version of the pipeline is also being built to feed event data into applications in addition to batch processing. Developing a semantic model and standard framework for describing events is discussed as being important for enabling downstream applications to consume structured event data.
How we use Hive at SnowPlow, and how the role of HIve is changingyalisassoon
The document summarizes how SnowPlow uses Apache Hive and other big data technologies to perform web analytics. It discusses how Hive is used at SnowPlow, the strengths and weaknesses of Hive versus alternatives, and how SnowPlow is leveraging technologies like Scalding, Infobright, and Mahout for more robust ETL, faster queries, and machine learning capabilities beyond SQL.
On the importance of evolving your data pipeline with your business, and how Snowplow enables that through self-describing data and the ability to recompute your data models on the entire event data set.
Our cofounder Alex Dean gave an introduction to Snowplow and then talked about our roadmap for 2017. Alex touched on several topics including support for more clouds, support for more storage targets, tailoring Snowplow to your industry, more intelligent event sources, moving our batch pipeline to Spark, mega-scale Snowplow and real-time support for Sauna, our decisioning and response system. Presented on 5 April 2017.
Using Snowplow for A/B testing and user journey analysis at CustomMadeyalisassoon
This document discusses user journeys, analyzing how users interact with a website over time to understand conversion rates. It describes building a unified data model to visualize customer journeys and see that most visitors to listing pages leave the site. The document also discusses how more page views are linked to more conversions and how A/B testing can be done using event tracking to test different page variants and their impact on user behavior and conversions.
Simply Business is a leading insurance provider for small business in the UK and we are now growing to the USA. In this presentation, I explain how our data platform is evolving to keep delivering value and adapting to a company that changes really fast.
Snowplow at DA Hub emerging technology showcaseyalisassoon
This document discusses Snowplow Analytics and its approach to reinventing digital analytics. Snowplow allows users to define their own event types, track events across all channels, and answer any questions by storing all digital event data in their own data warehouse. This enables users to join their data to other data sets, pick their own processing logic, and plug in any analytics tools. Snowplow is also open source, free, allows users to own their data and intelligence, and is scalable for tracking large volumes of events.
How to evolve your analytics stack with your business using SnowplowGiuseppe Gaviani
This document discusses evolving an analytics stack to match business changes. It recommends defining self-describing event and entity schemas that can be updated over time. Event data modeling aggregates raw events into modeled data like users and sessions for easier analysis. To evolve the data pipeline, businesses should use self-describing data that allows recomputing models on historical data when new questions arise or data collection changes.
Snowplow is at the core of everything we doyalisassoon
This document discusses Bauer Media's use of Snowplow for data collection and analytics across their 116 websites in Australia and New Zealand. Some key points:
- Bauer Media started collecting Snowplow data in 2014 without a specific use case in mind.
- They now use Snowplow data for cross-site reporting, ad hoc analysis, checking audience reports, and stalking individual users.
- Snowplow allows them to track things like page views, user behavior, content metadata, and ads that can't be tracked as well with Google Analytics.
Big data meetup budapest adding data schemas to snowplowyalisassoon
The document discusses adding data schemas to Snowplow, an open-source web and event analytics platform. It describes how Snowplow is evolving from a web analytics platform to a general event analytics platform to handle an infinite number of possible event types from any connected device. To address this, the document proposes adding JSON schemas to define the structure of each event type. These schemas would be versioned and stored in a central schema repository/registry to define the structure of raw and enriched events processed by Snowplow.
From Architecture to Analytics: A look at Simply Business’s data strategy Looker
Revamping Your Data Approach to Enable the Pace & Flexibility Needed to Make Timely Business Decisions
This slide deck is from Stewart Duncan of Simply Business and Zach Taylor of Looker. They discussed how Simply Business revamped their data approach to enable the pace and flexibility they needed to make timely decisions based on their data.
Simply Business offers a B2B web product that simplifies the experience of selecting and purchasing business insurance. With over 300,000 policyholders, Simply Business is the largest SME insurer in the UK. Removing friction for users as they complete the insurance quote process is critical to providing a hassle-free user experience and driving continuous improvements in conversion rates. Learn how data analytics helps them strive for these improvements.
Get ideas on the following:
- Using data for rapid A/B testing and to analyze user journeys to inform product development.
Transitioning from a traditional data architecture to a modern cloud-based stack integrating MongoDB, AWS Redshift, Hadoop and Looker.
- Designing a data platform and organizational process that drives data-driven behavior across an organization.
- Empowering product and marketing teams to get into granular online engagement and attribution data which has removed the bottleneck to making informed decisions.
Big Data Explained - Case study: Website Analyticsdeep.bi
This is an example case study showing what big data can mean for a small website that generates just 5000 visits a day.
It all depends on what we want do get from our assets like website traffic. If we only measure the number of people who visited our site, then we do not need to worry about “big data”. We just have to count total visits (5000 a day, 150 000 monthly).
But by using just the simple measure we know nothing about our visitors / customers. So, it pretty useless.
On the following slides we present what a website owner can gain from advanced website analytics and why big data technologies are recommended.
Optimisator : 8 Ways Analytics Helps Your Business Grow Optimisator
Organizations, people, and things are generating massive amounts of data every day. In a 24-hour period, we collectively send 294 billion e-mails and 500 million tweets. We plug 3.5 billion searches into Google. Our connected cars generate a whopping four petabytes of data. Even our watches, fridges, and TVs are constantly creating and sharing data.
By Optimisator
2016 09 measurecamp - event data modelingyalisassoon
Presentation by Christophe Bogaert to Measurecamp London September 2016. Christophe discussed what makes consuming and analysing event-streams difficult, and outlined a number of techniques for overcoming those obstacles.
The analytics journey at Viewbix - how they came to use Snowplow and the setu...yalisassoon
This document summarizes the evolution of video measurement and analytics solutions used by a company. It describes several solutions the company implemented, including sending tracking events to a server hosted on Rackspace [1], distributing event collection to Akamai and processing with Hadoop/Hive/SQL in Azure [2], and ultimately implementing a solution using Snowplow that addressed all of their requirements [3]. Key benefits of Snowplow included no limits on data, flexible data modeling, fast reporting and owning their own data. The document ends by discussing lessons learned around data quality, infrastructure costs, modeling needs and focusing on small, actionable insights from big data.
Snowplow: open source game analytics powered by AWSGiuseppe Gaviani
This is a presentation by Alex Dean and Yali Sassoon at Snowplow about open source game analytics powered by AWS. It was presented at the Games Developer Conference (GDC) in San Francisco, February 2017
Introducing Sauna - Decisioning and response platform from SnowplowGiuseppe Gaviani
Snowplow allows companies to track customer data across channels and warehouses to build intelligence. Sauna then takes this intelligence and allows companies to act on it by pushing insights to different marketing platforms like Salesforce and MailChimp. Sauna currently integrates with Optimizely and SendGrid and is open source to easily add more integrations.
Why use big data tools to do web analytics? And how to do it using Snowplow a...yalisassoon
There are a number of mature web analytics products that have been on the market for ~20 years. Big data tools have only really taken off in the last 5 years. So why use big data tools mine web analytics data?
In this presentation, I explore the limitations of traditional approaches to web analytics, and explain how big data tools can be used to address those limitations and drive more value from the underlying data. I explain how a combination of Snowplow and Qubole can be used to do this in practice
This document discusses event data and the Snowplow data pipeline. It notes that 3 years ago, analyzing user behavior and engagement using tools like Google Analytics was difficult. The Snowplow data pipeline was created to collect and analyze event-level data at scale using open source big data technologies. The pipeline has expanded to encompass different types of digital event data by developing a schema for structured JSON events and a versioning system. A real-time version of the pipeline is also being built to feed event data into applications in addition to batch processing. Developing a semantic model and standard framework for describing events is discussed as being important for enabling downstream applications to consume structured event data.
How we use Hive at SnowPlow, and how the role of HIve is changingyalisassoon
The document summarizes how SnowPlow uses Apache Hive and other big data technologies to perform web analytics. It discusses how Hive is used at SnowPlow, the strengths and weaknesses of Hive versus alternatives, and how SnowPlow is leveraging technologies like Scalding, Infobright, and Mahout for more robust ETL, faster queries, and machine learning capabilities beyond SQL.
On the importance of evolving your data pipeline with your business, and how Snowplow enables that through self-describing data and the ability to recompute your data models on the entire event data set.
Our cofounder Alex Dean gave an introduction to Snowplow and then talked about our roadmap for 2017. Alex touched on several topics including support for more clouds, support for more storage targets, tailoring Snowplow to your industry, more intelligent event sources, moving our batch pipeline to Spark, mega-scale Snowplow and real-time support for Sauna, our decisioning and response system. Presented on 5 April 2017.
Using Snowplow for A/B testing and user journey analysis at CustomMadeyalisassoon
This document discusses user journeys, analyzing how users interact with a website over time to understand conversion rates. It describes building a unified data model to visualize customer journeys and see that most visitors to listing pages leave the site. The document also discusses how more page views are linked to more conversions and how A/B testing can be done using event tracking to test different page variants and their impact on user behavior and conversions.
Simply Business is a leading insurance provider for small business in the UK and we are now growing to the USA. In this presentation, I explain how our data platform is evolving to keep delivering value and adapting to a company that changes really fast.
Snowplow at DA Hub emerging technology showcaseyalisassoon
This document discusses Snowplow Analytics and its approach to reinventing digital analytics. Snowplow allows users to define their own event types, track events across all channels, and answer any questions by storing all digital event data in their own data warehouse. This enables users to join their data to other data sets, pick their own processing logic, and plug in any analytics tools. Snowplow is also open source, free, allows users to own their data and intelligence, and is scalable for tracking large volumes of events.
How to evolve your analytics stack with your business using SnowplowGiuseppe Gaviani
This document discusses evolving an analytics stack to match business changes. It recommends defining self-describing event and entity schemas that can be updated over time. Event data modeling aggregates raw events into modeled data like users and sessions for easier analysis. To evolve the data pipeline, businesses should use self-describing data that allows recomputing models on historical data when new questions arise or data collection changes.
Snowplow is at the core of everything we doyalisassoon
This document discusses Bauer Media's use of Snowplow for data collection and analytics across their 116 websites in Australia and New Zealand. Some key points:
- Bauer Media started collecting Snowplow data in 2014 without a specific use case in mind.
- They now use Snowplow data for cross-site reporting, ad hoc analysis, checking audience reports, and stalking individual users.
- Snowplow allows them to track things like page views, user behavior, content metadata, and ads that can't be tracked as well with Google Analytics.
Big data meetup budapest adding data schemas to snowplowyalisassoon
The document discusses adding data schemas to Snowplow, an open-source web and event analytics platform. It describes how Snowplow is evolving from a web analytics platform to a general event analytics platform to handle an infinite number of possible event types from any connected device. To address this, the document proposes adding JSON schemas to define the structure of each event type. These schemas would be versioned and stored in a central schema repository/registry to define the structure of raw and enriched events processed by Snowplow.
From Architecture to Analytics: A look at Simply Business’s data strategy Looker
Revamping Your Data Approach to Enable the Pace & Flexibility Needed to Make Timely Business Decisions
This slide deck is from Stewart Duncan of Simply Business and Zach Taylor of Looker. They discussed how Simply Business revamped their data approach to enable the pace and flexibility they needed to make timely decisions based on their data.
Simply Business offers a B2B web product that simplifies the experience of selecting and purchasing business insurance. With over 300,000 policyholders, Simply Business is the largest SME insurer in the UK. Removing friction for users as they complete the insurance quote process is critical to providing a hassle-free user experience and driving continuous improvements in conversion rates. Learn how data analytics helps them strive for these improvements.
Get ideas on the following:
- Using data for rapid A/B testing and to analyze user journeys to inform product development.
Transitioning from a traditional data architecture to a modern cloud-based stack integrating MongoDB, AWS Redshift, Hadoop and Looker.
- Designing a data platform and organizational process that drives data-driven behavior across an organization.
- Empowering product and marketing teams to get into granular online engagement and attribution data which has removed the bottleneck to making informed decisions.
Big Data Explained - Case study: Website Analyticsdeep.bi
This is an example case study showing what big data can mean for a small website that generates just 5000 visits a day.
It all depends on what we want do get from our assets like website traffic. If we only measure the number of people who visited our site, then we do not need to worry about “big data”. We just have to count total visits (5000 a day, 150 000 monthly).
But by using just the simple measure we know nothing about our visitors / customers. So, it pretty useless.
On the following slides we present what a website owner can gain from advanced website analytics and why big data technologies are recommended.
Optimisator : 8 Ways Analytics Helps Your Business Grow Optimisator
Organizations, people, and things are generating massive amounts of data every day. In a 24-hour period, we collectively send 294 billion e-mails and 500 million tweets. We plug 3.5 billion searches into Google. Our connected cars generate a whopping four petabytes of data. Even our watches, fridges, and TVs are constantly creating and sharing data.
By Optimisator
The document discusses setting up an effective analytics framework. It outlines eight key phases: 1) attributing traffic sources, 2) defining the sales funnel, 3) integrating data, 4) attributing conversions, 5) optimizing conversions, 6) segmenting prospects, 7) analyzing the post-purchase funnel, and 8) ongoing reporting. It emphasizes regularly analyzing data and taking action. The goal is to understand customer behavior and improve revenue. Setting up the right analytics framework requires auditing current data, developing a collection strategy, and ongoing reporting and analysis.
This document discusses how an experimentation program manager, Kristen, can empower teams within her organization to make data-driven decisions through experimentation at scale. It outlines three key challenges Kristen faces: teams cannot measure what matters to them, cannot leverage experiment results, and lack confidence in results. The document then presents solutions from Optimizely's data platform to address each challenge by allowing teams to: 1) measure custom metrics that are important to their work, 2) analyze results through their preferred tools and workflows, and 3) gain confidence in results by reducing biases, discrepancies and false discoveries. This will help Kristen grow and maintain adoption of her organization's experimentation program.
Fuel for the cognitive age: What's new in IBM predictive analytics IBM SPSS Software
IBM recently launched an updated version of its predictive analytics platform. Explore the latest features, including R, Python and Spark integration and more powerful decision optimization.
5733 a deep dive into IBM Watson Foundation for CSP (WFC)Arvind Sathi
This document provides an overview of the Watson Foundations for CSPs (WFC) architecture, which uses four analytics components - discovery, detection, decision and drive - to enable use cases across various telecommunications business capabilities. It describes how WFC integrates predictive modeling using SPSS, real-time analytics using InfoSphere Streams, and other components to power analytics applications for areas such as customer experience management, fraud detection, location-based services and more.
How the World's Leading Independent Automotive Distributor is Reinventing Its...NUS-ISS
In this captivating session, we'll unveil the profound impact of AI, poised to revolutionise the business landscape. Prepare to shift your perspective, as we transition from the lens of a data scientist to the visionary mindset of a product manager. We're about to demystify the captivating world of Generative AI, dispelling myths and illuminating its remarkable potential. We will also delve into the pioneering applications that Inchcape is leading, pushing the boundaries of what's achievable. Join us for an exhilarating journey into the future of AI, where professionalism meets unparalleled excitement, and innovation takes center stage!
Content marketing analytics: what you should really be doingDaniel Smulevich
My presentation from Digital Marketing Show 2014. #DMSLDN
A journey through web analytics processes, from setting up KPIs to integrating data sources and automating reports.
Experian is a leading global information services company with over $15 billion in revenue. It uses advanced analytics and machine learning to drive innovation and embed new techniques into its business. This includes using web data, transactional data, and voice data to improve risk scoring, fraud detection, and customer insights. Experian develops products like its Web Data Insights and Transactional Data for Fraud Insights to provide these advanced analytics capabilities to its clients.
Operationalizing Customer Analytics with Azure and Power BICCG
Many organizations fail to realize the value of data science teams because they are not effectively translating the analytic findings produced by these teams into quantifiable business results. This webinar demonstrates how to visualize analytic models like churn and turn their output into action. Senior Business Solution Architect, Mike Druta, presents methods for operationalizing analytic models produced by data science teams into a repeatable process that can be automated and applied continuously using Azure.
Ad Yield Optimization @ Spotify - DataGotham 2013Kinshuk Mishra
This document discusses Spotify's approach to optimizing ad yield on its platform. It describes how Spotify uses a subset of impression data, filters it by criteria like location and age, and extrapolates the results using a simple growth model to forecast available ad impressions at scale in real-time. Some challenges addressed include organic growth, cold starts for new markets, and seasonality. The solution uses Hadoop to store log data, Postgres for booked campaigns, and a forecasting engine to handle queries over filtered and extrapolated data.
Slashing Big Data Complexity: How Comcast X1 Syndicates Streaming Analytics w...Amazon Web Services
Comcast's X1 Platform delivers a dramatically new entertainment experience to viewers. And not just to Comcast subscribers, but to several other major cable companies. This requires a massive integration layer between companies. A big part of that integration is delivering billions of data points per day to syndication partners. Find out how the X1 Platform uses Amazon Kinesis as a data bus, drastically simplifying data integration with others. As part of this, see how X1 uses Lambda, EMR Spark, and S3 - all leading to a near serverless big data backbone.
This document discusses how web analytics can be used to optimize marketing strategies. It explains that web analytics provides insights into customer behavior across channels to understand what is driving conversions. Effective attribution methods are needed to give credit to all contributing marketing channels. Emerging trends in web analytics include removing data silos, predictive analytics, and personalizing experiences based on individual shopping behaviors. The future of web analytics involves better integrating online and offline data to analyze complex multi-channel customer journeys.
This document discusses Vpon's mobile advertising system and recommender model. It describes the basic concept, challenges, and infrastructure of Vpon's ad serving platform. It then focuses on the recommender system, outlining the design, implementation, and evaluation process. Key steps include calculating ad and user similarities, predicting user preferences, optimizing ad delivery, and continuously improving based on results. The recommender significantly increased click-through and conversion rates while reducing costs.
Captricity at Corinium Chief Data Officer Forum Keynote - Brian Cox Captricity
Chief Data Officer Forum, Insurance
September 15, 2016
How Insurers are Leveraging Modern Technology for Improved Customer Experiences
This document discusses how insurers can use modern technologies like cloud services, machine learning, and robotic process automation to improve customer experiences. It provides examples of how insurers have used technologies like Captricity's automation platform to extract data from documents, automate processes, and gain insights from customer data to reduce costs, speed up processes, and unlock new opportunities. The document highlights the challenges insurers face in leveraging historical paper records and improving analytics capabilities.
The document discusses how big data and analytics can help companies gain insights and drive business impact. It provides examples of how companies have used big data platforms and cross-channel analytics to improve customer experiences, increase revenue, reduce costs, and implement a continuous process of testing and learning. Specifically, it highlights how one telecommunications company increased digital sales by over $60 million in three years by leveraging big data analytics to optimize marketing campaigns and customer journeys.
Stay Competitive in Programmatic with Advanced Data StrategiesHilary Ip
The document discusses strategies for staying competitive in programmatic advertising through advanced data analytics. It outlines common business goals of publishers such as simplifying and standardizing how programmatic data is handled, maximizing demand partner performance by tracking metrics like revenue and fill rates, and growing new products like video ads. The presentation then asks key questions about how organizations currently collect, store, transform, normalize and report on programmatic data from various sources to draw insights and addresses solutions for buyer analysis and improving video performance.
Similar to Simply Business and Snowplow - Multichannel Attribution Analysis (20)
Beyond the Basics of A/B Tests: Highly Innovative Experimentation Tactics You...Aggregage
This webinar will explore cutting-edge, less familiar but powerful experimentation methodologies which address well-known limitations of standard A/B Testing. Designed for data and product leaders, this session aims to inspire the embrace of innovative approaches and provide insights into the frontiers of experimentation!
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Round table discussion of vector databases, unstructured data, ai, big data, real-time, robots and Milvus.
A lively discussion with NJ Gen AI Meetup Lead, Prasad and Procure.FYI's Co-Found
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...Social Samosa
The Modern Marketing Reckoner (MMR) is a comprehensive resource packed with POVs from 60+ industry leaders on how AI is transforming the 4 key pillars of marketing – product, place, price and promotions.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
Simply Business and Snowplow - Multichannel Attribution Analysis
1. Using our event analytics platform for fun
and profit
Stewart Duncan –
Director of Data Science
2. A little about us…
• Amongst the largest business
insurance providers in the UK
• Almost 300,000 customers (and
growing fast)
• Using tech to make insurance simpler,
easier and more personalised
• Customer service is our beating heart
• Building a data-driven culture
3. Our analytics team a year ago…
Alberto
Daniele Anthony
Imtiaz
Natalie &
Emma
4. How had we got there?
Core OLTP platform migration
Single tool approach to our
data warehouse..Siloed web analytics
Resulted in…
Which meant we did a lot of this…
5. Making the case for a new data architecture
MechanismOpportunity
Use data to
optimise existing
processes
Run The
Business
Use data to optimise
the creation of new
business processes
Change The
Business
Explore data to
identify new
opportunities
Find New
Business
Best practice data
warehouse
Scalable data exploration platform
Unified event processing framework
Automated event enrichment and loading
Twitch analytics for
product owners
Analyst toolkit for
discovery
Data syndication
(in & out)
External analytics
applications
Leverage
Position
Use data to
leverage position
in market
6. Mining our granular event data
Business challenge:
– Shopping sessions can last up to a week
– Customers use many channels in that time
– Paid search becoming increasingly competitive
– First touch attribution hides the impact of
nurturing channels
7. What is the true value of each
marketing channel so that we can
allocate budget accordingly?
The exam question…
11. Modelling a Bayesian Network based on…
This likelihood
function needs to be
calculated
Simply the frequency of
conversions in the data set
What we are
looking for
12. Calculating P(Channel | Conversion)
Count frequencies at which a channel
appears in a journey to conversion…
Number of
appearances Frequency Probability
11 1 0.10%
10 1 0.10%
9 1 0.10%
8 1 0.10%
7 2 0.20%
6 2 0.20%
5 7 0.71%
4 8 0.81%
3 56 5.69%
2 166 16.85%
1 740 75.13%
PPC Affiliates
Number of
appearances Frequency Probability
6 2 0.99%
5 1 0.49%
4 2 0.99%
3 10 4.93%
2 21 10.34%
1 167 82.27%
…etc..
13. Calculating P(Channel | Conversion)
Deriving a probability density function for each channel based on frequencies.
Adjust by iteratively adding ‘noise’ to smooth curve and maximise entropy…
14. And the result…
Channel Propensity
Direct 5.0%
White Label Partner 1 12.0%
PPC 15.0%
Email 19.0%
White Label Partner 2 12.0%
Natural Search 13.0%
Affiliate 8.0%
Partner Landlord 16.0%
Display less than 1%
After inputting into the
equation:
Email has 19%
influence on
conversion across all
journeys
This method:
– Weights each channel,
taking into consideration
non-converting paths
– Allows us to calculate a
more accurate ROAS per
channel