Presenter: Tess Nesbitt, Senior Statistician, UpStream Software
Presentation Date: February 26, 2013
This presentation describes how Hadoop and Revolution R Enterprise provide the predictive analytics models for UpStream's revenue attribution application.
Big Data, Bigger Campaigns: Using IBM’s Unica and Netezza Platforms to Increa...graemeknows
Is your organization challenged by the explosion of data and increasing expectations for results? Unica Campaign Management and IBM Netezza appliances can provide capabilities to address and overcome them. This presentation offers customer case histories and performance studies that provide insights in today's world where digital and traditional channels are increasingly intertwined.
A CRM comparison sheet that lets you compare industry's best CRM based on their features and pricing. This comparison by Nayvug Infosolutions shall help you in deciding which CRM solutions to consider based on number of factors listed in the sheet.
TechConnectr's Big Data Connection. Digital Marketing KPIs, Targeting, Analy...Bob Samuels
This presentation was given at the Deep Dive Conference in November. 2013.
Big Data Applications... example, digital marketing, and targeting and optimization...
Feedback, and additional perspectives, is appreciated.
Thank you,
Bobby Samuels
TechConnectr.com
Big Data, Bigger Campaigns: Using IBM’s Unica and Netezza Platforms to Increa...graemeknows
Is your organization challenged by the explosion of data and increasing expectations for results? Unica Campaign Management and IBM Netezza appliances can provide capabilities to address and overcome them. This presentation offers customer case histories and performance studies that provide insights in today's world where digital and traditional channels are increasingly intertwined.
A CRM comparison sheet that lets you compare industry's best CRM based on their features and pricing. This comparison by Nayvug Infosolutions shall help you in deciding which CRM solutions to consider based on number of factors listed in the sheet.
TechConnectr's Big Data Connection. Digital Marketing KPIs, Targeting, Analy...Bob Samuels
This presentation was given at the Deep Dive Conference in November. 2013.
Big Data Applications... example, digital marketing, and targeting and optimization...
Feedback, and additional perspectives, is appreciated.
Thank you,
Bobby Samuels
TechConnectr.com
Datalicious media-attribution-optimising-digial-marketing-spend-in-financial-...Peerasak C.
Marketers have been applying
science, specifically statistics and
econometric modeling, for many
decades now to answer the key
question all advertisers face - how
to allocate media budgets across
channels to maximise overall return
on advertising spend (ROAS) with a
limited budget.
Slides Ladislav Bartos recently used in his discussion w/ mentees of The Product Mentor.
The Product Mentor is a program designed to pair Product Mentors and Mentees from around the World, across all industries, from start-up to enterprise, guided by the fundamental goals…Better Decisions. Better Products. Better Product People.
Throughout the program, each mentor leads a conversation in an area of their expertise that is live streamed and available to both mentee and the broader product community.
http://TheProductMentor.com
Adaptive software development processes epitomized by Agile methodologies are based on continual improvement – incremental changes that emerge as teams iterate and learn about the product they are developing. This appears to conflict with the world of the program office, responsible for defining the software development lifecycle (SDLC), in which a stable and repeatable development process with well-defined ownership and controls is a common objective. Using recent examples in which agile methods have been successfully introduced into large organizations with existing SDLCs, we consider the difficulties of creating a verifiable process when the process itself is continually being modified, and look at how software development can be managed and controlled without stifling the benefits of adaptive software development processes.
Overview of Selenium, WebDriver, Watir and related open source cross-browser testing technologies. Presentation given by Martin Kleppmann, founder of browser testing service Go Test It, at Ruby Manor 2009.
Continuous Integration promises faster delivery of higher quality software through an integrated automated build, test, and release management.
The greater challenge lies not within a project or team, but as you look to scale this across a larger organization or enterprise-wide. How do you allow teams to choose the tools and processes, yet ensure all stakeholders have full visibility and traceability across all your delivery pipelines and in real time?
In this webinar, we will demonstrate how you can implement a CI environment leveraging popular open source tools (or any tool) using TeamForge.
End-2-End Monitoring – Der Prüfstand jedes SLA´s – in 15 Minuten erklärt!MAXXYS AG
Die detailliertesten Performance-Messedaten nützen Ihnen in der Praxis nichts, wenn Sie nicht wissen, welche Antwortzeiten die Endnutzer täglich erleben. Aber obwohl diese End-User-Kennzahl von entscheidender Bedeutung ist, wird sie in vielen Unternehmen nicht konsequent erfasst. Warum ist das so? Dieses Webinar befasst sich mit den Herausforderungen des E2E-Monitoring und soll Ihnen zeigen, wie Sie dieses meistern und die Performance Ihrer Infrastruktur und ihre Auswirkung auf die User Experience wirklich in den Griff bekommen.
Valtech - Big Data for Marketing
Aurélie Hornoy, Digital Performance Lead, Valtech
aurelie.hornoy@valtech.fr
The benefits of data-driven marketing
Event - Big Data : de l'analytics à la créativité ...
Valtech - 29/11
Marketing Automation & CRM: Terrible Twosome or Dynamic Duo?Pardot
No matter where you are in your marketing automation/CRM lifecycle, whether beginning your search, choosing a system, implementing and integrating or utilizing already existing systems, this webinar is presented with you in mind!
- Choosing a New System – Learn how to give your current process a thorough “once-over” and use this information to establish your shopping list
- Implementing / Integrating a System – Learn what to do – after you’ve made the big purchase
- Optimizing an Existing System – Learn how to maximize your current process approach and bring marketing and sales together – how to play “nice” and get things done
Social CRM - #Datamarketing @DM2013Toronto ArCompany
CRM is not a new concept, but with the emergence of Big Data, it has the ability to transform organizations more than ever.
The rise of the social customer has also given rise to communities, friends, and recommendation sites having profound influence in their purchase decision. Companies have gotten too big to think about the individual customer. Companies have ignored the statistically insignificant.
These days, that same customer has the ability to bring down Goliath. I always come back to United Breaks Guitars incident. The truth is that we've now come full circle and these days in order to get a customer and keep them you have to go beyond just meeting their expectations.
A recent article: "The Reason So Many Brands Fail On Social Media Is That They Don't Actually Talk To Their Customers" indicated the following:
The wealth of data on customer desires being generated is helping organizations work more effectively, and achieve better results.
Social customer management doubles the percentage of sales leads that result in actual sales, relative to traditional CRM approaches.
Corporations are starting to recognize that paying attention to customer comments, interests and preferences–once deemed "irrelevant" by brands– becomes the competitive differentiator.
Big Data: Unveiling opportunities in Email MarketingEmail Monks
This infographic by Monks unveils the big data opportunities discussing multi-channel data management, big data analytics models, customer data types, etc. to help marketers with the best practices to utilize big data in email marketing.
Anderson SAA 2014 Using CRM Data for "Big Picture" Researchdinaa_proj
David G. Anderson (University of Tennessee) presented his paper, “Using CRM Data for ‘Big Picture’ Research,” at the 79th annual meeting of the Society for American Archaeology in Austin, TX, in April 2014. This paper details the importance of CRM research in the development of Archaeology over the last forty years. Giving credit to the hundreds of thousands of technical reports and other forms of archaeological data stemming from ever-increasing amounts of CRM research in the Southeast, Anderson says this is the basis on which big picture research can now be accomplished. As technology and storage have caught up with the massive scale of new archaeological questions, digital repositories like DINAA can be utilized as highly effective tools.
PR Congress 2011 | Plenary 5 - Have They Come Back for Seconds?prcongress2011
Plenary 5 of the 18th National PR Congress last 22 to 23 September dealt with Customer Relationship Management. Customer service has now evolved from merely addressing customer queries to maintaining relationships with them. It is not just a matter of acquiring customers, but more importantly, it is about retaining them.
The session underscored the importance of trust-based discussions and sharing mutually-beneficial objectives in customer relationship management. Panelists’ inputs revolved around strategies, obstacles, and imperatives in building customer trust.
Topic Presenter:
Ms. Ichay Bulaong, CRM Head, ABS-CBN
Moderator:
Ms. Angelica Esguerra-Petterson, Executive Director, Australia New Zealand Chamber of Commerce Philippines
Topic Discussants:
Ms. Sandra Puno, Director for Communications, Nestle Philippines
Dr. Francisco Tranquilino, Advisor to the Ethic Committee, Pharmaceutical and Healthcare Association of the Philippines
Datalicious media-attribution-optimising-digial-marketing-spend-in-financial-...Peerasak C.
Marketers have been applying
science, specifically statistics and
econometric modeling, for many
decades now to answer the key
question all advertisers face - how
to allocate media budgets across
channels to maximise overall return
on advertising spend (ROAS) with a
limited budget.
Slides Ladislav Bartos recently used in his discussion w/ mentees of The Product Mentor.
The Product Mentor is a program designed to pair Product Mentors and Mentees from around the World, across all industries, from start-up to enterprise, guided by the fundamental goals…Better Decisions. Better Products. Better Product People.
Throughout the program, each mentor leads a conversation in an area of their expertise that is live streamed and available to both mentee and the broader product community.
http://TheProductMentor.com
Adaptive software development processes epitomized by Agile methodologies are based on continual improvement – incremental changes that emerge as teams iterate and learn about the product they are developing. This appears to conflict with the world of the program office, responsible for defining the software development lifecycle (SDLC), in which a stable and repeatable development process with well-defined ownership and controls is a common objective. Using recent examples in which agile methods have been successfully introduced into large organizations with existing SDLCs, we consider the difficulties of creating a verifiable process when the process itself is continually being modified, and look at how software development can be managed and controlled without stifling the benefits of adaptive software development processes.
Overview of Selenium, WebDriver, Watir and related open source cross-browser testing technologies. Presentation given by Martin Kleppmann, founder of browser testing service Go Test It, at Ruby Manor 2009.
Continuous Integration promises faster delivery of higher quality software through an integrated automated build, test, and release management.
The greater challenge lies not within a project or team, but as you look to scale this across a larger organization or enterprise-wide. How do you allow teams to choose the tools and processes, yet ensure all stakeholders have full visibility and traceability across all your delivery pipelines and in real time?
In this webinar, we will demonstrate how you can implement a CI environment leveraging popular open source tools (or any tool) using TeamForge.
End-2-End Monitoring – Der Prüfstand jedes SLA´s – in 15 Minuten erklärt!MAXXYS AG
Die detailliertesten Performance-Messedaten nützen Ihnen in der Praxis nichts, wenn Sie nicht wissen, welche Antwortzeiten die Endnutzer täglich erleben. Aber obwohl diese End-User-Kennzahl von entscheidender Bedeutung ist, wird sie in vielen Unternehmen nicht konsequent erfasst. Warum ist das so? Dieses Webinar befasst sich mit den Herausforderungen des E2E-Monitoring und soll Ihnen zeigen, wie Sie dieses meistern und die Performance Ihrer Infrastruktur und ihre Auswirkung auf die User Experience wirklich in den Griff bekommen.
Valtech - Big Data for Marketing
Aurélie Hornoy, Digital Performance Lead, Valtech
aurelie.hornoy@valtech.fr
The benefits of data-driven marketing
Event - Big Data : de l'analytics à la créativité ...
Valtech - 29/11
Marketing Automation & CRM: Terrible Twosome or Dynamic Duo?Pardot
No matter where you are in your marketing automation/CRM lifecycle, whether beginning your search, choosing a system, implementing and integrating or utilizing already existing systems, this webinar is presented with you in mind!
- Choosing a New System – Learn how to give your current process a thorough “once-over” and use this information to establish your shopping list
- Implementing / Integrating a System – Learn what to do – after you’ve made the big purchase
- Optimizing an Existing System – Learn how to maximize your current process approach and bring marketing and sales together – how to play “nice” and get things done
Social CRM - #Datamarketing @DM2013Toronto ArCompany
CRM is not a new concept, but with the emergence of Big Data, it has the ability to transform organizations more than ever.
The rise of the social customer has also given rise to communities, friends, and recommendation sites having profound influence in their purchase decision. Companies have gotten too big to think about the individual customer. Companies have ignored the statistically insignificant.
These days, that same customer has the ability to bring down Goliath. I always come back to United Breaks Guitars incident. The truth is that we've now come full circle and these days in order to get a customer and keep them you have to go beyond just meeting their expectations.
A recent article: "The Reason So Many Brands Fail On Social Media Is That They Don't Actually Talk To Their Customers" indicated the following:
The wealth of data on customer desires being generated is helping organizations work more effectively, and achieve better results.
Social customer management doubles the percentage of sales leads that result in actual sales, relative to traditional CRM approaches.
Corporations are starting to recognize that paying attention to customer comments, interests and preferences–once deemed "irrelevant" by brands– becomes the competitive differentiator.
Big Data: Unveiling opportunities in Email MarketingEmail Monks
This infographic by Monks unveils the big data opportunities discussing multi-channel data management, big data analytics models, customer data types, etc. to help marketers with the best practices to utilize big data in email marketing.
Anderson SAA 2014 Using CRM Data for "Big Picture" Researchdinaa_proj
David G. Anderson (University of Tennessee) presented his paper, “Using CRM Data for ‘Big Picture’ Research,” at the 79th annual meeting of the Society for American Archaeology in Austin, TX, in April 2014. This paper details the importance of CRM research in the development of Archaeology over the last forty years. Giving credit to the hundreds of thousands of technical reports and other forms of archaeological data stemming from ever-increasing amounts of CRM research in the Southeast, Anderson says this is the basis on which big picture research can now be accomplished. As technology and storage have caught up with the massive scale of new archaeological questions, digital repositories like DINAA can be utilized as highly effective tools.
PR Congress 2011 | Plenary 5 - Have They Come Back for Seconds?prcongress2011
Plenary 5 of the 18th National PR Congress last 22 to 23 September dealt with Customer Relationship Management. Customer service has now evolved from merely addressing customer queries to maintaining relationships with them. It is not just a matter of acquiring customers, but more importantly, it is about retaining them.
The session underscored the importance of trust-based discussions and sharing mutually-beneficial objectives in customer relationship management. Panelists’ inputs revolved around strategies, obstacles, and imperatives in building customer trust.
Topic Presenter:
Ms. Ichay Bulaong, CRM Head, ABS-CBN
Moderator:
Ms. Angelica Esguerra-Petterson, Executive Director, Australia New Zealand Chamber of Commerce Philippines
Topic Discussants:
Ms. Sandra Puno, Director for Communications, Nestle Philippines
Dr. Francisco Tranquilino, Advisor to the Ethic Committee, Pharmaceutical and Healthcare Association of the Philippines
This three part series provides a complete overview of End-to-End Customer Engagement and reveals the proven best-practices of In-Store Clienteling; Multi-Channel Campaign Management and Analytics; Ecommerce and Consumer-Facing Digital Technologies.
PART 1: In-Store Clienteling
The panel of industry experts offer an informative, in-depth look at In-Store Clienteling including:
• Clienteling defined: the trends and technologies of In-Store Clienteling
• Managing the shift in the retailer-customer relationship
• Culture change: techniques for creating a Clienteling culture
• Implementation: best-practices in capturing, converting and keeping customers
• Calculating the ROI: real-world examples
• Clienteling outlook: what the future has in store
This document proposes advanced data analytics as the key solution for building intimate knowledge about our customers’ behaviour, preferences and aspirations; an essential requirement for maximizing revenue in our current competitive environment.
Tim Watson, Independent Email Marketing Consultant
49% of email marketers in the recent Econsultancy Email Marketing Census 2012 said the lack of email strategy was a major barrier to effective email marketing.
Presented to eRum (Budapest), May 2018
There are many common workloads in R that are "embarrassingly parallel": group-by analyses, simulations, and cross-validation of models are just a few examples. In this talk I'll describe the doAzureParallel package, a backend to the "foreach" package that automates the process of spawning a cluster of virtual machines in the Azure cloud to process iterations in parallel. This will include an example of optimizing hyperparameters for a predictive model using the "caret" package.
By David Smith. Presented at Microsoft Build (Seattle), May 7 2018.
Your data scientists have created predictive models using open-source tools, proprietary software, or some combination of both, and now you are interested in lifting and shifting those models to the cloud. In this talk, I'll describe how data scientists can transition their existing workflows — while using mostly the same tools and processes — to train and deploy machine learning models based on open source frameworks to Azure. I'll provide guidance on keeping connections to data sources up-to-date, evaluating and monitoring models, and deploying applications that make use of those models.
Presentation delivered by David Smith to NY R Conference https://www.rstats.nyc/, April 2018:
Minecraft is an open-world creativity game, and a hit with kids. To get kids interested in learning to program with R, we created the "miner" package. This package is a collection of simple functions that allow you to connect with a Minecraft instance, manipulate the world within by creating blocks and controlling the player, and to detect events within the world and react accordingly.
The miner package is intended mainly for kids, to inspire them to learn R while playing Minecraft. But the development of the package also provides some useful insights into how to build an R package to interface with a persistent API, and how to instruct others on its use. In this talk I'll describe how to set up your own Minecraft server, and how to use and extend the package. I'll also provide a few examples of the package in action in a live Minecraft session.
While Python is a widely-used tool for AI development, in this talk I'll make the case for considering R as a platform for developing models for intelligent applications. Firstly, R provides a first-class experience working deep learning frameworks with its keras integration. Equally importantly, it provides the most comprehensive suite of statistical data analysis tools, which are extremely useful for many intelligent applications such as transfer learning. I'll give a few high-level examples in this talk, and we'll go into further detail in the accompanying interactive code lab.
There are many common workloads in R that are "embarrassingly parallel": group-by analyses, simulations, and cross-validation of models are just a few examples. In this talk I'll describe several techniques available in R to speed up workloads like these, by running multiple iterations simultaneously, in parallel.
Many of these techniques require the use of a cluster of machines running R, and I'll provide examples of using cloud-based services to provision clusters for parallel computations. In particular, I will describe how you can use the SparklyR package to distribute data manipulations using the dplyr syntax, on a cluster of servers provisioned in the Azure cloud.
Presented by David Smith at Data Day Texas in Austin, January 27 2018.
A look at the changing perceptions of R, from the early days of the R project to today. Microsoft sponsor talk, presented by David Smith to the useR!2017 conference in Brussels, July 5 2017.
Predicting Loan Delinquency at One Million Transactions per SecondRevolution Analytics
Real-time applications of predictive models must be able to generate predictions at the rate that transactions are generated. Previously, such applications of models trained using R needed to be converted to other languages like C++ or Java to achieve the required throughput. In this talk, I’ll describe how to use the in-database R processing capabilities of Microsoft R Server to detect fraud in a SQL Server database of loan records at a rate exceeding one million transactions per second. I will also show the process of training the underlying gradient-boosted tree model on a large training set using the out-of-memory algorithms of Microsoft R.
Presented by David Smith at The Data Science Summit, Chicago, April 20 2017.
The ability to independently reproduce results is a critical issue within the scientific community today, and is equally important for collaboration and compliance in business. In this talk, I'll introduce several features available in R that help you make reproducibility a standard part of your data science workflow. The talk will include tips on working with data and files, combining code and output, and managing R's changing package ecosystem.
Presented by David Smith, R Community Lead (Microsoft), at Monktoberfest October 2016.
The value of open source isn’t just in the software itself. The communities that form around open source software provide just as much value and sometimes even more: in ongoing development, in documentation, in support, in marketing, and as a supply of ready-trained employees. Companies who build on open source tend to focus on the software, but neglect communities at their peril.
In this talk, I share some of my experiences in building community for an open-source software company, Revolution Analytics, and perspectives since the acquisition by Microsoft in 2015.
R is more than just a language. Many of the reasons why R has become such a popular tool for data science come from the ecosystem surrounding the R project. R users benefit from the many resources and packages created by the community, while commercial companies (including Microsoft) provide tools to extend and support R, and services to help people use R.
In this talk, I will give an overview of the R Ecosystem and describe how it has been a critical component of R’s success, and include several examples of Microsoft’s contributions to the ecosystem.
(Presented to EARL London, September 2016)
(Presented by David Smith at useR!2016, June 2016. Recording: https://channel9.msdn.com/Events/useR-international-R-User-conference/useR2016/R-at-Microsoft )
Since the acquisition of Revolution Analytics in April 2015, Microsoft has embarked upon a project to build R technology into many Microsoft products, so that developers and data scientists can use the R language and R packages to analyze data in their data centers and in cloud environments.
In this talk I will give an overview (and a demo or two) of how R has been integrated into various Microsoft products. Microsoft data scientists are also big users of R, and I'll describe a couple of examples of R being used to analyze operational data at Microsoft. I'll also share some of my experiences in working with open source projects at Microsoft, and my thoughts on how Microsoft works with open source communities including the R Project.
Hadoop is famously scalable. Cloud Computing is famously scalable. R – the thriving and extensible open source Data Science software – not so much. But what if we seamlessly combined Hadoop, Cloud Computing, and R to create a scalable Data Science platform? Imagine exploring, transforming, modeling, and scoring data at any scale from the comfort of your favorite R environment. Now, imagine calling a simple R function to operationalize your predictive model as a scalable, cloud-based Web Service. Learn how to leverage the magic of Hadoop on-premises or in the cloud to run your R code, thousands of open source R extension packages, and distributed implementations of the most popular machine learning algorithms at scale.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Elevating Tactical DDD Patterns Through Object Calisthenics
The Impact of Big Data On Marketing Analytics (UpStream Software)
1. The Impact of Big Data on Marketing Analytics
FEBRUARY 2013
Powered by:
1
2. Who we are
Company Overview
Experienced team with a proven history of solving difficult analytics
problems for Fortune 500 companies
Cloud-based software to manage marketing’s big data problems:
customer level revenue attribution and multi-channel optimization, triggered
marketing, and planning and reporting
Locations San Francisco, Seattle, and Hyderabad
2
3. Marketing Analytics Goals
Identify the most profitable Target the right customers Understand what the spend
channels for every customer at the right time with the right in each marketing
and the most profitable message. channel contributes to sales.
customers for every channel.
“Advanced Revenue Attribution”
3
4. Challenges with Multi-Channel Retail
Multi-channel marketers are unsure where to spend their next dollar.
Messy data with many Don’t understand how spending No easy way to identify the
marketing and order channels, on marketing affects conversion most profitable channels for every
disparate databases, various customer
execution platforms
4
5. How do you approach the problem?
Enable retailers to conduct customer-level analysis on
big data to understand what motivates individuals to buy.
Assemble and standardize Apply the rigor of a medical Identify and attribute Know whom
all of a marketer’s data into researcher with patented the revenue drivers to reach
a Hadoop cluster methodology
5
6. Advanced Revenue Attribution
What is it?
Data-driven time-to-event statistical modeling used to establish an objective and accurate revenue distribution, all
done at the individual user level
What are Common Attribution Buckets?
“Big Data” platform that handles and connects all of a company’s online and offline data (sales, web
analytics logs, catalog and email send data, display and search advertising logs, etc.)
Augment marketing campaign data with supplementary information to correctly distribute variance across
all contributing factors (i.e. Customer Driven (Store Location, Seasonal Factors), Special Cased (Branded
Search, Economic Conditions)
How is it different?
Modeling is done at the customer level
– facilitates both the micro and macro level analyses in tandem for the most comprehensive insights that a marketer can
extract
– empowers marketers to customize their strategies at this very same granular level
Focus on modeling time effectively enables the targeting of specific customers with specific treatments at
specific times
6
10. Architecture: Hadoop – Revolution Integration
Current State: Revo v6
• Functions to read Hadoop output;
xdf creation CUSTOM VARIABLES
UPSTREAM DATA
FORMAT (UDF) • Exploratory data analysis (PMML)
• GAM survival models
• ETL • Scoring for inference
• N marketing channels • Scoring for prediction
• Behavioral variables
• 5 billion scores per day
• Promotional data per customer
• Overlay data
10
11. Why Revolution R?
We used to prep data and build models with SAS / WPS
Current Hardware: Linux CentOS 6
We switched to Revolution R for the following reasons:
Cost effective
Comprehensive and easy-to-use statistical packages (especially familiar for people coming from academia)
Scale & Performance (increase 4x with Revo Scale R)
• (RevoScaleR) rxLogit on 36MM rows and 30 variables (full input data is 68MB) data runs in under 4
minutes
• Descriptive and modeling functions operate on compressed xdf files to preserve disk space
Beautiful graphics with high degree of user control
Open source environment enables the best and brightest in both academia and industry to contribute R
packages every day; unlimited growth potential
Ongoing Revo support – extremely receptive team to work with
11
12. Case Study: Top Multi-Channel Retailer
180%
Attribution
160%
Impact Direct Load
Presented results that were contrary to 140%
company’s expectation; client validated Other
results internally 120%
Search
Within 3 months, reallocated $5MM
100%
marketing budget to another channel Display Remarketing
with more changes to follow
80%
Customer
Driven/Trade Area
Insights 60% Catalog
Marketing is responsible for ~50% of overall
40% Other
sales (offline and online). The other half
Search
account for the customer’s buying habit and
20% Display Remarketing
store trade area. Email Catalog
Email
Ecommerce significantly more influenced by 0%
marketing than retail or call-center channels Before After
Direct Load: UpStream credits marketing
activities that drove user “navigation” to
website.
12
13. Case Study: Top Multi-Channel Retailer
Optimization
Impact
Already field tested head-to-head against industry leading model
+14% lift in response rate
+$270K in new revenue in a single campaign
Reallocated marketing circulation: identified best prospects to not mail that were likely to
purchase without receiving catalog
Scored 22MM households with 9 models all in the cloud
13
14. Summary
The World is Changing:
The way customers are purchasing services is changing
Managing marketing budgets in the multi-channel world is challenging
Understanding attribution is critical to successfully deploy your marketing budget
To Be Successful, Your Attribution Solution Should:
Cover all of your data
Both online and offline
Be statistically relevant
Guess work doesn’t count
Scalable and flexible
Make sure you have the right technology platform and tools
14
16. Example Findings
Google keywords often perform worse than you think
In many cases 20-40% worse
Display Advertising performs better than you think
Certain types of display, such as retargeting, performs better than you think and can have strong influence
especially at retail stores, which most attribution tools fail to pick up
Custom loyalty has the most impact at the retail store
Often retail sales are due to habit and loyalty, but the same trend doesn’t hold online
Retail sales are influenced by the presence of a store near home
Unfortunately the inverse is also true, web purchases are not typically driven by having a store nearby
Seasonal is much stronger at Internet than Retail or Call Center
The impact of season purchasing is almost double that of retail
Tenure of customers show significant differences
Newer customers are more sensitive to marketing, seasonal factors, and store area than established
customers (based on tenure).
16
Editor's Notes
Tess Nesbitt, Statistician and Senior Consultant at Upstream / Business Researchers
We are a team of number crunchers, backgrounds in econ, math, statistics, physics, astrophysics, business…. the whole gamut of scientific and technical disciplines Started as BRI, a consulting company but have developed another aspect of the company called Upstream, which has been going for about 2 years where we focus primarily of working on big data problems for marketing revolving bullet 2
We hear multi-channel word used a lot in retail, but it is pretty an ambiguous word. We have 2 definitions of channel:Those on the left hand side are where you spend marketing budget, those on the right hand side are purchases are made---we separate these two out so we can see crossings (how much is email driving to store sales, how much is direct mail driving to online sales?)
This is an observational data problem---we read in a lot of data: every impression served, every click to the website, every email delivered clicked on every catalog every postcard and all the order data from every channel as well--we look at entire gamut of marketing how you reach customersWe tie this data together and later model it--we borrow techniques from biostatistics and medical research and apply them to this data (outcome instead of die is buy)-once we understand what drives conversion, we can use that to split up orders into channels that drove itwhen you undestandwhat drives sales, you can decide what marketing to buy next--So what we are doing is assigning credit of sales to various types of marketing you are conducting.--when we figure out what drives sales , then we want to move to figuring out how to redirect budget (Targeting)--Strategi Allocation c use this info it to make better decisions about how and when to market to customers--Incremental Response: can see how receptive people are to various types of marketing (reallocate catalog to customers who are most moved by certain treatments))
we want to understand co-occurrence of marketing phenomena-most of these survival analysis techniques are for small data, but we apply it to huge data-time-dependent outcome-majority of our inputs are time-dependent covariates-competing risks: survival framework is designed to handle competing risks ------you are exposing people to a cocktail of drugs, and we want to know if was it the aspirin that killed you?
Assume we already built a model, what can we do with it?Recency table is in days, sales is in dollars1)Retrospectively - 2 months email is well below the fold, you arent clicking on it (effect has decayed down to nearly zero) agaon so catalog gets credit email gets more credit in second case--we take into account the amplitude of the effect and timing
This distribution is what we are up against, what we are trying to modelhighly nonlinearpart of our methodology is to put terms in the model that control for a distribution like this, so we control for this while overlaying marketing treatments
we treat upsteam as scoring systemsame scoring system makes data for modelingin Hadoop, we do all the ETL--handle lots of data and files, we create behvioralvariabeles, time between purchases, number of purchases, promotional schedule, etc.Overlay data-demographic datawe push the data out in a cleansed way for survival modeling we use RevoRfor explorating work and modelingwhen these are finished, they are pushed back to Hadoop for scoringscoring for prediction (lift charts, use model for selection,etc.)creating 5 billion scores per day per retailer
Retails was double counting their sale s(over 100%)--savvy marketers want ot know this incremental effect--these percentages might not be smae if we only look at web sales, or only retails, etc...in this example we have combined all order hcnnales--this is 1 year of data and it is retrospective--could we use this info going forward?