We all know that we need data and lots of it, but are we really utilising it in the best way that leads to a winning competitive advantage and growth? Do we and our businesses truly know how all the advanced data collection technologies and methods are available to increase revenue or are we leaving money on the table?
In this masterclass, Jason Tan, a data analytics industry leader, together with Bright Data, the leading SaaS platform for online data collection, will demonstrate how to incorporate competitors data into an analytics platform and optimise revenue.
So much so that you never leave any money on the table.
During this brief business-focused session, Jason will present the verified use cases applied by e-commerce and insurance sectors and show you all the five steps required to develop a revenue optimisation platform.
By attending this masterclass, you will walk away with the knowledge on how to embed analytics directly into your business frontline and make better, profitable decisions based on real-time or near-live actionable data.
Originally presented live alongside Bright Data 2021
Presentation can be viewed here:
https://www.youtube.com/watch?v=Pjq_bETnbvo
Automate and Optimise with a Competitor Pricing Engine - CDO JakartaJason Tan
1. Five steps process to developing a revenue optimisation platform
2. Reviewing verified use cases in the E-commerce and insurance sectors
3. Why you should start to embed analytics directly into your business frontline
Originally presented live at Chief Data Officer Jakarta 2021
Embedding Analytics into the Frontline to Optimise Revenue - DSC EuropeJason Tan
Here is what I’m going to show you in this presentation:
1. How you can build an independent analytics platform and use it to complement the legacy system
2. How to embed this analytics platform to optimise the customer experience and revenue
3. My five step implementation to put everything together
By the end of this presentation, I hope you can walk away with the ideas and aspirations you need to achieve all three of these points.
Originally presented live at Data Science Conference Europe 2021
Automate and Optimise with a Competitor Pricing Engine - GoogleJason Tan
Many businesses are now building up their data and analytics capabilities to lead them to growth and winning competitive advantage. However, for too long, intelligence generated by the countless analytics and dashboards never made it beyond the PowerPoint packs.
In this event, Jason Tan will demonstrate how to
- Incorporate competitor data into the analytics models and optimise revenue
- Embed analytics directly into business frontline
- Develop an independent analytics platform to complement legacy system
Originally presented live to a Google team
Building a Marketing Data Warehouse from Scratch - SMX Advanced 202Christopher Gutknecht
This deck covers the journey of starting with BigQuery, adding more data sources and building a process around your data warehouse. It covers the three phases greenfield, dashboards and operational analytics and the necessary data components.
The code for uploading your product feed can be found here:
https://gist.github.com/ChrisGutknecht/fde93092e21039299ab76715596eac01
If you have any questions, reach out to me on Linkedin!
Revolutionising Storage for your Future Business RequirementsNetApp
Non-disruptive Operations, Efficiency and Seamless scale are all topics of discussion by organisations facing challenging growth in the volumes of data stored. In this session Julian Wheeler, NetApp Channel SE Manager, investigates new storage infrastructures that enable you to manage growth, scale and efficiency while improving the service to the business.
Learn more - http://www.talend.com/products/talend-6
When you’re ready to move to Big Data, connect in the cloud, and across the Internet of Things, Talend 6 streamlines the process. Convert traditional data integration jobs and MapReduce jobs to Spark with the click of a button, and realize the potential of real-time data-driven decision making. Learn more about Talend and Spark.
Talend 6 also brings continuous delivery, MDM REST API, plus data masking and semantic discovery to our products.
Automate and Optimise with a Competitor Pricing Engine - CDO JakartaJason Tan
1. Five steps process to developing a revenue optimisation platform
2. Reviewing verified use cases in the E-commerce and insurance sectors
3. Why you should start to embed analytics directly into your business frontline
Originally presented live at Chief Data Officer Jakarta 2021
Embedding Analytics into the Frontline to Optimise Revenue - DSC EuropeJason Tan
Here is what I’m going to show you in this presentation:
1. How you can build an independent analytics platform and use it to complement the legacy system
2. How to embed this analytics platform to optimise the customer experience and revenue
3. My five step implementation to put everything together
By the end of this presentation, I hope you can walk away with the ideas and aspirations you need to achieve all three of these points.
Originally presented live at Data Science Conference Europe 2021
Automate and Optimise with a Competitor Pricing Engine - GoogleJason Tan
Many businesses are now building up their data and analytics capabilities to lead them to growth and winning competitive advantage. However, for too long, intelligence generated by the countless analytics and dashboards never made it beyond the PowerPoint packs.
In this event, Jason Tan will demonstrate how to
- Incorporate competitor data into the analytics models and optimise revenue
- Embed analytics directly into business frontline
- Develop an independent analytics platform to complement legacy system
Originally presented live to a Google team
Building a Marketing Data Warehouse from Scratch - SMX Advanced 202Christopher Gutknecht
This deck covers the journey of starting with BigQuery, adding more data sources and building a process around your data warehouse. It covers the three phases greenfield, dashboards and operational analytics and the necessary data components.
The code for uploading your product feed can be found here:
https://gist.github.com/ChrisGutknecht/fde93092e21039299ab76715596eac01
If you have any questions, reach out to me on Linkedin!
Revolutionising Storage for your Future Business RequirementsNetApp
Non-disruptive Operations, Efficiency and Seamless scale are all topics of discussion by organisations facing challenging growth in the volumes of data stored. In this session Julian Wheeler, NetApp Channel SE Manager, investigates new storage infrastructures that enable you to manage growth, scale and efficiency while improving the service to the business.
Learn more - http://www.talend.com/products/talend-6
When you’re ready to move to Big Data, connect in the cloud, and across the Internet of Things, Talend 6 streamlines the process. Convert traditional data integration jobs and MapReduce jobs to Spark with the click of a button, and realize the potential of real-time data-driven decision making. Learn more about Talend and Spark.
Talend 6 also brings continuous delivery, MDM REST API, plus data masking and semantic discovery to our products.
Building Data Products with BigQuery for PPC and SEO (SMX 2022)Christopher Gutknecht
In this data management session, Christopher describes how to build robust and reliable data products in BigQuery and dbt, for PPC and SEO use cases. After an introduction to the modern data stack, six principles of reliable data products are presented, followed by the following use cases:
- Google Ads Conversion upload
- SEO sitemap efficiency report
- Google Shopping product rating sync
- Large-Scale link checker with advertools
- Inventory-based PPC campaigns with dbt
Here is the referenced selection of gists on github: https://gist.github.com/ChrisGutknecht
Understanding DataOps and Its Impact on Application QualityDevOps.com
Modern day applications are data driven and data rich. The infrastructure your backends run on are a critical aspect of your environment, and require unique monitoring tools and techniques. In this webinar learn about what DataOps is, and how critical good data ops is to the integrity of your application. Intelligent APM for your data is critical to the success of modern applications. In this webinar you will learn:
The power of APM tailored for Data Operations
The importance of visibility into your data infrastructure
How AIOps makes data ops actionable
A basic introduction to Big Query, how it works and what it can do. Look into a use case of Big Query, using Google Analytics and CRM data to create a powerful remarketing list.
Data Con LA 2022 - Why Data Quality vigilance requires an End-to-End, Automat...Data Con LA
Curtis ODell, Global Director Data Integrity at Tricentis
Join me to learn about a new end-to-end data testing approach designed for modern data pipelines that fills dangerous gaps left by traditional data management tools—one designed to handle structured and unstructured data from any source. You'll hear how you can use unique automation technology to reach up to 90 percent test coverage rates and deliver trustworthy analytical and operational data at scale. Several real world use cases from major banks/finance, insurance, health analytics, and Snowflake examples will be presented.
Key Learning Objective
1. Data journeys are complex and you have to ensure integrity of the data end to end across this journey from source to end reporting for compliance
2. Data Management tools do not test data, they profile and monitor at best, and leave serious gaps in your data testing coverage
3. Automation with integration to DevOps and DataOps' CI/CD processes are key to solving this.
4. How this approach has impact in your vertical
Data Integration and Marketing Attribution ROIVENUE™
Microsoft and ROIVENUE™ have teamed up to provide a glimpse into the benefits of integrating all your marketing data. All about the latest advancements in data management powered by Azure and how ROIVENUE™ helps marketers identify where to best allocate their digital spend with our Marketing Attribution models and Budget Optimizer™.
Power to the People: A Stack to Empower Every User to Make Data-Driven DecisionsLooker
Infectious Media runs on data. But, as an ad-tech company that records hundreds of thousands of web events per second, they have have to deal with data at a scale not seen by most companies. You can not make decisions with data when people need to write manual SQL only for queries take 10-20 minutes to return. Infectious Media made the switch to Google BigQuery and Looker and now every member of every team can get the data they need in seconds.
Infectious Media shares:
- Why they chose their current stack
- Why faster data means happier customers
- Advantages and practical implications of storing and processing that much data
Check out the recording at https://info.looker.com/h/i/308848878-power-to-the-people-a-stack-to-empower-every-user-to-make-data-driven-decisions
In the age of IoT, most everyone is talking about data lakes. For the most part, we all agree on the value data lakes deliver, but beyond this conceptual agreement, there are still many practical questions that need answers. The key to success comes down to how data lakes are implemented and managed.
Chuck Yarbrough outlines the 5 keys for creating a data lake along with strategies for defining, ingesting, governing, managing, and analyzing the data lake in ways that will enable transformative benefits in Iot and other use cases. This session will show and share how real-world data lake implementations are changing the world. Chuck focuses on automation of the data lake from ingesting data to managing metadata at scale and applying machine learning to drive significant results. Along the way, Chuck explores tools and procedures that help create a well-organized, -governed, and -managed data lake—without the risk of creating a dreaded data swamp. You’ll leave armed with the five keys to successfully creating and managing a killer data lake.
How Does the Denodo Platform Accelerate Your Time to Insights?Denodo
Watch full webinar here: https://bit.ly/3PRcuby
In this demo session, we will illustrate the power of Denodo and delve into how Denodo helps organisations make sense of disparate silos of data. We will demonstrate the Denodo advanced data catalog and our AI/ML features that help organizations democratize and govern their data.
Wie beschleunigt die Denodo Plattform Ihre Zeit der Erkenntnisgewinnung?Denodo
Watch full webinar here: https://bit.ly/3ayILnx
In this demo session, we will illustrate the power of Denodo and delve into how Denodo helps organisations make sense of disparate silos of data. We will demonstrate the Denodo advanced data catalog and our AI/ML features that help organizations democratize and govern their data.
Inside 6 Dimensional Model for Industry 4.0 Smart Factory by WeboniseWebonise Lab
Webonise uses a six dimensional approach to help manufacturing companies on their path to a Smart factory with 4th Industrial Revolution. The deep dive to pure Tech Adoption Strategies + Data Driven Play for Future Factories.
Presented at 3|SHARE's EVOLVE'15 - The Adobe Experience Manager Community Summit on August 18th, 2015 at the Hard Rock Hotel in San Diego, CA. http://evolve.3sharecorp.com
Use cases for Hadoop and Big Data Analytics - InfoSphere BigInsightsGord Sissons
This presentation is from TDWI's event in Boston during the summer of 2014. IBM InfoSphere BigInsights is IBM's enterprise grade Hadoop offering. It combines the best of open-source Hadoop, with advanced capabilities including Big SQL that clients can optionally deploy to get to market faster with a variety of big data and analytic applications.
Adaptiva provides innovative IT efficiency and PC power management software solutions. A Microsoft Partner, Adaptiva extends Microsoft System Center Configuration Manager providing revolutionary hierarchy simplification, client health assurance and PC energy management.
Adaptiva helps users decrease the environmental impact of desktop computing, increase desktop
manageability and end-user productivity, and significantly reduce IT costs. Adaptiva solutions
leverage existing IT resources to enhance the operation of Configuration Manager without
disrupting service to end-users. Based in Bellevue, Washington, Adaptiva is deployed on more than 1 million devices globally and
sold directly and through a network of partners and channel resellers.
How to Build a Diversified Investment Portfolio.pdfTrims Creators
Building a diversified investment portfolio is a fundamental strategy to manage risk and optimize returns. For both novice and experienced investors, diversification offers a pathway to a more stable and resilient financial future. Here’s an in-depth guide on how to create and maintain a well-diversified investment portfolio.
More Related Content
Similar to Don't Leave Money on the Table - Bright Data
Building Data Products with BigQuery for PPC and SEO (SMX 2022)Christopher Gutknecht
In this data management session, Christopher describes how to build robust and reliable data products in BigQuery and dbt, for PPC and SEO use cases. After an introduction to the modern data stack, six principles of reliable data products are presented, followed by the following use cases:
- Google Ads Conversion upload
- SEO sitemap efficiency report
- Google Shopping product rating sync
- Large-Scale link checker with advertools
- Inventory-based PPC campaigns with dbt
Here is the referenced selection of gists on github: https://gist.github.com/ChrisGutknecht
Understanding DataOps and Its Impact on Application QualityDevOps.com
Modern day applications are data driven and data rich. The infrastructure your backends run on are a critical aspect of your environment, and require unique monitoring tools and techniques. In this webinar learn about what DataOps is, and how critical good data ops is to the integrity of your application. Intelligent APM for your data is critical to the success of modern applications. In this webinar you will learn:
The power of APM tailored for Data Operations
The importance of visibility into your data infrastructure
How AIOps makes data ops actionable
A basic introduction to Big Query, how it works and what it can do. Look into a use case of Big Query, using Google Analytics and CRM data to create a powerful remarketing list.
Data Con LA 2022 - Why Data Quality vigilance requires an End-to-End, Automat...Data Con LA
Curtis ODell, Global Director Data Integrity at Tricentis
Join me to learn about a new end-to-end data testing approach designed for modern data pipelines that fills dangerous gaps left by traditional data management tools—one designed to handle structured and unstructured data from any source. You'll hear how you can use unique automation technology to reach up to 90 percent test coverage rates and deliver trustworthy analytical and operational data at scale. Several real world use cases from major banks/finance, insurance, health analytics, and Snowflake examples will be presented.
Key Learning Objective
1. Data journeys are complex and you have to ensure integrity of the data end to end across this journey from source to end reporting for compliance
2. Data Management tools do not test data, they profile and monitor at best, and leave serious gaps in your data testing coverage
3. Automation with integration to DevOps and DataOps' CI/CD processes are key to solving this.
4. How this approach has impact in your vertical
Data Integration and Marketing Attribution ROIVENUE™
Microsoft and ROIVENUE™ have teamed up to provide a glimpse into the benefits of integrating all your marketing data. All about the latest advancements in data management powered by Azure and how ROIVENUE™ helps marketers identify where to best allocate their digital spend with our Marketing Attribution models and Budget Optimizer™.
Power to the People: A Stack to Empower Every User to Make Data-Driven DecisionsLooker
Infectious Media runs on data. But, as an ad-tech company that records hundreds of thousands of web events per second, they have have to deal with data at a scale not seen by most companies. You can not make decisions with data when people need to write manual SQL only for queries take 10-20 minutes to return. Infectious Media made the switch to Google BigQuery and Looker and now every member of every team can get the data they need in seconds.
Infectious Media shares:
- Why they chose their current stack
- Why faster data means happier customers
- Advantages and practical implications of storing and processing that much data
Check out the recording at https://info.looker.com/h/i/308848878-power-to-the-people-a-stack-to-empower-every-user-to-make-data-driven-decisions
In the age of IoT, most everyone is talking about data lakes. For the most part, we all agree on the value data lakes deliver, but beyond this conceptual agreement, there are still many practical questions that need answers. The key to success comes down to how data lakes are implemented and managed.
Chuck Yarbrough outlines the 5 keys for creating a data lake along with strategies for defining, ingesting, governing, managing, and analyzing the data lake in ways that will enable transformative benefits in Iot and other use cases. This session will show and share how real-world data lake implementations are changing the world. Chuck focuses on automation of the data lake from ingesting data to managing metadata at scale and applying machine learning to drive significant results. Along the way, Chuck explores tools and procedures that help create a well-organized, -governed, and -managed data lake—without the risk of creating a dreaded data swamp. You’ll leave armed with the five keys to successfully creating and managing a killer data lake.
How Does the Denodo Platform Accelerate Your Time to Insights?Denodo
Watch full webinar here: https://bit.ly/3PRcuby
In this demo session, we will illustrate the power of Denodo and delve into how Denodo helps organisations make sense of disparate silos of data. We will demonstrate the Denodo advanced data catalog and our AI/ML features that help organizations democratize and govern their data.
Wie beschleunigt die Denodo Plattform Ihre Zeit der Erkenntnisgewinnung?Denodo
Watch full webinar here: https://bit.ly/3ayILnx
In this demo session, we will illustrate the power of Denodo and delve into how Denodo helps organisations make sense of disparate silos of data. We will demonstrate the Denodo advanced data catalog and our AI/ML features that help organizations democratize and govern their data.
Inside 6 Dimensional Model for Industry 4.0 Smart Factory by WeboniseWebonise Lab
Webonise uses a six dimensional approach to help manufacturing companies on their path to a Smart factory with 4th Industrial Revolution. The deep dive to pure Tech Adoption Strategies + Data Driven Play for Future Factories.
Presented at 3|SHARE's EVOLVE'15 - The Adobe Experience Manager Community Summit on August 18th, 2015 at the Hard Rock Hotel in San Diego, CA. http://evolve.3sharecorp.com
Use cases for Hadoop and Big Data Analytics - InfoSphere BigInsightsGord Sissons
This presentation is from TDWI's event in Boston during the summer of 2014. IBM InfoSphere BigInsights is IBM's enterprise grade Hadoop offering. It combines the best of open-source Hadoop, with advanced capabilities including Big SQL that clients can optionally deploy to get to market faster with a variety of big data and analytic applications.
Adaptiva provides innovative IT efficiency and PC power management software solutions. A Microsoft Partner, Adaptiva extends Microsoft System Center Configuration Manager providing revolutionary hierarchy simplification, client health assurance and PC energy management.
Adaptiva helps users decrease the environmental impact of desktop computing, increase desktop
manageability and end-user productivity, and significantly reduce IT costs. Adaptiva solutions
leverage existing IT resources to enhance the operation of Configuration Manager without
disrupting service to end-users. Based in Bellevue, Washington, Adaptiva is deployed on more than 1 million devices globally and
sold directly and through a network of partners and channel resellers.
Similar to Don't Leave Money on the Table - Bright Data (20)
How to Build a Diversified Investment Portfolio.pdfTrims Creators
Building a diversified investment portfolio is a fundamental strategy to manage risk and optimize returns. For both novice and experienced investors, diversification offers a pathway to a more stable and resilient financial future. Here’s an in-depth guide on how to create and maintain a well-diversified investment portfolio.
What You're Going to Learn
- How These 4 Leaks Force You To Work Longer And Harder in order to grow your income… improve just one of these and the impact could be life changing.
- How to SHUT DOWN the revolving door of Income Stagnation… you know, where new sales come into your magazine while at the same time existing sponsors exit.
- How to transform your magazine business by fixing the 4 “DON’Ts”...
#1 LEADS Don’t Book
#2 PROSPECTS Don’t Show
#3 PROSPECTS Don’t Buy
#4 CLIENTS Don’t Stay
- How to identify which leak to fix first so you get the biggest bang for your income.
- Get actionable strategies you can use right away to improve your bookings, sales and retention.
Best Crypto Marketing Ideas to Lead Your Project to SuccessIntelisync
In this comprehensive slideshow presentation, we delve into the intricacies of crypto marketing, offering invaluable insights and strategies to propel your project to success in the dynamic cryptocurrency landscape. From understanding market trends to building a robust brand identity, engaging with influencers, and analyzing performance metrics, we cover all aspects essential for effective marketing in the crypto space.
Also Intelisync, our cutting-edge service designed to streamline and optimize your marketing efforts, leveraging data-driven insights and innovative strategies to drive growth and visibility for your project.
With a data-driven approach, transparent communication, and a commitment to excellence, InteliSync is your trusted partner for driving meaningful impact in the fast-paced world of Web3. Contact us today to learn more and embark on a journey to crypto marketing mastery!
Ready to elevate your Web3 project to new heights? Contact InteliSync now and unleash the full potential of your crypto venture!
Textile Chemical Brochure - Tradeasia (1).pdfjeffmilton96
Explore Tradeasia’s brochure for eco-friendly textile chemicals. Enhance your textile production with high-quality, sustainable solutions for superior fabric quality.
Explore Sarasota Collection's exquisite and long-lasting dining table sets and chairs in Sarasota. Elevate your dining experience with our high-quality collection!
Salma Karina Hayat is Conscious Digital Transformation Leader at Kudos | Empowering SMEs via CRM & Digital Automation | Award-Winning Entrepreneur & Philanthropist | Education & Homelessness Advocate
When listening about building new Ventures, Marketplaces ideas are something very frequent. On this session we will discuss reasons why you should stay away from it :P , by sharing real stories and misconceptions around them. If you still insist to go for it however, you will at least get an idea of the important and critical strategies to optimize for success like Product, Business Development & Marketing, Operations :)
Reflect Festival Limassol May 2024.
Michael Economou is an Entrepreneur, with Business & Technology foundations and a passion for Innovation. He is working with his team to launch a new venture – Exyde, an AI powered booking platform for Activities & Experiences, aspiring to revolutionize the way we travel and experience the world. Michael has extensive entrepreneurial experience as the co-founder of Ideas2life, AtYourService as well as Foody, an online delivery platform and one of the most prominent ventures in Cyprus’ digital landscape, acquired by Delivery Hero group in 2019. This journey & experience marks a vast expertise in building and scaling marketplaces, enhancing everyday life through technology and making meaningful impact on local communities, which is what Michael and his team are pursuing doing once more with Exyde www.goExyde.com
11. Ethical data collection
What makes a data collection platform ethical?
● Leads with transparency
● Preserves the digital ecosystem: Matches data collection scale with website
capabilities (don’t be a bad bot)
● Abides by global regulations (GDPR, CCPA)
10
Thank you Anna for the kind introduction. Hi, My name is Jason Tan, and I embed data science directly into the business frontline.
Today, I’m going to show you why a revenue optimisation engine can make your organisation more data-driven and how to use it to optimise revenue.
Here is what I’m going to show you today.
What’s a world-class revenue optimisation engine used by the leading insurers and retailers
How to use these five modules implementation in creating a revenue optimisation engine
Why this will boost your revenue by at least 10%
You probably have already known, platforms like Amazon and Alibaba are optimising their price for every customer. They embrace the data-driven philosophy to use internal and external data for marketing and selling products to their customers.
Now you can achieve the same results. Moreover, thanks to the advance of cloud and analytics as a service platform, you can start implementing the same solution in 6 months.
I want to tell you a back story of how the insurers and I have been doing this for years, and most importantly, how you can implement this concept in your business.
So back in mid of 2000, I worked as a pricing analyst at one of the largest insurance companies here in Australia.
We hired tens of backpackers to collect data for us manually. They would sit in front of the computer to get thousands of insurance quotes from the competitor website.
The actuaries would then reverse engineer the quote on how other insurers are charging for each rating factor. It includes the factors such as gender, age, vehicle's brand, claim count, and all other essential factors.
These are the factors that made up the final premium you pay for your insurance.Once we have got that sorted, we then simulate how much the competitors would be charging our customers.
Together with other data, this information is subsequently fed into the pricing optimisation platform to generate the most optimal price before customers renew their policy.
We were pleasantly surprised that, due to the premium branding we hold in the market, how often we could increase the price and not leave any money on the table.
I have since then implemented this for my own retail business and my corporate clients.
Imagine you are the Chief Revenue Officer of this retailing company called Running Warehouse. You have an online store and also a few physical stores.
You proudly ship to over 20 countries worldwide, and it’s truly an international business.
The primary products you sell are running shoes for runners. Like many other retailers, you compete with both physical and online stores.
As for the consumers, they often visit multiple websites before they shop. Especially, if the products are identical.
For example, these Ultra Boost 21 running shoes from Adidas are trendy, which is one of my favourites.
At Running Warehouse, you are currently selling it at $219.96, and there are still stocks for all sizes. It’s a pretty good price. Let’s look at few other competitors stores and how much they are selling.
Adidas is selling at $270.00 at their store. And that is $50.04 dearer, which is 23% more than what you are charging.
Do you agree that is quite a big difference? I certainly do. Although, I think Adidas can certainly charge this price and get away from it because of their market positioning.
How about the largest sporting goods retailers in Australia - Rebel Sport.
It looks like it is one cent cheaper than Adidas.
Finally, let us see how much they are selling at Amazon. They sell it at $244.53 plus $24.64 for delivery fees.
Assuming the buyer does not have Amazon Prime, that is $271.17. It is the most expensive of all. That is unbelievable! Did you expect that?
However, if we look closer at what Amazon is charging, I think it is evident how Amazon optimises its price and revenue.
Without the delivery fees, $244.53 is somewhere between Running Warehouse and all other major retailers. It allows them to win the sale from other big-box retailers.
At the same time, they use delivery fees as a hook to sign people up for Amazon Prime for all year free & fast shipping.
Finally, no human would come out with a price ending 53 cents. It is the optimisation engine in place dictating the price, which you already knew.
What have we learnt so far? Running Warehouse is clearly the cheapest among all. On average, you are charging $51 lesser than your competitors.
You're leaving so much much on the table. So the questions we want to ask are:
-> Would you always be aware you are selling at a price that's 23% below everyone else?
-> What if you could sell it at a higher price but remain competitive?
-> What if you could monitor and adjust the price for all the products automatically?
I know the answers to those questions are a simple yes. Using modern cloud platforms, you too, can develop a revenue optimisation engine like Amazon.
To do that you will need five modules, and we will go through each of them in this masterclass.
The first module is to collect external data, such as product, colour, price, location, from your competitors.
You can choose to include as many or as few of the competitors as you like, but make sure you have the key competitors.
And to collect these external data, we will need a mechanism to manage them at scale. Now the good news is it has never been easier than ever, and you won’t have to recruit any backpacker to do these jobs.
Instead, you can rely on modern data collection platforms like Bright Data and their infrastructure to scale up and down according to your needs.
As we are in 2021, many companies and people are well aware of web scraping. Because of that, it’s a lot harder to scrape the data these days.
But, it is entirely possible to solve all of these challenges.
Let’s start with two main issues you could face if you are building your own web scraping solution.
1 - The websites will quickly block your IP address even when dealing with public web data
2 - They could mislead you by showing you different information
These are the reasons why we often recommend Bright Data to all our clients.
With Bright Data, you can rely on a versatile data collection tool and a residential proxy network. This means that your data collection activity will never be misled or blocked by the websites.
In addition, you don’t have to worry about maintaining any infrastructure whilst controlling your costs based on your metered usage.
Even better, you can get up and running straight away by using their ready-made templates. If, however, the website you want to access is not available as a template, Bright Data would create them for you in a matter of days.
When working with an online data collection platform, the ethical aspect is just as crucial as business performance.
Make sure you address the following:
-> Commitment to transparency - don’t be afraid to look behind the scenes at the processes and platform itself
-> Preserve the digital ecosystem - While collecting online data, you don’t want to overload the website with too many requests - or in short - don’t be a bad bot.
A good analogy is fishing - if you go fishing and catch too many fishes, you will cause an ecological disaster - the same goes for the digital ecosystem.
-Lastly, always check if the platform you are using abides by international regulations
With Bright Data, we could now monitor and kept ourselves up to date with the competitors' offers. So, we can now incorporate this critical information to optimise the product pricing.
Remember, we are not merely mimicking what the competitors are charging. Instead, we will take it further to optimise the revenue at the subsequent modules.
In module 2, we need to map and translate the external data to our products. For example, they include the brand name, model, colour, delivery fees, and potentially location.
This will make sure we can compare all the products like for like.
In other more complex scenarios, the products are not necessarily identical. For example, general insurance products are one of them.
This is when you need to have the domain expert providing the knowledge to map and translate the data.
When it comes price optimisation, there are few things we should always have in place to safeguard the business and avoid catastrophic disasters.
[PRESS ENTER x 2]
The first one is the ceiling price and the second one is the floor price. A ceiling price is the maximum amount you would want to charge for a product.
In the retail industry, you wouldn't want to deviate too much from the regular price. For example, $100 dearer than everyone else for the same product.
As for the floor price, it is the minimum price you want to charge for a product. Unless it's a loss leader, you do not wish to sell it at below cost.
In our case, our ceiling price is $271.17 + 5% loading, and that's about $284.73.
Depending on where you want to position in the market. Our floor price is $219.96, which is our original price.
So basically, you can find an optimal number between the ceiling and floor price but you would not want to go outside these two numbers.
Other than that, you would also want to consider if you’re an online pure-play or you have both an online and offline presence.
If you have offline stores, what are the locations and what is your typical consumer demographics visiting the store?
Is it at a premium high street location where they are generally affluent shoppers?
And finally, how do you design your online stores and customer experience for your most loyal shoppers?
For example, do you have a way to actively encourage the shoppers to always log in for a personalised experience?
If you do, you will know your customers so much better.
You will have more opportunities to create a great experience for them, make more upsales, and increase their loyalty.
There are a few ways to achieve this, but this topic alone is worthy of another discussion for another day.
After we have optimised the price, the next step is optimising your revenue.
And this is the holy grail of a data-driven revenue optimisation platform. You can make it very advanced by including many complex models using both internal and external data.
Generally, I would advise to start simple and gradually increase the model's complexity over time.
Price is one of the crucial factors, but it’s not the only factor. There are many factors to consider, such as the demand, inventory, clearance rate, and locality.
Overall, this is where you want to optimise your revenue for the business with the data you have.
By considering all these factors, you can then design a strategy to maximise the results.
One important note about this part here is, you don’t have to have all of them from day 1. Instead, you can add or subtract any of them as you see fit.
The most important thing about this is how you design the platform to be able to add more or remove any of them quickly. Like a switch.
Remember, the last thing you want to have is that anytime you make an update, you will need to engage a team of engineers to make the changes that could take months to implement.
Instead, you want to move faster than your competitors. You want to make the changes in a matter of days or even hours.
In our experience, the very last step can be a showstopper at some companies.
Like many other companies, they have a legacy system from prominent software vendors. They made various customisations to make the IT system works for their business.
However, these legacy IT systems do not necessarily support the data-driven concept in the modern IT architecture. As a result, they cannot quickly adapt to changes at speed.
So the delay or complexity of updating the legacy system is causing all of these models never to realise their full potential. But don’t worry, I can assure you that it is entirely possible to work with the IT team to get the architecture design right.
Once we have optimised the price and revenue, we will now have an entirely new set of pricing for each distribution channel that achieves the optimal results.
It includes the personalised price for every known shopper, a competitive price on the website, and the stores for each location.
This brings us to the final step. It’s about updating the analytical results into the system automatically.
If you have a modern web-based system, it’s very likely that you already have API, and we can easily update the price through the API.
If you have a legacy system, we can work with the IT team to push the data.
Finally, we want to streamline the changes with the team members at the brick and mortar stores.
You may also want to provide them with training on responding to the customers who question the price discrepancy.
With all that said, the key here is to have as little human intervention as possible. This step here is critical to get it right because you will minimise the human error, costs, and resources from the manual update.Instead, you can increase the speed of delivering and optimising the results from the automation.You can tune the system to run daily, weekly or monthly. From our experience, we would suggest a fortnightly or monthly run is suitable for most businesses.
Now, let’s look at the result from the example we have used in this masterclass.
Before the optimisation, the original price is $219.96. Based on the external data we have collected, we know it is possible for a price upheaval.
On average, we now are charging an extra $38 or 17% more than before whilst still remaining competitive.
Bear in mind; the above is only for one product. Imagine the financial impact you can make, when you replicate this for the entire portfolio.
As for the cost perspective, how many FTEs we could be saving, and how much human error we could avoid?
The most important thing, though is, you will never have to worry about the competitors crushing your price because you’re constantly monitoring and one step ahead of them.
I know from the registration details, there are people from different industries.
So some of you are wondering if this applies to your industry.
From our experience and research, many companies in the industries you see on the screen have already benefited from such revenue optimisation platform.
We are almost at the end of the masterclass.
Let me briefly restate these main points that we discussed today so you can take them back to the office and start implementing them to bulletproof the future.
1 - Modern data collection with BrightData
2 - Map & translate the external data accordingly
3 - Optimise product pricing with a competitor pricing engine
4 - Optimise portfolio revenue with more data and analytic models
5 - Update your result with an automated process into the system
Before I let you go, I want to quote what Jeff Bezos said back in 2000.
“If we want to have 20 million customers, then we want to have 20 million stores.”
The moral of this quote is that if this concept and framework are one of the critical things that make Bezos the wealthiest person in the world, why do we hesitate and not doing it at our organisation?
It makes no sense.
And remember, developing and maintaining a revenue optimisation platform is easier than ever before. You don’t need hundreds of engineers like Amazon once did to build every single element from scratch.
Instead, you can take advantage of the technology platforms like Bright Data and expertise like DDA. The costs of having such a platform are cheaper and more accessible than ever.
What Amazon spent a few years building in the past, you could now achieve it yourself in 6 months at a fraction of the costs. If you, however, need assistance, call Bright Data and DDA.
Finally, we have some special gifts to thank you for attending this masterclass.
In this or next week, contact Bright Data and claim your $50 bonus to begin your Data Collector journey
Contact me or DDA to receive a tailored report if your business can benefit from a revenue optimisation engine.
So, take action and take action now.
Thank you!