What are the common assumptions about AB (split) testing that are wrong? What are the lies told by vendors, consultants and the stuff you have convinced yourself about. What is illusory - what can you trust - what's it really all about. 20 top myths debunked after asking fellow CRO professionals what is on THEIR top list.
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
A compendium of the most common mistakes and problems people encounter when trying to optimise or split test cross device experiences (mobile, tablet, desktop, app, tv etc.)
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
In this talk, I show the key shortcuts to stop doing stupid testing and move towards innovative and transformative design & build methodologies, including innovation through split testing exploration
Condensed testing syrup - @OptimiseorDie @sydney sep 2011 - 4 years of testin...Craig Sullivan
A summary of my 4 years of A/B and Split testing, with case studies of work, photography guidelines, and key advice on which elements of the page to test for quick wins.
I enjoyed giving this talk at the Online Retailer Conference in Sydney, which is a fine place to visit.
The presentation c.overs a really good 'pizza' analogy for explaining testing to senior management and budget holders. Then covering how you go about discovering good places to test on your site, and what tools will help you get that data.
Lastly, I explore what worked for me in testing, show some examples of how similar our winners are across the globe and then cover some cross channel testing. The last one here is a big growth area and involves optimising contact centres and channels, using the web as a tool. Some interesting work going on here and I show some of ours, as well as new things in the pipeline.
There are some great resources attached, including a list of remote user testing services and the best 'guides' I could find on 'Conversion Rate Optimisation'. Hope you enjoyed the talk and thank you Sydney.
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
An expanded deck of the top 18 blockers to getting successful AB or Multivariate test results. In this deck, you get a complete checklist of the stuff you need to prepare, watch, launch and monitor your testing, so it gets you the *right* conclusions.
Slides to go with a talk on rapid, lightweight research you can do before tackling a landing page, funnel step or lead-gen form. Comes complete with all the Google Analytics reports you'll need to mine useful data to share! Less bullshit, more truth in meetings!
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
A compendium of the most common mistakes and problems people encounter when trying to optimise or split test cross device experiences (mobile, tablet, desktop, app, tv etc.)
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
In this talk, I show the key shortcuts to stop doing stupid testing and move towards innovative and transformative design & build methodologies, including innovation through split testing exploration
Condensed testing syrup - @OptimiseorDie @sydney sep 2011 - 4 years of testin...Craig Sullivan
A summary of my 4 years of A/B and Split testing, with case studies of work, photography guidelines, and key advice on which elements of the page to test for quick wins.
I enjoyed giving this talk at the Online Retailer Conference in Sydney, which is a fine place to visit.
The presentation c.overs a really good 'pizza' analogy for explaining testing to senior management and budget holders. Then covering how you go about discovering good places to test on your site, and what tools will help you get that data.
Lastly, I explore what worked for me in testing, show some examples of how similar our winners are across the globe and then cover some cross channel testing. The last one here is a big growth area and involves optimising contact centres and channels, using the web as a tool. Some interesting work going on here and I show some of ours, as well as new things in the pipeline.
There are some great resources attached, including a list of remote user testing services and the best 'guides' I could find on 'Conversion Rate Optimisation'. Hope you enjoyed the talk and thank you Sydney.
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
An expanded deck of the top 18 blockers to getting successful AB or Multivariate test results. In this deck, you get a complete checklist of the stuff you need to prepare, watch, launch and monitor your testing, so it gets you the *right* conclusions.
Slides to go with a talk on rapid, lightweight research you can do before tackling a landing page, funnel step or lead-gen form. Comes complete with all the Google Analytics reports you'll need to mine useful data to share! Less bullshit, more truth in meetings!
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
My Slides from Reaktor Breakpoint 2015 - This is by far the best deck (and hopefully talk) I've done this year. Masses of info, reading, articles, useful reports and more.
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
Session Replay or Screen Recording tools are now part of an arsenal of discovery toolkits that can drive optimisation, bug fixes, funnel and journey analysis - using qual and quant techniques. Without these tools, the analytics data misses emotion, frustration, friction and more - I've collated the best tips, tricks, tools and approaches to yield the most valuable insights for CRO / Growth Hacking.
Product design is Poo - And how to fix it!Craig Sullivan
A look at why product design is still so poor, even after 22 years of digital design work. Why do these problems exist and how can we remove them from the way we build products? Lean corporate and startup growth models are explored in the solutions to this horrendous problem!
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...Craig Sullivan
A roundup of all the things to help you maintain a competitive edge in experience design and conversion optimisation. With examples of companies putting this stuff together, the tools they are using and their project management approaches, this presentation delves deeper into the cultural aspects of CRO.
The top reasons and solutions for not getting value out of your AB tests - some practical tips for designing insightful and correctly instrumented test
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
An updated deck of a short talk (30m) given at the first Brighton CRO meetup. Contains useful AB testing tools as well as full speaker notes for most of the slides.
Mobile presentation - Sydney Online Retailer - 26 Sep 2011Craig Sullivan
In this presentation, I use analytics data from our global mobile reach, to illustrate the trends that are driving growth, how to take opportunity from them and what to do with your own site. I present a case for device and user knowledge, to allow you to optimise conversion rates, revenue and delight for visitors.
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
In this session, we explain how to mine GA for broken device experiences, flows, funnel blocks and more... Using a new grid tool we've developed, you can pull multi-dimensional segmented funnel and metric data from Google Analytics - we explain how it works, why you need it and what problems it solves. Find where your site is leaking money through data
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
This talk is the latest deck showing common problems that will easily break or skew your ab and multivariate testing results. Avoid these problems by following the simple advice in this deck!
Product Design is Poo - And we're all going to dieCraig Sullivan
A humorous presentation about what is wrong with the current way of building digital products. Showing what is wrong, explaining the signs and giving you a checklist for reforming your company - are laid out with links, resources and further reading.
Onboard like a juggernaut - Elite camp 2015Conversionista
Conversionista's presentation at Digital EliteCamp in Estonia 2105.
Find the critical conversion points in your SaaS user onboarding journey.
- Get Registered Prospects to use the service
- Get Active users to pay up
- Get Paying Customers to stay
- Get churned users to return
- Get all of them to refer more users
Do this - And rock!
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
A practical toolkit for getting inside customers heads, in order to design and create persuasive psychological approaches to copy, pages, buttons, designs and your entire service. Craig shows you here how to mine what you already have - to design a better bank balance and continuously improving future for your company, staff and your customers.
Why Does My Conversion Rate Suck? Craig Sullivan, Senior Optimisation Consult...PRWD
Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
If you’re a marketer it’s very likely that you’ve run an A/B test. It’s also likely that you’ve never calculated the sample size for your tests, and instead, you run tests until they reach statistical significance. If this is the case, your strategy is statistically flawed. Conforming to sample size requires marketers to wait longer for test results, but choosing to ignore it will bear false positives and lead to bad decisions.
This deck was created for an email audience for there are valuable lessons for anyone who runs A/B tests.
Talks@Coursera - A/B Testing @ Internet Scalecourseratalks
Talks@Coursera
This tech talk will describe how to build an experiment platform that can handle large-scale experiments. The talk will also discuss several best practices in designing and analyzing online experiments learned from companies like Coursera, Microsoft and LinkedIn.
About the Speakers
Ya Xu has been working in the domain of online A/B testing for over 4 years. She currently leads a team of engineers and data scientists building a world-class online A/B testing platform at LinkedIn. She also spearheads taking LinkedIn's A/B testing culture to the next level by evangelizing best practices and pushing for broad-based platform adoption. She holds a Ph.D. in Statistics from Stanford University.
Chuong (Tom) Do currently leads a team of data engineers and analysts in the Analytics team at Coursera, which is responsible for data infrastructure and quantitative analysis in support of the product and business. He completed his Ph.D. in Computer Science at Stanford University in 2009 and worked as a scientist in the personal genetics company 23andMe until 2012, where his research has collectively spanned the fields of machine learning, computational biology, and statistical genetics.
Writing quality text content has a great ROI. It generates trust, empathy and improves conversion. A few tips on how we do it at Drivy. Paulin Dementhon presented those slides at the Blend Conference in Lyon on October, 2nd 2013.
Human-Centered Copywriting: How Your Words Can Make or Break Your User Experi...UserTesting
The words you include in a website, app, email, or ad are your brand’s opportunity to speak directly to your users and build a relationship. The problem is that most copy is written to serve the company, not the customer.
Learn:
• How bad copy can destroy your UX
• How good copy can help you win more business and increase customer loyalty
• Some surprising human quirks, and how we can use them to our advantage when we write
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
My Slides from Reaktor Breakpoint 2015 - This is by far the best deck (and hopefully talk) I've done this year. Masses of info, reading, articles, useful reports and more.
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
Session Replay or Screen Recording tools are now part of an arsenal of discovery toolkits that can drive optimisation, bug fixes, funnel and journey analysis - using qual and quant techniques. Without these tools, the analytics data misses emotion, frustration, friction and more - I've collated the best tips, tricks, tools and approaches to yield the most valuable insights for CRO / Growth Hacking.
Product design is Poo - And how to fix it!Craig Sullivan
A look at why product design is still so poor, even after 22 years of digital design work. Why do these problems exist and how can we remove them from the way we build products? Lean corporate and startup growth models are explored in the solutions to this horrendous problem!
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...Craig Sullivan
A roundup of all the things to help you maintain a competitive edge in experience design and conversion optimisation. With examples of companies putting this stuff together, the tools they are using and their project management approaches, this presentation delves deeper into the cultural aspects of CRO.
The top reasons and solutions for not getting value out of your AB tests - some practical tips for designing insightful and correctly instrumented test
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
An updated deck of a short talk (30m) given at the first Brighton CRO meetup. Contains useful AB testing tools as well as full speaker notes for most of the slides.
Mobile presentation - Sydney Online Retailer - 26 Sep 2011Craig Sullivan
In this presentation, I use analytics data from our global mobile reach, to illustrate the trends that are driving growth, how to take opportunity from them and what to do with your own site. I present a case for device and user knowledge, to allow you to optimise conversion rates, revenue and delight for visitors.
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
In this session, we explain how to mine GA for broken device experiences, flows, funnel blocks and more... Using a new grid tool we've developed, you can pull multi-dimensional segmented funnel and metric data from Google Analytics - we explain how it works, why you need it and what problems it solves. Find where your site is leaking money through data
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
This talk is the latest deck showing common problems that will easily break or skew your ab and multivariate testing results. Avoid these problems by following the simple advice in this deck!
Product Design is Poo - And we're all going to dieCraig Sullivan
A humorous presentation about what is wrong with the current way of building digital products. Showing what is wrong, explaining the signs and giving you a checklist for reforming your company - are laid out with links, resources and further reading.
Onboard like a juggernaut - Elite camp 2015Conversionista
Conversionista's presentation at Digital EliteCamp in Estonia 2105.
Find the critical conversion points in your SaaS user onboarding journey.
- Get Registered Prospects to use the service
- Get Active users to pay up
- Get Paying Customers to stay
- Get churned users to return
- Get all of them to refer more users
Do this - And rock!
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
A practical toolkit for getting inside customers heads, in order to design and create persuasive psychological approaches to copy, pages, buttons, designs and your entire service. Craig shows you here how to mine what you already have - to design a better bank balance and continuously improving future for your company, staff and your customers.
Why Does My Conversion Rate Suck? Craig Sullivan, Senior Optimisation Consult...PRWD
Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.
SAMPLE SIZE – The indispensable A/B test calculation that you’re not makingZack Notes
If you’re a marketer it’s very likely that you’ve run an A/B test. It’s also likely that you’ve never calculated the sample size for your tests, and instead, you run tests until they reach statistical significance. If this is the case, your strategy is statistically flawed. Conforming to sample size requires marketers to wait longer for test results, but choosing to ignore it will bear false positives and lead to bad decisions.
This deck was created for an email audience for there are valuable lessons for anyone who runs A/B tests.
Talks@Coursera - A/B Testing @ Internet Scalecourseratalks
Talks@Coursera
This tech talk will describe how to build an experiment platform that can handle large-scale experiments. The talk will also discuss several best practices in designing and analyzing online experiments learned from companies like Coursera, Microsoft and LinkedIn.
About the Speakers
Ya Xu has been working in the domain of online A/B testing for over 4 years. She currently leads a team of engineers and data scientists building a world-class online A/B testing platform at LinkedIn. She also spearheads taking LinkedIn's A/B testing culture to the next level by evangelizing best practices and pushing for broad-based platform adoption. She holds a Ph.D. in Statistics from Stanford University.
Chuong (Tom) Do currently leads a team of data engineers and analysts in the Analytics team at Coursera, which is responsible for data infrastructure and quantitative analysis in support of the product and business. He completed his Ph.D. in Computer Science at Stanford University in 2009 and worked as a scientist in the personal genetics company 23andMe until 2012, where his research has collectively spanned the fields of machine learning, computational biology, and statistical genetics.
Writing quality text content has a great ROI. It generates trust, empathy and improves conversion. A few tips on how we do it at Drivy. Paulin Dementhon presented those slides at the Blend Conference in Lyon on October, 2nd 2013.
Human-Centered Copywriting: How Your Words Can Make or Break Your User Experi...UserTesting
The words you include in a website, app, email, or ad are your brand’s opportunity to speak directly to your users and build a relationship. The problem is that most copy is written to serve the company, not the customer.
Learn:
• How bad copy can destroy your UX
• How good copy can help you win more business and increase customer loyalty
• Some surprising human quirks, and how we can use them to our advantage when we write
UX/CASE STUDY-STYLE COPYWRITING: Product Buying GuidesAdam Stanley
Written with a customer-centric tone of voice. Highlighted key product features and potential FAQs to create a user guide that was easy to follow. Top-level style copywriting. Succinct and relevant.
USECON RoX 2015: Slip into your customers' shoes - Mobile EthnographyUSECON
Speaker: Klaus Schwarzenberger (CTO Experience Fellows)
Customer Experience Research – eine innovative Software
Die Kundenreise ist eine komplexe Sache, die sich über eine Vielzahl von Kanälen abspielt. Gerade bei der Neu- oder Weiterentwicklung von Produkten stoßen quantitative Ansätze oft an ihre Grenzen in Bezug auf die Aussagekraft. Mobile Ethnographie und andere qualitative Methoden bringen wertvolle Einblicke, um die Bedürfnisse des Kunden zu verstehen und das Unternehmen fit für die Experience Economy zu machen.
Haben Sie dazu Fragen oder möchten Sie die Folien haben, dann kontaktieren Sie uns bitte unter office@usecon.com
Nathalie Nahai - 5 psychological principles of persuasive designNathalie Nahai
In this presentation for the Habit Summit, I outline 5 of the most powerful psychological principles for persuasive product design:
1. Endowed progress
2. Sunk-cost fallacy
3. Appointment dynamic
4. Opportunity Cost
5. Hedonic adaptation
During the talk I’ll explain their scientific basis and illustrate their application with numerous case studies, as well as give practical tips on how you can implement these principles in your own work.
Hope you enjoy it!
NN x
Beyond SEO: copywriting for professionalsJoost de Valk
SEO has gone from a technical trade to being more marketing focussed. Joost & Marieke will talk about how to gain great rankings & satisfied visitors by writing quality content. Focussing on SEO copywriting has a major pitfall. We give examples and tips on how to write a post that is both readable as well as SEO-friendly.
When designing for web and mobile platforms, the copy matters. Whether it’s in a form field, a check-out flow, or a call to action, copy can make or break the user's experience with a product or brand. When we engage customers through digital experiences for entertainment or e-commerce, it is imperative to consider the copy within the context of the medium, user flow, visual design, and overarching brand narrative. In this class for General Assembly NYC, students will learn best practices, tips, and tools for writing the best copy or microcopy possible.
https://generalassemb.ly/education/copywriting-strategies-for-better-ux
AB Testing and UX - a love story with numbers and people (by Craig Sullivan a...Northern User Experience
AB Testing and UX - a love story with numbers and people
Slides from the NUX6 talk by Craig Sullivan, Friday 27th October 2017.
2017.nuxconf.uk / nuxuk.org
Synopsis:
What’s wrong with the web these days? The mobile experience sucks. The customer experience sucks. It doesn’t work. It’s too hard to use. The text is too small. Nobody measures this happening. The interaction patterns suck. Nobody ever calls up to complain but nobody does anything anyway. Millions of people lose countless days to friction, poor design and frustrating moments on their devices.
There may be thousands of things you can fix that look promising – but how do you know where to start? What if you could measure what sucked, where it sucked and how big the problem was? Using lightweight research methods and tools, you can stop making excuses and start knowing exactly what to do. Life becomes much simpler and easier with a scientific method of optimising growth or delight within your product.
Craig has trained over 500 people on how to measure and optimise their product experience, finding 100M of ‘lost revenue’ using just one of the techniques you will learn. With reports, checklists, downloadable templates and toolkits for every budget and stage of growth – you can stop guessing tomorrow.
Experimental statistics is only one of the many powerful analytical techniques companies are using to supercharge their experiment ideation, segmentation, and analysis. Check out this content for a refresher of key stats issues and a discussion on how to use data for better test and bigger wins.
Fail and Win: Why a Failed Test Isn’t a Bad ThingOptimizely
Caleb Whitmore, CEO, Analytics Pros
Ryan Lillis, Strategic Optimization Consultant, Optimizely
Here's something you don't expect to hear at a CRO conference: most A/B tests don't produce a variation that's better than what you already have.
If all you're doing is running an A/B test, viewing select metrics, and giving a "thumbs up" or "thumbs down," you won't have a successful optimization program — even if you happen upon a few "winners."
But you don't have to run your optimization program this way.
A/B testing done right allows you to draw winning insights from "losing" tests that have the power to genuinely affect your business.
Caleb Whitmore, Founder and CEO of Analytics Pros, shows that you can achieve a 360-degree view of data that leverages your analytics engine as well as your testing platform to drive deep and genuine insights about the effects of your tests.
You'll learn a holistic approach to testing that goes way beyond "winners" and "losers."
Patrick McKenzie Opticon 2014: Advanced A/B TestingPatrick McKenzie
A/B Testing Beyond Headlines and Button Colors -- ideas for tests (particularly for B2B SaaS), common pitfalls in organizations, and how to overcome them.
One of the most commonly asked questions is “when is an MVT experiment or AB test finished?”
Is it at 30 days...? 100 conversions...? 10,000 visitors...?
The short answer is... it depends.
Craig Sullivan - Keynote speaker summary & final thoughts - Conversion Hotel ...Webanalisten .nl
Slides of the keynote by Craig Sullivan (UK) at Conversion Hotel 2015, Texel, the Netherlands (#CH2015): "You already listened to 10 keynotes – number 11 will refresh your memory, make you laugh and will leave you with some final thoughts for the trip home." http://conversionhotel.com
Condensed version of Peep Laja's "Master the Essentials of Conversion Optimization" to help understand conversion optimization and increase knowledge, while saving time.
Research and Discovery Tools for Experimentation - 17 Apr 2024 - v 2.3 (1).pdfVWO
You can utilize various forms of Generative Research to deepen your understanding of how people interact with your product or service.
Craig has amassed a vast toolkit of research methods, which he has employed to optimize websites and apps for over 500 companies. He'll share which methods yielded the highest return on investment, identified key customer pain points, and generated the best experiment ideas.
By sharing the top inspection methods essential for our work, Craig will provide advice for each technique. Anticipate insights on driving experiment hypotheses from research, a list of essential toolkit components for tomorrow, and additional resources for further reading.
How to Turn Your Optimization Team into a Revenue Doubling MachineOptimizely
How do you create a world-leading optimization team? One that you can trust to consistently deliver ROI year after year? Check out these slides and learn:
- How a high traffic website has gained A/B tested revenue lift two years in a row.
- The process that radically improves your optimization program’s effectiveness
- Surprising personalization case study test results
- The gold standard method for proving ROI for upper management
- One A/B test your designer will hate and your CFO will love
Even the most comprehensive strategies will only ever be as effective as the teams, tools, and processes that bring them to life. Review this content and learn how to:
- Getting started: Earn executive buy-in and build your team
- Maintain momentum despite setbacks: Turn initial losses into wins
- Scale up: Build company-wide culture for long-term winning streaks
Do you have an OEE calculator? TBM Operations consultants share their framework for demonstrating process improvements in financial terms so you can convince senior management that OEE improvement should be a top priority in 2022.
3 TED style talks of 15-20 minutes, featuring:
(1) Conversion methodologies, Lean UX and Agile? What gives?
(2) #Measurecamp and my Top Analytics Tips of 2013
(3) Conversion tools of the CRO masters
The tools used by the CRO masters round the world to optimise analytics, UX, VOC,insight and testing - all to optimise your insight or conversion figures.
1.Wireless Communication System_Wireless communication is a broad term that i...JeyaPerumal1
Wireless communication involves the transmission of information over a distance without the help of wires, cables or any other forms of electrical conductors.
Wireless communication is a broad term that incorporates all procedures and forms of connecting and communicating between two or more devices using a wireless signal through wireless communication technologies and devices.
Features of Wireless Communication
The evolution of wireless technology has brought many advancements with its effective features.
The transmitted distance can be anywhere between a few meters (for example, a television's remote control) and thousands of kilometers (for example, radio communication).
Wireless communication can be used for cellular telephony, wireless access to the internet, wireless home networking, and so on.
This 7-second Brain Wave Ritual Attracts Money To You.!nirahealhty
Discover the power of a simple 7-second brain wave ritual that can attract wealth and abundance into your life. By tapping into specific brain frequencies, this technique helps you manifest financial success effortlessly. Ready to transform your financial future? Try this powerful ritual and start attracting money today!
Multi-cluster Kubernetes Networking- Patterns, Projects and GuidelinesSanjeev Rampal
Talk presented at Kubernetes Community Day, New York, May 2024.
Technical summary of Multi-Cluster Kubernetes Networking architectures with focus on 4 key topics.
1) Key patterns for Multi-cluster architectures
2) Architectural comparison of several OSS/ CNCF projects to address these patterns
3) Evolution trends for the APIs of these projects
4) Some design recommendations & guidelines for adopting/ deploying these solutions.
APNIC Foundation, presented by Ellisha Heppner at the PNG DNS Forum 2024APNIC
Ellisha Heppner, Grant Management Lead, presented an update on APNIC Foundation to the PNG DNS Forum held from 6 to 10 May, 2024 in Port Moresby, Papua New Guinea.
# Internet Security: Safeguarding Your Digital World
In the contemporary digital age, the internet is a cornerstone of our daily lives. It connects us to vast amounts of information, provides platforms for communication, enables commerce, and offers endless entertainment. However, with these conveniences come significant security challenges. Internet security is essential to protect our digital identities, sensitive data, and overall online experience. This comprehensive guide explores the multifaceted world of internet security, providing insights into its importance, common threats, and effective strategies to safeguard your digital world.
## Understanding Internet Security
Internet security encompasses the measures and protocols used to protect information, devices, and networks from unauthorized access, attacks, and damage. It involves a wide range of practices designed to safeguard data confidentiality, integrity, and availability. Effective internet security is crucial for individuals, businesses, and governments alike, as cyber threats continue to evolve in complexity and scale.
### Key Components of Internet Security
1. **Confidentiality**: Ensuring that information is accessible only to those authorized to access it.
2. **Integrity**: Protecting information from being altered or tampered with by unauthorized parties.
3. **Availability**: Ensuring that authorized users have reliable access to information and resources when needed.
## Common Internet Security Threats
Cyber threats are numerous and constantly evolving. Understanding these threats is the first step in protecting against them. Some of the most common internet security threats include:
### Malware
Malware, or malicious software, is designed to harm, exploit, or otherwise compromise a device, network, or service. Common types of malware include:
- **Viruses**: Programs that attach themselves to legitimate software and replicate, spreading to other programs and files.
- **Worms**: Standalone malware that replicates itself to spread to other computers.
- **Trojan Horses**: Malicious software disguised as legitimate software.
- **Ransomware**: Malware that encrypts a user's files and demands a ransom for the decryption key.
- **Spyware**: Software that secretly monitors and collects user information.
### Phishing
Phishing is a social engineering attack that aims to steal sensitive information such as usernames, passwords, and credit card details. Attackers often masquerade as trusted entities in email or other communication channels, tricking victims into providing their information.
### Man-in-the-Middle (MitM) Attacks
MitM attacks occur when an attacker intercepts and potentially alters communication between two parties without their knowledge. This can lead to the unauthorized acquisition of sensitive information.
### Denial-of-Service (DoS) and Distributed Denial-of-Service (DDoS) Attacks
Bridging the Digital Gap Brad Spiegel Macon, GA Initiative.pptxBrad Spiegel Macon GA
Brad Spiegel Macon GA’s journey exemplifies the profound impact that one individual can have on their community. Through his unwavering dedication to digital inclusion, he’s not only bridging the gap in Macon but also setting an example for others to follow.
2. @OptimiseOrDie
• UX, Analytics, Split Testing and Growth Rate Optimisation
• Started doing testing & CRO 2004
• Split tested over 45M visitors in 19 languages
• 67+ mistakes I MADE with AB testing
• Like riding a bike…
• Optimise your optimisation? Get in touch!
3.
4. Myths, Lies and Illusions of Optimisation
1. Optimisation and Testing are the same
2. Optimisation and Testing are easy
3. Statistical significance tells you when to stop
4. High traffic always makes tests shorter
5. It’s all about Conversion Rates
6. You don’t need to test the test
7. Your tests will give you the quoted lift
8. High traffic pages give the best tests
9. It’s about getting all your tests to win
10. Optimisation is only about websites
@OptimiseOrDie
11. Segmentation will tell you what happened
12. You can’t run multiple simultaneous tests
13. Testing is great for settling arguments
14. You can spot trends early on in a test
15. More test volumes = better results
16. Your tests tell you truths that last forever
17. You can test even on low traffic sites
18. Other people’s tests are ‘Best Practice’
19. Doesn’t involve changing the way you work
20. Testing makes you a Data Scientist
11. What is CRO/Optimisation?
• “Using Analytics data and
Customer feedback to improve the
performance of your website.”
• “Finding out why visitors aren’t
converting and then resolving
these issues.”
• “Running loads of crappy split tests
randomly until the heat death of
the universe”
@OptimiseOrDie
12. What is my definition?
“A structured, systematic and continuous
application of techniques that are used to
discover, quantify and prioritise issues.
These can be turned into hypotheses to drive
experiments and opportunity in the following
business outcomes:”
• Increased revenue or profitability
• Increasing LTV, loyalty, NPS/Sat scores, Kristoffer Potential
• Removing cost from the business or contact center
• Higher productivity or labour flexibility
• Delighting customers
• Reduced development effort
@OptimiseOrDie
13. What Optimisation is NOT!
• A way to change things you (or others) hate
• A methodology for running split tests
• A guarantee of increased conversion
• A methodology for looking at analytics data
• A rescue for silo-bound or non-agile design or
development processes
• A way to trick people into buying
• A bolt-on – it IS the process
@OptimiseOrDie
14. Optimisation is a:
• Way of joining the worlds of Customer insight, UX,
Analytics, Split testing and Business Strategy
• Overarching Design and development process which
prioritises work around opportunity
• Strategic, not tactical, response to wasted development
effort or product change
• Way to create a deep and meaningful connection
between the team, customers, business and the outcomes
of making product changes
• The killer app to remove ego, opinion, assumptions,
cherished notions or ‘we just do it that way’ from decision
making.
• More powerful method than UX research or analytics
alone, in guiding the directionality of product change
@OptimiseOrDie
15. Optimisation includes these:
• Qualitative research
• Analytics, Quant analysis and
insight
• UX inspection and discovery
• Competitive Intelligence
• Priority based opportunity
• VOC, Surveys and Customer
Satisfaction (NPS)
• Call tracking & Call centre
optimisation
• AB testing
• Multivariate testing
• Photography optimisation
• EEG / ECG / Galvanic response
@OptimiseOrDie
• Web performance tuning
• Forms analytics
• Eye tracking
• Market research
• Big & Unstructured data analysis
• PPC optimisation
• Session replay analysis
• Customer journey mapping
• Ethnographic (diary) study
research
• Cross device, platform and
channel insight
• Email optimisation
17. Surely you just add Javascript?
• This work is harder than anything I’ve ever done!
• You all have expensive & limited resources for testing - like Airport take-off slots
• You MUST make use of these efficiently
• You HAVE to balance resource cost with opportunity by prioritising carefully
@OptimiseOrDie
• You’re doing the equivalent of drug trials!
• For large OR small companies, the instrumentation, analytics, tools
setup, plan design, test methodology and analysis cannot be ‘done
later on’. It’s not complex but it is vital.
• The best companies have a structural shell (process, methodology,
management) around their activities
• If you’re not optimising the process continuously (Kaizen), you won’t
increase your velocity of iterations.
• Optimise the Optimisation
19. The 95% Stopping Problem
@OptimiseOrDie
• Many people use 95, 99% ‘confidence’ to stop
• This value is unreliable and moves around
• Nearly all my tests reach significance before they are
actually ready
• You can hit 95% early in a test (18 minutes!)
• If you stop, it could be a false result
• Read this Nature article : bit.ly/1dwk0if
• Optimizely have changed their stats engine
• This 95% thingy – must be LAST on your stop list
• Let me explain
20. The 95% Stopping Problem
@OptimiseOrDie
Scenario 1 Scenario 2 Scenario 3 Scenario 4
After 200
observations
Insignificant Insignificant Significant! Significant!
After 500
observations
Insignificant Significant! Insignificant Significant!
End of
experiment
Insignificant Significant! Insignificant Significant!
“You should know that stopping a test once it’s significant is deadly sin
number 1 in A/B testing land. 77% of A/A tests (testing the same thing
as A and B) will reach significance at a certain point.”
Ton Wesseling, Online Dialogue
21. The 95% Stopping Problem
@OptimiseOrDie
“Statistical Significance does not equal Validity”
http://bit.ly/1wMfmY2
“Why every Internet Marketer should be a Statistician”
http://bit.ly/1wMfs1G
“Understanding the Cycles in your site”
http://mklnd.com/1pGSOUP
23. Business & Purchase Cycles
@OptimiseOrDie
• Customers change
• Your traffic mix changes
• Markets, competitors
• Be aware of all the waves
• Always test whole cycles
• Don’t exclude slower buyers
• When you stop, let test
subjects still complete!
Start Test Finish Avg Cycle
24. • TWO BUSINESS CYCLES minimum (week/mo)
• 1 PURCHASE CYCLE minimum
• 250 CONVERSIONS minimum per creative (e.g. checkouts)
• 350 & MORE! if response is very similar
• FULL WEEKS/CYCLES never part of one
• KNOW what marketing, competitors and cycles are doing
• RUN a test length calculator - bit.ly/XqCxuu
• SET your test run time , RUN IT, STOP IT, ANALYSE IT
• ONLY RUN LONGER if you need more data
• DON’T RUN LONGER just because the test isn’t giving the result you want!
@OptimiseOrDie
How Long? Simple Rules to follow
25. 5. It’s ALL about Conversion Rate
@OptimiseOrDie
26. It’s all about the business
@OptimiseOrDie
• You’re optimising a business here, not a page or site
• Tricking, pushing or persuading people at a superficial level to
take an action is not a viable strategy
• Your optimisation strategy is a series of steps, not a tool
• Testing is about learning, not converting.
• Tests that fail to tell you anything (regardless of outcome) are
a failure themselves
• If you don’t shift the business goals, your optimisation and
testing budget will be threatened
27. 6. You don’t need to test the test – just
go
@OptimiseOrDie
Browser testing www.crossbrowsertesting.com
www.browserstack.com
www.spoon.net
www.saucelabs.com
www.multibrowserviewer.com
Mobile devices www.appthwack.com
www.deviceanywhere.com
www.opendevicelab.com
Read this article bit.ly/1wBccsJ
28. 7. The test result gives the promised lift
@OptimiseOrDie
29. The result is a range
@OptimiseOrDie
• Version A is 3% conversion
• Version B is 4% conversion
• Yay! That’s a 25% lift
• Let’s tell everyone
• When it goes live, you get 5.7%
• That’s because it was A RANGE
• 3% +/- 0.5
• 4% +/- 0.4
• Actual result was 3.5% for A
• Actual result was 3.7% for B
31. 8. Testing is best on high traffic pages
@OptimiseOrDie
Think like the CEO of a
department store!
If you can’t refurbish the entire
store, which floors or
departments will you invest in
optimising?
Wherever there is:
• Footfall
• Low return
• Opportunity
34. 9. It’s all about WINNING test results
@OptimiseOrDie
35. Failing is good
@OptimiseOrDie
• Tests that are ‘about the same’ are a failure
• They’re also very hard to call
• That means you have to be BOLD not conservative
• A test that comes out negative is NOT a failure
• If a ‘negative’ test teaches you something, it’s a success!
• If you hit 40/50/60% failed tests, that’s fine
• If you aren’t failing regularly, you’re not BOLD enough
• Success is about the number of tests you finish each month,
and what you learn from them
36. We believe that doing [A]
for People [B] will make
outcome [C] happen.
We’ll know this when we
observe data [D] and
obtain feedback [E].
(reverse)
@OptimiseOrDie
38. Optimisation is just for
websites
@OptimiseOrDie
• Service Design (Airbnb)
• Onboarding flows with emails
• Email templates
• Apps – testing, debugging, tracking
• Phone tracking and call centre optimisation
• Social, Display, TV, Video and other advertising
• Print adverts, Direct Mail
• In-store promotions
• Product manuals, guides, interfaces
• EVERYTHING has elasticity – just find it
• Even Multi-variate call centre scripts
40. Segmentation explains stuff
@OptimiseOrDie
• Beware of small sample sizes
• A = 350 conversions
• B = 300 conversions
• A Conversions for Safari = 20
• B Conversions for Safari = 25
• Only needs 2 people to change that from 25% lift to 14%
41. 12. You can’t run concurrent split tests
@OptimiseOrDie
42. Oh yes you can, with GA!
@OptimiseOrDie
• If you push events or variables into GA, you can report on
behaviour for A or B (or any variations).
• If you do it that way, you can easily run multiple tests on
different page targets simultaneously.
• You grab AAA, AAB, ABA, ABB and analyse.
• Test subjects get a recipe of tests, so one caveat
• If you pick things that clash or jar the experience
• If you have $5 delivery but then $2 in another test
• Apart from that, it’s fine to run these
• They tell you about the right experience recipe across several
pages being optimised in concert
43. 13. Testing is great for settling arguments
@OptimiseOrDie
44. 14. You can spot trends early in a test
@OptimiseOrDie
45. You can spot trends early
@OptimiseOrDie
• Tests are volatile in the early stages
• Watch but shut your mouth and wait
• Keep an eye on any odd behaviour
• Be patient!
47. More tests = Better results
@OptimiseOrDie
• Increasing volume without optimising velocity
• Not optimising the process means it scales badly
• If you put garbage in, you’ll get garbage out
• Ramping up before you have it tuned is crazy
• Get the team, process, project management, methodology,
toolkits, selling and PR nailed first
• THEN optimise and scale
• I know that one UK retailer is doing 140 per month!
• That’s LOW compared to some companies I work with
• Big companies doing 2 a month? Meh.
49. Tests are a Truth Forever
@OptimiseOrDie
• Traffic changes
• Prices change
• Product mix changes
• Advertising evolves and changes
• Markets are different
• Customers have changed
• Competitors or regulatory landscape moves
• Things happen outside of your control
• You need to revisit tests or ITERATE
• Always be trying to beat the new winner, not basking in the
glory of a test you ran 9 months ago
• The lift may have vanished
• Schrodinger’s AB test
51. Testing is fine on low traffic
sites
@OptimiseOrDie
• Yes, if you estimate your minimum testing unit
• This is the time to AB test a sample of say 250 conversions, with
the rules I set earlier.
• A payday loan company? 2 months minimum!
• Run the calculations
• Check the test length
• If it takes like, 8 million years, what can you do?
• Read this or download my AB testing decks:
bit.ly/1umy5Y6
52. 18. Other people’s tests are Best Practice
@OptimiseOrDie
“STOP copying your competitors
They may not know what the
f*** they are doing either”
Peep Laja, ConversionXL
53. Tests you see online?
@OptimiseOrDie
• Your customers are not the same
• Your site is not the same
• Your advertising and traffic is not the same
• Your UX is not the same
• How the f*** do you think it guarantees a result?
• Use them to inform or suggest ideas
• They’re like the picture on meal packets
• Serving Suggestion Only
70. #12 : The Best Companies…
• Invest continually in analytics instrumentation, tools, people
• Use an Agile, iterative, cross-silo, one team project culture
• Prefer collaborative tools to having lots of meetings
• Prioritise development based on numbers and insight
• Practice real continuous product improvement, not SLEDD*
• Are fixing bugs, cruft, bad stuff as well as optimising
• Source photos and content that support persuasion and utility
• Have cross channel, cross device design, testing and QA
• Segment their data for valuable insights, every test or change
• Continually reduce cycle (iteration) time in their process
• Blend ‘long’ design, continuous improvement AND split tests
• Make optimisation the engine of change, not the slave of ego
* Single Large Expensive Doomed Developments
This is the title of my talk today.
Wonderful picture, isn’t it?
It’s from an IBM advert from 1951 and is a great piece of work, especially the copywriting. The whole message here is “Buying an IBM computer gets you the same power as 150 extra engineers”. And not a feature in sight – the trick is they’re not selling the computer, they’re selling what the computer will do for your business and your life.
And what am I talking about today? Well the fact that most split tests being run these days are just bullshit – the slide rules don’t add up for a lot of companies.
Many C level execs I’ve spoken to complain about the variability of return or success on this kind of testing.
There’s a reason for this -
And here’s a boring slide about me – and where I’ve been driving over 400M of additional revenue in the last few years. For the sharp eyed amongst you, you’ll see that Lean UX hasn’t been around since 2008. Many startups and teams were doing this stuff before it got a new name, even if the approach was slightly different. For the last 4 years, I’ve been optimising sites using a blend of these techniques.
And here are some of the clients I’ve been working for.
Dull bit is now officially over.
And don’t worry – if it’s not working for you – and looks like this, it’s OK – you’re just doing it the wrong way.
Although I admire AB testing companies - all of them - for championing the right to test and making it easy for anyone to implement - there's a problem. Democratisation of testing brings with it a large chunk of stupidity too.
When YouTube first appeared, did anyone think "Oh boy, there's only ever going to be high quality content to see on here. Seriously. No”
And this crappy AB testing is basically the equivalent of funny cat videos
People taking videos of themselves playing video games
And like, wow, there are 6.9 million Gangnam Style videos. Just incredible.
But hidden in those big numbers, YouTube will always have a tiny percentage of really great stuff, very little good stuff and a long tail of absolute bollocks.
And the same is true of split testing - there's some really well run stuff, getting very good results and there's a lot of air guitar going on.
So – the first myth. Some people think that testing and optimisation are the same thing.
AB testing is just one of the techniques that I’ll use to optimise a business and there are many more. Let’s talk about definitions.
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
It has taken me a long time to find out where all the bear traps are hidden. Mainly from screwing up tests and figuring out what was wrong, through lots of testing time.
And most companies and teams are stepping on these bear traps without even realising. And they wonder why the test results aren’t replicated in the bank account results. Hah.
I have a list now of about 60 ways to easily break, skew, bias or screw up your tests completely. But here are some real biggies to watch for:
I once explained to my daughter – you know, when adults like look really in control and making decisions and appearing not to suffer from indecision? Don’t believe it for a minute – we’re just better at winging it cause we’re older.
And this is the huge hole that’s gnawing at the hear of many digital operations. The inability to understand what you can and can’t be confident about – but nobody wants to admit they’re guessing a lot of the time.
There is one answer to this trap I call taking a visit to Guessaholics Anonymous - to surrender to the higher power of testing and innovation by using consumer psychological insight and data to guide your hand. To recognise you’re powerless at deciding what’s best or second guessing what will win.
It's actually liberating to not be sitting in a meeting room, arguing about the wording of a bloody button for 4 fucking hours, ever again.
And this was the state of my head in 2004. The inability to understand what you can and can’t be confident about – but nobody wants to admit they’re fucking guessing a lot of the time.
And it took me a long time to figure out I didn’t know anything really – it was all assumptions and cherished notions. It was pretty crushing to test my way to this realisation but MUCH I’m happier now.
Now I think I know this much - but I might know a wee bit more than I think I do – but I’m erring on the side of caution.
That’s because I'm always questioning everything I do through the lens of that consumer insight and testing.
Without customers and data driven insights, you can’t shape revenue and delight. They’ll give you the very psychological insights you need to apply levers to influence them, if you only ask questions. Everything else is just a fucking guess.
Even with tests, if the only inputs you’ve got are ego and opinion, they’re going to be lousy guesses and you’re wasting your experiments.
And now a bit about something I call Rumsfeldian Space – exploring the unknowns. This is vital if you want to make your testing bold enough to get great results.
You need to inhabit the contextual and emotional landscape of the consumer to really shape product or service experience. The only way to do this is have teams and cultures that create a direct and meaningful connection between teams and the customer, in the impact that every change has on the outcome.
Every atom of every piece of copy, design, error message, email, website, support, help content, absolutely bloody everything you do - has to be framed within knowledge and empathy with the consumer fears, worries, barriers, pain but also the real problems we solve by designing products not as features but as life enhancing. And this is the best marketing of all, like the IBM ad.
Business Model Optimisation requires a watchmakers eye – a complete understanding of the watch from macro to micro - the flow of delight and money that can be shaped inside every customer experience, website, and interaction - at a component and a service design level.
Most people have 1 or 2 legs at most. The best companies I've worked with are doing all of these.
Darwin did NOT say 'survival of the fittest' – that was actually another guy called Herbert Spencer. What Darwin actually pushed was that the key ingredients were heritability of traits, variation and selection based on survival. If only your marketing programme was quite as ruthless eh?
And if you want variation and innovation, the survival of good ideas in favour of bad and knowledge that you pass on – you need a culture of adaptability, improvement and change. Agile is about a shared mind-set across managers, leaders and everyone in the team.
There’s a Harvard survey about how the *most* productive teams communicate. Not in meetings but all the time - deskside, IM, phone, skype, GitHub, agile tools, apps - these are the telegraph wires of the collaborative, participative and mission oriented teams.
My key insight of the last 10 years in building and leading teams is that agile, open, flat, cross-silo, participative, flexible and collaborative environments produce customer connected products of high quality. Autoglass NPS higher than Apple.
I hope you enjoyed it as much as I did writing it. All my details are here and slides will be uploaded shortly.
Thank you for your time today.
So – what’s driving this change then? Well there have been great books on selling and persuading people – all the way back to ‘Scientific Advertising’ in 1923.
And my favourite here is the Cialdini work – simply because it’s a great help for people to find practical uses for these techniques.
I’ve also included some analytics and testing books here – primarily because they help so MUCH in augmenting our customer insight, testing and measurement efforts.
There are lots of books with really cool examples, great stories and absolutely no fucking useful information you can use on your website – if you’ve read some of these, you’ll know exactly what I mean. These are the tomes I got most practical use from and I’d recommend you buy the whole lot – worth every penny.
So – what’s driving this change then? Well there have been great books on selling and persuading people – all the way back to ‘Scientific Advertising’ in 1923.
And my favourite here is the Cialdini work – simply because it’s a great help for people to find practical uses for these techniques.
I’ve also included some analytics and testing books here – primarily because they help so MUCH in augmenting our customer insight, testing and measurement efforts.
There are lots of books with really cool examples, great stories and absolutely no fucking useful information you can use on your website – if you’ve read some of these, you’ll know exactly what I mean. These are the tomes I got most practical use from and I’d recommend you buy the whole lot – worth every penny.
So – what’s driving this change then? Well there have been great books on selling and persuading people – all the way back to ‘Scientific Advertising’ in 1923.
And my favourite here is the Cialdini work – simply because it’s a great help for people to find practical uses for these techniques.
I’ve also included some analytics and testing books here – primarily because they help so MUCH in augmenting our customer insight, testing and measurement efforts.
There are lots of books with really cool examples, great stories and absolutely no fucking useful information you can use on your website – if you’ve read some of these, you’ll know exactly what I mean. These are the tomes I got most practical use from and I’d recommend you buy the whole lot – worth every penny.
These are all people on twitter who cover hybrid stuff – where usability, psychology, analytics and persuasive writing collide. If you follow this lot, you’ll be much smarter within a month, guaranteed.
And here are the most useful resources I regularly use or share with people. They have the best and most practical advice – cool insights but with practical applications.
In my opinion, these are the attributes of companies doing great things with optimisation and continuous improvement.
This is the future of testing. A machine learning system that will test out variants and tell you what’s driving response to all your experiments.
Know if that offer works because of someone’s age or past spending patterns – let the tool explain to you where the value is and let it exploit these patterns as an intelligent agent under your control.