BrightonSEO 2018 (September 28): How to Optimize for Visual Search
- Independent consultant, working with
Google, Columbia Business School,
MIT, and many others.
- 9 years in search.
- Research lead for ClickZ and Search
Visual search turns
camera into a
scan an object or
landscape and the
search engine will
features in the
image, then return
This goes beyond
as we will see.
How does visual search
Search what you see.
How it works: Pinterest Lens
1. Query understanding: Object
shape, size, color. Annotations
help connect this to Pinterest’s
Results pulled from Visual
Search (similar aesthetic), Object
Search (similar objects) and
Image Search (similar text
Blending ratios are weighted
dynamically based on query
understanding - including
related boards and past user
"In the English language there are 180,000
words, and we only use 3,000 to 5,000 of
If you’re trying to do voice recognition,
there’s a small set of things you need to
be able to recognize.
Think about how many objects there are
in the world, distinct objects, billions, and
they all come in different shapes and
- Clay Bavor, Google
Why does visual search
matter for marketers?
“"Shopping has always been visual.
We’ve just been taught to do the
- Amy Dziewiontkoski, Pinterest
Consumers - particularly
younger generations - are
more likely to engage with
brands through visual media.
Google and Amazon have
both realized this and visual
search is a central tenet of
their strategy to encourage
discovery beyond the search
Visual search both extends and shortens the search journey
Intent state Input Output
Open to ideas
Looking for a style
Looking for a specific type of
Ready to buy
- Visual search creates a
new space for image-
- It can also collapse the
purchase journey, allowing
someone to go from
image to purchase in just
a few moments.
- Visual Search will be
integrated with Google
Express, shopping through
Snapchat, and already
works with Lens the Look
● Visual searches via Pinterest Lens every month
● 97% of all Pinterest searches are non-branded
● 93% of consumers consider images to be the key deciding factor in
a purchasing decision (KissMetrics)
● 72% of internet users search for visual content before making a
How can I optimize for
Some practical tips to gain a competitive advantage
1. Upload your inventory: sitemaps and
2. Order images based on stylistic/aesthetic relations
“The images that appear in both the style ideas and similar
items grids are also algorithmically ranked, and will prioritize
those that focus on a particular product type or that appear
as a complete look and are from authoritative sites.”
This matters because it shows how
visual search differs from traditional
image search optimization - but also
how it can be similar, too.
3. Image best practices for visual search
➔ Alt attributes and captions: include
➔ Insert images on high authority,
relevant pages. Google will
prioritize images that are central
to their host page.
➔ Only use stock photography if it
has been edited to make it unique.
➔ Maintain a consistent aesthetic -
this helps search engines
understand the relation between
3. Remove clutter from images
Automatic object detection works best when
the focal points of the image are in the
Tell search engines what your image is about
by giving prominence to the items you want
to rank for.
This allows the technology to produce a
feature map, which can be used to find
relevant images in the database.
Place your screenshot here
Help search engines understand
your content by using structured
data for all relevant elements of
As a guideline, always mark up:
● Product name
5. Use visual search to unite the physical and
Maps integration through Augmented Reality
PinCodes in store lead consumers to online
listings. Ensure they have a cohesive
experience across channels
1. Upload your product inventory to your website and social media profiles
a. Image XML sitemap
b. Check indexation status of images
2. Research trends (both keywords and styles)
a. Map keywords to images
b. Logical taxonomy
3. Optimize your images
a. Make images central to your landing pages
b. Maintain a consistent aesthetic
c. Don’t use stock images; or at least, edit them to make them unique
4. Make context clear
a. Structured data is essential (!)
5. Link your physical and digital presences
b. Maps optimization
You can find me at @ClarkBoyd & email@example.com
Google’s computer vision technology couldn’t tell
apart a Maltese and a Maltipoo(!)
Zia Chisti said in an interview with The Economist this week, “Much of what we
think of as AI is just the same old algorithms, but faster. True AI will learn on its
own from its environment, increasing in accuracy over time.”
Google is continuously analyzing images and updating its feature recognition.
In this instance, it has separated out the two dogs into different albums as it
identified their individual features. 30
Separate photo albums!
“ “A field linguist has gone to visit a culture whose
language is entirely different from our own. The linguist is
trying to learn some words from a helpful native speaker,
when a rabbit scurries by.
The native speaker declares “gavagai”, and the linguist is
left to infer the meaning of this new word. The linguist is
faced with an abundance of possible inferences,
including that “gavagai” refers to rabbits, animals, white
things, that specific rabbit, or “undetached parts of
rabbits”. There is an infinity of possible inferences to be
made. How are people able to choose the correct one?”
looking for images
on Google, but also
Pinterest, and other
are using images to
find styles that they
Brands are responsible for providing the algorithms with better content to
fuel results. Stock images are at odds with what the visual searcher wants
to see and Lens technologies cannot read beyond the surface of these
How different search engines work
- Pinterest: Meta data (Pins, board names, image tags) help it understand context. The ‘blender’ that decides the weighting
of shape/color/texture is dynamic and effective. The high quantity of visual searches on the platform is helping Pinterest
improve accuracy of object recognition. Pinterest is closest to understanding the ‘essence’ of an object, beyond its form
or color. This allows it to deliver satisfactory results for fashion and decor image queries.
- Google: For now, Google turns the image into a text query based on the object it recognizes. Search with an image of a
mug using Google Lens and it will return results for mugs, but it will not detect the style of the mug based on any design
patterns it contains. Google is very effective at picking out text on items such as clothing and uses these to form queries,
too. Knowledge Graph and Maps integration will see Google’s results improve, but for now it is behind Pinterest in
identifying the intangibles that escape the grasp of language.
- Amazon: Amazon could not recognize any aspects of Loafie, even though the item is in its inventory. In general, Amazon
is effective at recognizing everyday objects and returning related results. Search with an image of a kettle and it will
return ‘kettle’ results. This makes it useful for its core purpose as a retailer, but Amazon will also want to branch out into
the ‘inspiration’ space. Better visual search results will be an important part of this strategy.
- Bing: The results for Bing show that it focuses on identifying color and shape, but not necessarily the category of the
object. Bing has some useful new features that allow for object isolation within complex images, but the algorithms
require more data and training if Bing is to expand its remit beyond everyday objects. The results in this test were the
most erratic of all the technology providers.
- Camfind: As a specialist visual search tool, Camfind performed impressively, even if the results were not entirely
accurate. Years of training on a wide range of objects has led to a more nuanced understanding of objects beyond just
color and shape. It will be interesting to see where Camfind sits in this market as the likes of Google make visual search a
priority and integrate into services like Shopping.
- There are numerous layers of interpretation when analyzing
an image: size, shape, color, object purpose, style, context…
- Different technologies approach this in different ways.
Pinterest is best at blending these factors, for now.
- Even when an object exists in the image inventory (in the
case of Amazon), there are no guarantees that the visual
search engine will recognize it.
- We need to help search engines as much as possible.
Camera-based search leads to:
● 48% more product views
● 75% greater likelihood to return
● 51% higher time on site
● 9% higher average order value
Google image search is already a huge
opportunity. As Google integrates
Lens into more products, brands will
be able to connect with their audience
in new and more effective ways.
Accessible visual search technology is allowing new players like Hutch to innovate. The
advantage of visual search here for both the consumer and the brand is clear. Consumers get
better results that allow them to ‘see’ the products and brands get the chance to show off
Object shape, size,
help connect this to
Results pulled from
Visual Search (similar
objects) and Image
Search (similar text
Blending ratios are
boards and past user
How it works