Columnist Kristopher Jones talks visual search -- its current state, its implications for the future and strategies for SEOs looking to capitalize on this evolving technology.

The arrival of the Pinterest Lens and Google Lens has ignited a battle for visual search engine supremacy. Beyond opening up a new revenue stream for e-commerce stores, visual search could completely alter consumer habits and purchasing decisions.

In a world driven by instant gratification, visual search can open the door toward “snap and surf” purchasing, streamlining the search interface. This provides a promising outlook for e-commerce stores that develop their product listing ads (PLAs) and online catalogs for the visual web.

While still in its infancy, optimizing for visual search could greatly improve your website’s user experience, conversion rate and online traffic. Yet images are often given very little attention by SEO experts, who generally focus more on optimizing for speed than for alternative attributes and appeal.

While visual search won’t displace the use of keywords and the importance of text-based search, it could completely disrupt the SEO and SEM industry. I’d like to discuss some of the fundamentals of visual search and how it will affect our digital marketing strategy moving forward.

What is visual search?

There are currently three different visual search processes being employed by major search companies:

Traditional image search that relies on textual queries.

Reverse image search that relies on structured data to determine similar characteristics.

Pixel-by-pixel image searches that enable “snap and search” by image or by parts of the image.

In this article, I’m focusing mainly on the third type, which allows consumers to discover information or products online by simply uploading or snapping a picture and focusing their query on the part of the image they’d like to research. It’s essentially the same as text search, just with an image representing the query that’s being matched to it.

TinEye provided the first visual search application, which is still in use today. This form of image search matched the image to other images on the web based on similar characteristics, such as shapes and colors. Unfortunately, TinEye provided a limited range of search applications by failing to map out the outlines of different objects in an image.

Today’s image recognition technology can actually recognize multiple shapes and outlines contained within a single image to allow users to match to different objects. For example, Microsoft’s image search technology allows users to search for specific items pictured within a larger image.

Microsoft is even working on detecting when the selected portion of the image has a shopping intent, showing “related products” in these instances. Unfortunately, Microsoft’s visual search is fairly limited to a few verticals, such as home appliances and travel.

Right now, this technology is limited. What companies like Pinterest, Microsoft and Google are investing in is a visual search application powered by machine learning technology and deep neural networks.

The idea is to get machines to recognize different shapes, sizes and colors in images the same way the human brain does. When we look at specific pictures, we do not see a sea of points and dotted lines. We immediately identify patterns and shapes based on past experiences. Unfortunately, we still barely understand how our minds interpret images, so programming this into a machine presents some obvious complications.

Visual search engines have come to rely on neural networks that utilize machine learning technology to improve upon its process. Companies like Google benefit from their wealth of information that allows its Lens application to constantly improve upon its search functionality. Google Lens is not only able to identify different objects within pictures but is also able to match them to locations near you, provide customer reviews and sort listings by the same principles that govern its own search algorithms.

Implications and future

So, what does this technology entail for users and businesses? Imagine being able to snap a picture of a restaurant and have a search engine tell you the name of the restaurant, the location, peak demand times and menu specials for the night. This technology could feasibly be used to snap a picture of a pair of shoes from a magazine or from a stranger and enable you to order them right there.

For e-commerce stores, visual search puts people very high in the funnel. With some unique images, product reviews and a good product description, you can entice buyers to make a purchasing decision on the spot.

This will also open up the field of competition a little bit. The Pinterest visual search engine is by far one of the most disruptive on the market. However, Pinterest’s search engine only redirects pinners to posts on Pinterest, meaning you’ll need to develop a presence on this platform to reach those audience members.

With the rise of voice search and natural language processing (NLP) accompanying this trend, this technology could help kick-start the trend of interface-free SEO. (Although I suspect that keywords and text-based search will still retain its importance, even for shopping and purchasing decisions.)

Potential strategies

In terms of optimizing for visual search, some of the most fundamental SEO practices will still apply. Structured data remains incredibly important, especially for visual search algorithms like Microsoft’s that still rely on it to match characteristics.

It’s important that images are displayed clearly and free of clutter so that visual applications have an easier time processing them. Beyond this, you should stick to the basics of image-based search optimization:

Conclusion

Visual search will provide a new revenue stream for e-commerce stores and vastly improve the user’s shopping experience. This could have a major impact on SEO and paid media, bringing back a renewed focus on image optimization, which has long been ignored by SEO practitioners. This new frontier of search will only reinforce existing strategies for SEO and make the need to optimize for mobile search and your visual web presence more prescient.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

About The Author

Kristopher Jones is a serial entrepreneur, angel investor, and best-selling author of "SEO Visual Blueprint" by Wiley (2008, 2010, 2013). Kris was the founder and former CEO of digital marketing and affiliate agency Pepperjam (sold to eBay) and has since founded multiple successful businesses, including ReferLocal.com, APPEK Mobile Apps, French Girls App, and LSEO.com, where he serves as CEO. Most recently, Kris appeared on Apple's first TV Show, "Planet of the Apps," where he and his business partner, comedian / actor Damon Wayans, Jr., secured $1.5 million for an on-demand LIVE entertainment booking app called Special Guest.