Google SEO News and Discussion Forum

Whether you like instant previews or not is not the topic of this discussion. However, feel free to express how you feel about instant previews in the current ongoing discussion [webmasterworld.com] that is more general than this one.

I want to discuss what impact instant previews will have on traffic, who will benefit, and how we can benefit from it.

Good for lower ranked sites?Clicking around I noticed that I ventured lower in the SERPs. Is it possible that instant previews may benefit websites that rank from position five and lower?

Page quotes and click through ratesSome sites have page quotes in a larger font bordered with an orange box. Could the right quote inspire more click throughs? How do we get Google to grab the right quote?

No snippet meta tagThe no snippet meta tag is a tag designed to keep Google from using page text in the SERPs, beneath your title tag. The use of the no-snippet meta will remove the instant preview. If currently using the no-snippet tag, is it time to consider removing it in order to test the benefit of displaying an instant preview?

I think it seems pretty obvious that sites with an unattractive or missing preview are likely to suffer in comparison to those that display well.

Unfortunately there seems to be no way for a webmaster to control what appears in the preview, and I have seen many reports of missing images and stylesheets in current previews (and Flash content never shows at all).

So at the moment it seems something of a lottery - a matter of luck rather than skill.

i think the impact of instant previews on traffic is being diffused. here are two reasons i think this:

mobile apps - they have opened a new way to generate traffic without requiring users to pass through the serps. the percentage of mobile users varies for each site. my network averages about 10% of traffic from mobile users and this number grows each month.

stupid users - google does a great job adding new features but lets not underestimate how stupid users can be. a large number of people still go to google.com and search for "google". i bet many users dont even know to click on the magnify glass icon to see the instant preview.

that being said i do think we should figure out how best to address instant preview. some options to think about is:

should you cloak a pretty page for googlebot? should you fill out short pages so they dont look empty? should you increase the use of h1 tags, to make previews easier to read? should you shrink font size & make users click through to read?

those are some of the ideas that are bouncing around in my head about instant preview.

i dont think google is deliberately trying to reduce the visibility of their adwords. the instant preview only obscures the adwords when the user actively selects it. i doubt many users will want to see instant prview & adwords at the same time.

google is well known for testing countless versions of their serps. i would bet that they have tested it and noticed minimal impact on adwords so they went ahead with the current placement. i also would not be surprised if they tweak the instant preview display based on their constant testing.

As much as I like the new feature it is completely exploitable. I have already completed a test where an avertising message is displayed as the preview instead of the webpage. Obviously this will be counted as black hat by google but im sure it wont be long before sites start to use this as a method to increase traffic, it would take a manual google check to catch this out so would very hard to detect.

I have already completed a test where an advertising message is displayed as the preview

In my view, this is how it should have been all along - give the webmaster control of the "thumbnail" for corporate branding purposes, a kind of hybrid avatar and favicon, if you will.

Obviously this will be counted as black hat by google

We shall see (I have conducted similar tests).

In the current implementation webmasters will no longer feel able to keep images out of the index (because their previews will suffer). It is an absurd waste - and hardly "fair use" - to screenshot every page in the SERPs, but that is where we are.

So, staying on-topic ("how we can benefit from it") I would say that in order to benefit from previews you must, at the very least, lift any restrictions on indexing images as a prerequisite.

Google is also not playing fair with their preview bot. It comes into my server across all sites through googlebot IPs that have a proper rDNS and on IPs that have no rDNS at all. "We will always run our bots with reverse lookup available so you can check it's genuine". Yeah, right, you and Bing both!

How do we know preview is a genuine bot? It's easily forged and a fair few of google's IPs are used by the general public or (often worse) by apps creators, some of whom have been proven to be hackers or even criminals. To display Previews of our sites on google's now-corrupted SERPS we have no obvious recourse but to allow ANY google IP onto our servers providing it carries the web preview UA. There is no way of checking its legitimacy.

And what about images disallowed in robots.txt? Some of our sites show images culled (by the preview bot?) illegally (disallow images directory in robots.txt), others show no images at all - one site even shows as "not available" (and contrary to one suggestion, it is a school site and certainly not a #*$! site!).

I was read one report which suggested that Previews could be scraped from SERPS - easily enough done - and used on other web sites. Now that's going to be intersting!

in order to benefit from previews you must, at the very least, lift any restrictions on indexing images as a prerequisite

I can say that hasn't been necessary on my end. I have the images folder blocked in robots.txt to all crawlers including Google. I also have hot-link protection in place through Apache and yet still the sites on my server are generating previews. I can see that the Google web preview is being returned a 200 response when requesting the images so I really don't know how they are doing this but I have bigger fish to fry right now than to worry about a single engine stepping around explicit instructions to not crawl or link to server images.

But overall, as far as how to benefit from this I'm still assessing it myself. I don't think it's a totally bad thing or I would simply ban those web preview IPs at my firewall.

What I am realizing, based on my observations is that sites with flash and other technologies that generate blank areas are no doubt going to suffer. Casual users might think if they go to the site they will be required to install some sort of a plugin and just skip it rather than be bothered by it.

More than ever now, I think it will be important to present a visually appealing static presentation. At least for the main page anyway. For me that means business as usual because it is what I've always preferred. And also, Google is just one source of traffic. It still stands with me to design for the consensus and not get bogged down with trying to appease any one source of traffic. In the long run it balances itself out.

For me this probably means increasing the size of thumbnail images. My thumbs are about as small as I can reasonably get away with to improve performance, which means that in the preview they are smaller than I can reasonably get away with and in many cases you can't really make out what the image is. Time to do some mass automated image resizing.

On the other hand, this gives me an excuse to reduce the number of thumbnails displayed per page, which will immediately increase the total number of pages. Hmmm... that could be useful...

I can see that the Google web preview is being returned a 200 response when requesting the images

That is because the Google Web Preview bot does not consult robots.txt and acts "like a browser".

It is actually a prefetcher bot that appears to work like this:

If a site appears in the SERPs and its robots.txt restricts Googlebot access to images, CSS etc then the Google Web Preview bot will - once the "preview" option is chosen for a particular SERP - prefetch all such pages in the listed SERPs (not just the one selected for previewing) and will also cache the results.

So seeing the bot in your logs does not necessarily mean that someone previewed your site.

sites with flash and other technologies that generate blank areas are no doubt going to suffer

There are many reports on Google's webmaster forum and elsewhere of missing images and stylesheets producing uninviting screenshots and blank areas.

Martinibuster's topic here is "how we can benefit from the previews".

The first step towards any benefit is ensuring a good-looking preview image.

For now, at least, cloaking to the prefetcher bot gives the best results here.

Anyone know if there's any rhyme or reason to how often these images get updated? Does it correspond to a new page cache, or is it some other time period? Has anyone changed their page and seen the preview image change to reflect the new page? How long did it take?

caribguy, I think it would depend on the niche, whether you want to highlight your text content or your image content or both. I would think that if you're in arts, crafts, any sort of design-related trade (architecture, construction, graphic design, decorating etc) or anything else where people are primarily interested in what your products look like, you would want to highlight the images.

If you're in an informational or research niche, services, B2B, news, etc you might be more inclined to highlight the text.

Q: Can I show different content in the preview?A: No. You must show Googlebot and the Google Web Preview the same content that users from that region would see (see our Help Center article on cloaking).

The article is undated, but seems to have been published very recently (post launch).

Q: I want to block my images from being indexed, but Iím happy with them appearing on a preview image; how can I juggle the two?A: In order for images to be embedded in previews, it is important that they are not disallowed by your robots.txt file. In order to block crawlable images from being indexed, you can use the "noindex" x-robots-tag HTTP header element.

until it settles down and shows the images consistently i think it will probably lose us as much traffic as we gain.

if i do site:www.example.com on my site, half of them show images and half dont. half of them show background images in the CSS and half of them dont. and this is when i've got the image directory blocked in robots.txt. so half of them will attract traffic (maybe) and the other half will send them running away.

Q: I want to block my images from being indexed, but Iím happy with them appearing on a preview image; how can I juggle the two? A: In order for images to be embedded in previews, it is important that they are not disallowed by your robots.txt file. In order to block crawlable images from being indexed, you can use the "noindex" x-robots-tag HTTP header element.

if we remove the block in robots.txt, doesnt that mean that all the thousands and thousands of images we have on our sites will now get downloaded by all the bots? because they wont be able to tell its noindex until they grab it. or does it not work like that?

I can't see X-Robots-Tag working any different to the normal meta tag, with the proviso that as far as I'm aware a lot of SEs do not obey X-Robots-Tag anyway. So they are saying: "Do what we say and stuff the other SEs!"

Oh, yes. It was google's assertion that I could use that tag to block the images that prompted that comment. :(

Some time back I dreamed up a scheme that would allow webmasters to specify exactly what could be taken by whom and for what purpose. It included a variable file path/name specified in robots.txt and a potential for as many sections as were necessary, along the lines of linux ini files. When I mentioned it around WebmasterWorld it was suggested I read the spec for a certain robots.txt replacement scheme, which in my opinion was not terribly good (forget the name now). I didn't push mine because (eg) google would ignore it anyway, but it's coming up time to force some such scheme onto SEs. The problem is: how? I'm sure google, for one, would ignore it.

Interesting topic! I'm just finding my way around those previews, in fact just started paying attention only after reading this thread, which probably goes to show that it will be months before "general public" catches on.

I have a couple of questions that someone may already know answers for, please oblige if you do:

Does the preview bot share the data with the indexing engine in a way AdSense (Mediapartners-Google) bot does?

The way the web page renders is NOT exactly similar to FF, IE or Opera that I test my layouts in. Some of the images get mis-aligned and it also looks like they render for page widths slightly less than 800px - a rarity these days. What type of browser do you guys think Goog's rendering looks most alike?

Quick note: A week after I changed a page, the SERPs are now showing the new Instant Preview image. Latest cache shows a date 3 days ago. So...4 days to new cache, and another 3 days to new image preview. Obviously, that will be different for different sites/pages, but thought I'd at least share my one test with you all.

preview completely obscures the adwords block on the right, is this deliberate ?

Scooterdude has got the point. People will try to click on the preview. How many of those clicks "won't work" and will become clicks on Adwords? One tiny tweak on Javascript that displays the preview (only for small % of users) and Google got 1%+ clicks on adwords - that's like 1 billion + dollars. Best of all nobody can tell, but Google.