robots.txt file issue

I have uploaded several feeds. The first time I put 5 products up and I received this message on just 2 products:

The robots.txt file for your page prevented us from crawling the image

I looked at the robots.txt file and found it was blocking /images and /assets so I removed the disallow. I then uploaded a new feed with 10 products and about 3 of them were rejected with the same message. My questions.

If the robots.txt is blocked why is it not blocked for all products? why are some approved?

Why would there still be an issue when I have removed the block? Is there some other file other then in the main web folder?

Why does it say they can't crawl the image yet clicking on the image links in the merchant cente all show the correct image?

Re: robots.txt file issue

This can be an issue with cached information. Also where are you seeing this message.

If it is the diagnostics tab than all this information is collected over a period of 30 days. Which means that Google does not recheck the images every day or every time you submit the feed. But during a period of 30 days. This message will soon disappear.

However if its in the data feed tab that you see this issue. Than there must be another issue.

Re: robots.txt file issue

This can be an issue with cached information. Also where are you seeing this message.

If it is the diagnostics tab than all this information is collected over a period of 30 days. Which means that Google does not recheck the images every day or every time you submit the feed. But during a period of 30 days. This message will soon disappear.

However if its in the data feed tab that you see this issue. Than there must be another issue.