I'm having some troubles with facebook's scraper after migrating my project to a new server,
after changing my DNS server to point to the new server's IP, Facebook's scraper still hits the old IP, which makes me wonder if there's a way to force the scraper to update it's DNS cache ?

On the old box I'm giving back a short max-age for the Cache-Control header Cache-Control: max-age=300, the situation has been going on since the migration approx. one week ago

I can't seem to find any relevant solutions on SO or even Facebook's documentation..

The only relevant tip I've come across so far is manually using Facebook's Linter against the URL, which forces the cache to update, but with over 10Mil image shares it's obviously not the way to go for me.

I see your point, but that's not going to help in my case because the objects are already posted on profiles and are circling Facebook, or did I misunderstand your tip ?
–
Mostafa Torbjørn BergJul 20 '13 at 18:07

If you were able to solve the problem by using the Facebook Linter for your any one url, then this should solve your problem. This is just an API endpoint for that Linter, so you can automate the linting process.
–
Agent47DarkSoulJul 21 '13 at 5:21

The above solution is not going to repost the object. What it does is allows us to tell facebook to scrape our object (the same way you do it in the linter). Since this is just an API endpoint you can make as many calls to it as you want. eg. lets say your object is a webpage and its url is example.com/webpage Now you can make a POST request to the above endpoint like: https://graph.facebook.com/?id=http://www.example.com/webpage&scrape=true This will result in forced scraping of the object by facebook crawler. Now you can run a loop and update as many urls as you want.
–
Agent47DarkSoulJul 30 '13 at 8:57