Facebook debugger: Clear whole site cache
Asked Answered
S

3

1

I am aware that Facebook caches the Like data for specific pages on your site once they're visited for the first time, and that entering the url into the debugger page clears the cache. However, we've now improved our Facebook descriptions/images/etc and we need to flush the cache for the entire site (about 300 pages).

Is there a simple way to do this, or if we need to write a routine to correct them one by one, what would be the best way to achieve this?

Scully answered 16/11, 2012 at 15:57 Comment(0)
A
3

Is there a simple way to do this,

Not as simple as a button that clears the cache for a whole domain, no.

or if we need to write a routine to correct them one by one, what would be the best way to achieve this?

You can get an Open Graph URL re-scraped by making a POST request to:

https://graph.facebook.com/?id=<URL>&scrape=true&access_token=<app_access_token>

So you’ll have to do that in a loop for your 300 objects. But don’t do it too fast, otherwise you might hit your app rate limit – try to leave a few seconds between the requests, according to a recent discussion in the FB developers group that should work fine. (And don’t forget to URL-encode the <URL> value properly before inserting it into the API request URL.)

Ass answered 16/11, 2012 at 16:14 Comment(0)
E
2

The simple solution in wordpress, go to permalinks and change the permalinks and use a custom permalink, in my case I just added an underscore so did this... /_%postname%/

Facebook then has no info on the (now) new urls so they scrape it all fresh.

I was looking for this same answer and all the answers were super complicated for me as a non coder.

Turned out there is a very simple answer and I came up with it all by myself :) .

I have a wordpress website that with a variety of plugins I've bulk uploaded over 4,000 images that created 4,000 posts.
The problem was I uploaded them and then tried setting up the facebook share plugins before sorting the og:meta tag issue so the total 4,000 posts were scraped by FB with no og:meta so when I then added them it made no difference. The fb debugger could not be used as I had over 4k posts.

I must admit I'm a bit excited, for many years I have got helpfull answers from google searches sending me to this forum. Often the suggestions I found were well over my head as I'm not a coder, I'm a "copy paster".

I'm so happy to be able to give back to this great forum and help someone else out :)

Erhard answered 9/5, 2017 at 14:22 Comment(1)
Do this if you want to kill (turn into 404) all your entries in search enginesPredicable
S
1

Well i also got the same scenario and used hack and it works but obviously as @Cbroe mentioned in his answer that the API call has some limitation with rate limiting so i guess you should take care of it in my case i only have 100 URLs to re-scrape.

So here is the solution:

$xml = file_get_contents('http://example.com/post-sitemap.xml'); // <-- Because i have a wordpress site which has sitemap.

$xml = simplexml_load_string($xml); // Load it as XML
$applicationAccessToken = 'YourToken'; // Application Access Token You can get it from https://developers.facebook.com/tools/explorer/    

$urls = [];
foreach($xml->url as $url) {
    $urls[] = $url->loc; // Get URLS from site map to our new Array
}

$file = fopen("response.data", "a+"); // Write API response to another file so later we can debug it.
foreach($urls as $url) {
    echo "\033[Sending URL for Scrape $url \n";
    $data = file_get_contents('https://graph.facebook.com/?id='.$url.'&scrape=true&access_token='.$applicationAccessToken);

    fwrite($file, $data . "\n"); //Put Response in file
    sleep(5); // Sleep for 5 seconds!
}
fclose($file); // Close File as all the urls is scraped.

echo "Bingo  It's Compelted!";
Slipon answered 27/7, 2017 at 9:1 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.