Facebook like - showing cached version og:image, way to refresh or reindex it?
Asked Answered
E

6

61

Having an issue with Facebook like and a cached og:image.

Long story short: Facebook has cached an older version of our like image. The meta content URL can't be changed. Is there anything I can do to refresh it?

Long story: The site I'm working on has a meta tag for an og:image that Facebook uses when a page is liked. This meta tag uses the same image URL on all pages across the site. The image is simply a branding image for the site.

The issue is the site recently updated their branding, and we can't get the Facebook like image to update. When a user clicks the like link, the resulting post to Facebook still shows the old branding image.

The meta tag is similar to:

<meta property="og:image" content="http://[domain].com/images/bookmark/apple-touch-icon.png"/>

Whenever a like makes its way to Facebook, the URL to the image is changed to the cached Facebook URL, similar to this:

http://external.ak.fbcdn.net/safe_image.php?d=AQDajxm-qgVNdfEL&w=90&h=90&url=http%3A%2F%2F[domain].com%2Fimages%2Fbookmark%2Fapple-touch-icon.png

This URL displays the older version of the site's branding. It has been over a week, and it has not updated yet.

Is there any way to force Facebook to reindex the image/clear it's cache? Or, does Facebook periodically do this automatically? I couldn't find any relevant information on this.

I know that changing the URL in the meta tag could fix the issue, but the meta tag is generated by code used across multiple sites and it can not be changed. I also tried the delinter tool as was suggested to me by others. No luck.

Extended answered 27/9, 2011 at 16:23 Comment(5)
are you caching the page on the server? try appending something like ?123 at the end of the url and try the Facebook debugger again.Evincive
Doesn't work. I'm sure adding a query string creates a new cache, but the og:image is still shows the older cached image. There are hundreds of pages on this site, all using the same og:image URL. I think FB recognizes that, and has cached the og:image URL independently of the like URL and uses that instead of creating a separate og:image cache file for each URL liked. The problem is, there doesn't appear to be any way to clear out that cache. I can go through the source of multiple pages and see that the og:image URL is changed to the same external.ak.fbcdn.net.. URL each time.Extended
This is not a duplicate of the question that this links to. Additionally, today in 2013 simply re-linting the URL, as the answer on the linked question says, actually does not fix the cache issue.Dice
@cosmicbdog - I've re-opened this. Are you going to surprise us with a great answer? :)Channing
@Channing Unfortunately I have yet to surprise anybody with a great answer! lol But if I find the answer to this I will definitely post it.Dice
P
91

Insert your URL into their linter and it should reload its cache

Perlman answered 27/9, 2011 at 16:28 Comment(7)
Tried that already, didn't work. The og:image on the resulting debugger page still shows the cached version. I'm guessing the og:image is cached independently of the URL entered into the debugger.Extended
This worked, but I first had to change the og:image URL (both on the server and in the tag) to get it to pick up the change. You can then change it back to the original URL afterward, and it will pick up the change again (if you had the OP's problem of not being able to change it permanently). Also, I don't think this question should be closed; it is a different problem from the linked question.Alisander
This worked, and it's so lame that this is how it works. Got 200 URL's to 'lint' nowCollative
To change the og:image URL one just has to add a "?v=1" to the image path. No need to rename the file.Devilment
Worked for me after a couple of attempts. Clicked both buttons shown after the initial 'Debug' click, before it worked.Condolence
For me, the scraper / debugger didn't work until I removed meta tags completely, scraped, re-input the ones I wanted to appear and scraped again. It seems as if just changing the URL of an existing og:image tag wouldn't be picked up out right.Mesne
There's a button "Scrape again" on this page. This is what does the refreshing. i.imgur.com/0FsidR6.pngCutshall
R
25

You can use Facebook's object debugger which will allow you to enter the page URL and then on the next page you can re-submit it in a request to 'Fetch new scrape information'. This will clear Facebook's cache for the given URL - Not that it may take some time to propagate around all their cache nodes.

Facebook's Object Debugger can be found here: https://developers.facebook.com/tools/debug/

We recently found that Facebook was caching URLs using a query string against the relative URL and that the query string was being ignored which messed up a few dynamic images we were serving purely based on the query string.

It turns out that you can specify a last modified timestamp (in Unix timestamp format) to help ensure when FB crawls your site, it always gets the correct image.

This can be done by including the following OG meta tag:

<meta property="og:updated_time" content="123465789" />

For dynamic sites you'll want to generate the content value - using PHP the current Unix timestamp can be inserted as follows:

<meta property="og:updated_time" content="<?=time()?>" />
Record answered 24/6, 2015 at 9:36 Comment(2)
If the og:image url hasn't changed it'll use the old (cached) version though?Catachresis
Generate unix timestamp with JavaScript. var t = Math.floor((new Date().getTime()) / 1000); https://mcmap.net/q/92859/-how-can-i-generate-unix-timestampsHornswoggle
S
11

I have think a possible solution... what if you add at the end of the URL a random string?

like www.server.com/something.php?v=<?php echo rand() ?> or www.server.com/something.jpg?v=<?php echo rand() ?>

i guess facebook cahce object depending on the url... change it randomly... could help.

Slimsy answered 14/5, 2013 at 10:20 Comment(4)
this worked for me,what i'm thinking now is: this will cause problems with Google and Seo?Boucicault
About google and Seo in general i could say that I have updated some sites where we were doing seo on, with the logic above, and we haven't noticed nothing about penalization or Seo issues, but this is totally empiricalMannerheim
but can't count number of shareLewse
well, I've solved this by tracking manually this event via an ajax call when the user clicks on the "like" link.Mannerheim
H
8

7 years later after this post was made and this is still a problem, but its not facebook's cache: It is human error (allow me to elaborate)

OG:TYPE effects your image scrape:

  1. https://ogp.me/#type_article not the same as https://ogp.me/#type_website

Be aware that og:type=website will cause any /sub-pages/ of that url to become "canonical". This means you will have trouble getting your images to update using the scraper no matter what you do.

Consider this "assumption and common mistake"

-<meta property="og:type" content="website" /> => https://www.example.org (parent)
-<meta property="og:type" content="website" /> => https://www.example.org/sub-page/
-<meta property="og:type" content="website" /> => https://www.example.org/sub-page/child-2/
- Ergo: /sub-page/ and /child-2/ will inherit the og:image of the parent

Those are not "all websites", 1 is a website, the others are articles.

If you do that Facebook will think all of those are canonical and it will put the FIRST og:image into all of them. (try it, you'll see) - if you set the og:url to be your root or parent domain you've told facebook they are all canonical. (there is good reason for that, but its off topic)

Consider this solution (which is what most people "really want")

-<meta property="og:type" content="article" /> => https://www.example.org/sub-page/
-<meta property="og:type" content="article" /> => https://www.example.org/sub-page/child-2/

If you do that now Facebook will give you far far less problems with scraping your NEW images.

In closing, YES the cache busters, random vars, changing urls and suggestions here can work, but they will seem like "intermittent voodoo" if the og:type is not specified correctly.

PS: remember that a CDN or serverside cache will serve to Facebook's scraper even if you "think" you can see the most recent version. (I wont spend any time on this other than to point out it will waste colossal amounts of your time if not double checked.)

Hagood answered 6/9, 2019 at 19:29 Comment(1)
They will find it, eventually... after trying everything else that does not work. (which is a healthy experience) took me long enough. lolEncy
D
2
  1. Change the URL of the og:image when you update the Image

example

<meta property="og:image"         content="https://abc.lk/img/share-english.jpg" />

to

<meta property="og:image"         content="https://abc.lk/img/share-english-1.jpg" />
  1. go to https://developers.facebook.com/tools/debug/sharing

  2. Add url - click debug

  3. Check Time Scraped and Click Scrape again

Devoir answered 21/10, 2019 at 6:9 Comment(1)
no. change property="og:type" content="WEBSITE" to content="ARTICLE" (instead of "website") because every page on your website is not a separate website, its an article. [see post above titled - OG:TYPE effects your image scrapeEncy
V
0
<meta property="og:image" content="https://example.com/image.jpg?v=<?= time() ?>" />

This was my solution, using ?v=UNIX_TIMESTAMP but still you have to fetch page again.

https://developers.facebook.com/tools/debug/sharing

Versicle answered 15/7, 2020 at 18:55 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.