Is there a google API to read cached content? [closed]
Asked Answered
U

2

13

I know you can go to http://webcache.googleusercontent.com/search?q=cache:http://example.com/ to view Google's cache of any URL, but do they provide an API to hit thousands of these and pay for access?

I don't want to just make HTTP GETs to these URLs too fast and get my IP addresses banned or upset Google.

Just wondering if they offer a way to pay and do this through official channels like they do with their search API.

Unwritten answered 25/9, 2013 at 16:16 Comment(0)
L
4

Google doesn't seem to have an API to access the cached results:

There are some attempts to scrape it and wrap it in APIs, such as this perl module

Other than that the Wayback Machine has an API, of cached versions of sites. Perhaps that will do?

Lavernelaverock answered 3/11, 2014 at 2:57 Comment(0)
K
-1

Currently there's no tool that I've found that does it. You'd have to create your own script to individually cache a certain number of pages. To avoid Google blocking you, I suggest capping the number of urls scraped. Not ideal, but running a script 10 times is better than looking at 1000 cached urls individually. :/

If you want to see if anything you edit on your site would effect your potential rankings in Google, I'd check out SEORadar.com, they'll do that for you.

Khalsa answered 12/4, 2017 at 18:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.