I'm using they python requests library for the first time, and I am confused. When I run the function below in a for loop, with different base URLs, it appears to receive a response, but the content that is returned is the same for both URLs.
If I look at the API URL in my browser, then I see that it's the content for the first URL that's being returned both times. What am I missing?
base_urls = ['http://kaweki.wikia.com/','http://solarmovie.wikia.com/']
def getEdits(wikiObj, limit=500):
payload = {'limit': limit}
r = requests.get('{}api/v1/Activity/LatestActivity'.format(wikiObj),
params=payload)
edits = r.json()
return edits['items']
for url in base_urls:
print getEdits(url)
curl
. Funny enough, when you run the requests in your browser (chrome) you *will get different results. I tried adding the header'Cache-Control': 'no-cache'
to the request but it didn't solve it. In order to further debug we need to see the server-side apache logs to see why it treats it as similar requests when it's done through code/curl and why it treats it as different requests when they're placed through a browser. – Yellowgreen