This will not only change the max_retries but also enable a backoff strategy which makes requests to all http:// addresses sleep for a period of time before retrying (to a total of 5 times):
import requests
from requests.adapters import HTTPAdapter, Retry
s = requests.Session()
retries = Retry(total=5,
backoff_factor=0.1,
status_forcelist=[ 500, 502, 503, 504 ])
s.mount('http://', HTTPAdapter(max_retries=retries))
s.get('http://httpstat.us/500')
As per documentation for Retry
: if the backoff_factor is 0.1, then sleep() will sleep for [0.05s, 0.1s, 0.2s, 0.4s, ...] between retries. It will also force a retry if the status code returned is 500, 502, 503 or 504.
Various other options to Retry
allow for more granular control:
- total – Total number of retries to allow.
- connect – How many connection-related errors to retry on.
- read – How many times to retry on read errors.
- redirect – How many redirects to perform.
- method_whitelist – Set of uppercased HTTP method verbs that we should retry on.
- status_forcelist – A set of HTTP status codes that we should force a retry on.
- backoff_factor – A backoff factor to apply between attempts.
- raise_on_redirect – Whether, if the number of redirects is exhausted, to raise a
MaxRetryError
, or to return a response with a response code in the 3xx range.
- raise_on_status – Similar meaning to raise_on_redirect: whether we should raise an exception, or return a response, if status falls in status_forcelist range and retries have been exhausted.
NB: raise_on_status is relatively new, and has not made it into a release of urllib3 or requests yet. The raise_on_status keyword argument appears to have made it into the standard library at most in python version 3.6.
To make requests retry on specific HTTP status codes, use status_forcelist. For example, status_forcelist=[503] will retry on status code 503 (service unavailable).
By default, the retry only fires for these conditions:
- Could not get a connection from the pool.
TimeoutError
HTTPException
raised (from http.client in Python 3 else httplib).
This seems to be low-level HTTP exceptions, like URL or protocol not
formed correctly.
SocketError
ProtocolError
Notice that these are all exceptions that prevent a regular HTTP response from being received. If any regular response is generated, no retry is done. Without using the status_forcelist, even a response with status 500 will not be retried.
To make it behave in a manner which is more intuitive for working with a remote API or web server, I would use the above code snippet, which forces retries on statuses 500, 502, 503 and 504, all of which are not uncommon on the web and (possibly) recoverable given a big enough backoff period.
requests.get(url, max_retries=num_max_retries, dely_between_retries=3))
– Anarchyjust.get
andjust.post
in github.com/kootenpv/just – Duaneduarchy