SSL failure on Windows using python requests
Asked Answered
C

2

15

Apologies for the very long post, but I'm really trying to be thorough...

I have a dedicated web site that serves as bridge to exchange data between various environmental models operated from remote servers and running on different types of OSes (Linux, MacOS and Windows). Basically each server can upload/download data files to the web site, and files are then used for further processing with a different model on another server.

The web sites has some basic protection (IP filtering, password and SSL using LetsEncrypt certificates). All the remote servers can access the site and upload/download data through a simple web interface that we have created.

Now we are trying to automate some of the exchange with a simple python (2.7) daemon (based on the requests module). The daemon monitors certain folders and uploads the content to the web site.

The daemon works fine on all of the remote servers, except for one running Windows 7 Enterprise 64bit. This server has Python 2.7.13 installed and the following packages: DateTime (4.1.1), psutil (5.2.0), pytz (2016.10), requests (2.13.0), zope.interface (4.3.3).

From this server the SSL connection works fine through a web browser, but the daemon always returns:

raise SSLError(e, request=request)
requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)

Here is what we tried so far:

  • setting verify=false. This works fine, but we cannot use it in our final production environment..
  • copying the certificate from another server where the daemon works, and setting verify=(name of the certificate file) (no success)
  • setting the 'User-agent' to the exact same string that we get from the Windows machine on the web site when the connection is done with a web browser (no success)

What other setting should we be looking at on the Windows server to try to solve the problem? Can it be a firewall setting that somehow allows the browsers SSL connection through but blocks the python daemon?

UPDATE
The organization that is running the Windows remote server that was producing the error substitutes all SSL certificates at the proxy level.
Their IT people solved our problem by adding the URL of our web site to the list of "pass through" sites on their proxy settings.

This works and it's fine for now. However I'm wondering if we could have handled the certificate substitution directly in python...

Cornerstone answered 23/3, 2017 at 16:11 Comment(0)
M
28

It is possible to get the Requests library to use Python's inbuilt ssl module to make the SSL portion of the HTTP connection. This is doable because the urllib3 utils that Requests uses allow passing a Python SSLContext into them.

However, note that this may depend on the necessary certificates already being loaded into the trust store based on a previous Windows access (see this comment)

Some sample code follows (this needs a recent version of Requests; it works with 2.18.4):

import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.ssl_ import create_urllib3_context

class SSLContextAdapter(HTTPAdapter):
    def init_poolmanager(self, *args, **kwargs):
        context = create_urllib3_context()
        kwargs['ssl_context'] = context
        context.load_default_certs() # this loads the OS defaults on Windows
        return super(SSLContextAdapter, self).init_poolmanager(*args, **kwargs)

s = requests.Session()
adapter = SSLContextAdapter()
s.mount('https://myinternalsite', adapter)
response = s.get('https://myinternalsite')
Miche answered 7/5, 2018 at 13:37 Comment(11)
Doesn't work for me. I'm using Requests v2.19.1, and it gives me this error: 'PyOpenSSLContext' object has no attribute 'load_default_certs'Tumulus
For Python 3.6.5 and requests 2.19.1, I had to replace the create_urllib3_context import with import ssland then change the context's assignment to be context = ssl.create_default_context().Unmoving
Thanks for the feedback. On Windows 10, using Python 2.7.15 and requests 2.21.0, the create_urllib3_context import and code above still works correctly for me.Miche
@kraussian, what platform are you running on? How did you install Python?Miche
@tm1212, on requests 2.18.4, 2.19.0 and 2.21.0, the call to SSLContext fails for me with TypeError: __new__() takes at least 2 arguments: 1 given. Are you doing something different?Miche
@Unmoving you raise a good point ... the ssl create_default_context method actually does the load_default_certs operation automatically, and this is a simpler way to set up the context without having to import a private member from urllib3. This seems to work on both Python 2.7 and Python 3. My testing of the above code also works on Python 3 but it could be simpler using your methodMiche
I had it working with Python3 and only tried it against requests 2.12.4 and 2.19.1 . I removed my comment as @Unmoving has the better solution.Thrash
Tried this solution and it works with python 3.7 requests 2.21.0 Is there a way to apply this solution to all requests without a session?Different
@Josh, Every call to requests.get(), etc. always uses a session behind the scenes. So, there's no way to use requests without a session.Felike
Awesome, thanks for the recipe. Confirmed working on Win 10, Python 3.6.8, requests==2.20.Mccormack
Is there a way to make this work for any site, instead of just a specific domain? This is needed if you are on a corporate network that MitMs all HTTPS requests.Tungstic
F
1

Requests doesn't use your Windows root CA store like your browser does.

From the docs: By default, Requests bundles a set of root CAs that it trusts, sourced from the Mozilla trust store. However, these are only updated once for each Requests version.

This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.

You can literally do this:

cafile = 'cacert.pem' # http://curl.haxx.se/ca/cacert.pem
r = requests.get(url, verify=cafile)

Or you can use certifi if your CA cert is signed by a public entity.

Furnace answered 23/3, 2017 at 17:33 Comment(9)
Thanks for your reply. We already tried this approach (second bullet point), but without success. I edited my question to make sure it's cleared.Cornerstone
Which certificate did you copy? You need the issuing certificate authority cert, not the webserver cert.Furnace
We copied the cacert.pem file from another Windows server where the daemon works. From C:\Python27\lib\site-packages\requests\cacert.pemCornerstone
If it is working on another machine, with the same code, then there's something up with the file, like getting mangled on copy. Otherwise I'm not sure.Furnace
The file is fine... we tried the same file on other Windows machines to double check. That's why we are questioning other possible settings at the OS level...Cornerstone
perhaps your requests is old, or your ssl libraries are old and don't support required ciphers? It makes no sense trying to come up with a different solution, fix the root problem on your box.Furnace
I'm not sure what you mean by "solve the root problem on your box"... If you read my question you'll see that everything works just fine on every other system we have tried it on. And the machine on which it does not work has exactly the same version of python and requests (and all the other installed packages) as all the other servers... so I'm just trying to figure out what else I can (or should) be looking into on the machine on which it does not work.Cornerstone
Right, my point is the problem is not the python code. It is another issue on your computer.Furnace
You can also set verify= to a directory containing certificates, but you must first run openssl rehash on the directory. See requests.kennethreitz.org/en/master/user/advanced/… for more info.Felike

© 2022 - 2024 — McMap. All rights reserved.