I manage a lot of HTTPS proxys (That's proxies which have an SSL connection of their own). I'm building a diagnostic tool in python that attempts to connect to a page through each proxy and email me if it can't connect through one of them.
The way I've set out to go about this is to use urllib to connect through each proxy and return a page which should say "success" with the code below.
def fetch(url):
connection = urllib.urlopen(
url,
proxies={'http':"https://"+server+':443'}
)
return connection.read()
print fetch(testURL)
This fetches the page I want perfectly the problem is it will still fetch the page I want even if the proxy server information is incorrect or the proxy server is inactive. So either it never uses the proxy server or it tries it and connects without it when it fails.
How can I correct this?
Edit: No one seems to know how to do this. I'm going to start reading through other languages libraries to see if they can handle it better. Does anyone know if it's easier in another language like Go?
Edit: I just wrote this in a comment below but I think it might be a misunderstanding going around. "The proxy has it's own ssl connection. So if I go to google.com, I first do a key exchange with foo.com and then another with the destination address bar.com or the destination address baz.com The destination doesn't have to be https, the proxy is https"