I have an application that sends data to a server using an HTTPS POST. I use a System.Net.WebClient object to do this. Here is a function that sends some data:
private byte[] PostNameValuePairs(string uri, NameValueCollection pairs)
{
byte[] response;
String responsestring = "";
using (WebClient client = new WebClient())
{
client.Headers = GetAuthenticationHeader();
string DataSent = GetNameValueCollectionValuesString(pairs);
try
{
response = client.UploadValues(uri, pairs);
responsestring = Encoding.ASCII.GetString(response);
}
catch (Exception e)
{
responsestring = "CONNECTION ERROR: " + e.Message;
return Encoding.ASCII.GetBytes(responsestring);
}
finally
{
_communicationLogger.LogCommunication(uri, client.Headers.ToString(), DataSent, responsestring);
}
}
return response;
}
We are passing in a URI beginning with https://
This has been working great for a long time. Today, we started getting the following connection error: "The underlying connection was closed: An unexpected error occurred on a send". We did some troubleshooting with the owner of the server, and they finally narrowed it down to the following. They made a change to their server to block TLS 1.0, and said that we now need to send our data using either TLS 1.1 or 1.2.
What do I need to set in my WebClient object (or elsewhere in my function) to make it use TLS 1.1 or 1.2 instead of TLS 1.0?
We are using .NET Framework 4.5 if that makes a difference.
WebClient
at all in that other Q&A, so explaining how these classes are related could be useful, and the asker and other readers might also be interested in potential solutions for setting this on a per-instance basis, for example. – OestradiolSystem.Net.ServicePointManager.SecurityProtocol
" answers without any proper explanation in them, as there are already hundreds of. And according to Set the SecurityProtocol (Ssl3 or TLS) on the .net HttpWebRequest per request, you can't set it per request. WebClient uses HttpWebRequest internally anyway. – Horrid