503 Errors or Blank Page from Azure Front Door
Asked Answered
M

2

7

I've read through the link below and extended the timeout, disable compression on the Origin/Azure Front Door and a Rules Set rule to remove accept-encoding from the request for byte range requests. However, I am still getting these random 503 errors or blank pages.

https://learn.microsoft.com/en-us/azure/frontdoor/front-door-troubleshoot-routing

In my web app, I've added the custom domain (app.contoso.com) with my own certificate.

enter image description here

And in my Front door, I've added my custom domain (app.contoso.com) with my own certificate and set up a backend pool and a routing rule.

enter image description here

enter image description here

And in the update backend, I've left the backend host header empty as I would like it to be able to redirect to the custom domain instead of showing the web app url.

enter image description here

And in my routing rule, I've set the following

enter image description here

I've followed the instruction closely provided from Azure but still getting random 503 Errors or Blank Page from Azure Front Door. And I can confirm that my Web app (contoso.azurewebsites.net) is working when I am getting a 503 errors or blank pages from the custom domain from front door. I've also ensure I used the same SSL certificate on the web app and front door custom domain.

In my DNS, I've mapped the following

app.contoso.com CNAME contoso.azurefd.net

Is there anything I missed out or does anybody have a solution to this?

Follow up

I've tried to enable and disable the certificate subject name validation and change the send/receive timeout from 30 to 240 or from 240 to 30 but still getting the same issues.

enter image description here

Update: It giving blank page or 503 error (see screenshot below)

I have FrontDoor setup in front of multiple backend Web App. I've noticed there's a number of random 503 errors each day. The requests themselves are valid, and retrying generally works, however, the initial request that fails is not hitting the backend Web App. The backend health at the times of the errors are also 100%, and it can't be a timeout because the error is immediate (timeout is also extended to 240 seconds). It seems like FrontDoor itself is having some kind of health issue, but there's no health issues mentioned.

enter image description here

Madge answered 30/9, 2021 at 12:51 Comment(3)
Could you please help us to know that have you configure this setting in azure front door as followed by Microsoft document [1]: i.sstatic.net/UFi0r.pngLapin
@AjayKumarGhose-MT I've updated the post. See Follow up section. Initially, 30 doesnt work so I changed to 240 based on the Azure troubleshooting article (learn.microsoft.com/en-us/azure/frontdoor/…)Madge
We are experiencing the same problem. Has this been resolved yet?Anesthetist
R
2

I have observed what I thought were random 503 errors from front door, and eventually narrowed it down to a particular set of circumstances - in the Azure portal, the App Service Client Certificates mode was set to Required or Allow, and the request was an HTTP POST with sizes over 100KB.

My experience was that problem is not in Front Door - it was just reporting the problem. The issue is that the application hosted by the Azure App Service never receives the request that is sent by Front Door, and therefore doesn't respond - leading to a 503 timeout. Increasing the timeout in Front Door won't solve the problem.

From what I've read the underlying cause is that "when the server sends the SSL renegotiation request the client has already sent too much data for the server to buffer before it can receive the SSL renegotiation response" (quoted from here: https://github.com/dotnet/runtime/issues/17336)

There are a couple of workarounds which may help in your scenario involving setting ServicePointManager.Expect100Continue = true (from here https://github.com/dotnet/runtime/issues/17336) or alternatively performing a HEAD request followed by a POST (documented here: https://github.com/fasigpt/appserviceclientcertauth/blob/fa0862dcd44ae570594e21f4bbab4f328cd5eadb/clientcert/Program.cs#L35)

But the only way I've found to reliably eliminate the 503 errors reported by Front Door is by setting the Client Certificates mode to "Ignore" in the App Service settings section in the Azure Portal. This may or may not be suitable for your particular scenario, YMMV - I didn't find this ideal, but for my case it was less bad than intermittent (but also reasonably frequent) 503 errors.

Recruitment answered 1/12, 2021 at 14:13 Comment(1)
That's interesting! Our Client Certificates mode is already set to "Ignore" in the App Service settings section in the Azure Portal. However, we are still intermittent 503 errors. The other two suggestions are more towards dotnet and we are on PHP.Madge
L
0

I tried in my local environment and it works fine for me ,Followed by this MS DOC .

Here is my configuration :

enter image description here

Make sure that you have add your custom domain properly which destination as frontend host name.

enter image description here

In backend pool i have tried to set the interval to 5(small value) so the front door knows if backend is down and that it shouldn't routes traffic to that backend anymore

enter image description here

In routing rules, make sure that you have checked all your frontends/domains . enter image description here

enter image description here

After that you can run by hitting your frontdoor host . enter image description here

Please refer this Microsoft documentation : Add a custom domain to your Front Door for more information.

Lapin answered 1/10, 2021 at 8:39 Comment(3)
I've followed the steps accordingly. I am getting the front door page, but at times it kept giving blank or 503 errors. See follow up, I have edited the question.Madge
could you please refer this discussion in MS Q&A Azure Front Door returns a 503 Service UnavailableLapin
I had read the link you provided before posting my issue here. I've also tired the suggestion in this link but still no success. #66964352Madge

© 2022 - 2024 — McMap. All rights reserved.