Can one cache and secure a REST API with Cloudflare?
Asked Answered
C

3

53

I am designing a RESTful API that is intended to be consumed by a single-page application and a native mobile app. Some calls of this API return public results that can be cached for a certain time. Moreover, there is a need for rate protection to protect the API against unauthorized users (spiders)

Can I use Cloudflare to implement caching and rate-limiting / DDOS protection for my RESTful API?

Caching: Cloudflare supports HTTP cache control headers so the API can decide for each entity requested via GET whether is public and how long it can be cached.

  • However it is not clear whether the cache control header is also passed downstream to client, so will also trigger the browser to cache the response? This may not be desirable, as it could make troubleshooting more difficult
  • Akamai has an Edge-Control header to ensure content is cached in CDN but not the browser. Can one do something similar with Cloudflare?

DDOS Protection: Cloudflare support has an article recommending that DDOS protection be disabled for backend APIs, but this does not apply to my use case where each client is supposed to make few requests to the API. The native DDOS protection actually fits my requirements for protecting the API against bots.

  • I need to know how I can programatically detect when Cloudflare serves a Captcha / I'm under attack etc. page This would then allow the SPA / mobile app to react intelligently, and redirect the user to a web view where she can demonstrate her "hummanness".

  • From Cloudflare documentation, it is not obvious what HTTP status code is sent when a DDOS challenge is presented. An open-source cloudscraper to bypass Cloudflare DDOS protection seems to indicate that Captcha and challenge pages are delivered with HTTP status 200. Is there a better way than parsing the request body to find out whether DDOS protection kicked in?

  • Cloudflare apparently uses cookies to record who solved the Captcha successfully. This obviously creates some extra complexity with native apps. Is there a good way to transfer the Cloudflare session cookies back to a native app after the challenge has been solved?

Probably this is something of an advanced Cloudflare use case - but I think it's promising and would be happy to hear if anyone has experience with something like this (on Cloudflare or another CDN).

Cop answered 10/4, 2015 at 19:10 Comment(6)
Can you update us on what you found?Hypercorrect
No update yet, except that Cloudflare support was not able to tell me how to programatically detect DDOS pages.Cop
"Akamai has an Edge-Control header to ensure content is cached in CDN but not the browser. Can one do something similar with Cloudflare?" I am REALLY interested in this, and cannot find a work-around. If you want immediate changes to say a user avatar, you cannot cache in the browser but also Cloudflare (as of today) does not allow you to cache on their servers only even for their Enterprise customers who have a 30sec minimum via page rules.Noggin
any new update buddy ? more than 2 years passed, looks like Cloudflare has not made any progress with DDOS protection for the APIsMalines
Hey, how did you resolve it? Any update on this?Wideawake
It is possible to set "edge" TTL different to browser TTL using header Cache-Control: s-maxage=200, max-age=60 (s-maxage = edge TTL) - see support.cloudflare.com/hc/en-us/articles/…Cop
H
24

Cloudflare has published a list of best practices for using it with APIs.

TL;DR, they recommend setting a page rule that patches all API requests and putting the following settings on it:

  1. Cache Level: Bypass
  2. Always Online: OFF
  3. Web Application Firewall: OFF
  4. Security Level: Anything but "I'm under attack"
  5. Browser Integrity Check: OFF
Haematoblast answered 25/10, 2017 at 13:39 Comment(7)
It basically means "disable all". Passing API traffic through the Cloudflare doesn't make much sense anymore (besides only hiding your IP)?Ferriferous
@SergeyKostrukov The requests still go through their points of presence (cloudflare.com/network) so this can still be benefical for your end-users performance. Also, I would say it protects you against DDoSFestal
Given this list of "disables", I wonder if it's even useful to go via Cloudflare for a pure REST-y API. Your cloud hosting provider may offer some kind of DDOS protection without the need to use an external proxy.Cynthla
@SergeyKostrukov hiding your IP is not that important anymore as it's fairly common today that your box has a pool allocated IP that rotates (for internet-facing IPs)Cynthla
I don't see the Always Online and Web Application Firewall options in the page rules.Plumley
@SlavaFominII It seems they've disappeared(?)Geter
I dont think this works, can someone look at my problem: #77701594Outbreak
A
6

This is a 5 year-old question from @flexresponsive with the most recent answer having been written 3 years ago and commented upon 2 years ago. While I'm sure the OP has by now found a solution, be it within CloudFlare or elsewhere, I will update the solutions given in a contemporary (2020) fashion and staying within CloudFlare. Detailed Page Rules are always a good idea for anyone; however for the OP's specific needs, this specific set in combination with a "CloudFlare Workers" script will be of benefit:

  1. Edge Cache TTL: (n)time set to the time necessary for CloudFlare to cache your API content along/in its "Edge" (routes from edge node/server farm location is dependent upon one's account plan, with "Free" being of lowest priority and thus more likely to serve content from a location with higher a latency from it to your consumers.

  2. However Edge Cache TTL > 0 (basically using it at all) this will not allow setting the following, which may or not be of importance to your API:

  3. Cache Deception Armor: ON

  4. Origin Cache Control: ON if #3 is being used and you want to do the following :

  5. Use Cache Level: Cache Everything in combination with a worker that runs during calls to your API. Staying on-topic, I'll show two headers to use specific to your API 's route / address.

  addEventListener("fetch", event => {
  event.respondWith(fetchAndReplace(event.request));
    });
    async function fetchAndReplace(request) {
    const response = await fetch(request);
      let type = response.headers.get("Content-Type") || "";
      if (!type.startsWith("application/*")) {
      return response;
        }
      let newHeaders = new Headers(response.headers);
         'Cache-Control', 's-maxage=86400';
         'Clear-Site-Data', '"cache"';
    return new Response(response.body, {
    status: response.status,
    statusText: response.statusText,
    headers: newHeaders
    });
   }

In setting the two cache-specific headers, you are saying "only shared proxies can cache this". It's impossible to fully control how any shared proxy actually behave, though, so depending on the API payload, the no-transform value may be of value if that's a concern, e.g. if only JSON is in play, then you'd be fine without it unless a misbehaving cache decides to mangle it along the way, but if say, you'll be serving anything requiring an integrity hash or a nonce then using the no-transform is a must to ensure that the payload isn't altered at all and in being altered cannot be verified as the file coming from your API. The Clear-Site-Data header with the Cache value set instructs the consumer's browser to essentially clean the cache as it receives the payload. "Cache" needs to be within double-quotes in the HTTP header for it to function.

Insofar as running checks to ensure that your consumers aren't experiencing a blocking situation where the API payload cannot be transmitted directly to them and a hCaptcha kicks in, inspecting the final destinations for a query string containing a cf string (I don't recall the exact layout but it would definitely have the CloudFlare cf in it and definitely not be where you want your consumers landing. Beyond that, the "normal" DDoS protection that CloudFlare uses would not be triggered by normal interaction with the API. I'd also recommend not following CloudFlare's specific advice to use a security level of anything but "I'm Under Attack"; on that point I must point out that even though the 5-second redirect won't occur on each request, hCaptchas will be triggered on security levels Low, Medium & High. Setting the security level to "Essentially Off" does not mean a security level of null; additionally the WAF will catch standard violations and that of course may be adjusted according to what is being served from your API.

Hopefully this is of use, if not to the OP at least to other would-be visitors.

Aspectual answered 18/4, 2020 at 19:0 Comment(1)
Is it recommended to use caching at CDN level and purge as required for a REST API?Construction
P
5

Yes CloudFlare can help with DDOS protections and No it does not implement caching and rate-limiting for your API. You are to implement those your self or you use a framework that does.

You can use CloudFlare to protect your API endpoint by using it as a proxy. CloudFlare protects the entire URL bit your can use the page rules to tweak the settings to your api endpoint.

Example: https://api.example.com/*
  • Reduce the the security for this rule to between low or medium so as not to show a captcha.
  • API's are not meant to show captcha you protect them with authorizations and access codes.
  • you can implement HTTP Strict Transport Security and Access-Control Headers on your headers.
  • Cloud Hosting providers (e.g DigitalOcean, Vultr,etc..) have free or paid DDoS protection. You can subscribe for it on just that public facing VM. This will be a big plus because now you have double DDOS protection.

For cache APIs

Create a page rule like https://api.example.com/*.json
  • Set the Caching Level for that rule such that CloudFlare caches it on its servers for a specific duration.

The are so many other ways you can protect APIs. Hopes this answer has been of help?

Philippic answered 17/5, 2016 at 16:0 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.