Rails fragment caching: would 100K+ fragments degrade performance?
Asked Answered
B

2

6

I have a site with a large amount of data and I'm doing "Russian doll" caching on all pages like this:

# articles.html.haml
- cache "list of articles", expires_in: 15.minutes do
  = render partial: "article", collection: @articles

# _article.html.haml
- cache article do
  = article.body
  = render partial: "comment", collection: article.comments

# _comment.html.haml
- cache comment do
  = comment.body

This would create hundreds of thousands of fragments.

1. Would this degrade performance with so many fragment files in the /tmp/cache directory?

2. Does rail automatically delete old fragments when they are auto-expired?

PS. The site resides on a single Ubuntu server with 4GB ram. It's not using memcached as the cache store, just the standard file based implementation that comes out of the box with rails.

Botanist answered 21/1, 2014 at 12:40 Comment(0)
I
1

At some point you'll have to ask yourself the question whether a cache hit is still less costly than generating the fragment. If the answer is yes (and this can be benchmarked), this still improves performance when compared to an uncached situation. In the case of hundreds of thousands of cache fragments stored on a physical (platter) hard drive, I would be very wary of an I/O bottleneck though. If this becomes an issue, you could limit the depth of your caching strategy to reduce the number of files. But again, please benchmark. The hit-ratio is a very important statistic here, because a high hit-rate will limit I/O in this specific case.

If performance worries you, also look at how often fragments are expired. In your specific situation, the "list of articles" would become invalid every time a comment is posted. You currently expire every 15 minutes, but if you want your output to be consistent, it should actually be expired immediately after a comment or article is placed or edited. If you have multiple comments per minute on your list of articles, you're absolutely right to even cache individual comments here. If I/O becomes a problem, you can always add some RAM and start using memcached (or redis, for that matter).

However, because you've got multiple layers of caching, you might be totally fine here with only a couple of hits to your file system in addition to the hits to the parent "list of articles" fragment.

Intellectuality answered 21/1, 2014 at 16:37 Comment(0)
O
1

Unless you're hosted on a Windows server, 100K files spread out over several folders in Rail's cache isn't going to be any performance hit.

Oshiro answered 3/2, 2014 at 4:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.