I implemented a simple sitemap class using Django's default sitemap application. As it was taking a long time to execute, I added manual caching:
class ShortReviewsSitemap(Sitemap):
changefreq = "hourly"
priority = 0.7
def items(self):
# Try to retrieve from cache
result = get_cache(CACHE_SITEMAP_SHORT_REVIEWS, "sitemap_short_reviews")
if result!=None:
return result
result = ShortReview.objects.all().order_by("-created_at")
# Store in cache
set_cache(CACHE_SITEMAP_SHORT_REVIEWS, "sitemap_short_reviews", result)
return result
def lastmod(self, obj):
return obj.updated_at
The problem is that Memcached allows only maximum a 1 MB object. This one was bigger that 1 MB, so storing it into the cache failed:
>7 SERVER_ERROR object too large for cache
The problem is that Django has an automated way of deciding when it should divide the sitemap file into smaller ones. According to the documentation:
You should create an index file if one of your sitemaps has more than 50,000 URLs. In this case, Django will automatically paginate the sitemap, and the index will reflect that.
What do you think would be the best way to enable caching sitemaps?
- Hacking into Django sitemaps framework to restrict a single sitemap size to, let's say, 10,000 records seems like the best idea. Why was 50,000 chosen in the first place? Google advice? Random number?
- Or maybe there is a way to allow Memcached to store bigger files?
- Or perhaps once saved, the sitemaps should be made available as static files? This would mean that instead of caching with Memcached I'd have to manually store the results in the filesystem and retrieve them from there next time when the sitemap is requested (perhaps cleaning the directory daily in a cron job).
All those seem very low level and I'm wondering if an obvious solution exists...