What does it mean by cold cache and warm cache concept?
Asked Answered
H

5

84

I read a paper and it used terms cold cache and warm cache. I googled about this terms but I didn't find something useful (only a thread here).

What do these terms mean?

Hord answered 31/3, 2014 at 7:45 Comment(0)
T
136

TL;DR There is an analogy with a cold engine and warm engine of the car. Cold cache - doesn't have any values and can't give you any speedup because, well, it's empty. Warm cache has some values and can give you that speedup.

A cache is a structure that holds some values (inodes, memory pages, disk blocks, etc.) for faster lookup.

Cache works by storing some kind of short references in a fast search data structure (hash table, B+ Tree) or faster access media (RAM memory vs HDD, SSD vs HDD).

To be able to do this fast search you need your cache to hold values. Let's look at an example.

Say, you have a Linux system with some filesystem. To access files in the filesystem you need to know where your file starts at the disk. This information stored in the inode. For simplicity, we say that the inode table is stored somewhere on disk (so-called "superblock" part).

Now imagine, that you need to read file /etc/fstab. To do this you need to read inode table from disk (10 ms) then parse it and get start block of the file and then read the file itself(10ms). Total ~20ms

This is way too many operations. So you are adding a cache in form of a hash table in RAM. RAM access is 10ns - that's 1000(!) times faster. Each row in that hash table holds 2 values.

(inode number or filename) : (starting disk block)

But the problem is that at the start your cache is empty - such cache is called cold cache. To exploit the benefits of your cache you need to fill it with some values. How does it happen? When you're looking for some file you look in your inode cache. If you don't find inode in the cache (cache miss) you're saying 'Okay' and do full read cycle with inode table reading, parsing it and reading file itself. But after parsing part you're saving inode number and parsed starting disk block in your cache. And that's going on and on - you try to read another file, you look in cache, you get cache miss (your cache is cold), you read from disk, you add a row in the cache.

So cold cache doesn't give you any speedup because you are still reading from disk. In some cases, the cold cache makes your system slower because you're doing extra work (extra step of looking up in a table) to warm up your cache.

After some time you'll have some values in your cache, and by some time you try to read the file, you look up in cache and BAM! you have found inode (cache hit)! Now you have starting disk block, so you skip reading superblock and start reading the file itself! You have just saved 10ms!

That cache is called warm cache - cache with some values that give you cache hits.

Taiwan answered 31/3, 2014 at 8:34 Comment(1)
Nit: 10ns is 10^6 times faster than 10msIrrigation
S
45

enter image description here

Background:

Cache is a small and faster memory, that helps avoid CPU to access main memory (bigger and slower) to save time (cache reads are ~100 x faster than reads from main memory). But this only helps if the data that your program needs has been cached (read from main memory into cache) and is valid. Also, cache gets populated with data over time. So, cache can be:
1. Empty, or
2. can contain irrelevant data, or
3. can contain relevant data.


Now, to your question:

Cold cache: When the cache is empty or has irrelevant data, so that CPU needs to do a slower read from main memory for your program data requirement.

Hot cache: When the cache contains relevant data, and all the reads for your program are satisfied from the cache itself.

So, hot caches are desirable, cold caches are not.

Superinduce answered 1/4, 2014 at 0:58 Comment(0)
E
6

Very nice reponse @avd.

Cold Cache is just a blank cache or one with stale-data.

Hot Cache on the other hand, maintains useful data that your system requires. It helps you achieve faster processing; mostly it is used for near real-time processing of requests. There are systems/processes that need certain information handy before they start catering to user-requests; such as a trading platform which would require market-data/risk-info/security-info etc before it can processes a user-request. It will be time-consuming if for each request the process has to query a DB/service to get this critical info. So it would be a good idea to cache it; and that would be feasible through Hot Cache. This cache should be maintained regularly (updates/removals etc); otherwise over the period your cache may grow in size with unecessary data and you might notice perfomance degradation.

To create Hot Cache, one method would be a lazy-population of cache, what I mean by that is that as and when you get requests you populate the cahce; in that case the initial requests would be slow but subsequent ones would be quicker. Another approach would be to load the data at process start-up (or before user requests start coming in) and maintain the cache till the process lives.

Electrojet answered 19/11, 2015 at 12:23 Comment(0)
H
2

There is a similar concept in Frontend Web development, almost all the good js frontend framework does WarmCaching these days.

Cold Cache: Frontend app calls backend server for the first time.

Warm Cache: After the frontend fetches the data for the first time it saves it in its local cache. So the next time it tries to call the backend it fetches the item from its local cache.

reference: Caching Data in frontend

Hospodar answered 11/10, 2019 at 21:37 Comment(1)
cold/warm are part of a pair... the cache starts cold and is warmed once data is populated in it. This can be called a "pull-though" cache since we are reading our information through it, and it keeps a copy of the answer for next time. Where data is pre-fetched, self-warming or where a source is actively pushing data into the cache, we refer to that as "hot" because the cache data is always ready to go. In the F/E context I'm inclined to use the expression pre-warmed as the data is not independently loaded into the cache but is coupled to the read actions. Not exactly hot close but close.Frederickson
J
1

In a multiprogram environment if tasks retain a significant portion of their working set from previous execution,the cache is called warm cache

Caches that have no history from prior executions are called cold cache

Jasper answered 30/12, 2019 at 6:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.