Both solutions try to make the database changes be synced to cache ASAP, i.e. make the database and cache be consistent as much as possible, so that application code can read the (possible) latest data.
The following are cons for each solution.
Solution A: deleting item in cache
If the item is updated very frequently, this solution deletes the cache entry all the time, and you'll get lots of cache miss. Most requests hit the database, and make the cache useless.
Solution B: Updating item in cache
This solution might insert infrequent items into the cache, and lower the cache hit ratio. When an item is updated in database, you've no idea whether it's a hot item. If it's a cold one, and you insert it into cache, a hot item might be evicted from the cache (since the cache size is limited).
Cache inconsistent problem still exists
Although both solutions try to make the cache be consistent as much as possible. They cannot guarantee that. Take the following case for example.
- client A reads the cache, and found that it does not exist.
- client A reads the database and get the value: old-value.
- client B updates the database with a new value: new-value.
- client B updates cache with new-value or deletes entry in cache (although it might not exist in cache).
- client A update the cache with old-value.
In this case, cache still holds old value.
Solution C: make friends with inconsistency
Since you're using cache, you cannot avoid inconsistency with database. So in most cases, we don't need to update/delete cache items when updating database. Check this and this for more detail.