I previously asked this question on Apollo’s community chat, but am reposting it here because it wasn’t answered after two weeks.
This question is about the new cache introduced by Apollo 3.0. This update deprecated local queries in favour of “type policies” (or “field policies”), which include several tools for working with the cache, such as a merge
function that can be defined for each GraphQL type to specify how incoming data should be merged with existing data in the cache.
I’m wondering which use cases are better suited for the merge
function, versus writing a precalculated update value via writeQuery
or writeFragment
.
Let’s say I have an array with objects, and this array is property of another type. For example:
type Person {
hobbies: [Hobby!]!
}
The application allows me to delete, add, and update elements in this array. Should hobbies
make use of the merge
function? Previously, I wrote this logic in local mutation resolvers, but these have been deprecated. I can easily mimic the old behaviour by defining a function for each of these resolvers, which contains the add/remove/update logic, and uses writeFragment
to store the results.
I can see the benefit of using merge
because (I think) it’s a lower layer of abstraction, but can it be used in the case of incoming hobbies
array? As far as I can see, the way it would work would imply that we’d have to deduce the type of mutation by means of the incoming input. For example:
- If the incoming array contains one less item (or only contains the deleted item), we did a delete operation, and we can assume that all other elements remained the same.
- If the incoming array contains just one element and it is new, we did an add operation, and will merge this one element inside the array
- If the incoming array contains the same number of elements (or has just one updated item), we must have updated one and must now figure out which one to replace (let’s assume that, due to the implementation, it is not possible to directly update a single
Hobby
, because other hobbies may be affected as well, e.g. due to non-overlapping time constraints).
This seems less elegant than mirroring the old approach where the new value for hobbies
is calculated before calling writeFragment
.
Is there a performance benefit to using merge
? Are existing and non-affected items in hobbies
kept in place when using merge
, and overwritten when using writeFragment
? Assuming the newly calculated data passed to writeFragment
contains an array with shallow copies for the unmodified elements.
Thank you very much for clearing this up for me!
merge
result in simpler code than a read/writeQuery after each mutation? If you have mutations that add/remove a single item and others that overwrite the entire set, then you need some merge magic to handle all that. Isn't it better to keep doing manual read/writes so your cache update logic is next to the mutations doing the changes? The only use-case I see formerge
is pagination, but docs/console warnings seem to push it for everything (although v3.4 made cache writes default to overwriting again). – Analysand