Redux RTK Query: Invalidating only single element from list
Asked Answered
M

2

5

I have the following API defined for the RTK query:

export const postsApi = createApi({
  reducerPath: "postsApi",
  baseQuery: fetchBaseQuery({ baseUrl: 'https://myservice.co/api/v2/' }),
  tagTypes: ["Posts"],
  endpoints: (builder) => ({
    getAllPosts: builder.query({
      query: () => ({ method: "GET", url: "/posts" }),
      transformResponse: (response) => response.posts,
      providesTags: (result) =>
        result
          ? [
              ...result.map(({ id }) => ({ type: "Posts", id })),
              { type: "Posts", id: "LIST" },
            ]
          : [{ type: "Posts", id: "LIST" }],
    }),
    updatePost: builder.mutation({
      query: ({ postId, ...body }) => ({
        url: `/posts/${postId}`,
        method: "POST",
        config: { body },
      }),
      invalidatesTags: (_, __, arg) => [{ type: "Posts", id: arg.id }],
    }),
    getPost: builder.query({
      query: (postId) => ({
        method: "GET",
        url: `/posts/${postId}`,
      }),
      providesTags: (_, __, id) => [{ type: "Posts", id }],
    }),
  }),
});

export const { useGetAllPostsQuery, useUpdatePostMutation, useGetPostQuery } = postsApi;

What I would like this to do is that when updatePost is called succesfully, it would only invalidate cache for a single post and use the getPost query to refetch the information instead the getAllPosts. Is this anyway possible? The posts are shown in a table fetched with the getAllPosts query.

Manna answered 4/8, 2021 at 8:22 Comment(0)
M
6

No. RTK-Query is a document cache (full response = document), not a normalized cache. It does not know anything about the contents of the cached responses and their structure - and it will never try to "stitch" anything in there by itself.

You can do that manually with an optimistic update, but everything RTK-Q does automatically is refetch, which should be more than enough in most use cases.

Metamorphosis answered 4/8, 2021 at 10:24 Comment(3)
Thanks for the answer! Is there any way to do the optimistic update so that the whole list fetch is not triggered? The point being here that that for each post there is calls to external services and fetching them all every time one updates gets quite heavy. If this is not possible the only way is to use cache in the service providing the posts to remove extra traffic to other services.Manna
Then do not add the single entries to the providesTags. providesTags means "if this in invalidated, refetch this".Metamorphosis
See Redux manual caching for more info. It does say a use case is changing a single item from a large list.Storax
R
0

You can handle this case by manually updating the cache either pessimistically or optimistically in updatePost endpoint.

We will use the updateQueryData method in rtk api to update an already existing cache entry

For pessimistically you must return the new version of the post as the result of your mutation. Then you update the list cache by finding the post to update in the list :

     updatePost: builder.mutation({
              query: ({ postId, ...body }) => ({
                url: `/posts/${postId}`,
                method: "POST",
                config: { body },
              }),
              invalidatesTags: (_, __, arg) => [{ type: "Posts", id: arg.id }],
              async onQueryStarted({...patch},{dispatch, queryFulfilled}) {
                    //here you can do optimistic update using the patch values
              
                    //or you can wait for the mutation result
                    const {data: updatedPost} = await queryFulfilled;
                    const patchResult = dispatch(
                      postApi.util.updateQueryData('getAllPosts',undefined,(draft)=>{
                          const cachedPost = draft.find(p => p.id === updatedPost.id);
                           if (t) {
                            Object.assign(cachedPost, updatedPost);
                           }
                      })
                    );
                  
              }
         }),
Rosemonde answered 28/6 at 12:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.