You can add a filter, intercept the current HttpServletRequest
and wrap it in a custom HttpServletRequestWrapper
. In your custom HttpServletRequestWrapper
, you read the request body and cache it and then implement getInputStream
and getReader
to read from the cached value. Since after wrapping the request, the cached value is always present, you can read the request body multiple times:
@Component
public class CachingRequestBodyFilter extends GenericFilterBean {
@Override
public void doFilter(ServletRequest servletRequest, ServletResponse servletResponse, FilterChain chain)
throws IOException, ServletException {
HttpServletRequest currentRequest = (HttpServletRequest) servletRequest;
MultipleReadHttpRequest wrappedRequest = new MultipleReadHttpRequest(currentRequest);
chain.doFilter(wrappedRequest, servletResponse);
}
}
After this filter, everybody will see the wrappedRequest
which has the capability of being read multiple times:
public class MultipleReadHttpRequest extends HttpServletRequestWrapper {
private ByteArrayOutputStream cachedContent;
public MultipleReadHttpRequest(HttpServletRequest request) throws IOException {
// Read the request body and populate the cachedContent
}
@Override
public ServletInputStream getInputStream() throws IOException {
// Create input stream from cachedContent
// and return it
}
@Override
public BufferedReader getReader() throws IOException {
// Create a reader from cachedContent
// and return it
}
}
For implementing MultipleReadHttpRequest
, you can take a look at ContentCachingRequestWrapper
from spring framework which is basically does the same thing.
This approach has its own disadvantages. First of all, it's somewhat inefficient, since for every request, request body is being read at least two times. The other important drawback is if your request body contains 10 GB
worth of stream, you read that 10 GB
data and even worse bring that into memory for further examination.