I have a spring batch project that reads a huge zip file containing more than 100.000 xml files.
I am using MultiResourcePartitioner, and I have a Memory issue and my batch fails with
java.lang.OutOfMemoryError: GC overhead limit exceeded.
It seems like if all the xml files are loaded in memory and not garbaged after processing.
Is there a performant way to do this ?
Thanks.
Partitioner
that groups files together into chunks. – AtthiaMultiResourcePartitioner
creates one partition (and therefore oneExecutionContext
and oneStepExecution
per file. With 200,000 files, you may wan to group them together so that you have less partitions. – Atthia