I'm currently trying to build some code execution optimization for a contest, and was looking at the ObjectPool pattern to favor object reuse instead of new object instantiation.
I've put together a small project (and the only test class) to investigate some of the things I see and don't understand.
What I'm doing:
- compare the creation of very simple objects for 5 000 000 iterations using both the new() and Pool.get() operations
- play around three axes, running all tests with and without:
- a "warmup" that runs the loop once before doing the measurements
- assigning the newly creating object to a local variable and using it for some computation
- using fixed vs random parameters as arguments
The results I have are:
Figures are for new instantiation vs with object pool for 5 000 000 iterations
without_warmup_without_new_object_use_with_random_parameters: 417 vs 457
without_warmup_without_new_object_use_with_fixed_parameters: 11 vs 84
without_warmup_with_new_object_use_with_random_parameters: 515 vs 493
without_warmup_with_new_object_use_with_fixed_parameters: 64 vs 90
with_warmup_without_new_object_use_with_random_parameters: 284 vs 419
with_warmup_without_new_object_use_with_fixed_parameters: 8 vs 55
with_warmup_with_new_object_use_with_random_parameters: 410 vs 397
with_warmup_with_new_object_use_with_fixed_parameters: 69 vs 82
What I notice from that:
- Using fixed parameters has a huge impact when instantiating a new object without reusing it. My guess was that the compiler was doing some kind of optimization and found that there was no side-effects and would remove the object instantiation altogether, but comparing the perfs with an empty loop shows that something still happens
- Using fixed parameters has a significant impact (though less pronounced) for the speed of new Object(), making it faster than the object pool version in some cases
- The object pool is faster in the "real life" scenarios (ie reuse the new objects and use somewhat random params), but not in most of them, which also hints at a compiler optimization.
What I'm looking for here is to understand these results, and get pointers to docs / books that I could read to get a good knowledge of what happens behind the scenes in these cases.
Thanks!
random.nextDouble()
rather than to any compiler / JIT optimizations that you are hypothesizing. – Lanfrifill it up and you start reusing objects previously allocated
Not sure I get your meaning here. The pool is filled and reuses items as far as I can see. As for freeing objects, that's something I'll want to have a look at later on. My use case is pretty simple, so a flag on the object (or cloning the few objects I want to keep) might be enough. – Exposenew
. – LanfriOnce you add the concept of freeing, things will become a lot more complicated
: got it, In this particular case, the objects are instantiated for a short processing and can be reused afterwards, so do I need to explicitly free them? – Expose