I wanted to upgrade to Java 11 and tomcat 9 my spring boot application that uses Hazelcast 3.12.9 as cashing mechanism. When I deployed locally everything looks to work fine and the caching successfully works. But when the application runs on the cluster, I receive from all 3 nodes that are available the following error:
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.some.service.some.server.domain.ClassA
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:88)
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:77)
at com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:48)
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.toObject(AbstractSerializationService.java:187)
at com.hazelcast.map.impl.proxy.MapProxySupport.toObject(MapProxySupport.java:1237)
at com.hazelcast.map.impl.proxy.MapProxyImpl.get(MapProxyImpl.java:120)
at com.hazelcast.spring.cache.HazelcastCache.lookup(HazelcastCache.java:162)
at com.hazelcast.spring.cache.HazelcastCache.get(HazelcastCache.java:67)
at com.some.service.some.server.domain.ClassACache.get(GlassACache.java:28)
at com.some.service.some.server.domain.ClassAFacade.getClassA(ClassAFacade.java:203)
at com.some.service.some.server.domain.ClassAFacade.getGlassA(ClassAFacade.java:185)
at com.some.service.some.server.domain.ClassALogic.lambda$getClassAInParallel$1(ClassALogic.java:196)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:952)
at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(ReduceOps.java:926)
at java.base/java.util.stream.AbstractTask.compute(AbstractTask.java:327)
at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.lang.ClassNotFoundException: com.some.service.some.server.domain.ClassA
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
at com.hazelcast.nio.ClassLoaderUtil.tryLoadClass(ClassLoaderUtil.java:288)
Hazelcast customizer :
@Configuration
public class ClassAHazelcastConfig {
private static final MaxSizePolicy HAZELCAST_DEFAULT_MAX_SIZE_POLICY = MaxSizePolicy.PER_NODE;
private static final EvictionPolicy HAZELCAST_DEFAULT_EVICTION_POLICY = EvictionPolicy.LRU;
@Bean
HazelcastConfigurationCustomizer customizer(CachePropertiesHolder cacheProperties) {
return config -> {
config.addMapConfig(new MapConfig()
.setName(CLASS_A_CACHE)
.setMaxSizeConfig(new MaxSizeConfig(cacheProperties.getMaxsize(), HAZELCAST_DEFAULT_MAX_SIZE_POLICY))
.setEvictionPolicy(HAZELCAST_DEFAULT_EVICTION_POLICY)
.setTimeToLiveSeconds(cacheProperties.getTtl()));
config.getSerializationConfig().addSerializerConfig(
new SerializerConfig()
.setImplementation(new OptionalStreamSerializer())
.setTypeClass(Optional.class)
);
};
}
}
@Configuration
@EnableConfigurationProperties(CachePropertiesHolder.class)
public class CacheConfig implements CachingConfigurer, EnvironmentAware, ApplicationContextAware {
public static final String CLASS_A_CACHE = "CACHE_A";
private Environment environment;
private ApplicationContext applicationContext;
@Override
@Bean(name="cacheManager")
public CacheManager cacheManager() {
boolean cachingEnabled = Boolean.parseBoolean(environment.getProperty("cache.enabled"));
if (cachingEnabled) {
HazelcastInstance instance = (HazelcastInstance) applicationContext.getBean("hazelcastInstance");
return new HazelcastCacheManager(instance);
}
return new NoOpCacheManager();
}
@Override
public CacheResolver cacheResolver() {
return new SimpleCacheResolver(Objects.requireNonNull(cacheManager()));
}
@Bean
@Override
public KeyGenerator keyGenerator() {
return new SimpleKeyGenerator();
}
@Bean
@Override
public CacheErrorHandler errorHandler() {
return new SimpleCacheErrorHandler();
}
@Override
public void setEnvironment(@NotNull Environment environment) {
this.environment = environment;
}
@Override
public void setApplicationContext(@NotNull ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
}
Everything works properly fine with Java 8 and tomcat 8.
Update:
After some days of investigation, I see that the only place that these exceptions are thrown into a parallel stream that is used.
return forkJoinPool.submit(() ->
items.parallelStream()
.map(item -> {
try {
return biFunction.apply(item);
} catch (Exception e) {
LOG.error("Error", e);
return Optional.<Item>empty();
}
})
The weird thing is that with Java 8 and tomcat 8 I did not have that issue.