Spring Batch 4.2.4: Unable to deserialize the execution context
Asked Answered
B

4

10

I was using spring-batch:4.2.2.RELEASE as part of spring-boot-starter-batch:2.2.4.RELEASE. After upgrading the latter to version 2.3.1.RELEASE, I get the following exception when starting a job:

java.lang.IllegalArgumentException: Unable to deserialize the execution context
    at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:328)
    at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:312)
    at org.springframework.jdbc.core.RowMapperResultSetExtractor.extractData(RowMapperResultSetExtractor.java:94)
    at org.springframework.jdbc.core.RowMapperResultSetExtractor.extractData(RowMapperResultSetExtractor.java:61)
    at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:679)
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:617)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:669)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:700)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:712)
    at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:768)
    at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.getExecutionContext(JdbcExecutionContextDao.java:129)
    at org.springframework.batch.core.explore.support.SimpleJobExplorer.getStepExecutionDependencies(SimpleJobExplorer.java:238)
    at org.springframework.batch.core.explore.support.SimpleJobExplorer.getJobExecutions(SimpleJobExplorer.java:87)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
    at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
    at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
    at com.sun.proxy.$Proxy145.getJobExecutions(Unknown Source)
...
Caused by: com.fasterxml.jackson.databind.exc.InvalidTypeIdException: Missing type id when trying to resolve subtype of [map type; class java.util.HashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]]: missing type id property '@class'
 at [Source: (ByteArrayInputStream); line: 1, column: 192]
    at com.fasterxml.jackson.databind.exc.InvalidTypeIdException.from(InvalidTypeIdException.java:43)
    at com.fasterxml.jackson.databind.DeserializationContext.missingTypeIdException(DeserializationContext.java:1790)
    at com.fasterxml.jackson.databind.DeserializationContext.handleMissingTypeId(DeserializationContext.java:1319)
    at com.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._handleMissingTypeId(TypeDeserializerBase.java:303)
    at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedUsingDefaultImpl(AsPropertyTypeDeserializer.java:166)
    at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:107)
    at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserializeWithType(MapDeserializer.java:400)
    at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:68)
    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4482)
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3479)
    at org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer.deserialize(Jackson2ExecutionContextStringSerializer.java:123)
    at org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer.deserialize(Jackson2ExecutionContextStringSerializer.java:102)
    at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao$ExecutionContextRowMapper.mapRow(JdbcExecutionContextDao.java:325)
    ... 45 common frames omitted


I understand that the new version has a restricted handling of JSON deserialization and attempted to implement the suggested fix from the Jackson2ExecutionContextStringSerializer javadoc, however the problem persists:

@EnableBatchProcessing
@Configuration
class BatchConfig(

    val properties: BatchProperties,
    val dataSource: DataSource,
    val transactionManagerCustomizers: TransactionManagerCustomizers,
    val entityManagerFactory: EntityManagerFactory
) : JpaBatchConfigurer(properties, dataSource, transactionManagerCustomizers, entityManagerFactory) {

    override fun createJobRepository(): JobRepository {
        val factory = JobRepositoryFactoryBean()
        val map = PropertyMapper.get()
        map.from(dataSource).to { dataSource: DataSource? -> factory.setDataSource(dataSource!!) }
        map.from { determineIsolationLevel() }.whenNonNull().to { isolationLevelForCreate: String? -> factory.setIsolationLevelForCreate(isolationLevelForCreate!!) }
        map.from { properties.tablePrefix }.whenHasText().to { tablePrefix: String? -> factory.setTablePrefix(tablePrefix!!) }
        map.from { transactionManager }.to { transactionManager: PlatformTransactionManager? -> factory.transactionManager = transactionManager!! }
        factory.afterPropertiesSet()

        val serializer = configureContextSerializer()
        factory.setSerializer(serializer)

        return factory.getObject()
    }

    private fun configureContextSerializer(): Jackson2ExecutionContextStringSerializer {
        val polymorphicTypeValidator = LaissezFaireSubTypeValidator()
        objectMapper.activateDefaultTyping(polymorphicTypeValidator)
        val serializer = Jackson2ExecutionContextStringSerializer()
        serializer.setObjectMapper(objectMapper)
        return serializer
    }

The craziest part is that the execution context is actually empty, the database value is always "{}". I even tried changing all values in the DB to "{"@class":"java.util.HashMap"}", but I still get the same exception.

Does anyone have an idea how to fix this? Is the configuration from my fix attempt wrong?

Banksia answered 3/7, 2020 at 15:8 Comment(7)
Are you trying to run a new job instance or restart a failed one? Execution contexts that were serialized with 4.2.2 do not contain type informations (@class, @property, etc) that are required if deseriazlied with 4.2.4. If you run a new job with 4.2.4 and enable default typing, jackson will add type infos to the serialized execution context.Respectively
@MahmoudBenHassine I am running a new one, but I want to read the last execution time of a previously successful instance. Do i understand correctly that this will not be possible having run the past jobs with 4.2.2?Banksia
You would need to add type infos (as you tried) to be able to deserialize the previous context (the one serialized with 4.2.2). I think your attempt to add @class in "{"@class":"java.util.HashMap"}" is incorrect. Try to see how jackson serializes a new execution context and update your existing ones with a similar content.Respectively
@MahmoudBenHassine Actually, {"@class":"java.util.HashMap"} was correct, I had to add it to batch_step_execution_context as well as to batch_job_execution_context.Banksia
ok good to know. Indeed, you need to make sure to update both job and step execution contexts obviously. Is your issue resolved now?Respectively
@MahmoudBenHassine yes, I posted an answer with a sample fixBanksia
I've got the same issue with Spring batch 4.3.1 (spring boot 2.4.2). I've downgraded the version spring batch to 4.2.2 to solve it.Acima
B
13

Thanks to @MahmoudBenHassine for pointing me in the direction of the fix:

My attempt to manually add the type information to the database values was correct, but I didn't take it far enough.

There are 2 tables, whose values that needed updating:

  • table batch_job_execution_context, column short_context
  • table batch_step_execution_context, column short_context

I did this with a liquibase script:

    <changeSet id="update-job_execution_context-for-spring-batch-4.2.4" author="kpentchev">
        <update tableName="batch_step_execution_context">
            <column name="short_context" valueComputed="REPLACE(short_context, '{', '{&quot;@class&quot;:&quot;java.util.HashMap&quot;,')" />
        </update>
    </changeSet>

    <changeSet id="update-step_execution_context-for-spring-batch-4.2.4" author="kpentchev">
        <update tableName="batch_step_execution_context">
            <column name="short_context" valueComputed="REPLACE(short_context, '{', '{&quot;@class&quot;:&quot;java.util.HashMap&quot;,')" />
        </update>
    </changeSet>

The configuration overriding was not needed.

It would have been nice to have a migration service as part of the spring-batch release, but here is a work-around.

Banksia answered 6/7, 2020 at 9:55 Comment(7)
The configuration overriding was not needed.: which Jackson version do you have in your classpath? I'm not sure there is a migration service we (should) can provide as users execution contexts can contain any sort of data. Your use case is specific since you need some info from the previous execution to run a new one, the problem should not happen for new job instances.Respectively
@MahmoudBenHassine I have jackson version 2.11.0 after the upgrade to spring-boot 2.3.1.RELEASE and had 2.10.2 before.Banksia
That explains it. Spring Batch 4.2.x uses Jackson 2.10.x with which you would need the config override (The upcoming v4.3 will use Jackson 2.11), but it's ok and better that you have Jackson 2.11 (brought by Spring Boot 2.3). Glad you fixed you issue!Respectively
@MahmoudBenHassine Wouldn't hurt to put this somewhere in release notes? I checked and couldn't find anything, but that could just be me looking up wrong stuff.Lodging
@Lodging It's in the release notes: github.com/spring-projects/spring-batch/releases/tag/… , issue 3729.Respectively
There is a typo in the sql scripts. I suppose the first tableName should be batch_job_execution_context instead of batch_step_execution_contextArchdeaconry
Same issue I faced while SpringBatch Version upgrade. With older version it's {"map":[{entry": With new version {"@class" : "java.util.Hashmap. Hence causing some Jackson deserialise error. I have to clean both tables before proceeding with new SpringBatch version.Spanker
R
4

I recently ran into a similar problem while upgrading from Spring Batch 4.2.1.RELEASE to 4.2.4.RELEASE.

@kpentchev provides a good solution for this by directly modifying serialized execution context JSON in the database.

Another solution is to extend Jackson2ExecutionContextStringSerializer#deserialize(InputStream), catch the potential exception that's thrown deserializing the old JSON format, and use a second legacy ObjectMapper.

I've provided one such implementation below.


import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.DeserializationFeature;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.MapperFeature;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.deser.std.StdDeserializer;
import com.fasterxml.jackson.databind.exc.InvalidTypeIdException;
import com.fasterxml.jackson.databind.module.SimpleModule;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.lang.reflect.Field;
import java.nio.charset.StandardCharsets;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.Objects;
import javax.validation.constraints.NotNull;
import lombok.NonNull;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.JobParameter;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.repository.dao.Jackson2ExecutionContextStringSerializer;
import org.springframework.util.ReflectionUtils;

/**
 * Extends {@link Jackson2ExecutionContextStringSerializer} in order to support deserializing JSON
 * that was serialized using Spring Batch 4.2.1.RELEASE, and persisted in the database.
 *
 * <p>This class has been tested upgrading from Spring Batch 4.2.1.RELEASE to 4.2.4.RELEASE.
 */
@Slf4j
public class BackwardsCompatibleSerializer extends Jackson2ExecutionContextStringSerializer {

  private final ObjectMapper newObjectMapper;

  private final ObjectMapper legacyObjectMapper;

  public BackwardsCompatibleSerializer() {
    newObjectMapper = getNewObjectMapper();
    legacyObjectMapper = createLegacyObjectMapper();
  }

  /**
   * Overrides the default deserialization method.  If an {@link InvalidTypeIdException} is thrown
   * during deserialization, the exception is caught, and an attempt is made to deserialize the JSON
   * using the legacy {@link ObjectMapper} instance.
   */
  @Override
  public @NotNull Map<String, Object> deserialize(@NotNull InputStream in) throws IOException {
    String json = inputStreamToString(in);
    TypeReference<HashMap<String, Object>> typeRef = new TypeReference<>() {};
    try {
      return newObjectMapper.readValue(json, typeRef);
    } catch (InvalidTypeIdException e) {
      log.info("Couldn't deserialize JSON: will attempt to use legacy ObjectMapper");
      log.debug("Stacktrace", e);
      return legacyObjectMapper.readValue(json, typeRef);
    }
  }

  /**
   * Uses Java reflection to access the new {@link ObjectMapper} instance from the private
   * superclass field.  This will be used to serialize and deserialize JSON created using Spring
   * Batch 4.2.4.RELEASE.
   *
   * @return the new {@link ObjectMapper} instance
   */
  private ObjectMapper getNewObjectMapper() {
    ObjectMapper newObjectMapper;
    Field field = ReflectionUtils.findField(Jackson2ExecutionContextStringSerializer.class,
        "objectMapper", ObjectMapper.class);
    Objects.requireNonNull(field, "objectMapper field is null");
    ReflectionUtils.makeAccessible(field);
    newObjectMapper = (ObjectMapper) ReflectionUtils.getField(field, this);
    return newObjectMapper;
  }

  /**
   * Creates the {@link ObjectMapper} instance that can be used for deserializing JSON that was
   * previously serialized using Spring Batch 4.2.1.RELEASE.  This instance is only used if an
   * exception is thrown in {@link #deserialize(InputStream)} when using the new {@link
   * ObjectMapper} instance.
   *
   * @return the {@link ObjectMapper} instance that can be used for deserializing legacy JSON
   */
  @SuppressWarnings("deprecation")
  private ObjectMapper createLegacyObjectMapper() {
    ObjectMapper legacyObjectMapper = new ObjectMapper();
    legacyObjectMapper.configure(MapperFeature.DEFAULT_VIEW_INCLUSION, false);
    legacyObjectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, true);
    legacyObjectMapper.enableDefaultTyping();
    legacyObjectMapper.registerModule(new JobParametersModule());
    return legacyObjectMapper;
  }

  private static String inputStreamToString(@NonNull InputStream inputStream) throws IOException {
    ByteArrayOutputStream result = new ByteArrayOutputStream();
    byte[] buffer = new byte[1024];
    int length;
    while ((length = inputStream.read(buffer)) != -1) {
      result.write(buffer, 0, length);
    }
    return result.toString(StandardCharsets.UTF_8);
  }

  /*
   * The remainder of this file was copied from here:
   *
   * https://github.com/spring-projects/spring-batch/blob/4.2.1.RELEASE/spring-batch-core/src/main/java/org/springframework/batch/core/repository/dao/Jackson2ExecutionContextStringSerializer.java
   */

  // BATCH-2680

  /**
   * Custom Jackson module to support {@link JobParameter} and {@link JobParameters}
   * deserialization.
   */
  private static class JobParametersModule extends SimpleModule {

    private static final long serialVersionUID = 1L;

    private JobParametersModule() {
      super("Job parameters module");
      setMixInAnnotation(JobParameters.class, JobParametersMixIn.class);
      addDeserializer(JobParameter.class, new JobParameterDeserializer());
    }

    private abstract static class JobParametersMixIn {

      @JsonIgnore
      abstract boolean isEmpty();
    }

    private static class JobParameterDeserializer extends StdDeserializer<JobParameter> {

      private static final long serialVersionUID = 1L;
      private static final String IDENTIFYING_KEY_NAME = "identifying";
      private static final String TYPE_KEY_NAME = "type";
      private static final String VALUE_KEY_NAME = "value";

      JobParameterDeserializer() {
        super(JobParameter.class);
      }

      @SuppressWarnings("checkstyle:all")
      @Override
      public JobParameter deserialize(JsonParser parser, DeserializationContext context)
          throws IOException {
        JsonNode node = parser.readValueAsTree();
        boolean identifying = node.get(IDENTIFYING_KEY_NAME).asBoolean();
        String type = node.get(TYPE_KEY_NAME).asText();
        JsonNode value = node.get(VALUE_KEY_NAME);
        Object parameterValue;
        switch (JobParameter.ParameterType.valueOf(type)) {
          case STRING: {
            parameterValue = value.asText();
            return new JobParameter((String) parameterValue, identifying);
          }
          case DATE: {
            parameterValue = new Date(value.get(1).asLong());
            return new JobParameter((Date) parameterValue, identifying);
          }
          case LONG: {
            parameterValue = value.get(1).asLong();
            return new JobParameter((Long) parameterValue, identifying);
          }
          case DOUBLE: {
            parameterValue = value.asDouble();
            return new JobParameter((Double) parameterValue, identifying);
          }
        }
        return null;
      }
    }
  }
}
Rheumy answered 12/8, 2020 at 19:13 Comment(0)
M
2

I relied on @kpentchev's solution and used the following SQL commands:

update BATCH_JOB_EXECUTION_CONTEXT
set SHORT_CONTEXT = replace(SHORT_CONTEXT, '{"map"', '{"@class":"java.util.HashMap","map"')
WHERE SHORT_CONTEXT LIKE '{"map":%';

update BATCH_STEP_EXECUTION_CONTEXT
set SHORT_CONTEXT = replace(SHORT_CONTEXT, '{"map"','{"@class":"java.util.HashMap","map"')
WHERE SHORT_CONTEXT LIKE '{"map":%';

commit;
Marabou answered 30/4, 2021 at 13:47 Comment(0)
S
1

An alternate solution can be to cleanup the database.

DELETE FROM BATCH_JOB_EXECUTION_CONTEXT WHERE JOB_EXECUTION_ID IN (SELECT JOB_EXECUTION_ID FROM BATCH_JOB_EXECUTION WHERE JOB_EXECUTION_ID IN (SELECT JOB_EXECUTION_ID FROM BATCH_JOB_INSTANCE WHERE JOB_NAME = 'myJob'))

DELETE FROM BATCH_STEP_EXECUTION_CONTEXT WHERE STEP_EXECUTION_ID IN (SELECT STEP_EXECUTION_ID FROM BATCH_STEP_EXECUTION WHERE JOB_EXECUTION_ID IN (SELECT JOB_EXECUTION_ID FROM BATCH_JOB_INSTANCE WHERE JOB_NAME = 'myJob'))

DELETE FROM BATCH_STEP_EXECUTION WHERE JOB_EXECUTION_ID IN (SELECT JOB_EXECUTION_ID FROM BATCH_JOB_INSTANCE WHERE JOB_NAME = 'myJob')

DELETE FROM BATCH_JOB_EXECUTION_PARAMS WHERE JOB_EXECUTION_ID IN (SELECT JOB_EXECUTION_ID FROM BATCH_JOB_INSTANCE WHERE JOB_NAME = 'myJob')

DELETE FROM BATCH_JOB_EXECUTION WHERE JOB_EXECUTION_ID IN (SELECT JOB_EXECUTION_ID FROM BATCH_JOB_INSTANCE WHERE JOB_NAME = 'myJob')

DELETE FROM BATCH_JOB_INSTANCE WHERE JOB_NAME = 'myJob'

Sollars answered 23/5, 2023 at 13:20 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.