I am using Spring JDBCTemplate and BatchPreparedStatementSetter to perform batch Update on a postgreSql DB. I wanted to capture the erroneous records and after going through some posts, found out that catching the BatchUpdateException and then looking for 'Statement.EXECUTE_FAILED' could help me identify the records that were erroneous. However, when I implement it as below, I never get a batchUpdate exception.
Here I am trying to enter the same id "120" repeatedly so that I get a unique constraint violation to force an exception at db level.
@Override
@Transactional
public void batchInsertTenantPayloads(List<Payload> payloadMessages) {
try {
jdbcTemplate.batchUpdate(DBQueryConstants.INSERT_INTO_MESSAGE_METERING_PAYLOAD, new BatchPreparedStatementSetter() {
@Override
public int getBatchSize() {
return payloadMessages.size();
}
@Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setLong(1, 120);
ps.setString(2, payloadMessages.get(i).getId());
ps.setDate(3, Date.valueOf(payloadMessages.get(i).getTs().toLocalDate()));
ps.setString(4, payloadMessages.get(i).getContent().getId());
ps.setInt(5, payloadMessages.get(i).getStatus().ordinal());
ps.setString(6, MyUtils.toJSON(payloadMessages.get(i)));
}
});
} catch (Exception e) {
if (e.getCause() instanceof BatchUpdateException) {
LOGGER.info("Batch exception occurred");
BatchUpdateException be = (BatchUpdateException) e.getCause();
int[] batchUpdateCounts = be.getUpdateCounts();
for (int index = 0; index < batchUpdateCounts.length; index++) {
if (batchUpdateCounts[index] == Statement.EXECUTE_FAILED) {
LOGGER.error(
"[MMC] batch update Error execution >>>>>>>>>>>" + index + " --- , codeFail : " + batchUpdateCounts[index]
+ "---, line " + payloadMessages.get(index));
}
}
}
}
}
The challenge is that I never enter the BatchUpdateException block. Instead I get an
org.springframework.dao.DuplicateKeyException
and the nested exception is
org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "message_metering_payload_pkey"
What can I do differently so that I can capture the batchUpdate exception and get just the error rows and commit the good ones.
My batch size is 500. Postgre version is 9.6 and I am using spring 2.0 with spring jdbc 5.1.3.
The stacktrace looks something like this
org.springframework.dao.DuplicateKeyException: PreparedStatementCallback; SQL [insert into message_metering_payload(id,tenant_name,source_dt,exchange_id,status,payload)values(?,?,?,?,?,?)]; ERROR: duplicate key value violates unique constraint "message_metering_payload_pkey"
Detail: Key (id)=(120) already exists.; nested exception is org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "message_metering_payload_pkey"
Detail: Key (id)=(120) already exists.
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:242)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1444)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:632)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:646)
at org.springframework.jdbc.core.JdbcTemplate.batchUpdate(JdbcTemplate.java:936)