Can you initialize Spring Batch metadata tables with Liquibase?
Asked Answered
O

4

6

Currently I have a setup like below. On running the batch job locally the job will create the necessary metadata tables automatically using the data-source property values since initialize-schema is set to always. Liquibase will also run and create any tables listed in its changelog.

Here is my application.yml file

spring:
  batch:
    initialize-schema: always
    job:
      enabled: true
  liquibase:
    url: db_url
    user: deploy_user
    password: deploy_pass
    change-log: classpath:db/changelog/db.changelog-master.yaml
    enabled: true
data-source:
  mysql:
    user: r_user
    password: r_pass
    jdbc-url: db_url

Here is my db.changelog-master.yaml file.

databaseChangeLog:

  - changeSet:
    dbms: mysql
    id: create-sample-table
    author: me
    sql: CREATE TABLE sample_table (
      sample_id VARCHAR(255) NOT NULL,
      sample_text TEXT,
      PRIMARY KEY (samoke_id)
      ) ENGINE=InnoDB DEFAULT
      CHARSET=utf8 COLLATE=utf8_bin;

Mysql datasource config:

@Configuration
public class DataSourceConfiguration {

    @Primary
    @Bean(name = "mySQLDataSource")
    @ConfigurationProperties("data-source.mysql")
    public DataSource mySQLDataSource() {
        return DataSourceBuilder.create().type(HikariDataSource.class).build();
    }
}

Liquibase Configuration (probably posting more than what's needed):

@Configuration
@EnableConfigurationProperties(LiquibaseProperties.class)
public class LiquibaseConfiguration {
    private static final Logger LOG = LoggerFactory.getLogger(LiquibaseConfiguration.class);

    @Autowired
    private LiquibaseProperties liquibaseProperties;


    public DataSource liquibaseDataSource() {
        DataSourceBuilder factory = DataSourceBuilder
                .create()
                .url(liquibaseProperties.getUrl())
                .username(liquibaseProperties.getUser())
                .password(liquibaseProperties.getPassword());

        return factory.build();
    }

    public void testLiquibaseConnection() throws SQLException {

        LOG.info("Testing connection to Liquibase (in case PCF restarts and we have stale dynamic secrets)...");
        liquibaseDataSource().getConnection();
        LOG.info("Testing connection to Liquibase (in case PCF restarts and we have stale dynamic secrets)... Succeeded");
    }

    @Bean
    public SpringLiquibase liquibase() {
        try {
            testLiquibaseConnection();
        } catch (Exception ex) {
            LOG.warn("WARNING: Could not connect to the database using " + liquibaseProperties.getUser() + ", so we will be skipping the Liquibase Migration for now. ", ex);
            return null;
        }
        SpringLiquibase liquibase = new SpringLiquibase();
        liquibase.setChangeLog(this.liquibaseProperties.getChangeLog());
        liquibase.setContexts(this.liquibaseProperties.getContexts());
        liquibase.setDataSource(liquibaseDataSource());
        liquibase.setDefaultSchema(this.liquibaseProperties.getDefaultSchema());
        liquibase.setDropFirst(this.liquibaseProperties.isDropFirst());
        liquibase.setShouldRun(this.liquibaseProperties.isEnabled());
        liquibase.setLabels(this.liquibaseProperties.getLabels());
        liquibase.setChangeLogParameters(this.liquibaseProperties.getParameters());
        return liquibase;
    }

}

The issue is we have different credentials for creating/deploying tables and reading/writing to tables in our deployed environments. So the below setup will work to create tables via Liquibase, but fail creating the metadata tables due to having the incorrect credentials upon deployment. Our current work-around to get the metadata tables created is to deploy with the data-source properties having deploy credentials, run the job to initialize the tables and then redeploy with read/write credentials. (We can't just leave the deploy credentials for reads because they have very short TTL).

Is it possible to create the metadata tables for Spring Batch via Liquibase automatically? Specifically, without adding the creation SQL manually to the changelog files?

UPDATE:

Using veljkost's answer below having a changelog file that looks like this works:

databaseChangeLog:
  - changeSet:
      dbms: mysql
      id: create-spring-batch-metadata
      author: dev.me
      changes:
        - sqlFile:
            encoding: UTF-8
            path: classpath:/org/springframework/batch/core/schema-mysql.sql
            relativeToChangelogFile: false
            splitStatements: true
            stripComments: true
Onitaonlooker answered 12/11, 2019 at 22:13 Comment(3)
the job will create the necessary metadata tables automatically: that's not correct, Spring Batch does not create tables automatically, it is up to you to do it manually or tell Spring boot to do it for you (as in your case). You need to make sure your liquibase script runs before the job so that Spring Batch can find the tables and report its meta-data.Savoy
That might just be improper wording on my end sorry about that, since I have initialize-schema: always set the schema is created before a job is ran. Edited the question to make that more explicit. Also, liquibase does run first, the main question is can liquibase somehow create those metadata tables without them being explicitly in the changelog.Onitaonlooker
ok thanks for the clarification. the main question is can liquibase somehow create those metadata tables without them being explicitly in the changelog.: this is a liquibase question rather than a spring batch question. I'm not too familiar with liquibase, so I will let someone with better knowledge about it answer your quetion.Savoy
D
15

Yes, you can reference the schema files that already exist in Spring Batch project. In org.springframework.batch.core package you can find schema-*.sql files where * is the name of the targeted db. Since you are running on mysql, your change set would look something like this:

- changeSet:
  id: 1234
  author: adam.sandler
  changes:
      - sqlFile:
            encoding: utf8
            path: classpath:/org/springframework/batch/core/schema-mysql.sql
            relativeToChangelogFile: false
            splitStatements: true
            stripComments: true
Drews answered 14/11, 2019 at 10:45 Comment(2)
Ah, awesome, thanks for pointing this out. I didn't realize you could specify an sqlFile.Onitaonlooker
For other databases, sql files can be found in github repoTiler
G
2

To auto-migrate to your database without the use of liquabase add

spring.batch.initialize-schema=always

to your application.properties file, it will auto migrate to the embedded data-source

Grazia answered 1/12, 2020 at 10:26 Comment(1)
While this should work, our goal was to manage all database schema via liquibase to prevent any conflicts.Onitaonlooker
P
1

For XML change sets I had to add a delimiter as well:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
    xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    logicalFilePath="/common/run-once/feature-base.xml"
    xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog 
                             http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-4.0.xsd">
    <changeSet author="tim.doege" id="feature/spring-batch">
        <sqlFile dbms="mysql" encoding="UTF-8"
            path="classpath:/org/springframework/batch/core/schema-mysql.sql"
            relativeToChangelogFile="false" splitStatements="true"
            endDelimiter=";" stripComments="true" />
    </changeSet>
</databaseChangeLog>
Pomace answered 21/4, 2023 at 10:43 Comment(0)
R
0

For sql changesets, I had to download and unzip the jar to find schema-postgresql.sql and copy and paste the SQL statements into Liquibase.

Rubious answered 18/5, 2023 at 6:10 Comment(1)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Phanerozoic

© 2022 - 2024 — McMap. All rights reserved.