Spring Kafka integration test Error while writing to highwatermark file
Asked Answered
D

3

17

I'm writing integration test using spring-kaka-2.2.0 in spring boot application, I'm nearly succeeded still my test case return true but still I see multiple error after that.

2019-02-21 11:12:35.434 ERROR 5717 --- [       Thread-7] kafka.server.ReplicaManager              : [ReplicaManager broker=0] Error while writing to highwatermark file in directory /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645

org.apache.kafka.common.errors.KafkaStorageException: Error while writing to checkpoint file /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/replication-offset-checkpoint
Caused by: java.io.FileNotFoundException: /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/replication-offset-checkpoint.tmp (No such file or directory)

Test Config

@EnableKafka
@TestConfiguration
public class KafkaProducerConfigTest {

@Bean
public EmbeddedKafkaBroker embeddedKafkaBroker() {
    return new EmbeddedKafkaBroker(1,false,2,"test-events");
}


@Bean
public ProducerFactory<String, Object> producerFactory() {
    Map<String, Object> props = new HashMap<>();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafkaBroker().getBrokersAsString());
    props.put(ProducerConfig.RETRIES_CONFIG, 0);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
    return new DefaultKafkaProducerFactory<>(props);
}

@Bean
public KafkaTemplate<String, Object> kafkaTemplate() {
    KafkaTemplate<String, Object> kafkaTemplate = new KafkaTemplate<>(producerFactory());
    return kafkaTemplate;
   }

@Bean("consumerFactory")
 public ConsumerFactory<String, Professor> createConsumerFactory() {
     Map<String, Object> props = new HashMap<>();
     props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafkaBroker().getBrokersAsString());
     props.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");
     props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
     JsonDeserializer<Professor> jsonDeserializer = new JsonDeserializer<>(Professor.class,false);
     return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), jsonDeserializer);
 }

@Bean("kafkaListenerContainerFactory")
 public ConcurrentKafkaListenerContainerFactory<String, Professor> kafkaListenerContainerFactory() {
     ConcurrentKafkaListenerContainerFactory<String, Professor> factory = new ConcurrentKafkaListenerContainerFactory<>();
     factory.setConsumerFactory(createConsumerFactory());
     factory.setBatchListener(true);
     factory.getContainerProperties().setAckMode(AckMode.BATCH);
     return factory;
 }

@Bean
public StringJsonMessageConverter converter() {
    return new StringJsonMessageConverter();
}

@Bean
public Listener listener() {
    return new Listener();
}

public class Listener {
    public final CountDownLatch latch = new CountDownLatch(1);

    @Getter
    public List<Professor> list;

    @KafkaListener(topics = "test-events", containerFactory = "kafkaListenerContainerFactory")
    public void listen1(List<Professor> foo) {

        list=foo;
        this.latch.countDown();
       }
    }

}

Test class

@EnableKafka
@SpringBootTest(classes = { KafkaProducerConfigTest.class })
@RunWith(SpringRunner.class)
public class KafkaProducerServiceTest {

@Autowired
private KafkaConsumerService kafkaConsumerService;

@Autowired
private Listener listener;

@Test
public void testReceive() throws Exception {
    Professor professor = new Professor("Ajay", new Department("social", 1234));
    List<Professor> pro = new ArrayList<>();
    pro.add(professor);
    System.out.println(pro);
    kafkaConsumerService.professor(pro);
    System.out.println("The professor object is sent to kafka -----------------------------------");
    listener.latch.await();
    List<Professor> result = listener.getList();
    Professor resultPro = result.get(0);
    System.out.println(result);
    System.out.println(resultPro);

    assertEquals(pro.get(0).getName(), result.get(0).getName());

     }

 }

Test case testReceive() is passing but still with multiple error messages

Error 1 with Stack-trace

019-02-21 11:12:35.434 ERROR 5717 --- [       Thread-7] kafka.server.ReplicaManager              : [ReplicaManager broker=0] Error while writing to highwatermark file in directory /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645

org.apache.kafka.common.errors.KafkaStorageException: Error while writing to checkpoint file /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/replication-offset-checkpoint
Caused by: java.io.FileNotFoundException: /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/replication-offset-checkpoint.tmp (No such file or directory)

Error 2 with Stack-trace

2019-02-21 11:12:35.446  WARN 5717 --- [pool-8-thread-1] kafka.utils.CoreUtils$                   : /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/__consumer_offsets-4/00000000000000000000.index (No such file or directory)

java.io.FileNotFoundException: /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/__consumer_offsets-4/00000000000000000000.index (No such file or directory)

Error 3 with Stack-trace

2019-02-21 11:12:35.451  WARN 5717 --- [pool-8-thread-1] kafka.utils.CoreUtils$                   : /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/test-events-0/00000000000000000000.timeindex (No such file or directory)

java.io.FileNotFoundException: /var/folders/s3/rz83xz3n1j13lgy9mtwkln594g3x0g/T/kafka-1246121392091602645/test-events-0/00000000000000000000.timeindex (No such file or directory)
at java.io.RandomAccessFile.open0(Native Method) ~[na:1.8.0_191]
Delacroix answered 21/2, 2019 at 17:35 Comment(0)
G
5

Do you actually have permissions to write to /var/folders/s3 ...?

You can override the location with

@Bean
public EmbeddedKafkaBroker embeddedKafkaBroker() {
    return new EmbeddedKafkaBroker(1,false,2,"test-events")
        .brokerProperties(Collections.singletonMap(KafkaConfig.LogDirProp(), "/tmp/foo"));
}
Greyback answered 21/2, 2019 at 21:45 Comment(2)
sir is there a way to pass custom port for embedded Kafka broker? @Gary RussellDelacroix
Don't ask new questions in comments on old answers; always ask a new question. The broker starts on a random port by default (which is the best idea for CI builds). Use new EmbeddedKafkaBroker(1,false,2,"test-events").kafkaPorts(1234) to specify a specific port (or ports if you are starting multiple brokers).Greyback
A
22

I had similar issue and with help of Gary Russell answer I solved it by pointing log dir to gradle build output dir log.dir=out/embedded-kafka or in case of maven log.dir=target/embedded-kafka.

The following code snippet shows how to do it using @EmbeddedKafka.

@ExtendWith(SpringExtension.class)
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE, classes = {Application.class})
@EmbeddedKafka(
        topics = "topic",
        partitions = 1,
        controlledShutdown = true,
        brokerProperties={
                "log.dir=out/embedded-kafka"
        })
@TestPropertySource(
        properties = {
                "spring.kafka.bootstrap-servers=${spring.embedded.kafka.brokers}"
        })
public class OutboxEventsTest {
...
}
Apotropaic answered 5/7, 2019 at 19:29 Comment(2)
Just a tip that you can also set the properties with the @SpringBootTest(properties = {"..."}) annotation.Poly
Gosh. I was using the same configuration for my integration test and it worked like a charm. Wish I could give 10 upvotes here.Steppe
G
5

Do you actually have permissions to write to /var/folders/s3 ...?

You can override the location with

@Bean
public EmbeddedKafkaBroker embeddedKafkaBroker() {
    return new EmbeddedKafkaBroker(1,false,2,"test-events")
        .brokerProperties(Collections.singletonMap(KafkaConfig.LogDirProp(), "/tmp/foo"));
}
Greyback answered 21/2, 2019 at 21:45 Comment(2)
sir is there a way to pass custom port for embedded Kafka broker? @Gary RussellDelacroix
Don't ask new questions in comments on old answers; always ask a new question. The broker starts on a random port by default (which is the best idea for CI builds). Use new EmbeddedKafkaBroker(1,false,2,"test-events").kafkaPorts(1234) to specify a specific port (or ports if you are starting multiple brokers).Greyback
A
0

Just change the broker properties for Embedded Kafka

@RunWith(SpringJUnit4ClassRunner.class)
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = {MyApplication.class})
@TestPropertySource(locations = "classpath:application-test.properties")
@EmbeddedKafka(
        topics = {"my_topic_name"},
        partitions = 1,
        brokerProperties = {"log.dir=target/kafka"}
        )
Aureliaaurelian answered 21/5, 2020 at 15:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.