How do you ingest Spring boot logs directly into elastic
Asked Answered
A

2

5

I’m investigating feasability of sending spring boot application logs directly into elastic search. Without using filebeats or logstash. I believe the Ingest plugin may help with this.

My initial thoughts are to do this using logback over TCP.

https://github.com/logstash/logstash-logback-encoder

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <appender name="stash" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
      <destination>127.0.0.1:4560</destination>
      <encoder class="net.logstash.logback.encoder.LogstashEncoder" />
  </appender>

  <root level="DEBUG">
      <appender-ref ref="stash" />
  </root>
</configuration>

So looking at the above you can send logs directly into logstash. Im just wondering if it was possible to use the newer functionality of ingest and skip using logstash? By sending json encoded logs directly into elastic over the network using the ingest method?

https://www.elastic.co/blog/new-way-to-ingest-part-1

My question

I’m wondering if this is possible? If so could you explain how you would do it. Also what possible what would be the pitfalls etc.

Alchemy answered 10/8, 2017 at 13:24 Comment(3)
The TCP appender is not suited for this as you need to send data over HTTP. I would look over the Loggly HTTP appender instead which should do the job just fine: loggly.com/docs/java-logbackNeiman
doesnt look like thats sending it to elastic? also its a paid for productAlchemy
You can change the URL of course to send it to your own ES. See the source code, it will send your logs to whatever URL you have configuredNeiman
N
9

I just tried my suggestion and it worked out perfectly.

First, add this dependency in your POM:

    <dependency>
        <groupId>org.logback-extensions</groupId>
        <artifactId>logback-ext-loggly</artifactId>
        <version>0.1.2</version>
    </dependency>

Then, in your logback.xml configuration, add an appender and a logger like this:

<appender name="ES" class="ch.qos.logback.ext.loggly.LogglyAppender">
    <endpointUrl>http://localhost:9200/tests/test?pipeline=logback</endpointUrl>
    <pattern>%m</pattern>
</appender>
<logger name="es" level="INFO" additivity="false">
    <appender-ref ref="ES"/>
</logger>

You also need to define an ingest pipeline like this:

PUT _ingest/pipeline/logback
{
  "description": "logback pipeline",
  "processors": [
    {
      "set" : {
        "field": "source",
        "value": "logback"
      }
    }
  ]
}

Then, in your code you can use that logger and send whatever data you have to your ES

private Logger esLogger = LoggerFactory.getLogger("es");
...
esLogger.info("{\"message\": \"Hello World from Logback!\"}");

And this document will end up in your ES:

{
    "_index": "tests",
    "_type": "test",
    "_id": "AV3Psj5MF_PW7ho1yJhQ",
    "_score": 1,
    "_source": {
      "source": "logback",
      "message": "Hello World from Logback!",
    }
}
Neiman answered 11/8, 2017 at 5:19 Comment(1)
Thanks Val.. this really helps and it does log messages :-) How do I expand it to include the other standard usual log fields such as time, hosts, log level etc, will be using kibana to make sense of the logs. Also can you pass in a string rather than the json encoding string?Alchemy
U
3

I asked the same question earlier: Direct integration of Logback with Elasticsearch

Take a look to https://github.com/internetitem/logback-elasticsearch-appender

UPDATE Development of logback-elasticsearch-appender stalled in October 2017 with breaking issues for ES v6.x

United answered 28/8, 2017 at 5:29 Comment(3)
I believe the issue is now closedMegalith
It was closed in 2019. Latest release to Maven Central is in 2017. Effectively it is unavailable for consumption (( mvnrepository.com/artifact/com.internetitem/…United
Thank you for the update. Is there any other resource where we can achieve the same?Megalith

© 2022 - 2024 — McMap. All rights reserved.