Anyone knows how structured logging is usually implemented with SLF4J?
Is there any open source already out there handling this?
Anyone knows how structured logging is usually implemented with SLF4J?
Is there any open source already out there handling this?
Slf4j added support for structured logging (and fluent API) with v2.0.0 (Alpha as of Oct 2019):
http://www.slf4j.org/apidocs/org/slf4j/spi/LoggingEventBuilder.html
int newT = 15;
int oldT = 16;
// using classical API
logger.debug("oldT={} newT={} Temperature changed.", newT, oldT);
// using fluent API
logger.atDebug()
.addKeyValue("oldT", oldT)
.addKeyValue("newT", newT)
.log("Temperature changed.");
If you use SLF4J in conjunction with Logback and Logstash, structured logging is supported with StructuredArguments
. You can find documentation about this on the logstash logback encoder page on Github.
A simple example of how it works. This log line..
log.debug("Retrieved file {}", StructuredArguments.value("filename", upload.getOriginalFilename()))
..yields the following log json output:
{
"filename": "simple.zip",
"@timestamp": "2019-02-12T14:31:31.631+00:00",
"severity": "DEBUG",
"service": "upload",
"thread": "http-nio-9091-exec-1",
"logger": "some.great.ClassName",
"message": "Retrieved file simple.zip"
}
net.logstash.logback.encoder.LogstashEncoder
). Unfortunately it is not compatible with Log4j or other logging frameworks and even incompatible with other Logback formatters. It only works with own supplied formaters (( –
Arron There is an example in github which is implemented using SLF4J
. Hope it will help you.
For more learning you can go through this tutorial.
FYI - I've just open sourced a structured logging wrapper for SLF4J. We're using it at my day job to front logging into Splunk and it's been a big improvement. Maybe you will find it useful.
https://github.com/Randgalt/maple
You define a "schema" and then wrap an SLF4J logger. E.g.
public interface LoggingSchema {
LoggingSchema name(String name);
LoggingSchema date(Instant date);
... etc ...
}
...
MapleLogger<LoggingSchema> logger = MapleFactory.getLogger(slf4j, LoggingSchema.class);
logger.info("my message", s -> s.name("john doe").date(Instant.now());
Generates an slf4j log ala:
slf4j.info("my message name=\"john doe\" date=2019-10-08T18:52:15.820177Z");
For anyone who stumbles upon this rather old question: As an alternative approach that is way more developer friendly than manual setting of each param into MDC, you could use a structured logger like https://github.com/jacek99/structlog4j ideally along with yaml or json formatter. Then it is really easy to feed the logs to ELK stack, and query all logs based on parameters(you wont have to create log entry parser manually, as all relevant fields will be there in structured form already). Or create your own simple logger on top of slf4j, that would take any varargs passed the the .log
method and automatically create key-value pairs in MDC, and then you can pair it with a structured formatter e.g. if your runtime uses Logstash: https://github.com/logstash/logstash-logback-encoder#mdc-fields
You may try to user Logstage in Scala https://izumi.7mind.io/latest/release/doc/logstage/index.html
We have an effectful adapter for slf4j and we perform zero-cost structuring for your logs.
Also, you may find that we have many advantages despite MDC replacing. For effect libs, we have a context for Fiber and FiberLocal We have automatic structure identifiers for it's better processing in structured databases
© 2022 - 2024 — McMap. All rights reserved.