Is it a good idea to use serilog to write logs directly to the elasticsearch
Asked Answered
M

2

11

I'm evaluating different options about the distributed log server.

In the Java world, as I can see, the most popular solution is filebeat + kafka + logstash + elasticsearch + kibana.

However, in .NET world, there's a serilog which can send structure logs directly to the elasticsearch. So the only required components are elasticsearch + kibana.

I searched a lot, but there's not much information about this solution in production. I've no idea whether it's enough to handle large volumes of logs.

Can anyone give me some suggestions? Thanks.

Mender answered 4/6, 2018 at 13:51 Comment(0)
S
3

I had the same issue exactly. Our system worked with the "classic" elk-stack architecture i.e. FileBeat -> LogStash -> Elastic ( ->Kibana). but as we found out in big projects with a lot of logs Serilog is much better solution for the following reasons:

  1. CI\CD - when you have different types of logs with different structure which you want to have different types, Serilog power comes in handy. in LogStash you need to create a different filter to break down a message according to the pattern. which implies that there is big coupling in the log structure aspect and the LogStash aspect - very bug prone.
  2. maintenance - Because of the easy CI\CD and the one point of change, it is easier to maintain a large amount of logs.
  3. Scalability - FileBeat has a problem to handle big chunks of data because of the registry file which have a tend to "explode" - reference from personal experience stack overflow flow question ; elastic-forum question
  4. Less failure points - with serilog the log send directly to elastic when with Filebeat you have to path through LogStash. one more place to fail.

Hope it helps you with your evaluation.

Syringomyelia answered 14/11, 2018 at 10:34 Comment(1)
I second this answer with an additional point: 5. costs - Using other applications like logstash and/or beats with some queuing mechanism mixed in like SQS can increase costs quite heavily when compared with only scaling your application and elastic cluster.Knives
P
0

Update (Dec 2021):

The Elasticsearch logger provider has been moved to the Elastic ECS DotNet project.

Find the latest version here: https://github.com/elastic/ecs-dotnet/blob/master/src/Elasticsearch.Extensions.Logging/ReadMe.md

The nuget package is here: https://www.nuget.org/packages/Elasticsearch.Extensions.Logging/1.6.0-alpha1

It is still labelled an alpha release (although it has more functionality than the Essential's version), so currently (Dec 2021) you need to specify the version when adding the package:

dotnet add package Elasticsearch.Extensions.Logging --version 1.6.0-alpha1

Disclaimer: I am the author

ORIGINAL ANSWER

There is now also a stand alone logger provider that will write .NET Core logging direct to Elasticsearch, following the Elasticsearch Common Schema (ECS) field specifications, https://github.com/sgryphon/essential-logging/tree/master/src/Essential.LoggerProvider.Elasticsearch

To use this from your .NET Core application, add a reference to the Essential.LoggerProvider.Elasticsearch package:

dotnet add package Essential.LoggerProvider.Elasticsearch

Then, add the provider to the loggingBuilder during host construction, using the provided extension method.

using Essential.LoggerProvider;
// ...
    .ConfigureLogging((hostContext, loggingBuilder) =>
    {
        loggingBuilder.AddElasticsearch();
    })

The default configuration will write to a local Elasticsearch running at http://localhost:9200/.

Once you have sent some log events, open Kibana (e.g. http://localhost:5601/) and define an index pattern for "dotnet-*" with the time filter "@timestamp".

This reduces the dependencies even more, as rather than pull in the entire Serilog infrastructure (App -> Microsoft ILogger -> Serilog provider/adapter -> Elasticsearch sink -> Elasticsearch) you now only have (App -> Microsoft ILogger -> Elasticsearch provider -> Elasticsearch).

The ElasticsearchLoggerProvider also writes events following the Elasticsearch Common Schema (ECS) conventions, so is compatible with events logged from other sources, e.g. Beats.

Palatine answered 28/3, 2020 at 7:19 Comment(1)
something similar github.com/amccool/AM.Extensions.Logging.ElasticSearchSindee

© 2022 - 2024 — McMap. All rights reserved.