How to decrease Logstash Memory Usage
P

1

13

I am using Logstash-5.6.5 (in Windows) running in a standalone system (no cloud or cluster). Planning to watch some log files and post it to locally run elasticsearch. But when checked the Logstash's memory usage, without a configuration to watch any file it is showing around 600MB memory usage. When I add input file pipeline configurations further it adds memory futher (For watching 3 log files it added up to 70MB, but I am planning to add more upto 20 logs).

1. Is it the expected behaviour?
2. Is there any way to reduce the huge memory usage by logstash?

Panta answered 2/2, 2018 at 6:7 Comment(0)
P
19

After researching for couple of days below is my answer to my question.

Below are the ways we can optimize Logstash memory:

  1. Logstash memory usage is primarily getting accumulated by heap size. This can be effectively controlled by setting the heap memory size in the environment variable LS_JAVA_OPTS as below, before launching Logstash (for Windows version in my case):

    set "LS_JAVA_OPTS=-Xms512m –Xmx512m"
    

Otherwise may be this can be added in the setup.bat at the beginning of the file.

In this way I have limited Logstash total the memory usage to 620 MB maximum.

  1. Logstash pipeline configurations (input/filter/output) can be optimized using the methods mentioned here.

In this way I asserted whether my Logstash filter configurations are optimized.

  1. Also pipeline input file configurations can be optimized using few properties below to ignore/close old log files as explained here, which will prevent unnecessary creation of pipeline threads.

    • ignore_older - in seconds - to totally ignore any file older than the given seconds
    • max_open_files - in numbers - to optimize the maximum number of opened files
    • close_older - in seconds to close the older files
    • exclude - array of unwanted file names (with or without wildcard)

In my case I was required to watch the recent files only and ignore the older files, and I have set the configuration accordingly as below:

input {
  file {
    #The application log path that will match with the rolling logs.
    path => "c:/path/to/log/app-1.0-*.log"
    #I didn't want logs older than an hour.
    #If that older file gets updated with a new entry 
    #that will become the new file and the new entry will be read by Logstash
    ignore_older => 3600 

    #I wanted to have only the very recent files to be watched. 
    #Since I am aware there won't be more then 5 files I set it to 5.
    max_open_files => 5 

    #If the log file is not updated for 5 minutes close it. 
    #If any new entry gets added then it will be opened again.
    close_older => 300 
  }
}
Panta answered 6/2, 2018 at 17:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.