Adding fields depending on event message in Logstash not working
Asked Answered
P

1

11

I have ELK installed and working in my machine, but now I want to do a more complex filtering and field adding depending on event messages.

Specifically, I want to set "id_error" and "descripcio" depending on the message pattern.

I have been trying a lot of code combinations in "logstash.conf" file, but I am not able to get the expected behavior.

Can someone tell me what I am doing wrong, what I have to do or if this is not possible? Thanks in advance.

This is my "logstash.conf" file, with the last test I have made, resulting in no events captured in Kibana:

input { 
    file {
        path => "C:\xxx.log"
    }
}

filter {
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR2:error2}" ]
        add_field => [ "id_error", "2" ]
        add_field => [ "descripcio", "error2!!!" ]
    }
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR1:error1}" ]
        add_field => [ "id_error", "1" ]
        add_field => [ "descripcio", "error1!!!" ]
    }
    if ("_grokparsefailure" in [tags]) { drop {} }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "xxx-%{+YYYY.MM.dd}"
  }
}

I also have tried the following code, resulting in fields "id_error" and "descripcio" with both vaules "[1,2]" and "[error1!!!,error2!!!]" respectively, in each matched event.

As "break_on_match" is set "true" by default, I expect getting only the fields behind the matching clause, but this doesn't occur.

input { 
  file {
    path => "C:\xxx.log"
  }
}

filter {
  grok {
    patterns_dir => "C:\elk\patterns"
    match => [ "message", "%{ERROR1:error1}" ]
    add_field => [ "id_error", "1" ]
    add_field => [ "descripcio", "error1!!!" ]
    match => [ "message", "%{ERROR2:error2}" ]
    add_field => [ "id_error", "2" ]
    add_field => [ "descripcio", "error2!!!" ]
  }
  if ("_grokparsefailure" in [tags]) { drop {} }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "xxx-%{+YYYY.MM.dd}"
  }
}
Pallaton answered 23/4, 2015 at 14:33 Comment(6)
add_field (and add_tag) are only run when the filter itself succeeds, so if your grok{} doesn't match the pattern, the field won't be added.Free
I have checked all different error patterns individually and all of them suceed, so that's not the problem.Pallaton
So your output contains the error1 and error2 fields?Free
Yes, of course, but that's not the problem. I mean that for every matching event, either "error1" pattern or "error2" pattern, I get a list of values in fields "id_error" and "descripcio" like id_error = [1,2], descripcio = [error1!!!,error2!!!].Pallaton
That probably means that both groks were matching, so both add_fields would be run, thus creating an array out of your field.Free
I do not think they can match for the same event. As I said, I had checked all error patterns individually and all of them were matching correctly. I do not know what was the behavior inside the programs, but now I have the good code. Thank you Alain.Pallaton
P
4

I have solved the problem. I get the expected results with the following code in "logstash.conf":

input { 
  file {
    path => "C:\xxx.log"
  }
}

filter {
  grok {
    patterns_dir => "C:\elk\patterns"
    match => [ "message", "%{ERROR1:error1}" ]
    match => [ "message", "%{ERROR2:error2}" ]
  }
  if [message] =~ /error1_regex/ {
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR1:error1}" ]
    }
    mutate {
        add_field => [ "id_error", "1" ]
        add_field => [ "descripcio", "Error1!" ]
        remove_field => [ "message" ]
        remove_field => [ "error1" ]
    }
  }
  else if [message] =~ /error2_regex/ {
    grok {
        patterns_dir => "C:\elk\patterns"
        match => [ "message", "%{ERROR2:error2}" ]
    }
    mutate {
        add_field => [ "id_error", "2" ]
        add_field => [ "descripcio", "Error2!" ]
        remove_field => [ "message" ]
        remove_field => [ "error2" ]
    }
  }
  if ("_grokparsefailure" in [tags]) { drop {} }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "xxx-%{+YYYY.MM.dd}"
  }
}
Pallaton answered 24/4, 2015 at 15:0 Comment(1)
As of Logstash 8.5, does this still work? Doesn't work for me on 8.5Tortoiseshell

© 2022 - 2024 — McMap. All rights reserved.