Converting date format to YYYY-MM-DD from YYYY/MM/DD HH:MM:SS format in Logstash for nginx error logs
Asked Answered
P

1

7

I am having nginx error logs of the below form:-

2015/09/30 22:19:38 [error] 32317#0: *23 [lua] responses.lua:61: handler(): Cassandra error: Error during UNIQUE check: Cassandra error: connection refused, client: 127.0.0.1, server: , request: "POST /consumers/ HTTP/1.1", host: "localhost:8001"

As mentioned here I am able to parse this logs.

My filter configuration is like the below:-

filter {  
  grok {
      match => {
        "message" => [
          "%{DATESTAMP:mydate} \[%{DATA:severity}\] (%{NUMBER:pid:int}#%{NUMBER}: \*%{NUMBER}|\*%{NUMBER}) %{GREEDYDATA:mymessage}",
          "%{DATESTAMP:mydate} \[%{DATA:severity}\] %{GREEDYDATA:mymessage}",
          "%{DATESTAMP:mydate} %{GREEDYDATA:mymessage}"
        ]
      }
      add_tag => ["nginx_error_pattern"]
    }

    if ("nginx_error_pattern" in [tags]) {      
      grok {
        match => {
          "mymessage" => [
            "server: %{DATA:[request_server]},"
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "host: \"%{IPORHOST:[request_host]}:%{NUMBER:[port]}\""
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "request: \"%{WORD:[request_method]} %{DATA:[request_uri]} HTTP/%{NUMBER:[request_version]:float}\""
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "client: %{IPORHOST:[clientip]}",
            "client %{IP:[clientip]} "
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "referrer: \"%{DATA:[request_referrer]}\""
          ]
        }       
      }                
    }
}

mydate is having date of the form:-

"mydate" => "15/09/30 22:19:38"

Can someone let me know how can I add one more field (let's say log_day) having date of the form 2015-09-30?

Primalia answered 6/10, 2015 at 12:23 Comment(0)
S
12

It is always a good idea to save the time/date in a field of type date. It enables you to do complex range queries with Elasticsearch or Kibana.

You can use logstash's date filter to parse the date.

Filter:

date {
    match => [ "mydate", "YY/MM/dd HH:mm:ss" ]
}

Result:

"@timestamp" => "2015-09-30T20:19:38.000Z"

The date filter puts the result in the @timestamp field by default.

To avoid the default mapping into @timestamp field, specify the target field like "log_day", such as following:

Filter:

date {
    match => [ "mydate", "YY/MM/dd HH:mm:ss" ]
    target => "log_day"
}

Result:

"log_day" => "2015-09-30T20:19:38.000Z"

Once you have a field of type date you can proceed with further operations. You might use the date_formatter filter to create another date field in your special format.

date_formatter {
        source => "log_day"
        pattern => "YYYY-MM-dd"
}

Result: "log_day" => "2015-09-30"

Sidwell answered 6/10, 2015 at 13:19 Comment(2)
Thanks @hurb. This is working as expected. I have a follow up question. Can we do this without modifying the @timestamp field? Right now this will overwrite @timestamp with the date in log.Primalia
Sure, just add target => "log_day" to your date filter. See me edit.Sidwell

© 2022 - 2024 — McMap. All rights reserved.