how to replace logstash @timestamp with log timestamp
Asked Answered
J

2

5

My time stamp in the logs are in the format as below

2016-04-07 18:11:38.169  which is  yyyy-MM-dd HH:mm:ss.SSS

This log file is not live one (stored/old one), and I am trying to replace this timpestamp with logstash @timestamp value for the betterment in the Kibana Visualization.

My filter in logstash is like below

     grok {
       match => {
            "message" => [ "(?<timestamp>(\d){4}-(\d){2}-(\d){2} (\d){2}:(\d){2}:(\d){2}.(\d){3}) %{SYSLOG5424SD} ERROR u%{BASE16FLOAT}.%{JAVACLASS} - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::\"2b948ed5-12c0-4ae0-9b99-f1ee01191001\"- Actual Time taken to process \:\: %{NUMBER:responseTime:int}" ]
            } 

  }

date {
        match => [ "timestamp:date" , "yyyy-MM-dd HH:mm:ss.SSS Z"  ]
        timezone => "UTC"
        target => "@timestamp" 
         } 

But, its not replacing the @timestamp value, Json value

{
  "_index": "logstash-2017.02.09",
  "_type": "logs",
  "_id": "AVoiZq2ITxwgj2avgkZa",
  "_score": null,
  "_source": {
    "path": "D:\\SoftsandTools\\Kibana\\Logs_ActualTimetakentoprocess.log",
    "@timestamp": "2017-02-09T10:23:58.778Z", **logstash @timestamp**
    "responseTime": 43,
    "@version": "1",
    "host": "4637",
    "message": "2016-04-07 18:07:01.809 [SimpleAsyncTaskExecutor-3] ERROR s.v.wsclient.RestClient - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::\"2b948ed5-12c0-4ae0-9b99-f1ee01191001\"- Actual Time taken to process :: 43",
    "timestamp": "2016-04-07 18:07:01.809"   **Mine time stamp**
  }

Sample log line -

2016-04-07 18:11:38.171 [SimpleAsyncTaskExecutor-1] ERROR s.v.wsclient.RestClient - TransId:2b948ed5-12c0-4ae0-9b99-f1ee01191001 - TransactionId ::"2b948ed5-12c0-4ae0-9b99-f1ee01191001"- Actual Time taken to process :: 521

Could you please help and let me know, where am I going wring here..

Junoesque answered 9/2, 2017 at 10:27 Comment(2)
Please add a sample logline.Veron
thanks for response.. UpdatedJunoesque
C
8

You should basically have a grok match in order to use the timestamp of your log line:

grok {
    patterns_dir => ["give your path/patterns"]
    match => { "message" => "^%{LOGTIMESTAMP:logtimestamp}%{GREEDYDATA}" }          
}

In your pattern file make sure to have the patter which matches your timestamp in the log, which could look something like this:

LOGTIMESTAMP %{YEAR}%{MONTHNUM}%{MONTHDAY} %{TIME}

And then once you've done the grok filtering you might be able to use the filtered value like:

mutate {
    add_field => { "newtimestamp" => "%{logtimestamp}" }
    remove_field => ["logtimestamp"]
}
date {
    match => [ "newtimestamp" , "ISO8601" , "yyyy-MM-dd HH:mm:ss.SSS" ]
    target => "@timestamp"  <-- the timestamp which you wanted to apply on
    locale => "en"
    timezone => "UTC"
}

Hope this helps!

Carlton answered 9/2, 2017 at 10:38 Comment(6)
Thanks for response. I do have my grok, apologies.. updated question again. I will try with mutate and update.. thanks againJunoesque
Yes please, let me know.Carlton
And make sure to adjust the grok match according to yours.Carlton
@Junoesque any luck on this?Carlton
Apologies for delay. I tried the one you suggested.. Could not get through, but sight tweak on that worked. Below is the date filter that worked for me date { match => [ "timestamp" , "ISO8601" ] target => "@ Logtimestamp" locale => "en" timezone => "UTC" } Am getting @ Logtimestamp as new attribute in date field in Kibana Visulization, which I can make use of for plotting graph. Did not use mutate , and my grok is {TIMESTAMP_ISO8601:timestamp}Junoesque
You can also copy the logged date/time string to [@metadata][logtimestamp] so you don't have to remove the field logtimestamp by explicitly specifying remove_field clause, since everything in @metadata field will NOT be output.Ionize
T
1

you can use date filter plugin of logstash

date {
    match => ["timestamp", "UNIX"]
}
Tongue answered 25/12, 2018 at 10:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.