Data type conversion using logstash grok
Asked Answered
K

1

0

Basic is a float field. The mentioned index is not present in elasticsearch. When running the config file with logstash -f, I am getting no exception. Yet, the data reflected and entered in elasticsearch shows the mapping of Basic as string. How do I rectify this? And how do I do this for multiple fields?

input {  
      file {
          path => "/home/sagnik/work/logstash-1.4.2/bin/promosms_dec15.csv"
          type => "promosms_dec15"
          start_position => "beginning"
          sincedb_path => "/dev/null"
      }
}
filter {
    grok{
        match => [
            "Basic", " %{NUMBER:Basic:float}"
        ]
    }

    csv {
        columns => ["Generation_Date","Basic"]
        separator => ","
    }  
    ruby {
          code => "event['Generation_Date'] = Date.parse(event['Generation_Date']);"
    }

}
output {  
    elasticsearch { 
        action => "index"
        host => "localhost"
        index => "promosms-%{+dd.MM.YYYY}"
        workers => 1
    }
}
Km answered 18/12, 2014 at 14:36 Comment(1)
If your CSV is really only two columns, you could grok{} it yourself, then the %{NUMBER:Basic:float} trick will work just fine.Noncommittal
V
4

You have two problems. First, your grok filter is listed prior to the csv filter and because filters are applied in order there won't be a "Basic" field to convert when the grok filter is applied.

Secondly, unless you explicitly allow it, grok won't overwrite existing fields. In other words,

grok{
    match => [
        "Basic", " %{NUMBER:Basic:float}"
    ]
}

will always be a no-op. Either specify overwrite => ["Basic"] or, preferably, use mutate's type conversion feature:

mutate {
    convert => ["Basic", "float"]
}
Vedic answered 18/12, 2014 at 15:53 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.