Logstash reports [0] _grokparsefailure when parsing logs
Asked Answered
W

1

9

I have logs that come in from that are in this format. I have assigned the logstash variable to the pattern below. I believe that I have each of these elements assigned properly with the predefined Grok tags that come with it. However when I run logstash it reflects: [0] "_grokparsefailure" indicating that it is unable to parse the request. I am at a loss for what exactly is wrong with my conf. Does anyone on here have any idea what could be causing it? I am pretty new to logstash. Thanks in advance

1383834858 0 71.172.136.12 20097903 198.2.20.171 80 TCP_HIT/200 252 HEAD http://podcasts.someserver.com/80830A/podcasts.someserver.com/nyv/voice-film-club/2013/11/the-sexy-god-thor.mp3 - 0 355 "-" "Podcasts/2.0" 33546 "-"

or

%{BASE10NUM:timestamp} = 1383834858
%{BASE10NUM:time_taken} = 0
%{IP:clientip} = 71.172.136.12
%{BASE10NUM:filesize} = 20097903
%{IP:serverip} = 198.2.20.171
%{BASE10NUM:port} = 80
%{WORD:status_code} = TCP_HIT/200
%{BASE10NUM:sc_bytes} = 252
%{WORD:method} = HEAD
%{URI:cs_uri} = http://podcasts.someserver.com/80830A/podcasts.someserver.com/nyv/voice-   film-club/2013/11/the-sexy-god-thor.mp3
%{NOTSPACE:ignore2} = -
%{BASE10NUM:rs_duration} = 0
%{BASE10NUM:rs_bytes} = 355
%{QS:c_referrer} = "-"
%{QS:user_agent} = "Podcasts/2.0"
%{BASE10NUM:customerid} = 33546
%{QS:ignore} = "-"

My logstash.conf file looks like this:

input {
    #wpa_media logs from the CDN(see puppet module)
    redis {
        type => "wpc_media"
        host => "devredis1.somedomain.com"
        # these settings should match the output of the agent
        data_type => "list"
        key => "wpc_media"
        codec => json
        debug => true
   }
}


filter {
    grok {
        type    => "wpc_media"
        pattern => [ "%{BASE10NUM:timestamp} %{BASE10NUM:time_taken} %{IP:clientip} %{BASE10NUM:filesize} %{IP:serverip} %{BASE10NUM:port} %{WORD:status_code} %{BASE10NUM:sc_bytes} %{WORD:method} %{URI:cs_uri} %{NOTSPACE:ignore2} %{BASE10NUM:rs_duration} %{BASE10NUM:rs_bytes} %{QS:c_referrer} %{QS:user_agent} %{BASE10NUM:customerid} %{QS:ignore} " ]
    }

    mutate {
        #just something to cover up the error not really fixing it
        #remove_tag  => [ "_grokparsefailure" ]
        remove => [ "customer_id", "ignore", "c_referrer", "time_taken" ]
    }
}
output {
    stdout { debug => true debug_format => "ruby"}
}
Waxler answered 7/11, 2013 at 22:44 Comment(0)
G
26

For your own reference, the GrokDebugger site is really handy for problems like this.

For the particular log event you provided, %{WORD} is not matching TCP_HIT/200.

One quick fix is to match with %{DATA:status_code} instead (you can see built-in patterns on GitHub). You could certainly build a more targeted match, but it's hard to do so without seeing possible inputs.

If you're always expecting word/number, something like (?<status_code>%{WORD}/%{INT}) could work.

Grigri answered 7/11, 2013 at 22:54 Comment(1)
Thats what did it! I ended up breaking up TCP_HIT and 200 in to two separate variables like %{WORD:result_code}/%{INT:status_code} since after a bit more research they are kind of two separate results anyways. Thank you for you for your assistanceWaxler

© 2022 - 2024 — McMap. All rights reserved.