I'm having issues with grok parsing. In ElasticSearch/Kibana the lines I match come up with the tag _grokparsefailure.
Here is my logstash config :
input {
file {
type => logfile
path => ["/var/log/mylog.log"]
}
}
filter {
if [type] == "logfile"
{
mutate {
gsub => ["message","\"","'"]
}
grok
{ match => { "message" => "L %{DATE} - %{TIME}: " } }
}
}
output {
elasticsearch { host => localhost port => 9300 }
}
lines/patterns I'm trying to match : L 08/02/2014 - 22:55:49: Log file closed : " finished "
I tried the debugger on http://grokdebug.herokuapp.com/ and it works fine, my pattern matches correctly.
Lines I want to parse might contain double quotes, and I've read there can be issues with the way grok handles and escapes them. So I tried to mutate to replace " with ' to avoid issues but no luck.
Any ideas ? How can I debug this ?
Thanks
_grokparsefailure
. A good strategy for debugging this is to create a test file that has the expected log in it, and use a config file withinput { stdin{} }
andoutput {stdout { codec => rubydebug } }
and then do logstash -f test_conf < test_file and see what's going on. If you do that and post the input/output, it might be easier to help. As is, your filter is correct for the line given and outputs correctly, although you aren't capturing thegrok
results... for example{%DATE:date}
– Gamboa