Configure logstash to read logs from Amazon S3 bucket
Asked Answered
C

0

7

I have been trying to configure logstash to read logs which are getting generated in my amazon S3 bucket, but have not been successful. Below are the details :

  1. I have installed logstash on an ec2 instance
  2. My logs are all gz files in the s3 bucket
  3. The conf file looks like below :
  input {
    s3 {
      access_key_id => "MY_ACCESS_KEY_ID"
      bucket => "MY_BUCKET"
      region => "MY_REGION"
      secret_access_key => "MY_SECRET_ACESS_KEY"
      prefix => "/"
      type => "s3"
      add_field => { source => gzfiles }
    }
  }

  filter {
    if [type] == "s3" {
      csv {
        columns => [ "date", "time", "x-edge-location", "sc-bytes", "c-ip", "cs-method", "Host", "cs-uri-stem", "sc-status", "Referer", "User-Agent", "cs-uri-query", "Cookie", "x-edge-result-type", "x-edge-request-id" ]  
      }
    }

    if([message] =~ /^#/) {
      drop{}
    } 
  }

  output {
    elasticsearch {
      host => "ELASTICSEARCH_URL" protocol => "http"
    } 
  }
Cement answered 5/8, 2015 at 13:39 Comment(2)
Do you have any error in the log file?Pimentel
Nopes. Added separator => "\t" within csv to get it working.Cement

© 2022 - 2024 — McMap. All rights reserved.