logstash http_poller first URL request's response should be input to second URL's request param
Asked Answered
B

1

5

I have two URLs (due to security concern i will explain by using dummy)

 a> https://xyz.company.com/ui/api/token
 b> https://xyz.company.com/request/transaction?date=2016-01-21&token=<tokeninfo>

When you hit url mentioned in point 'a' it will generate a token let it be a string of 16 characters

Then that token should be used in making second request of point 'b' in token param


Updated

 The second url response is important to me i.e is a JSON response, I need       
 to filter the json data and extract required data and output it to standard 
 output and elastic search.    

is there any way of doing so in logstash using plugin "http_poller" or any other plugins.

Note : these request urls should be executed one after another, i.e point 'a' url should be executed first and point 'b' url should be executed next after receiving new token.

Please suggest.

Bernhardt answered 25/5, 2016 at 11:55 Comment(2)
Do you need this sequence of calls to run only once or repeatedly?Gay
I need to call these urls in intervals. let it be every 60 minutesBernhardt
G
7

Yes, it's possible with a mix of an http_poller input and an http output.

Here is the config I came up with:

input {
   # 1. trigger new token requests every hour
   http_poller {
     urls => {
       token => "https://xyz.company.com/ui/api/token"
     }
     interval => 3600
     add_field => {"token" => "%{message}"}
   }
}
filter {
}
output {
   # 2. call the API
   http {
     http_method => "get"
     url => "https://xyz.company.com/request/transaction?date=2016-01-21&token=%{token}"
   }
}

UPDATE

If you want to be able to get the content of the API call and store it in ES, you need a hybrid solution. You need to set up a cron that will call a script that runs the two HTTP calls and stores the results in a file and then you can let logstash tail that file and forward the results to ES.

Shell script to put on cron:

#!/bin/sh

# 1. Get the token
TOKEN=$(curl -s -XGET https://xyz.company.com/ui/api/token)

# 2. Call the API with the token and append JSON to file
curl -s -XGET "https://xyz.company.com/request/transaction?date=2016-01-21&token=$TOKEN" >> api_calls.log

The above script can be set on cron using crontab (or similar), there are plenty of examples out there on how to achieve this.

Then the logstash config can be very simple. It just needs to tail the api_calls.log file and send the document to ES

input {
    file {
        path => "api_calls.log"
        start_position => "beginning"
    }
}
filter {
    json {
        source => "message"
    }
}
output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "my_index"
        document_type" => "my_type"
    }
    stdout {
        codec => "rubydebug"
    }
}
Gay answered 25/5, 2016 at 13:27 Comment(8)
Thank you for solution Val, but how i will catch output of second url call.Bernhardt
That will be logged as an event from which you can then extract the content.Gay
Does it mean i need to log it in a file and read it and push it in elasticsearch. Don't we have any other approach in single call? any sample example will help me a lot. Thanks.Bernhardt
You haven't really explained what you want to do with the API response. Please update your question with any additional useful information and I'll amend my answer.Gay
Val, do you have any breakthrough...any suggestion will help!Bernhardt
I have a solution but it involves putting a shell script on cron in addition to some logstash config. Are you open to that?Gay
if logstash has limitation and alone logstash cannot solve purpose. Yes I may try shell script or python.Bernhardt
could you please provide the approach to the solutionBernhardt

© 2022 - 2024 — McMap. All rights reserved.