I have the following configuration for my logstash importing a few CSV files:
input {
file {
path => [
"C:\Data\Archive_ATS_L1\2016-10-08-00-00_to_2016-10-09-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv",
"C:\Data\Archive_ATS_L1\2016-10-09-00-00_to_2016-10-10-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv",
"C:\Data\Archive_ATS_L1\2016-10-10-00-00_to_2016-10-11-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv",
"C:\Data\Archive_ATS_L1\2016-10-11-00-00_to_2016-10-12-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv",
"C:\Data\Archive_ATS_L1\2016-10-12-00-00_to_2016-10-13-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv",
"C:\Data\Archive_ATS_L1\2016-10-13-00-00_to_2016-10-14-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv",
"C:\Data\Archive_ATS_L1\2016-10-14-00-00_to_2016-10-15-00-00\S2KHistorian\Historian\S2KEventMsg_Table.csv"
]
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["MessageCode","SourceGuid","DateTimeGenerated","Code1","Code2","Code3","Code4","LanguageCode", "AlarmSeverity", "Message", "Guid1", "Guid2", "Guid3", "Guid4", "MessageOrigin", "RequestId", "Bool1", "Bool2", "Bool3", "Bool4", "Bool5", "Bool6", "Bool7", "Bool8", "Code5", "Code6", "Bool9", "Bool10", "Bool11", "Code7"]
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "S2K"
workers => 1
}
stdout {}
}
I launch logstash with this command line:
logstash.bat –f ..\conf\logstash.conf --verbose
Usually I see the data that's being imported into Elasticsearch in the console. But all I get this time is one line that says "Pipeline main started" and it stays like that.
How can I check from logstash if data was imported? I tried using Elasticsearch by running: curl http://localhost:9200/_aliases
This usually gives the list of indices. But the index I have in this config (called S2K) does not get listed.
I'm new to ELK so how can I check if logstash is doing it's job? Please note that I'm using Windows 7.