I am wondering how to create separated indexes for different logs fetched into logstash
(which were later passed onto elasticsearch
), so that in kibana
, I can define two indexes for them and discover them.
In my case, I have a few client servers (each of which is installed with filebeat
) and a centralized log server (ELK
). Each client server has different kinds of logs, e.g. redis.log
, python
logs, mongodb
logs, that I like to sort them into different indexes and stored in elasticsearch
.
Each client server also serves different purposes, e.g. databases, UIs, applications. Hence I also like to give them different index names (by changing output index in filebeat.yml
?).
document_type
in filebeat will create a[@metadata][type]
field in the logstash event and not a[type]
field? I think it should readindex => "%{type}-%{+YYYY.MM.dd}"
instead. – Runyan