I am getting 300K+ metrics/Minute in a kafka topic as timeseries. I want to store and query the data. The visualisation tool which satisfy my requirement is Grafana. In order to efficiently store and query, I am thinking of storing these timeseries in Prometheus.
Kafka topic with lot of timeseries -> Prometheus -> Grafana
I am not so sure, how can I achieve this, as Prometheus is Pull based scraping method. Even if I write a pull service, will it allow me to pull 300K/Minute metrics?
SYS 1, UNIX TIMESTAMP, CPU%, 10
SYS 1, Processor, UNIX TIMESTAMP, CPUCACHE, 10
SYS 2, UNIX TIMESTAMP, CPU%, 30
.....
Most of the articles talks about Kafka exporter/JMX exporter to monitor Kafka. I am not looking for kafka monitoring, rather ship the timeseries data stored in a topic and leverage Prometheus query language and Grafana to analyze.